Push Notifications in PWA Using Firebase and React

Push Notifications in PWA Using Firebase and React


Many sites send notifications to their users through the browser for various events occurring within the web app. We can easily do this using Cloud Messaging, which allows us to send messages to any device using HTTP requests with Firebase.

Here are the basic steps required for pushing the notifications to a web app using Firebase.


Setup

Firstly, it’s necessary to have a Firebase account. Once you have an account, go to the ‘Create a Project’ section.

For this demo, we are creating the project using the create-react-app command.

In addition to this, we need the Firebase library. For that, open your terminal and run the command npm install firebase –save.


Coding

  1. Let’s create a file inside the project directory named push-notification.js. Now we must create a function that initializes Firebase and has the important keys to our project.

  1. Next, we must call the respective function.

In the above image, the function named initializeFirebase() has been called.

  • Service Worker:A service worker is used as a script that the browser runs in the background, separate from the web app. It is mainly providing data in the form of local storage cache. Even if we refresh the web page, the locally stored cache doesn’t change.

In the above image, we are importing a service worker (serviceWorker) into the index.js file, where other packages are also imported. Now, after importing it into the index.js file, let’s get that exported by developing a new file called serviceWorker.js, which is served as a service.

In the register function, the default is used so that user can use any name they want to enter in index.js while importing as a user. For example, we can use** “import <u>swore</u>(or any other name) from. /serviceWorker;**” instead of “import serviceWorker from. /serviceWorker;

If (navigator. serviceWorker. Controller) {} =>This is the point from which the updated pre-cached content has been fetched, but the previous service worker will still serve the older content until all the client’s tabs are closed. New content is available and will be used when all tabs for this page are closed.

  1. serviceWorker will basically import all the scripts that are needed to show the notifications when your app is running in the background.
  2. It is mandatory to have a firebase-messaging-sw.js file in the location where all the other files are served.

  1. Pass messagingSenderId to the firebase.initializeApp function in this file and push it to the notification.js file as shown above.
  • Requesting Permissions to Notifications: It is best practice to allow the user to choose whether or not they wish to see the notifications. For that, let’s create a function in the push-notification.js file that will make a request and generate a token. For example, the  askForPermissionToReceiveNotifications() function is, first, requesting permission and then going through the getToken() method to generate the token and save it to local storage.

Note: You can also check the generated token in the developer mode=>Application=>LocalStorage=>requestedURL=>notification-token. Now, here you get your desired generated token. The value of notification-token(key) is the example of a generated token.

Now, after generating the token we need to call the function. Here, we call it in index.js file wherever we need to generate the token, or you can also call this function on click of a button.

  • Send the Notifications to the User:For sending notifications to the user, we must make a request to the Firebase API and inform the Firebase API about the token the user will receive.Note:In the below example we use Postman, but, if you are willing to use anything else you are free to do that as well.
  • The steps required to make the request are as follows:
  1. Firstly, we need to make a POST request to the Google API like https://fcm.google.apis.com/fcm/send,by sending the JSON in the request body.
  2. Our JSON should look like:

  1. “Click_action” in the above JSON describes the URL and “to” describes the token that gets generated.
  2. In the request header, we need to pass the key as content-type and authorization and the value as application/json and server key of our project, respectively:

This key is the server key which you can find in your Firebase account:

1. Login to your Firebase account.

2. Go to the project settings through project Overview settings icon.

3. Then go to the Cloud Messaging and there you can easily get the server key for your Firebase account.

Here, you just click to send our request to Postman and everything is good to go.

Note: Remember that notification will only appear when your app is in the background or minimized.


Thanks for reading

If you liked this post, share it with all of your programming buddies!

Follow us on Facebook | Twitter

Further reading

React - The Complete Guide (incl Hooks, React Router, Redux)

Modern React with Redux [2019 Update]

50+ Java Interview Questions for Programmers

Top 100 Python Interview Questions and Answers

100+ Basic Machine Learning Interview Questions and Answers

Introduction to Java String Interview Questions and Answers

Best 50 React Interview Questions and Answers in 2019

Top 50+ SQL Interview Questions and Answers in 2019

Best Java Microservices Interview Questions In 2019

Best 50 Nodejs interview questions from Beginners to Advanced in 2019

100+ Java Interview Questions and Answers In 2019

Angular Architecture Patterns and Best Practices (that help to scale)

Angular Architecture Patterns and Best Practices (that help to scale)
<p class="ql-align-center">Originally published by Bartosz Pietrucha at angular-academy.com</p><p> In order to deal with mentioned factors to maintain a high quality of delivery and prevent technical debt, robust and well-grounded architecture is necessary. Angular itself is a quite opinionated framework, forcing developers to do things the proper way, yet there are a lot of places where things can go wrong. In this article, I will present high-level recommendations of well-designed Angular application architecture based on best practices and battle-proven patterns. Our ultimate goal in this article is to learn how to design Angular application in order to maintain sustainable development speed and ease of adding new features in the long run. To achieve these goals, we will apply:</p>
    <li>proper abstractions between application layers,</li><li>unidirectional data flow,</li><li>reactive state management,</li><li>modular design,</li><li>smart and dumb components pattern.</li>
<p></p>

Problems of scalability in front-end

<p>Let's think about problems in terms of scalability we can face in the development of modern front-end applications. Today, front-end applications are not "just displaying" data and accepting user inputs. Single Page Applications (SPAs) are providing users with rich interactions and use backend mostly as a data persistence layer. This means, far more responsibility has been moved to the front-end part of software systems. This leads to a growing complexity of front-end logic, we need to deal with. Not only the number of requirements grows over time, but the amount of data we load into the application is increasing. On top of that, we need to maintain application performance, which can easily be hurt. Finally, our development teams are growing (or at least rotating - people come and go) and it is important for new-comers to get up to speed as fast as possible.</p><p></p><p>One of the solutions to the problems described above is solid system architecture. But, this comes with the cost, the cost of investing in that architecture from day one. It can be very tempting for us developers, to deliver new features very quickly, when the system is still very small. At this stage, everything is easy and understandable, so development goes really fast. But, unless we care about the architecture, after a few developers rotations, tricky features, refactorings, a couple of new modules, the speed of development slows down radically. Below diagram presents how it usually looked like in my development career. This is not any scientifical study, it's just how I see it.</p><p></p>

Software architecture

<p>To discuss architecture best practices and patterns, we need to answer a question, what the software architecture is, in the first place. Martin Fowlerdefines architecture as "highest-level breakdown of a system into its parts". On top of that, I would say that software architecture describes how the software is composed of its parts and what are the rules and constraints of the communication between those parts. Usually, the architectural decisions that we make in our system development, are hard to change as the system grows over time. That's why it is very important to pay attention to those decisions from the very beginning of our project, especially if the software we build is supposed to be running in production for many years. Robert C. Martin once said: the true cost of software is its maintenance. Having well-grounded architecture helps to reduce the costs of the system's maintenance.</p>
Software architecture is the way the software is composed of its parts and the rules and constraints of the communication between those parts

High-level abstraction layers

<p>The first way, we will be decomposing our system, is through the abstraction layers. Below diagram depicts the general concept of this decomposition. The idea is to place the proper responsibility into the proper layer of the system: coreabstraction or presentation layer. We will be looking at each layer independently and analyzing its responsibility. This division of the system also dictates communication rules. For example, the presentation layer can talk to the core layer only through the abstractionlayer. Later, we will learn what are the benefits of this kind of constraint.</p><p></p>

Presentation layer

<p>Let's start analyzing our system break-down from the presentation layer. This is the place where all our Angular components live. The only responsibilities of this layer are to present and to delegate. In other words, it presents the UI and delegates user's actions to the core layer, through the abstraction layer. It knows what to display and what to do, but it does not know how the user's interactions should be handled.</p><p>Below code snippet contains CategoriesComponent using SettingsFacadeinstance from abstraction layer to delegate user's interaction (via addCategory() and updateCategory()) and present some state in its template (via isUpdating$).</p><pre class="ql-syntax" spellcheck="false">@Component({ selector: 'categories', templateUrl: './categories.component.html', styleUrls: ['./categories.component.scss'] }) export class CategoriesComponent implements OnInit {

@Input() cashflowCategories$: CashflowCategory[];
newCategory: CashflowCategory = new CashflowCategory();
isUpdating$: Observable<boolean>;

constructor(private settingsFacade: SettingsFacade) {
this.isUpdating$ = settingsFacade.isUpdating$();
}

ngOnInit() {
this.settingsFacade.loadCashflowCategories();
}

addCategory(category: CashflowCategory) {
this.settingsFacade.addCashflowCategory(category);
}

updateCategory(category: CashflowCategory) {
this.settingsFacade.updateCashflowCategory(category);
}

}
</pre>

Abstraction layer

<p>The abstraction layer decouples the presentation layer from the core layer and also has it's very own defined responsibilities. This layer exposes the streams of state and interface for the components in the presentation layer, playing the role of the facade. This kind of facade sandboxes what components can see and do in the system. We can implement facades by simply using Angular class providers. The classes here may be named with Facade postfix, for example SettingsFacade. Below, you can find an example of such a facade.</p><pre class="ql-syntax" spellcheck="false">@Injectable()
export class SettingsFacade {

constructor(private cashflowCategoryApi: CashflowCategoryApi, private settingsState: SettingsState) { }

isUpdating$(): Observable<boolean> {
return this.settingsState.isUpdating$();
}

getCashflowCategories$(): Observable<CashflowCategory[]> {
// here we just pass the state without any projections
// it may happen that it is necessary to combine two or more streams and expose to the components
return this.settingsState.getCashflowCategories$();
}

loadCashflowCategories() {
return this.cashflowCategoryApi.getCashflowCategories()
.pipe(tap(categories => this.settingsState.setCashflowCategories(categories)));
}

// optimistic update
// 1. update UI state
// 2. call API
addCashflowCategory(category: CashflowCategory) {
this.settingsState.addCashflowCategory(category);
this.cashflowCategoryApi.createCashflowCategory(category)
.subscribe(
(addedCategoryWithId: CashflowCategory) => {
// success callback - we have id generated by the server, let's update the state
this.settingsState.updateCashflowCategoryId(category, addedCategoryWithId)
},
(error: any) => {
// error callback - we need to rollback the state change
this.settingsState.removeCashflowCategory(category);
console.log(error);
}
);
}

// pessimistic update
// 1. call API
// 2. update UI state
updateCashflowCategory(category: CashflowCategory) {
this.settingsState.setUpdating(true);
this.cashflowCategoryApi.updateCashflowCategory(category)
.subscribe(
() => this.settingsState.updateCashflowCategory(category),
(error) => console.log(error),
() => this.settingsState.setUpdating(false)
);
}
}
</pre>

Abstraction interface

<p>We already know the main responsibilities for this layer; to expose streams of state and interface for the components. Let's start with the interface. Public methods loadCashflowCategories()addCashflowCategory() and updateCashflowCategory() abstract away the details of state management and the external API calls from the components. We are not using API providers (like CashflowCategoryApi) in components directly, as they live in the core layer. Also, how the state changes is not a concern of the components. The presentation layer should not care about how things are done and components should just call the methods from the abstraction layer when necessary (delegate). Looking at the public methods in our abstraction layer should give us a quick insight about high-level use casesin this part of the system.</p><p>But we should remember that the abstraction layer is not a place to implement business logic. Here we just want to connect the presentation layer to our business logic, abstracting the way it is connected.</p>

State

<p>When it comes to the state, the abstraction layer makes our components independent of the state management solution. Components are given Observables with data to display on the templates (usually with async pipe) and don't care how and where this data comes from. To manage our state we can pick any state management library that supports RxJS (like NgRx) or simple use BehaviorSubjects to model our state. In the example above we are using state object that internally uses BehaviorSubjects (state object is a part of our core layer). In the case of NgRx, we would dispatch actions for the store.</p><p>Having this kind abstraction gives us a lot of flexibility and allows to change the way we manage state not even touching the presentation layer. It's even possible to seamlessly migrate to a real-time backend like Firebase, making our application real-time. I personally like to start with BehaviorSubjects to manage the state. If later, at some point in the development of the system, there is a need to use something else, with this kind of architecture, it is very easy to refactor.</p>

Synchronization strategy

<p>Now, let's take a closer look at the other important aspect of the abstraction layer. Regardless of the state management solution we choose, we can implement UI updates in either optimistic or pessimistic fashion. Imagine we want to create a new record in the collection of some entities. This collection was fetched from the backend and displayed in the DOM. In a pessimistic approach, we first try to update the state on the backend side (for example with HTTP request) and in case of success we update the state in the frontend application. On the other hand, in an optimistic approach, we do it in a different order. First, we assume that the backend update will succeed and update frontend state immediately. Then we send request to update server state. In case of success, we don't need to do anything, but in case of failure, we need to rollback the change in our frontend application and inform the user about this situation.</p>
Optimistic update changes the UI state first and attempts to update the backend state. This provides a user with a better experience, as he does not see any delays, because of network latency. If backend update fails, then UI change has to be rolled back.
Pessimistic update changes the backend state first and only in case of success updates the UI state. Usually, it is necessary to show some kind of spinner or loading bar during the execution of backend request, because of network latency.

Caching

<p>Sometimes, we may decide that the data we fetch from the backend will not be a part of our application state. This may be useful for read-only data that we don't want to manipulate at all and just pass (via abstraction layer) to the components. In this case, we can apply data caching in our facade. The easiest way to achieve it is to use shareReplay() RxJS operator that will replay the last value in the stream for each new subscriber. Take a look at the code snippet below with RecordsFacade using RecordsApi to fetch, cache and filter the data for the components.</p><pre class="ql-syntax" spellcheck="false">@Injectable()
export class RecordsFacade {

private records$: Observable<Record[]>;

constructor(private recordApi: RecordApi) {
this.records$ = this.recordApi
.getRecords()
.pipe(shareReplay(1)); // cache the data
}

getRecords() {
return this.records$;
}

// project the cached data for the component
getRecordsFromPeriod(period?: Period): Observable<Record[]> {
return this.records$
.pipe(map(records => records.filter(record => record.inPeriod(period))));
}

searchRecords(search: string): Observable<Record[]> {
return this.recordApi.searchRecords(search);
}
}
</pre><p>To sum up, what we can do in the abstraction layer is to:</p>

    <li>expose methods for the components in which we:</li><li class="ql-indent-1">delegate logic execution to the core layer,</li><li class="ql-indent-1">decide about data synchronization strategy (optimistic vs. pessimistic),</li><li>expose streams of state for the components:</li><li class="ql-indent-1">pick one or more streams of UI state (and combine them if necessary),</li><li class="ql-indent-1">cache data from external API.</li>
<p>As we see, the abstraction layer plays an important role in our layered architecture. It has clearly defined responsibilities what helps to better understand and reason about the system. Depending on your particular case, you can create one facade per Angular module or one per each entity. For example, the SettingsModule may have a single SettingsFacade, if it's not too bloated. But sometimes it may be better to create more-granular abstraction facades for each entity individually, like UserFacade for Userentity.</p>

Core layer

<p>The last layer is the core layer. Here is where core application logic is implemented. All data manipulation and outside world communicationhappen here. If for state management, we were using a solution like NgRx, here is a place to put our state definition, actions and reducers. Since in our examples we are modeling state with BehaviorSubjects, we can encapsulate it in a convenient state class. Below, you can find SettingsState example from the core layer.</p><pre class="ql-syntax" spellcheck="false">@Injectable()
export class SettingsState {

private updating$ = new BehaviorSubject<boolean>(false);
private cashflowCategories$ = new BehaviorSubject<CashflowCategory[]>(null);

isUpdating$() {
return this.updating$.asObservable();
}

setUpdating(isUpdating: boolean) {
this.updating$.next(isUpdating);
}

getCashflowCategories$() {
return this.cashflowCategories$.asObservable();
}

setCashflowCategories(categories: CashflowCategory[]) {
this.cashflowCategories$.next(categories);
}

addCashflowCategory(category: CashflowCategory) {
const currentValue = this.cashflowCategories$.getValue();
this.cashflowCategories$.next([...currentValue, category]);
}

updateCashflowCategory(updatedCategory: CashflowCategory) {
const categories = this.cashflowCategories$.getValue();
const indexOfUpdated = categories.findIndex(category => category.id === updatedCategory.id);
categories[indexOfUpdated] = updatedCategory;
this.cashflowCategories$.next([...categories]);
}

updateCashflowCategoryId(categoryToReplace: CashflowCategory, addedCategoryWithId: CashflowCategory) {
const categories = this.cashflowCategories$.getValue();
const updatedCategoryIndex = categories.findIndex(category => category === categoryToReplace);
categories[updatedCategoryIndex] = addedCategoryWithId;
this.cashflowCategories$.next([...categories]);
}

removeCashflowCategory(categoryRemove: CashflowCategory) {
const currentValue = this.cashflowCategories$.getValue();
this.cashflowCategories$.next(currentValue.filter(category => category !== categoryRemove));
}
}
</pre><p>In the core layer, we also implement HTTP queries in the form of class providers. This kind of class could have Api or Service name postfix. API services have only one responsibility - it is just to communicate with API endpoints and nothing else. We should avoid any caching, logic or data manipulation here. A simple example of API service can be found below.</p><pre class="ql-syntax" spellcheck="false">@Injectable()
export class CashflowCategoryApi {

readonly API = '/api/cashflowCategories';

constructor(private http: HttpClient) {}

getCashflowCategories(): Observable<CashflowCategory[]> {
return this.http.get<CashflowCategory[]>(this.API);
}

createCashflowCategory(category: CashflowCategory): Observable<any> {
return this.http.post(this.API, category);
}

updateCashflowCategory(category: CashflowCategory): Observable<any> {
return this.http.put(${this.API}/${category.id}, category);
}

}
</pre><p>In this layer, we could also place any validators, mappers or more advanced use-cases that require manipulating many slices of our UI state.</p><p>We have covered the topic of the abstraction layers in our frontend application. Every layer has it's well-defined boundaries and responsibilities. We also defined the strict rules of communication between layers. This all helps to better understand and reason about the system over time as it becomes more and more complex.</p><p></p><p>If you need help with your project, check out Angualar Academy Workshops or write an email to [email protected].</p>

Unidirectional data flow and reactive state management

<p>The next principle we want to introduce in our system is about the data flow and propagation of change. Angular itself uses unidirectional data flow on presentation level (via input bindings), but we will impose a similar restriction on the application level. Together with reactive state management (based on streams), it will give us the very important property of the system - data consistency. Below diagram presents the general idea of unidirectional data flow.</p><p></p><p>Whenever any model value change in our application, Angular change detection system takes care of the propagation of that change. It does it via input property bindings from the top to bottom of the whole component tree. It means that a child component can only depend on its parent and never vice versa. This is why we call it unidirectional data flow. This allows Angular to traverse the components tree only once (as there are no cycles in the tree structure) to achieve a stable state, which means that every value in the bindings is propagated.</p><p>As we know from previous chapters, there is the core layer above the presentation layer, where our application logic is implemented. There are the services and providers that operate on our data. What if we apply the same principle of data manipulation on that level? We can place the application data (the state) in one place "above" the components and propagate the values down to the components via Observable streams (Redux and NgRx call this place a store). The state can be propagated to multiple components and displayed in multiple places, but never modified locally. The change may come only "from above" and the components below only "reflect" the current state of the system. This gives us the important system's property mentioned before - data consistency - and the state object becomes the single source of truth. Practically speaking, we can display the same data in multiple places and not be afraid that the values would differ.</p><p>Our state object exposes the methods for the services in our core layer to manipulate the state. Whenever there is a need to change the state, it can happen only by calling a method on the state object (or dispatching an action in case of using NgRx). Then, the change is propagated "down", via streams, the to presentation layer (or any other service). This way, our state management is reactive. Moreover, with this approach, we also increase the level of predictability in our system, because of strict rules of manipulating and sharing the application state. Below you can find a code snippet modeling the state with BehaviorSubjects.</p><pre class="ql-syntax" spellcheck="false">@Injectable()
export class SettingsState {

private updating$ = new BehaviorSubject<boolean>(false);
private cashflowCategories$ = new BehaviorSubject<CashflowCategory[]>(null);

isUpdating$() {
return this.updating$.asObservable();
}

setUpdating(isUpdating: boolean) {
this.updating$.next(isUpdating);
}

getCashflowCategories$() {
return this.cashflowCategories$.asObservable();
}

setCashflowCategories(categories: CashflowCategory[]) {
this.cashflowCategories$.next(categories);
}

addCashflowCategory(category: CashflowCategory) {
const currentValue = this.cashflowCategories$.getValue();
this.cashflowCategories$.next([...currentValue, category]);
}

updateCashflowCategory(updatedCategory: CashflowCategory) {
const categories = this.cashflowCategories$.getValue();
const indexOfUpdated = categories.findIndex(category => category.id === updatedCategory.id);
categories[indexOfUpdated] = updatedCategory;
this.cashflowCategories$.next([...categories]);
}

updateCashflowCategoryId(categoryToReplace: CashflowCategory, addedCategoryWithId: CashflowCategory) {
const categories = this.cashflowCategories$.getValue();
const updatedCategoryIndex = categories.findIndex(category => category === categoryToReplace);
categories[updatedCategoryIndex] = addedCategoryWithId;
this.cashflowCategories$.next([...categories]);
}

removeCashflowCategory(categoryRemove: CashflowCategory) {
const currentValue = this.cashflowCategories$.getValue();
this.cashflowCategories$.next(currentValue.filter(category => category !== categoryRemove));
}
}

</pre><p>Let's recap the steps of handling the user interaction, having in mind all the principles we have already introduced. First, let's imagine that there is some event in the presentation layer (for example button click). The component delegates the execution to the abstraction layer, calling the method on the facade settingsFacade.addCategory(). Then, the facade calls the methods on the services in the core layer - categoryApi.create() and settingsState.addCategory(). The order of invocation of those two methods depends on synchronization strategy we choose (pessimistic or optimistic). Finally, the application state is propagated down to the presentation layer via the observable streams. This process is well-defined.</p><p></p>

Modular design

<p>We have covered the horizontal division in our system and the communication patterns across it. Now we are going to introduce a vertical separation into feature modules. The idea is to slice the application into feature modules representing different business functionalities. This is yet another step to deconstruct the system into smaller pieces for better maintainability. Each of the features modules share the same horizontal separation of the core, abstraction, and presentation layer. It is important to note, that these modules could be lazily loaded (and preloaded) into the browser increasing the initial load time of the application. Below you can find a diagram illustrating features modules separation.</p><p></p><p>Our application has also two additional modules for more technical reasons. We have a CoreModule that defines our singleton services, single-instance components, configuration, and export any third-party modules needed in AppModule. This module is imported only once in AppModule. The second module is SharedModule that contains common components/pipes/directives and also export commonly used Angular modules (like CommonModule). SharedModule can be imported by any feature module. The diagram below presents the imports structure.</p><p></p>

Module directory structure

<p>Below diagram presents how we can place all the pieces of our SettingsModule inside the directories. We can put the files inside of the folders with a name representing their function.</p><p></p>

Smart and dumb components

<p>The final architectural pattern we introduce in this article is about components themselves. We want to divide components into two categories, depending on their responsibilities. First, are the smart components (aka containers). These components usually:</p>
    <li>have facade/s and other services injected,</li><li>communicate with the core layer,</li><li>pass data to the dumb components,</li><li>react to the events from dumb components,</li><li>are top-level routable components (but not always!).</li>
<p>Previously presented CategoriesComponent is smart. It has SettingsFacadeinjected and uses it to communicate with the core layer of our application.</p><p>In the second category, there are dumb components (aka presentational). Their only responsibilities are to present UI element and to delegate user interaction "up" to the smart components via events. Think of a native HTML element like <button>Click me</button>. That element does not have any particular logic implemented. We can think of the text 'Click me' as an input for this component. It also has some events that can be subscribed to, like click event. Below you can find a code snippet of a simple presentational component with one input and no output events.</p><pre class="ql-syntax" spellcheck="false">@Component({
selector: 'budget-progress',
templateUrl: './budget-progress.component.html',
styleUrls: ['./budget-progress.component.scss'],
changeDetection: ChangeDetectionStrategy.OnPush
})
export class BudgetProgressComponent {

@Input()
budget: Budget;
today: string;

}
</pre>

Summary

<p>We have covered a couple of ideas on how to design the architecture of an Angular application. These principles, if applied wisely, can help to maintain sustainable development speed over time, and allow new features to be delivered easily. Please don't treat them as some strict rules, but rather recommendations that could be employed when they make sense.</p><p>We have taken a close look at the abstractions layers, unidirectional data flow, and reactive state management, modular design, and smart/dumb components pattern. I hope that these concepts will be helpful in your projects and, as always, if you have any questions, I am more than happy to chat with you.</p><p class="ql-align-center">Originally published by Bartosz Pietrucha at angular-academy.com</p><p>==========================================</p><p>Thanks for reading :heart: If you liked this post, share it with all of your programming buddies! Follow me on Facebook | Twitter</p>

Learn More

<p>☞ Angular 8 (formerly Angular 2) - The Complete Guide</p><p>☞ Complete Angular 8 from Zero to Hero | Get Hired</p><p>☞ Learn and Understand AngularJS</p><p>☞ The Complete Angular Course: Beginner to Advanced</p><p>☞ Angular Crash Course for Busy Developers</p><p>☞ Angular Essentials (Angular 2+ with TypeScript)</p><p>☞ Angular (Full App) with Angular Material, Angularfire & NgRx</p><p>☞ Angular & NodeJS - The MEAN Stack Guide</p>

Why Django is the Best Web Framework for Charting Uncharted Territory

Why Django is the Best Web Framework for Charting Uncharted Territory

Why Django is the Best Web Framework for Charting Uncharted Territory?

Wrapping CommonJS library in Angular 8 using Mark.js

Wrapping CommonJS library in Angular 8 using Mark.js

Introduction

Time to time on my daily tasks I have to implement some functionality that was already implemented by someone previously in a neat vanillaJS library, but… no Angular version or even ES6 module of it is available to be able to easily grab it into your Angular 8 application.

Yes, you can attach this lib in index.html with script tag but from my point of view, it hardens maintainability. Also, you should do the same for another Angular project where you might use it.

Much better for is create Angular wrapper directive (or component) and publish it as npm package so everyone (and you of course:) can easily re-use it in another project.

One of such libraries is mark.js — quite solid solution for highlighting search text inside a specified webpage section.
Wrapping CommonJS library in Angular 8 using Mark.js

<figcaption class="av ej me mf hr do dm dn mg mh aq cv">mark.js</figcaption>

How mark.js works

In original implementation mark.js can be connected to a project in two ways:

$ npm install mark.js --save-dev// in JS code
const Mark = require('mark.js');
let instance = new Mark(document.querySelector("div.context"));
instance.mark(keyword [, options]);OR<script src="vendor/mark.js/dist/mark.min.js"></script>// in JS code
let instance = new Mark(document.querySelector("div.context"));
instance.mark(keyword [, options]);

And the result looks like this:
Wrapping CommonJS library in Angular 8 using Mark.js

<figcaption class="av ej me mf hr do dm dn mg mh aq cv">mark.js run result (taken from [official Mark.js page](https://markjs.io/configurator.html))</figcaption>

You can play with it more on mark.js configurator page.

But can we use it in Angular way? Say, like this

// some Angular module
imports: [
...
MarkjsModule // imports markjsHighlightDirective
...
]// in some component template
<div class="content_wrapper" 
     [markjsHighlight]="searchValue"
     [markjsConfig]="config"
>

Let's also add some additional functionality. Say, scroll content_wrapper to first highlighted word:

<div class="content_wrapper" 
     [markjsHighlight]="searchText"
     [markjsConfig]="config"
     [scrollToFirstMarked]="true"
>

Now let's implement and publish Angular library with a demo application that will contain markjsHighlightDirective and its module.
We will name it ngx-markjs.

Planning Angular project structure

To generate an Angular project for our lib we will use Angular CLI.

npm install -g @angular/cli

Now let's create our project and add ngx-markjs lib to it:

ng new ngx-markjs-demo --routing=false --style=scss
// a lot of installations goes herecd ngx-markjs-demong generate lib ngx-markjs

And now lets add markjsHighlightDirective starter to our ngx-markjs lib

ng generate directive markjsHighlight --project=ngx-markjs

After deleting ngx-markjs.component.ts and ngx-markjs.service.ts in projects/ngx-markjs/src/lib/ folder which were created automatically by Angular CLI we will get next directory structure for our project:
Wrapping CommonJS library in Angular 8 using Mark.js

<figcaption class="av ej me mf hr do dm dn mg mh aq cv">ngx-markjs-demo project with ngx-markjs lib</figcaption>

To conveniently build our library lets add two more lines in a project package.json file to scripts section:

"scripts": {
  "ng": "ng",
  "start": "ng serve --port 4201",
  "build": "ng build",
  "build:ngx-markjs": "ng build ngx-markjs && npm run copy:lib:dist", 
  "copy:lib:dist": "cp -r ./projects/ngx-markjs/src ./dist/ngx-markjs/src",
  "test": "ng test",
  "lint": "ng lint",
  "e2e": "ng e2e"
},

build:ngx-markjs — runs build for ngx-markjs library (but not for parent demo project)

copy:lib:dist — it is convenient to have source files in npm packages as well, so this command will copy library sources to /dist/ngx-markjs folder (where compiled module will be placed after build:ngx-markjs command).

Now time to add implementation code!

*Remark: official Angular documentation about creating libraries recommends generating starter without main parent project, like this:
ng new my-workspace — create-application=false
But I decided to keep the main project and make it a demo application just for my convenience.

Connecting commonJS lib into Angular app

We need to do a few preparational steps before we start implementing our directive:

#1. Load mark.js

Mark.js library which we wan to wrap is provided in CommonJS format.

There are two ways to connect script in CommonJS script:

a) Add it with script tag to index.html:

<script src="vendor/mark.js/dist/mark.min.js"></script>

b) Add it to angular.json file in a project root so Angular builder will grab and applied it (as if it was included with a script tag)

"sourceRoot": "src",
"prefix": "app",
"architect": {
  "build": {
    "builder": "@angular-devkit/build-angular:browser",
    "options": {
      "outputPath": "dist/ngx-markjs-demo",
      "index": "src/index.html",
      "main": "src/main.ts",
      "polyfills": "src/polyfills.ts",
      "tsConfig": "tsconfig.app.json",
      "aot": false,
      "assets": [
        "src/favicon.ico",
        "src/assets"
      ],
      "styles": [
        "src/styles.scss"
      ],
      "scripts": []
    },

#2. Adding mark.js to lib package.json

Now we should add mark.js lib as a dependency to our library package.json in <root>/projects/ngx-markjs/src folder (don't mix it up with src/package.json — file for main parent project).
We can add it as peerDependencies section — in that case, you should install mark.js manually prior to installing our wrapper package.

Or we can add mark.js to dependencies section — then mark.js package will be installed automatically when you run npm i ngx-markjs.

You can read more about the difference between package.json dependencies and peerDependencies in this great article.

#3. Get entity with require call.

const Mark = require('mark.js');

In our case, I would prefer to use require since mark.js code should be present only inside markjsHighlight lib module but not in whole application (until we use actually it there).

Small remark: some tslint configurations prevent using require to stimulate using es6 modules, so in that case just wrap require with _/ tslint: disabled /_ comment. Like this:

/* tslint:disable */
const Mark = require('mark.js');
/* tslint:enable */

The project is ready. Now it is time to implement our markjsHighlightDirective.

Wrapping mark.js in a directive

Ok, so lets plan how our markjsHighlightDirective will work:

  1. It should be applied to the element with content — to get HTML element content where the text will be searched. (markjsHighlight input)
  2. It should accept mark.js configuration object (markjsConfig input)
  3. And we should be able to switch on and off 'scroll to marked text' feature (scrollToFirstMarked input)

For example:

<div class="content_wrapper" 
     [markjsHighlight]="searchText"
     [markjsConfig]="config"
     [scrollToFirstMarked]="true"
>

Now it is time to implement these requirements.

Adding mark.js to the library

Install mark.js to our project

npm install mark.js

And create its instance in a projects/ngx-markjs/src/lib/markjs-highlight.directive.ts file:

import {Directive} from '@angular/core';


declare var require: any;
const Mark = require('mark.js');


@Directive({
  selector: '[markjsHighlight]'
})
export class MarkjsHighlightDirective {

  constructor() {}

}

To prevent Typescript warnings — I declared require global variable.

Creating a basic directive starter

The very first starter for MarkjsHighlightDirective will be

@Directive({
  selector: '[markjsHighlight]' // our directive
})
export class MarkjsHighlightDirective implements OnChanges {

  @Input() markjsHighlight = '';  // our inputs
  @Input() markjsConfig: any = {};
  @Input() scrollToFirstMarked: boolean = false;

  @Output() getInstance = new EventEmitter<any>();

  markInstance: any;

  constructor(
    private contentElementRef: ElementRef, // host element ref
    private renderer: Renderer2 // we will use it to scroll
  ) {
  }

  ngOnChanges(changes) {  //if searchText is changed - redo marking    if (!this.markInstance) { // emit mark.js instance (if needeed)
      this.markInstance = new Mark(this.contentElementRef.nativeElement);
      this.getInstance.emit(this.markInstance);
    }

    this.hightlightText(); // should be implemented    if (this.scrollToFirstMarked) {
      this.scrollToFirstMarkedText();// should be implemented
    }    
  }
}

Ok, so let's go through this starter code:

  1. We defined three inputs for searchText value, config and scrolling on/off functionality (as we planned earlier)
  2. ngOnChanges lifeCycle hook emits instance of Mark.js to parent component (in case you want to implement some additional Mark.js behavior)
    Also, each time searchText is changed we should redo text highlight (since search text is different now) — this functionality will be implemented in this.hightlightText method.
    And if scrollToFirstMarked is set to true — then we should run this.scrollToFirstMarkedText.

Implementing highlight functionality

Our method this.hightlightText should receive searchText value, unmark previous search results and do new text highlighting. It can be successfully done with this code:

hightlightText() {
  this.markjsHighlight = this.markjsHighlight || '';   if (this.markjsHighlight && this.markjsHighlight.length <= 2) {
    this.markInstance.unmark();
    return;  } else {    this.markInstance.unmark({
      done: () => {
        this.markInstance.mark((this.markjsHighlight || ''), this.markjsConfig);
      }
    });
  }
}

Code is self-explanatory: we check if markjsHighlight valur is not null or undefined (because with these values Mark.js instances throw the error).

Then check for text length. If it is just one letter or no text at all — we unmark text and return;

Otherwise, we unmark previously highlighted text and start new highlighting process.

Implementing a "scroll to first marked result" feature

One important remark here before we start implementing scroll feature: content wrapper element, where we apply our directive to should have css position set other than static (for example_position: relative_). Otherwise offset to be scrolled to will be calculated improperly.

OK, lets code this.scrollToFirstMarkedText method:

constructor(
  private contentElementRef: ElementRef,
  private renderer: Renderer2
) {
}
....scrollToFirstMarkedText() {
  const content = this.contentElementRef.nativeElement;// calculating offset to the first marked element
  const firstOffsetTop = (content.querySelector('mark') || {}).offsetTop || 0;   this.scrollSmooth(content, firstOffsetTop); // start scroll
}

scrollSmooth(scrollElement, firstOffsetTop) {
  const renderer = this.renderer;

  if (cancelAnimationId) {
    cancelAnimationFrame(cancelAnimationId);
  }
  const currentScrollTop = scrollElement.scrollTop;
  const delta = firstOffsetTop - currentScrollTop;

  animate({
    duration: 500,
    timing(timeFraction) {
      return timeFraction;
    },
    draw(progress) {
      const nextStep = currentScrollTop + progress * delta;     // set scroll with Angular renderer
     renderer.setProperty(scrollElement, 'scrollTop', nextStep);
    }
  });
}...
let cancelAnimationId;// helper function for smooth scroll
function animate({timing, draw, duration}) {
  const start = performance.now();
  cancelAnimationId = requestAnimationFrame(function animate2(time) {
    // timeFraction goes from 0 to 1
    let timeFraction = (time - start) / duration;
    if (timeFraction > 1) {
      timeFraction = 1;
    }
    // calculate the current animation state
    const progress = timing(timeFraction);
    draw(progress); // draw it
    if (timeFraction < 1) {
      cancelAnimationId = requestAnimationFrame(animate2);
    }
  });
}

How it works:

  1. We get content wrapper element (injected in a constructor by Angular) and query for first highlighted text node (Mark.js to highlight text wrap it in <Mark></Mark> HTML element).
  2. Then start **this.scrollSmooth**function. scrollSmooth cancels previous scroll (if any), calculates scroll difference, delta (diff between current scroll position and offsetTop of marked element) and call an animated function which will calculate timings for smooth scrolling and do actual scroll (by calling renderer.setProperty(scrollElement, ‘scrollTop’, nextStep)).
  3. Animate function is a helper taken from a very good javascript learning tutorial site javscript.info.

Our directive is ready! You can take a look at a full code here.

The only thing we have to do yet is to add a directive to NgxMarkjsModule module:

import { NgModule } from '@angular/core';
import { MarkjsHighlightDirective } from './markjs-highlight.directive';



@NgModule({
  declarations: [MarkjsHighlightDirective],
  imports: [
  ],
  exports: [MarkjsHighlightDirective]
})
export class NgxMarkjsModule { }

Applying Result

Now let's use it in our demo application:

1. Import NgxMarkjsModule to app.module.ts:

...
import {NgxMarkjsModule} from 'ngx-markjs';

@NgModule({
  declarations: [
    AppComponent
  ],
  imports: [
    BrowserModule,
    NgxMarkjsModule
  ],
  providers: [],
  bootstrap: [AppComponent]
})
export class AppModule { }

2. I added some content to app.component.html and applied the directive to it:

<div class="search_input">
  <input placeholder="Search..." #search type="text">
</div>
<div class="content_wrapper"
     [markjsHighlight]="searchText$ | async"
     [markjsConfig]="searchConfig"
     [scrollToFirstMarked]="true"
>
  <p>Lorem ipsum dolor ssit amet, consectetur...a lot of text futher</p>

3. In app.component.ts we should subscribe to input change event and feed search text to markjsHighlight directive with async pipe:

@Component({
  selector: 'app-root',
  templateUrl: './app.component.html',
  styleUrls: ['./app.component.scss']
})
export class AppComponent implements AfterViewInit {
  title = 'ngx-markjs-demo';
  @ViewChild('search', {static: false}) searchElemRef: ElementRef;
  searchText$: Observable<string>;
  searchConfig = {separateWordSearch: false};

  ngAfterViewInit() {
    // create stream from inpout change event with rxjs 'from' function    this.searchText$ = fromEvent(this.searchElemRef.nativeElement, 'keyup').pipe(
      map((e: Event) => (e.target as HTMLInputElement).value),
      debounceTime(300),
      distinctUntilChanged()
    );
  }
}

Let's start it and take a look at result:

ng serve

Wrapping CommonJS library in Angular 8 using Mark.js

<figcaption class="av ej me mf hr do dm dn mg mh aq cv">It works!</figcaption>

We did it!
The last thing to do: we should publish our directive to npm registry:

npm login
npm build:ngx-markjs
cd ./dist/ngx-markjs
npm publish

And here it is in a registry: ngx-markjs.

Conclusion

Did you meet some neat vanillaJS library which you want to use in Angular? Now you know how to do that!

Pros

  1. Now we can easily import our directive in Angular 8 project.
  2. Additional scroll functionality is quite neat — use it to improve user experience.

Cons

  1. Possibly mark.js implemented only for a browser. So if you plan to use it in some other platforms (Angular allows it — read more about it here) — it may not work.

Related links:

  1. Mark.js
  2. ngx-markjs github repo.

Build RESTful API In Laravel 5.8 Example

Build RESTful API In Laravel 5.8 Example


If you want to create web services with php than i will must suggest to use laravel 5.8 to create apis because laravel provide structure with authentication using passport. Based on structure it will become a very easily way to create rest apis.

Just Few days ago, laravel released it's new version as laravel 5.8. As we know laravel is a more popular because of security feature. So many of the developer choose laravel to create rest api for mobile app developing. Yes Web services is a very important when you create web and mobile developing, because you can create same database and work with same data.

Follow bellow few steps to create restful api example in laravel 5.8 app.

Step 1: Download Laravel 5.8

I am going to explain step by step from scratch so, we need to get fresh Laravel 5.8 application using bellow command, So open your terminal OR command prompt and run bellow command:

<pre class="ql-syntax" spellcheck="false">composer create-project --prefer-dist laravel/laravel blog </pre>

Step 2: Install Passport

In this step we need to install passport via the Composer package manager, so one your terminal and fire bellow command:

<pre class="ql-syntax" spellcheck="false">composer require laravel/passport </pre>

After successfully install package, we require to get default migration for create new passport tables in our database. so let's run bellow command.

<pre class="ql-syntax" spellcheck="false">php artisan migrate </pre>

Next, we need to install passport using command, Using passport:install command, it will create token keys for security. So let's run bellow command:

<pre class="ql-syntax" spellcheck="false">php artisan passport:install </pre>

Step 3: Passport Configuration

In this step, we have to configuration on three place model, service provider and auth config file. So you have to just following change on that file.

In model we added HasApiTokens class of Passport,

In AuthServiceProvider we added "Passport::routes()",

In auth.php, we added api auth configuration.

app/User.php

<pre class="ql-syntax" spellcheck="false"><?php

namespace App;

use Illuminate\Notifications\Notifiable;
use Illuminate\Contracts\Auth\MustVerifyEmail;
use Laravel\Passport\HasApiTokens;
use Illuminate\Foundation\Auth\User as Authenticatable;

class User extends Authenticatable implements MustVerifyEmail
{
use HasApiTokens, Notifiable;

/**
 * The attributes that are mass assignable.
 *
 * @var array
 */
protected $fillable = [
    'name', 'email', 'password',
];

/**
 * The attributes that should be hidden for arrays.
 *
 * @var array
 */
protected $hidden = [
    'password', 'remember_token',
];

}
</pre>

app/Providers/AuthServiceProvider.php

<pre class="ql-syntax" spellcheck="false"><?php

namespace App\Providers;

use Laravel\Passport\Passport;
use Illuminate\Support\Facades\Gate;
use Illuminate\Foundation\Support\Providers\AuthServiceProvider as ServiceProvider;

class AuthServiceProvider extends ServiceProvider
{
/**
* The policy mappings for the application.
*
* @var array
*/
protected $policies = [
'App\Model' => 'App\Policies\ModelPolicy',
];

/**
 * Register any authentication / authorization services.
 *
 * @return void
 */
public function boot()
{
    $this-&gt;registerPolicies();

    Passport::routes();
}

}
</pre>

config/auth.php

<pre class="ql-syntax" spellcheck="false"><?php

return [
.....
'guards' => [
'web' => [
'driver' => 'session',
'provider' => 'users',
],
'api' => [
'driver' => 'passport',
'provider' => 'users',
],
],
.....
]
</pre>

Step 4: Add Product Table and Model

next, we require to create migration for posts table using Laravel 5.8 php artisan command, so first fire bellow command:

<pre class="ql-syntax" spellcheck="false">php artisan make:migration create_products_table
</pre>

After this command you will find one file in following path database/migrations and you have to put bellow code in your migration file for create products table.

<pre class="ql-syntax" spellcheck="false"><?php

use Illuminate\Support\Facades\Schema;
use Illuminate\Database\Schema\Blueprint;
use Illuminate\Database\Migrations\Migration;

class CreateProductsTable extends Migration
{
/**
* Run the migrations.
*
* @return void
*/
public function up()
{
Schema::create('products', function (Blueprint $table) {
$table->increments('id');
$table->string('name');
$table->text('detail');
$table->timestamps();
});
}

/**
 * Reverse the migrations.
 *
 * @return void
 */
public function down()
{
    Schema::dropIfExists('products');
}

}
</pre>

After create migration we need to run above migration by following command:

<pre class="ql-syntax" spellcheck="false">php artisan migrate
</pre>

After create "products" table you should create Product model for products, so first create file in this path app/Product.php and put bellow content in item.php file:


app/Product.php

<pre class="ql-syntax" spellcheck="false"><?php

namespace App;

use Illuminate\Database\Eloquent\Model;

class Product extends Model
{
/**
* The attributes that are mass assignable.
*
* @var array
*/
protected $fillable = [
'name', 'detail'
];
}
</pre>

Step 5: Create API Routes

In this step, we will create api routes. Laravel provide api.php file for write web services route. So, let's add new route on that file.

routes/api.php

<pre class="ql-syntax" spellcheck="false"><?php

/*
|--------------------------------------------------------------------------

API Routes
Here is where you can register API routes for your application. These
routes are loaded by the RouteServiceProvider within a group which
is assigned the "api" middleware group. Enjoy building your API!

*/

Route::post('register', 'API\[email protected]');

Route::middleware('auth:api')->group( function () {
Route::resource('products', 'API\ProductController');
});
</pre>

Step 6: Create Controller Files

in next step, now we have create new controller as BaseController, ProductController and RegisterController, i created new folder "API" in Controllers folder because we will make alone APIs controller, So let's create both controller:

app/Http/Controllers/API/BaseController.php

<pre class="ql-syntax" spellcheck="false"><?php

namespace App\Http\Controllers\API;

use Illuminate\Http\Request;
use App\Http\Controllers\Controller as Controller;

class BaseController extends Controller
{
/**
* success response method.
*
* @return \Illuminate\Http\Response
*/
public function sendResponse($result, $message)
{
$response = [
'success' => true,
'data' => $result,
'message' => $message,
];

    return response()-&gt;json($response, 200);
}

/**
 * return error response.
 *
 * @return \Illuminate\Http\Response
 */
public function sendError($error, $errorMessages = [], $code = 404)
{
	$response = [
        'success' =&gt; false,
        'message' =&gt; $error,
    ];

    if(!empty($errorMessages)){
        $response['data'] = $errorMessages;
    }

    return response()-&gt;json($response, $code);
}

}
</pre>

app/Http/Controllers/API/ProductController.php

<pre class="ql-syntax" spellcheck="false"><?php

namespace App\Http\Controllers\API;

use Illuminate\Http\Request;
use App\Http\Controllers\API\BaseController as BaseController;
use App\Product;
use Validator;

class ProductController extends BaseController
{
/**
* Display a listing of the resource.
*
* @return \Illuminate\Http\Response
*/
public function index()
{
$products = Product::all();

    return $this-&gt;sendResponse($products-&gt;toArray(), 'Products retrieved successfully.');
}

/**
 * Store a newly created resource in storage.
 *
 * @param  \Illuminate\Http\Request  $request
 * @return \Illuminate\Http\Response
 */
public function store(Request $request)
{
    $input = $request-&gt;all();

    $validator = Validator::make($input, [
        'name' =&gt; 'required',
        'detail' =&gt; 'required'
    ]);

    if($validator-&gt;fails()){
        return $this-&gt;sendError('Validation Error.', $validator-&gt;errors());       
    }

    $product = Product::create($input);

    return $this-&gt;sendResponse($product-&gt;toArray(), 'Product created successfully.');
}

/**
 * Display the specified resource.
 *
 * @param  int  $id
 * @return \Illuminate\Http\Response
 */
public function show($id)
{
    $product = Product::find($id);

    if (is_null($product)) {
        return $this-&gt;sendError('Product not found.');
    }

    return $this-&gt;sendResponse($product-&gt;toArray(), 'Product retrieved successfully.');
}

/**
 * Update the specified resource in storage.
 *
 * @param  \Illuminate\Http\Request  $request
 * @param  int  $id
 * @return \Illuminate\Http\Response
 */
public function update(Request $request, Product $product)
{
    $input = $request-&gt;all();

    $validator = Validator::make($input, [
        'name' =&gt; 'required',
        'detail' =&gt; 'required'
    ]);

    if($validator-&gt;fails()){
        return $this-&gt;sendError('Validation Error.', $validator-&gt;errors());       
    }

    $product-&gt;name = $input['name'];
    $product-&gt;detail = $input['detail'];
    $product-&gt;save();

    return $this-&gt;sendResponse($product-&gt;toArray(), 'Product updated successfully.');
}

/**
 * Remove the specified resource from storage.
 *
 * @param  int  $id
 * @return \Illuminate\Http\Response
 */
public function destroy(Product $product)
{
    $product-&gt;delete();

    return $this-&gt;sendResponse($product-&gt;toArray(), 'Product deleted successfully.');
}

}
</pre>

app/Http/Controllers/API/RegisterController.php

<pre class="ql-syntax" spellcheck="false"><?php

namespace App\Http\Controllers\API;

use Illuminate\Http\Request;
use App\Http\Controllers\API\BaseController as BaseController;
use App\User;
use Illuminate\Support\Facades\Auth;
use Validator;

class RegisterController extends BaseController
{
/**
* Register api
*
* @return \Illuminate\Http\Response
*/
public function register(Request $request)
{
$validator = Validator::make($request->all(), [
'name' => 'required',
'email' => 'required|email',
'password' => 'required',
'c_password' => 'required|same:password',
]);

    if($validator-&gt;fails()){
        return $this-&gt;sendError('Validation Error.', $validator-&gt;errors());       
    }

    $input = $request-&gt;all();
    $input['password'] = bcrypt($input['password']);
    $user = User::create($input);
    $success['token'] =  $user-&gt;createToken('MyApp')-&gt;accessToken;
    $success['name'] =  $user-&gt;name;

    return $this-&gt;sendResponse($success, 'User register successfully.');
}

}
</pre>

Now we are ready to to run full restful api and also passport api in laravel. so let's run our example so run bellow command for quick run:

<pre class="ql-syntax" spellcheck="false">php artisan serve
</pre>

make sure in details api we will use following headers as listed bellow:

<pre class="ql-syntax" spellcheck="false">'headers' => [
'Accept' => 'application/json',
'Authorization' => 'Bearer '.$accessToken,
]
</pre>

Here is Routes URL with Verb:

1) Login: Verb:GET, URL:http://localhost:8000/oauth/token

2) Register: Verb:GET, URL:http://localhost:8000/api/register

3) List: Verb:GET, URL:http://localhost:8000/api/products

4) Create: Verb:POST, URL:http://localhost:8000/api/products

5) Show: Verb:GET, URL:http://localhost:8000/api/products/{id}

6) Update: Verb:PUT, URL:http://localhost:8000/api/products/{id}

7) Delete: Verb:DELETE, URL:http://localhost:8000/api/products/{id}

Now simply you can run above listed url like as bellow screen shot:

Login API:

Register API:

Product List API:

Product Create API:

Product Show API:

Product Update API:

Product Delete API:

I hope it can help you...

Thanks for reading ❤

If you liked this post, share it with all of your programming buddies!

Follow me on Facebook | Twitter

Learn More

PHP with Laravel for beginners - Become a Master in Laravel

Projects in Laravel: Learn Laravel Building 10 Projects

Laravel for RESTful: Build Your RESTful API with Laravel

Fullstack Web Development With Laravel and Vue.js

Creating RESTful APIs with NodeJS and MongoDB Tutorial

Developing RESTful APIs with Lumen (A PHP Micro-framework)

Build a Simple REST API in PHP

Node.js and Express Tutorial: Building and Securing RESTful APIs

Building a Vue SPA With Laravel

Build a CMS with Laravel and Vue

7 DevOps Tools You Should Know In 2020

7 DevOps Tools You Should Know In 2020

DevOps culture is now an integral part of every tech-savvy business and plays a role in many business processes, ranging from project planning to software delivery.As cloud services are prevailing today, the requirement of related supplementary services is growing rapidly. DevOps technologies are increasing as well, so how one should choose the right tools to automate work? There are a lot of opinions.

There are a lot of tools that make DevOps possible, and it will be near impossible to cover them in one article. However, the 7 tools you’ll learn about in this article are some of the most popular and powerful DevOps tools.

1. Jenkins

This is image title
A lot of DevOps engineers call Jenkins the best CI/CD tool available in the market, since it’s incredibly useful. Jenkins is an automation server that is written in Java and is used to report changes, conduct live testing and distribute code across multiple machines. As Jenkins has a built-in GUI and over 1,000 plugins to support building and testing your application, it is considered a really powerful, yet easy to use tool. Thanks to these plugins, Jenkins integrates well with practically every other instrument in the continuous integration and continuous delivery toolchain.

  • Easy to install and a lot of support available from the community.

  • 1,000+ plugins are available and easy to create your own, if needed.

  • It can be used to publish results and send email notifications.

2- Terraform

This is image title

Terraform is an infrastructure-as-code tool that lets you build, change, and manage infrastructure properly. You can consider Terraform to be a provisioning tool. It helps you set up servers, databases, and other kinds of infrastructure that powers full-scale applications.

The code that manages infrastructure using Terraform is written in the Hashicorp Configuration Language (HCL). All of the configurations you need should be in this file as it will include dependencies to let the application run. HCL is declarative, such that you only need to specify the end state you desire, and Terraform will do the rest of the job.

Terraform is not restricted to any particular cloud service provider as it works with multiple cloud providers and environments. There are no issues with compatibility when using Terraform.
Cloud services providers such as AWS, Microsoft Azure, Google Cloud all integrate seamlessly with Terraform. Version Control System hosting providers such as Github and Bitbucket all also work fine with it.

There is an enterprise and open source version and Terraform can be installed on macOS, Linux and Windows systems.

3- Ansible

This is image title

Similar to Terraform, Ansible is also an infrastructure-as-code tool. Ansible is a tool that helps with the deployment of applications, the provisioning and configuration management of servers. Ansible is built in Python and maintained by RedHat. But it remains free and open source.

As a configuration management system, you can use Ansible to set up and build multiple servers. You get to install Ansible on a control machine, without requiring Ansible running on the other servers which can vary from web to app to database servers.

Unlike Terraform, Ansible doesn’t make use of HCL for its code. Instead, the configurations are written in Ansible playbooks which are YAML files. Ansible uses a hybrid of a declarative and procedural pattern. This is different from Terraform, which is solely declarative.

Since Ansible works on a control machine that administers others, it requires a mode of communicating with them. In this case, Ansible uses SSH. It pushes the modules to the other servers from the dominant server. Ansible is an agentless system, as it doesn’t require a deployment agent on the different machines.

Linux is the most suitable operating system for installing Ansible. However, it also works fine on macOS. For Windows users, it is possible to use Ansible through the bash shell from the Windows Subsystem for Linux.

4- Docker

This is image title

Docker is a software containerization platform that allows DevOps to build, ship, and run distributed processes within containers. This gives developers the ability to create predictable environments that are isolated from the rest of the applications and can be run anywhere. Containers are isolated but share the same OS kernel. This way you get to use hardware resources more efficiently compared to virtual machines.

Each container can hold a single process, like a web server or database management system. You can create a cluster of containers distributed across different nodes to have your application up and running in both load balancing and high availability modes. Containers can communicate on a private network, as you most likely want to keep some of your application parts private for security purposes. Simply expose your web server to the Internet and you are good to go.

What I like most is that you can install Docker on your computer to run containers locally to make some ad-hoc software tests without installing its dependencies globally. When you are done, you simply terminate your Docker container and your computer is as clean as new.

  • Build once, run anywhere! You can package an application from your laptop and run it unmodified on any public/private cloud or bare metal server.

  • Containers are lightweight and fast.

  • Docker Hub offers many official and community-built public Docker images.

  • Separating different components of a large application into containers have security benefits: if one container is compromised, others remain unaffected.

5- Kubernetes

This is image title

Kubernetes (K8s) is a Google open-source tool that lets you administer Docker containers. Since there are often a lot of containers running in production, Kubernetes makes it possible to orchestrate those containers.

It is, however, important to understand the reason to orchestrate Docker containers in the first place. When there are many containers running, it is hard to manually monitor these containers and have them communicating with each other. Asides, this scaling also becomes difficult as well as load balancing.

With Kubernetes, it is possible to bring all these containers under control so this cluster of machines can be administered as one machine. Often compared to Docker Compose, Kubernetes is different as it makes it easier to deploy, scale, and monitor the containers. When any of them crash, they can self-heal, and Kubernetes can spin up new ones as replacements. With K8s, it is possible to do storage orchestration, service discovery, and load balancing easily.

You can install Kubernetes on macOS, Linux, and Windows and use it through the Kubernetes command-line tool.

6- RabbitMQ

This is image title

RabbitMQ is a great messaging and queuing tool that you can use for applications that runs on most operating systems. Managing queues, exchanges and routing with it is a breeze. Even if you have an elaborate configuration to be built, it’s relatively easy to do so, since the tool is really well-documented. You can stream a lot of different high-performance processes and avoid system crashes through a friendly user interface. It's a durable and robust messaging broker that is worth your attention. As RabbitMQ developers like to say, it’s "messaging that just works."

  • Guaranteed message delivery.

  • Push work into background processes, freeing your web server up to handle more users.

  • Scale the most frequently used parts of your system, without having to scale everything.

  • Handling everything with ease even if it seems to be a huge crash.

7- Packer

This is image title

Packer is another DevOps tool from Hashicorp on the list. Written in Golang, Packer helps you automate the creation of virtual images. The process of manually building images can be frustrating as it is error-prone, but Packer eliminates all of that.

With a single JSON file, you can use Packer to create multiple images. So when it works the first time, there’s a guarantee that it will work the hundredth time since nothing interferes in the automation process. Many cloud service providers work with images, so you can seamlessly work with those providers since Packer standardizes the creation of images for the cloud environments.

Packer doesn’t work as a standalone tool. You can integrate it with Ansible, Chef, and Jenkins so the images can be used further down in the deployment pipeline. The installation process is not complicated, and you can learn to get started with the tool.

Conclusion

The concept of DevOps is can be very beneficial to getting large-scale applications to be performant under different kinds of load or traffic. It also makes the software deployment pipeline easy to manage.
However, the concepts DevOps are hard to implement without the availability of tools. There are many tools in this space and companies have varying choices.

Thanks for reading