Serverless computing (or serverless for short), is an execution model where the cloud provider manages and allocates resources dynamically without the need for infrastructure. Resource allocation is based on the as needed, real-time use of your application or website. When running this type of hosting, you are only charged for the amount of resources that our code uses.

Everything that is “served up” from a serverless platform is served from a stateless compute containers that is event-triggered. These triggered events are the same ones that would run on your ordinary server; HTTP requests, database events, monitoring alerts, cronjobs, and so forth.

In most cases, the code that is sent to the cloud provider for execution is usually in the form of a function. Because of this, serverless is often called “Functions as a Service” or “FaaS.” This is a term that you most likely read about as well. There are several thoughts to be aware of if we ever consider transitioning to a serverless environment.

In this article we will cover some of the basics of serverless computing. What serverless is, what is it used for, and what are some of its pros and cons.

Considerations of Transitioning to Serverless

Microservices

Your application should be constructed in the form of functions. The majority of developers deploy their applications as a single rails application, but in serverless, we will adapt the code to a microservice architecture. You can run an entire application as a group of separate function but is not recommended.

Stateless Functions

As stated earlier, functions run inside stateless containers. Be aware that your functions will most likely be invoked in a new container every time because we cannot run code after the event has completed.

Cold Starts

Because our functions run in a stateless container, the functions respond when an event is triggered. Every time this happens, a small amount of latency occurs, which is why it is called a “Cold Start.”

When a function completes, our container will stay active for a short while before going back to a stateless mode. If another event is triggered while the container is still running, the container will respond more quickly. This is called “Warm Start.” How long the cold starts will last depends entirely on a cloud provider and programming language used to write the function.

Now that we know how serverless works let’s review some of the pros and cons of a serverless architecture.

#api #architecture #computing #framework #microservices #serverless #serverless architecture

What is Serverless? A Beginners Guide
1.40 GEEK