Lambda and its Layers

When I first heard about AWS Lambda I was quite confused about what it was and tried to use it to train a simple ML model but was hit with a hard 5 minutes execution limit. Fast-forward a few years, I believe Lambda has evolved a lot and so have people’s understanding of event-driven systems and serverless compute. It has become part of many modern applications and data architects.

At re:Invent 2018, Lambda was heavily buffed with custom runtime and an increase in execution runtime limit to 15 minutes. Lambda Layers was also released which allowed you to share common dependencies to ease lambda deployment size and updates. However, AWS still hasn’t addressed the needs of friendly steps to bring in non-native python packages such as Pandas.

The troublesome approaches to bringing in external packages…

Currently, you either have to zip up your Lambda function and Linux compatible dependencies, or upload your dependencies as a Lambda Layers. If you’ve played around with Google Cloud Functions and Azure Function before then you would know that it can be as easy as to writing a wish list in the requirements.txt.

To add extra complexities, some of the Python packages need to compile C or C++ extensions (packages such as Numpy and Pandas). This could be a bit of a hassle if you want to use your macOS or Windows machine to pip install -t . pandasthen zip them up for Lambda Layers which is an Amazon Linux environment.

There are a few ways of bringing in Linux compatible dependencies whether it’s through Serverless or using an EC2 Instance. Now, if you’ve read some of my blogs before, I really enjoy using Lambdas in Hackathons and since time is the essence, I want to show you the easiest and quickest way on how I bring Python dependencies across as a Lambda Layers using Docker.

#lambda #aws #python #docker #programming

How to Install Python Packages for AWS Lambda Layer
2.00 GEEK