Build a Serverless Pytorch API using AWS Lambda + Docker in just two steps

This month AWS announced container images support for AWS Lambda. For an Artificial Intelligence practitioner, it opens the possibility of deploying Deep Learning Models (Pytorch, Tensorflow) or more robust Decision Tree Models (XGBoost) as serverless API’s using AWS Lambda.

At Kavak, we’ve published a GitHub repo with code example to deploy a Pytorch and XGBoost Model as Serverless API.

In this tutorial, we will follow along with the Pytorch example, using the following AWS resources:

  • ECR Registry. To store the Docker Images.
  • S3 Bucket. Storage the model artifacts and stacks templates.
  • Lambda Function. An API event triggers the Lambda Function to generate model predictions.

These services are provisioned automatically using AWS CloudFormation and AWS Sam.

Note. Make sure you have awscli and aws-sam-cli installed and configured in your system. For more information, go here.

#deep-learning #aws #docker #serverless #api

Deploying Deep Learning Models as Serverless API
2.95 GEEK