1565412983
Originally published by PHIL NASH at twilio.com
When AWS launched Lambda in 2014 there was no love for Ruby. Platforms like Python, Node.js, and Java started the serverless revolution for hosting and running functions in the cloud. At the end of 2018, support for Ruby was finally launched.
You can build with Ruby on Lambda using raw functions and Serverless Application Model (SAM) templates as described in the getting started guide for Ruby on Lambda, but Ruby is all about developer happiness and when the config file is longer than your program the process could be described as painful. Enter the Jets framework a framework that "leverages the power of Ruby to make serverless joyful for everyone."
Jets combines the experience of building a Rails application with the ability to deploy to AWS Lambda and related services, including API Gateway, S3, and DynamoDB. In this post we're going to see how to get started with Jets and deploy a Lambda powered Twilio application written in Ruby.
To keep things simple we're going to build an SMS based app and to make it a bit more fun, we're going to include a little humour. When you text into the application it will respond with a bad joke. Sorry, misspelling there, I meant a dad joke, courtesy of the icanhazdadjoke API.
This application will show us how to get started with Jets, create controllers, actions and routes. We'll build a single endpoint that responds to HTTP requests from Twilio when a number receives an incoming SMS message. The endpoint will return TwiML with a random dad joke each time and hilarity will ensue.
To follow along with this project you'll need:
Got all that? Then let's get started.
We'll start by installing the jets
gem globally. On the command line type:
gem install jets
The Jets executable can then be used like Rails as a generator to initiate a new project and then run commands within the project. We'll create a new project now, but given our scope in this post we are going to limit it a little. First up, we're going to create the project in API mode as there's no need for HTML views in this application. That also avoids asset compilation with webpacker, which will save in our build time. We also don't need a database. Run the following command to create a new project:
jets new dad-jokes-sms --mode api --no-database
Once the generator has finished running, change into the dad-jokes-sms
directory and run your application locally with the following command:
jets serve
Open up http://localhost:8888 and you will see a page that look like this:
If you see this page then your new Jets project is running successfully.
Now we have a Jets application we can use the gem to generate parts of our application, from models and controllers to full scaffolds. For our application we're going to need a controller with one action. Generate it with the following:
jets generate controller Messages create
The generator will create and edit a number of files for us. We need to check the new routes so open config/routes.rb
.
Jets generated a GET
route, but Twilio webhooks make POST
requests by default and I prefer to keep it that way. Set the application up to receive POST
request webhooks on the /messages
endpoint instead like so:
Jets.application.routes.draw do post 'messages', to: 'messages#create' root "jets/public#show"# The jets/public#show controller can serve static utf8 content out of the public folder.
# Note, as part of the deploy process Jets uploads files in the public folder to s3
# and serves them out of s3 directly. S3 is well suited to serve static assets.
# More info here: http://rubyonjets.com/docs/assets-serving/
any “*catchall”, to: “jets/public#show”
end
Now, let’s go write our controller action. Open app/controllers/messages_controller.rb
and you will see one method for the create
action. This is the action which will receive our Twilio webhook and respond with TwiML to send back a dad joke.
To send back a dad joke we need to make a call to the icanhazdadjoke API. Let’s write a quick private method we can use to achieve this.
We’ll use open-uri
as it’s useful for making simple web requests (including downloading files and images). The API will respond with plain text if we ask it to, which saves us doing any parsing. Add the following to the MessagesController
:
require ‘open-uri’class MessagesController < ApplicationController
def create
endprivate
def random_joke
open(‘https://icanhazdadjoke.com/’, { ‘Accept’ => ‘text/plain’ }).read
end
end
Now we’re ready to return our joke to Twilio as TwiML.
We’ll build up a response using the helpers from the twilio-ruby helper library. Open the Gemfile
and add twilio-ruby
:
source “https://rubygems.org”gem “jets”
gem “twilio-ruby”
On the command line, run bundle install
to install the gem. Now in the create
action instantiate a new TwiML response object, reply to the incoming message using the <Message>
TwiML element and render the XML response, like so:
require ‘open-uri’class MessagesController < ApplicationController
def create
twiml = Twilio::TwiML::MessagingResponse.new
twiml.message body: random_joke
render xml: twiml.to_xml
endprivate
def random_joke
open(‘https://icanhazdadjoke.com/’, { ‘Accept’ => ‘text/plain’ }).read
end
end
You can read more about how to use the twilio-ruby
helper library for generating TwiML in the documentation.
We can run this locally to test we are getting the expected response. If you stopped the application, start it again with jets serve
. Make a POST
request to localhost:8888/messages using curl
and you’ll see your joke, provided by icanhazdadjoke, in the TwiML response:
curl --data “” http://localhost:8888/messages
<?xml version=“1.0” encoding=“UTF-8”?>
<Response>
<Message>What’s the advantage of living in Switzerland? Well, the flag is a big plus.</Message>
</Response>
Great, our Jets application is working! Now to deploy it to AWS Lambda.
To deploy our Jets application to AWS we first need to set up our project with credentials to allow it to access AWS services. A good practice here is to create a user that has the minimum number of permissions required to do everything it needs to. The Jets documentation describes the minimum permissions that our user will need. Within our AWS account we’re going to create a policy that contains these permissions and a new user that will be assigned the policy. We can then use that user’s credentials to deploy our application.
In your AWS console find the IAM service (or head straight to the IAM section).
Go to the Policies section and create a new policy.
Choose the JSON tab and enter the following JSON from the Jets documentation:
{
“Version”: “2012-10-17”,
“Statement”: [
{
“Effect”: “Allow”,
“Action”: [
“apigateway:",
"cloudformation:”,
“dynamodb:",
"events:”,
“iam:",
"lambda:”,
“logs:",
"route53:”,
“s3:"
],
“Resource”: [
"”
]
}
]
}
Click through to review the policy and give it a name.
Save the policy. Now we need to create a new user and attach the policy to it, giving it the permissions to create the resources Jets needs to deploy. Open the Users section in the IAM console and create a new user.
Give the user a name and select Programmatic Access for the Access Type.
Click Next to choose the permissions for your new user. Choose Attach existing policies directly and filter for the name of the policy you just created. Select that policy and click Next.
Click Next until you reach the success page.
Save the Access key ID and Secret access key from the last screen. We’ll need them to deploy with. Now we’re ready to deploy.
On the command line enter:
AWS_ACCESS_KEY_ID=YOUR_USER_KEY AWS_SECRET_ACCESS_KEY=YOUR_USER_SECRET_KEY jets deploy
Jets will use the credentials as environment variables to set up all the resources in your AWS account to run the application. It takes a little while, but when it is complete you will have a URL where your application is running.
You’ll notice this deployed to a “dev” environment. You can read more about how Jets handles environments in the documentation.
We can now test this URL using curl
. Remember we use the route /messages
so add that to the end of your API Gateway endpoint and make a POST
request.
$ curl --data “” https://YOUR_API_GATEWAY_ENDPOINT/messages
<?xml version=“1.0” encoding=“UTF-8”?>
<Response>
<Message>I knew a guy who collected candy canes, they were all in mint condition</Message>
</Response>
Now to have dad jokes on hand at all times let’s hook this up to a Twilio number.
Head into your Twilio console to your active phone numbers. If you already have a number you want to work with, edit it, otherwise buy a new number that can receive incoming SMS messages. In the field for A message comes in enter your application URL.
Save your number and send it a message. You should get a dad joke in response. Now, in celebration, go and tell someone nearby that joke and let me know whether they laugh or groan.
In this post we’ve seen how to get started with Jets to write Ruby applications that you can deploy to AWS Lambda. You can see the full project on GitHub.
There’s a lot more to what Jets can help you accomplish, including responding to events, storing data in databases and even run your existing Rails application. Check out these articles from the Jets documentation for more on what you can do with Jets.
Originally published by PHIL NASH at twilio.com
==================================================
Thanks for reading :heart: If you liked this post, share it with all of your programming buddies! Follow me on Facebook | Twitter
☞ Ruby on Rails REST API: The Complete Guide
☞ How To Install Ruby On Rails On Windows
☞ The Complete Ruby on Rails Developer Course
☞ Learn Ruby on Rails from Scratch
☞ Testing Ruby with RSpec: The Complete Guide
☞ Build a Crypto Currency Portfolio App With Ruby on Rails
☞ Ruby On Rails Web Development: To-Do List App
☞ Advanced Ruby Programming: 10 Steps to Mastery
#ruby-on-rails
1598408880
The Basics
AWS KMS is a Key Management Service that let you create Cryptographic keys that you can use to encrypt and decrypt data and also other keys. You can read more about it here.
Important points about Keys
Please note that the customer master keys(CMK) generated can only be used to encrypt small amount of data like passwords, RSA key. You can use AWS KMS CMKs to generate, encrypt, and decrypt data keys. However, AWS KMS does not store, manage, or track your data keys, or perform cryptographic operations with data keys.
You must use and manage data keys outside of AWS KMS. KMS API uses AWS KMS CMK in the encryption operations and they cannot accept more than 4 KB (4096 bytes) of data. To encrypt application data, use the server-side encryption features of an AWS service, or a client-side encryption library, such as the AWS Encryption SDK or the Amazon S3 encryption client.
Scenario
We want to create signup and login forms for a website.
Passwords should be encrypted and stored in DynamoDB database.
What do we need?
Lets Implement it as Serverless Application Model (SAM)!
Lets first create the Key that we will use to encrypt and decrypt password.
KmsKey:
Type: AWS::KMS::Key
Properties:
Description: CMK for encrypting and decrypting
KeyPolicy:
Version: '2012-10-17'
Id: key-default-1
Statement:
- Sid: Enable IAM User Permissions
Effect: Allow
Principal:
AWS: !Sub arn:aws:iam::${AWS::AccountId}:root
Action: kms:*
Resource: '*'
- Sid: Allow administration of the key
Effect: Allow
Principal:
AWS: !Sub arn:aws:iam::${AWS::AccountId}:user/${KeyAdmin}
Action:
- kms:Create*
- kms:Describe*
- kms:Enable*
- kms:List*
- kms:Put*
- kms:Update*
- kms:Revoke*
- kms:Disable*
- kms:Get*
- kms:Delete*
- kms:ScheduleKeyDeletion
- kms:CancelKeyDeletion
Resource: '*'
- Sid: Allow use of the key
Effect: Allow
Principal:
AWS: !Sub arn:aws:iam::${AWS::AccountId}:user/${KeyUser}
Action:
- kms:DescribeKey
- kms:Encrypt
- kms:Decrypt
- kms:ReEncrypt*
- kms:GenerateDataKey
- kms:GenerateDataKeyWithoutPlaintext
Resource: '*'
The important thing in above snippet is the KeyPolicy. KMS requires a Key Administrator and Key User. As a best practice your Key Administrator and Key User should be 2 separate user in your Organisation. We are allowing all permissions to the root users.
So if your key Administrator leaves the organisation, the root user will be able to delete this key. As you can see **KeyAdmin **can manage the key but not use it and KeyUser can only use the key. ${KeyAdmin} and **${KeyUser} **are parameters in the SAM template.
You would be asked to provide values for these parameters during SAM Deploy.
#aws #serverless #aws-sam #aws-key-management-service #aws-certification #aws-api-gateway #tutorial-for-beginners #aws-blogs
1617016800
In this post, I will show you how to use Amazon S3 Object Lambda to resize images on the fly. The Serverless Framework will be used to define the Infrastructure as Code and to simplify the deployment. Sharp will be used to resize the images. Lambda will be written using the Node.js 14x Lambda runtime
One of the most common Lambda patterns is to transform data stored inside Amazon S3. Generally, a lambda function is invoked after a file has been uploaded. Lambda would retrieve that file, apply any needed transformation (e.g. converting type of file) and store the result in S3.
That pattern was working well, however, it would require some work done onto a file despite that being accessed in the future or not.
If you needed to convert a file on the fly you should have created a Lambda function, invoke it via Amazon API GW and wait for the lambda to perform the transformation.
AWS has recently introduced Amazon S3 Object Lambda in a good post by Danilo Poccia. S3 Object Lambda allows creating a Lambda directly connected to the S3 bucket (using S3 Access Points) that is automatically invoked when retrieving the object from S3!
That means that our application needs only to send an S3 Get Object request to retrieve the original or transformed data
Also, a very important peculiarity of using Amazon S3 Object Lambda it’s that the file you want to retrieve doesn’t need to exist on S3! We will make use of this for our scenario
_Note: High-level AWS CLI S3 commands (e.g.
_aws s3 cp_
) don’t currently support S3 Object Lambda, instead we need to use low-level S3 API commands (e.g. __aws s3api get-object)_
In his post, Danilo highlighted the most common use cases for Amazon S3 Object Lambda:
#aws-lambda #serverless #aws-s3 #aws
1617875400
2020 was a difficult year for all of us, and it was no different for engineering teams. Many software releases were postponed, and the industry slowed its development speed quite a bit.
But at least at AWS, some teams released updates out of the door at the end of the year. AWS Lambda received two significant improvements:
With these two new features and Lambda Layers, we now have three ways to add code to Lambda that isn’t directly part of our Lambda function.
The question is now: when should we use what?
In this article, I try to shine some light on the Lambda Layers, Lambda Extensions, and Docker image for Lambda.
First things first. All these Lambda features can be used together. So if you think about where to put your code, at least your decisions aren’t mutually exclusive. You can upload a Docker image and attach a regular Lambda Layer and a Lambda Extension. The same is possible if your Lambda function is based on a ZIP archive.
What does this all mean? Keep reading and find out.
#aws #aws-lambda #serverless #devops #docker #lambda
1625253720
In this Serverless Saturday video, we’ll be going over how to create your first AWS Lambda function!
In the next video, we’ll be covering how to set up CI/CD with your AWS Lambda function so stay tuned and make sure to subscribe!
To get started, log-in to your AWS account here: https://aws.amazon.com/console/
Found this video helpful? Feel free to support this channel here: https://ko-fi.com/jacksonyuan
#node.js #node #lambda #aws #aws lambda #serverless
1627169220
In this video we are going to learn the differences between ec2 vs lambda by going over:
#aws #lambda #ec2 #serverless #aws lambda