Callum Slater

Callum Slater

1566833235

Getting Started with AWS Lambda and Go

Originally published by Ran Ribenzaft at https://epsagon.com

Go is undoubtedly one of the fastest-growing languages today. Since its 1.0 release in March 2012, it has seen adopted in a vast number of industries, but particularly in the cloud computing space. From microservices to the tools and components that power some of the largest cloud infrastructures, it’s hard to ignore Go’s contribution.

Projects such as Docker and Kubernetes have helped shape the way we run code at scale today, while others like Terraform leverage infrastructure as code in the current multi-cloud and multi-provider world.

With such a track record, it’s no wonder that in a recent study published by HackerRank of over 70 thousand developers, 37.2% indicated that Go is a language they wish to learn in 2019. No other language scored as high, making Go, per this particular study, the language most developers want to learn in 2019.

Given this popularity, it’s also no surprise that AWS – a pioneer and de facto leader in serverless computing – had already added Go as a supported runtime to AWS Lambda back at the beginning of 2018.

Advantages of Go for AWS Lambda

While all Lambda runtimes offer the same advantages in terms of scalability and share many concepts, there are some notable advantages when you use Go: Runtime versioning scheme, cold start performance, and pricing.

Runtime Versioning

The Go runtime for AWS Lambda has a very significant difference from every other runtime available today. While other runtimes support only specific versions of a language (for example, Python 2.7, 3.6, and 3.7 are three separate runtimes), the Go runtime supports any 1.x release, which already spans over seven years of releases.

Whenever a new version of Go is released, it’s possible to use it from day one, without having to wait for AWS to release a newly updated runtime. Such a feature would not be possible were Go not a statically compiled language with its Go 1 compatibility guarantee.

Cold Starts

When a Lambda function hasn’t had an invocation for a while, or when a spike in traffic requires additional functions to spawn, there’s a small penalty associated with it – the often dreaded cold start. Fortunately, Go has one of the fastest cold start times. You can read more on how to minimize AWS Lambda cold starts here.

Pricing

The pricing model of AWS Lambda is per invocation, plus the duration of the invocation rounded up to the nearest 100ms. The price per each 100ms also depends on the amount of memory allocated to the function.

While Go isn’t necessarily as memory-hungry as some dynamic languages, there is a small catch: a direct correlation between the CPU performance and allocated memory. While Go might also squeeze more performance out of a throttled CPU than some other languages, this remains an important point to consider. If you wish to learn more about this particular topic, be sure to read How to Make Lambda Faster: Memory Performance Benchmark.

With all this in mind, you can rest assured that Go is an excellent choice for AWS Lambda: A performant, cost-efficient language with the bonus of being able to run the latest (or even prerelease) version of the Go language without waiting for AWS to update their runtimes.

Getting Started

So, let’s get some coding done. You are going to create an HTTP triggered AWS Lambda function that responds with JSON. We’ll start by returning a response that contains the current time in UTC and then improve it to detect the requester’s timezone using a free third-party API, enabling us to also include the local time in the response.

Before we can start, however, there are a few prerequisites we must fulfill.

Prerequisites

You’ll need an AWS account for this. If you don’t yet have one, sign up for a free account here. The AWS free tier includes one million invocations every month and enough credit to run a single 128MB function continuously, all at no charge.

For building and deploying your functions, you’ll be using the Serverless Framework, which is the most widely used tool for the job. Assuming you have a recent version of Node.js installed, you can install the Serverless CLI with the following npm command:

$ npm install -g serverless

Once you have the Serverless CLI installed, you must configure it to use the AWS access keys of your account:

$ serverless config credentials --provider aws --key <access key ID> --secret <secret access key>

If you don’t have Go installed yet, you can either download an installer from the official website or use your favorite package manager to install it.

The Lambda-Time Function

Now that you have everything you need, let’s create a new project using Go modules for your function. Let’s name the module lambda-time. In a new, empty directory, initialize the Go module, and install the AWS Lambda for Go library:

$ go mod init lambda-time
$ go get github.com/aws/aws-lambda-go

After this, you can proceed to create a main.go file that implements your handler function and starts the process:

package main
import (
    "context"
    "encoding/json"
    "log"
    "time"
    "github.com/aws/aws-lambda-go/events"
    "github.com/aws/aws-lambda-go/lambda"
)
type response struct {
    UTC time.Time `json:"utc"`
}
func handleRequest(ctx context.Context, request events.APIGatewayProxyRequest) (events.APIGatewayProxyResponse, error) {
    now := time.Now()
    resp := &response{
        UTC: now.UTC(),
    }
    body, err := json.Marshal(resp)
    if err != nil {
        return events.APIGatewayProxyResponse{}, err
    }
    return events.APIGatewayProxyResponse{Body: string(body), StatusCode: 200}, nil
}
func main() {
    lambda.Start(handleRequest)
}

This previous code can be broken into a few simple steps:

  • Define a response struct that supports JSON serialization and defines the HTTP response body of a successful invocation of your AWS Lambda function.
  • Create a request handler function, which creates a response struct containing the current time in UTC and then proceeds to serialize it as JSON. In case the serialization fails, you return the error; if everything goes well, you respond with your serialized JSON as the response body and a status code of 200.
  • Register your handler function in the main function using the AWS Lambda for Go library.

The Handler Function

It’s worth taking some time to understand how the handler function works. While there are multiple valid handler signatures, the one you used is the complete one. The context argument provides information on the invoked function, its environment, and also the deadline of the invocation. Returning an error value from the handler function signals that the invocation failed and automatically logs the value of the error.

That leaves the request and response structs in your handler function signature. Lambda functions are invoked either by AWS services or by using an AWS SDK (e.g., from another Lambda function). Data passed in and out of a Lambda function is in JSON format. In your case, the AWS Lambda for Go library automatically handles the serialization and deserialization between JSON and Go values.

When calling Lambda functions using the AWS SDK, the structure of the input and output JSON data is up to the developer. For AWS Lambda functions invoked by AWS services, the data structure depends on the invoking service. Amazon API Gateway is the service that triggers Lambda functions in response to HTTP calls. For API Gateway, this means the request is always of type events.APIGatewayProxyRequest and the response will always be of type events.APIGatewayProxyResponse.

The AWS Lambda for Go library contains the data definitions for each AWS service that can invoke Lambda functions.

Deployment

Your function is now ready and you can proceed by deploying it with the Serverless Framework. For that, we must first create a serverless.yml file that defines what we are deploying:

service: lambda-time
provider:
  name: aws
  runtime: go1.x
package:
 exclude:
   - ./**
 include:
   - ./bin/**
functions:
  lambda-time:
    handler: bin/lambda-time
    events:
      - http:
          path: /
          method: get

Here you name both your service and function lambda-time, but a service could instead contain multiple functions with different names. You also need to configure your API Gateway by specifying that the function responds to HTTP events of a particular HTTP method and at a given request path.

Next up, build the code as an x86-64 Linux executable, and deploy it:

$ GOOS=linux GOARCH=amd64 go build -o bin/lambda-time .
$ serverless deploy

Once finished, the command prints the URL for the endpoint. Open it, and make sure it responds with the current time.

Improving Your Service

Now that you have a working service, you can improve it by also returning the time in the local timezone of the requester (based on IP address). To do so, add an HTTP client to your Lambda function along with a function that returns the timezone for a given IP address:

var httpClient = &http.Client{}
func timezone(ip string) *time.Location {
        resp, err := httpClient.Get("https://ipapi.co/" + ip + "/timezone/")
        if err != nil {
                return nil
        }
        defer resp.Body.Close()
        tz, err := ioutil.ReadAll(resp.Body)
        if err != nil {
                return nil
        }
        loc, err := time.LoadLocation(string(tz))
        if err != nil {
                return nil
        }
        return loc
}

By rebuilding and redeploying the Lambda function, you can now see the local timezone (assuming it is possible to determine based on IP):

$ GOOS=linux GOARCH=amd64 go build -o bin/lambda-time .
$ serverless deploy

You have now built a Lambda function that, while simple, demonstrates how to build an actual service. However, to make a production service, you’re still missing some crucial elements–monitoring, instrumentation, and tracing. Don’t wait for your first outage before considering implementing these.

Monitoring, Instrumentation, and Tracing

Epsagon has designed and built an extremely easy-to-use Go library for this purpose. If you haven’t signed up yet for a free trial, go ahead and get started!

Once you have your Epsagon API token, add it as an environment variable to the provider section of the serverless.yml so that it applies to all of your functions:

provider:
  name: aws
  runtime: go1.x
  environment:
    EPSAGON_TOKEN: "<Epsagon API token>"

When done, install the library:

$ go get github.com/epsagon/epsagon-go

By making only two minimal changes to your service, you will be able to add tracing and instrumentation:

import (
        "github.com/epsagon/epsagon-go/epsagon"
        "github.com/epsagon/epsagon-go/wrappers/net/http"
)
var httpClient = epsagonhttp.Wrap(http.Client{})
func main() {
        lambda.Start(epsagon.WrapLambdaHandler(
                &epsagon.Config{ApplicationName: "lambda-time"},
                handleRequest,
        ))
}

All you had to do was wrap both your Lambda function handler and your HTTP client. The Epsagon library handles the rest. Now you can get full tracing and instrumentation, not only to your service but to outbound calls as well:

Epsagon Architecture View

You also get monitoring for all of your Lambda invocations and much more.

Summary

In this post, we’ve looked at some of the advantages of using Go for writing AWS Lambda functions. We saw what it takes to set up an environment to develop, build, and deploy functions with the Serverless Framework. We then proceeded to create and deploy a Lambda function behind an API Gateway that showcased a real-world usage scenario of calling third-party APIs. Finally, we added monitoring, instrumentation, and tracing by using the Epsagon for Go library.

Equipped with this knowledge, you can now start developing microservices in Go and deploying them in a serverless fashion using AWS Lambda. Remember – Serverless platforms can take care of running and scaling your code, but being able to troubleshoot and fix production issues quickly is vital to running a successful business. For this latter part, you can count on Epsagon to help you effectively monitor and troubleshoot your applications for optimal results.

Thanks for reading

If you liked this post, share it with all of your programming buddies!

Follow me on Facebook | Twitter

Further reading

Learn How To Code: Google’s Go (golang) Programming Language

Go: The Complete Developer’s Guide (Golang)

Build Realtime Apps | React Js, Golang & RethinkDB

Go Programming Language Tutorial | Golang Tutorial For Beginners | Go / Golang Crash Course

Google’s Go Essentials For Node.js / JavaScript Developers

Moving from NodeJS to Go

Learn Go Programming - Golang Tutorial for Beginners

A guide to Golang e-commerce

AWS Certified Solution Architect Associate

AWS Lambda vs. Azure Functions vs. Google Functions

Running TensorFlow on AWS Lambda using Serverless

Deploy Docker Containers With AWS CodePipeline

A Complete Guide on Deploying a Node app to AWS with Docker

Create and Deploy AWS and AWS Lambda using Serverless Framework

Introduction To AWS Lambda

#go #aws #web-service #microservices #serverless

What is GEEK

Buddha Community

Getting Started with AWS Lambda and Go

Cross-account access to invoke AWS lambda using AWS CDK

If you are here, you may have a pretty good knowledge of how to use AWS CDK for defining cloud infrastructure in code and provisioning it through AWS. So let’s get started on how to grant permission to your lambda function to access the resources in another AWS account.

Let’s say you have two accounts called Account A and Account B, and you need to give permission to lambda function in Account A (ex: 11111111)to access the resources in Account B(22222222). You can easily do this by assuming an IAM Role in Account B and then uses the returned credentials to invoke AWS resources in Account B.

#acces #account #aws #lambda #aws lambda #aws cdk

Cache secrets using AWS Lambda extensions

What is the AWS Lambda extension?

A month back AWS announced a preview of Lambda Extensions, a new way to easily integrate Lambda with your favorite monitoring, observability, security, and governance tools. Extensions can be published as Lambda layers, there are two types are extension:

  • Internal extensions → Run as part of the runtime process, in-process with your code. Internal extensions enable use cases such as automatically instrumenting code.
  • External extensions → Allow you to run separate processes from the runtime but still within the same execution environment as the Lambda function. External extensions can start before the runtime process and can continue after the runtime shuts down. These extensions run as companion processes to Lambda functions.

#aws #aws-secrets-manager #lambda #aws lambda

Fannie  Zemlak

Fannie Zemlak

1599854400

What's new in the go 1.15

Go announced Go 1.15 version on 11 Aug 2020. Highlighted updates and features include Substantial improvements to the Go linker, Improved allocation for small objects at high core counts, X.509 CommonName deprecation, GOPROXY supports skipping proxies that return errors, New embedded tzdata package, Several Core Library improvements and more.

As Go promise for maintaining backward compatibility. After upgrading to the latest Go 1.15 version, almost all existing Golang applications or programs continue to compile and run as older Golang version.

#go #golang #go 1.15 #go features #go improvement #go package #go new features

Gordon  Matlala

Gordon Matlala

1617875400

Adding Code to AWS Lambda, Lambda Layers, and Lambda Extensions Using Docker

2020 was a difficult year for all of us, and it was no different for engineering teams. Many software releases were postponed, and the industry slowed its development speed quite a bit.

But at least at AWS, some teams released updates out of the door at the end of the year. AWS Lambda received two significant improvements:

  • AWS Lambda Extensions; and
  • Support of Docker images for your functions.

With these two new features and Lambda Layers, we now have three ways to add code to Lambda that isn’t directly part of our Lambda function.

The question is now: when should we use what?

In this article, I try to shine some light on the Lambda Layers, Lambda Extensions, and Docker image for Lambda.

First things first. All these Lambda features can be used together. So if you think about where to put your code, at least your decisions aren’t mutually exclusive. You can upload a Docker image and attach a regular Lambda Layer and a Lambda Extension. The same is possible if your Lambda function is based on a ZIP archive.

What does this all mean? Keep reading and find out.

#aws #aws-lambda #serverless #devops #docker #lambda

Seamus  Quitzon

Seamus Quitzon

1601341562

AWS Cost Allocation Tags and Cost Reduction

Bob had just arrived in the office for his first day of work as the newly hired chief technical officer when he was called into a conference room by the president, Martha, who immediately introduced him to the head of accounting, Amanda. They exchanged pleasantries, and then Martha got right down to business:

“Bob, we have several teams here developing software applications on Amazon and our bill is very high. We think it’s unnecessarily high, and we’d like you to look into it and bring it under control.”

Martha placed a screenshot of the Amazon Web Services (AWS) billing report on the table and pointed to it.

“This is a problem for us: We don’t know what we’re spending this money on, and we need to see more detail.”

Amanda chimed in, “Bob, look, we have financial dimensions that we use for reporting purposes, and I can provide you with some guidance regarding some information we’d really like to see such that the reports that are ultimately produced mirror these dimensions — if you can do this, it would really help us internally.”

“Bob, we can’t stress how important this is right now. These projects are becoming very expensive for our business,” Martha reiterated.

“How many projects do we have?” Bob inquired.

“We have four projects in total: two in the aviation division and two in the energy division. If it matters, the aviation division has 75 developers and the energy division has 25 developers,” the CEO responded.

Bob understood the problem and responded, “I’ll see what I can do and have some ideas. I might not be able to give you retrospective insight, but going forward, we should be able to get a better idea of what’s going on and start to bring the cost down.”

The meeting ended with Bob heading to find his desk. Cost allocation tags should help us, he thought to himself as he looked for someone who might know where his office is.

#aws #aws cloud #node js #cost optimization #aws cli #well architected framework #aws cost report #cost control #aws cost #aws tags