1617271200
In this example we are going to use Localstack and Golang to work with AWS Simple Storage Service (S3). We will create a new bucket, upload an object to a bucket, download an object from a bucket, delete an object from a bucket and list objects in a bucket.
├── assets
│ ├── id.txt
│ └── logo.png
├── internal
│ ├── bucket
│ │ └── bucket.go
│ └── pkg
│ └── cloud
│ ├── aws
│ │ ├── aws.go
│ │ └── s3.go
│ ├── client.go
│ └── model.go
├── main.go
└── tmp
#aws #go #golang
1617271200
In this example we are going to use Localstack and Golang to work with AWS Simple Storage Service (S3). We will create a new bucket, upload an object to a bucket, download an object from a bucket, delete an object from a bucket and list objects in a bucket.
├── assets
│ ├── id.txt
│ └── logo.png
├── internal
│ ├── bucket
│ │ └── bucket.go
│ └── pkg
│ └── cloud
│ ├── aws
│ │ ├── aws.go
│ │ └── s3.go
│ ├── client.go
│ └── model.go
├── main.go
└── tmp
#aws #go #golang
1617291120
In this example we are going to use Localstack and Golang to work with AWS Simple Notification Service (SNS). We will create a new topic, list all topics, subscribe to a topic, list all topic subscriptions, publish to a topic and unsubscribe from a topic.
├── internal
│ ├── pkg
│ │ └── cloud
│ │ ├── aws
│ │ │ ├── aws.go
│ │ │ └── sns.go
│ │ ├── client.go
│ │ └── model.go
│ └── pubsub
│ └── pubsub.go
└── main.go
#aws #go #golang
1617362280
In this example we are going to use Localstack and Golang to work with AWS Simple Queue Service (SQS). We will create queues, send messages and receive messages as well as doing some other minor work.
├── internal
│ ├── message
│ │ └── message.go
│ └── pkg
│ └── cloud
│ ├── aws
│ │ ├── aws.go
│ │ └── sqs.go
│ ├── client.go
│ └── model.go
└── main.go
#aws #go #golang
1601368680
I would say that is very rare to find a situation when we don’t need to have file storage and a database with regard to web development.
The file storage can be used to handle simple situations like application settings, connection strings, and some customer data. At the same time, this resource can be used for complex scenarios like data archiving / analytics, and even for static content.
However, if you need to store complex data, performance, and scalability the database might be the right choice.
So, let’s imagine a situation when need to extract the data from your file storage and put it in your database, simple as that.
Secondly, we will assume that our file storage is the AWS Simple Cloud Storage (S3) and the database is the Amazon Dynamo DB.
And finally, let’s suppose that the files are not too big (under 500MB) and they are in CSV format, following the pattern: _UserIdentifier,Username,Language. _An example of the data can be:
123456,Darth Vader,Python
774477,Yoda,Golang
999000,Chewbacca,Javascript
There are a few approaches that we can use to solve our problem, like:
Easily automate the movement and transformation of data.
Simple, flexible, and cost-effective ETL
Start querying data instantly. Get results in seconds. Pay only for the queries you run.
So, all these tools are awesome and can be used without any issue to achieve what we are looking for.
Nevertheless, there are some scenarios that we don’t need to use a shovel to burst a balloon, right? We can just use a needle :)
Assuming that we want to a simple solution, easy to maintain, and limited to small/medium files, the Lambda function fits perfectly in this situation.
The idea is very simple, our lambda function will be triggered by the S3, and will save these data in the DynamoDB.
Ok, let’s start!
#s3 #cloud #aws #lambda #golang
1627276637
Amazon Web Service, aka AWS, is a leading cloud infrastructure provider for storing your servers, applications, databases, networking, domain controllers, and active directories in a widespread cloud architecture. AWS provides a Simple Storage Service (S3) for storing your objects or data with (119’s) of data durability. AWS S3 is compliant with PCI-DSS, HIPAA/HITECH, FedRAMP, EU Data Protection Directive, and FISMA that helps satisfy regulatory requirements.
When you log in to the AWS portal, navigate to the S3 bucket, choose your required bucket, and download or upload the files. Doing it manually on the portal is quite a time-consuming task. Instead, you can use the AWS Command Line Interface (CLI) that works best for bulk file operations with easy-to-use scripts. You can schedule the execution of these scripts for an unattended object download/upload.
#aws #aws cli #aws s3