akshay L

akshay L

1590221642

AWS Terraform Tutorial | DevOps Terraform | Terraform Tutorial For Beginners | Intellipaat

In this aws terraform tutorial for beginners video you will learn what is aws devops terraform, aws cloud, various terraform providers, how to create an ec2 instance, how to install terraform and work on the various attributes and variables in detail.
link: https://www.youtube.com/watch?v=Fc5c1o9VMUM

#awsterraformtutorial #devopsterraform #terraformtutorial #terraformaws

What is GEEK

Buddha Community

AWS Terraform Tutorial | DevOps Terraform | Terraform Tutorial For Beginners | Intellipaat

Getting Started With Terraform Modules

Introduction

In this article, we will see a subtle introduction to terraform modules, how to pass data into the module, get something from the module and create a resource (GKE cluster), it’s intended to be as simple as possible just to be aware of what a module is composed of, or how can you do your own modules, sometimes it makes sense to have modules to abstract implementations that you use over several projects, or things that are often repeated along the project. So let’s see what it takes to create and use a module.

The source code for this article can be found here. Note that in this example I’m using GCP since they give you $300 USD for a year to try their services and it looks pretty good so far, after sign-up you will need to go to IAM, then create a service account and after that export the key (this is required for the terraform provider to talk to GCP).

Composition of a Module

A module can be any folder with a main.tf file in it, yes, that is the only required file for a module to be usable, but the recommendation is that you also put a README.md file with a description of the module if it’s intended to be used by people if it’s a sub-module it’s not necessary, also you will need a file called variables.tf and other outputs.tf of course if it’s a big module that cannot be split into sub-modules you can split those files for convenience or readability, variables should have descriptions so the tooling can show you what are they for, you can read more about the basics for a module here.

Before moving on let’s see the folder structure of our project:

Java

1

├── account.json

2

├── LICENSE

3

├── main.tf

4

├── module

5

│   ├── main.tf

6

│   ├── outputs.tf

7

│   └── variables.tf

8

├── README.md

9

└── terraform.tfvars

10

11

1 directory, 8 files

The Project

Let’s start with the main.tf that will call our module, notice that I added a few additional comments but it’s pretty much straight forward, we set the provider, then we define some variables, call our module and print some output (output can also be used to pass data between modules).

Java

1

## Set the provider to be able to talk to GCP

2

provider "google" {

3

  credentials = "${file("account.json")}"

4

  project     = "${var.project_name}"

5

  region      = "${var.region}"

6

}

7

8

## Variable definition

9

variable "project_name" {

10

  default = "testinggcp"

11

  type    = "string"

12

}

13

14

variable "cluster_name" {

15

  default = "demo-terraform-cluster"

16

  type    = "string"

17

}

18

19

variable "region" {

20

  default = "us-east1"

21

  type    = "string"

22

}

23

24

variable "zone" {

25

  default = "us-east1-c"

26

  type    = "string"

27

}

28

29

## Call our module and pass the var zone in, and get cluster_name out

30

module "terraform-gke" {

31

  source = "./module"

32

  zone = "${var.zone}"

33

  cluster_name = "${var.cluster_name}"

34

}

35

36

## Print the value of k8s_master_version

37

output "kubernetes-version" {

38

  value = module.terraform-gke.k8s_master_version

39

}

Then terraform.tfvars has some values to override the defaults that we defined:

Java

1

project_name = "testingcontainerengine"

2

cluster_name = "demo-cluster"

3

region = "us-east1"

4

zone = "us-east1-c"

#tutorial #devops #terraform #gcp cloud #terraform tutorial #kubernetes for beginners #terraform modules

Christa  Stehr

Christa Stehr

1598408880

How To Unite AWS KMS with Serverless Application Model (SAM)

The Basics

AWS KMS is a Key Management Service that let you create Cryptographic keys that you can use to encrypt and decrypt data and also other keys. You can read more about it here.

Important points about Keys

Please note that the customer master keys(CMK) generated can only be used to encrypt small amount of data like passwords, RSA key. You can use AWS KMS CMKs to generate, encrypt, and decrypt data keys. However, AWS KMS does not store, manage, or track your data keys, or perform cryptographic operations with data keys.

You must use and manage data keys outside of AWS KMS. KMS API uses AWS KMS CMK in the encryption operations and they cannot accept more than 4 KB (4096 bytes) of data. To encrypt application data, use the server-side encryption features of an AWS service, or a client-side encryption library, such as the AWS Encryption SDK or the Amazon S3 encryption client.

Scenario

We want to create signup and login forms for a website.

Passwords should be encrypted and stored in DynamoDB database.

What do we need?

  1. KMS key to encrypt and decrypt data
  2. DynamoDB table to store password.
  3. Lambda functions & APIs to process Login and Sign up forms.
  4. Sign up/ Login forms in HTML.

Lets Implement it as Serverless Application Model (SAM)!

Lets first create the Key that we will use to encrypt and decrypt password.

KmsKey:
    Type: AWS::KMS::Key
    Properties: 
      Description: CMK for encrypting and decrypting
      KeyPolicy:
        Version: '2012-10-17'
        Id: key-default-1
        Statement:
        - Sid: Enable IAM User Permissions
          Effect: Allow
          Principal:
            AWS: !Sub arn:aws:iam::${AWS::AccountId}:root
          Action: kms:*
          Resource: '*'
        - Sid: Allow administration of the key
          Effect: Allow
          Principal:
            AWS: !Sub arn:aws:iam::${AWS::AccountId}:user/${KeyAdmin}
          Action:
          - kms:Create*
          - kms:Describe*
          - kms:Enable*
          - kms:List*
          - kms:Put*
          - kms:Update*
          - kms:Revoke*
          - kms:Disable*
          - kms:Get*
          - kms:Delete*
          - kms:ScheduleKeyDeletion
          - kms:CancelKeyDeletion
          Resource: '*'
        - Sid: Allow use of the key
          Effect: Allow
          Principal:
            AWS: !Sub arn:aws:iam::${AWS::AccountId}:user/${KeyUser}
          Action:
          - kms:DescribeKey
          - kms:Encrypt
          - kms:Decrypt
          - kms:ReEncrypt*
          - kms:GenerateDataKey
          - kms:GenerateDataKeyWithoutPlaintext
          Resource: '*'

The important thing in above snippet is the KeyPolicy. KMS requires a Key Administrator and Key User. As a best practice your Key Administrator and Key User should be 2 separate user in your Organisation. We are allowing all permissions to the root users.

So if your key Administrator leaves the organisation, the root user will be able to delete this key. As you can see **KeyAdmin **can manage the key but not use it and KeyUser can only use the key. ${KeyAdmin} and **${KeyUser} **are parameters in the SAM template.

You would be asked to provide values for these parameters during SAM Deploy.

#aws #serverless #aws-sam #aws-key-management-service #aws-certification #aws-api-gateway #tutorial-for-beginners #aws-blogs

Jeromy  Lowe

Jeromy Lowe

1599097440

Data Visualization in R with ggplot2: A Beginner Tutorial

A famous general is thought to have said, “A good sketch is better than a long speech.” That advice may have come from the battlefield, but it’s applicable in lots of other areas — including data science. “Sketching” out our data by visualizing it using ggplot2 in R is more impactful than simply describing the trends we find.

This is why we visualize data. We visualize data because it’s easier to learn from something that we can see rather than read. And thankfully for data analysts and data scientists who use R, there’s a tidyverse package called ggplot2 that makes data visualization a snap!

In this blog post, we’ll learn how to take some data and produce a visualization using R. To work through it, it’s best if you already have an understanding of R programming syntax, but you don’t need to be an expert or have any prior experience working with ggplot2

#data science tutorials #beginner #ggplot2 #r #r tutorial #r tutorials #rstats #tutorial #tutorials

Nella  Brown

Nella Brown

1618707180

What is AWS DevOps? - AWS DevOps Tutorial

AWS and DevOps are two of the most powerful technologies that can be of great use when combined together. This short blog on what is AWS DevOps is written in a way to help you get a clear idea of AWS, DevOps, and about the outcome when these two technologies become one. Later in the course of this blog, you will also learn briefly about the tools and technologies of AWS DevOps, its advantages and how AWS DevOps and Azure DevOps differ from each other. Get ready to learn in-depth about the popular field in this tutorial.

Technology is something that rapidly evolves and aims only to grow in the future. Compared to the past years, the amount of change this field has seen in the last two decades is much larger. AWS and DevOps are two significant technologies in the field of Cloud Computing, a popular IT domain. This blog talks about these two technologies in particular and the benefits they bring when put together.

#aws #cloud computing #devops #aws devops