Justyn  Ortiz

Justyn Ortiz

1602896400

Using the Cloud Foundation Toolkit with Terraform

Last year, we released the Cloud Foundation Toolkit, open source templates that help you quickly build a strong cloud foundation according to best practices. These modules are available for both the Terraform infrastructure-as-code framework, as well as our own Cloud Deployment Manager.

This blog post will detail building a secure cloud foundation using the Cloud Foundation Toolkit Terraform example foundation. From there, we will explore how to deploy a microservices demo application onto the foundation using Terraform. After reading this content, we hope you learn how to accomplish the following:

  • Reduce the time required to build out an enterprise cloud foundation to less than one day following Google best practices
  • Use your cloud foundation by deploying a demo Google Kubernetes Engine (GKE) workload onto the foundation using Terraform
  • Deploy a GKE cluster at the highest level of security based on Google expert recommendations (IAP with TCP forwarding bastion host)

Getting started

To get started with using the Cloud Foundation Toolkit, first you need to understand Terraform and Linux command line basics. Then, you will need to make sure you have the following prerequisites.

Prerequisites:

  1. A GCP Organization
  2. A GCP Billing Account
  3. Ability to create Cloud Identity / G Suite groups
  4. Linux command line access with the following installed and configured:
  5. Google Cloud SDK
  6. Terraform
  7. Git

Building out a cloud foundation

First, you will need to clone the Terraform example foundation repository.

Git clone[https://github.com/terraform-google-modules/terraform-example-foundation.git](https://github.com/terraform-google-modules/terraform-example-foundation.git)

This repo contains several distinct Terraform projects each within their own directory that must be applied separately, but in sequence. Each of these Terraform projects are to be layered on top of each other, running in the following order.

0-bootstrap: The purpose of this step is to bootstrap a GCP organization, creating all the required resources & permissions to start using the Cloud Foundation Toolkit (CFT). This step also configures Cloud Build & Cloud Source Repositories for foundations code in subsequent stages.

#google cloud platform #devops & sre

What is GEEK

Buddha Community

Using the Cloud Foundation Toolkit with Terraform
Adaline  Kulas

Adaline Kulas

1594162500

Multi-cloud Spending: 8 Tips To Lower Cost

A multi-cloud approach is nothing but leveraging two or more cloud platforms for meeting the various business requirements of an enterprise. The multi-cloud IT environment incorporates different clouds from multiple vendors and negates the dependence on a single public cloud service provider. Thus enterprises can choose specific services from multiple public clouds and reap the benefits of each.

Given its affordability and agility, most enterprises opt for a multi-cloud approach in cloud computing now. A 2018 survey on the public cloud services market points out that 81% of the respondents use services from two or more providers. Subsequently, the cloud computing services market has reported incredible growth in recent times. The worldwide public cloud services market is all set to reach $500 billion in the next four years, according to IDC.

By choosing multi-cloud solutions strategically, enterprises can optimize the benefits of cloud computing and aim for some key competitive advantages. They can avoid the lengthy and cumbersome processes involved in buying, installing and testing high-priced systems. The IaaS and PaaS solutions have become a windfall for the enterprise’s budget as it does not incur huge up-front capital expenditure.

However, cost optimization is still a challenge while facilitating a multi-cloud environment and a large number of enterprises end up overpaying with or without realizing it. The below-mentioned tips would help you ensure the money is spent wisely on cloud computing services.

  • Deactivate underused or unattached resources

Most organizations tend to get wrong with simple things which turn out to be the root cause for needless spending and resource wastage. The first step to cost optimization in your cloud strategy is to identify underutilized resources that you have been paying for.

Enterprises often continue to pay for resources that have been purchased earlier but are no longer useful. Identifying such unused and unattached resources and deactivating it on a regular basis brings you one step closer to cost optimization. If needed, you can deploy automated cloud management tools that are largely helpful in providing the analytics needed to optimize the cloud spending and cut costs on an ongoing basis.

  • Figure out idle instances

Another key cost optimization strategy is to identify the idle computing instances and consolidate them into fewer instances. An idle computing instance may require a CPU utilization level of 1-5%, but you may be billed by the service provider for 100% for the same instance.

Every enterprise will have such non-production instances that constitute unnecessary storage space and lead to overpaying. Re-evaluating your resource allocations regularly and removing unnecessary storage may help you save money significantly. Resource allocation is not only a matter of CPU and memory but also it is linked to the storage, network, and various other factors.

  • Deploy monitoring mechanisms

The key to efficient cost reduction in cloud computing technology lies in proactive monitoring. A comprehensive view of the cloud usage helps enterprises to monitor and minimize unnecessary spending. You can make use of various mechanisms for monitoring computing demand.

For instance, you can use a heatmap to understand the highs and lows in computing visually. This heat map indicates the start and stop times which in turn lead to reduced costs. You can also deploy automated tools that help organizations to schedule instances to start and stop. By following a heatmap, you can understand whether it is safe to shut down servers on holidays or weekends.

#cloud computing services #all #hybrid cloud #cloud #multi-cloud strategy #cloud spend #multi-cloud spending #multi cloud adoption #why multi cloud #multi cloud trends #multi cloud companies #multi cloud research #multi cloud market

Adaline  Kulas

Adaline Kulas

1594166040

What are the benefits of cloud migration? Reasons you should migrate

The moving of applications, databases and other business elements from the local server to the cloud server called cloud migration. This article will deal with migration techniques, requirement and the benefits of cloud migration.

In simple terms, moving from local to the public cloud server is called cloud migration. Gartner says 17.5% revenue growth as promised in cloud migration and also has a forecast for 2022 as shown in the following image.

#cloud computing services #cloud migration #all #cloud #cloud migration strategy #enterprise cloud migration strategy #business benefits of cloud migration #key benefits of cloud migration #benefits of cloud migration #types of cloud migration

Zelma  Gerlach

Zelma Gerlach

1620767400

The Cloud Agnostic Paradigm

This article will be a holistic discussion of the unique value propositions of the current cloud offerings across AWS, Azure, and Terraform for rapid deployment of cloud infrastructure in their given product suites.

Azure Resource Manager (ARM) and AWS CloudFormation are both highly regarded, heavily used tools on their individual cloud providers. We will provide a snapshot in time in comparison of these tools and product development around them with Hashicorp Terraform and its roadmap.

For further discussion or questions, please join our Slack group and chat with the team and other passionate DevOps community members.

#terraform #terraform-cloud #devops #cloud-computing #aws-cloud #cloud

Building Pipelines with Terraform Cloud

Having a robust and effective CI/CD pipeline is the key to shorter sprints and effective iterations of cloud-native applications. In order to push updates regularly and successfully, you have to incorporate a number of things into the pipeline, including testing and security.

Terraform is used to build, maintain, and update cloud infrastructure. It runs from your desktop and communicates directly with cloud service providers like AWS.

While Terraform offers all the features you need to do infrastructure and policy as code, it is far from challenge-free. The most prominent challenge of them all is when collaborative efforts are needed in the process because manual changes are still required.

#cloud technology #terraform #ci/cd pipeline #terraform cloud #cloud

Thurman  Mills

Thurman Mills

1622252340

CI/CD for Cloud Run with Terraform

In this post we will create a CI/CD pipeline to deploy a webservice written in Deno to GCP Google Cloud Run with Terraform and GitHub actions. This is the second part of a series, where the first part was about the basic setup. If you haven’t read it yet, head over to part I. The source of the sample project can be found here.

GitHub Actions

GitHub released its fully integrated CI/CD workflow tool GitHub actions as GA in November 2019. The tool is event-driven and enables you to run a series of commands after an event happened. Events can be internal (eg. push, pull request, etc.) or external (eg. triggered from other sources using tokens). The overall configuration is called the workflow file. A workflow can be triggered via events and consists of jobs. Jobs are a group of steps and will run on the same GitHub runner, hence you can easily share data between various steps. A step consists of an action or shell command. They are the smallest portable block of a workflow file. You can define your own action or include one of the numerous predefined actions.

#devops #terraform-cloud #github-actions #terraform #google-cloud #cloud