narayana reddy

narayana reddy

1587705906

Google Cloud Professional Architect

Ever since working with a great team at the BBC, I’ve been noticing the momentum behind Google Cloud. I’ve been looking to do a certification for a while, so decided to go with Google’s Professional Cloud Architect.

If you do some research on the certification, you’ll find words like “vast” describing the scope of the exam. They’re not wrong. From compute, storage and networking, through business requirements, capacity planning, SRE, regulatory compliance, containers, continuous deployment, even kubectl commands, this isn’t something you can study for straight out of the gate from coding bootcamp.

The range of topics and the layers of knowledge, from CIDR blocks to Continuous Deployment to cloud migration and hybrid connectivity, sets a high bar. That’s what makes it both tough and also respected. It takes a healthy and broad level of experience to tackle it. What’s nice is that experience is more than theoretical and more than rote product knowledge, and there are common-sense aspects in there too that probe real-world experience. Something I particularly like is that the answer isn’t always Google.

Preparing for the exam

The advice I was given was to go for the Coursera material. That stood me in good stead. Don’t expect perfection though: this stuff is changing all the time and there are a few bloopers and “human touches” in the content. My favourite is when the instructor’s Google Home starts talking to him in the background, closely followed by the time there’s a rustling sound, as if someone is monkeying around behind the camera, and the straight-faced, earnest instructor can’t quite hold back a lovely smile.

To get In depth Knowledge in Google cloud you can enroll for free live demo Google cloud architect Online Training

When it comes to the cloud, over-engineered perfection isn’t the name of the game and for me these little foibles add real warmth to what is otherwise a pretty intense process of learning. I listened to much of the content on 1.5x speed, partly to get through it and partly to stop my mind from wandering. There’s nothing quite like feeling like you‘re working to keep up to keep you focused. It’s important to say you won’t get everything from the course material. You’ll get good coverage of most areas, but it’s unlikely the course content alone will get you through.

I started with the Architecting with Google Cloud Platform Specialization and Preparing for the Google Cloud Professional Cloud Architect Exam. I’ve had hands-on experience with AWS and GCP at this point, but haven’t covered their breadth of services, so those courses broadened my horizon but, perhaps more importantly, underscored how many services there are and how much there is to know about each. Having completed the courses comfortably, and hit 80% on the practice exam, I felt I had a pretty good grasp of how under-prepared I was.
I’m strongest on compute: virtual machines, functions, Kubernetes and PaaS are all familiar to me, although I needed to get into the finer detail (e.g. how are storage throughput and network capacity affected by the number of cores on a Compute Engine instance?). I decided to round out the other areas covered by the exam.

I knew I had plenty to learn about the range of storage products, their different aims, use-cases, capacities, advantages and disadvantages, so I decided to do a few courses from the Data Engineering, Big Data, and Machine Learning on GCP Specialization. I also decided, because Kubernetes is a Leviathan with hidden depths, I’d do as much of the Architecting with Google Kubernetes Engine Specialization as I could before the exam. This gave me the detail I needed to answer one or two questions I might otherwise have had to make intelligent guesses for.

Going for the numbers

As with any certification exam, there’s diminishing returns to over-studying. You’ll likely only work in depth in a few areas and all areas will change over time, so knowing it all as it stands today doesn’t add much value. Knowing enough to pass across the board, staying up to date with most things and getting really good at a couple of things is my idea of pragmatic and practical. I booked the exam and dug in for some more study.

I now felt I had compute and storage covered, plus some more detail on GKE, which got me feeling more comfortable. I have a decent grasp of networking, and by now had a good idea of GCP’s take on VPCs and load balancing. I had touched on Stackdriver both theoretically and in practice, so held off on learning more there. I felt I still needed more detail though, so I went looking for blog posts like this one by Jean-Louis (JL) Marechaux and this one by sathish vj to get some leads on where to deepen my understanding. They were particularly effective in getting me more than adequately terrified about what I was facing.

Vanishing hope

In case you think I’m made of the stuff that causes impostor syndrome in innocent bystanders, someone sliding through with ease, I’d like to share with you the sense of foreboding I felt coming up to the exam. I hope that if you’re studying for PCA and all you see on this glistening Internet are what look to be smug people who sailed through under a light breeze, a glass of prosecco and strawberries in hand, pinkies out, shades on, that I can share a real moment which I hope will bring you solace.

It’s a tough exam. The range of things you realise you don’t know will grow faster than the number of things you do know. Much like life, you can’t “win” this one, but you can show up with the best of your efforts and experience. I ended up trawling through pages and pages of documentation, gleaning details, into the nigh, hoping that just a few what felt like a smattered collection of details I was gathering would get me over the bar — a bar looked very high.

I found a practice test to take, the night before the exam, on a website somewhere. I gave it a go, just to get an “exit poll” on what I’d learned ahead of exam day. It had 10 questions, two of which I recognised from the official practice exam. I scored 50%, including the two questions I already knew the answer to. It wasn’t good. With hindsight, I think the questions were open to interpretation, or maybe I was tired, but at the time I became increasingly concerned that I was about to face-plant.

Exam day

There’s no official pass mark for the exam and you get no feedback. Just a yes or no. It’s two hours and 50 questions. I was anxious. There was one timely bright spark: I’d watched this Simon Sinek talk in my worried state the night before. He explains that “anxious” and “excited” are physiologically similar. It turns out that just saying “I’m excited” rather than “I’m nervous” can materially improve your performance.

My exam was at the Pitman Training Centre in central Edinburgh. I arrived early, had my ID ready and they got me started. There’s something about these kinds of testing facilities where it feels like everything is running Windows Vista on a Pentium II and you’re never sure if the next screen is really going to load. I settled in. “I’m excited” I said out loud in my head.

I opted for a strategy of answering everything as a first guess, then review and re-review, gradually whittling down to those last few tough nut questions. The most important tip I can give you for this exam is also the most simple: read the question. And the answers. Pay attention to the language and don’t rush. The questions are well written. Don’t assume you know what’s being asked. Bookmark questions for review.

My first pass took just over an hour. In the next half hour I did a full review to eliminate the questions I was most confident in. I was left with about 15 to whittle down. I spent my last 20 minutes going over them, committing to answer one by one. I knew there’d be a few that would have be educated guesses. I wasn’t shooting for 100% but I knew I had to minimise the chance of slipping under the bar by a few points.

With two seconds left on the clock, I submitted my final answer and the exam was over. A feedback form later, I found myself staring at a white screen with writing on it, explaining what would happen next. After those two hours of intense concentration I was word-blind. I scanned the sentence back and forth, but saw no sign of a test result. Finally my eyes scanned up a little way and landed on a single word: Pass.

Over the rainbow

I’ve never been so pleased, so relieved and so thankful to see those four letters. For me achieving Professional Cloud Architect felt like stepping up to a genuine challenge. If you’re considering it, I’d certainly recommend it, but not for light entertainment.

Knowing what I do now I can say I’ve got a new level of respect for people who’ve done it. More than a learning experience, it’s the closest I’ve come to something that can assess those of us who identify as something like Architect or Tech Lead. There’s something both humbling and satisfying about putting yourself to the test and coming through.

#google #cloudplatform #architecttraining #education #onlinetraining

What is GEEK

Buddha Community

Adaline  Kulas

Adaline Kulas

1594162500

Multi-cloud Spending: 8 Tips To Lower Cost

A multi-cloud approach is nothing but leveraging two or more cloud platforms for meeting the various business requirements of an enterprise. The multi-cloud IT environment incorporates different clouds from multiple vendors and negates the dependence on a single public cloud service provider. Thus enterprises can choose specific services from multiple public clouds and reap the benefits of each.

Given its affordability and agility, most enterprises opt for a multi-cloud approach in cloud computing now. A 2018 survey on the public cloud services market points out that 81% of the respondents use services from two or more providers. Subsequently, the cloud computing services market has reported incredible growth in recent times. The worldwide public cloud services market is all set to reach $500 billion in the next four years, according to IDC.

By choosing multi-cloud solutions strategically, enterprises can optimize the benefits of cloud computing and aim for some key competitive advantages. They can avoid the lengthy and cumbersome processes involved in buying, installing and testing high-priced systems. The IaaS and PaaS solutions have become a windfall for the enterprise’s budget as it does not incur huge up-front capital expenditure.

However, cost optimization is still a challenge while facilitating a multi-cloud environment and a large number of enterprises end up overpaying with or without realizing it. The below-mentioned tips would help you ensure the money is spent wisely on cloud computing services.

  • Deactivate underused or unattached resources

Most organizations tend to get wrong with simple things which turn out to be the root cause for needless spending and resource wastage. The first step to cost optimization in your cloud strategy is to identify underutilized resources that you have been paying for.

Enterprises often continue to pay for resources that have been purchased earlier but are no longer useful. Identifying such unused and unattached resources and deactivating it on a regular basis brings you one step closer to cost optimization. If needed, you can deploy automated cloud management tools that are largely helpful in providing the analytics needed to optimize the cloud spending and cut costs on an ongoing basis.

  • Figure out idle instances

Another key cost optimization strategy is to identify the idle computing instances and consolidate them into fewer instances. An idle computing instance may require a CPU utilization level of 1-5%, but you may be billed by the service provider for 100% for the same instance.

Every enterprise will have such non-production instances that constitute unnecessary storage space and lead to overpaying. Re-evaluating your resource allocations regularly and removing unnecessary storage may help you save money significantly. Resource allocation is not only a matter of CPU and memory but also it is linked to the storage, network, and various other factors.

  • Deploy monitoring mechanisms

The key to efficient cost reduction in cloud computing technology lies in proactive monitoring. A comprehensive view of the cloud usage helps enterprises to monitor and minimize unnecessary spending. You can make use of various mechanisms for monitoring computing demand.

For instance, you can use a heatmap to understand the highs and lows in computing visually. This heat map indicates the start and stop times which in turn lead to reduced costs. You can also deploy automated tools that help organizations to schedule instances to start and stop. By following a heatmap, you can understand whether it is safe to shut down servers on holidays or weekends.

#cloud computing services #all #hybrid cloud #cloud #multi-cloud strategy #cloud spend #multi-cloud spending #multi cloud adoption #why multi cloud #multi cloud trends #multi cloud companies #multi cloud research #multi cloud market

Rusty  Shanahan

Rusty Shanahan

1597833840

Overview of Google Cloud Essentials Quest

If you looking to learn about Google Cloud in depth or in general with or without any prior knowledge in cloud computing, then you should definitely check this quest out, Link.

Google Could Essentials is an introductory level Quest which is useful to learn about the basic fundamentals of Google Cloud. From writing Cloud Shell commands and deploying my first virtual machine, to running applications on Kubernetes Engine or with load balancing, Google Cloud Essentials is a prime introduction to the platform’s basic features.

Let’s see what was the Quest Outline:

  1. A Tour of Qwiklabs and Google Cloud
  2. Creating a Virtual Machine
  3. Getting Started with Cloud Shell & gcloud
  4. Kubernetes Engine: Qwik Start
  5. Set Up Network and HTTP Load Balancers

A Tour of Qwiklabs and Google Cloud was the first hands-on lab which basically gives an overview about Google Cloud. There were few questions to answers that will check your understanding about the topic and the rest was about accessing Google cloud console, projects in cloud console, roles and permissions, Cloud Shell and so on.

**Creating a Virtual Machine **was the second lab to create virtual machine and also connect NGINX web server to it. Compute Engine lets one create virtual machine whose resources live in certain regions or zones. NGINX web server is used as load balancer. The job of a load balancer is to distribute workloads across multiple computing resources. Creating these two along with a question would mark the end of the second lab.

#google-cloud-essentials #google #google-cloud #google-cloud-platform #cloud-computing #cloud

Zelma  Gerlach

Zelma Gerlach

1619050860

Cloud Operations Overview for Google Cloud Professional Architect

Operations Suite (Stackdriver) is a hybrid monitoring, logging, and diagnostics tool suite for applications on the Google Cloud Platform and AWS.

GCP Purchased Stackdriver and was rebranded to Google Stackdriver after the purchase.

Google has now rebranded the Stackdriver Suite as “Cloud Operations” This is important to know in case the exam has not been updated to reflect the change.

Cloud Operations monitors the clouds service layers in a single SaaS solution. Cloud Operations maintains native integration with Google Cloud data tools BigQuery, Cloud Pub/Sub, Cloud Storage, Cloud Datalab, and out-of-the-box integration with all your other application components.

In a nutshell Cloud Operations Suite allows you to Monitor, troubleshoot, and improve application performance on your Google Cloud environment.

#google-cloud-platform #google-cloud #cloud-computing #cloud-architecture #cloud

Google Cloud: Caching Cloud Storage content with Cloud CDN

In this Lab, we will configure Cloud Content Delivery Network (Cloud CDN) for a Cloud Storage bucket and verify caching of an image. Cloud CDN uses Google’s globally distributed edge points of presence to cache HTTP(S) load-balanced content close to our users. Caching content at the edges of Google’s network provides faster delivery of content to our users while reducing serving costs.

For an up-to-date list of Google’s Cloud CDN cache sites, see https://cloud.google.com/cdn/docs/locations.

Task 1. Create and populate a Cloud Storage bucket

Cloud CDN content can originate from different types of backends:

  • Compute Engine virtual machine (VM) instance groups
  • Zonal network endpoint groups (NEGs)
  • Internet network endpoint groups (NEGs), for endpoints that are outside of Google Cloud (also known as custom origins)
  • Google Cloud Storage buckets

In this lab, we will configure a Cloud Storage bucket as the backend.

#google-cloud #google-cloud-platform #cloud #cloud storage #cloud cdn

Google Cloud EMEA Retail & Consumer Goods Summit: The Future of Retail

The way consumers make their everyday decisions is evolving, as digital ways of working, shopping and communicating have become the new normal. So now it’s more important than ever for companies in the retail sector to prioritise an insights-driven technology strategy and understand what’s truly important for their customers.

Through its partnerships with some of the world’s leading retailers and brands, Google Cloud provides solutions that address the retail sector’s most challenging problems, whether it’s creating flexible demand forecasting models to optimize inventory or transforming e-commerce using AI-powered apps. Over the past few years, we’ve been observing and analyzing the many facets of changing consumer behaviour. We are here to support retailers and brands as they transform their businesses to adapt to this new landscape.

Featuring consumer research and insights from your peers, Google Cloud’s Retail & Consumer Goods Summit will offer candid conversations to help you solve your challenges. We’ll be joined by industry innovators, including Carrefour Belgium and L’Oréal, who’ll discuss the future of retail and consumer goods.

#cloud native #google cloud platform #google cloud in europe #cloud #google cloud