Alex  Voloshyn

Alex Voloshyn

1622634300

Azure Load Balancer Tutorial

Azure Load balancing allows for distribution of load or incoming network traffic across a group of backend resources or servers. It operates at layer 4 of the Open Systems Interconnection (OSI) model. It is the single point of contact for clients.

In this episode I give you introduction to what Azure Load Balancer service is and what are the key concepts around it including integrations with virtual machine scale sets and availability sets.

In this episodes live demo of

  • Creating load balancer service
  • Adding backend pools
  • Creating health probes
  • Creating load balancing rules

Source code: https://github.com/MarczakIO/azure4ev…

Subscribe: https://www.youtube.com/c/Azure4Everyone/featured

#azure

What is GEEK

Buddha Community

Azure Load Balancer Tutorial
Aayush Singh

Aayush Singh

1614345154

Azure Tutorial | Azure Tutorial For Beginners | Learn Azure | Intellipaat

Azure tutorial

You will learn what is cloud computing, what is azure, how to create an Azure account, various azure services, azure CLI, azure virtual machine, what is azure app services along with hands-on demo and interview questions and answers. This is a must-watch session for everyone who wishes to learn azure from and make a career in it.

Who should watch this Microsoft Azure video?

If you want to learn Azure to become Solutions architects & programmers looking to build SaaS, PaaS, and IaaS applications then this Intellipaat windows azure certification is for you. The Intellipaat Azure video is your first step to learn Azure. Since this Microsoft Azure certification video can be taken by anybody, so if you are a Network and Systems administrator or Graduates and professionals looking to upgrade the skills to cloud technologies or Storage and security admins, Virtualization & network engineers then you can also watch this Azure certification video.

Why should you opt for an Azure career?

If you want to fast-track your career then you should strongly consider Azure. Cloud computing and cloud infrastructure are today some of the most powerful shifts that are happening in organizations around the world that want to benefit from its strengths like low cost, instant availability and high reliability. The Intellipaat industry-designed Microsoft Azure training is for those looking to make a solid career in the Microsoft Azure domain and become a Microsoft Azure certified professional. The salaries for Azure professionals are very good. Hence this Intellipaat azure video is your stepping stone to a successful career!

#azure tutorial #azure tutorial for beginners #learn azure

Ron  Cartwright

Ron Cartwright

1600624800

Getting Started With Azure Event Grid Viewer

In the last article, we had a look at how to start with Azure DevOps: Getting Started With Audit Streaming With Event Grid

In the article, we will go to the next step to create a subscription and use webhook event handlers to view those logs in our Azure web application.

#cloud #tutorial #azure #event driven architecture #realtime #signalr #webhook #azure web services #azure event grid #azure #azure event grid #serverless architecture #application integration

Hal  Sauer

Hal Sauer

1593444960

Sample Load balancing solution with Docker and Nginx

Most of today’s business applications use load balancing to distribute traffic among different resources and avoid overload of a single resource.

One of the obvious advantages of load balancing architecture is to increase the availability and reliability of applications, so if a certain number of clients request some number of resources to backends, Load balancer stays between them and route the traffic to the backend that fills most the routing criteria (less busy, most healthy, located in a given region … etc).

There are a lot of routing criteria, but we will focus on this article on fixed round-robin criteria — meaning each backend receives a fixed amount of traffic — which I think rarely documented :).

To simplify we will create two backends “applications” based on flask Python files. We will use NGINX as a load balancer to distribute 60% of traffic to application1 and 40% of traffic to application2.

Let’s start the coding, hereafter the complete architecture of our project:

app1/app1.py

from flask import request, Flask
import json

app1 = Flask(__name__)
@app1.route('/')
def hello_world():
return 'Salam alikom, this is App1 :) '
if __name__ == '__main__':
app1.run(debug=True, host='0.0.0.0')

app2/app2.py

from flask import request, Flask
import json

app1 = Flask(__name__)
@app1.route('/')
def hello_world():
return 'Salam alikom, this is App2 :) '
if __name__ == '__main__':
app1.run(debug=True, host='0.0.0.0')

Then we have to dockerize both applications by adding the requirements.txt file. It will contain only the flask library since we are using the python3 image.

#load-balancing #python-flask #docker-load-balancing #nginx #flask-load-balancing

Tutorial: Loading and Cleaning Data with R and the tidyverse

1. Characteristics of Clean Data and Messy Data

What exactly is clean data? Clean data is accurate, complete, and in a format that is ready to analyze. Characteristics of clean data include data that are:

  • Free of duplicate rows/values
  • Error-free (e.g. free of misspellings)
  • Relevant (e.g. free of special characters)
  • The appropriate data type for analysis
  • Free of outliers (or only contain outliers have been identified/understood), and
  • Follows a “tidy data” structure

Common symptoms of messy data include data that contain:

  • Special characters (e.g. commas in numeric values)
  • Numeric values stored as text/character data types
  • Duplicate rows
  • Misspellings
  • Inaccuracies
  • White space
  • Missing data
  • Zeros instead of null values

2. Motivation

In this blog post, we will work with five property-sales datasets that are publicly available on the New York City Department of Finance Rolling Sales Data website. We encourage you to download the datasets and follow along! Each file contains one year of real estate sales data for one of New York City’s five boroughs. We will work with the following Microsoft Excel files:

  • rollingsales_bronx.xls
  • rollingsales_brooklyn.xls
  • rollingsales_manhattan.xls
  • rollingsales_queens.xls
  • rollingsales_statenisland.xls

As we work through this blog post, imagine that you are helping a friend launch their home-inspection business in New York City. You offer to help them by analyzing the data to better understand the real-estate market. But you realize that before you can analyze the data in R, you will need to diagnose and clean it first. And before you can diagnose the data, you will need to load it into R!

3. Load Data into R with readxl

Benefits of using tidyverse tools are often evident in the data-loading process. In many cases, the tidyverse package readxl will clean some data for you as Microsoft Excel data is loaded into R. If you are working with CSV data, the tidyverse readr package function read_csv() is the function to use (we’ll cover that later).

Let’s look at an example. Here’s how the Excel file for the Brooklyn borough looks:

The Brooklyn Excel file

Now let’s load the Brooklyn dataset into R from an Excel file. We’ll use the readxlpackage. We specify the function argument skip = 4 because the row that we want to use as the header (i.e. column names) is actually row 5. We can ignore the first four rows entirely and load the data into R beginning at row 5. Here’s the code:

library(readxl) # Load Excel files
brooklyn <- read_excel("rollingsales_brooklyn.xls", skip = 4)

Note we saved this dataset with the variable name brooklyn for future use.

4. View the Data with tidyr::glimpse()

The tidyverse offers a user-friendly way to view this data with the glimpse() function that is part of the tibble package. To use this package, we will need to load it for use in our current session. But rather than loading this package alone, we can load many of the tidyverse packages at one time. If you do not have the tidyverse collection of packages, install it on your machine using the following command in your R or R Studio session:

install.packages("tidyverse")

Once the package is installed, load it to memory:

library(tidyverse)

Now that tidyverse is loaded into memory, take a “glimpse” of the Brooklyn dataset:

glimpse(brooklyn)
## Observations: 20,185
## Variables: 21
## $ BOROUGH <chr> "3", "3", "3", "3", "3", "3", "…
## $ NEIGHBORHOOD <chr> "BATH BEACH", "BATH BEACH", "BA…
## $ `BUILDING CLASS CATEGORY` <chr> "01 ONE FAMILY DWELLINGS", "01 …
## $ `TAX CLASS AT PRESENT` <chr> "1", "1", "1", "1", "1", "1", "…
## $ BLOCK <dbl> 6359, 6360, 6364, 6367, 6371, 6…
## $ LOT <dbl> 70, 48, 74, 24, 19, 32, 65, 20,…
## $ `EASE-MENT` <lgl> NA, NA, NA, NA, NA, NA, NA, NA,…
## $ `BUILDING CLASS AT PRESENT` <chr> "S1", "A5", "A5", "A9", "A9", "…
## $ ADDRESS <chr> "8684 15TH AVENUE", "14 BAY 10T…
## $ `APARTMENT NUMBER` <chr> NA, NA, NA, NA, NA, NA, NA, NA,…
## $ `ZIP CODE` <dbl> 11228, 11228, 11214, 11214, 112…
## $ `RESIDENTIAL UNITS` <dbl> 1, 1, 1, 1, 1, 1, 1, 1, 2, 1, 1…
## $ `COMMERCIAL UNITS` <dbl> 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0…
## $ `TOTAL UNITS` <dbl> 2, 1, 1, 1, 1, 1, 1, 1, 2, 1, 1…
## $ `LAND SQUARE FEET` <dbl> 1933, 2513, 2492, 1571, 2320, 3…
## $ `GROSS SQUARE FEET` <dbl> 4080, 1428, 972, 1456, 1566, 22…
## $ `YEAR BUILT` <dbl> 1930, 1930, 1950, 1935, 1930, 1…
## $ `TAX CLASS AT TIME OF SALE` <chr> "1", "1", "1", "1", "1", "1", "…
## $ `BUILDING CLASS AT TIME OF SALE` <chr> "S1", "A5", "A5", "A9", "A9", "…
## $ `SALE PRICE` <dbl> 1300000, 849000, 0, 830000, 0, …
## $ `SALE DATE` <dttm> 2020-04-28, 2020-03-18, 2019-0…

The glimpse() function provides a user-friendly way to view the column names and data types for all columns, or variables, in the data frame. With this function, we are also able to view the first few observations in the data frame. This data frame has 20,185 observations, or property sales records. And there are 21 variables, or columns.

#data science tutorials #beginner #r #r tutorial #r tutorials #rstats #tidyverse #tutorial #tutorials

Eric  Bukenya

Eric Bukenya

1624713540

Learn NoSQL in Azure: Diving Deeper into Azure Cosmos DB

This article is a part of the series – Learn NoSQL in Azure where we explore Azure Cosmos DB as a part of the non-relational database system used widely for a variety of applications. Azure Cosmos DB is a part of Microsoft’s serverless databases on Azure which is highly scalable and distributed across all locations that run on Azure. It is offered as a platform as a service (PAAS) from Azure and you can develop databases that have a very high throughput and very low latency. Using Azure Cosmos DB, customers can replicate their data across multiple locations across the globe and also across multiple locations within the same region. This makes Cosmos DB a highly available database service with almost 99.999% availability for reads and writes for multi-region modes and almost 99.99% availability for single-region modes.

In this article, we will focus more on how Azure Cosmos DB works behind the scenes and how can you get started with it using the Azure Portal. We will also explore how Cosmos DB is priced and understand the pricing model in detail.

How Azure Cosmos DB works

As already mentioned, Azure Cosmos DB is a multi-modal NoSQL database service that is geographically distributed across multiple Azure locations. This helps customers to deploy the databases across multiple locations around the globe. This is beneficial as it helps to reduce the read latency when the users use the application.

As you can see in the figure above, Azure Cosmos DB is distributed across the globe. Let’s suppose you have a web application that is hosted in India. In that case, the NoSQL database in India will be considered as the master database for writes and all the other databases can be considered as a read replicas. Whenever new data is generated, it is written to the database in India first and then it is synchronized with the other databases.

Consistency Levels

While maintaining data over multiple regions, the most common challenge is the latency as when the data is made available to the other databases. For example, when data is written to the database in India, users from India will be able to see that data sooner than users from the US. This is due to the latency in synchronization between the two regions. In order to overcome this, there are a few modes that customers can choose from and define how often or how soon they want their data to be made available in the other regions. Azure Cosmos DB offers five levels of consistency which are as follows:

  • Strong
  • Bounded staleness
  • Session
  • Consistent prefix
  • Eventual

In most common NoSQL databases, there are only two levels – Strong and EventualStrong being the most consistent level while Eventual is the least. However, as we move from Strong to Eventual, consistency decreases but availability and throughput increase. This is a trade-off that customers need to decide based on the criticality of their applications. If you want to read in more detail about the consistency levels, the official guide from Microsoft is the easiest to understand. You can refer to it here.

Azure Cosmos DB Pricing Model

Now that we have some idea about working with the NoSQL database – Azure Cosmos DB on Azure, let us try to understand how the database is priced. In order to work with any cloud-based services, it is essential that you have a sound knowledge of how the services are charged, otherwise, you might end up paying something much higher than your expectations.

If you browse to the pricing page of Azure Cosmos DB, you can see that there are two modes in which the database services are billed.

  • Database Operations – Whenever you execute or run queries against your NoSQL database, there are some resources being used. Azure terms these usages in terms of Request Units or RU. The amount of RU consumed per second is aggregated and billed
  • Consumed Storage – As you start storing data in your database, it will take up some space in order to store that data. This storage is billed per the standard SSD-based storage across any Azure locations globally

Let’s learn about this in more detail.

#azure #azure cosmos db #nosql #azure #nosql in azure #azure cosmos db