Using Azure Log Analytics Workspaces to collect Custom Logs from your VM

Motivation:

We all have seen the Monitoring Tab Key Metrics on the VM Page. Yes, it is useful to see whether the CPU has been running or not through the **CPU **metrics, to check when the VM is getting the data from the outside world through Network In metrics and if the VM is doing any kind of write operation using **Disk Operations/Sec **metrics but it’s not effective for the custom services that we build on VM. So, in this blog, I will be giving you an example of how to create your own log based on customer service, bring it to Azure Log Analytics Workspace using its default agent, and query it according to our needs, even better create an alert on it.

Prerequisites:

  1. An Azure Account
  2. Azure Virtual Machine Service
  3. Azure Log Analytics Workspace Service
  4. Azure Alert Service

Azure Virtual Machine:

Let’s start with the VM itself. You already have the service running on the VM but don’t know how to get those logs into the portal or even create a log for your services. So, let’s assume that you have not created the logs for your services. So for this blog, I will be taking a simple flask app as my example for the service. To get it up and running we must download the flask library from pip and then create a flask app similar to this below:

#logs #log-analytics #azure #azure log analytics

What is GEEK

Buddha Community

Using Azure Log Analytics Workspaces to collect Custom Logs from your VM

Using Azure Log Analytics Workspaces to collect Custom Logs from your VM

Motivation:

We all have seen the Monitoring Tab Key Metrics on the VM Page. Yes, it is useful to see whether the CPU has been running or not through the **CPU **metrics, to check when the VM is getting the data from the outside world through Network In metrics and if the VM is doing any kind of write operation using **Disk Operations/Sec **metrics but it’s not effective for the custom services that we build on VM. So, in this blog, I will be giving you an example of how to create your own log based on customer service, bring it to Azure Log Analytics Workspace using its default agent, and query it according to our needs, even better create an alert on it.

Prerequisites:

  1. An Azure Account
  2. Azure Virtual Machine Service
  3. Azure Log Analytics Workspace Service
  4. Azure Alert Service

Azure Virtual Machine:

Let’s start with the VM itself. You already have the service running on the VM but don’t know how to get those logs into the portal or even create a log for your services. So, let’s assume that you have not created the logs for your services. So for this blog, I will be taking a simple flask app as my example for the service. To get it up and running we must download the flask library from pip and then create a flask app similar to this below:

#logs #log-analytics #azure #azure log analytics

Ruthie  Bugala

Ruthie Bugala

1620431700

Analyze Azure Cosmos DB data using Azure Synapse Analytics

This article will help you understand how to analyze Azure Cosmos DB data using Azure Synapse Analytics.

Introduction

Azure Cosmos DB is a multi-model NoSQL database that supports hosting various types of data that are transactional in nature. OLTP systems employ transactional databases for hosting operational data. To analyze large volumes of transactional data, relational databases do not scale or perform to the needs of large-scale analytics. Columnar data warehouses are one of the preferred, effective, and proven means of analyzing and aggregating large volumes of data for big data scale analytics. Azure Synapse is the data warehouse offering in the Microsoft Azure technology stack. The challenge with analyzing transactional data in relational databases using columnar warehouses is that one needs to replicate and/or relocate data from operational repositories into analytical repositories. Hybrid transactional analytical processing (HTAP) is a methodology or approach where data hosted in a relational format is auto-organized in a columnar format eliminating the need to replicate and/or relocate the data to a great extent. Azure offers a feature to analyze data hosted in Cosmos DB using Azure Synapse. In this article, we will learn how to implement the same.

Pre-requisites

We are assuming that we are hosting data in the Cosmos DB instance. To simulate this assumption, we would need an Azure Cosmos DB account implemented using the Core (SQL) API, with all the preview features turned on. Once you have an account created, you would be able to see an account listed as shown below.

#azure #sql azure #azure synapse analytics #azure

Ruthie  Bugala

Ruthie Bugala

1619601744

Azure Synapse Analytics Database CI/CD using Azure Function

In this article, I will discuss an Azure Database CI/CD approach using Azure Premium Function and Jenkins pipeline. I will only explain the architecture and the approach I took to implement the Database CI/CD pipeline.

Problem Statement and Challenges

I was working on a project where I had to build a Database deployment pipeline using enterprise GitHub which is only accessible through the company’s internal network. Also, port 1433 was blocked from the internal network to the Azure Synapse public endpoint for security reasons. Hence the only option I had was to run my pipeline in an internal network so that I could access GitHub which I was using for my Database Deployment Source Control and send the SQL code to Azure Synapse using Azure function HTTP post as port 1433 was blocked.

#azure #devops #azure-synapse-analytics #azure-devops #azure-functions

Aisu  Joesph

Aisu Joesph

1619636760

Azure Log Analytics Custom Logs Dashboard For Synapse Analytics

Problem Statement

The Azure Monitor Data Collector API allows you to import any custom log data into a Log Analytics workspace in Azure Monitor. In this article, we will send data/metrics from Azure Synapse Analytics to Azure log analytics workspace and build an alert and dashboard on top of it.

The only requirements are that the data should be JSON-formatted and split into 30 MB or less segments. This is a completely flexible mechanism that can be plugged into many ways: here we will use data/logs sent from Azure Synapse Data warehouse.

The pipeline presented here will not be the most performant or otherwise optimized, it is intended to serve as a starting point towards building a production pipeline of your own.

#azure-logic-apps #azure-synapse-analytics #microsoft-azure