Nat  Kutch

Nat Kutch

1598078880

Track your Azure Analysis Services Processing Times per Table

Recently I have been working with Azure Analysis Services (AAS) quite a lot. The number of tables in our model was quickly growing and processing times were fluctuating and increasing as well. Reason enough for me to find out a little bit more about which tables were taking up the most processing time from the AAS perspective.

Here I will explain the steps I took to get detailed information about the processing steps for an Azure Analysis Services Full Process from a Log Analytics workspace into a Power BI report.

  • Setting up Diagnostics Logging to Log Analytics workspace
  • Exploring the Logs with the Kusto Query language
  • Getting the data into Power BI

Diagnostic Logs

The starting point, as always, was the Microsoft documentation. Okay, actually it was Google or Bing. But quickly I landed on the documentation pages that include all the need steps to start. The link is listed below.

The configuration of the Diagnostics Settings, to send the logs to a Log Analytics workspace, was already done in this case. If you still need to do this, the docs include the needed actions. It’s pretty straight forward.


Exploring the Logs

The next step is exploring the logs in our Log Analytics workspace (LAW) by writing a query to get the information we are looking for. The first example query from the documentation was a good starting point. But not exactly what we are looking for, since we are looking for the Refresh of the model. And not the query times.

In the end, I wrote** two queries and join these together in Power BI**. In the first query, I am specifically looking for the highest level of the refresh process.

The value ‘CommandEnd’ in the ‘OperationName’ field and the ‘TextData_s’ field filtered on ‘<Refresh ’ does the trick in this case. Below is the query, if you want to use it, don’t forget to add your own server and database name in the second line.

let window = AzureDiagnostics 
	| where ResourceProvider == "MICROSOFT.ANALYSISSERVICES" and Resource =~ "<MyServerName>" and DatabaseName_s =~ "<MyDatabaseName>"
	| where TimeGenerated > ago(14d); 
	window 
	| where OperationName has "CommandEnd" and TextData_s has "<Refresh "
	| extend DurationMs=extract(@"([^,]*)", 1,Duration_s, typeof(long)) 
	| project StartTime_t,EndTime_t,ServerName_s,OperationName,RootActivityId_g,TextData_s,DatabaseName_s,ApplicationName_s,Duration_s,Success_s,EffectiveUsername_s,User_s,EventSubclass_s,DurationMs 
	| order by StartTime_t desc 
view raw
AASRefreshDiagnosticLog01.kql hosted with ❤ by GitHub

In the second query, we get the details of the refresh and we want to filter out as many operations that we do not use. This query results in all the detailed information we are looking for.

let window = AzureDiagnostics 
	| where ResourceProvider == "MICROSOFT.ANALYSISSERVICES" and Resource =~ "<MyServerName>" and DatabaseName_s =~ "<MyDatabaseName>"
	| where OperationName !hasprefix "Discover" and OperationName !hasprefix "VertiPaq" and OperationName !hasprefix "LogMetric" 
	| where TimeGenerated > ago(14d);  
	window 
	| extend DurationMs=extract(@"([^,]*)", 1,Duration_s, typeof(long)) 
	| project TimeGenerated,RootActivityId_g, ResourceId, OperationName, DatabaseName_s, level_d, EventClass_s, EventSubclass_s, StartTime_t, EndTime_t, ProgressTotal_s, ObjectReference_s, ObjectName_s, ObjectID_s, ObjectPath_s, Duration_s, CPUTime_s, Severity_s, Success_s, Error_s, TextData_s,DurationMs   
	| order by TimeGenerated desc
view raw
AASRefreshDiagnosticLog02Details.kql hosted with ❤ by GitHub

Another option, to get fewer results in the last query, is to join it to the first query on the ‘RootActivityId_g’. That way you make sure you only get data that relates to the first query.

#cloud #azure-analysis-services #azure #analysis-services #power-bi #data analysis

What is GEEK

Buddha Community

Track your Azure Analysis Services Processing Times per Table
Nat  Kutch

Nat Kutch

1598078880

Track your Azure Analysis Services Processing Times per Table

Recently I have been working with Azure Analysis Services (AAS) quite a lot. The number of tables in our model was quickly growing and processing times were fluctuating and increasing as well. Reason enough for me to find out a little bit more about which tables were taking up the most processing time from the AAS perspective.

Here I will explain the steps I took to get detailed information about the processing steps for an Azure Analysis Services Full Process from a Log Analytics workspace into a Power BI report.

  • Setting up Diagnostics Logging to Log Analytics workspace
  • Exploring the Logs with the Kusto Query language
  • Getting the data into Power BI

Diagnostic Logs

The starting point, as always, was the Microsoft documentation. Okay, actually it was Google or Bing. But quickly I landed on the documentation pages that include all the need steps to start. The link is listed below.

The configuration of the Diagnostics Settings, to send the logs to a Log Analytics workspace, was already done in this case. If you still need to do this, the docs include the needed actions. It’s pretty straight forward.


Exploring the Logs

The next step is exploring the logs in our Log Analytics workspace (LAW) by writing a query to get the information we are looking for. The first example query from the documentation was a good starting point. But not exactly what we are looking for, since we are looking for the Refresh of the model. And not the query times.

In the end, I wrote** two queries and join these together in Power BI**. In the first query, I am specifically looking for the highest level of the refresh process.

The value ‘CommandEnd’ in the ‘OperationName’ field and the ‘TextData_s’ field filtered on ‘<Refresh ’ does the trick in this case. Below is the query, if you want to use it, don’t forget to add your own server and database name in the second line.

let window = AzureDiagnostics 
	| where ResourceProvider == "MICROSOFT.ANALYSISSERVICES" and Resource =~ "<MyServerName>" and DatabaseName_s =~ "<MyDatabaseName>"
	| where TimeGenerated > ago(14d); 
	window 
	| where OperationName has "CommandEnd" and TextData_s has "<Refresh "
	| extend DurationMs=extract(@"([^,]*)", 1,Duration_s, typeof(long)) 
	| project StartTime_t,EndTime_t,ServerName_s,OperationName,RootActivityId_g,TextData_s,DatabaseName_s,ApplicationName_s,Duration_s,Success_s,EffectiveUsername_s,User_s,EventSubclass_s,DurationMs 
	| order by StartTime_t desc 
view raw
AASRefreshDiagnosticLog01.kql hosted with ❤ by GitHub

In the second query, we get the details of the refresh and we want to filter out as many operations that we do not use. This query results in all the detailed information we are looking for.

let window = AzureDiagnostics 
	| where ResourceProvider == "MICROSOFT.ANALYSISSERVICES" and Resource =~ "<MyServerName>" and DatabaseName_s =~ "<MyDatabaseName>"
	| where OperationName !hasprefix "Discover" and OperationName !hasprefix "VertiPaq" and OperationName !hasprefix "LogMetric" 
	| where TimeGenerated > ago(14d);  
	window 
	| extend DurationMs=extract(@"([^,]*)", 1,Duration_s, typeof(long)) 
	| project TimeGenerated,RootActivityId_g, ResourceId, OperationName, DatabaseName_s, level_d, EventClass_s, EventSubclass_s, StartTime_t, EndTime_t, ProgressTotal_s, ObjectReference_s, ObjectName_s, ObjectID_s, ObjectPath_s, Duration_s, CPUTime_s, Severity_s, Success_s, Error_s, TextData_s,DurationMs   
	| order by TimeGenerated desc
view raw
AASRefreshDiagnosticLog02Details.kql hosted with ❤ by GitHub

Another option, to get fewer results in the last query, is to join it to the first query on the ‘RootActivityId_g’. That way you make sure you only get data that relates to the first query.

#cloud #azure-analysis-services #azure #analysis-services #power-bi #data analysis

Getting started with Azure Analysis Services

This article will describe how to get started with Azure Analysis Services and help you understand the configuration and pricing options to create our first Analysis Services instance.

Introduction

SQL Server introduced SQL Server Analysis Services (SSAS) in 2005 for Online Analytical Processing (OLAP) workloads that require a different type of data management mechanism than typical Online Transaction Processing (OLTP) workloads. As the data in data warehouses tend to be in terabytes to petabytes, dealing with such large volumes of data needs a different engine altogether which SQL Server Analysis Services provided. With the advent of the Azure cloud, due to its obvious benefits of cloud computing, all the data workloads have been moving or migrating to the cloud. On cloud, services are available in an Infrastructure as a Service, Platform as a Service and Software as a Service model. For the entire SQL Server technology stack, Azure provides each of the equivalent components as a service like SQL Azure or SQL Managed instance which is a near equivalent of SQL Server on premises. SQL Server Analysis Services (SSAS) is no exception to this. Azure provides Azure Analysis Services which provides an Online Analytical Processing engine on Azure cloud. Let’s get started with Azure Analysis Services in this article and see how it can be configured.

Creating an Azure Analysis Services Instance

It is assumed that the reader already has an Azure account with all the necessary administrative privileges to operate and administer Azure Analysis Services. Azure provides a free $200 credit account for first-time users to try out this service as well. So, if you have never created a free account on Azure, you can consider availing this benefit to evaluate Analysis Services. Assuming you have access to an Azure account with necessary privileges, and you have already logged on to the Azure account, let’s get started with creating our first instance on Analysis Services.

#azure #sql azure #azure analysis services

 iOS App Dev

iOS App Dev

1622608260

Making Sense of Unbounded Data & Real-Time Processing Systems

Unbounded data refers to continuous, never-ending data streams with no beginning or end. They are made available over time. Anyone who wishes to act upon them can do without downloading them first.

As Martin Kleppmann stated in his famous book, unbounded data will never “complete” in any meaningful way.

“In reality, a lot of data is unbounded because it arrives gradually over time: your users produced data yesterday and today, and they will continue to produce more data tomorrow. Unless you go out of business, this process never ends, and so the dataset is never “complete” in any meaningful way.”

— Martin Kleppmann, Designing Data-Intensive Applications

Processing unbounded data requires an entirely different approach than its counterpart, batch processing. This article summarises the value of unbounded data and how you can build systems to harness the power of real-time data.

#stream-processing #software-architecture #event-driven-architecture #data-processing #data-analysis #big-data-processing #real-time-processing #data-storage

Ray  Patel

Ray Patel

1623292080

Getting started with Time Series using Pandas

An introductory guide on getting started with the Time Series Analysis in Python

Time series analysis is the backbone for many companies since most businesses work by analyzing their past data to predict their future decisions. Analyzing such data can be tricky but Python, as a programming language, can help to deal with such data. Python has both inbuilt tools and external libraries, making the whole analysis process both seamless and easy. Python’s Panda s library is frequently used to import, manage, and analyze datasets in various formats. However, in this article, we’ll use it to analyze stock prices and perform some basic time-series operations.

#data-analysis #time-series-analysis #exploratory-data-analysis #stock-market-analysis #financial-analysis #getting started with time series using pandas

Ron  Cartwright

Ron Cartwright

1600624800

Getting Started With Azure Event Grid Viewer

In the last article, we had a look at how to start with Azure DevOps: Getting Started With Audit Streaming With Event Grid

In the article, we will go to the next step to create a subscription and use webhook event handlers to view those logs in our Azure web application.

#cloud #tutorial #azure #event driven architecture #realtime #signalr #webhook #azure web services #azure event grid #azure #azure event grid #serverless architecture #application integration