Tyrique  Littel

Tyrique Littel

1603936800

Fun With SQL Using Postgres and Azure Data Studio

I have already written a few posts about PostgreSQL and this time I would like to take a step back and discuss some of the basics and at the same time share with you my experience of using Azure Data Studio.

Azure Data Studio is a cross-platform database tool for data professionals using the Microsoft family of on-premises and cloud data platforms on Windows, macOS, and Linux. It is very easy to install and offers a modern editor experience with IntelliSense, code snippets, source control integration, and an integrated terminal. It’s engineered with the data platform user in mind, with the built-in charting of query result sets and customizable dashboards. You can learn more about it from the official website on this link.

ADS also has notebooks that are similar to Jupiter notebooks for python and other languages and are great for combining formatted text with code. You can execute queries via a query window or via a notebook window. I will be using both of those. If you haven’t used ADS before, give it a try and you will like it.

Setup

I have installed the PostgreSQL database on my machine and I also have installed Azure Data Studio which I will use to execute various queries for the database. You can also use pgAdmin or any other tool as u wish. The following screenshot shows that I am connected to Postgres server.

showing connection to Postgres server

Create a Database

Ok, let’s start something simple and create a database. I use the notebook window from the Azure Data Studio (ADS) and execute the following SQL which created the database as intended.

creating the database

and here is the database:

example database

Basic SQL Queries (Create, Alter, Select and Drop Basics)

In this section, we will refresh some basic SQL knowledge and some queries and their purposes.

Creating a Table

in DDL (Data Definition Language) SQL is very simple in PostgreSQL. Here is how the typical SQL looks like.

creating a table

the keyword serial is PostgreSQL specific and it set up an auto-incrementing value and that is the typical way for the primary-key.

Deleting a Table

This is also very straight forward: Try not to do that in your production database :)

deleting table

The if exists set up guards against error if the table doesn’t exist.

Altering a Table

is also very simple. e.g. if we want to alter a column in our table, the following is the syntax.

altering a table

Another example showing **renaming **a column:

renaming table

#sql #postgresql #databases

What is GEEK

Buddha Community

Fun With SQL Using Postgres and Azure Data Studio
Cayla  Erdman

Cayla Erdman

1594369800

Introduction to Structured Query Language SQL pdf

SQL stands for Structured Query Language. SQL is a scripting language expected to store, control, and inquiry information put away in social databases. The main manifestation of SQL showed up in 1974, when a gathering in IBM built up the principal model of a social database. The primary business social database was discharged by Relational Software later turning out to be Oracle.

Models for SQL exist. In any case, the SQL that can be utilized on every last one of the major RDBMS today is in various flavors. This is because of two reasons:

1. The SQL order standard is genuinely intricate, and it isn’t handy to actualize the whole standard.

2. Every database seller needs an approach to separate its item from others.

Right now, contrasts are noted where fitting.

#programming books #beginning sql pdf #commands sql #download free sql full book pdf #introduction to sql pdf #introduction to sql ppt #introduction to sql #practical sql pdf #sql commands pdf with examples free download #sql commands #sql free bool download #sql guide #sql language #sql pdf #sql ppt #sql programming language #sql tutorial for beginners #sql tutorial pdf #sql #structured query language pdf #structured query language ppt #structured query language

Ruthie  Bugala

Ruthie Bugala

1620435660

How to set up Azure Data Sync between Azure SQL databases and on-premises SQL Server

In this article, you learn how to set up Azure Data Sync services. In addition, you will also learn how to create and set up a data sync group between Azure SQL database and on-premises SQL Server.

In this article, you will see:

  • Overview of Azure SQL Data Sync feature
  • Discuss key components
  • Comparison between Azure SQL Data sync with the other Azure Data option
  • Setup Azure SQL Data Sync
  • More…

Azure Data Sync

Azure Data Sync —a synchronization service set up on an Azure SQL Database. This service synchronizes the data across multiple SQL databases. You can set up bi-directional data synchronization where data ingest and egest process happens between the SQL databases—It can be between Azure SQL database and on-premises and/or within the cloud Azure SQL database. At this moment, the only limitation is that it will not support Azure SQL Managed Instance.

#azure #sql azure #azure sql #azure data sync #azure sql #sql server

Rylan  Becker

Rylan Becker

1621121100

Writing U-SQL scripts using Visual Studio for Azure Data Lake Analytics

In the 2nd article of the series for Azure Data Lake Analytics, we will use Visual Studio for writing U-SQL scripts.

Introduction

Azure Data Lake stores the unstructured, structured, and semi-structured data in the Azure cloud infrastructure. You can use Azure portal, Azure Data Factory(ADF), Azure CLI, or various other tools. In the previous article, An overview of Azure Data Lake Analytics and U-SQL, we explored the Azure Data lake Analytics using the U-SQL script.

In this article, we will understand U-SQL scripts and executing them using Visual Studio.

U-SQL scripts execution in the Visual Studio

U-SQL is known as a big data query language, and it combines the syntax similar to t-SQL and the power of C## language. You can extract, transform data in the required format using the scripts. It has few predefined extractors for CSV, Text, TSV for extracting data from these formats. Similarly, it allows you to convert the output to your desired format. It offers big data processing from gigabyte to petabyte scale. You can combine data from Azure Data Lake Storage, Azure SQL DB Azure Blob Storage, Azure SQL Data Warehouse.

You can develop and execute the scripts locally using Visual Studio. Later, you can move your resources to the Azure cloud. This approach allows you to save the cost for Azure resources ( compute and storage) because in the Visual Studio, it does not cost you for the executions.

To use these scripts in the Visual Studio, you should have _the _Azure Data Lake and Stream Analytics Tools installed. You can navigate to Visual Studio installer -> Workloads-> Data Storage and processing -> Azure Data lake and Stream Analytics.

Launch the Visual Studio 2019 and create a new U-SQL project. You get a few other templates such as Class Library, Unit Test project and sample application as well. We will work with a project template that creates a project with your USQL scripts.

#azure #sql azure #visual studio #azure data lake analytics #visual studio #u-sql

Creating and Cataloging SQL pools in Azure SQL Server

This article will walk you through creating a new SQL pool within an existing Azure SQL Server as well as catalog the same using the Azure Purview service.

Introduction

Data is generated by transactional systems and typically stored in relational data repositories. This data is generally used by live applications and for operational reporting. As this data volume grows, this data is often required by other analytical repositories and data warehouses where it can be used for referential purposes and adding more context to other data from across the organization. Transactional systems (also known as Online Transaction Processing (OLTP) systems) usually need a relational database engine, while analytical systems (also known as Online Analytical Processing (OLAP) systems) usually need analytical data processing engines. On Azure cloud, it is usually known that for OLTP requirements, SQL Server or Azure SQL Database can be employed, and for analytical data processing needs, Azure Synapse and other similar services can be employed. SQL Pools in Azure Synapse host the data on an SQL Server environment that can process the data in a massively parallel processing model, and the address of this environment is generally the name of the Azure Synapse workspace environment. At times, when one has already an Azure SQL Server in production or in use, the need is to have these SQL Pools on an existing Azure SQL Server instance, so data in these SQL pools can be processed per the requirements on an OLAP system as well as the data can be co-located with data generated by OLTP systems. This can be done by creating SQL Pools within the Azure SQL Server instance itself. In this article, we will learn to create a new SQL Pool within an existing Azure SQL Server followed by cataloging the same using the Azure Purview service.

Pre-requisite

As we intend to create a new SQL Pool in an existing Azure SQL Server instance, we need to have an instance of Azure SQL in place. Navigate to Azure Portal, search for Azure SQL and create a new instance of it. We can create an instance with the most basic configuration for demonstration purposes. Once the instance is created, we can navigate to the dashboard page of the instance and it would look as shown below.

As we are going to catalog the data in the dedicated SQL Pool hosted on Azure SQL instance, we also need to create an instance of Azure Purview. We would be using the Azure Purview studio from the dashboard of this instance, tonregister this SQL Pool as the source and catalog the instance.

#azure #sql azure #azure sql server #sql #sql #azure

Rylan  Becker

Rylan Becker

1621079520

How to use SQL Server DACPAC extensions in Azure Data Studio

This article is focused on installing and using SQL Server DACPAC Extension in Azure Data Studio.

Additionally, the readers of this article are going to get a conceptual understanding of this extension along with its implementation in the light of a professional life scenario. At the end of the article, you will also find some handy tips about the approach supported by this extension.

About SQL Server DACPAC Extension

It is always good to first get familiar with this extension before we start installing and using it.

What is SQL Server DACPAC Extension?

This extension helps in database import and export operations primarily built for managing data-tier applications.

What is a data-tier application (DAC)?

According to Microsoft documentation, a data-tier application is a logical database management entity that defines all the SQL Server objects like tables, views associated with a user’s database.

What is a DACPAC?

A DACPAC is a data-tier application package in the form of a windows file containing all the database structure into one unit.

What is the purpose of DACPAC?

DACPAC helps developers and DBAs to package their database into a single unit to be either handed over to the team responsible for deploying the database to target environments in a manual or automated fashion.

Are there any requirements for standard data-tier application management via DACPAC?

The database must be registered as a data-tier application to be managed via standard DACPAC deployments in a commercial development environment.

In simple words, we can use this extension in Azure Data Studio to manage specific database deployments strategy (through DACPAC packages) that is going to help us to simplify the DLM (Database Lifecycle Management).

Installing SQL Server DACPAC Extension

In order to use the extension, we have to install it first, so let us do that. Please open Azure Data Studio and switch to Extensions.

#azure #development #sql azure #sql #dacpac #azure data studio