Over the weekend, I wanted to do a quick proof of concept on certain capabilities of Databricks and I wanted to use Azure SQL as a source. I faced quite bit of challenges and google was not kind enough to provide me a solution. I tried everything from bcp to bulk insert the file on my local computer and somehow it came out with errors which failed to fix.

Finally I managed to load the csv file into database and thought of sharing this with everyone, so that if you have to quickly load data into Azure SQL Database and you don’t want to write a script in Databricks or use Azure Data Factory for such a simple task. And before you move ahead, I am assuming that you have a fair understanding of the Azure ecosystem particularly the Storage Account.

So first things first, upload the file, you want to load into Azure SQL database, to a container in Azure Storage Account. You can use the normal Blob container and don’t have to use Azure Data Lake Storage for this.

#data-engineering #data #azure #azure-sql-database #database

Loading a csv file into Azure SQL Database from Azure Storage.
1.45 GEEK