How to load data from Azure Data Lake Store into Azure SQL Data Warehouse

In my previous post, I wrote about getting started with Azure data lake analytics and data lake store. There I have mentioned how to create Azure data lake analytics and data lake store.

Then I had shared a basic example for U-SQL, the U-SQL output in this post is saved into data lake store folder which I am going to use in this example for loading into Azure SQL data warehouse.

Also have a look at this article for establishing Azure data lake store authentication, which has to be done as mandatory step to access the files in data lake store and load into Azure datawarehouse.

Creating Azure SQL datawarehouse from Azure portal as shown below:

Azure active directory app registration for Data lake store authentication

In my previous post, I wrote about getting started with Azure data lake analytics and data lake store. In this post, let us see how to create authentication for data lake store using Azure active directory.

Azure Data Lake Store uses Azure Active Directory for authentication.
Two main options available for this are:
  • End-user authentication 
  • Service-to-service authentication 

In this post, let us see how to create service-to-service authentication.

All about U-SQL

In my previous posts, I shared about Getting started with Azure data lake analytics and U-SQL, Basic example for U-SQL, Querying azure sql table using U-SQL.

Those who are familiar with T-SQL, getting used to U-SQL is fairly easy.

U-SQL generic syntax:

FROM "file"
USING Extractors.()

TO "file to Azure data lake store"
USING Outputters.()

We can also create user defined objects - extractors, outputters, processors, appliers, combiners, reducers.

In this post, I am going to share all the U-SQL references. 

Query Azure SQL database tables using U-SQL

In my previous post, I wrote about U-SQL basic example. In this post, let us see how we can query azure sql database tables using U-SQL.

First step is to create U-SQL database.Similar to other database systems and SQL-inspired Big Data processing systems such as Hive, U-SQL uses the concept of a database to group related objects together.


U-SQL Basic Example

In my previous posts, I shared about basics and how to create Azure data lake analytics and data lake store.

In this post, let us see an basic example for U-SQL script - Selecting from values Table Value Constructor and save the output to Azure data lake store.

We can create and submit U-SQL scripts from Visual studio. While installing we need to select the Azure data lake and stream analytics tool or we can download it from here.

Getting started with Azure Data Lake Analytics, U-SQL and Data Lake store

Azure Data Lake Analytics is an on-demand analytics job service to simplify big data analytics.

It includes U-SQL, a language that combines a familiar SQL-like declarative language with the extensibility and programmability provided by C# types and the C# expression language and big data processing concepts such as “schema on reads”

U-SQL’s scalable distributed runtime enables to efficiently analyze data in the store and across SQL Servers in Azure, Azure SQL Database, and Azure SQL Data Warehouse.

Azure data factory - Copy data from Azure blob storage to Azure cosmos DB

In my previous posts, we saw about copying data from Azure blob storage to Azure cosmos DB using Azure data factory copy wizard. In this post, let us see how we can perform the same copy operation by creating JSON definitions for Linked service, Dataset, Pipeline & Activity from Azure portal.

First we need to create Azure data factory from Azure portal:

Click New -> Data + Analytics -> Data Factory

Copy data from Azure blob storage to Azure cosmos DB

In this post, let us see an example to copy data from Azure blob storage to Azure cosmos DB SQL API using Azure data factory copy wizard.

I have created Azure blob storage and Azure Cosmos DB SQL API in my previous posts.

Azure Storage Explorer

Azure Blob Storage is a service for storing large amounts of unstructured data. 
We now have the facility to work with blob containers and blobs using Storage Explorer which is under preview version.

We can download storage explorer for free, from here.

Azure blob storage and container

In this post, let us see how we can create Azure blob storage account, container and upload file into container.

Login into Azure portal -> Click New -> Storage -> Storage account - blob, file, table, queue

SQL Vulnerability Assessment for Azure SQL Database and on-premises SQL Server

SQL Vulnerability Assessment (newly introduced feature in SSMS) can help to discover, track, 
and remediate potential database vulnerabilities. We can use it to proactively improve database security.

From SSMS - > Object Explorer :

Lift and shift SSIS packages to Azure

This article is about moving packages deployed in On-premises SQL Server to Azure SQL database.

Once packages deployed to cloud, managing and monitoring them is similar how we do from Integration Services Catalog.

Azure SSIS Integration Runtime engine, introduced as part of Azure Data Factory version 2 will be used to execute SSIS packages on cloud.

To provision the SSISDB in Azure and Azure SSIS Integration runtime:

Getting started with Azure data factory

Azure Data Factory, a cloud data integration service using that we can

1.  Create ETL/ELT data-driven workflows to process raw unorganised data from various data stores for data analysis

2.  Transform the data by using compute services such as Azure HDInsight Hadoop, Spark, Azure Data Lake Analytics, and Azure Machine Learning

3.  Automate data movement

SSIS Unpivot

In this post, let us see how to unpivot data using SSIS. Let us consider below sample data

Below screenshot shows the above data in unpivoted format:

Import and Export data from Azure Cosmos DB

In my previous posts, I wrote about deploying Azure Cosmos DB and basic of azure cosmos DB SQL queryIn this post, let us see an example for importing data from Azure cosmos DB.

The Data Migration tool is an open source solution that imports data to Azure Cosmos DB from a variety of sources, including:
  • JSON files
  • MongoDB
  • SQL Server
  • CSV files
  • Azure Table storage
  • Amazon DynamoDB
  • HBase
  • Azure Cosmos DB collections

To download Azure Cosmos DB Data Migration tool , after downloading we can run

Basics of an Azure Cosmos DB SQL query

In my previous post, I wrote about getting started with Azure Cosmos DB. In this post, let us see basic SQL's to query the JSON documents.

Microsoft Azure Cosmos DB supports querying documents using SQL as a JSON query language on SQL API accounts. Azure Cosmos DB is truly schema-free.

To show some examples, I am going to create two sample JSON documents:

Azure SQL Database elastic query

In this post, let us see how we can achieve cross database querying between two azure sql databases that resides in same server.

In previous posts, I wrote about 

Going through above topics will be helpful before creating an elastic query.
An elastic query can be used to make data located in a Azure SQL database available to other Azure SQL databases. 

I have created two Azure sql databases AzSQLDB & AzSQLDB_OLTP in same server azsqldbserv as shown below:

Azure Cosmos DB

Azure Cosmos DB is Microsoft's globally distributed, multi-model database.
With Azure cosmos DB, we can quickly create and query document, key/value, and graph databases

Azure cosmos DB supports multiple models:

  1. SQL : A schema-less JSON database engine with SQL querying capabilities.
  2. MongoDB : A MongoDB database service built on top of Cosmos DB.  Compatible with existing MongoDB libraries, drivers, tools and applications.
  3. Table : A key-value database service built to provide premium capabilities for Azure Table storage applications.
  4. Graph (Gremlin) : A graph database service built following the Apache TinkerPop specification.
  5. Cassandra : A key/value store built on the Apache Cassandra implementation. 
In this post, let us see steps to get started with Azure Cosmos DB SQL API.

What is Microsoft SQL Operations Studio

For various SQL Server database related operations, SQL Server management studio (SSMS) is used.

From SQL Server 2016, SSMS needs to be installed separately. Later it was included in the SETUP.exe for installing SQL Server.

SQL Operations Studio is now available in preview. What is Microsoft SQL Operations Studio ?

Error code 17051 while starting SQL Server services

It's been while I had connected to SQL Server installed in my laptop using SQL Server management studio (SSMS).

When I tried connecting using SSMS, I got error as shown in below screenshot:

H1B visa petition case status prediction model using Azure machine learning

In this post, let us see an example for Azure Machine learning - Classification model.

Steps involved to build an Azure ML experiment (classification model) that will predict based on the input data feed:
1) Data preparation
2) Choosing the Algorithm
3) Train, Score & Evaluate the model
4) Deploying the trained model 

SQL Server 2017 - Graph tables

Native graph databases have risen in popularity, being used for social networks, transportation networks, logistics, and much more. Graph database scenarios can easily be found across several business disciplines, including supply chain management, computer or telecommunication networks, detecting fraud attacks, and recommendation engines.

SQL Server offers graph database capabilities to model many-to-many relationships, handle applications that has hierarchical data and when there is need to analyze interconnected data and relationships.

SQL Serve(R) visualization on SSRS

We all know, SQL Server R Services is one of the new feature introduced in the recent release of SQL Server. 

Below are the benefits of integrating the R language within SQL Server:

i)  keep analytics close to the data 
ii) eliminate the costs and security risks associated with data movement.

So it means, we can use the rich and powerful R language within SQL Server management studio (SSMS) to create models and generate predictions using your SQL Server data. 

SQL Server 2016 - Getting started with R Services

R Services (In-Database) provides a platform for developing and deploying intelligent applications that uncover new insights.

R Services (In-Database) integrates the R language with SQL Server, you can keep analytics close to the data and eliminate the costs and security risks associated with data movement.

To install R Services

Publish an SSIS Package result set as a SQL View

From SQL server version 2012 onwards there is an in-built mechanism to publish SSIS package result set into SQL view

1.    Requires SSIS package to be designed with Data Streaming Destination.

This destination component needs to be separately downloaded in 2012 and 2014. From 2016, it is part of standard toolbox item.

2.    Running Data Feed Publish Wizard

    i)  Creates a linked server – SSISOLEDB.

    ii) Creates a SQL view in the specified database.

Below are the steps to publish an SSIS Package result set as a SQL View

SSRS 2016 Mobile reports

In this post, I have provided quick walk-through on Integration of Mobile reports with SSRS 2016 

Below are the pre-requisites to play with mobile reports, Inside Azure VM – Install below software’s:

1.       Microsoft Visual C++ 2012 Redistributable (x86) - 11.0.61030
2.       Microsoft SQL Server Mobile Report Publisher
3.       Microsoft® SQL Server® 2016 Report Builder
4.       Configured a Firewall for Report Server Access
5.       AdventureWorks Databases and Scripts for SQL Server 2016 CTP3