Accessing SalesForce Data Using SQL with Enzo Unified

In this article I will show you how to access SalesForce data to simplify data access and analysis, either in real-time or through a local cache, using SQL commands through the Enzo Unified data virtualization server. Most customers using SalesForce will need to access their data stored in SalesForce tables to run custom reports, or display the information on custom applications. For example, customers may want to build a custom monitoring tool to be notified when certain data changes are detected, while other customers want to keep a local copy of the SalesForce data for performance reasons.

High Level Overview

You may need to access SalesForce data for a variety of reasons: reports, data synchronization, real-time data access, business orchestrations... Most organizations facing the need to extract SalesForce data are looking for a way to get real-time data access into SalesForce, or access a local copy of the data for faster processing. Enzo Unified is a data virtualization server that gives you the ability to access SalesForce data in real-time, or automatically cached locally, either through SQL or REST commands.

Enzo Unified offers three data access and synchronization methods:

  • - Virtual Table
    A virtual table provides a view into a SalesForce object. The virtual table doesn’t actually store any data; it provides a simple way to get to the data through REST and/or native SQL requests.
  • - Snapshot
    A snapshot is created on top of a Virtual Table. Snapshots create a local copy of the underlying SalesForce data and make the data available to both REST and SQL requests. Snapshots buffer the data locally inside Enzo Unified, which provides a significant performance boost when querying data. Snapshots can be configured to be refreshed periodically.
  • - Integration
    When SalesForce data needs to be copied to an external system, such as SQL Server or SharePoint, Enzo Unified provides an Integration adapter that is designed to copy data changes to a destination. For example, with the Integration adapter, Enzo Unified can detect changes to an Account table in SalesForce and replicate changes made to that table in near-time to a SalesForce list. The Integration adapter will be covered in a future article.

For a broader overview of Enzo Unified, read this whitepaper. For a brief justification for this technology, see this article.

SalesForce Virtual Table

Let’s first create a Virtual Table in our SalesForce adapter; the virtual table is called Account1, and points to the Account table in SalesForce. Note that the adapter is already configured with my SalesForce credentials. SalesForce credentials are assigned to a Enzo login; in the example below, the credentials to connect to my SalesForce environment is attached to the ‘sa’ account in Enzo. Because multiple configuration settings can be saved, they are named; I am showing you the ‘bscdev’ configuration, and the (*) next to it means that it’s the default setting for this login.

image

To create a virtual table, let’s select the Virtual Tables tab. A virtual table in SalesForce is defined by a SOQL definition, which Enzo Unified runs behind the scenes to fetch the data. Clicking on the ellipses next to the SOQL statement allows you to edit the command. The edit window allows you to test your SOQL command. The columns of the virtual table are automatically created for you based on the command you specified.

 imageimage

Once the virtual table is created, retrieving data from the Account table is as simple as running this command:  SELECT * FROM SalesForce.Account1 – this executes the SOQL command against SalesForce behind the scenes, and once the data has been fetched it is returned to you.

You might wonder… where do I run this command from?  Since Enzo Unified is a data virtualization server, that understands native SQL Server requests, you can use SQL Server Management Studio (SSMS) to connect to Enzo Unified directly. Or you can use Excel, and connect to Enzo Unified as a SQL Server database. Or you could use ADO.NET from a .NET application for example, and declare a SqlConnection that points to Enzo Unified.

For example, here is the command executed from SSMS; the data came back live from SalesForce.

 image

And here is the data using Excel 2013, with the Connection Properties to Enzo Unified. As you can see, Excel sees the Account1 virtual table and is able to read directly from it.

image image

You can also use Visual Studio and create a SqlConnection to connect to Enzo Unified and fetch data directly using ADO.NET. For example the following command returns up to 100 records records from the Account1 virtual table and binds the result to a Data Grid:

SqlConnection conn = new SqlConnection(“server=localhost,9550;uid=enzo_login;pwd=enzo_password”);
conn.Open();
SqlCommand cmd = new SqlCommand(“SELECT TOP 100 * FROM SalesForce.Account1”, conn);
SqlDataReader reader = cmd.ExecuteReader();

DataSet ds = new DataSet();
DataTable dt = new DataTable("Table1");
ds.Tables.Add(dt);
ds.Load(reader, LoadOption.PreserveChanges, ds.Tables[0]);
dataGridViewResult.DataSource = ds.Tables[0];

conn.Close();

 

Virtual Table Snapshot

As described previously, a snapshot is a local copy of the data and kept in Enzo Unified for faster retrieval. This allows you to create a simple local cache of remote data; in this case we will store the SalesForce account table in a Snapshot called AccountCache. The Snapshot is defined on the Account1 virtual table. Using Enzo Manager, select the Account1 virtual table (in the SalesForce adapter), and select the Snapshot tab.  The Snapshot below shows that it is refreshed daily; note that you can enter a Cron schedule to refresh the Snapshot at specific intervals.

image

Once created, the Snapshot becomes accessible through Enzo Unified using an EXEC statement:  EXEC SalesForce.AsyncResult ‘AccountCache’

The main difference is that the data is now coming from a local cache instead of the SalesForce Account table directly. As a result, the data may be somewhat delayed; however since the Snapshot offers a schedule for data refresh, you have control over how old the data is.

image

You can also access the Snapshot data from code using ADO.NET. The following code shows you how (note the EXEC call to the AsyncResult object).

SqlConnection conn = new SqlConnection(“server=localhost,9550;uid=enzo_login;pwd=enzo_password”);
conn.Open();
SqlCommand cmd = new SqlCommand(“EXEC SalesForce.AsyncResult ‘AccountCache’”, conn);
SqlDataReader reader = cmd.ExecuteReader();

DataSet ds = new DataSet();
DataTable dt = new DataTable("Table1");
ds.Tables.Add(dt);
ds.Load(reader, LoadOption.PreserveChanges, ds.Tables[0]);
dataGridViewResult.DataSource = ds.Tables[0];

conn.Close();

 

Summary

This article shows you how to access SalesForce data using native SQL commands through Enzo Unified, using various tools and techniques, including Excel, SSMS and .NET code. Because Enzo Unified is a data virtualization server that understands native SQL and REST commands, anyone with the proper access rights can access SalesForce data without learning the underlying APIs. And as with most data virtualization platforms, Enzo Unified offers security options giving you the ability to control who can access which virtual tables and snapshots.

About Herve Roggero

Herve Roggero, Microsoft Azure MVP, @hroggero, is the founder of Enzo Unified (http://www.enzounified.com/). Herve's experience includes software development, architecture, database administration and senior management with both global corporations and startup companies. Herve holds multiple certifications, including an MCDBA, MCSE, MCSD. He also holds a Master's degree in Business Administration from Indiana University. Herve is the co-author of "PRO SQL Azure" and “PRO SQL Server 2012 Practices” from Apress, a PluralSight author, and runs the Azure Florida Association.v

Exploring Microsoft Azure DocumentDB

In this blog post, I will provide an introduction to DocumentDB by showing you how to create a collection (a collection is a container that stores your data), and how to connect to your collection to add and fetch data. DocumentDB is a newer no-sql data storage engine that is hosted in the Microsoft Azure cloud exclusively. Because DocumentDB stores records as JSON objects, it is a natural database engine for developers. Unlike other offerings however, it also offers key features such as automatic indexing, server-side triggers, functions and stored procedures (through Javascript).

Creating a DocumentDB Database

First, let’s create a new DocumentDB database so we can start exploring this service; three things need to be created:  an account, a database, and a collection (where data is actually stored). An account can host multiple databases, and each database can host multiple collections. From you Azure portal (https://portal.azure.com) find DocumentDB from the list of available services, and create a new DocumentDB account. The Resource Group is a logical container allowing to group, view, and manage related services. The screenshot below shows the information I have provided.

Once the account has been created, the screen changes to show you a menu of options from which you can create databases; of importance, DocumentDB allows you to change the default consistency of your no-sql databases (no-sql database consistency is an important concept as it can impact performance, availability and consistency – see Consistency levels in DocumentDB); we will keep the default setting. Also note an important configuration property: your keys. Locate the Keys configuration menu to reveal your access keys. Note that DocumentDB allows you to manage read-only keys as well.

Select Overview from the top of this menu, and click on Add Database and enter a database identifier (which is the database name; my database is called ‘testdb’), and click OK.

Once the database has been created, you will need to create a collection. Select the database to open up a new panel, and click on Add Collection. Enter the necessary information and click OK (see the information I provided below; my collection name is logdata; I also changed the Throughput to 400 to reduce the cost since this is a test collection).  At this point, we are ready to access this collection and start adding records (in proper no-sql speak, we will be adding documents).

Before jumping into the code, let’s make note of the following information since this will be needed later to connect to DocumentDB.

Configuration Value Comment
Database ID testdb The database “Id” is the name of the database we created
Collection Id logdata The collection “Id” is the name of the collection we created
Endpoint https://YOUR-ACCOUNT-NAME.documents.azure.com:443/ This is the URI of your DocumentDB service; use your account name
Auth Key {Look under Keys in your DocumentDB database} This is the Primary Key or Secondary Key of your DocumentDB account

 

Create a VS2015 Project

Let’s create a simple project using Visual Studio 2015 to access the DocumentDB collection. Note that the complete project and source code can be found here:  http://www.bluesyntaxconsulting.com/files/DocumentDBLogData.zip 

We will create a Console Application to perform a few simple operations. Once you have created the project, you will need to add the DocumentDB SDK.  To install the SDK, find the Microsoft Azure DocumentDB package, or use the following command in the Package Manager Console (if you download the sample code, the package will automatically be downloaded when you first compile):

Install-Package Microsoft.Azure.DocumentDB

Let’s create a class that holds a single log entry in our DocumentDB collection. The class name is LogEntry. We need to have a unique identifier for every document, and it must be called Id.

    public class LogEntry
    {
        public string Id { get; set; } // Guid of log
        public DateTime DateAdded { get; set; }
        public string Message { get; set; }
        public string Category { get; set; }    // Info, Warning, Error
        public int Severity { get; set; } // Low, Medium, High
        public string Source { get; set; } // Application name
    }

Then, we will create a simple Console application that does two things: it can add a new document in our collection, and it can list all the documents from the collection.  The following is the complete code for the console application; note the private variables on top that represent the configuration settings identified previously.

using Microsoft.Azure.Documents;
using Microsoft.Azure.Documents.Client;
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;

namespace DocumentDBLogData
{
    class Program
    {

        static string endpoint = "https://YOUR-ACCOUNT-NAME.documents.azure.com:443/";
        static string authKey = "YOUR-PRIMARY-OR-SECONDARY-KEY";
        static string databaseId = "testdb";
        static string collectionId = "logdata";

        static void Main(string[] args)
        {
            Console.WriteLine("STARTING DocumentDB Demo... ");
           
            while(true)
            {
                Console.Clear();
                Console.WriteLine("1: Create new message");
                Console.WriteLine("2: List all messages");
                Console.WriteLine("");

                switch(Console.ReadKey().KeyChar)
                {
                    case '1':

                        Console.WriteLine("");
                        Console.WriteLine("Adding a record to DocumentDB...");

                        AddLogInfoEntry();

                        Console.WriteLine("New record added in DocumentDB. Press ENTER to continue.");
                        Console.ReadLine();

                        break;
                    case '2':

                        Console.WriteLine("");
                        Console.WriteLine("Fetching DocumentDB records...");

                        DisplayLogInfoEntry();

                        Console.WriteLine("");
                        Console.WriteLine("Press ENTER to continue.");
                        Console.ReadLine();

                        break;
                }

            }
        }

        static void AddLogInfoEntry()
        {
            using (DocumentClient client = new DocumentClient(new Uri(endpoint), authKey))
            {
                var collection = UriFactory.CreateDocumentCollectionUri(databaseId, collectionId);
                LogEntry le = new LogEntry()
                {
                    Id = Guid.NewGuid().ToString(),
                    Category = "Info",
                    DateAdded = DateTime.UtcNow,
                    Message = "General message from Console App",
                    Severity = 1,
                    Source = "CONSOLE APP"
                };

                Document newDoc = client.CreateDocumentAsync(collection, le).GetAwaiter().GetResult();

            }
        }

        static void DisplayLogInfoEntry()
        {
            using (DocumentClient client = new DocumentClient(new Uri(endpoint), authKey))
            {
                var collection = UriFactory.CreateDocumentCollectionUri(databaseId, collectionId);

                var docs = client.CreateDocumentQuery<LogEntry>(collection).AsEnumerable();

                foreach(var doc in docs)
                {
                    string output = "{0}\t{1}\t{2}\t{3}\t{4}";
                    output = string.Format(output, doc.Id, doc.DateAdded, doc.Source, doc.Severity, doc.Message);
                    Console.WriteLine(output);
                }

                Console.WriteLine();

            }
        }

    }
}

By pressing 1, the console application connects to DocumentDB and adds a record to the LogData collection.  By pressing 2, the console application fetches all available documents and displays them on the screen. Note that if you have a large number of records, you will need to add logic to page the operation (let’s say 100 documents at a time for example), and handle retry logic operations if the service is too busy.

Conclusion

This simple introduction to DocumentDB provides a quick overview of the simplicity of this service, along with a sample project for creating and accessing documents. Although DocumentDB is very easy to configure and use in code, many advanced features (not covered in this introduction) are available around performance, security and availability. For a deeper understanding of DocumentDB, please refer to the online MSDN documentation, and the QuickStart provided in the DocumentDB menu inside the Azure Portal.

About Herve Roggero

Herve Roggero, Microsoft Azure MVP, @hroggero, is the founder of Enzo Unified (http://www.enzounified.com/). Herve's experience includes software development, architecture, database administration and senior management with both global corporations and startup companies. Herve holds multiple certifications, including an MCDBA, MCSE, MCSD. He also holds a Master's degree in Business Administration from Indiana University. Herve is the co-author of "PRO SQL Azure" and “PRO SQL Server 2012 Practices” from Apress, a PluralSight author, and runs the Azure Florida Association.v

HANDS-ON LAB: Emulating Devices for Azure IoT Hub with SQL Server

In this post, I will explain how you can easily simulate dozens (or hundreds) of IoT devices to test your Azure IoT Hub configuration. Simulating IoT devices with dynamic data can help you test your Azure IoT Hub configuration settings, along with downstream consumers such as Stream Analytics and Power BI reports. In order to facilitate the communication between SQL Server and an Azure IoT Hub, we will use Enzo Unified (http://www.enzounified.com) which abstracts the underlying cloud APIs, allowing native SQL commands to be executed against the Azure IoT Hub. With Enzo Unified, simulating dozens of IoT devices can be done with a simple T-SQL statement.

image

Pre-Requities

To emulate IoT devices, you will need a Microsoft Azure account, and have a desktop (or server) with the following technologies installed:

  • Windows 8 (or higher) with .NET 4.6 installed
  • SQL Server 2014 Express or higher
  • Enzo Unified 1.6 (please contact info@enzounified.com to obtain a download link and installation instructions)

We will use SQL Server to drive test cases (simulating devices) by running SQL scripts, and Enzo Unified as a bridge allowing the SQL scripts to send data to the Azure IoT Hub from SQL Server directly.

Configure Microsoft Azure Azure IoT Hub

Let's first configure our Azure IoT Hub.

image

  • Click on IoT Hub (under Internet of Things); another window will show up
  • Enter a name for the service (for example: IoTEnzoTest) - the name for your IoT Hub will be called YOUR_IOT_HUB_NAME
  • Specify a Resource Group name, such as EnzoTest
  • You will reuse this resource group later when creating a Streaming Analytics job
  • Change the pricing tier level to F1 if you can to minimize the cost of the Hub
  • Select the East US location
  • Check the Pin to Dashboard checkbox
  • Click Create

image

When the IoT Hub has been created (this make take a few minutes) click Settings on your IoT Hub page. Under Settings, select Shared Access Policies, and select the iothubowner policy; the Shared Access Keys will be displayed.
Save the Connection String Primary Key (you will use the entire connection string when configuring Enzo Unified later). The connection string to the IoT Hub looks like this:

HostName=YOUR_IOT_HUB_NAME.azure-devices.net;SharedAccessKeyName=iothubowner;SharedAccessKey=YOUR_SHARED_ACCESS_KEY

Configure Enzo Unified for the Azure IoT Hub

In this section, we will configure Enzo Unified to connect to the Azure IoT Hub service using SQL Server Management Studio (SSMS). Enzo Unified will serve as the bridge between SQL Server and the Azure IoT Hub environment by allowing SQL Server to send test messages to Azure.

  • Connect to Enzo Unified using SSMS
  • Explore some of the AzureIoTHub methods you have access to by running the built-in help command:

EXEC AzureIoTHub.help

image

One of the available methods is _configCreate; let's use this method to add your IoT connection string. To learn how to use this command, run the following statement:

EXEC AzureIoTHub._configCreate help

image

Let's create a configuration called "default" (replace the IoT Hub name and connection string with you values):

EXEC AzureIoTHub._configCreate 'default', 1, 'YOUR_IOT_HUB_NAME', 'YOUR_CONNECTION_STRING'

  • You are now ready to access the hub through Enzo Unified.
    • If you need to update the configuration settings, use the _configUpdate command (same parameters); then use the _configFlush command to apply the new settings. 
    • If you create multiple configuration settings, you can use the _configUse to switch the active configuration. 
    • To list all available configuration settings, use the _configList command.

Run the following command to confirm you can connect to the Azure IoT Hub; no records will be returned yet as we have not yet configured our monitoring environment.

EXEC AzureIoTHub.ListDevices

Each SQL Server (i.e. device) has its own Access Key into the IoT Hub; you do not need to remember this information as Enzo Unified will work from the device name (DEVLAP03-SQL2014 in my example).

Create Virtual IoT Devices

We are now ready to add virtual IoT Devices, so that the Azure IoT Hub can accept incoming data from your SQL Server. We will create a few virtual IoT devices for this test.

  • Connect to Enzo Unified using SSMS
  • Run the following command (replace YOUR_DEVICE_NAME with a friendly name for the SQL Server; I used 'DEVLAP03-SQL2014')
  • NOTE: The backslash '\' is not a valid character for a device name; you can use the dash '-' instead to specify a SQL Server instance name

EXEC AzureIoTHub.CreateDevice 'test1’

EXEC AzureIoTHub.CreateDevice 'test2'

EXEC AzureIoTHub.CreateDevice 'test3'

EXEC AzureIoTHub.CreateDevice 'test4'

EXEC AzureIoTHub.CreateDevice 'test5'

You should now see the devices when you run the ListDevices command:

image

Let's test our new IoT Device by sending a JSON document from SSMS (through Enzo Unified):

Run the following command:

EXEC bsc.AzureIoTHub.SendData 'test1', '{"deviceId":"test1", "location":"home", "messurementValue":700, "messurementType":"darkness","localTimestamp":"2016-4-14 16:35:00"}'

You will soon see the message count go up on your Microsoft Azure IoT Hub dashboard.

Simulating a Test From Multiple Virtual Devices

At this time we are ready to send multiple test messages on behalf of virtual devices to the Azure Hub. To achieve this, we will use the SendTestData method; this method allows you to send messages (with different values to simulate actual devices) from multiple devices. To obtain help on this method, you can run this command:

EXEC AzureIoTHub.SendTestData help

The following command sends at least 12 messages, with 100ms interval, using four virtual devices. One of the parameters of this method is the list of devices that will participate in the test. The message is built using the template provided, which uses these functions, so that every message sent has different data sets:

  • #deviceid() – the name of the device
  • #pick(home,work,car) – selects one of the values randomly
  • #rnddouble(a,b) – selects a double value randomly between [a,b[
  • #utcnow() – the current time in UTC

EXEC bsc.AzureIoTHub.SendTestData 
    12,
    100,
    'test1,test2,test3,test4',
    '{"deviceId":"#deviceid()", "location":"#pick(home,work,car)", "messurementValue":#rnddouble(400.0,700.0), "messurementType":"darkness","localTimestamp":"#utcnow()"}'

The above command will generate at least 12 messages and output the messages that were actually sent to the Azure IoT Hub.

image

Saving Output to SQL Server for Analysis

Last but not least, let’s create a similar test and save the output provided by the SendTestData method to a local table in SQL Server so that it can be analyzed later.  In order to do this, we will need to call Enzo Unified through a Linked Server. In order to configure Linked Server to Enzo Unified, follow the instructions provided in the help of Enzo Unified.

First, connect SSMS to your local SQL Server, and create a database with a table where the data will be stored.

IF (NOT EXISTS(SELECT * FROM master..sysdatabases WHERE name = 'mytestdb'))
CREATE DATABASE mytestdb
GO

IF (NOT EXISTS(SELECT * FROM mytestdb.sys.tables WHERE name = 'iotrawresults'))
  CREATE table mytestdb..iotrawresults (dateCreated datetime, id nvarchar(50), messageId nvarchar(50), data nvarchar(1024), props nvarchar(1024), durationms int)
GO

To save the output of the SendTestData method into the iotrawresults table previously created, you will run the following command:

INSERT INTO mytestdb..iotrawresults EXEC [localhost,9590].bsc.AzureIoTHub.SendTestData
    10,
    100,
    'test1,test2',
    '{"deviceId":"#deviceid()", "location":"#pick(home,work,car)", "messurementValue":#rnddouble(400.0,700.0), "messurementType":"darkness","localTimestamp":"#utcnow()"}'

We have created a simple way to simulate IoT devices and send random data to an Azure IoT Hub; to scale this test system you can improve this lab by adding the following items:

  • Build a SQL Server job to run the test on a schedule
  • Build multiple jobs/tests to increase the number of devices and send a different message mix

Conclusion

This lab introduces you to the Azure IoT Hub, Enzo Unified and its AzureHub adapter, and how to leverage SQL Server to create an ecosystem of virtual devices simulating data emission to the Microsoft Azure cloud.

About Herve Roggero

Herve Roggero, Microsoft Azure MVP, @hroggero, is the founder of Enzo Unified (http://www.enzounified.com/). Herve's experience includes software development, architecture, database administration and senior management with both global corporations and startup companies. Herve holds multiple certifications, including an MCDBA, MCSE, MCSD. He also holds a Master's degree in Business Administration from Indiana University. Herve is the co-author of "PRO SQL Azure" and “PRO SQL Server 2012 Practices” from Apress, aPluralSight author, and runs the Azure Florida Association.

Real-Time Data Aggregation

I had the privilege of being interviewed by Microsoft, on Channel 9, regarding real-time data aggregation from distributed heterogeneous data sources, using the platform my company has created (Enzo Unified). This video introduces you to Enzo Unified, and shows how to easily merge data from from multiple sources, and create simple solutions that remove the complexities of APIs and traditional ETL data staging.

You can find the video here:  https://channel9.msdn.com/Shows/SupervisionNotRequired/4

Thank you David Crook for recording this session.

About Herve Roggero

Herve Roggero, Microsoft Azure MVP, @hroggero, is the founder of Enzo Unified (http://www.enzounified.com/). Herve's experience includes software development, architecture, database administration and senior management with both global corporations and startup companies. Herve holds multiple certifications, including an MCDBA, MCSE, MCSD. He also holds a Master's degree in Business Administration from Indiana University. Herve is the co-author of "PRO SQL Azure" and “PRO SQL Server 2012 Practices” from Apress, a PluralSight author, and runs the Azure Florida Association.

Windows 10 Core and Azure IoT Hub

I recently had the opportunity to follow an IoT Lab following the instructions provided in the Nightlight workshop, as found here: http://thinglabs.io/workshop/cs/nightlight/. No need to say, I jumped on the opportunity to learn about Windows 10 Core and have some fun with direct Azure integration with live Power BI reporting in the backend.

You will need the Azure IoT kit in order to go through this lab: https://www.adafruit.com/products/2733 – it costs a bit over $100; money well spent! In the box, you will find a Raspberry Pi2, a breadboard, and electronic components to build a nightlight.

WP_20160412_005

The first thing I needed to do was to upgrade my Windows development laptop from Windows 8 to Windows 10. The process was smooth and everything was compatible, including my Visual Studio 2015 installation (which is required for this lab). Actually, one thing to note here is that you must install the Universal Windows App Development Tools –> Tools and Windows SDK to build an app for devices; that’s an option in the Features of the Visual Studio 2015 installer. Another important configuration step is to enable Windows 10 development on your DEV machine. All these pre-requisites can be found here: http://thinglabs.io/workshop/cs/nightlight/getting-started/

Building out the nightlight was actually fun; I hadn’t touched electronic components in years, so this was refreshing and a bit challenging at times; specially with the ADC (Analog Digital Converter) component. But with patience, it all started to make sense and soon enough the wiring was working.

WP_20160412_014

Then came the code part of things… this is where the real fun begins. Controlling the pins on the GPIO was the coolest thing ever… Basically the GPIO exposes pins that you can access programmatically to send commands and receive data.

WP_20160413_001

One of the steps in the lab was to create an Azure IoT Hub, connect to it from the device, and explore live data being sent over to the cloud; in this case, the Raspberry Pi2 was designed to capture light level information, send the light measure to the cloud every second, and turn on or off the nightlight depending on the darkness level of the room. The lab goes into details on how this is done here: http://thinglabs.io/workshop/cs/nightlight/setup-azure-iot-hub/ and here: http://thinglabs.io/workshop/cs/nightlight/sending-telemetry/.

The real surprise of this entire solution was to see data flow in near-time through Power BI and visualize the darkness level. This is roughly what it looks like at the conclusion of the lab (picture taken from the lab):

Create the Power BI report

Not everything was smooth; in fact it took me nearly two days to get everything working. My biggest frustrations with the lab were two-fold:

  1. 1. Visual Studio 2015 was, at times, unable to communicate/find the Raspberry Pi2 to start the program
  2. 2. Windows 10 Core wants to push an update to the Raspberry Pi2 regardless of whether or not you want it to

The second issue was more concerning because the Windows upgrade failed on me repeatedly, and the only option was to reimage the Raspberry Pi2 with the Windows 10 Core default image. I learned later that it is possible to disable Windows Updates if you use Windows 10 Core Pro.

In all, this was an amazing lab; if you want to learn about Windows 10 Core, Azure IoT Hub, and connect the dots with Power BI, I highly recommend going through this lab.

About Herve Roggero

Herve Roggero, Microsoft Azure MVP, @hroggero, is the founder of Enzo Unified (http://www.enzounified.com/). Herve's experience includes software development, architecture, database administration and senior management with both global corporations and startup companies. Herve holds multiple certifications, including an MCDBA, MCSE, MCSD. He also holds a Master's degree in Business Administration from Indiana University. Herve is the co-author of "PRO SQL Azure" and “PRO SQL Server 2012 Practices” from Apress, a PluralSight author, and runs the Azure Florida Association.

Managing Multiple SQL Servers

If you deal with SQL Server on a regular basis, or if you are a DBA dealing with production databases, you are most likely using a monitoring tool to gather performance statistics and receive alerts. However when it comes to performing real-time analysis, such as finding out which databases are low in log space, querying all your error logs to search for a specific event, or even to find out which databases contain a specific stored procedure or column name, it can get a bit tricky. Most DBAs will use SQL Server Management Studio (SSMS) to perform real-time queries, and the built-in tool called Central Management Servers; however this tool has multiple short comings. This blog post presents you with an overview Central Management Servers, a new tool called Enzo SQL Manager, and presents pros and cons of both solutions.

Using SSMS Central Management Servers

SSMS offers a good solution for DBAs to query multiple SQL Server databases at once: Central Management Servers (or CMS). CMS is a technology built directly in SSMS and offers the ability to run SQL commands to one or more SQL Servers. You can think of CMS as a sharding technology, which allows you to run the same SQL command on multiple SQL Server instances at the same time, and show an aggregated result as if the command was run on a single server.

For example, in the screenshot below, SSMS is connected to CMS. The Registered Servers window on the left shows you CMS, with localhost\Enzo as the instance that hosts CMS. Within this instance, two servers are registered: DEVLAP02\SQLSERVER2012, and DEVLAP03\SQL2014. From the naming convention of my instances of SQL Server, you can infer that my SQL Server instances are not the same version of SQL Server.  Yet, I am able to run a single command (SELECT GETDATE()) and obtain the current datetime information on both instances in the result window. You will also notice that CMS automatically adds the ServerName column (although not specifically requested by the command) so that you know which server the information comes from.

image

From a configuration standpoint, CMS requires a network login (SSPI), and as a result is unable to connect to Azure SQL Databases. It also means that your SQL Servers must be joined to the same network.

There are other limitations with CMS; for example CMS can only query SQL Server instances; it cannot send the same SQL statement to all databases within an instance (you can run the undocumented sp_msforeachdb command, but this is not a distributed command; it is a loop operation which does not aggregate results). In addition, it is schema sensitive: sending a command to multiple SQL Server instances of various versions could return an error. For example, the following SQL command (SELECT * FROM sys.databases) fails in my setup because the system view “databases” returns slightly different columns in both SQL Server versions:

image

From an architecture standpoint, you cannot use CMS from a .NET application or any other tool than SSMS, because it is a feature of SSMS and unavailable outside of the application. In other words, you cannot leverage the ability to send parallel queries outside of SSMS.

Using Enzo SQL Manager

Similarly to CMS, Enzo SQL Manager, a solution built on top of Enzo Unified, allows you to run SQL commands against one or more SQL Server instances; it can also run a command against all user databases and will automatically adapt to various schemas. Because Enzo SQL Manager works with database logins, you can include SQL Server instances that are not part of your network, including Azure SQL Databases.

In the screenshot below, SSMS is connected to Enzo SQL Manager running on LOCALHOST,9556 (shown at the bottom of the screenshot), and the command is executed against all the servers registered with Enzo (servers are registered with a separate management interface). Enzo SQL Manager provides views that run built-in SQL commands (or custom-defined commands) against the instances and/or databases; additional columns are also added automatically to identify the machine name, instance name and optionally the database id and name where the data came from. The command below (RowCounts) returns a list of tables in each database with a count of records.

image

Enzo SQL Manager offers a number of built-in commands, and allows you to extend the list of views by providing a name for the view and the SQL command that should be executed. For example, you can define a new view called CPUActivity which returns the SPID, CPU and LoginName of user processes that have a CPU greater than 0.  The checkbox “Execute command against all user databases” allow you to control whether the view will execute on each SQL Server instance, or each user databases within each instance.

image

Once the view has been defined, you can now run this SQL command against all the registered servers when you are connected to Enzo:

SELECT * FROM MsSqlMgmt.CPUActivity

You can also further filter the results as such:

SELECT * FROM MsSqlMgmt.CPUActivity WHERE loginame <> ‘sa’

Although Enzo SQL Manager does not understand complex queries natively (such as a JOIN or a GROUP BY operation), you can create a custom view with the complex SQL Statement. For example, you could create a custom view that joins Table and Index system tables and make this complex SQL query available through the view.

Since Enzo SQL Manager is a service, you can connect to it using a .NET application or a dashboard, making it easier to create custom monitoring solutions. For example, you could easily add a SQL Job that calls the CPUActivity custom view, and through Enzo Unified make a Phone Call or send a SMS text message when a specific condition has been detected using and SQL statement.  For example, the following SQL Job monitors blocking calls against all registered servers and sends a SMS when a blocking issue has been detected. A table variable called tableLocks is declared to store the list of blocking calls returned by Enzo SQL Manager.

DECLARE @tableLocks table (machine_name nvarchar(100), instance_name nvarchar(100), blocked_session_id int)

INSERT INTO @tableLocks
SELECT machine_name,instance_name,blocked_session_id  FROM [localhost,9556].bsc.MsSqlMgmt.Blocking

SELECT * FROM @tableLocks

IF (Exists(SELECT * FROM @tableLocks))
BEGIN
    DECLARE @message nvarchar(250)
    SET @message = 'Blocking issue detected on ' + (SELECT CAST(COUNT(*) as nvarchar(5)) FROM @tableLocks) + ' session(s)!'
    EXEC [localhost,9556].bsc.twilio.sendsms 'YOUR PHONE NUMBER', @message
END

Enzo SQL Manager uses Twilio to send SMS messages; this allows you to send any text message directly from SQL Server by running the Twilio.SendSMS command. This command accepts multiple phone numbers so that you can send a text message to multiple phones. To make this work, you will need to open up an account with Twilio and use the Enzo SQL Management interface to register your Twilio account. This screen is found under Configuration –> Configure Twilio. If you wish to make phone calls from SQL Server, you will also need to make sure Enzo is accessible from the Internet; the Public Enzo Address is the URL where Enzo is accessible from the public Internet. For more information about Twilio, visit http://www.twilio.com.

image

Pros and Cons Summary

Generally speaking, CMS provides a more comprehensive support for SQL statements; however Enzo SQL Manager supports the creation of views that can contain any valid SQL statement. Enzo Unified supports other capabilities, such as the ability to query databases in parallel, Linked Server (for integration with SQL Jobs for example), automatically adapts to variable schema definitions, is fully extensible and supports Azure SQL Database connectivity.

image

How To Try Enzo SQL Manager

You can visit our website and download a trial version of Enzo SQL Manager at http://www.enzounified.com/enzo-sql-manager/.

About Herve Roggero

Herve Roggero, Microsoft Azure MVP, @hroggero, is the founder of Enzo Unified (http://www.enzounified.com/). Herve's experience includes software development, architecture, database administration and senior management with both global corporations and startup companies. Herve holds multiple certifications, including an MCDBA, MCSE, MCSD. He also holds a Master's degree in Business Administration from Indiana University. Herve is the co-author of "PRO SQL Azure" and “PRO SQL Server 2012 Practices” from Apress, aPluralSight author, and runs the Azure Florida Association.

About Enzo Unified

Enzo Unified is a data platform that helps companies reduce development and data integration project timelines, improve data quality, and increase operational efficiency by solving some of the most complex real-time data consumption challenges. For more information, contact info@enzounified.com, or visit http://www.enzounified.com/.

Azure Bootcamp 2016 – Boca Raton

This post is a summary of the local event, and includes the Azure Data Labs that we are hosting in the Boca Raton location for the Global Azure Bootcamp 2016.

Introduction

We are excited to introduce you to the Microsoft Azure cloud. Many of our local MVPs will be presenting tomorrow on various topics, so be ready for a downpour of information!  We will have two tracks: Dev Ops, and Developer. We will also have a joint session for an overall introduction to Azure in the morning, and another joint session for a general Q&A right after lunch.

Here is the tentative agenda:

image

Data Lab

We are proposing three labs for this event to focus on data: Azure SQL Database, Azure Storage, and the Redis Cache. Note that each lab could take between 30-45 minutes to complete; as a result you may only be able to go through one or two labs at the event.

Azure SQL Database


Objectives: create a new Azure SQL Database server, and a database, using the management portal (portal.azure.com) and connect to it using SQL Server Management Studio, and Visual Studio.

Lab 1: Create a Azure SQL Database
https://azure.microsoft.com/en-us/documentation/articles/sql-database-get-started/

Lab 2: Connect using SQL Server Management Studio
https://azure.microsoft.com/en-us/documentation/articles/sql-database-connect-query-ssms/

Lab 3: Connect using C# and Add Rows
https://azure.microsoft.com/en-us/documentation/articles/sql-database-develop-dotnet-simple/

Lab 4: Run System Commands to Monitor Performance
https://azure.microsoft.com/en-us/documentation/articles/sql-database-monitoring-with-dmvs/

 

Azure Storage


Objectives: Create an Azure Table and store no-sql data in the table programmatically. Send a message in an Azure Queue programmatically and retrieve it later.

Lab 1: Create, Query and Update Data in an Azure Table with C#
https://azure.microsoft.com/en-us/documentation/articles/storage-dotnet-how-to-use-tables/

Lab 2: Create and Use Azure Queues using C#
https://azure.microsoft.com/en-us/documentation/articles/storage-dotnet-how-to-use-queues/

 

Redis Cache


Objectives: Create a Redis Cache environment and use it using C#.

Lab 1: Create and use a Redis Azure Cache

https://azure.microsoft.com/en-us/documentation/articles/storage-dotnet-how-to-use-queues/


About the Azure Bootcamp

The event is sponsored globally by many companies (see http://global.azurebootcamp.net/ for general information about this global event). We also have two local sponsors that are making this event possible: Champions Solutions Group (www.championsg.com), and Enzo Unified (www.enzounified.com).

The event is organized by the Azure Florida Association; we host meetings monthly (either online or onsite). Please join the Azure Florida Association on Linked In: https://www.linkedin.com/groups/4177626

Our speakers this year are:

Dave Noderer, MVP (@davenoderer, https://www.linkedin.com/in/davenoderer)

Adnan Cartwright, MVP (@adnancartwright, https://www.linkedin.com/in/adnan-cartwright-78633aa)

Jason Milgram, MVP (@jmilgram, https://www.linkedin.com/in/jasonmilgram)

Herve Roggero, MVP (@hroggero, https://www.linkedin.com/in/hroggero)

Accessing SharePoint Online Data Directly From SQL Server and REST

If you have ever worked with SharePoint, either on premises or online, and ever needed to fetch records or manage Lists programmatically, you know how hard of a task this can be. Learning the SharePoint API presents a significant learning curve even for senior developers. In this post, you will see how you can interact with SharePoint Online using simple SQL statements or simple REST commands, and how you can tap into SharePoint directly from within SQL Server (such as triggers, views, and functions). Although this blog post focuses on the online version of SharePoint, the examples below also work with SharePoint on premises. Accessing SharePoint directly from SQL Server (or simple REST commands) requires the use of Enzo Unified, a real-time integration platform that hides the complexities of APIs.

Managing SharePoint Objects

SharePoint offers a large array of management options through its APIs, including the ability to create Sites, Lists, Folders, and Security. At the time of this writing, Enzo Unified allows you to access basic SharePoint management features using SQL, such as the ability to create a SharePoint list, and add Fields to the list. For example, creating a new SharePoint Task List using SQL looks like this:

EXEC SharePoint.createlist 'Ch9Tweets', 'Tasks', 'Channel9 Tweets Received'

You can also add new fields to a SharePoint list using SQL. For example, let’s add a new field called ScreenName of type Text:

EXEC SharePoint.addfield 'Ch9Tweets', 'ScreenName', 'Text'

Other SQL procedures allow you to create Folders where you can store Documents.

Reading From and Writing To SharePoint Lists

Now that we have created a SharePoint List using SQL, we can add a new record. Let’s use an INSERT statement to add a record in the SharePoint List we just created. The INSERT statement allows you to specify the field names you want to provide a value for.

INSERT INTO sharepoint.list@Ch9Tweets (Title, ScreenName) VALUES ('New tweet received.', '@hroggero')

image

The Messages window shows you that one record was affected. You may have noticed a minor variation to the usual SQL syntax: the table name is called ‘list@Ch9Tweets’. The @ sign is used to provide the actual name of the SharePoint list. The UPDATE and DELETE statements are similar:

UPDATE SharePoint.list@Ch9Tweets SET ScreenName = '@enzo_unified' WHERE ID = 1

DELETE FROM sharepoint.list@Ch9Tweets WHERE ID = 1

Selecting from a SharePoint list allows you to fetch SharePoint data directly using a SQL command, as such:

SELECT ID, Title, ScreenName, Priority FROM sharepoint.list@Ch9Tweets

It is also possible to include the TOP and WHERE clauses, which automatically issue the proper CAML query for SharePoint to filter the data. For example, you could select a subset of records like this:

SELECT TOP 5 ID, Title, ScreenName, Priority FROM sharepoint.list@Ch9Tweets WHERE ID < 10

image

Integrating SharePoint And SQL Server

The above examples work just fine if you are connected to Enzo Unified directly; however when used directly from within SQL Server (such as a trigger), you will need to created a Linked Server to Enzo Unified, and add the Server Name to the query. For example, the previous SELECT command would look like this when called within a Stored Procedure (notice the addition of the linked server):

SELECT TOP 5 ID, Title, ScreenName, Priority FROM [localhost,9550].bsc.sharepoint.list@Ch9Tweets WHERE ID < 10

Most of the operations to access and change SharePoint data can also be performed using specific stored procedures within Enzo Unified. For example, it is possible to insert a new item using the AddListItemRaw command. This command inserts a new list item in the Ch9Tweets list, where the values for the Title and ScreenName fields are provided in the XML document passed in.

EXEC SharePoint.AddListItemRaw ‘Ch9Tweets’, ‘<root><Title>New tweet!</Title><ScreenName>@hroggero</ScreenName></root>’

REST Access

In addition to supporting native SQL commands from SQL Server, Enzo Unified also provides a simple REST interface that can be used from mobile phones. Let’s call the GetListItemsEx method using a REST command. To know which parameters to send to Enzo Unified, let’s run the HELP command on GetListItemsEx first:

exec bsc.SharePoint.GetListItemsEx help

image

We can see that GetListItemsEx requires a view name, and optionally accepts the columns and where parameters. Calling the GetListItemsEx method using the HTTP endpoint of Enzo Unified (listening on 19560 in this example) looks like this (note: the call requires and authentication token to work):

image

The JSON response provides the list of tweets that match the WHERE clause:

image

Conclusion

The ability to integrate directly with SharePoint from SQL Server without the need to use an ETL tool and without the need to learn the SharePoint APIs allows developers and DBAs to build direct database-level integration with SharePoint. As such, it is possible to interact directly with SharePoint lists through the following SQL Server objects:

  • - Stored Procedures
  • - Functions
  • - Triggers
  • - Views
  • - SQL Jobs

In addition to low-level integration with SQL Server, Enzo Unified provides a REST interface allowing developers to communicate securely to SharePoint using HTTP or HTTPS.

About Herve Roggero

Herve Roggero, Microsoft Azure MVP, @hroggero, is the founder of Enzo Unified (http://www.enzounified.com/). Herve's experience includes software development, architecture, database administration and senior management with both global corporations and startup companies. Herve holds multiple certifications, including an MCDBA, MCSE, MCSD. He also holds a Master's degree in Business Administration from Indiana University. Herve is the co-author of "PRO SQL Azure" and “PRO SQL Server 2012 Practices” from Apress, aPluralSight author, and runs the Azure Florida Association.

About Enzo Unified

Enzo Unified is a data platform that helps companies reduce development and data integration project timelines, improve data quality, and increase operational efficiency by solving some of the most complex real-time data consumption challenges. For more information, contact info@enzounified.com, or visit http://www.enzounified.com/.

Accessing No-SQL data from SQL Server 2016 and R Services

In this post, I will show how to easily extend SQL Server 2016 R Services to consume No-SQL data in real-time, such as Azure Tables. This example could easily be modified to access other No-SQL services such as Couchbase for example. As you may know, SQL Server 2016 (currently in Release Candidate 1) now offers the ability to call an R service for statistical analysis using an SQL statement. What is interesting is that you can execute an R script (that contains the statistical computation) directly from SQL Server 2016, and consume SQL Server data directly from within the R script. This means that you can more easily perform advanced mathematical computations in SQL Server by leveraging an external R service. However, there are situations in which parts of the data needed for the analysis is not readily available inside SQL Server, but in external data stores, such as a No-SQL environment; this usually translates into addition configuration to leverage an ETL tool to load the data into temporary or staging tables. In this blog post, I will show you how SQL Server 2016 with R Services can tap into a No-SQL data store directly using basic SQL statements by leveraging Enzo Unified, without using an ETL tool and without the need to create temporary/staging tables.

I am assuming that you have installed SQL Server 2016 (RC1), and the necessary R services on the same machine; please follow these links for further information about SQL Server 2016 and how to install the R services. Note that you will need to select the Advanced Analytics Extensions option during the installation of SQL Server 2016.

- SQL Server 2016 RC1: https://www.microsoft.com/en-us/evalcenter/evaluate-sql-server-2016 

- Install R Services: https://msdn.microsoft.com/en-us/library/mt604883.aspx

- Post Installation Server Configuration: https://msdn.microsoft.com/en-us/library/mt590536.aspx

You will also need Enzo Unified installed on the same machine; to obtain Enzo Unified, contact info@enzounified.com. Enzo Unified looks like SQL Server on the network and provides instant access to a large number of endpoints using the SQL language natively, from SQL Server or applications directly. In this scenario, Enzo Unified is configured to allow access to an Azure Table using the following command:

SELECT * FROM AzureStorage.Table@table1

Note: Enzo Unified uses a specific convention to specify an Azure Table name: add the @ symbol followed by the name of the table. The above SQL command will return all available entities and columns from an Azure Table called “table1”.

In order to make this data available to an R script, I created a Linked Server definition in SQL Server that points to Enzo Unified. The Linked Server is called [localhost,9550]. This means that SQL Server can also access No-SQL data directly using the following command (note the addition of the Linked Server):

SELECT * FROM [localhost,9550].bsc.AzureStorage.Table@table1

However, Azure Tables do not have a firm schema definition; each entity in an Azure Table is an XML document with potentially different nodes; as a result, it is necessary to create a virtual table inside Enzo Unified that binds the table to a known schema. The following command creates a virtual table called “SalesData” and is created under a schema called “shard”; the command accepts the name of the virtual table, the connection information to the Azure Table (not shown here to simplify the example), a comment, and the columns names (schema) that it returns:

exec shard.CreateVirtualTable
     'SalesData',        -- name of virtual table
     'AllRecentSales',  -- name of the data source pointing to the desired Azure Table
     'Retrieves all sales data',    -- comment
     'int id|||x,datetime effdate|||x,itemname|||x,category|||x,statename|||x,decimal price|||x,int customerid|||x,int orderid|||x,__source__'      -- list of columns

Now that the virtual table has been created in Enzo Unified, the SalesData virtual table returns the desired columns from the Azure Table:

SELECT id, statename, price, customerid FROM [localhost,9550].bsc.shard.salesdata

Once the virtual table has been configured in Enzo Unified, SQL Server 2016 can present Azure Table data to an R script. The R script accepts an SQL command that will be used as the input data set.  For example, the following R script’s input data set is the SQL statement provided above that returns four columns: the id, statename, price and customerid fields from the Azure Table.

clip_image001

As you can see above, Enzo Unified has abstracted the Azure Table entirely, presenting No-SQL data as a SQL Server table to an R script; this sample script does not actually perform any statistical analysis; it only returns the data it reads to demonstrate the capability to access No-SQL data from an R script. Because the call to Enzo Unified is performed in real-time, this script could yield a different output every time it is executed if the underlying Azure Table data changes.

In conclusion, configuring SQL Server 2016 (RC1), R Services, and Enzo Unified allows you to perform statistical analysis on a large number of data sources, including No-SQL databases, thanks to Enzo Unified, without building temporary tables and complex ETL processes to being the data into SQL Server staging tables.

About Herve Roggero

Herve Roggero, Microsoft Azure MVP, @hroggero, is the founder of Enzo Unified (http://www.enzounified.com). Herve's experience includes software development, architecture, database administration and senior management with both global corporations and startup companies. Herve holds multiple certifications, including an MCDBA, MCSE, MCSD. He also holds a Master's degree in Business Administration from Indiana University. Herve is the co-author of "PRO SQL Azure" and “PRO SQL Server 2012 Practices” from Apress, a PluralSight author, and runs the Azure Florida Association.

About Enzo Unified

Enzo Unified is a data platform that helps companies reduce development and data integration project timelines, improve data quality, and increase operational efficiency by solving some of the most complex real-time data consumption challenges. For more information, contact info@enzounified.com, or visit http://www.enzounified.com/.

Exporting Azure Table Into Excel and SQL Server using Enzo Cloud Backup

Do you have data stored in Azure Tables but can’t find a way to easily export into Excel or even SQL Server for analysis? With Enzo Cloud Backup (a free tool by the way) you can easily do that. Although Enzo Cloud Backup is a backup tool first and foremost, you can also export Azure Table data. Let me show you how to do this. Backing up an Azure Table allows you to restore data at a later time, while exporting an Azure Table allows you to inspect the content of an Azure Table and run reports if desired.

Install and Configure Enzo Cloud Backup

First and foremost, you will need to download Enzo Cloud Backup. Then, when you start Enzo Cloud Backup, you will need to login by providing a Storage Account and a Key; this is the Storage Account that the tool needs to save its internal configuration settings; it is usually best to create a separate Storage Account for Enzo Cloud Backup to use. Once the information is provided, click on Connect. To learn more on how to create a Storage Account in Microsoft Azure, see this post.

image

Once logged in, let’s register a Storage Account that we want to explore, and in which the Azure Table table you want to export resides. From the menu, click Connection –> Data Store –> Register –> Azure Storage. This will bring up a window that allows you to register a Storage Account you want to backup or explore. In this example, I am registering a storage account called bsctest1.

image

Once registered, the account will show up on the left menu of the main screen, under Azure Storage. I am showing you the bsctest1 Storage Account below. As you can see, there are a few Azure Tables showing up, and a few buttons on top: Backup, Open in Excel and Explore.

image

Let’s also register a database server to export to, so that we can quickly select it later (note: this step is optional). To do this, go to Connection –> Data Store –> Register –> Database Server, and enter the name of the database server along with credentials. Click OK to save the database connection information.

image

Exploring Azure Table Data

Let’s first explore the content of the enzodebug Azure Table. Click on the enzodebug table then click on Explore. This will open up a browser showing up to 1,000 entities at a time. You can enter a list of column names separated by a column, and a filter for the data which will come up once you click on Refresh. Click on Next allows you to browse the next set of records. The browser allows you to quickly inspect your data, but you cannot export from here. To learn more about filters, visit this page on MSDN and search for the Filter Expressions section.

image

Exporting to Excel

Another feature available is to export Azure Table data in Excel directly from Enzo Cloud Backup. Back on the main screen, click on the Open in Excel button. This will open a screen giving you a few options: you can provide a list of properties to return (all by default) and a filter for the data (none by default). You can also choose a Table Hint, which provides much faster data upload times when the PartitionKey contains a GUID, or random number. Select Optimize Search for GUID Values when your PartitionKey has random values.

image

When you are ready, click on Export. A progress indicator will tell you how far you are in the load process in Excel.

image

Once the download of the data is complete and Excel has all the data, you will see your data in Excel as expected. Depending on how many records you are exporting, and the PartitionKey strategy selected, this operation may take some time.

image

Exporting to SQL Server or Azure SQL Database

You can also export this data to SQL Server, or Azure SQL Database just as easily. Because of data type constraints and other considerations, a few more options are available to you.

From the main window, still having the table selected, click on Backup. Note that unlike the two previous options, this approach allows you to export multiple tables at once. The Backup screen will show up as follows:

image

If you would like to use the GUID strategy as discussed previously, you can do so under the Default Strategy tab:

image

From the General tab, click on Add Backup Device. A panel will be added to the list of destinations. Choose SQL Server or SQL Database from the destination Dropdown list, and provide the connection credentials. In this example, I am also creating a new database (TestExport) with a default size of 1GB (this is important; if your data needs more than 1GB of space, you need to change the database size accordingly or the export will fail). [note: if you did not register a database server previously, you can type the server name and the user id/pwd fields by hand here).

image

In the Data Import Options you can change a few settings that dictate the behavior of the export process depending on unexpected data conditions. I chose the create missing tables, and to add Error Columns if one or more columns cannot be loaded in SQL Server (this will allow you to load the data even if some columns fail to load). 

image

After you click the Start button, you can see the progress of the export by looking at the Tasks.

image

Once completed, we can view our data. Using SQL Server Management Studio (SSMS), let’s connect to the database where the export has occurred. Make sure to pre-select the database name on the Connection Properties tab if you are connecting to an Azure SQL Database.

image

image

Once logged in, simply select from the table:

image

Note that three fields were added automatically in my export (these fields are only created if there are data errors during the export, and if you have selected the Add Error Columns option earlier): __ENZORAWXML__, __ENZOERRORID__, and __ENZOERRORDESC__.  The error is telling me that one of the columns could not be exported because of a name mismatch: the TimeStamp column (date/time) already exists. That’s because in XML (the underlying storage type of Azure Tables), property names are case sensitive: in my case, each entity has both a Timestamp and TimeStamp property (note the case difference). However by default SQL Server column names are not case sensitive, and as a result it is not possible to create both fields in a table in SQL Server. While the extra TimeStamp column was not created, the __ENZORAWXML__ field contains the actual value of the field, in XML, so you can still inspect it here.

image

Conclusion

As shown in this blog post, Enzo Cloud Backup is a tool that allows you to not only backup Azure Storage content, but also easily browse and export Azure Tables for further analysis in Excel and SQL Server / Azure SQL Database. 

About Herve Roggero

Herve Roggero, Microsoft Azure MVP, @hroggero, is the founder of Blue Syntax Consulting (http://www.bluesyntaxconsulting.com). Herve's experience includes software development, architecture, database administration and senior management with both global corporations and startup companies. Herve holds multiple certifications, including an MCDBA, MCSE, MCSD. He also holds a Master's degree in Business Administration from Indiana University. Herve is the co-author of "PRO SQL Azure" and “PRO SQL Server 2012 Practices” from Apress, a PluralSight author, and runs the Azure Florida Association.