Windows 10 Core and Azure IoT Hub

I recently had the opportunity to follow an IoT Lab following the instructions provided in the Nightlight workshop, as found here: No need to say, I jumped on the opportunity to learn about Windows 10 Core and have some fun with direct Azure integration with live Power BI reporting in the backend.

You will need the Azure IoT kit in order to go through this lab: – it costs a bit over $100; money well spent! In the box, you will find a Raspberry Pi2, a breadboard, and electronic components to build a nightlight.


The first thing I needed to do was to upgrade my Windows development laptop from Windows 8 to Windows 10. The process was smooth and everything was compatible, including my Visual Studio 2015 installation (which is required for this lab). Actually, one thing to note here is that you must install the Universal Windows App Development Tools –> Tools and Windows SDK to build an app for devices; that’s an option in the Features of the Visual Studio 2015 installer. Another important configuration step is to enable Windows 10 development on your DEV machine. All these pre-requisites can be found here:

Building out the nightlight was actually fun; I hadn’t touched electronic components in years, so this was refreshing and a bit challenging at times; specially with the ADC (Analog Digital Converter) component. But with patience, it all started to make sense and soon enough the wiring was working.


Then came the code part of things… this is where the real fun begins. Controlling the pins on the GPIO was the coolest thing ever… Basically the GPIO exposes pins that you can access programmatically to send commands and receive data.


One of the steps in the lab was to create an Azure IoT Hub, connect to it from the device, and explore live data being sent over to the cloud; in this case, the Raspberry Pi2 was designed to capture light level information, send the light measure to the cloud every second, and turn on or off the nightlight depending on the darkness level of the room. The lab goes into details on how this is done here: and here:

The real surprise of this entire solution was to see data flow in near-time through Power BI and visualize the darkness level. This is roughly what it looks like at the conclusion of the lab (picture taken from the lab):

Create the Power BI report

Not everything was smooth; in fact it took me nearly two days to get everything working. My biggest frustrations with the lab were two-fold:

  1. 1. Visual Studio 2015 was, at times, unable to communicate/find the Raspberry Pi2 to start the program
  2. 2. Windows 10 Core wants to push an update to the Raspberry Pi2 regardless of whether or not you want it to

The second issue was more concerning because the Windows upgrade failed on me repeatedly, and the only option was to reimage the Raspberry Pi2 with the Windows 10 Core default image. I learned later that it is possible to disable Windows Updates if you use Windows 10 Core Pro.

In all, this was an amazing lab; if you want to learn about Windows 10 Core, Azure IoT Hub, and connect the dots with Power BI, I highly recommend going through this lab.

About Herve Roggero

Herve Roggero, Microsoft Azure MVP, @hroggero, is the founder of Enzo Unified ( Herve's experience includes software development, architecture, database administration and senior management with both global corporations and startup companies. Herve holds multiple certifications, including an MCDBA, MCSE, MCSD. He also holds a Master's degree in Business Administration from Indiana University. Herve is the co-author of "PRO SQL Azure" and “PRO SQL Server 2012 Practices” from Apress, a PluralSight author, and runs the Azure Florida Association.

Managing Multiple SQL Servers

If you deal with SQL Server on a regular basis, or if you are a DBA dealing with production databases, you are most likely using a monitoring tool to gather performance statistics and receive alerts. However when it comes to performing real-time analysis, such as finding out which databases are low in log space, querying all your error logs to search for a specific event, or even to find out which databases contain a specific stored procedure or column name, it can get a bit tricky. Most DBAs will use SQL Server Management Studio (SSMS) to perform real-time queries, and the built-in tool called Central Management Servers; however this tool has multiple short comings. This blog post presents you with an overview Central Management Servers, a new tool called Enzo SQL Manager, and presents pros and cons of both solutions.

Using SSMS Central Management Servers

SSMS offers a good solution for DBAs to query multiple SQL Server databases at once: Central Management Servers (or CMS). CMS is a technology built directly in SSMS and offers the ability to run SQL commands to one or more SQL Servers. You can think of CMS as a sharding technology, which allows you to run the same SQL command on multiple SQL Server instances at the same time, and show an aggregated result as if the command was run on a single server.

For example, in the screenshot below, SSMS is connected to CMS. The Registered Servers window on the left shows you CMS, with localhost\Enzo as the instance that hosts CMS. Within this instance, two servers are registered: DEVLAP02\SQLSERVER2012, and DEVLAP03\SQL2014. From the naming convention of my instances of SQL Server, you can infer that my SQL Server instances are not the same version of SQL Server.  Yet, I am able to run a single command (SELECT GETDATE()) and obtain the current datetime information on both instances in the result window. You will also notice that CMS automatically adds the ServerName column (although not specifically requested by the command) so that you know which server the information comes from.


From a configuration standpoint, CMS requires a network login (SSPI), and as a result is unable to connect to Azure SQL Databases. It also means that your SQL Servers must be joined to the same network.

There are other limitations with CMS; for example CMS can only query SQL Server instances; it cannot send the same SQL statement to all databases within an instance (you can run the undocumented sp_msforeachdb command, but this is not a distributed command; it is a loop operation which does not aggregate results). In addition, it is schema sensitive: sending a command to multiple SQL Server instances of various versions could return an error. For example, the following SQL command (SELECT * FROM sys.databases) fails in my setup because the system view “databases” returns slightly different columns in both SQL Server versions:


From an architecture standpoint, you cannot use CMS from a .NET application or any other tool than SSMS, because it is a feature of SSMS and unavailable outside of the application. In other words, you cannot leverage the ability to send parallel queries outside of SSMS.

Using Enzo SQL Manager

Similarly to CMS, Enzo SQL Manager, a solution built on top of Enzo Unified, allows you to run SQL commands against one or more SQL Server instances; it can also run a command against all user databases and will automatically adapt to various schemas. Because Enzo SQL Manager works with database logins, you can include SQL Server instances that are not part of your network, including Azure SQL Databases.

In the screenshot below, SSMS is connected to Enzo SQL Manager running on LOCALHOST,9556 (shown at the bottom of the screenshot), and the command is executed against all the servers registered with Enzo (servers are registered with a separate management interface). Enzo SQL Manager provides views that run built-in SQL commands (or custom-defined commands) against the instances and/or databases; additional columns are also added automatically to identify the machine name, instance name and optionally the database id and name where the data came from. The command below (RowCounts) returns a list of tables in each database with a count of records.


Enzo SQL Manager offers a number of built-in commands, and allows you to extend the list of views by providing a name for the view and the SQL command that should be executed. For example, you can define a new view called CPUActivity which returns the SPID, CPU and LoginName of user processes that have a CPU greater than 0.  The checkbox “Execute command against all user databases” allow you to control whether the view will execute on each SQL Server instance, or each user databases within each instance.


Once the view has been defined, you can now run this SQL command against all the registered servers when you are connected to Enzo:

SELECT * FROM MsSqlMgmt.CPUActivity

You can also further filter the results as such:

SELECT * FROM MsSqlMgmt.CPUActivity WHERE loginame <> ‘sa’

Although Enzo SQL Manager does not understand complex queries natively (such as a JOIN or a GROUP BY operation), you can create a custom view with the complex SQL Statement. For example, you could create a custom view that joins Table and Index system tables and make this complex SQL query available through the view.

Since Enzo SQL Manager is a service, you can connect to it using a .NET application or a dashboard, making it easier to create custom monitoring solutions. For example, you could easily add a SQL Job that calls the CPUActivity custom view, and through Enzo Unified make a Phone Call or send a SMS text message when a specific condition has been detected using and SQL statement.  For example, the following SQL Job monitors blocking calls against all registered servers and sends a SMS when a blocking issue has been detected. A table variable called tableLocks is declared to store the list of blocking calls returned by Enzo SQL Manager.

DECLARE @tableLocks table (machine_name nvarchar(100), instance_name nvarchar(100), blocked_session_id int)

INSERT INTO @tableLocks
SELECT machine_name,instance_name,blocked_session_id  FROM [localhost,9556].bsc.MsSqlMgmt.Blocking

SELECT * FROM @tableLocks

IF (Exists(SELECT * FROM @tableLocks))
    DECLARE @message nvarchar(250)
    SET @message = 'Blocking issue detected on ' + (SELECT CAST(COUNT(*) as nvarchar(5)) FROM @tableLocks) + ' session(s)!'
    EXEC [localhost,9556].bsc.twilio.sendsms 'YOUR PHONE NUMBER', @message

Enzo SQL Manager uses Twilio to send SMS messages; this allows you to send any text message directly from SQL Server by running the Twilio.SendSMS command. This command accepts multiple phone numbers so that you can send a text message to multiple phones. To make this work, you will need to open up an account with Twilio and use the Enzo SQL Management interface to register your Twilio account. This screen is found under Configuration –> Configure Twilio. If you wish to make phone calls from SQL Server, you will also need to make sure Enzo is accessible from the Internet; the Public Enzo Address is the URL where Enzo is accessible from the public Internet. For more information about Twilio, visit


Pros and Cons Summary

Generally speaking, CMS provides a more comprehensive support for SQL statements; however Enzo SQL Manager supports the creation of views that can contain any valid SQL statement. Enzo Unified supports other capabilities, such as the ability to query databases in parallel, Linked Server (for integration with SQL Jobs for example), automatically adapts to variable schema definitions, is fully extensible and supports Azure SQL Database connectivity.


How To Try Enzo SQL Manager

You can visit our website and download a trial version of Enzo SQL Manager at

About Herve Roggero

Herve Roggero, Microsoft Azure MVP, @hroggero, is the founder of Enzo Unified ( Herve's experience includes software development, architecture, database administration and senior management with both global corporations and startup companies. Herve holds multiple certifications, including an MCDBA, MCSE, MCSD. He also holds a Master's degree in Business Administration from Indiana University. Herve is the co-author of "PRO SQL Azure" and “PRO SQL Server 2012 Practices” from Apress, aPluralSight author, and runs the Azure Florida Association.

About Enzo Unified

Enzo Unified is a data platform that helps companies reduce development and data integration project timelines, improve data quality, and increase operational efficiency by solving some of the most complex real-time data consumption challenges. For more information, contact, or visit

Azure Bootcamp 2016 – Boca Raton

This post is a summary of the local event, and includes the Azure Data Labs that we are hosting in the Boca Raton location for the Global Azure Bootcamp 2016.


We are excited to introduce you to the Microsoft Azure cloud. Many of our local MVPs will be presenting tomorrow on various topics, so be ready for a downpour of information!  We will have two tracks: Dev Ops, and Developer. We will also have a joint session for an overall introduction to Azure in the morning, and another joint session for a general Q&A right after lunch.

Here is the tentative agenda:


Data Lab

We are proposing three labs for this event to focus on data: Azure SQL Database, Azure Storage, and the Redis Cache. Note that each lab could take between 30-45 minutes to complete; as a result you may only be able to go through one or two labs at the event.

Azure SQL Database

Objectives: create a new Azure SQL Database server, and a database, using the management portal ( and connect to it using SQL Server Management Studio, and Visual Studio.

Lab 1: Create a Azure SQL Database

Lab 2: Connect using SQL Server Management Studio

Lab 3: Connect using C# and Add Rows

Lab 4: Run System Commands to Monitor Performance


Azure Storage

Objectives: Create an Azure Table and store no-sql data in the table programmatically. Send a message in an Azure Queue programmatically and retrieve it later.

Lab 1: Create, Query and Update Data in an Azure Table with C#

Lab 2: Create and Use Azure Queues using C#


Redis Cache

Objectives: Create a Redis Cache environment and use it using C#.

Lab 1: Create and use a Redis Azure Cache

About the Azure Bootcamp

The event is sponsored globally by many companies (see for general information about this global event). We also have two local sponsors that are making this event possible: Champions Solutions Group (, and Enzo Unified (

The event is organized by the Azure Florida Association; we host meetings monthly (either online or onsite). Please join the Azure Florida Association on Linked In:

Our speakers this year are:

Dave Noderer, MVP (@davenoderer,

Adnan Cartwright, MVP (@adnancartwright,

Jason Milgram, MVP (@jmilgram,

Herve Roggero, MVP (@hroggero,

Accessing SharePoint Online Data Directly From SQL Server and REST

If you have ever worked with SharePoint, either on premises or online, and ever needed to fetch records or manage Lists programmatically, you know how hard of a task this can be. Learning the SharePoint API presents a significant learning curve even for senior developers. In this post, you will see how you can interact with SharePoint Online using simple SQL statements or simple REST commands, and how you can tap into SharePoint directly from within SQL Server (such as triggers, views, and functions). Although this blog post focuses on the online version of SharePoint, the examples below also work with SharePoint on premises. Accessing SharePoint directly from SQL Server (or simple REST commands) requires the use of Enzo Unified, a real-time integration platform that hides the complexities of APIs.

Managing SharePoint Objects

SharePoint offers a large array of management options through its APIs, including the ability to create Sites, Lists, Folders, and Security. At the time of this writing, Enzo Unified allows you to access basic SharePoint management features using SQL, such as the ability to create a SharePoint list, and add Fields to the list. For example, creating a new SharePoint Task List using SQL looks like this:

EXEC SharePoint.createlist 'Ch9Tweets', 'Tasks', 'Channel9 Tweets Received'

You can also add new fields to a SharePoint list using SQL. For example, let’s add a new field called ScreenName of type Text:

EXEC SharePoint.addfield 'Ch9Tweets', 'ScreenName', 'Text'

Other SQL procedures allow you to create Folders where you can store Documents.

Reading From and Writing To SharePoint Lists

Now that we have created a SharePoint List using SQL, we can add a new record. Let’s use an INSERT statement to add a record in the SharePoint List we just created. The INSERT statement allows you to specify the field names you want to provide a value for.

INSERT INTO sharepoint.list@Ch9Tweets (Title, ScreenName) VALUES ('New tweet received.', '@hroggero')


The Messages window shows you that one record was affected. You may have noticed a minor variation to the usual SQL syntax: the table name is called ‘list@Ch9Tweets’. The @ sign is used to provide the actual name of the SharePoint list. The UPDATE and DELETE statements are similar:

UPDATE SharePoint.list@Ch9Tweets SET ScreenName = '@enzo_unified' WHERE ID = 1

DELETE FROM sharepoint.list@Ch9Tweets WHERE ID = 1

Selecting from a SharePoint list allows you to fetch SharePoint data directly using a SQL command, as such:

SELECT ID, Title, ScreenName, Priority FROM sharepoint.list@Ch9Tweets

It is also possible to include the TOP and WHERE clauses, which automatically issue the proper CAML query for SharePoint to filter the data. For example, you could select a subset of records like this:

SELECT TOP 5 ID, Title, ScreenName, Priority FROM sharepoint.list@Ch9Tweets WHERE ID < 10


Integrating SharePoint And SQL Server

The above examples work just fine if you are connected to Enzo Unified directly; however when used directly from within SQL Server (such as a trigger), you will need to created a Linked Server to Enzo Unified, and add the Server Name to the query. For example, the previous SELECT command would look like this when called within a Stored Procedure (notice the addition of the linked server):

SELECT TOP 5 ID, Title, ScreenName, Priority FROM [localhost,9550].bsc.sharepoint.list@Ch9Tweets WHERE ID < 10

Most of the operations to access and change SharePoint data can also be performed using specific stored procedures within Enzo Unified. For example, it is possible to insert a new item using the AddListItemRaw command. This command inserts a new list item in the Ch9Tweets list, where the values for the Title and ScreenName fields are provided in the XML document passed in.

EXEC SharePoint.AddListItemRaw ‘Ch9Tweets’, ‘<root><Title>New tweet!</Title><ScreenName>@hroggero</ScreenName></root>’

REST Access

In addition to supporting native SQL commands from SQL Server, Enzo Unified also provides a simple REST interface that can be used from mobile phones. Let’s call the GetListItemsEx method using a REST command. To know which parameters to send to Enzo Unified, let’s run the HELP command on GetListItemsEx first:

exec bsc.SharePoint.GetListItemsEx help


We can see that GetListItemsEx requires a view name, and optionally accepts the columns and where parameters. Calling the GetListItemsEx method using the HTTP endpoint of Enzo Unified (listening on 19560 in this example) looks like this (note: the call requires and authentication token to work):


The JSON response provides the list of tweets that match the WHERE clause:



The ability to integrate directly with SharePoint from SQL Server without the need to use an ETL tool and without the need to learn the SharePoint APIs allows developers and DBAs to build direct database-level integration with SharePoint. As such, it is possible to interact directly with SharePoint lists through the following SQL Server objects:

  • - Stored Procedures
  • - Functions
  • - Triggers
  • - Views
  • - SQL Jobs

In addition to low-level integration with SQL Server, Enzo Unified provides a REST interface allowing developers to communicate securely to SharePoint using HTTP or HTTPS.

About Herve Roggero

Herve Roggero, Microsoft Azure MVP, @hroggero, is the founder of Enzo Unified ( Herve's experience includes software development, architecture, database administration and senior management with both global corporations and startup companies. Herve holds multiple certifications, including an MCDBA, MCSE, MCSD. He also holds a Master's degree in Business Administration from Indiana University. Herve is the co-author of "PRO SQL Azure" and “PRO SQL Server 2012 Practices” from Apress, aPluralSight author, and runs the Azure Florida Association.

About Enzo Unified

Enzo Unified is a data platform that helps companies reduce development and data integration project timelines, improve data quality, and increase operational efficiency by solving some of the most complex real-time data consumption challenges. For more information, contact, or visit

Accessing No-SQL data from SQL Server 2016 and R Services

In this post, I will show how to easily extend SQL Server 2016 R Services to consume No-SQL data in real-time, such as Azure Tables. This example could easily be modified to access other No-SQL services such as Couchbase for example. As you may know, SQL Server 2016 (currently in Release Candidate 1) now offers the ability to call an R service for statistical analysis using an SQL statement. What is interesting is that you can execute an R script (that contains the statistical computation) directly from SQL Server 2016, and consume SQL Server data directly from within the R script. This means that you can more easily perform advanced mathematical computations in SQL Server by leveraging an external R service. However, there are situations in which parts of the data needed for the analysis is not readily available inside SQL Server, but in external data stores, such as a No-SQL environment; this usually translates into addition configuration to leverage an ETL tool to load the data into temporary or staging tables. In this blog post, I will show you how SQL Server 2016 with R Services can tap into a No-SQL data store directly using basic SQL statements by leveraging Enzo Unified, without using an ETL tool and without the need to create temporary/staging tables.

I am assuming that you have installed SQL Server 2016 (RC1), and the necessary R services on the same machine; please follow these links for further information about SQL Server 2016 and how to install the R services. Note that you will need to select the Advanced Analytics Extensions option during the installation of SQL Server 2016.

- SQL Server 2016 RC1: 

- Install R Services:

- Post Installation Server Configuration:

You will also need Enzo Unified installed on the same machine; to obtain Enzo Unified, contact Enzo Unified looks like SQL Server on the network and provides instant access to a large number of endpoints using the SQL language natively, from SQL Server or applications directly. In this scenario, Enzo Unified is configured to allow access to an Azure Table using the following command:

SELECT * FROM AzureStorage.Table@table1

Note: Enzo Unified uses a specific convention to specify an Azure Table name: add the @ symbol followed by the name of the table. The above SQL command will return all available entities and columns from an Azure Table called “table1”.

In order to make this data available to an R script, I created a Linked Server definition in SQL Server that points to Enzo Unified. The Linked Server is called [localhost,9550]. This means that SQL Server can also access No-SQL data directly using the following command (note the addition of the Linked Server):

SELECT * FROM [localhost,9550].bsc.AzureStorage.Table@table1

However, Azure Tables do not have a firm schema definition; each entity in an Azure Table is an XML document with potentially different nodes; as a result, it is necessary to create a virtual table inside Enzo Unified that binds the table to a known schema. The following command creates a virtual table called “SalesData” and is created under a schema called “shard”; the command accepts the name of the virtual table, the connection information to the Azure Table (not shown here to simplify the example), a comment, and the columns names (schema) that it returns:

exec shard.CreateVirtualTable
     'SalesData',        -- name of virtual table
     'AllRecentSales',  -- name of the data source pointing to the desired Azure Table
     'Retrieves all sales data',    -- comment
     'int id|||x,datetime effdate|||x,itemname|||x,category|||x,statename|||x,decimal price|||x,int customerid|||x,int orderid|||x,__source__'      -- list of columns

Now that the virtual table has been created in Enzo Unified, the SalesData virtual table returns the desired columns from the Azure Table:

SELECT id, statename, price, customerid FROM [localhost,9550].bsc.shard.salesdata

Once the virtual table has been configured in Enzo Unified, SQL Server 2016 can present Azure Table data to an R script. The R script accepts an SQL command that will be used as the input data set.  For example, the following R script’s input data set is the SQL statement provided above that returns four columns: the id, statename, price and customerid fields from the Azure Table.


As you can see above, Enzo Unified has abstracted the Azure Table entirely, presenting No-SQL data as a SQL Server table to an R script; this sample script does not actually perform any statistical analysis; it only returns the data it reads to demonstrate the capability to access No-SQL data from an R script. Because the call to Enzo Unified is performed in real-time, this script could yield a different output every time it is executed if the underlying Azure Table data changes.

In conclusion, configuring SQL Server 2016 (RC1), R Services, and Enzo Unified allows you to perform statistical analysis on a large number of data sources, including No-SQL databases, thanks to Enzo Unified, without building temporary tables and complex ETL processes to being the data into SQL Server staging tables.

About Herve Roggero

Herve Roggero, Microsoft Azure MVP, @hroggero, is the founder of Enzo Unified ( Herve's experience includes software development, architecture, database administration and senior management with both global corporations and startup companies. Herve holds multiple certifications, including an MCDBA, MCSE, MCSD. He also holds a Master's degree in Business Administration from Indiana University. Herve is the co-author of "PRO SQL Azure" and “PRO SQL Server 2012 Practices” from Apress, a PluralSight author, and runs the Azure Florida Association.

About Enzo Unified

Enzo Unified is a data platform that helps companies reduce development and data integration project timelines, improve data quality, and increase operational efficiency by solving some of the most complex real-time data consumption challenges. For more information, contact, or visit

Exporting Azure Table Into Excel and SQL Server using Enzo Cloud Backup

Do you have data stored in Azure Tables but can’t find a way to easily export into Excel or even SQL Server for analysis? With Enzo Cloud Backup (a free tool by the way) you can easily do that. Although Enzo Cloud Backup is a backup tool first and foremost, you can also export Azure Table data. Let me show you how to do this. Backing up an Azure Table allows you to restore data at a later time, while exporting an Azure Table allows you to inspect the content of an Azure Table and run reports if desired.

Install and Configure Enzo Cloud Backup

First and foremost, you will need to download Enzo Cloud Backup. Then, when you start Enzo Cloud Backup, you will need to login by providing a Storage Account and a Key; this is the Storage Account that the tool needs to save its internal configuration settings; it is usually best to create a separate Storage Account for Enzo Cloud Backup to use. Once the information is provided, click on Connect. To learn more on how to create a Storage Account in Microsoft Azure, see this post.


Once logged in, let’s register a Storage Account that we want to explore, and in which the Azure Table table you want to export resides. From the menu, click Connection –> Data Store –> Register –> Azure Storage. This will bring up a window that allows you to register a Storage Account you want to backup or explore. In this example, I am registering a storage account called bsctest1.


Once registered, the account will show up on the left menu of the main screen, under Azure Storage. I am showing you the bsctest1 Storage Account below. As you can see, there are a few Azure Tables showing up, and a few buttons on top: Backup, Open in Excel and Explore.


Let’s also register a database server to export to, so that we can quickly select it later (note: this step is optional). To do this, go to Connection –> Data Store –> Register –> Database Server, and enter the name of the database server along with credentials. Click OK to save the database connection information.


Exploring Azure Table Data

Let’s first explore the content of the enzodebug Azure Table. Click on the enzodebug table then click on Explore. This will open up a browser showing up to 1,000 entities at a time. You can enter a list of column names separated by a column, and a filter for the data which will come up once you click on Refresh. Click on Next allows you to browse the next set of records. The browser allows you to quickly inspect your data, but you cannot export from here. To learn more about filters, visit this page on MSDN and search for the Filter Expressions section.


Exporting to Excel

Another feature available is to export Azure Table data in Excel directly from Enzo Cloud Backup. Back on the main screen, click on the Open in Excel button. This will open a screen giving you a few options: you can provide a list of properties to return (all by default) and a filter for the data (none by default). You can also choose a Table Hint, which provides much faster data upload times when the PartitionKey contains a GUID, or random number. Select Optimize Search for GUID Values when your PartitionKey has random values.


When you are ready, click on Export. A progress indicator will tell you how far you are in the load process in Excel.


Once the download of the data is complete and Excel has all the data, you will see your data in Excel as expected. Depending on how many records you are exporting, and the PartitionKey strategy selected, this operation may take some time.


Exporting to SQL Server or Azure SQL Database

You can also export this data to SQL Server, or Azure SQL Database just as easily. Because of data type constraints and other considerations, a few more options are available to you.

From the main window, still having the table selected, click on Backup. Note that unlike the two previous options, this approach allows you to export multiple tables at once. The Backup screen will show up as follows:


If you would like to use the GUID strategy as discussed previously, you can do so under the Default Strategy tab:


From the General tab, click on Add Backup Device. A panel will be added to the list of destinations. Choose SQL Server or SQL Database from the destination Dropdown list, and provide the connection credentials. In this example, I am also creating a new database (TestExport) with a default size of 1GB (this is important; if your data needs more than 1GB of space, you need to change the database size accordingly or the export will fail). [note: if you did not register a database server previously, you can type the server name and the user id/pwd fields by hand here).


In the Data Import Options you can change a few settings that dictate the behavior of the export process depending on unexpected data conditions. I chose the create missing tables, and to add Error Columns if one or more columns cannot be loaded in SQL Server (this will allow you to load the data even if some columns fail to load). 


After you click the Start button, you can see the progress of the export by looking at the Tasks.


Once completed, we can view our data. Using SQL Server Management Studio (SSMS), let’s connect to the database where the export has occurred. Make sure to pre-select the database name on the Connection Properties tab if you are connecting to an Azure SQL Database.



Once logged in, simply select from the table:


Note that three fields were added automatically in my export (these fields are only created if there are data errors during the export, and if you have selected the Add Error Columns option earlier): __ENZORAWXML__, __ENZOERRORID__, and __ENZOERRORDESC__.  The error is telling me that one of the columns could not be exported because of a name mismatch: the TimeStamp column (date/time) already exists. That’s because in XML (the underlying storage type of Azure Tables), property names are case sensitive: in my case, each entity has both a Timestamp and TimeStamp property (note the case difference). However by default SQL Server column names are not case sensitive, and as a result it is not possible to create both fields in a table in SQL Server. While the extra TimeStamp column was not created, the __ENZORAWXML__ field contains the actual value of the field, in XML, so you can still inspect it here.



As shown in this blog post, Enzo Cloud Backup is a tool that allows you to not only backup Azure Storage content, but also easily browse and export Azure Tables for further analysis in Excel and SQL Server / Azure SQL Database. 

About Herve Roggero

Herve Roggero, Microsoft Azure MVP, @hroggero, is the founder of Blue Syntax Consulting ( Herve's experience includes software development, architecture, database administration and senior management with both global corporations and startup companies. Herve holds multiple certifications, including an MCDBA, MCSE, MCSD. He also holds a Master's degree in Business Administration from Indiana University. Herve is the co-author of "PRO SQL Azure" and “PRO SQL Server 2012 Practices” from Apress, a PluralSight author, and runs the Azure Florida Association.

Sharing Tables Between Two or More SQL Servers using Couchbase

In this blog post I will show you how you can allow SQL Server to tap into Couchbase directly using regular SQL commands, and use Couchbase Buckets to share data between two or more SQL Server databases.

SQL Server is a powerful database server when it comes to relational data structures for cloud and enterprise applications, and Couchbase is becoming a leading platform for NoSQL distributed databases offering extreme scale and simple geo replication. Many application developers want the benefits of relational data storage and the scale and ease of use of a NoSQL data store. But without the right tools, it is very difficult to combine both technologies seamlessly without creating data silos. In addition, while many DBAs will manage SQL Server effectively, they are usually unable to assist with data integration projects that need to bring both databases together.

In some organizations, it is important to be able to share data between multiple SQL Server databases. This is usually done with built-in features, such as Replication, or Log Shipping for example. However these options can be cumbersome to implement and have specific limitations, such as the database versions being used by both SQL Server databases. More importantly, these features are not as effective when the two databases are geographically separated, when both databases are considered read/write, or when the two databases cannot connect to each other directly.

How It Works

In order to understand how Couchbase and SQL Server can work together, we must first explore the essential differences between the two systems. While both Couchbase and SQL Server are database servers, Couchbase is traditionally used as a document-centric platform storing files as JSON documents accessed using application code, while SQL Server stores data in a proprietary format that is accessed using the SQL language from applications or by analysts running reports.

One of the major differences between the two systems is related to the format of the data being stored. In Couchbase, data is stored as a JSON document structure, which can have properties, arrays and objects. This means that some JSON documents have depth in their data structure that is not easily represented using rows and columns. SQL Server on the other hand stores and services data in rows and columns, and as a result is not a friendly data store for JSON documents. As a result, in order to exchange data between the two database servers, a data transformation needs to take place, between JSON documents and rows and columns.

In addition to the storage format, Couchbase uses the REST protocol to service data; REST commands are sent as HTTP requests. This differs from SQL Server, which services data using a proprietary protocol called TDS (Tabular Data Stream). This difference in protocols means that SQL Server cannot natively connect directly to a Couchbase server.

Enzo™ Unified is designed to solve the data structure and protocol differences outlined above, providing SQL Server a bridge to communicate natively with Couchbase in real-time, using Linked Server (a feature of SQL Server allowing remote querying of other database systems). With Enzo Unified, SQL Server can connect to Couchbase, fetch data, and modify data inside Couchbase using the SQL language, as if Couchbase understood the TDS protocol and the SQL language of SQL Server.

From an installation standpoint, Enzo Unified is a Windows Service that understands the TDS protocol natively; as such it can be co-hosted with SQL Server on the same machine, or installed on a separate server altogether.


The ability to communicate to Couchbase using the SQL language directly from SQL Server creates unique opportunities to build real-time solutions with less complexity. Let’s take a look at an example on how useful this can be.


Let’s assume we have two SQL Server databases installed at two physical locations that cannot connect directly over the network: one in the USA and one in Europe; these two SQL Server databases hold website users that have previously registered, and we want SQL Server to keep its own copy of locally registered users: the US database will hold users from North America, and the European database will hold users from the old continent. This design provides good performance because each website has its dedicated database in the same geographic region.

However, when a new user registers, we want to make sure that the email address provided is unique across all users worldwide. Indeed, a new user should not be allowed to use an existing email address during the registration process. Instead, if a user from Europe logs into the US website, the website should prevent the creation of a new user. 

In this example we will be using Couchbase to store a copy of the web user records centrally. Because Couchbase is a distributed database, it can be installed both in the USA and in Europe; its buckets can easily be replicated across geographic regions automatically with simple configuration settings (replication in Couchbase is suggested here to improve the performance or requests, but it not technically necessary). To simplify our diagram, we will show Couchbase as a single unit; however Couchbase is best installed with multiple instances for performance and availability. An instance of Enzo Unified is installed in each geographic location; Enzo Unified will serve as a real-time bridge between SQL Server and Couchbase, and present the Couchbase bucket as a table to SQL Server. Finally, let’s assume that Couchbase is installed in the cloud so that it can be accessed from both locations (see my blog post on how to install Couchbase in Windows Azure).


Accessing Couchbase From SQL Server

In order to share user data across multiple SQL Server databases, SQL Server must first be able to read and write to Couchbase. This is done through Enzo Unified, either by connecting directly to it using ADO.NET or SQL Server Management Studio (SSMS), or through SQL Server by registering Enzo Unified as a Linked Server. The default bucket from Couchbase is used to store web users: the id will contain the user email, and additional properties are saved (first name, last name, and age). As you can see below, Couchbase contains a single document for user


Let’s fetch this record from SSMS.

First, we need to register the schema of this document so that Enzo Unified can represent it as a database table (in Enzo, tables are not used to store data; they provide a bridge to underlying services, so they are called virtual tables). The following command (‘createclass’) creates a new virtual table called ‘webusers’; notice the list of columns we define for this virtual table (id, type, firstname, lastname and age). We will be using the column ‘id’ as the email address of the user, and the type property will be hard-coded as ‘user’.

exec couchbase.createclass 'webusers', 'default', 'string id,string type,string firstname,string lastname,int age'

Once the virtual table has been created, it can be accessed as a regular SQL table (called ‘webusers’), and through stored procedures (insertwebusers, deletewebusers,updatewebusers and selectwebusers).

Next, we will register Enzo Unified as a Linked Server to allow triggers, stored procedures, and views to access Couchbase directly. Generally speaking it is not required to create a Linked Server against Enzo Unified; a developer can connect directly to Enzo Unified using ADO.NET for example. However a Linked Server is necessary for SQL Server to access Couchbase natively (from triggers for example). Once the Linked Server has been registered, we can simply access the Couchbase documents with a simple SQL syntax; for example the following SQL command could be part of a trigger to check if the email address provided already exists, as such:


This record is accessible from both locations (USA and Europe). The performance of this call is primarily tied to the network bandwidth; very little information is actually transmitted over the network. This command ran in less than once second in the US with Couchbase installed in Windows Azure and SSMS running from my office.

When a new web user registers from either website, a trigger could simply call a stored procedure on Enzo Unified to insert a record in Couchbase. This stored procedure was created during the ‘createclass’ call made previously. An INSERT command on the ‘webusers’ table is also possible.


At this point, we have two users registered. Note that SQL Server can read and write to Couchbase, so it is possible to delete and update records just as easily.

Retrieving Data Using Couchbase Views

An interesting feature of Couchbase is the ability to create Views, which are a projection of a subset of documents that can be accessed for reading. Enzo Unified supports Couchbase views as well.

Let’s create a new view in Couchbase, which returns all email addresses registered; the Couchbase view itself is a JavaScript function that checks for the type of document (a ‘user’ document), and returns the ‘id’ of the document (which is the email address). In our current test, we only have one kind of document (a user); however it is good practice to include a field that describes the kind of document you are dealing with.


Then in SSMS, we simply declare the view as being available in Enzo Unified. After connecting directly to Enzo Unified, you declare the view as such:

exec couchbase.defineview 'emails', 'default', 'emails', 'emails'

The syntax of the ‘defineview’ command in Enzo is beyond the scope of this blog; this command essentially creates a new virtual table called ‘emails’ in Enzo, which we can now call directly. This virtual table is read-only (the associated insert,update and delete commands are not created). You may notice that we are not using a Linked Server in the following SQL call; this is a direct, native call to Enzo Unified from SSMS, so we do not need to use the registered Linked Server because the call is not made by SQL Server.



This blog introduces you to the fundamental differences between SQL Server and Couchbase, and how Enzo Unified for Couchbase can enhance applications by leveraging the two database platforms. Enzo Unified enables SQL Server to tap into the power of in-memory tables provided by Couchbase, and Couchbase buckets can be updated directly through database events to keep data synchronized. In this example, I showed you how data can be shared by two geographically separated SQL Server databases through native SQL commands to enable data validation without complex replication or synchronization jobs.


About Enzo Unified

Enzo Unified is a data platform that helps companies reduce development and data integration project timelines, improve data quality, and increase operational efficiency by solving some of the most complex real-time data consumption challenges. For more information, contact, or visit

Configuring and Using the Redis Azure Cache

In this blog post I will show you how to configure and use a cloud caching service called the Azure Redis Cache. This caching service makes it easy to store information in the cloud to save temporary data, or share data with other services using nothing more than a shared Internet address.

Creating the Azure Redis Cache Service

The first step is to create the Redis cache service itself. For this, you will need an Azure account, and login to the Azure management portal ( To create a new Redis cache, simply browse from the available list of services (click on Browse), and find the Redis Cache service. Provide the name for your cache, which will be used as part of the Internet address of your service (the DNS name). In the example below, I am creating the cache as “bsctest”, and as a result the DNS address for my service will be “”. Also note that you will need to select a pricing tier; I picked the C0 Basic pricing tier. WARNING: Make sure to click on the “Select” button at the bottom of the screen for the pricing tier change to take effect. Then click the “Create” button to create the service; this may take a few minutes.


You will soon need the Access Key from the sample code below; to find your access key, go back to the Redis Cache (you can either go back to Browse All, or use the shortcut that was created for you on the main dashboard). From there, click on the newly created cache, and in the Settings window, click on Access Keys. Only will only need one of those keys; click on the icon next to the access key and save it somewhere handy.


Using the Azure Redis Cache Service

Once the cache service has been created, using it is a breeze. All we have to do is create a .NET application, reference the Redis client libraries, and interact with the cache data.

First, let’s create a sample Console application using Visual Studio 2013 using C#.


The right-click on the project name itself (from Solution Explorer window), and click Manage NuGet Packages. From there, click on the Online link on the left, and type Redis in the search box.  We will use the StackExchange.Redis package; select it and click on Install. You can now close the NuGet window and return to your project.

Note that you may get an exception if the cache hasn’t been created just yet, so make sure the cache service is running before proceeding.

Let’s connect to the cache, and store a simple string for our test. First, you need to connect to the Redis Cache using the Connect method on the ConnectionMultiplexer object; then, use the StringGet and StringGet methods to write/read data.

Here is the complete code for this simple test:

using StackExchange.Redis;

static void Main(string[] args)


            var connection = ConnectionMultiplexer.Connect(
                "{0},connectTimeout=5000,abortConnect=false,ssl=true,password={1}", name, key));

            var db = connection.GetDatabase();
            Console.WriteLine("Setting a cache value");
            db.StringSet("firstname", "James");

            Console.WriteLine("Reading from the cache now...");
            var val = db.StringGet("firstname");

            Console.WriteLine("  value: " + val);


When you run the above code, you should see something like this.


Et voila! The Redis cache offers many features; the example above is only a test, so I invite you to investigate the caching service more in depth before jumping in production with it.

About Herve Roggero

Herve Roggero, Microsoft Azure MVP, @hroggero, is the founder of Blue Syntax Consulting ( Herve's experience includes software development, architecture, database administration and senior management with both global corporations and startup companies. Herve holds multiple certifications, including an MCDBA, MCSE, MCSD. He also holds a Master's degree in Business Administration from Indiana University. Herve is the co-author of "PRO SQL Azure" and “PRO SQL Server 2012 Practices” from Apress, a PluralSight author, and runs the Azure Florida Association.

How To Install Couchbase In Microsoft Azure For Test Purposes

In this blog, I will show you how to install and configure an instance of Couchbase (, a powerful NoSQL database, on a Microsoft Azure A0/A1 server configuration with the smaller installation possible of Couchase for limited development and test purposes. Note that such installation is not recommended for production loads and only intended to provide the cheapest instance possible in Azure that can be used to explore the features of Couchbase.

Step 1: Create a Virtual Machine in Microsoft Azure


Let’s create a new Virtual Machine (Windows Server 2012 R2 Datacenter) in Microsoft Azure. First, you need to login to Microsoft Azure ( Click on the Virtual Machine tab on the left, and make sure to select a basic A2 size for the Virtual Machine as this will speed up installation of Couchbase. Selecting an A0/A1 now will likely not work since one of the steps is highly CPU intensive. We will change the size of the VM back to an A0/A1 after the installation of Couchbase is finalized.


Once the Virtual machine is running and ready to be configured with Couchbase, click on Connect inside the menu option located to the right of the Virtual Machine in Azure. This will open up a Remote Desktop Connection screen which will allow you to log into your machine with the administrative credentials you just provided.


Once logged in, turn off Enhanced Security Configuration (ESC) for administrators so that you can more easily browser to the Couchbase website and download the installation package. To turn off ESC you need to start Server Manager, and click on the Local Server tab on the left; find Enhanced Security Configuration on the list of properties, and click on the On link, which will prompt you for additional choices. Select Off for Administrators, and click OK.



Step 2: Install Couchbase

Open you browser and download the Couchbase installation program. Open browser to Click on the Download button, and choose Couchbase Server (which expands the page with further download options). Make sure to click on the Community Edition button since we are testing Couchbase, then click on the Download link. A download screen will open; click Save As, and choose your D: drive to save the installation package.



Once downloaded, open up Windows Explorer, and select your D drive. You should see the installation package; on my machine, the file name is couchbase-server-community_4.0.0-rc0-windows_amd64.exe, but yours might be different depending on the release you downloaded. Right-click on the Couchbase installation package and select Properties, and click on the Unblock button (so that you can run this EXE in the next step) and click OK.


Double-click on the Couchbase installation program you just unblocked. When the installation Wizard starts, click Next, accept all default configuration settings and click Install. The installation process is rather quick, and within a few minutes Couchbase is configured.

A warning about the local windows firewall will come up; click on Yes to disable the Windows Firewall.


Once the installation is complete, you will see this screen showing a successful installation, then click on Finish.


Once you click on Finish, the default browser will open, pointing to the Couchbase Console. The console is used to manage your Couchbase configuration; a shortcut is also added on the Windows Desktop for convenience. Once the browser has opened the Couchbase Console, click on the Setup button.

Step 3: Configure Couchbase



Because we are creating a simple configuration for Couchbase, you can leave the default settings as provided for the most part. Because we want to run Couchbase on an A0 machine, with only 756MB of total RAM, change the Data RAM Quota to 256MB (the minimum allowed) if you plan to downsize to an A0 later, or 300MB (or higher) if you plan to use an A1 size or higher, and set the Index Quota to 256MB. The new Couchbase Cluster will be configured on, the local machine.

NOTE: Do not change the path of the local directories to point to the D drive; the D drive is considered volatile and only used for temporary data, which could render Couchbase unusable upon restart of the machine.

Clicking Next will create the configuration of the Couchbase server.


Next, choose zero or one of the sample buckets if you plan to downsize to an A0 later, or any number of sample buckets if you plan to use an A1 or higher, and click Next. A bucket is equivalent to a database and will take no less than 100MB in memory. If you plan to use a A0 Virtual Machine, you should limit your selection to a single bucket because your VM will be low in memory.

Once you are ready to proceed, click Next.


In this screen we configure the default bucket. As mentioned before, the minimum size for a bucket is 100MB, so let’s allocate 100MB per node for the RAM Quota. Leave the other options to their default values.


Uncheck Software Update Notifications for this test installation, and click Next.


In the last step of the configuration wizard, you specify a user name and password for the Server itself. Leave Administrator as the user name, and choose a password. We will need this information later. Click Next to complete the configuration of Couchbase.


Step 4: Explore Couchbase Locally


Once completed, you will see the administration console of Couchbase, including the RAM allocation and disk usage overview. The sample buckets begin to load immediately but could take a few minutes to finalize.


Click on the blue arrow next to the Data Buckets tab to inspect your buckets’ summary data. This gives you a quick overview of the RAM allocation for this bucket, and the storage size on disk. Clicking on the bucket name itself provides detailed utilization metrics.


Clicking on the Documents button allows you to inspect your data. Note that you may need to wait a little while, and possibly refresh your browser before you can inspect the data. The browser for the data is simple, and gives you a short list of JSON documents at a time; you can search for specific documents and browse the list page by page.


Step 5: Finalize Configuration In Azure


We installed Couchbase on a A2 Virtual Machine so that the installation process can complete. Once the data has completely loaded (this will take a few minutes), you can resize the Virtual Machine to an A1, or an A0 if you so choose; note that an A0 is barely able to run Couchbase, so you should select this option only if you want to minimize your monthly bill and you are willing to wait a while for your requests to take place. To do this, you will need to go back to the Azure portal, find your Virtual Machine, and change the size accordingly by clicking on Basic A0 and clicking on the Select button at the bottom. Resizing the Virtual Machine will take a few minutes.


While you are on the configuration screen of your virtual machine, note down your VM’s IP Address; we will need this later to test the cluster.

Last but not least, we need to open up certain ports in Azure for the Virtual Machine to be accessible to external clients; this is not a required step if your test client will be inside another Virtual Machine in Azure, but since my test machine is at home, I need to open up those ports.

Normally we need to open up two sets of ports: those on the operating system (the Windows Firewall), and those in Azure. Since we turned off the Windows Firewall earlier, we only need to open up the ports in Azure. The list of ports to open can be found in this article: So we will open TCP ports 8091, 8092, 11210, and 11211. To do so, from your Virtual Machine dashboard in the Azure Management Console, click on the Endpoints setting, and add the above ports.

NOTE: Port 8091 is used for the web configuration console; you should only need to open up this port if you want to configure Couchbase remotely.


Once all the TCP rules are added you should see a configuration similar to this:


Now, let’s try accessing Couchbase from a remote browser. Since I opened port 8091 for web administration, I will be able to manage Couchbase remotely from my workstation. The URL for the Couchbase admin console is your IP Address followed by the port number and the Index page, as such: HTTP://YOUR_SERVER_IP:8091/Index. This will prompt you for a login screen, where you enter Administrator for the user name (it’s case sensitive) and the password you specified earlier in the configuration wizard.


You should also note that we did not secure the server with an SSL certificate; as a result all traffic is unencrypted and could be inspected on your network in clear text, including the password.

Once logged in, as expected, I can see my Couchbase web management interface and manage my Couchbase installation from my workstation.


And that’s it! You have successfully installed and configured Couchbase on a single node, on a small Virtual Machine in Microsoft Azure for testing purposes.

About Herve Roggero

Herve Roggero, Microsoft Azure MVP, @hroggero, is the founder of Blue Syntax Consulting ( Herve's experience includes software development, architecture, database administration and senior management with both global corporations and startup companies. Herve holds multiple certifications, including an MCDBA, MCSE, MCSD. He also holds a Master's degree in Business Administration from Indiana University. Herve is the co-author of "PRO SQL Azure" and “PRO SQL Server 2012 Practices” from Apress, a PluralSight author, and runs the Azure Florida Association.

Developers, Get Ready for Windows 10

The wave of Windows 10 development has already started. With many cool features, and the Universal Windows Platform (with APIs that are guaranteed to be present on all devices running Windows 10), you can start your engines and kick the tires! Here are a few resources you may find useful:

Visual Studio 2015 Download:

Here is an overview of what’s new with Windows 10 for developers:

Here is how to get started with Windows 10 app development:

Last but not least, here are a few videos on the Microsoft Virtual Academy you should check out:

Let’s get started!