About the new Microsoft Innovation Center (MIC) in Miami

Last night I attended a meeting at the new MIC in Miami, run by Blain Barton (@blainbar), Sr IT Pro Evangelist at Microsoft. The meeting was well attended and is meant to be run as a user group format in a casual setting. Many of the local Microsoft MVPs and group leaders were in attendance as well, which allows technical folks to connect with community leaders in the area.

If you live in South Florida, I highly recommend to look out for future meetings at the MIC; most meetings will be about the Microsoft Azure platform, either IT Pro or Dev topics.

For more information on the MIC, check out this announcement:  http://www.microsoft.com/en-us/news/press/2014/may14/05-02miamiinnovationpr.aspx.

About Herve Roggero

Herve Roggero, Microsoft Azure MVP, @hroggero, is the founder of Blue Syntax Consulting (http://www.bluesyntaxconsulting.com). Herve's experience includes software development, architecture, database administration and senior management with both global corporations and startup companies. Herve holds multiple certifications, including an MCDBA, MCSE, MCSD. He also holds a Master's degree in Business Administration from Indiana University. Herve is the co-author of "PRO SQL Azure" and “PRO SQL Server 2012 Practices” from Apress, a PluralSight author, and runs the Azure Florida Association.

Azure Table JSON vs. AtomPub

A few months ago Microsoft released an update to the Microsoft Azure platform allowing developers to access Azure Tables using JSON payloads. Until then, developers had no choice but to use the AtomPub model.  Since this was the only data representation model available, there was no choice to be made, and as a result the use of the AtomPub model was implicit. With the introduction of JSON payloads, developers can now choose one or the other. Although the choice of using AtomPub or JSON is largely transparent (there are at this time a couple of minor exceptions), meaning that there are no functionality differences, developers should use the JSON payload going forward because the payload is significantly smaller. Take a quick look at payload examples on the MSDN documentation

To use the JSON payload, developers can set the PayloadFormat property as such (tableClient is and instance of CloudTableClient):

  tableClient.DefaultRequestOptions.PayloadFormat = TablePayloadFormat.JsonNoMetadata;

In order to see how much data was actually flowing through, Microsoft published a small test application that allows you to send a few entities into an Azure Table. The source code provided by Microsoft can be found on that page, so it’s pretty easy to try it out. However I needed to see this in action, and wanted to use Fiddler to see the actual payload leaving my machine. So I used the same general idea provided by the sample code from Microsoft and modified it to create my own test harness. To do this, I created a new Console application using Visual Studio 2012, added the Windows Azure Storage 3.0 library as a Nuget package, and created a simple application. You can download my test harness here.  The application saves an log data in a test table called LogTest.  You can change the ‘count’ variable to add more records if you want.

I used Fiddler to capture the tests. The first test was to insert 100 entities in my Azure Table using JSON. The second test was to read those entities back using JSON. The third was to insert 100 entities using AtomPub and the fourth was to read them back using AtomPub. Here are the results of those tests:

  JSON Payload AtomPub Payload Payload Savings
Insert 100 Entities 97,700 bytes sent 155,870 bytes sent -37%
Read 100 Entities 40,550 bytes received 122,578 bytes received -67%

 

In my tests, the JSON payload reduced network traffic (bytes sent) by 37% when inserting data, and about 67% when reading data (bytes received). Note that there was virtually no performance improvement in terms of execution time; the advantage of using JSON is limited to smaller network payloads. Still, this is a significant difference; considering that extracting data out of the Microsoft data centers costs money, companies that read lots of data from Azure Tables could realize cost savings as a result of using JSON payloads.

About Herve Roggero

Herve Roggero, Microsoft Azure MVP, @hroggero, is the founder of Blue Syntax Consulting (http://www.bluesyntaxconsulting.com). Herve's experience includes software development, architecture, database administration and senior management with both global corporations and startup companies. Herve holds multiple certifications, including an MCDBA, MCSE, MCSD. He also holds a Master's degree in Business Administration from Indiana University. Herve is the co-author of "PRO SQL Azure" and “PRO SQL Server 2012 Practices” from Apress, a PluralSight author, and runs the Azure Florida Association.

Discover the Management API in Microsoft Azure

Microsoft recently published a Management API framework within Microsoft Azure. If you haven’t had a chance to review this new feature, still in preview at the time of this writing, you may be surprised… So what is this new feature? Who would use it, and why?

The Management API offers interesting capabilities in at least two scenarios:  transform existing web services into more modern REST APIs, and leverage an advanced API management platform.

If you want to move your older public APIs, such as web services, into the era of mobile computing, complete with monitoring, servicing and security management, you are in luck… Microsoft Azure’s Management API provides a remarkably easy way to build a service proxy that hides older public APIs and allows you to define the REST methods you want to expose individually, and how they map to your older Web Services. The Management API provides a user interface to define the API verbs and the backend web service endpoint that serves it.

In addition to providing a mechanism to upgrade existing web services as REST services without writing any line of code, the Management API offers a strong management interface allowing you to specify caching options, access keys, access control and monitoring of your REST endpoints.

This Management API is designed for companies to upgrade their traditional web services as REST services without the need to change their code base. It also allows small and large organizations to obtain a management platform that a simple web service upgrade would not provide out of the box, such as caching and monitoring.

Although the Management API is still in a preview phase, I highly recommend you investigate the capabilities of this amazing new feature in Microsoft Azure. To learn more about it, please visit the Microsoft Azure website: http://azure.microsoft.com/en-us/documentation/articles/api-management-get-started/

About Herve Roggero

Herve Roggero, Microsoft Azure MVP, @hroggero, is the founder of Blue Syntax Consulting (http://www.bluesyntaxconsulting.com). Herve's experience includes software development, architecture, database administration and senior management with both global corporations and startup companies. Herve holds multiple certifications, including an MCDBA, MCSE, MCSD. He also holds a Master's degree in Business Administration from Indiana University. Herve is the co-author of "PRO SQL Azure" and “PRO SQL Server 2012 Practices” from Apress, a PluralSight author, and runs the Azure Florida Association.

Microsoft Azure News: Capturing VM Images

If you have a Virtual Machine (VM) in Microsoft Azure that has a specific configuration, it used to be difficult to clone that VM. You had to sysprep the VM, and clone the data disks. This was slow, prone to errors, and stopped you from being productive.

No more!

A new option, called Capture, allows you to easily select a VM, running or not. The capture will copy the OS disk and data disks and create a new image out of it automatically for you. This means you can now easily clone an entire VM without affecting productivity.  To capture a VM, simply browse to your Virtual Machines in the Microsoft Azure management website, select the VM you want to clone, and click on the Capture button at the bottom. A window will come up asking to name your image. It took less than 1 minute for me to build a clone of my server.

And because it is stored as an image, I can easily create a new VM with it. So that’s what I did… And that took about 5 minutes total.  That’s amazing…  To create a new VM from your image, click on the NEW icon (bottom left), select Compute/Virtual Machine/From Gallery, and select My Images from the left menu when selecting an Image. You will find your newly created image. Because this is a clone, you will not be prompted for a new login; the user id/password is the same.

About Herve Roggero

Herve Roggero, Microsoft Azure MVP, @hroggero, is the founder of Blue Syntax Consulting (http://www.bluesyntaxconsulting.com). Herve's experience includes software development, architecture, database administration and senior management with both global corporations and startup companies. Herve holds multiple certifications, including an MCDBA, MCSE, MCSD. He also holds a Master's degree in Business Administration from Indiana University. Herve is the co-author of "PRO SQL Azure" and “PRO SQL Server 2012 Practices” from Apress, a PluralSight author, and runs the Azure Florida Association.

Awesome event: //publish/

Did you hear about //publish/?  It’s a pretty cool event organized by Microsoft and supported by the community to help you finalize your Windows apps, including Windows 8, and Windows Phone. Major prizes are at the key too! You can get free support, testing and even free consulting services with a Microsoft engineer! This event is designed to help you overcome the final blockers and publishing your apps.

Hurry up! Visit https://publishwindows.com, find a location near you, and sign up. It’s that easy.

Here is a good post about this event with more details:  MVP Guest Post

In the cloud, and back.

Unfortunately, not all projects that try to adopt cloud computing are successful. Some of them are doomed from the start, while others manage to limp along and eventually succeed. Although there are many successful implementations of cloud projects, on which businesses depend on daily, this blog is about an actual adoption failure, and attempts to present facts that led to this failure. The company in question is very large, does business at a national level, and is highly subject to seasonal activities; so cloud computing was very attractive for its ability to scale (up and down) and use multiple data centers for failover and high availability. However, as much as the high level alignment was right, many early signs of adoption were ignored and ultimately lead the company to almost entirely redo its technical architecture planning away from cloud computing. At least for now.

High Availability Requirements

While High Availability is a major attribute of cloud computing vendors, including Microsoft Windows Azure, it is slightly overrated as some organizations are starting to find out. Indeed, if you read the fine prints, high availability is usually expressed per month. So a 99.9% high availability on a database server in the cloud per month is in fact relatively low on a yearly basis. While this isn’t a significant problem for most organizations, it can spell trouble for the highly seasonal business. Indeed, if your business is highly seasonal, and you end up making 80% of your income within 60 days of the year, the systems better be up and running for 60 days with a 99.99% or better availability on a yearly basis. You just can’t afford downtime.

Too Early

While cloud computing has matured significantly over the last year or so, this project relied on very early versions of the Microsoft Windows Azure platform, which at the time only offered Platform as a Service capabilities. While staying on the bleeding edge is important for companies to remain competitive, this customer couldn’t be successful in this environment. There were too many workarounds implemented and unproven cloud architecture patterns selected; the lack of Virtual Machines and permanent storage disks was a significant burden for this project. This company simply tried to adopt cloud computing too quickly without starting with smaller projects to build up its knowledge capital.

Bad Taste

Last but not least, the cloud adoption failure left a bad taste with parts of the management team, making it difficult to justify any now valid cloud implementation patterns. This is a shame, because the company has difficulties in thinking beyond its current data center boundaries by fear of another failure. Indeed, who would go into a management meeting and propose cloud computing to this customer at this time? Timing, indeed, is of the essence.

While it could take time for this customer to look back and extract lessons learned for this significant adoption failure, it could very well help in unusual ways the next time around.  With a clearer understanding of the benefits of cloud computing, and some of its adoption challenges, this company will undoubtedly build a more thoughtful approach to cloud adoption over the next few years and build better customer solutions in the end. And although I don’t wish this kind of adoption pains to anyone, it could be a necessary evil for some corporations while cloud implementation patterns are better defined, and more widely understood by management and technical teams alike.

About Herve Roggero

Herve Roggero, Microsoft Azure MVP, @hroggero, is the founder of Blue Syntax Consulting (http://www.bluesyntaxconsulting.com). Herve's experience includes software development, architecture, database administration and senior management with both global corporations and startup companies. Herve holds multiple certifications, including an MCDBA, MCSE, MCSD. He also holds a Master's degree in Business Administration from Indiana University. Herve is the co-author of "PRO SQL Azure" and “PRO SQL Server 2012 Practices” from Apress, a PluralSight author, and runs the Azure Florida Association.

Azure SQL Database = Long-Term Storage?

Here is an interesting concept that I would like to share. I always looked at Azure SQL Database (the Microsoft PaaS relational database engine) as a first in class database server; and of course it is. But when you compare SQL Server to Azure SQL Database, it becomes quickly evident that SQL Server has more features, performs better, and has fewer limitations. And it makes total sense: SQL Server is a full-blown, configurable database platform, while Azure SQL Database is a limited version of SQL Server running on a shared server environment.

Some Key Differences


Let’s first review a few key differences between SQL Server and Azure SQL Database. The following differences are not meant to be exhaustive, but they represent key variations that make sense in the context of this blog post.

Performance

SQL Server is a highly scalable database server that can process hundreds or thousands of requests per second, with very high concurrency levels and virtually unlimited throughput (at least as much as the server allows). For example, there are almost no limits on memory access, or disk I/O, as long as the underlying hardware allows it. While SQL Server has internal limitations, they usually far exceed those of Azure SQL Database. Some of the performance limitations of Azure SQL Database are implemented in the form of Throttling, with specific error codes, to prevent a single user of a database to impact other customers on the same server.

Database Features

SQL Server also comes with many additional features than purely its relational engine, such as Linked Servers, Encryption, Full Text Indexing, SQL Agent and more. As a result, with a couple of exceptions, it is fair to think of Azure SQL Database as a subset of SQL Server (basically most of the relational engine). With Azure SQL Database you can create databases, tables, stored procedures, run most T-SQL statements and build a complete database. However some of the more advanced features are not available. On the other hand, one of Azure SQL Database’s amazing feature is the ability to create new databases quickly and easily, without having to worry about low-level configuration or to figure out on which server the database will reside.

Availability Features

Azure SQL Database has a significant advantage over SQL Server in the area of high availability, up to 99.9% of monthly uptime. While SQL Server offers configuration options that can exceed 99.9%, Azure SQL Database’s availability is provided by default, without any specific configuration, which is not the case for SQL Server. This means that high availability is built directly into the service and doesn’t require specialized knowledge to install or maintain.

Cost

Another important aspect of Azure SQL Database is cost: you pay for what you use. The larger the database, the higher the cost. And the longer you keep the database, the more you pay over time. This means that there are no licenses to worry about, and if you create a database for 24 hours, then drop it, you will pay for 24 hours of uptime; in the US, a 1GB database costs about $9.99 per month for the entry level editions, which is not very expensive.

A Parallel with Long-Term Storage Disks


Keeping the above information in mind, Azure SQL Database offers interesting capabilities that are difficult to achieve with SQL Server, at a reasonable price. Specifically, the ability to programmatically (and quickly) create databases, with high availability, is unparalleled. Let’s draw a parallel with long-term storage disks. Long-term storage is considered cheaper than hot disks, and is usually slower. The primary purpose of long-term storage is recoverability at a reasonable price. So if we assume that 99.9% availability monthly is acceptable for roughly $9.99 per month per 1GB of data, Azure SQL Database can be used to offload data that is not accessed very often and for which slower access time is reasonable.

This means that Azure SQL Database could be used to store temporary processing units of work (like batch processes), store historical data away from the primary database, create temporary tables for reporting purposes and more. And because SQL Server can communicate directly to Azure SQL Database using Linked Server, the abstraction can be total from the perspective of an end user. Stored procedures could be reading data directly from the cloud, or even merge with local data, to provide a unified view of the information to end users. Using Azure SQL Database as a long-term backend store for SQL Server seems to make a lot of sense for many scenarios.

 

About Herve Roggero


Herve Roggero, Windows Azure MVP, @hroggero, is the founder of Blue Syntax Consulting (http://www.bluesyntaxconsulting.com). Herve's experience includes software development, architecture, database administration and senior management with both global corporations and startup companies. Herve holds multiple certifications, including an MCDBA, MCSE, MCSD. He also holds a Master's degree in Business Administration from Indiana University. Herve is the co-author of "PRO SQL Azure" and “PRO SQL Server 2012 Practices” from Apress, a PluralSight author, and runs the Azure Florida Association.

March Events in South Florida

Here are two events that I will participate soon: one on SQL Server, and the other on Windows Azure.

March 19, 6PM – SELECT * FROM Twitter

On March 19, at Carnival Cruise Lines, in Miami, I will be speaking about a new concept: no-API. Strap in your seatbelts and discover how SQL Server can take center stage when it comes to data movement and extraction. Data virtualization can empower the DBA to build real-time and batch solutions without developers, and gain more control over data quality. You will see how DBAs can easily directly tap into social media, documents, cloud computing, Internet services and even internal systems like message queuing and legacy systems. You can register here: http://www.fladotnet.com/Reg.aspx?EventID=705. I hope to see you there!

March 29 – Windows Azure Global Bootcamp

On March 29, join us at NOVA University, Fort Lauderdale, to talk Windows Azure, part of a global event driven by the community. This all day training will include Infrastructure as a Service, Platform as a Service and hands-on labs! Three experts will join me: Adnan Cartwright (@jinnxey), Shanavas Thayyullathil (from Microsoft), and Mir Majeed (@mirmajeed) . Register here: http://www.eventbrite.com/myevent?eid=9720104093.

 

For other great events in South Florida, check out http://www.fladotnet.com.

The Magic of SQL: no-API

Once in a while, you get the unique opportunity to create something different. So different that it feels right, wrong, strange, wonderful and transformative at the same time. So I decided to blog about a concept, which is turning into reality. It has to do with the increasing complexity of APIs, and the magic of SQL. APIs are for developers; including Web APIs, Web Services, XML, JSON… all that ecosystem of documents that represent data is for computers. And that’s fine. But it’s clearly not for human consumption. At least, not the software engineering kind. Did you count the number of Web APIs on the net? And how many data protocols exist out there (such as WMI, FTP, SOAP…), and how many document types exist (such as Excel, PDF, Flat Files, Zipped documents…)?  It’s actually mind blowing… and even some developers have a hard time keeping up with technology trends.  For example, how much will it take for a developer with 5 years of experience to correctly fetch, and page through Twitter feeds?

On the other hand, we have a wonderful technology available: SQL. It can be used to Read, Delete, Update, and Add data, among other things. And with a little more work, you can even join multiple data sets together to bring light to new data sets. SQL is easy to use, easy to learn, widespread, and relatively standard across database vendors. So, how easy would it be to communicate with all these data sources if only they understood SQL? Yeah… as in “SELECT * FROM Twitter.Timeline”. Or for SharePoint: “SELECT * FROM SharePoint.mylist”…  or even for Windows Azure Tables: “SELECT * FROM AzureStorage.MyTable1”.  Would it be useful? To who? And why?

Honestly I am shocked that this problem hasn’t been solved before.  Sure, you have a few ODBC drivers out there that will “pretend” like you are talking SQL. But in reality, these solutions are not Server based, and as a result they have many severe drawbacks. I am not going to dive into the specifics right now as to why, but suffice to say that no one will use specialized ODBC drivers to access a central data service (like a SaaS platform, or an Enterprise-class Data as a Service); it’s too cumbersome. So I am talking about the need for a real server-based database-like server that understands SQL on one end, and hides the complexities of the APIs on the other.

In other words, a no-API solution.

Who cares?

 

Actually, there are many potential interested parties for an SQL-like paradigm. Here are a few: managers that know SQL (actually the number is large in IT), junior developers, DBAs, Data Architects, business analysts, SharePoint administrators, report writers, ETL developers, business intelligence vendors, and probably some Enterprise Architects as well, looking for technologies that simplify their ecosystems.  And why do they care? Because they usually depend on developers to build custom solutions to access their own data…

Would a report developer be interested in accessing SharePoint Lists using SQL to build a report? Probably. No need to learn the SharePoint API anymore.

Would a business analyst be interested in getting Tweets into Excel for analysis because the Marketing department needs some competitive intelligence?  Possibly. And no need to learn the Twitter API.

Would a DBA be interested in saving table records in an FTP site directly using SQL? Probably. No need to code batch programs to do this.

The list goes on and on… I can see a large community of users interested in using SQL for daily tasks. It’s easier, it has fewer dependencies and because the underlying APIs are hidden, they become almost irrelevant. One language to access them all. At least, that’s the vision.

But Why?

 

Ah… but why? Why oh why would anyone care?  It’s simpler.  In fact, it’s the simplest possible way of accessing data.  Imagine what it would take to build a simple report that pulls data from SharePoint, and another database. Simple, right?  On paper, it’s easy. But you need a large technology stack to build the report:  ETL tools to fetch data from both SharePoint, and from the other database. A web service to call the SharePoint API so that it can be called by the ETL tool (and don’t you dare tell me you can read SQL directly… Smile  you are not supposed to, and it’s not even possible for SharePoint Online/Office 365). A temporary database to store it all. A job to refresh the data every 24 hours perhaps. And finally you can build the report. Oh, and if you are a larger company, a DEV, TEST and STAGE environment where all of that is duplicated… We are talking weeks or even months of work here…

But if you could do it all using SQL, why even bother with an ETL tool, a temporary database, a job, or even a web service?  Just pull the data, join it on the report, and you are done! And on top of it, it’s real time!  Why? Because the API is virtualized as a data source. So the playing field is even for reporting tools to play nicely without the need to move data around.

Let’s be careful: I am not saying a no-API platform replaces the need for ETL, web services or even temporary databases. They just offer a much simpler alternative for the myriad of projects that simply don’t need heavy infrastructure.

I am also saying however that a large part of a corporate workforce would become more productive if they had easy access to the sea of information locked up behind APIs. In my humble opinion, developers are becoming a bottleneck for many parts of an organization, because they are in short supply. So let’s remove the bottleneck! Let’s empower an entire workforce by giving them the magic of no-API through SQL.

So…

 

So here it is.  In my opinion, APIs are getting too complex for many non-developers, but data exposed by APIs is too valuable to corporations to be locked behind APIs. And SQL is the natural choice for these users. So let’s give it to them.

This is call for action. To learn more, check out our new website:  http://www.bluesyntaxconsulting.com and let us know what you think.

About Herve Roggero

Herve Roggero, Windows Azure MVP, @hroggero, is the founder of Blue Syntax Consulting (http://www.bluesyntaxconsulting.com). Herve's experience includes software development, architecture, database administration and senior management with both global corporations and startup companies. Herve holds multiple certifications, including an MCDBA, MCSE, MCSD. He also holds a Master's degree in Business Administration from Indiana University. Herve is the co-author of "PRO SQL Azure" and “PRO SQL Server 2012 Practices” from Apress, a PluralSight author, and runs the Azure Florida Association.

Backup Options for Windows Azure–Summary From Mike Martin

Mike Martin (@TechMike2KX, blog), Windows Azure MVP, did an awesome job at summarizing the various backup options for Windows Azure in this video. He first starts with an overview of redundancy in Windows Azure that covers Microsoft’s SLA and dives into the need for recovery for Windows Azure in general, including IaaS services and PaaS. Mike talks about multiple built-in solutions, and even provides a demo of third party vendors, such as Red Gate’s backup (Mike Wood Windows Azure MVP,  @mikewo, blog), and Zudio (Mark Rendle, Windows Azure MVP, @markrendle, blog).  I highly recommend you take a look at this video to get a quick overview of Windows Azure redundancy and recovery options.

Great job Mike!

Twitter