Scott Klein

  Home  |   Contact  |   Syndication    |   Login
  40 Posts | 0 Stories | 32 Comments | 0 Trackbacks

News

Twitter












Tag Cloud


Archives

Post Categories

Friday, August 31, 2012 #

Effective immediately, my blog is moving. My new blog url is:

http://www.scottlklein.com

My new blog is hosted on Windows Azure using DasBlog. Over the next few weeks I will be migrating my existing blogs over to the new location and I will remove this blog.

I'd like to thank GeeksWithBlogs for letting my host my blog on their fantastic site.

Scott


Wednesday, May 2, 2012 #

Microsoft has a sweepstakes where they are giving away a $50 gift certificate each week through June 14. More info here: http://www.brianhprince.com/post/2012/05/02/Cloud-Cash-Contest.aspx

Thursday, April 5, 2012 #

Effective immediately, new compute and storage resource options are now available when selecting data center options in the Windows Azure Portal. "West US" and "East US" options are now available, for Compute and Storage. SQL Azure options for these two data centers will be available in the next few months. The official announcement can be found here.

In terms of geo-replication:

  • US East and West are paired together for Windows Azure Storage geo-replication
  • US North and South are paired together for Windows Azure Storage geo-replication

These two new data centers are now visible in the Windows Azure Management Portal effective immediately. Compute and Storage pricing remains the same across all data centers. Get started with Windows Azure through the free 90 day trial.


Monday, March 5, 2012 #

It is now possible to move your SQL Azure server from one subscription to another. Last week, the Windows Azure team added this new feature to the Windows Azure Management Portal. Permissions for moving a server between subscriptions simply require that you are an account administrator, Co-administrator, or service administrator in both the source and target subscriptions.

Moving a SQL Azure server is simple; log in to the Windows Azure Management Portal and select the Database option in the Navigation pane and select the SQL Azure server you would like to move. Once the server is selected, you will see a new “Move Server” button in the Information pane for the server, shown in the following figure.

WAMP

Clicking the “Move Server” button brings up the Move Server dialog which lists the available target subscriptions in which to move the selected SQL Azure server. The list of target subscriptions comes from the active, available subscriptions in which you are either an account administrator, Co-administrator, or service administrator. Select the subscription that you want to move the server to and click “OK”.

moveserver

After several seconds the portal will refresh, at which time you will see that your SQL Azure server has been moved and is now listed under the target subscription you selected in Move Server dialog. It is important to note that moving the SQL Azure server does not actually move the server or associated databases. Keep in mind that a SQL Azure server not a physical server but a logical server with pointers to the physical databases and their replicas. This move only re-associates your server from one subscription to another subscription.

Also, the move of a server between subscriptions may have an affect on your SQL Azure bill, as different subscriptions may be billed differently based on different offers each subscription is associated with.


Tuesday, February 14, 2012 #

Some Background

The AdventureWorks database has been around for over a decade; a staple amongst sample databases. The first version of the AdventureWorks database appeared in time for SQL Server 2000. Microsoft has been good at keeping the AdventureWorks sample database up to date as new versions of SQL Server are released. Case-in-point: SQL Server 2012 is at RC0 and yet you can already find a version of AdventureWorks for it (albeit, it really isn’t that different from the SQL Server 2008 R2 version). They even have multiple versions depending on your needs (Data Warehouse, LT, OLAP, etc).

As a Corporate Technical Evangelist for SQL Azure and somewhat new to Microsoft recently, I was even glad to see a version for SQL Azure. Added to CodePlex in late 2009, the current zip file, AdventureWorks2008R2AZ, contains an install for two databases based on the AdventureWorks database; a small data warehouse database and a small light version of the full AdventureWorks database. However, neither of these database are the full AdventureWorks database that we know and love, so I set out to solve that and make a version for SQL Azure that utilizes the full AdventureWorks database. And, while I was at it, with all of the hype and talk surrounding SQL Azure Federations, I thought it would also be nice to see a Federated version of the AdventureWorks database.

Exciting News

Thus, I am happy to let you know of two new additions to the SQL Azure samples page on CodePlex. Starting today, two new installs are available. The full AdventureWorks database for SQL Azure, and a SQL Azure Federation version of the full AdventureWorks database are now available and can be downloaded from here:

http://msftdbprodsamples.codeplex.com/releases/view/37304

I’ll spend a few minutes and discuss these two databases individually regarding why the efforts were taken to migrate them to SQL Azure and what we hope you will get from them.

Full AdventureWorks for SQL Azure

As far as sample databases go, the AdventureWorks database is the king. It exists simply, yet elegantly, to illustrate the features and functionality of its corresponding version of SQL Server. As such, migrating the full version of the AdventureWorks database to SQL Azure was a must, in part for the following reasons:

· SQL Server as a Service – The primary goal of taking the AdventureWorks database and migrating it to SQL Azure is to show that SQL Azure is SQL Server served up as a PaaS service. Obviously there are some differences in the logical vs. physical administration aspects but the bottom line is that SQL Azure is a cloud-based relational database service that is built on SQL Server technologies, and what better way to prove that by taking an existing on-premises database and showing how easy it is to migrate it to SQL Azure.

· Supported Functionality and Migration Strategies – AS SQL Azure gains adoption, the question continues to exists as to what does it take to migrate an existing on-premises database to SQL Azure, and what functionality is, and is currently not, supported and the steps necessary in the migration process. This example answers those questions.

Everything that needed to be modified, changed, or removed to ensure support for SQL Azure has been documented on the CodePlex page for this database. For example, all ON PRIMARY statements have been removed, and we explain this and the reasons why on the CodePlex page. We list these out so you’ll have an idea of what was needed in order get the AdventureWorks database into SQL Azure.

Given that this is the first fore of the full AdventureWorks database into SQL Azure means there is much more to come.

Full AdventureWorks with SQL Azure Federations

SQL Azure Federations was launched December of 2011. There wasn’t a whole lot of fanfare when it was released but those who have been keeping up with SQL Azure were and certainly are aware of its existence, simply because Microsoft has been talking about it for well over a year. Thus, creating a Federated version of the AdventureWorks database for SQL Azure was also a must with the following thoughts in mind:

· Traction – What a better way to keep momentum going for SQL Azure Federations than to take a well-known sample database and Federate it! Developers can now look at a long-existing sample database that has been federated and use that as a starting point to understanding and working with SQL Azure Federations.

· Example – With SQL Azure Federations so new it makes sense to provide a real-life example on how to Federate an existing database.

· Coolness – Honestly, seeing a Federated version of the AdventureWorks database is just cool. Really.

The current Federated version of the AdventureWorks database federates on Customer. We specifically selected to federate on Customer because it provides a great base to build from. There were several tenants we could have federated on, such as Products or People, but for a first cut, and to help the “transition” into understanding Federations, we decided to start somewhat simple.

The installs for both of the databases, the full non-federated and the federated, is quite easy. Once installed you will be able to see all the databases in SQL Server Management Studio Object Explorer, including the Federation Member as shown in the following figure.

clip_image001

Even cooler is that you can manager your Federations via the SQL Azure Management Portal as shown in the Figure below.

clip_image003

What’s Next

We are already in the process of creating additional, more advanced, versions of this database, which you will see in the coming weeks and months.

Conclusion

As features and functionality is added to SQL Azure, these databases will be updated correspondingly.

Long Live the AdventureWorks database! Love it, use it!


You asked for it, you got it! To help meet the needs of existing and future SQL Azure customers, Microsoft has announced that they are lowering the price of SQL Azure, and as well introducing a 100MB database option! The new pricing structure will result in cost savings from 48% to 75% of SQL Azure databases for databases larger than 1GB. Additionally, the new 100MB database option has been added to enable new customers to start using SQL Azure at half the cost of the previous price, yet still provide the full range of features, including High Availability, Fault Tolerance, and Elastic Scale-out. The following table lists the cost savings per database size.

GB

Previous Pricing

New Pricing

New Price/GB

Total % Decrease

5 $49.95 $25.99 $5.20 48%
10 $99.99 $45.99 $4.60 54%
25 $299.97 $75.99 $3.04 75%
50 $499.95 $125.99 $2.52 75%
100 $499.95 $175.99 $1.76 65%
150 $499.95 $225.99 $1.51 55%

The goal with this pricing is simply to provide better pricing options for larger deployments and make it easier and more cost effective for customers to get started with SQL Azure. As you can from the table, the price per GB drops significantly as your database needs grow. Additionally, this new pricing model provides an inexpensive option for customers who don’t require a big workload to still use SQL Azure and take advantage of the full Service Level Agreement.

Additional details on the new pricing changes can be found here and here.


Friday, January 27, 2012 #

I have seen a number of questions lately regarding how SQL Azure handles throttling and how to determine why the throttling occurred. Sometimes those questions are followed by another question asking how to handle throttling conditions in their applications.

Troubleshooting SQL Azure Throttling

GREAT NEWS! There actually is a way to find out if you are throttled and why, and the results of the throttling. The key is to look at the error message coming back to you. You’ll typically see an error message such as:

“The server is currently busy…”

Or

“The service has encountered an error...”

Or

“The service is experiencing a problem…”

There a few more, but the key is to look at the END of these messages because there will be a Code at the end. For example:

“The service is currently busy. Retry the request after 10 seconds. Code %d”.

The code is a decimal value, and is the vital piece of information to tracking down the throttling issue. For example, 131075. Don’t confuse this code with the error code. You’ll actually see two codes in the error message; the actual error code, and the reason code. It is the reason code we are after; the code that follows the error message.

So, before we dive into this reason code number, it is also important to understand the throttling modes, and throttling types. The types explain why you are getting throttled, and the modes explain how you are being throttled. Thus, it is this decimal code value at the end of the message that specifies the mode and type of throttling.

The two tables below show the different throttling types and modes. Throttling types, the reason you are being throttled, will fall into either a soft throttling category or a hard throttling category. This is because substantially exceeded types pose a much greater risk to the system and thus are handled more aggressively. Throttling modes range from no throttling at all to completely rejecting all read and writes.

Throttling Types

Throttling type

Soft Throttling limit exceeded

Hard Throttling limit exceeded

Temporary disk space problem occurred

0x01

0x02

Temporary log space problem occurred

0x04

0x08

High-volume transaction/write/update activity exists

0x10

0x20

High-volume database input/output (I/O) activity exists

0x40

0x80

High-volume CPU activity exists

0x100

0x200

Database quota exceeded

0x400

0x800

Too many concurrent requests occurred

0x4000

0x8000

Throttling Modes

Throttling mode

Description

Types of statements disallowed

Types of statements allowed

0x00

AllowAll - No throttling, all queries permitted.

No statements disallowed

All statements allowed

0x01

RejectUpsert - Updates and Inserts will fail.

INSERT, UPDATE, CREATE TABLE | INDEX

DELETE, DROP TABLE | INDEX, TRUNCATE

0x02

RejectAllWrites - All writes (including deletes) will fail.

INSERT, UPDATE, DELETE, CREATE, DROP

SELECT

0x03

RejectAll - All reads and writes will fail.

All statements disallowed

No statements allowed

With the error code in hand, open up Windows Calculator and from the View menu, select Programmer. We need the calculator to run in programmer mode because we are programmers. Not really, but because we need several keys that the Programmer mode supplies.

With the Calculator in Programmer mode, make sure the Dec and Dword options are selected on the left side of the calculator. Next, enter the code from the error message, in this case 131075. Now we’re getting to the good part. Once you have entered the reason code, change the notation from Dec to Hex. The reason code will now show 20003.

This number is the mode and type. How? The last two digits (03) are the throttling mode. The remaining digits, in this case the first three (200), are the throttling. We can then take these two numbers and look up in the two tables and determine that the throttling mode (03) is Reject All and the throttling type is High-volume CPU activity exists.

Thus, in this example the throttling occurred because too much CPU activity was taking place and therefor it was determined that Hard Throttling needed to be imposed and thus all reads are writes will fail.

Transient Fault Handling Application Block

The next step in our quest for improved performance is to implement retry logic into the application. Building something like this on your own could take months, so it sure is a good thing that it has been included in the Microsoft Enterprise Library. Now it is almost plug-n-play.

The retry logic built into the Enterprise library is called the Transient Fault Handling Application Block providing a set of reusable components for adding retry logic into your Windows Azure application. You can use these components against SQL Azure, Windows Azure Storage, and the Service Bus and Caching Service.

It is easy to add to your applications via Nuget. With your application open, type and run the following command in the Package Manager Console (also highlighted in the image below).

Install-Package EnterpriseLibrary.WindowsAzure.TransientFaultHandling

Nuget

The Nuget package installs the Enterprise Library and adds all the necessary references to the project.

image

I’m going to highlight three simple examples on how to implement retry logic into your application. However, before using the components, you need to add a few directives:

using Microsoft.Practices.EnterpriseLibrary.WindowsAzure.TransientFaultHandling.SqlAzure;

using Microsoft.Practices.TransientFaultHandling;

using System.Data.EntityClient;

using System.Data.SqlClient;

This first example uses the ReliableSqlConnection class, which looks very similar to the SqlConnection class of ADO.NET, but provides a set of value-add methods. These methods ensure that connections could be reliable established and commands reliably executed against a SQL Azure database.

In the code below, the ReliableSqlConnection is used to establish a reliable connection and execute a query against that connection.

using (var cnn = new ReliableSqlConnection(connString))

{

    cnn.Open();

 

    using (var cmd = cnn.CreateCommand())

    {

        cmd.CommandText = "SELECT * FROM HumanResources.Employee";

 

        using (var rdr = cmd.ExecuteReader())

        {

            if (rdr.Read())

            {

                //

            }

        }

    }

}

The previous example is pretty plain, and doesn’t really illustrate the flexibility of the retry components. This is because we haven’t defined a retry policy. Sure, we used the ReliableSqlConnection class as shown in the previous example, and as such will provide the built-in default policy as to how many times the retry will occur, and so on. But the real power comes from defining a custom policy, via the RetryPolicy class.

The RetryPolicy class allows for creating different policies based on our needs. The RetryPolicy class contains several constructors that accepts input pertaining to the retry count, retry interval timespan, and delta backoff.

RetryPolicy

This next code example below illustrates creating a retry policy using the RetryPolicy class, specifying the retry attempts and the fixed time between retries. This policy is then applied to the ReliablSqlConnection as both a policy to the connection as well as the policy to the command.

 

RetryPolicy myretrypolicy = new RetryPolicy<SqlAzureTransientErrorDetectionStrategy>(3, TimeSpan.FromSeconds(30));

 

using (ReliableSqlConnection cnn = new ReliableSqlConnection(connString, myretrypolicy, myretrypolicy))

{

    try

    {

        cnn.Open();

 

        using (var cmd = cnn.CreateCommand())

        {

            cmd.CommandText = "SELECT * FROM HumanResources.Employee";

 

            using (var rdr = cnn.ExecuteCommand<IDataReader>(cmd))

            {

                //

            }

        }

    }

    catch (Exception ex)

    {

        MessageBox.Show(ex.Message.ToString());

    }

}

 

The awesome thing is that retry logic just doesn’t exist for ADO.NET APIs, but the Entity Framework as well. The implementation of the retry policy model in the Transient Fault Handling Application Block makes it easy to wrap any user code into a the scope of a retry, and for the Entity Framework this is accomplished via the ExecuteAction and ExecuteAction<T> methods of the RetryPolicy class.

 

using (NorthwindEntities dc = new NorthwindEntities())

{

    RetryPolicy myPolicy = new RetryPolicy<SqlAzureTransientErrorDetectionStrategy>(3);

    Employee e1 = myPolicy.ExecuteAction<Employee>(() =>

        (from x in dc.Employees

            where x.LastName == "King"

            select x).First());

}

 

Now, there may be times when using the ReliableSqlConnection classes just aren’t feasible. Some developers are comfortable with the existing SqlConnection classes of ADO.NET and don’t want to 1) worry about replacing their existing database logic code with new code, or 2) may not trust a newer plug-in component. And this makes sense, as the ReliableSqlConnection classes are targeted more for new development than existing applications.

However, the question then becomes how to implement retry logic in existing applications. The answer is an easy one as shown in the example below. The ADO.NET SqlConnection class comes with an OpenWithRetry method which takes the same retry policy. We simply define a retry policy and pass that to the OpenWithRetry method. Equally as important, the SqlCommand object has several retry methods that provide the ability to pass a retry policy as well, including ExecuteReaderWithRetry, ExecuteNonQueryWithRetry, ExecuteScalarWithRetry, and ExecuteXmlReaderWithRetry. The example below defines a retry policy and uses that as a parameter to both the OpenWithRetry on the connection as well on the ExecuteReaderWithRetry.

RetryPolicy myretrypolicy = new RetryPolicy<SqlAzureTransientErrorDetectionStrategy>(3, TimeSpan.FromSeconds(30));

using (SqlConnection cnn = new SqlConnection(connString))

{

    cnn.OpenWithRetry(myretrypolicy);

    using (var cmd = cnn.CreateCommand())

    {

        cmd.CommandText = "SELECT * FROM HumanResources.Employee";

        using (var rdr = cmd.ExecuteReaderWithRetry(myretrypolicy))

        {

            if (rdr.Read())

            {

                //

            }

        }

    }

}

You have several options, but the point is that you don’t need to build retry logic yourself. The Transient Fault Handling Application Block makes it very easy to implement retry logic into new or existing applications. Plus, with your trusty calculator (in Programmer mode) you can now determine why throttling might be happening and handles those situations gracefully.

 

Happy coding!


Friday, October 14, 2011 #

Blue Syntax is pleased, and very excited, to announce the general availability of Enzo Backup for SQL Azure. It took a monumental effort to put this product together. The hardest part was designing the backup and restore routines in a way that would not constently trigger the throttling safegaurds in SQL Azure. In addition to the typical retry logic, adaptive loading algorithms that know how to "slow down" the data read/load based on specific error conditions was also incorporated, while keeping its internal operations in parallel.

Microsoft MVP's will have access to this backup utility at no charge. You'll quickly see that the tool is very easy to use and user-friendly, and no question the most complete backup utility available to date for SQL Azure. To get a free license, simply email info@bluesyntax.net.

Take a look and download Enzo Backup at http://www.bluesyntax.net/backup.aspx, which also includes the technical overview of the tool for quick reference.


Friday, October 7, 2011 #

Just confirmed the Azure Boot Camp dates for Honolulu Hawaii. November 14th and 15th. See http://www.azurebootcamp.com/city/Honolulu.


Thursday, September 22, 2011 #

The Registration links for the Charlotte, NC and Mountain View, CA Azure Boot Camps are now up on the Azure Boot Camp site:  http://www.azurebootcamp.com/schedule

See you there!