Backing up SQL Azure

That's it!!! After many days and nights... and an amazing set of challenges, I just released the Enzo Backup for SQL Azure BETA product (http://www.bluesyntax.net). Clearly, that was one of the most challenging projects I have done so far.

Why???

Because to create a highly redundant system, expecting failures at all times for an operation that could take anywhere from a couple of minutes to a couple of hours, and still making sure that the operation completes at some point was remarkably challenging. Some routines have more error trapping that actual code...

Here are a few things I had to take into account:

  • Exponential Backoff (explained in another post)
  • Dual dynamic determination of number of rows to backup 
  • Dynamic reduction of batch rows used to restore the data
  • Implementation of a flexible BULK Insert API that the tool could use
  • Implementation of a custom Storage REST API to handle automatic retries
  • Automatic data chunking based on blob sizes
  • Compression of data
  • Implementation of the Task Parallel Library at multiple levels including deserialization of Azure Table rows and backup/restore operations
  • Full or Partial Restore operations
  • Implementation of a Ghost class to serialize/deserialize data tables

And that's just a partial list... I will explain what some of those mean in future blob posts. A lot of the complexities had to do with implementing a form of retry logic, depending on the resource and the operation.

 

Print | posted @ Tuesday, June 21, 2011 7:27 AM

Comments on this entry:

Comments are closed.

Comments have been closed on this topic.