Shaun Xu

The Sheep-Pen of the Shaun


News

logo

Shaun, the author of this blog is a semi-geek, clumsy developer, passionate speaker and incapable architect with about 10 years experience in .NET. He hopes to prove that software development is art rather than manufacturing. He's into cloud computing platform and technologies (Windows Azure, Aliyun) as well as WCF and ASP.NET MVC. Recently he's falling in love with JavaScript and Node.js.

Currently Shaun is working at IGT Technology Development (Beijing) Co., Ltd. as the architect responsible for product framework design and development.

MVP

My Stats

  • Posts - 97
  • Comments - 348
  • Trackbacks - 0

Tag Cloud


Recent Comments


Recent Posts


Archives


Post Categories


 

In my last post I created a very simple WCF service with the user registration functionality. I created an entity for the user data and a DataContext class which provides some methods for operating the entities such as add, delete, etc. And in the service method I utilized it to add a new entity into the table service. But I didn’t have any validation before registering which is not acceptable in a real project. So in this post I would firstly add some validation before perform the data creation code and show how to use the LINQ for the table service.

 

LINQ to Table Service

Since the table service utilizes ADO.NET Data Service to expose the data and the managed library of ADO.NET Data Service supports LINQ we can use it to deal with the data of the table service. Let me explain with my current example: I would like to ensure that when register a new user the email address should be unique. So I need to check the account entities in the table service before add.

If you remembered, in my last post I mentioned that there’s a method in the TableServiceContext class – CreateQuery, which will create a IQueryable instance from a given type of entity. So here I would create a method under my AccountDataContext class to return the IQueryable<Account> which named Load.

   1: public class AccountDataContext : TableServiceContext
   2: {
   3:     private CloudStorageAccount _storageAccount;
   4:  
   5:     public AccountDataContext(CloudStorageAccount storageAccount)
   6:         : base(storageAccount.TableEndpoint.AbsoluteUri, storageAccount.Credentials)
   7:     {
   8:         _storageAccount = storageAccount;
   9:  
  10:         var tableStorage = new CloudTableClient(_storageAccount.TableEndpoint.AbsoluteUri, 
  11:                                                 _storageAccount.Credentials);
  12:         tableStorage.CreateTableIfNotExist("Account");
  13:     }
  14:  
  15:     public void Add(Account accountToAdd)
  16:     {
  17:         AddObject("Account", accountToAdd);
  18:         SaveChanges();
  19:     }
  20:  
  21:     public IQueryable<Account> Load()
  22:     {
  23:         return CreateQuery<Account>("Account");
  24:     }
  25: }

The method returns the IQueryable<Account> so that I can perform the LINQ operation on it. And back to my service class, I will use it to implement my validation.

   1: public bool Register(string email, string password)
   2: {
   3:     var storageAccount = CloudStorageAccount.FromConfigurationSetting("DataConnectionString");
   4:     var accountToAdd = new Account(email, password) { DateCreated = DateTime.Now };
   5:     var accountContext = new AccountDataContext(storageAccount);
   6:  
   7:     // validation
   8:     var accountNumber = accountContext.Load()
   9:         .Where(a => a.Email == accountToAdd.Email)
  10:         .Count();
  11:     if (accountNumber > 0)
  12:     {
  13:         throw new ApplicationException(string.Format("Your account {0} had been used.", accountToAdd.Email));
  14:     }
  15:  
  16:     // create entity
  17:     try
  18:     {
  19:         accountContext.Add(accountToAdd);
  20:         return true;
  21:     }
  22:     catch (Exception ex)
  23:     {
  24:         Trace.TraceInformation(ex.ToString());
  25:     }
  26:     return false;
  27: }

I used the Load method to retrieve the IQueryable<Account> and use Where method to find the accounts those email address are the same as the one is being registered. If it has I through an exception back to the client side. Let’s run it and test from my simple client application.

image

Oops! Looks like we encountered an unexpected exception. It said the “Count” is not support by the ADO.NET Data Service LINQ managed library. That is because the table storage managed library (aka. TableServiceContext) is based on the ADO.NET Data Service and it supports very limit LINQ operation. Although I didn’t find a full list or documentation about which LINQ methods it supports I could even refer a page on msdn here. It gives us a roughly summary of which query operation the ADO.NET Data Service managed library supports and which doesn't. As you see the Count method is not in the supported list.

Not only the query operation, there inner lambda expression in the Where method are limited when using the ADO.NET Data Service managed library as well.

For example if you added (a => !a.DateDeleted.HasValue) in the Where method to exclude those deleted account it will raised an exception said "Invalid Input". Based on my experience you should always use the simple comparison (such as ==, >, <=, etc.) on the simple members (such as string, integer, etc.) and do not use any shortcut methods (such as string.Compare, string.IsNullOrEmpty etc.).

   1: // validation
   2: var accountNumber = accountContext.Load()
   3:     .Where(a => a.Email == accountToAdd.Email)
   4:     .ToList()
   5:     .Count;
   6: if (accountNumber > 0)
   7: {
   8:     throw new ApplicationException(string.Format("Your account {0} had been used.", accountToAdd.Email));
   9: }

We changed the a bit and try again.

image

Since I had created an account with my mail address so this time it gave me an exception said that the email had been used, which is correct.

 

Repository Pattern for Table Service

The AccountDataContext takes the responsibility to save and load the account entity but only for that specific entity. Is that possible to have a dynamic or generic DataContext class which can operate any kinds of entity in my system? Of course yes. Although there's no typical database in table service we can threat the entities as the records, similar with the data entities if we used OR Mapping. As we can use some patterns for ORM architecture here we should be able to adopt the one of them - Repository Pattern in this example.

We know that the base class - TableServiceContext provide 4 methods for operating the table entities which are CreateQuery, AddObject, UpdateObject and DeleteObject. And we can create a relationship between the enmity class, the table container name and entity set name. So it's really simple to have a generic base class for any kinds of entities. Let's rename the AccountDataContext to DynamicDataContext and make the type of Account as a type parameter if it.

   1: public class DynamicDataContext<T> : TableServiceContext where T : TableServiceEntity
   2: {
   3:     private CloudStorageAccount _storageAccount;
   4:     private string _entitySetName;
   5:  
   6:     public DynamicDataContext(CloudStorageAccount storageAccount)
   7:         : base(storageAccount.TableEndpoint.AbsoluteUri, storageAccount.Credentials)
   8:     {
   9:         _storageAccount = storageAccount;
  10:         _entitySetName = typeof(T).Name;
  11:  
  12:         var tableStorage = new CloudTableClient(_storageAccount.TableEndpoint.AbsoluteUri,
  13:                                                 _storageAccount.Credentials);
  14:         tableStorage.CreateTableIfNotExist(_entitySetName);
  15:     }
  16:  
  17:     public void Add(T entityToAdd)
  18:     {
  19:         AddObject(_entitySetName, entityToAdd);
  20:         SaveChanges();
  21:     }
  22:  
  23:     public void Update(T entityToUpdate)
  24:     {
  25:         UpdateObject(entityToUpdate);
  26:         SaveChanges();
  27:     }
  28:  
  29:     public void Delete(T entityToDelete)
  30:     {
  31:         DeleteObject(entityToDelete);
  32:         SaveChanges();
  33:     }
  34:  
  35:     public IQueryable<T> Load()
  36:     {
  37:         return CreateQuery<T>(_entitySetName);
  38:     }
  39: }

I saved the name of the entity type when constructed for performance matter. The table name, entity set name would be the same as the name of the entity class. The Load method returned a generic IQueryable instance which supports the lazy load feature. Then in my service class I changed the AccountDataContext to DynamicDataContext and that's all.

   1: var accountContext = new DynamicDataContext<Account>(storageAccount);

Run it again and register another account.

image

The DynamicDataContext now can be used for any entities. For example, I would like the account has a list of notes which contains 3 custom properties: Account Email, Title and Content. We create the note entity class.

   1: public class Note : TableServiceEntity
   2: {
   3:     public string AccountEmail { get; set; }
   4:     public string Title { get; set; }
   5:     public string Content { get; set; }
   6:     public DateTime DateCreated { get; set; }
   7:     public DateTime? DateDeleted { get; set; }
   8:  
   9:     public Note()
  10:         : base()
  11:     {
  12:     }
  13:  
  14:     public Note(string email)
  15:         : base(email, string.Format("{0}_{1}", email, Guid.NewGuid().ToString()))
  16:     {
  17:         AccountEmail = email;
  18:     }
  19: }

And no need to tweak the DynamicDataContext we can directly go to the service class to implement the logic. Notice here I utilized two DynamicDataContext instances with the different type parameters: Note and Account.

   1: public class NoteService : INoteService
   2: {
   3:     public void Create(string email, string title, string content)
   4:     {
   5:         var storageAccount = CloudStorageAccount.FromConfigurationSetting("DataConnectionString");
   6:         var accountContext = new DynamicDataContext<Account>(storageAccount);
   7:         var noteContext = new DynamicDataContext<Note>(storageAccount);
   8:  
   9:         // validate - email must be existed
  10:         var accounts = accountContext.Load()
  11:             .Where(a => a.Email == email)
  12:             .ToList()
  13:             .Count;
  14:         if (accounts <= 0)
  15:             throw new ApplicationException(string.Format("The account {0} does not exsit in the system please register and try again.", email));
  16:  
  17:         // save the note
  18:         var noteToAdd = new Note(email) { Title = title, Content = content, DateCreated = DateTime.Now };
  19:         noteContext.Add(noteToAdd);
  20:     }
  21: }

And updated our client application to test the service.

image

I didn't implement any list service to show all notes but we can have a look on the local SQL database if we ran it at local development fabric.

image

 

Summary

In this post I explained a bit about the limited LINQ support for the table service. And then I demonstrated about how to use the repository pattern in the table service data access layer and make the DataContext dynamically. The DynamicDataContext I created in this post is just a prototype. In fact we should create the relevant interface to make it testable and for better structure we'd better separate the DataContext classes for each individual kind of entity. So it should have IDataContextBase<T>, DataContextBase<T> and for each entity we would have

class AccountDataContext<Account> : IDataContextBase<Account>, DataContextBase<Account> { … }

class NoteDataContext<Note> : IDataContextBase<Note>, DataContextBase<Note> { … }

 

Besides the structured data saving and loading, another common scenario would be saving and loading some binary data such as images, files. In my next post I will show how to use the Blob Service to store the bindery data - make the account be able to upload their logo in my example.

 

Hope this helps,

Shaun

 

All documents and related graphics, codes are provided "AS IS" without warranty of any kind.
Copyright © Shaun Ziyan Xu. This work is licensed under the Creative Commons License.

 

Shiju Varghese posted an article on his(her) blog and said that the RTM of the ASP.NET MVC 2 had been released and available to download. You can get the installation packeage and the release note here. And based on the release note there’s no breaking changes from RC2 to RTM.

Let’s play with the new ASP.NET MVC and look forward the Visual Studio 2010 RTM.


 

In Windows Azure platform there are 3 storage we can use to save our data on the cloud. They are the Table, Blob and Queue. Before the Chinese New Year Microsoft announced that Azure SDK 1.1 had been released and it supports a new type of storage – Drive, which allows us to operate NTFS files on the cloud. I will cover it in the coming few posts but now I would like to talk a bit about the Table Storage.

 

Concept of Table Storage Service

The most common development scenario is to retrieve, create, update and remove data from the data storage. In the normal way we communicate with database. When we attempt to move our application over to the cloud the most common requirement should be have a storage service. Windows Azure provides a in-build service that allow us to storage the structured data, which is called Windows Azure Table Storage Service.

The data stored in the table service are like the collection of entities. And the entities are similar to rows or records in the tradtional database. An entity should had a partition key, a row key, a timestamp and set of properties. You can treat the partition key as a group name, the row key as a primary key and the timestamp as the identifer for solving the concurrency problem.

Different with a table in a database, the table service does not enforce the schema for tables, which means you can have 2 entities in the same table with different property sets.

The partition key is being used for the load balance of the Azure OS and the group entity transaction. As you know in the cloud you will never know which machine is hosting your application and your data. It could be moving based on the transaction weight and the number of the requests. If the Azure OS found that there are many requests connect to your Book entities with the partition key equals “Novel” it will move them to another idle machine to increase the performance. So when choosing the partition key for your entities you need to make sure they indecate the category or gourp information so that the Azure OS can perform the load balance as you wish.

 

Consuming the Table

Although the table service looks like a database, you cannot access it through the way you are using now, neither ADO.NET nor ODBC. The table service exposed itself by ADO.NET Data Service protocol, which allows you can consume it through the RESTful style by Http requests.

The Azure SDK provides a sets of classes for us to connect it. There are 2 classes we might need: TableServiceContext and TableServiceEntity.

The TableServiceContext inherited from the DataServiceContext, which represents the runtime context of the ADO.NET data service. It provides 4 methods mainly used by us:

  • CreateQuery: It will create a IQueryable instance from a given type of entity.
  • AddObject: Add the specified entity into Table Service.
  • UpdateObject: Update an existing entity in the Table Service.
  • DeleteObject: Delete an entity from the Table Service.

Beofre you operate the table service you need to provide the valid account information. It’s something like the connect string of the database but with your account name and the account key when you created the storage service on the Windows Azure Development Portal.

After getting the CloudStorageAccount you can create the CloudTableClient instance which provides a set of methods for using the table service. A very useful method would be CreateTableIfNotExist. It will create the table container for you if it’s not exsited. And then you can operate the eneities to that table through the methods I mentioned above.

Let me explain a bit more through an exmaple. We always like code rather than sentence.

 

Straightforward Accessing to the Table

Here I would like to build a WCF service on the Windows Azure platform, and for now just one requirement: it would allow the client to create an account entity on the table service. The WCF service would have a method named Register and accept an instance of the account which the client wants to create. After perform some validation it will add the entity into the table service. So the first thing I should do is to create a Cloud Application on my VIstial Studio 2010 RC. (The Azure SDK 1.1 only supports VS2008 and VS2010 RC.)

The solution should be like this below.

image

Then I added a configuration items for the storage account through the Settings section under the cloud project. (Double click the Services file under Roles folder and navigate to the Setting section.) This setting will be used when to retrieve my storage account information. Since for now I just in the development phase I will select “UseDevelopmentStorage=true”.

And then I navigated to the WebRole.cs file under my WCF project. If you have read my previous posts you would know that this file defines the process when the application start, and terminate on the cloud. What I need to do is to when the application start, set the configuration publisher to load my config file with the config name I specified. So the code would be like below.

image

I removed the original service and contract created by the VS template and add my IAccountService contract and its implementation class - AccountService. And I add the service method Register with the parameters: email, password and it will return a boolean value to indicates the result which is very simple.

At this moment if I press F5 the application will be established on my local development fabric and I can see my service runs well through the browser.

image

Let’s implement the service method Rigister, add a new entity to the table service. As I said before the entities you want to store in the table service must have 3 properties: partition key, row key and timespan. You can create a class with these 3 properties. The Azure SDK provides us a base class for that named TableServiceEntity in Microsoft.WindowsAzure.StorageClient namespace. So what we need to do is more simply, create a class named Account and let it derived from the TableServiceEntity. And I need to add my own properties: Email, Password, DateCreated and DateDeleted. The DateDeleted is a nullable date time value to indecate whether this entity had been deleted and when.

image

Do you notice that I missed something here? Yes it’s the partition key and row key I didn’t assigned. The TableServiceEntity base class defined 2 constructors one was a parameter-less constructor which will be used to fill values into the properties from the table service when retrieving data. The other was one with 2 parameters: partition key and row key. As I said below the partition key may affect the load balance and the row key must be unique so here I would like to use the email as the parition key and the email plus a Guid as the row key.

image

OK now we finished the entity class we need to store onto the table service. The next step is to create a data access class for us to add it. Azure SDK gives us a base class for it named TableServiceContext as I mentioned below. So let’s create a class for operate the Account entities.

The TableServiceContext need the storage account information for its constructor. It’s the combination of the storage service URI that we will create on Windows Azure platform, and the relevant account name and key. The TableServiceContext will use this information to find the related address and verify the account to operate the storage entities. Hence in my AccountDataContext class I need to override this constructor and pass the storage account into it.

image

All entities will be saved in the table storage with one or many tables which we call them “table containers”. Before we operate an entity we need to make sure that the table container had been created on the storage. There’s a method we can use for that: CloudTableClient.CreateTableIfNotExist. So in the constructor I will perform it firstly to make sure all method will be invoked after the table had been created.

Notice that I passed the storage account enpoint URI and the credentials to specify where my storage is located and who am I.

image

Another advise is that, make your entity class name as the same as the table name when create the table. It will increase the performance when you operate it over the cloud especially querying.

Since the Register WCF method will add a new account into the table service, here I will create a relevant method to add the account entity. Before implement, I should add a reference - System.Data.Services.Client to the project. This reference provides some common method within the ADO.NET Data Service which can be used in the Windows Azure Table Service. I will use its AddObject method to create my account entity.

Since the table service are not fully implemented the ADO.NET Data Service, there are some methods in the System.Data.Services.Client that TableServiceContext doesn’t support, such as AddLinks, etc.

imageThen I implemented the serivce method to add the account entity through the AccountDataContext.

image

You can see in the service implmentation I load the storage account information through my configuration file and created the account table entity from the parameters. Then I created the AccountDataContext. If it’s my first time to invoke this method the constructor of the AccountDataContext will create a table container for me. Then I use Add method to add the account entity into the table.

Next, let’s create a farely simple client application to test this service. I created a windows console application and added a service reference to my WCF service.

The metadata information of the WCF service cannot be retrieved if it’s deployed on the Windows Azure even though the <serviceMetadata httpGetEnabled="true"/> had been set. If we need to get its metadata we can deploy it on the local development service and then changed the endpoint to the address which is on the cloud.

image

In the client side app.config file I specified the endpoint to the local development fabric address. And the just implement the client to let me input an email and a password then invoke the WCF service to add my acocunt.

image

Let’s run my application and see the result. Of course it should return TRUE to me. And in the local SQL Express I can see the data had been saved in the table.

image

 

Summary

In this post I explained more about the Windows Azure Table Storage Service. I also created a small application for demostration of how to connect and consume it through the ADO.NET Data Service Managed Library provided within the Azure SDK.

I only show how to create an eneity in the storage service. In the next post I would like to explain about how to query the entities with conditions thruogh LINQ. I also would like to refactor my AccountDataContext class to make it dyamic for any kinds of entities.

 

Hope this helps,

Shaun

 

All documents and related graphics, codes are provided "AS IS" without warranty of any kind.
Copyright © Shaun Ziyan Xu. This work is licensed under the Creative Commons License.

 

While I’m playing with Windows Azure I found it’s a little bit hard to find the information about it since it’s new to us. When I was investigated the new CloudDrive feature I almost could not get anything about it except its SDK.

Today I found a very good series of posts on ASP.NET community by Jeff Windmer. For now there are the frist 3 posts of his series but I believe he will continue and finish it. Please have a look on his blog:

http://weblogs.asp.net/jeffwids/archive/2010/03/02/getting-started-with-windows-azure-part-0-where-do-i-go-to-get-started.aspx

And I will continue my series as well. :o)

Hope this helps,

Shaun


 

There came a news that the free developer account of azure had been expired end of Jan 2010 which means there’s no way to play with it for free. But I’m lucky that my location is not on their billing list which means I can use and try it till this June. So I think it should be better to explain a bit on how to deploy before it's withdrawn by Microsoft. As I had mentioned many times in the last 2 posts, deploying the azure application would be a little bit different. And I will continue using the application I created in the last post - the Whiteboard, although it's suck.

First we go to the windows azure development portal. After logged in with my Live ID we went into the portal and I can see the default project I requested before.

01

Click into this project the page it shows us how many items (services) I had created. If there's nothing created in my account we can see the Storage Account and Hosted Service links and how many of them I can use.

02

Since the Whiteboard project needs the table storage we firstly create a storage account. Just click the Storage Account. We named this service with the label Whiteboard and typed some proper description and press Next button.

03

In the next page we will give the storage account a public name so that it could be resolved on the internet by our hosted service which will be created later. We named it as http://whiteboard.blob.core.windows.net. We can check the availability of this public storage name by clicking the Check Availability button. Since we need this account to be communicated with our hosted service so we will create a new affinity group and named it as 'whiteboard', located it to the nearest server and press Create.

04

In the coming summary page the account information of this storage server will be shown to us. There are 3 endpoints indicates 3 types of storage services provided by windows azure: Table, Queue and Blob. Following is the access key for consuming them. If you are carelessness, like me, exposed the access key to someone else then you can use the Regenerate button to regenerate them. Next is the affinity group we had created in the last step and the CDN feature which had been disabled by default. Then let's back to the portal and create the hosted service, which is the website.

05

Back to the portal the storage service we had just created was appeared on the page. Click the new service link to create the hosted service. This time we clicked Hosted Service linkage.

06

We named it as Whiteboard Web and also some proper description and press Next. In the creating page we specify the URL of this hosted service, which is the public access URL of our website. We typed the whiteboard for the name and checked it's available. Then in the affinity group section since we need this service to be connected with the storage we created before so now we select the second radio button and chose the whiteboard affinity group and click Create.

07

In windows azure a hosted service divides into 2 versions: Staging and Production. It allows us to developing the application and deployed, tested on staging section without changing the production one. Once we feel it's good enough then we can simply move the staging one to production which I will explain later. We do the similar job when using the original dedicated server or virtual server by ourselves (i.e. added 2 website in IIS one assigned to port 80 which is the production app and the other to 8080 for staging app) but now windows azure takes care of it.

08

Now we finished the steps for creating the services on azure. The next step is to update the configuration of our web application to consume the azure services rather than the local development fabric. Let's back to our source code.

If you have the impression of the last post we created a configuration section in the Cloud Service project named DataConnectionString and aimed it to the local development storage. Now let's change it to the real azure service. Open the Web_Role node under the Cloud Service project, Roles folder and navigate to the Settings panel, changed the account information through the popup connect string editor. We filled the account information based on the storage account we had just created and save it.

09

And then right-click the cloud service project and publish it.

10

Once finished the Visual Studio will open the local folder that contains the necessary package files, and open the azure develop portal to us. In the portal we go to the hosted service page we created before and clicked the Deploy button under the staging section. We uploaded the package and the configuration files be published before and assign a version to it, let's say v1.0. And let's press the Deploy button.

11

Although the Microsoft said the deployment will take no longer than 90 seconds normally absolutely I'm the abnormal one. It always takes me at least 5 minutes. I think it's because few people played with it in China with low bandwidth. Anyway, after it uploaded and deployed we can see there are some buttons appeared under the staging section.

  • Upgrade: This can be used when you want to upload a new version of your application.
  • Run: Execute the cloud application. The code in WebRole.cs will be executed. The website cannot be accessed until you run it.
  • Configure: Open the configuration file content that you can update through the portal.
  • Delete: (I don't think you need the explanation.)

So let's run it.

12

After the azure portal finished the deployment and the initialization you can access the website through the URL under the staging icon. Now let's typed something and posted it to see if it works well on the cloud.

13

Once we are feeling well we can move it from the staging to production. Just click the recycling icon in the middle of the staging and production section. Then after another several minutes the application will be deployed on the production section.

14

You can see now the staging section was empty and the production section with our application ready.

15

Let went to the URL on the page which is http://whiteboard.cloudapp.net/. Since we are using the same storage account we can see the comment we posted when at staging section.

16

 

In this post I covered the steps when deploy the application onto the windows azure platform. Deploying azure application is very easy. What we need is to create the services, update the configuration, upload the packages and migrate to production section. The windows azure will take case of file extracting, IIS and firewall configuration, migration between staging and production, etc. I think that is what Microsoft said, "shorten your time to market".

We only explained and used the web role and the table storage so far. There are other 2 storages (Blog and Queue) and some other roles we didn't mention. In the next few posts I will cover all of these 3 storages more deeply.

 

Hope this helps,

Shaun

All documents and related graphics, codes are provided "AS IS" without warranty of any kind.
Copyright © Shaun Ziyan Xu. This work is licensed under the Creative Commons License.

Note: Since I had removed the sample application of this post recently you will not be able to have an online example by refer the URL mentioned in this post.