December 2009 Entries
Earlier I had written on migrating your Northwind database from SQL Server to SQL Azure. In that, I had mentioned that the SQL Azure Migration Wizard supports migrating the schema and doesn’t do the data migration which has to be done manually.
Meanwhile, George Huey has published a recent version on the SQL Azure Migration Wizard at CodePlex which also does data migration. What this means is that, you can take an existing database and generate the scripts and go ahead and deploy the database along with data into your SQL Azure Database. That’s one hell of work done in a single step, I would say.
What’s more, the recent version also supports the following options
SQL Server to SQL Azure (existing)
SQL Azure to SQL Server
SQL Azure to SQL Azure (multiple accounts)
You can download the latest version from http://sqlazuremw.codeplex.com/Release/ProjectReleases.aspx?ReleaseId=32334
In the previous post I had described the steps to secure your Windows Azure tokens and get the necessary Visual Studio templates as well as making your web application Azure ready by adding the cloud project and building against it.
Once you have tested the Development Fabric, the instances as well as the application, the next step would be to publish it to the Windows Azure platform. Select the “CloudService1” project that you added to the solution, right click and select “Publish”
Once you click on “Publish”, if you are connected to the internet, it would try and open up Azure login screen and once you sign in, https://windows.azure.com/Cloud/Provisioning/Default.aspx
If you had received the Windows Azure Tokens, and claimed them (as per my previous posts), you would hit the screen as below upon clicking on your project.
in the above, once you click on “New Service” it opens up the page to choose the type of service you would want to create.
As mentioned earlier, in the CTP you would get to create 2 storage accounts and one hosted service. In our case, we would require a hosted service, so select the same and proceed to the next screen. It would provide you the option to specify the Service name and description (note both are mandatory) I have provided a name as well as description as per the screen below.
Clicking on next would take you to the screen where you can choose the sub-domain name. Provide the name and “Check Availability” and then leave the default and click on “Create”.
The next screen opens up with details of the stuff entered so far, as below:-
Click on “Deploy” and it would open up the next screen to upload your configuration files. When you selected “Publish” in the first step, it would have also opened up your solution’s “Publish” folder automatically. You can take that path to browse to the CloudService1.cspkg file for the first file upload control and the ServiceConfiguration.cscfg for the second and also provide a label for the deployment. After that Click “Deploy”
This process would take some time and then get to the following screen. (If you get an error here, mentioning the role instances are out of range, make sure you changed back the number of roles in your configuration file as mentioned in my previous post to “1” which is default. The current CTP limits it to 1 instance and although you can change it locally to any number to see the instance simulation in development fabric, when you publish and go to production in the CTP, it doesn't permit more than 1. So, change the configuration, rebuild and then publish to get the updated configuration files which you can replace)
Note that at this stage the application is “Stopped”. You need to explicitly click on “Run” to start the application. This would start enabling the Deployment and takes a few minutes based on the server availability and then gets to the “Initializing” stage where the icon (blue) changes to Yellow as well as status to “Initializing”
At this point the custom URL with our sub-domain (in this sample, http://harishweb.cloudapp.net) would not be available since its is still initializing. Once that is done and it indicates “Ready” as below, you can access the URL to see your app running on the cloud. You can also “Suspend” and do other things from this screen later.
So, with that we have deployed our ASP.NET application to cloud. If you had followed the Part I and Part II of the SQL Azure migration as well, your application is now running UI Layer on Windows Azure and SQL Layer on SQL Azure. That makes it a truly cloud based solution. Here below in a screen shot of my application running on Windows Azure with my custom sub domain i.e. http://harishweb.cloudapp.net (note, I would delete the service shortly and hence this URL wouldnt be available. since the CTP offers only one hosted service, i would keep it available for me all the time by removing the apps as and when i am done with demo )
I hope you found these posts beneficial and do point out if there are errors / omissions in the steps.
Earlier I had written 2 posts – Taking your Northwind Database to SQL Azure and binding it to an ASP.NET GridView Part I and Part II . I thought, I will complete the series with a post on moving your ASP.NET Application as well to Windows Azure making it a truly cloud based application.
Before we start, there are a bunch of things that you would need to do. First and foremost, you would need a Token for Windows Azure. You can request for a free token for Windows Azure from here after providing your Live ID and a few more details. I am not aware of the current time it takes for receiving a token but in the past it used to be 24 hours.
Similarly, if you want to have SQL Azure Tokens, you can get it from here and then login to https://sql.azure.com to redeem the token once you receive the same. (if you had migrated your SQL Database to SQL Azure as per my earlier posts, you would have done these already)
Once you receive the tokens for Windows Azure, you would need to visit the http://lx.azure.microsoft.com/ and sign in with your Live ID that you used for registering for the tokens. Once you login, you would be able to see a screen as below
Since you haven’t claimed the tokens, there won’t be any projects listed here. You can click on “Account” tab and click on “Manage my Tokens” in the bottom to claim the tokens for Windows Azure. With this token you can create 1 hosted services account and 2 blob storage accounts.
And, for development purposes, you would require Visual Studio 2008 SP1 and the Windows Azure Tools for Visual Studio 2008 SP1. You can download the tools from here
For the rest of the post, I am going to assume that you received the tokens for Windows Azure and SQL Azure and have already migrated the northwind sample database to SQL Azure and did the configuration steps as per Part I and Part II of my posts as well as created the web application with a simple gridview that binds to the SQL Azure database.
Once you login to the portal http://lx.portal.azure.com and click on the Project you will get a screen similar to the one below
In the above, you can see that, I already have a blob account by name “harishblobs” that I would use to store heavy data such as video etc., But the point of your interest would be the “New Service” link that is on the top. You would need this when you are migrating your ASP.NET Application in the steps to follow.
If all is done as per my earlier steps, you have your ASP.NET Application with a simple webform that has a GridView. The GridView is bound to a SQL DataSource and the SQL DataSource initially was configured to the use the local database instance of northwind database and thereafter, you had changed the connection string pointing it to the SQL Azure database.
The next step is to add a Cloud Project to your ASP.NET Application. In Visual Studio 2008, click on “File – Add – New Project” and choose “Cloud Service”
This would provide a screen to chose the Role type as below
Since we already have an existing ASP.NET Application,we just have to click “Ok” with “ASP.NET Web Role” highlighted as you see in the screen. Make sure the “Cloud Service Solution” is blank as per the above screen.
With this step you will find that a “CloudService1” has been added to the solution and it has a “Roles” folder as well as 2 configuration files.
Right Click on the Roles and select “Add – Web Role Project in Solution”
This would automatically add the existing Web Application in your solution to the Cloud project under “Roles” (you will only see a cloud project icon with the name of your existing web application – the files won’t move)
Once this step is done, you just have to build the whole solution once. If all is well, and you run the solution, you will get the page but there would be a series of notifications as well as status messages on the Visual Studio bar in the bottom indicating the steps the tool is doing.
After a few minutes you would get an URL which is http://127.x.x.x that has your page. Remember, we earlier had the localhost URL. This is the Fabric Controller simulation of your application running on the cloud. You can click on the notification area in your task bar to open the “Development Fabric UI” (The Development Fabric is something that gets installed when you install the Windows Azure SDK). You can expand the “Service Deployments” icon in the Development Fabric and then expand the subsequent node to the see the instances running. By default it shows 0 which indicates one instance of your application is running. You can change is number from the “ServiceConfiguration.cscfg” file in the Cloud Project and set the “Instances Count” to 3 or 4 or whatever you like. Once you do that and re-run the solution you would get a simulation screen in the Fabric Controller as below
Make sure you change back the number of instances to "1" in the file above before publishing to Windows Azure, since in production, as of CTP, it supports only 1 instance.
With this, we are all set to go the Cloud :) Considering the length of the post, I would put the next set of steps in another post to follow. Read next post
If you have used the Entity Framework that shipped with Visual Studio 2008 SP1, you would really start appreciating the flexibility it offers for building schema driven data access layer and get it to the UI Layer either directly or using a middle tier such as WCF RIA Service. Check my earlier post on this, if you are interested further :)
Meanwhile, the other exciting stuff that has been around is the SQL Azure which is part of the Windows Azure platform. SQL Azure provides relational data over the web which means, the Database is hosted, maintained and all is done by us and you get to store your database and query the same as if you were running it in your local Data Center or server. Of course, SQL Azure is currently CTP and you can get free access to it if you have the Azure Tokens.
While I had earlier written about Migrating your database to SQL Azure that example used an ASP.NET front end which had a GridView doing direct data binding with SQL DataSource. Obviously, one would want to use some of the more abstract controls such as LINQ DataSource / Entity DataSource.
The trick however, here is that, when you create an Entity DataSource from your local database, the Entity Modelling Designer gets access to pull all the required information and build a nice schema with meta data. However, if you try to bind your Entity Model to a SQL Azure tool, then you wouldn’t get this flexibility since SQL Azure doesn’t provide support for the same currently.
Kevin Hoffman provides a nice work around in his post which I want to repeat here for the sake of continuity in the post. If you would like to generate an Entity Model out of your SQL Azure Database, have a local copy of the database running with the same schema. The trick is to first point your Entity Designer tool to your local database and allow it to pull all the required information from here and build the entity model. Thereafter, you can just visit the SQL Azure Portal at https://sql.azure.com and pick up the identical database’ connection string to be copied to your web.config file to replace the local connection string (Read my earlier post for a steps).
Now, the Entity Framework connection string is a little complicated with a lot of settings. The connection string path is provided as a property within the main connection string. For example a typical Entity Framework connection string in the web.config file looks as below:-
connectionString="metadata=res://*/Model1.csdl|res://*/Model1.ssdl|res://*/Model1.msl;provider=System.Data.SqlClient;provider connection string="CONNECTION STRING;Encrypt=True;;MultipleActiveResultSets=False"" providerName="System.Data.EntityClient" /></connectionStrings>
In the above, as you can see, the actual connection string starts from provider connection string setting.
You need to replace that portion with the Connection string copied from the SQL Azure Portal for your Database. Make sure you don’t mess up with the $quot and other settings by mistake.
Secondly, if you are running Visual Studio 2010 Beta 2 and trying to accomplish this, in the copied connection string, you need to change the User Id part to USERNAME@SERVERNAME (read my previous post on this for more information) . Also, the default password that is copied from the SQL Azure portal is “mypassword” which you have to change to your actual password.
Finally you also need to set the MultipleActiveResultSets to False since SQL Azure doesn’t support the same currently. Otherwise, you will get an error.
Once you have taken care of all the steps above, your page works seamlessly as it did while binding the entity framework to your local database (provided you have an identical database to one that is running on SQL Azure)
I have highlighted the important portions in the connection string so that you can take care of those when binding.
While working on SQL Azure connectivity from Visual Studio 2010, I faced the above error. The full error text is as below:-
“Server name cannot be determined. It must appear as the first segment of the server's dns name (servername.database.windows.net). Some libraries do not send the server name, in which case the server name must be included as part of the user name (username@servername). In addition, if both formats are used, the server names must match. “
As the message suggested, I tried putting the username@servername but it didn’t help. The strange this was, the same connection string worked from Visual Studio 2008. Let me add more clarity to this.
With SQL Azure, you can migrate your on-premise database to SQL Azure (read my previous post on this) and thereafter, just change the connection string in your Web / Desktop Application Configuration file to point to the SQL Azure Database. The SQL Azure portal provides the connection string in the following format
So, in normal cases, you can create an application and do a data binding to get the connection string inserted into the configuration file. Later, once you migrate the data into the SQL Azure service, you can just change the connection string to the above (after adding the correct Servername, UserId, Password, Database name etc.,)
I did the above steps in Visual Studio 2008 and the application showed zero difference between binding from the local database instance and from the SQL Azure instance.
However, when I was trying this in Visual Studio 2010 Beta 2, the local instance bound properly (as expected) and when I replaced the connection string with the SQL Azure Database connection string, it started throwing the error explained in the beginning of the post.
The change I had to make was, in the connection string, for the User ID I had to specify USERNAME@SERVERNAME example rajiv@sdkldfldfdfd (both username and servername are imaginary. note the servername is just the name of the server and not the fully qualified path with .database.windows.net etc., which is not required)
Post this change, the application was able to bind data from the SQL Azure Database without any issues.
So, for Visual Studio 2008, it was simply username and for Visual Studio 2010, it required username@servername
Ok, I am playing with the Windows Azure Training Kit November 2009 release and the first sample I wanted to try was “Migrating web applications to Windows Azure”. I believe a whole bunch of people moving to Azure aren’t just going to create new web apps but rather try and move their existing web apps which is why, I thought this exercise is more important.
After following the initial few steps, I came to the place where we manage state providers and one of the requirement is to the StorageClient library available as a part of the training kit. Now when you add reference to this library (project) and try to build, you may hit the above error i.e. unable to find “Microsoft” or “Microsoft.ServiceHosting.ServiceRuntime” which is one of the primary assemblies used in the “StorageAccountInfo.cs” file.
I went through various searches and found out the information that this has moved to Microsoft.WindowsAzure.ServiceRuntime. What followed was a series of build errors in the same file pointing to various references. So the idea behind this post is to help folks get through this hurdle.
First off, you heard it already, you need to remove the reference for “Microsoft.ServiceHosting.ServiceRuntime” from both the project references as well as the “using….” statement in the class file. It needs to be replaced with “Microsoft.WindowsAzure.ServiceRuntime” both in add reference as well as the using area of the class file.
Secondly, you will get the error “RoleManager” does not exist. This should be replaced with “RoleEnvironment”
Then you would hit the error at “IsRoleManagerRunning” that needs to be replaced with IsAvailable, i.e. RoleEnvironment.IsAvailable
Then, you would also get error at RoleEnvironment.GetConfigurationSetting. It needs to be replaced with RoleEnvironment.GetConfigurationSettingValue
Finally, you would also get error at catch(RoleException). That needs to be replaced with catch(RoleEnvironmentException)
These changes should make your StorageClient project build successfully :)
You will get the above errors in the “AspProviders” project file as well and would need to replace them.
Another error that you would hit is around the RoleManager.WriteToLog which is replaced with the Microsoft.WindowsAzure.Diagnostics for logging events. You can read more about it at http://msdn.microsoft.com/en-us/library/ee830425.aspx
However, in my case, I commented that particular line and went ahead with the SDK sample.
Would keep posted of more explorations as and when I hit :)
I have been playing with the WCF RIA Services (erstwhile .NET RIA Services) for sometime and found that most of the samples out there focus on Silverlight based applications. While the new WCF RIA Services preview for VS 2010 is awesome in terms of its Silverlight integration, I also wanted to test out on building plain vanilla ASP.NET Applications and using the power of WCF RIA Services to build a middle tier for the same.
Ok, to begin with, I already had Visual Studio 2010 Beta 2 installed and went ahead and installed the WCF RIA Services Preview for Visual Studio 2010 Beta 2 (note that, if you already have the WCF RIA Services for Visual Studio 2008 SP1 installed, this doesn’t install on top of it – so you have to chose whether to use the one that works with Visual Studio 2008 with SL3 or VS 2010 Beta 2 with SL4 Beta – i chose the latter)
Once I had these installed, I went ahead and created an “File – New Project – Empty ASP.NET Web Application” in Visual Studio 2010 Beta 2. This creates a blank ASP.NET Web Application.
I went ahead and added an “ADO.NET Entity Data Model” giving it a name “Northwind.edmx” and configured it to use my sample northwind database. In the Entity Design Wizard, for choosing tables, I chose Products, Categories & Suppliers tables alone and completed the steps to create an Entity model.
This gave an Entity Data Model with 3 tables as well as the auto generated Designer.cs file with Context and Entities.
Now, in normal cases, you have seen a lot of demos where we just add an ASP.NET Webform, drag and drop a grid view control and configure it to use the Entity DataSource template which in turn is configured to use the Entity model created in the above steps. With a few additional clicks to enable paging, sorting, editing, deleting etc., your complete Grid with CRUD operations would be ready. We could also do this with SQL DataSource, LINQ DataSource, Object DataSource etc., based on the preference.
The general concern in these approaches were that, there is no actual middle tier and the UI Layer is directly bound to the Data Access Layer. The moment, you add a middle tier, then the flexibility of binding the Entity Model / LINQ to SQL Model etc., becomes obsolete and one has to configure all the UI => Middle tier => Data Access layer steps manually.
Now, with the power of WCF RIA Services, one can actually generate a middle tier with the underlying Entity Layer.
The next thing I did was to add a “Domain Service Class” (this comes only when you have installed the WCF RIA Services preview for VS 2010) and provide it a name, NorthwindBL.cs. Note before doing this, you have to build the solution after creating the Entity Framework model so that the entity model is available for the WCF RIA Service template to pick up and generate service methods.
Once I provided the name for the “Domain Service Class” and clicked “Add” it provided the screen for choosing the available Data Contexts and the entities (the tables we chose to create the entity model). I selected all the three tables (Products, Categories & Suppliers) and checked “Enable Editing” just for the Products Entity. I also unchecked the “Enable Client Access” checkbox in the top since that is specific to Silverlight scenario and checked the “Generate associated classes for metadata” in the bottom. This provided a class file with all the CRUD operations for “Products” and Get methods for “Suppliers” and “Categories” Entities.
Note that these methods are completely customizable and provides the platform for adding custom business logic, which was always missing earlier while binding the UI Layer directly using SQL DataSource, LINQ DataSource etc., For example, one of the generated Get method for Products is as below
public IQueryable<Product> GetProducts()
I wanted to make a simple customization to it by filtering out the product list with a where condition. I could use a LINQ Query and modify the above method as following:-
public IQueryable<Product> GetProducts()
while this is not the greatest business condition, you can get an idea on the amount of customization I could do even for simple crude operation. In addition, I can also add custom methods to this class file.
The next step is to start building the UI Layer. I added a simple Webform to the project and added a GridView from design view. For now on, there are a few tweaking steps that needs to be followed to be able to use the WCF RIA Services.
Once you install the WCF RIA Services, you will get a bunch of assemblies in the locaiton C:\Program Files\Microsoft SDKs\RIA Services\v1.0\Libraries\Server Out of these, you would require the System.Web.DomainServices.WebControls.dll file to be able to use the DomainDataSource control in the Webform.
Add a reference to it in the project as well as try adding the same to the Toolbox by rightclick on Toolbox, Choose Items and browsing to the above folder. Once you do that, you can drag and drop a DomainDataSource to the webform. It automatically adds the Register Tag Prefix as well.
Once this step is done, you can either go to design view and choose the GridView’s configuration set up and choose DomainDataSource under DataSource configuration. That’s all works as of now. It doesn’t take further to the available methods, etc., as it would be case with EntityDataSource, SQL DataSource etc.,
Then, you can manually modify the DomainDataSource to add further properties, as below:-
<cc1:DomainDataSource ID="DomainDataSource1" runat="server" DomainServiceTypeName="<ProjectName>.NorthwindBL"
EnableInsert="true" EnableUpdate="true" EnableDelete="true" SelectMethod="GetProducts" StoreOriginalValuesInViewState="true"></cc1:DomainDataSource>
For the GridView, a few properties needs to be enabled and with that and selecting a template, the code for GridView looks, as below:-
<asp:GridView ID="GridView1" runat="server" DataSourceID="DomainDataSource1"
CellPadding="4" ForeColor="#333333" GridLines="None" AllowPaging="True"
<AlternatingRowStyle BackColor="White" />
<EditRowStyle BackColor="#2461BF" />
<FooterStyle BackColor="#507CD1" Font-Bold="True" ForeColor="White" />
<HeaderStyle BackColor="#507CD1" Font-Bold="True" ForeColor="White" />
<PagerStyle BackColor="#2461BF" ForeColor="White" HorizontalAlign="Center" />
<RowStyle BackColor="#EFF3FB" />
<SelectedRowStyle BackColor="#D1DDF1" Font-Bold="True" ForeColor="#333333" />
<SortedAscendingCellStyle BackColor="#F5F7FB" />
<SortedAscendingHeaderStyle BackColor="#6D95E1" />
<SortedDescendingCellStyle BackColor="#E9EBEF" />
<SortedDescendingHeaderStyle BackColor="#4870BE" />
With that, we are good to run the page. Note that, the page retrieves records and binds on the gridview only for values where reorderlevel is greater than 10 as per the cutomization we made to the GetProducts method in the Businesslayer.
Similarly, the Edit, Update operations work automatically but you can go to the respective methods in NorthwindBL and add a few conditions, checks as and where required. You can also add Authorization to make sure that only those folks are allowed to modify etc., but I am not covering them as a part of this post.
So, with minimal steps, we could build a three tiered application skeleton and this can be enhanced greatly to build a full fledged app without you having to manually write out the Service Layer methods.
I have uploaded the sample solution along with this post and you can download the same from the link below. Note that you would need to add your connection string to the Web.config file.
I will try and cover the other scenarios as well in future posts, but for now, we have a 3 tier application built effortlessly with the power of ASP.NET, WCF RIA Services and Entity Framework.