CloudCasts Blog

Webcasts in the Cloud
posts - 130 , comments - 71 , trackbacks - 120

My Links

News

Tag Cloud

Article Categories

Archives

Post Categories

Image Galleries

Bloggers Guides

Monday, March 17, 2014

Windows Azure Storage In-Depth Course Available on Pluralsight

For the past few months I have been working on a Windows Azure Storage In-Depth course for Pluralsight. The course is now live, and available on their site.

The content is as follows:

  • Introduction
  • Windows Azure Storage Accounts
  • Storage Account Management & Monitoring
  • Windows Azure Storage Client Library
  • Windows Azure Storage Queues
  • Windows Azure Table Storage
  • Windows Azure Blob Storage
  • Windows Azure Storage Scalability and Performance
  • Using Shared Access Signatures
  • Transient Fault Handling

If you are new to Windows Azure and want to get an idea of the usage scenarios, performance and scalability of Windows Azure Storage it’s well worth taking a look at the introductory modules and scenarios. If you are an Azure developer the more advanced sections will be good for getting to grips with the full feature set of the storage services.

I really enjoyed putting the course together, and doing so taught me a lot about some of the more advanced usage scenarios of the storage services. Hope you find the course useful, and feel free to contact me with any feedback or questions.

The course is available here.

Posted On Monday, March 17, 2014 12:24 PM | Comments (0) | Filed Under [ Azure ]

Friday, March 7, 2014

Game Services and Telemetry Processing in Windows Azure

For the past few months I have been working on a pet project to integrate a 3D car racing game with back-end services hosted in Windows Azure. It has been a really fun project to work on, and has been a great learning experience for using Azure services for game services and telemetry processing, but also getting a look at the world of game development, 3D graphics, and developing physics engines for games.

An Introduction to Red Dog Racing

 

Red Dog Racing is a 3D driving game that makes use of back-end services hosted in Windows Azure. Sector and lap time data and telemetry data is send from the game to the Windows Azure Service Bus. A worker role is used to de-queue and process the data and store it as entities in Windows Azure Table Storage. Telemetry data is also sent to Windows Azure Blob Storage to provide the game with a replay function that displays ghost cars for the fastest laps driven by other players. The best lap times, overall standings, and driver telemetry data can be viewed on a website.

I have used the game as one of the scenarios in my recently published PluralSight course Windows Azure Storage In-Depth, to show how I use Queues, Table Storage and Blobs to process and store  lap time, telemetry data and replay data from the game.

I will be presenting my Game Services and Telemetry Processing in Windows Azure session at a number of conferences over the next few months.

  • Azure DevDays – Helsinki, 21st March
  • TechDays The Netherlands – The Hague, 16th – 17th April
  • DevSum – Stockholm, 21st – 23rd May
  • Techorama – Belgium, 27th – 28th May
  • NDC – Oslo, 2nd – 6th June

I’ll also be hosting one-day workshops with hands-on labs based on the game at NDC in Oslo, and DevSum in Stockholm, and including labs based on the scenario in my Windows Azure In-Depth training course. I will also be delivering one-day workshops based on the scenario to companies and training centers.

If you want to learn more about telemetry processing or implementing game services in Windows Azure, or would be interested in a Windows Azure training course or workshop please feel free to contact me though this blog.

In future posts I will be digging deeper into the implementation, and sharing my ideas about what makes Windows Azure such a great platform for developing scalable data storage and telemetry processing applications.

Posted On Friday, March 7, 2014 4:08 PM | Comments (0) |

Thursday, September 19, 2013

CloudBurst Day 1

The CloudBurst conference kicked off in Sweden today with sessions from Mark Brown, Magnus Mårtensson, Mike Martin, Maarten Balliauw, Scott Klein and Andy Cross. It was great to see such a good line up of presenters and some excellent sessions. Just like last year we had a packed room with many developers returning after last years event, and all sessions being live streamed.

We have a great line up for day 2, and will be live-streaming all the sessions from 09:00 Central European time. You can see the session line up and link to the live stream on the CloudBurst website.

Posted On Thursday, September 19, 2013 9:47 PM | Comments (0) |

Thursday, August 15, 2013

CloudBurst 2013–19th & 20th September, Akalla, Sweden

 

After months of planning today is the day we go public with CloudBurst 2013. We are following on from CloudBurst 2012 with two full days of presentations from community and industry leaders.

Magnus and I decided to run CloudBurst 2013 the day after CloudBurst 2012. Last year’s event saw some excellent presentations and great enthusiasm from the attendees. Following on from feedback from last years attendees we have shortened the lunch breaks to allow more session time, and there will be a “Mingle & Drinks” event at the end of day 1 where you will get a chance to chat with the presenters and other attendees.

The proposed session line-up as as follows:

  • The State of Windows Azure - Mark Brown - Windows Azure Community Manager
  • How it’s made: MyGet.org - Maarten Balliauw - Windows Azure MVP
  • Big Data for the Win!! - Andy Cross - Windows Azure MVP
  • Bursting to the Cloud in 1 Hour - Patriek Dorp, van - Windows Azure MVP
  • Windows Azure Store - Björn Eriksen - Architect Evangelist
  • Getting your Nerd on with Windows Azure data services - Scott Klein - Windows Azure Technical Evangelist
  • To be Decided - Magnus Mårtensson - Windows Azure MVP
  • Windows Azure though the eyes of an IT Pro and how to cope with Developers and Business Stakeholders - Mike Martin -Windows Azure MVP
  • To be Decided - Alan Smith - Windows Azure MVP
  • Connecting the Cloud with your local applications - Sam Vanhoutte - Windows Azure MVP
  • Dependable Cloud Architecture - Michael Wood - Windows Azure MVP

All sessions will be in English. More details are on the event website.

The event will be free to attend*, but places are limited. Register here.

If you would like to be informed of more Sweden Windows Azure Group (SWAG) events, sign up here.

We would like to thank the event sponsors (Microsoft Sweden, Active Solution and AddSkills) for covering the costs of the event and travel costs for the presenters.

* Because this is a free event with very limited attendance capacity where great speakers donate their time that you will get to experience we ask you to very carefully honor your ticket. We reserve the right to invoice you 500 SEK if you reserve a ticket but do not attend the event.

Posted On Thursday, August 15, 2013 10:26 AM | Comments (0) |

Tuesday, August 13, 2013

Successful cloud solutions in practice–Seminar at Active Solution, Stockholm

Jesper Zachrisson, Magnus Mårtensson and I will be presenting a semiar at Active Solution on Wednesday 28th September 11:30 – 12:45.

“Active Solution har satsat hårt inom molnet i allmänhet och Windows Azure i synnerhet. Olika typer av mer allmänna molntjänster som exempelvis mail är idag ett vanligt inslag hos de flesta företag. Active Solution använder molnet främst till lösningar som är unika för ett visst företag. Här har utvecklingen gått långsammare men det finns samtidigt mer att vinna i många fall.

 

Vi har bara under det senaste året jobbat i ett 15-tal uppdrag med kundunika lösningar. Vi ser att molnet används allt flitigare bland "IT-tunga" företag men att det är långt kvar för många "vanliga" företag och organisationer. Ursäkterna eller förklaringarna är inte alltid hållbara. Det tycker vi är fel och vill med det här seminariet visa hur andra gör.”

The event is free to attend. If you would like to attend you can register here.

Posted On Tuesday, August 13, 2013 10:56 AM | Comments (0) |

Friday, June 14, 2013

NDC Oslo

2013 has been a hectic year for conference presentations so far, NDC in Oslo has been the 6th conference I have attended, and my session there was my 11th conference presentation this year. I have been meaning to make the short trip over from Stockholm to NDC for a few years, and this was the first time I made it. I have heard a lot of great things about the event, and was impressed with the location, the sessions, and most of all the atmosphere around the event boots and during the party on Thursday evening.

The session I was delivering was my “Grid Computing with 256 Windows Azure Worker Roles & Kinect” demo, which I have delivered at many events over the past 12 months. The demo went fine. I’m always a little nervous when I try to scale out the application to 256 worker roles, it almost always works well and the application will scale in minutes, but very occasionally there can be a longer delay due to the provisioning process in the Windows Azure data centers. This would not be an issue for many scenarios, but when standing on stage in front of a room full of developers you really want things to run smoothly.

A number of people have suggested that I should pre-provision an environment so that it is guaranteed to be there when I run the demo during a session. For me the aim has always been to show the rapid scalability on cloud-based platforms live on stage. Pre-provisioning an environment may make for a more reliable demo but to me that would be cheating, and not half as much fun!

Sunday, April 28, 2013

Global Render Lab at the Global Windows Azure Bootcamp

Yesterday the attendees of the Global Windows Azure Bootcamp took part in a Global Render Lab that was built on the Windows Azure platform. The lab was adapted from a simple demo I wrote in 2010, and then adapted for a lab that I use on my Windows Azure training courses.

clip_image002

 

The lab allowed attendees from events all over the globe to participate and compete in rendering frames in a 3D animation. All the processing would take place in a Windows Azure datacenter.

clip_image004

About 750 attendees from 50 locations in 22 countries took part in the lab. During the event a total of 9904 worker role instances were started, with over 4,000 instances running concurrently for the second half of the event. 724,059 3D ray traced animation frames were rendered with a total render time of 4 years 184 days 2 hours and 46 minutes. The overall compute time used by the 9904 worker roles was almost 7 years.

clip_image006

The Global Render Lab website received 3,718 unique visits, with 40,022 page views during the event. At times there were over 100 simultaneous visitors on the site.

 

clip_image008

The traffic on the website was sustained over the day with over 5,000 page views per hour at its peak. The website was hosted on a single small reserved instance in Windows Azure Websites, with the ASP.NET cache being used to cache the result sets form the queries to the Windows Azure SQL Database.

clip_image010

 

228 animations were published to the website using Windows Azure Media Services. The peak inbound data was 6.57 GB per hour, and the maximum encoding job queue depth reached 43 jobs.

clip_image012

The worker roles used 4 storage accounts for animating, rendering, and encoding and media storage. The rendering storage account peaked at 2,105,873 queue requests per hour, which is an average of 585 requests per second. The peak for blob storage was 415,435 requests per hour, which is an average of 115 requests per second.

 

clip_image014

Creating the Global Render Lab – The Two-Dollar Demo

Back in 2010 there was a lot of buzz around Windows Azure and Cloud Computing as they were, and still are, new and rapidly evolving technologies. I had set through a number of presentations where the scalability of cloud based solutions was evangelized, but had never seen anyone demonstrate this scalability on stage. I wanted to create a demo that I could show during a 60 minute conference or user group presentation that would demonstrate this scalability.

My first job in IT was as a 3D animator and I had initially learned to create animations using PolyRay, a 3D text-based ray-tracer. Creating ray-traced animations is very processor intensive, so  running PolyRay in a Windows Azure worker role and then scaling the number of worker roles to create an animation would be a great way to demonstrate the scalability of cloud-based solutions. I created a very simple Windows Azure Cloud Service application that used PolyRay to render about 200 frames that I could use to create an animation.

The first time I shoed the demo was in Göteborg Sweden in October 2010. As I had was using the Windows Azure benefits in my MSDN subscription I had 20 cores available, and I demoed the application scaling to 16 cores. As the compute costs at the time were $0.12 per hour, 16 cores would cost $1.92, so I joked with the audience that it was my two-dollar demo.

Grid Computing with 256 Worker Roles and Kinect

Running on 16 instances was fine, but I really wanted to make the demo a little more impressive. Scaling to 256 instances seemed like the next logical step, and this would cost a little over $30 to run for an hour. With 256 instances I really needed a more impressive way to be able to create animations. I hit on the idea of using the depth camera in a Kinect sensor to capture depth data that could be used to create a 3D animation.

The image below of my daughter and I is taken using a Kinect depth camera.

 

clip_image016

For the animation I chose to model one of those pin-board desktop toys that were popular in the 80’s. I used a simple C# application to do this, it a scene file for the PolyRay ray-tracer using the pixel values of the image to determine the position of the pins. The image below shows the frame that would be rendered using the image above.

clip_image018

I also added Windows Azure Media Services and Windows Azure Websites into the demo so that the completed animation would be encoded onto MP4 format and published on a website.

Scaling an application to 256 worker roles live on stage is an exciting demo to do, but I do get a little nervous every time I do it, as I am heavily reliant on the Windows Azure datacenter I am using being able to allocate the resources on-demand when I need them. I have delivered the Grid Computing with 256 Worker Roles and Kinect demo a number of times at various user groups and conferences and, usually, the demo works fine. It typically takes about 10-20 minutes for the application to scale from 4 roles to 256 roles.

Azure Training Course

I have adapted the demo to use as a lab in my Windows Azure training courses. The class is divided into two teams and they compete with each other to render the most frames. The lab involves some coding on the solution, and then creating and deploying a deployment package to a Windows Azure Could Service. The students are free to scale the number of worker roles they are using to compete with the other team.

 

image

I found that the lab really encourages teamwork and cooperation, as when one student gets their solution deployed they will help the others on their team to complete the lab and get more worker roles running. I use a simple WPF application to keep the score.

 

clip_image022

If you are interested in attending one of my courses, details are here.

Global Windows Azure Bootcamp

In early 2013 Magnus Mårtensson and I had discussed the idea of running an Azure bootcamp in Stockholm. We decided it would be a great idea to involve some of the other MVPs in Europe and the US and ask if they were interested in running bootcamps in their regions on the same day. This would make for a great community spirit, and allow us to share ideas and publicize the events.

We set a date for Saturday 27th April, and started to get others involved. At the MVP summit in February we invited Azure MVPs and MVPs from other technologies to organize and run their own bootcamp events on the same day. We got a great response, and it resulted in close to 100 events planned in almost 40 countries, with over 7,000 people registered to attend.

Global Render Lab Concept

Another thing we discussed at the MVP summit was the idea of having some kind of lab, or project that all the events could participate in. This would really help to drive the community spirit and connect the groups in different regions. It would also help to make the event truly global, by having participants around the world cooperating to achieve one goal.

As I had the worker role animation rendering lab for my course ready to go, I suggested it would make a great lab for the Global Windows Azure Bootcamp. It should be fairly easy to convert the lab to work with the different locations working as teams, and create a website that would display the scores.

It would be great fun to have all the different countries and locations competing with each other to render the most animation frames. The challenge would be to ensure that the application would scale to a global level and be able to handle the load that the attendees would place on it.

Creating the Global Render Lab

The challenge I had with creating the lab was to make something that every student could participate in. We originally anticipated that we would have about 10 events, with an average of 50 people at each event, so we would have a maximum of 500 participants. As the event drew nearer we realized that we had been very conservative in our estimates, the event would be ten times larger than we originally planned, with close to 100 events and over 7,000 attendees.

The challenge for me when I easy creating the lab was to make something that could scale to tens of thousands of worker role instances if required. What made this even more challenging was that there would be no way to test this scalability before the event, and I had no control over all the instances that were running, as they would be deployed by attendees in different locations around the world. The lab would last for 26 hours, starting in Sydney and Melbourne Australia, and ending in San Diego California, meaning if I was going to be monitoring the lab over the event, I was not going to get much sleep.

Another potential issue with the event being a lot more popular than we expected was the load that it would place on Windows Azure. As I was hosting the storage services in the North Europe datacenter, the attendees would be deploying there worker roles there. We asked the Azure team if there would be any problems if the bootcamp attendees tried to create over 10,000 worker roles in one data center, and were assured that it would not be an issue.

The students would have a deployment package that they could deploy to Windows Azure using their own subscriptions. The events would also be able to use a Windows application and a Kinect controller to create and upload animations that would be processed by the global render farm.

There is a webcast with an overview of the lab here.

Running the Global Render Lab

The event kicked off in Sydney and Melbourne Australia on the morning of Saturday 27th April. In Sweden it was midnight, and I was at home monitoring the progress. The students would hopefully start deploring the worker roles early on, so I could see that everything was running smoothly and then get some sleep.

To monitor the render lab I added code to the worker roles to send messages to a Windows Azure Storage queue and I used a C# console application that would receive the messages. This meant I would receive notifications when worker roles started, stopped, animations were completed, and also any exceptions that were thrown by worker roles. I could keep track of the thousands of worker roles that were running using a simple C# console application. I even used Console.Beep() so that I would know when things were happening if I was not set ay my PC.

At 01:14 Swedish time Alex Thomas at the Melbourne event in Australia started the first worker role instance, closely followed by other attendees at that event. A total of 93 worker role instances were created at the Melbourne event, which rendered almost 9,000 frames of animation. At 04:00 on Saturday morning the lab was running smoothly, and so I decided to get a few hours sleep before the events in Europe started.

I woke up at 07:00 and some of the Eastern Europe events, and events in India had started. Things were still running fine, and we had a few hundred instances running. The event in Stockholm that I was running started at 10:00 and I got there early to open up and started the event with a short presentation about the Global Render Lab. Robert Folkesson and Chris Klug did a fantastic job delivering sessions and assisting the students with labs, whilst I spent most of the event monitoring the Render lab. Germany, Belgium, Denmark, UK and The Netherlands really got into the spirit of the lab, with Germany creating over 1,200 worker roles in total.

 

At 10:30 central European time we had over 1,000 running worker roles, and by 11:30 over 2,000. By 16:00 we had over 4,000 instance running, and this was maintained for the rest of the event as attendees in Europe deleted their deployments and attendees in the USA deployed ad scaled up theirs.

I had also included a “Sharks with Freakin Lasers” easter egg in the animation creator that some of the attendees discovered.

clip_image024

By the time the events in Europe were closing, the events in the USA and Brazil had started. I got home from the Stockholm event at 17:00 and after some family time I was back monitoring the lab. USA had about 14 events completing in the render lab, and were trying to catch up with Germany.

clip_image026

USA had a total of 2477 worker roles deployed during the event, compared to Germany’s 1260, so by the end of the event they had taken first place in the countries, with Berlin taking first place in the locations.

 

clip_image028

Issues Running the Lab

Two or three days before the event I was pretty sure that the Global Render Lab would be a failure, and was seriously considering cancelling the lab at the last minute. About three days before the event I hitting a lot of issues with reliability, I have not had time to diagnose exactly what caused these, but will hopefully include them in a later report. 12 hours before the event kicked off I hit a major potential show-stopper in my code but with help from to Maarten Balliauw I was able to resolve it quickly.

Thanks to the time invested by some of the event organizers during the testing phases of the lab I was able to detect a number of other issues that could have been potential show-stoppers on the day. The need to be able to deploy a new version if needed, and to put all the running worker roles into an idle state was quickly identified, as was the need to be able to reduce the load on storage accounts by disable worker roles by country, location, attendee or specific role instance. I had no control over the deployment and deleting of the worker roles, but I needed some control over how they ran against the storage accounts.

A number of animations failed to be completed and got stuck in the render queue with a status of Encoding, this was mostly due to the way I had implemented the encoding process in the worker role, but also due to the way the students created and deleted deployments. Worker roles were being deleted throughout the event, sometimes at a rate of over 100 per minute, and this meant that some long-running tasks would fail to complete.

Lessons Learned

Overall I felt that the lab was a great success. From the photos captured by the attendees who uploaded animations it looked like they were enjoying using the application. Many of the events took part in the lab, with some of them taking the competition aspects seriously. It would have been great to have more of the locations taking part, more effort could have been made to promote the lab and make sure that content was provided to attendees in their native languages.

On the whole the application stood up to the load that we placed on it. Some attendees had to wait a long time for their animations to be rendered and encoded. The job queue on the Media Services account indicates that things could have been improved there by increasing the capacity available there to reduce this time.

There were a few reliability issues that meant that some animations never got encoded, there is scope for improvement here. Also the range of different animations that could be selected and rendered form the depth data could be extended.

Global Render Lab 2.0

The project started out as a simple demo in 2010 and has been extended and improved to make the solution we used for the Windows Azure Boot Camp. I plan to continue working with the project when I get the time and make more improvements. I have a large project backlog list for the Global Render Lab, there were so many cool things that I wanted to add to it, but a limited amount of time available.

It was great fun to run the lab, and hopefully there will be opportunities to do something similar in the future. Feel free to contact me via this blog if you have any suggestions or questions about the lab. I’d be happy to deliver sessions detailing the background of the lab at user groups and conferences if there is an opportunity for that.

Posted On Sunday, April 28, 2013 9:46 PM | Comments (0) |

Friday, April 26, 2013

Global Windows Azure Bootcamp – Please Bring Your Kinect

I’m just putting the finishing touches on the Global Render Lab for the Global Windows Azure Bootcamp. The lab will allow bootcamp attendees around the world to join together to create a render farm in Windows Azure that will render 3D ray-traced animations created using depth data from Kinect controllers.

There is a webcast with an overview of the Global Render Lab here.

If you are attending a Global Windows Azure Bootcamp event you will have the chance to deploy an application to Windows Azure that will contribute processing power to the render farm. You will also have the chance to create animations that will be rendered in Windows Azure and published to a website.

A Kinect controller will be required to create animations, if you have either a Windows Kinect, or an X-Box Kinect with a power supply and adapter for a standard USB connection, please take it with you to the Global Windows Azure Bootcamp event you are attending. Having as many locations there are where attendees can create and upload animations as possible will male for a great community lab on a global scale.

Posted On Friday, April 26, 2013 12:27 AM | Comments (0) |

Monday, April 1, 2013

Website Authentication with Social Identity Providers and ACS Part 3 - Deploying the Relying Party Application to Windows Azure Websites

In the third of the series looking at website authentication with social identity providers I’ll focus on deploying the relying party application to a Windows Azure Website. This will require making some configuration changes in the management console in ACS, and the web.config file, and also changing the way that session cookies are created

The other parts of this series are here:

Website Authentication with Social Identity Providers and ACS Part 1

Website Authentication with Social Identity Providers and ACS Part 2 – Integrating ACS with the Universal Profile Provider

The relying party website has now been developed and tested in a development environment. The next stage is to deploy the application to Windows Azure Websites so that users can access it over the internet. The use of the universal profile provider and a Windows Azure SQL Database as a store for the profile information means that no changes in the database or configuration will be required when migrating the application.

Creating a Windows Azure Website

The first step is to create a Windows Azure Website in the Azure management portal. The following screenshot shows the creation of a new website with the URL of relyingpartyapp.azurewebsites.net in the West Europe region.

image

Note that the URL of the website must be unique globally, so if you are working through this solution, you will probably have to choose a different URL.

Configuring a Relying Party Application

With the website created, the relying party information will have to be configured for Windows Azure Active Directory Access Control, formally known as Windows Azure Access Control Service, (ACS). This is because the relying party configuration is specific to the URL of the website.

One option here is to create a new relying party with the URL of the Windows Azure Website, which will allow the testing of the op-premise application as well as the Azure hosted website. This will require the existing identity providers and rules to be recreated and edited for the new application. A quicker option is to modify the existing configuration, which is what I will do here.

The existing relying party application is present in the relying party applications section of the ACS portal.

image

In order to change the configuration for the host application the name, realm and return URL values will be changed to the URL of the Windows Azure Website URL (http://relyingpartyapp.azurewebsites.net/). The screenshot below shows these changes.

image

For consistency, the name of the rule group will also be changed appropriately.

image

With these changes made, ACS will now function for the application when it is hosted in Windows Azure Websites.

 

Configuring the Relying Party Application

For the relying party application website to integrate correctly with ACS, the URL for the website will need to be updated in two places in the web.config file. The following code highlights where the changes are made.

<system.identityModel>

  <identityConfiguration>

    <claimsAuthenticationManager type="RelyingPartyApp.Code.CustomCam, RelyingPartyApp" />

    <certificateValidation certificateValidationMode="None" />

    <audienceUris>

      <add value="http://relyingpartyapp.azurewebsites.net/" />

    </audienceUris>

    <issuerNameRegistry type="System.IdentityModel.Tokens.ConfigurationBasedIssuerNameRegistry, System.IdentityModel, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089">

      <trustedIssuers>

        <add thumbprint="B52D78084A4DF22E0215FE82113370023F7FCAC4" name="https://acsscenario.accesscontrol.windows.net/" />

      </trustedIssuers>

    </issuerNameRegistry>

  </identityConfiguration>

</system.identityModel>

<system.identityModel.services>

  <federationConfiguration>

    <cookieHandler requireSsl="false" />

    <wsFederation

      passiveRedirectEnabled="true"

      issuer="https://acsscenario.accesscontrol.windows.net/v2/wsfederation"

      realm="http://relyingpartyapp.azurewebsites.net/"

      requireHttps="false" />

  </federationConfiguration>

</system.identityModel.services>

 

 

 

Deploying the Relaying Party Application to Windows Azure Websites Website

The next stage is to deploy the relying party application can be deployed to Windows Azure Websites. Clicking the download publish profile link will allow a publish profile for the website to be saved locally, this can then be used by Visual Studio to deploy the website to Windows Azure Websites.

image

Be aware that the publish profile contains the credential information required to deploy the website, this information is sensitive, so adequate precautions must be taken to ensure it stays confidential.

To publish the relying party application from Visual Studio, right-click on the RelyingPartyApp project, and select Publish.

image

Clicking the Import button will allow the publish profile that was downloaded form the Azure management portal to be selected.

image

When the publish profile is imported, the details will be shown in the dialog, and the website can be published.

image

After publication the browser will open, and the default page of the relying party application will be displayed.

image

Testing the Application

In order to verify that the application integrates correctly with ACS, the login functionality will be tested by clicking on the member’s page link, and logging on with a Yahoo account. When this is done, the authentication process takes place successfully, however when ACS routes the browser back to the relying party application with the security token, the following error is displayed.

image

Note that I have configured the website to turn off custom errors.

<system.web>

  <customErrors mode="Off"/>

  <authorization>

    <!--<deny users="?" />-->

  </authorization>

  <authentication mode="None" />

 

The next section will explain why the error is occurring, and now the relying party application can be configured to resolve the error.

Configuring the Machine Key Session Security Token Handler

The default SessionSecurityTokenHandler used by WIF to create a secure cookie is not supported in Windows Azure Websites. The resolution for this is to configure the MachineKeySessionSecurityTokenHandler to be used instead. This is configured in the identityConfiguration section of the system.identityModel configuration for the website as shown below.

<system.identityModel>

  <identityConfiguration>

    <securityTokenHandlers>

      <remove type="System.IdentityModel.Tokens.SessionSecurityTokenHandler, System.IdentityModel, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" />

      <add type="System.IdentityModel.Services.Tokens.MachineKeySessionSecurityTokenHandler, System.IdentityModel.Services, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" />

    </securityTokenHandlers>

    <claimsAuthenticationManager type="RelyingPartyApp.Code.CustomCam, RelyingPartyApp" />

    <certificateValidation certificateValidationMode="None" />

    <audienceUris>

      <add value="http://relyingpartyapp.azurewebsites.net/" />

    </audienceUris>

    <issuerNameRegistry type="System.IdentityModel.Tokens.ConfigurationBasedIssuerNameRegistry, System.IdentityModel, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089">

      <trustedIssuers>

        <add thumbprint="B52D78084A4DF22E0215FE82113370023F7FCAC4" name="https://acsscenario.accesscontrol.windows.net/" />

      </trustedIssuers>

    </issuerNameRegistry>

  </identityConfiguration>

</system.identityModel>

 

With those changes made, the website can be deployed, and the authentication functionally tested again.

image

This time the authentication works correctly, the browser is redirected to the members page and the claims in the security token displayed.

 

 

Posted On Monday, April 1, 2013 10:12 PM | Comments (0) |

Sunday, March 24, 2013

Website Authentication with Social Identity Providers and ACS Part 2 – Integrating ACS with the Universal Profile Provider

This is the second in a series of articles looking at using Windows Azure Active Directory Access Control (formally Windows Azure Access Control Service), to integrate the authentication services provided by social identity providers to authenticate users with a website. This walkthrough will continue on directly from Website Authentication with Social Identity Providers and ACS Part 1, and add integration with the new .NET Universal Profile Provider.

Integrating ACS and Social Identity Providers with the .NET Universal Profile Provider

The site currently allows users to authenticate with a number of social identity providers, but does not allow users to build up a site profile, or store any user specific information. In many cases, once a user is authenticated, there will be a requirement for the user to store information or configure a site profile. In this step the ASP.NET Universal Profile Provider will be used to store user profile information. The profile provider will need to be integrated with the claims provided by ACS so that the authenticated user can be matched to their saved profile.

The architecture of the proposed implementation is shown below.

image

 

.NET Universal Providers

The .NET Universal Providers provide integration with multiple data sources, including SQL Server, SQL Server Compact and, most importantly for Azure developers, Windows Azure SQL Database. Scott Hanselman has an excellent blog post Introducing System.Web.Providers - ASP.NET Universal Providers for Session, Membership, Roles and User Profile on SQL Compact and SQL Azure, which provides some background on the providers.

 

Installing the Microsoft .NET Universal Providers

The Microsoft ASP.NET Universal Providers is installed as a NuGet package, the searching for “universal” in the Manage NuGet Pachaged dialog box will find the package in the search results. Clicking the Install button will install the universal providers and their dependencies.

image

 

The installation of the NuGet package will also make changes to the Web.config file for the application to configure the providers.

 

<system.web>

  <authorization>

    <!--<deny users="?" />-->

  </authorization>

  <authentication mode="None" />

  <compilation debug="true" targetFramework="4.5" />

  <httpRuntime targetFramework="4.5" requestValidationMode="4.5" />

  <profile defaultProvider="DefaultProfileProvider">

    <providers>

      <add

        name="DefaultProfileProvider"

        type="System.Web.Providers.DefaultProfileProvider, System.Web.Providers, Version=1.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35"

        connectionStringName="DefaultConnection"

        applicationName="/" />

    </providers>

    <properties>

      <add name="IdentityProvider" />

      <add name="Name" />

      <add name="Email" />

      <add name="Twitter" />

      <add name="Location"/>

    </properties>

  </profile>

  <membership defaultProvider="DefaultMembershipProvider">

    <providers>

      <add name="DefaultMembershipProvider" type="System.Web.Providers.DefaultMembershipProvider, System.Web.Providers, Version=1.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35"

            connectionStringName="DefaultConnection"

    enablePasswordRetrieval="false"

    enablePasswordReset="true"

    requiresQuestionAndAnswer="false"

    requiresUniqueEmail="false"

    maxInvalidPasswordAttempts="5"

    minRequiredPasswordLength="6"

    minRequiredNonalphanumericCharacters="0"

    passwordAttemptWindow="10"

    applicationName="/" />

    </providers>

  </membership>

  <roleManager defaultProvider="DefaultRoleProvider">

    <providers>

      <add connectionStringName="DefaultConnection" applicationName="/"

        name="DefaultRoleProvider" type="System.Web.Providers.DefaultRoleProvider, System.Web.Providers, Version=1.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35" />

    </providers>

  </roleManager>

  <!--

        If you are deploying to a cloud environment that has multiple web server instances,

        you should change session state mode from "InProc" to "Custom". In addition,

        change the connection string named "DefaultConnection" to connect to an instance

        of SQL Server (including SQL Azure and SQL  Compact) instead of to SQL Server Express.

  -->

  <!--<sessionState mode="InProc" customProvider="DefaultSessionProvider">

  <providers>

    <add name="DefaultSessionProvider" type="System.Web.Providers.DefaultSessionStateProvider, System.Web.Providers, Version=1.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35" connectionStringName="DefaultConnection" />

  </providers>

</sessionState>-->

</system.web>

 

A connection string for the providers has also been added at the end of the Web.config file.

  <system.identityModel.services>

    <federationConfiguration>

      <cookieHandler requireSsl="false" />

      <wsFederation passiveRedirectEnabled="true" issuer="https://acsscenario.accesscontrol.windows.net/v2/wsfederation" realm="http://win7base/RelyingPartyApp/" requireHttps="false" />

    </federationConfiguration>

  </system.identityModel.services>

  <connectionStrings>

    <add

      name="DefaultConnection"

      providerName="System.Data.SqlClient"

      connectionString="Data Source=.\SQLEXPRESS;Initial Catalog=aspnet-RelyingPartyApp-20130315201440;Integrated Security=SSPI" />

  </connectionStrings>

</configuration>

 

Instead of using a local database in SQL Server Express, a database named RelyingPartyApp will in Windows Azure SQL Database will be used. The configuration string is modified as follows with sensitive information replaced. The RelyingPartyApp database has been created as an empty database in Windows Azure SQL Database, but no tables for the profile information  have been added yet.

  <system.identityModel.services>

    <federationConfiguration>

      <cookieHandler requireSsl="false" />

      <wsFederation passiveRedirectEnabled="true" issuer="https://acsscenario.accesscontrol.windows.net/v2/wsfederation" realm="http://win7base/RelyingPartyApp/" requireHttps="false" />

    </federationConfiguration>

  </system.identityModel.services>

  <connectionStrings>

    <add

      name="DefaultConnection"

      providerName="System.Data.SqlClient"      connectionString="Server=tcp:SERVERNAME.database.windows.net,1433;Database=RelyingPartyApp;User ID=USER@ SERVERNAME;Password=PASSWORD;Trusted_Connection=False;Encrypt=True;Connection Timeout=30;" />

  </connectionStrings>

</configuration>

 

 

Configuring the Universal Profile Provider

The universal profile provider will store profile information for the site members. The values that will be stored in the profile for each user are specified in the profiler configuration. The following configuration is added in the profile section, it specifies that IdentityProvider, Name, Email, Twitter and Location exist as fields for each users profile.

<profile defaultProvider="DefaultProfileProvider">

  <providers>

    <add

      name="DefaultProfileProvider"

      type="System.Web.Providers.DefaultProfileProvider, System.Web.Providers, Version=1.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35"

      connectionStringName="DefaultConnection"

      applicationName="/" />

  </providers>

  <properties>

    <add name="IdentityProvider" />

    <add name="Name" />

    <add name="Email" />

    <add name="Twitter" />

    <add name="Location" />

  </properties>

</profile>

 

 

Creating the Membership and Profile Tables

The next step is to create the storage tables for the site membership and profile information in the Windows Azure SQL Database. This can be performed using the ASP.NET Configuration tool in Vislau Studio.

With the RelyingPartyApp application project selected in the solution explorer, the ASP.NET Configuration option can be selected from the project window.

image

This will open the ASP.NET Web Site Administration Tool in the portal. When the welcome page is opened, the tool will use the database connection string specified in the configuration file and connect to the database. If the database does not contain the tables required to store the website membership and profile information, these will be created.

image

If the database connection string is configured correctly, the tables will be created in the Windows Azure SQL Database. The following screenshot shows the six empty tables created in the RelyingPartyApp database.

image

 

If the tables are not created, check the database connection string and firewall rules to ensure that the ASP.NET Web Site Administration Tool can connect successfully with the database.

 

Identifying a Unique User

One of the challenges of using social identity providers to authenticate users is identifying users based on the claims supplied by these providers. The claims provided by the three social identity providers are summarized in the table below.

Claim

Supplied by

Notes

Name Identifier

Microsoft Account, Yahoo, Google

Guaranteed to be unique for user within identity provider. Guaranteed to be immutable.

Name

Google, Yahoo

No guarantee of uniqueness or immutability.

Email

Google, Yahoo

Guaranteed to be unique for user within identity provider. Guarenteed to be immutable.

Inditity Provider

ACS

Uniquly identifies identity provider.

 

The email address is not usable as it is not supplied by Microsoft Account. The name is not suitable as it is not supplied by Microsoft Account, and there is no guarantee of uniqueness, and can be changed by the account holder.

The only claim that is supplied by all three social identity providers is the name identifier claim. This is guaranteed to be unique within the identity provider, and also guaranteed to be immutable, meaning that once an account has been created, the name identifier value for that account will not change.

Name identifier, combined with the identity provider claim supplied by ACS is the best option, as this will ensure that the values are unique for each provider. Although the name identifier alone may seem suitable, it could be possible for a rouge identity provider to submit a name identifier claim that belongs to an account in another identity provider.

The Users table in the website membership and profile database specifies that the UserName for the user is datatype ncarchar (50). This is shown in the screenshot below.

image

In order to integrate with the universal profile provider database, the name that is used to uniquely identify the user must be 50 characters or less. The length of the name identifier claim supplied by the social identity providers is typically larger than 50 characters, and when combined with the identity provider as will, it will be even larger. One option to resolve this issue is to hash the combination of the identity provider and name identifier claims. This will also add an extra level of security, as the identity provider and name identifier values cannot theoretically be determined from the hash of their values. This will be done using a custom authentication manager in the website in a later section.

Modifying Claim Transformation Rules in ACS

As the user is going to be identified in the universal profile provider by the hashed values of the identity provider and name identifier claims, this value will be supplied to the application as the name claim, and used as the username for that identity. This means that the values for the name claim supplied by Google and Yahoo will need to be mapped in ACS and sent to the ASP.NET relying party application as another claim.

The claim transformation rules in ACS can be used to map the name claims supplied by Google and Yahoo to the given name claim. This claim can then be used by the ASP.NET relying party application to set the name used by the site visitor in the profile. In order to do this the two rules that pass the value of the name claim for Google and Yahoo will need to be modified to output the claim value as the given name claim. The following screenshot shows the modification name to the name claim for Yahoo.

image

With these changes made, the rule set can be saved. The same changes are made for the rule for Google, the modified rule group is shown below.

image

Combining the values of the identity provider and name identifier claims and hashing them is currently not possible using the rule sets in ACS. The next section will show how the name claim can be added to the incoming claims from ACS when the user is authenticated in the relying party application.

Creating a Custom Authentication Manager

Windows Identity Framework is an extensible framework that can be used in a number of levels of complexity. So far we have used the Identity and Access wizard, and modified the configuration file. In this section the core functionality of WIF will be extended to add a custom authentication manager to the ASP.NET relying party application, and then the configuration will be modified to make use of the custom authentication manager.

Claims Authentication Managers

Claims authentication managers are used in the Windows Identity Foundation authenticating pipeline and provide a place where the claims in the incoming security token can be processed before they reach the application code. Common tasks are claims validation, claims filtering, or adding claims to the security token. Developers can create their own custom claims authentication managers in order to process and manipulate claims in the Windows Identity Foundation authentication pipeline.

Creating a Custom Authentication Manager

Creating a custom authentication manager for the relying party application is fairly straightforward. A folder named Code as added to the web project, and then a class named CustomCam is created in that folder. A project reference to System.IdentityModel is then added, and the CustomCam class derived from System.Security.Claims.ClaimsAuthenticationManager, with the Authenticate and LoadCustomConfiguration overridden.

using System.Linq;

using System.Security.Claims;

using System.Security.Cryptography;

using System.Text;

 

namespace RelyingPartyApp.Code

{

    public class CustomCam : ClaimsAuthenticationManager

    {

        public override ClaimsPrincipal Authenticate

            (string resourceName, ClaimsPrincipal incomingPrincipal)

        {

            return base.Authenticate(resourceName, incomingPrincipal);

        }

 

        public override void LoadCustomConfiguration(System.Xml.XmlNodeList nodelist)

        {

            base.LoadCustomConfiguration(nodelist);

        }

    }

}

 

Modifying Claims in a Custom Authentication Manager

The Authenticate method will extract the values for the incoming identity provider and name identifier claims, combine them together, create an MD5 hash of the resulting string, and then add a new name claim with the hashed string set as the value. The implementation for this is shown below.

 

using System.Linq;

using System.Security.Claims;

using System.Security.Cryptography;

using System.Text;

 

namespace RelyingPartyApp.Code

{

    public class CustomCam : ClaimsAuthenticationManager

    {

        public override ClaimsPrincipal Authenticate

            (string resourceName, ClaimsPrincipal incomingPrincipal)

        {

            // Get the values of the IP and name identifyer caims.

            string identityProvider = "";

            string nameIdentifier = "";

            foreach (Claim claim in incomingPrincipal.Claims)

            {

                // IdentityProvider

                if (claim.Type.Equals            

                 ("http://schemas.microsoft.com/accesscontrolservice/2010/07/claims/identityprovider"))

                {

                    identityProvider = claim.Value;

                }

 

                // NameIdentifier

                if (claim.Type.Equals(ClaimTypes.NameIdentifier))

                {

                    nameIdentifier = claim.Value;

                }

            }

 

            // Create an MD5 hash of the value.

            string text = identityProvider + nameIdentifier;

            MD5 md5 = MD5.Create();

            byte[] inputBytes = System.Text.Encoding.ASCII.GetBytes(text);

            byte[] hash = md5.ComputeHash(inputBytes);

            StringBuilder sb = new StringBuilder();

            for (int i = 0; i < hash.Length; i++)

            {

                sb.Append(hash[i].ToString("X2"));

            }

            string hashedIdentifier = sb.ToString();

 

            // Add a name claim with the hashed identifier.

            incomingPrincipal.Identities.First().AddClaim(new Claim(ClaimTypes.Name, hashedIdentifier));

 

            return base.Authenticate(resourceName, incomingPrincipal);

        }

 

        public override void LoadCustomConfiguration(System.Xml.XmlNodeList nodelist)

        {

            base.LoadCustomConfiguration(nodelist);

        }

    }

}

 

 

Configuring the Custom Claims Authentication Manager

With the custom claims authentication manager created, it must be defined in the configuration so that the appropriate methods will be executed when the security token is received by the relying party application. The configuration is made in the identityConfiguration section of the system.identityModel section.

<system.identityModel>   

  <identityConfiguration>

    <claimsAuthenticationManager type="RelyingPartyApp.Code.CustomCam, RelyingPartyApp" />       

    <certificateValidation certificateValidationMode="None" />

    <audienceUris>

      <add value="http://win7base/RelyingPartyApp/" />

    </audienceUris>

    <issuerNameRegistry type="System.IdentityModel.Tokens.ConfigurationBasedIssuerNameRegistry, System.IdentityModel, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089">

      <trustedIssuers>

        <add thumbprint="B52D78084A4DF22E0215FE82113370023F7FCAC4" name="https://acsscenario.accesscontrol.windows.net/" />

      </trustedIssuers>

    </issuerNameRegistry>

  </identityConfiguration> 

</system.identityModel>

 

 

Testing the Custom Claims Authentication Manager

With the custom claims authentication manager now created and configured in the relying party application, the functionality can be tested. Logging onto the application using Yahoo as an identity provider shows the following claims.

image

 

The custom claims authentication manager has added a new name claim with a value of the hash of the identity provider and the name identifier. This will provide a unique value that can be inserted in the appropriate tables in the membership and profile database.

 

Creating and Modifying Site Profiles

In this step a web page will be added to allow site members to edit and save their profile information. The profile information will include name, email, twitter id and location. A web form named EditProfile.aspx will be created in the Members folder; the front side code for the interface is shown below.

<%@ Page Language="C#" AutoEventWireup="true" CodeBehind="EditProfile.aspx.cs" Inherits="RelyingPartyApp.Members.EditProfile" %>

<!DOCTYPE html>

<html xmlns="http://www.w3.org/1999/xhtml">

<head runat="server">

    <title></title>

</head>

<body>

    <form id="form1" runat="server">

        <div>

            <article>

                <p>Identity Provider: <asp:Label ID="lblIdentityProvider" runat="server" /></p>

                <p></p>

 

                <p>Name</p>

                <p><asp:TextBox ID="txtName" runat="server" /> </p>

                <p></p>

 

                <p>Email address</p>

                <p><asp:TextBox ID="txtEmail" runat="server" /></p>

                <p></p>

 

                <p>Twitter ID</p>

                <p><asp:TextBox ID="txtTwitter" runat="server" /></p>

                <p></p>

 

                <p>Location</p>

                <p><asp:TextBox ID="txtLocation" runat="server" /></p>

                <p></p>

 

                <asp:Button ID="btnSave" Text="Save" OnClick="btnSave_Click" runat="server" />

            </article>

        </div>

    </form>

</body>

</html>

 

When a new member visits the profile page, the name and email address claim values will be displayed in on the page. The user will have the option of modifying these values, and also adding twitter and location information, before the profile is saved.

The code behind that implements this is shown below.

using System;

using System.Security.Claims;

using System.Threading;

using System.Web;

using System.Web.Profile;

 

namespace RelyingPartyApp.Members

{

    public partial class EditProfile : System.Web.UI.Page

    {

        protected void Page_Load(object sender, EventArgs e)

        {

            if (!IsPostBack)

            {

                // Get the claim principal for the user.

                ClaimsPrincipal claimsPrincipal = Thread.CurrentPrincipal as ClaimsPrincipal;

 

                // Set the value for the identity provider.

                foreach (Claim claim in claimsPrincipal.Claims)

                {                   

                    if (claim.Type.Equals

             ("http://schemas.microsoft.com/accesscontrolservice/2010/07/claims/identityprovider"))

                    {

                        lblIdentityProvider.Text = claim.Value;

                    }

                }

 

 

                // Get the current user profile

                ProfileBase profile = HttpContext.Current.Profile;

 

                if (!string.IsNullOrEmpty(profile["Name"].ToString()))

                {

                    // If the profile exists, set the user interface details.

                    txtName.Text = profile["Name"].ToString();

                    txtEmail.Text = profile["Email"].ToString();

                    txtTwitter.Text = profile["Twitter"].ToString();

                    txtLocation.Text = profile["Location"].ToString();

                }

                else

                {

                    // If there is no profile, set the details from the claims provided by ACS.

                    foreach (Claim claim in claimsPrincipal.Claims)

                    {

                        // Set the given name.

                        if (claim.Type.Equals(ClaimTypes.GivenName))

                        {

                            txtName.Text = claim.Value;

                        }

 

                        // Set the email address.

                        if (claim.Type.Equals(ClaimTypes.Email))

                        {

                            txtEmail.Text = claim.Value;

                        }

                    }

                }

            }

        }

 

        protected void btnSave_Click(object sender, EventArgs e)

        {

            // Update the profile with the values in the text boxes.

            ProfileBase profile = HttpContext.Current.Profile;

            profile["Name"] = txtName.Text;

            profile["Email"] = txtEmail.Text;

            profile["Twitter"] = txtTwitter.Text;

            profile["IdentityProvider"] = lblIdentityProvider.Text;

            profile["Location"] = txtLocation.Text;

            profile.Save();

        }

 

 

    }

}

 

 

Testing the Implementation

The implementation can now be tested to verify that the Universal Profile Provider integrates with the Access Control Service (ACS) and Windows Identity Foundation. In order to do this the members page is browsed to, causing ACS to display the identity provider selector page. As name and email will be needed to test the setting in the user interface, Yahoo or Google will need to be used. The application will still function with a Microsoft account, but the name and email claim values will not be available in the security token, and will not be set in the user interface.

image

 

In this case, Yahoo will be used as an identity provider.

image

When the user navigates to the EditProfile page, the claim values from the security token provided by Yahoo, and transformed by ACS are set in the user interface.

image

Remember that the name claim has been mapped to the given name claim, and its value is displaced in the name text box.

The user can then make modifications to the details, selecting a different name and email address, and also adding twitter and location information. In this example I changed the name form Alan Azure to Alan Smith, and added my twitter handle and location.

image

When the user saves the profile information, a new user and a new profile are created in the membership and profile database in Windows Azure SQL Database.

The following screenshot shows the new user.

image

 

The next screenshot shows the profile, which is linked to the user with the UserId value. The details entered in the profile are shown in the PropertyValueStrings column.

image

 

The next time the user authenticates using the same social identity, they will be able to access, modify and save their own profile information. The site will also be able to maintain profile information for users authenticating with social identity providers.

 

Posted On Sunday, March 24, 2013 6:39 PM | Comments (0) |

Tuesday, March 19, 2013

Presenting Windows Azure Sessions

A couple of weeks ago I was on the road, presenting at SDC in Gothenburg, TechDays in Finland and TechDays in The Netherlands. I delivered five sessions, the “Grid Computing with 256 Windows Azure Worker Roles & Kinect” at all three events, and also two new sessions, “Migrate an On-Premise Application to the Cloud in 60 Minutes” and “Web Site Authentication with Social Identity Providers” at TechDays in The Netherlands.

It was a hectic week, but the events were great to be a part of. SDC in Göteborg had a great speaker’s dinner, as always, the event was coinciding with an indoor athletics event, so we were not in the usual rooms, but I did get to eat breakfast on the table next to Team GB. I arrived at TechDays in Finland, and had an hour to so to prepare before presenting my session, which was second to last on the last day. I saw “TechDays Päättyy” on the program at the end of the day, and interpreted it as meaning “TechDays Party”, so I was disappointed when everyone disappeared and the venue emptied. It seemed like a good event, it was a shame I could not be there for longer.

The next day I took a flight to Amsterdam, and made it to TechDays The Neatherlands in plenty of time for my session an Thursday. It was great to meet up with Clemens, Nuno, Rob Miles and Iris, and I got to have dinner with Scott Klien. I did another two sessions on the Friday, and then headed back home Saturday morning. The event was very well organized, add I was a bit skeptical of having 75 minute sessions instead of the typical 60 minutes, but after engaging in quite a few questions and discussions during my sessions I was glad of the extra time.

There is a recording of the “Migrate an On-Premise Application to the Cloud in 60 Minutes” session on channel 9.

I have some more presentations planned over the next few months…

·         17th April – Designing for the Cloud workshop at ITARC in Sweden

·         6th May – Migrate an On-Premise Application to the Cloud in 60 Minutes and Web Site Authentication with Social Identity Providers at Azure User Group Belgium

·         29th-31st May –  Grid Computing with 256 Windows Azure Worker Roles & Kinect at DevSum in Stockholm

·         12th-14th June – Grid Computing with 256 Windows Azure Worker Roles & Kinect at NDC in Oslo

I’d be happy to present these, and other sessions at conferences, user groups, and for companies, feel free to contact me through my blog if you are interested.

Posted On Tuesday, March 19, 2013 8:58 PM | Comments (0) |

Sunday, March 17, 2013

Website Authentication with Social Identity Providers and ACS Part 1

This article is the first in a series of three articles looking at using the Windows Azure Access Control Service to implement web site authentication using social identity providers. It will show how Microsoft, Yahoo and Google accounts can be used to authenticate users using their social identities and login credentials rather than requiring them to create credentials and an identity on the website itself.

 

Website Authentication using Social Identity Providers

In this scenario social identity providers will be used to authenticate users for an ASP.NET website. Microsoft, Google and Yahoo will be used as identity providers, and the Windows Azure Access Control Service (ACS) will be used to convert the security tokens and transform the claims to a common format for the website. A Google account with two-step authentication will then be used to provide administrative access to the site using rules in ACS.

Creating a Relying Party Application

The first step is to create a simple relying party application. This will be created as an ASP.NET application, which can be tested locally, and then deployed to Windows Azure Websites. To keep things as simple as possible, the application will consist of a default page, a page for site members and a page for site administrators.

The solution structure for the website is shown below.

image

The Default.aspx page can be viewed by anyone, but content in the Members folder requires that users are authenticated. This is configured in the Web.config file in the Members folder.

 

<?xml version="1.0"?>

<configuration>

    <system.web>

      <authorization>

        <!-- Deny access to unauthorized users.-->

        <deny users="?"/>

      </authorization>

    </system.web>

</configuration>

 

 

The content in the admin folder required that the users are authenticated, and are members of the Admin role.

 

<?xml version="1.0"?>

<configuration>

    <system.web>

      <authorization>

        <!-- Allow only users who are members of the Admin role. -->

        <allow roles="Admin"/>

        <deny users="*"/>

      </authorization>

    </system.web>

</configuration>

 

 

The Default.aspx contains links to the other two pages.

<%@ Page Language="C#" AutoEventWireup="true" CodeBehind="Default.aspx.cs" Inherits="RelyingPartyApp.Default" %>

<!DOCTYPE html>

<html xmlns="http://www.w3.org/1999/xhtml">

<head runat="server">

    <title>Relying Party Application</title>

</head>

<body>

    <form id="form1" runat="server">

        <div>Welcome to the Relying Party Application!</div>

        <div><a href="Members/Members.aspx">Members Page</a></div>

        <div><a href="Admin/Admin.aspx">Admin Page</a></div>

    </form>

</body>

</html>

 

The website is configured to be hosted in IIS at http://localhost/RelyingPartyApp. The homepage can be viewed in Internet Explorer.

image

Clicking on either the Members Page or Admin Page links will result in a 401.2 access denied message, as the user is not authenticated.

image

In the next section the website will be configured to use the Windows Azure Access Control Service (ACS) to provide website authentication using Microsoft Accounts.

 

Registering the Relying Party Application with ACS

In this stage the ASP.NET relying party application will be registered with the Windows Azure Access Control Service (ACS) and configured in ACS to use Microsoft Accounts as an identity provider. The web site will then be configured to use ACS to provide a security token to authenticate the users.

Creating a Namespace in ACS

At the time of writing the ACS functionality is being migrated to the new portal, the interface may well change going forward, but the principals will remain the same. Clicking on the Active Directory link in the portal will display the registered namespaces that can be sued for ACS. A new namespace will be added for this scenario.

image

The new namespace must be globally unique within ACS. Here the namespace acsscenario is created in the West Europe region.

image

After a couple of minutes the namespace will be activated and ready to use, the Manage button is used to open the namespace management portal.

image

 

Configuring a Relying Party Application

The next stage is to configure a relying party application in ACS. The Relm and Return URL will be set to the root URL for the application, which is http://localhost/RelyingPartyApp/ in my example. I also set this as the Name property, which is displayed in the portal, so that I know exactly which application it refers to. All other settings are left with the default values.

image

Not that SAML 2.0 is set for the security token format, and the lifetime of the tokens will be 600 seconds (10 minutes). Also note that “Windows Live ID” (Microsoft account) is selected as an identity provider, and that a new rule group will be created.

Configuring the Default Rule Group

A new default rule group was created for the relying party application. The rule group will not contain any rules, but these can be generated from the claims provided by ACS by clicking the Generate link.

image

When this is done a new rule will be added to pass-through the value of the nameidentifter claim from that “Windows Live ID” (Microsoft account) to the relying party application.

image

The rule group can now be saved to commit the changes.

 

Configuring the ASP.NET Relying Party Application to use ACS for Authentication

Now that the relying party application has been configured in ACS, it can be configured to request security tokens form ACS for authentication. In order to do this the Identity and Access Tool for Visual Studio will need to be installed, it is located here:

http://visualstudiogallery.msdn.microsoft.com/e21bf653-dfe1-4d81-b3d3-795cb104066e

Once this tool has been installed, a new Identity and Access option will appear on the web project context menu.

image

I find it easiest to use this wizard with the business identity provider option; to do this you will need the URI for the WS-Federation metadata for your ACS namespace. This is available on the Application Integration page of the ACS management portal.

image

This URI can be copy-pasted into the Identity and Access wizard. Note that the realm for the relying party application matches the realm that was specified in the ACS portal.

image

Clicking OK on the wizard will apply the configuration changes to the Web.config file that will configure the application to use ACS to retrieve security tokens.

The key change in the web site configuration is the replacing of the FormsAuthentication module with the WSFederationAuthenticationModule and SessionAuthenticationModule modules from the System.Identity model namespace.

<system.webServer>

  <modules>

    <remove name="FormsAuthentication" />

    <add name="WSFederationAuthenticationModule" type="System.IdentityModel.Services.WSFederationAuthenticationModule, System.IdentityModel.Services, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" preCondition="managedHandler" />

    <add name="SessionAuthenticationModule" type="System.IdentityModel.Services.SessionAuthenticationModule, System.IdentityModel.Services, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" preCondition="managedHandler" />

  </modules>

</system.webServer>

 

The authentication modules are configured to use the namespace in ACS as an issuer for security tokens using passive redirect.

<system.identityModel.services>

  <federationConfiguration>

    <cookieHandler requireSsl="false" />

    <wsFederation

      passiveRedirectEnabled="true"

      issuer=https://acsscenario.accesscontrol.windows.net/v2/wsfederation

      realm=http://localhost/RelyingPartyApp/

      requireHttps="false" />

  </federationConfiguration>

</system.identityModel.services>

 

Testing Authentication with Windows Live ID

The application is almost ready to be tested, one small change that needs to be made first is the removal of the configuration that will deny anonymous users access to the home page of the website. This was added by the Identity and Access wizard, but since we have configured access restrictions to the members and admin pages, it is not required and can be commented out.

<system.web>

  <authorization>

    <!--<deny users="?" />-->

  </authorization>

  <authentication mode="None" />

  <compilation debug="true" targetFramework="4.5" />

  <httpRuntime targetFramework="4.5" requestValidationMode="4.5" />

</system.web>

 

The default page can be viewed as normal, but then the members page is accessed the browser is redirected through ACS to the Microsoft account login page.

image

When the user logs on with a valid Microsoft account, the following error page is displayed, stating that the certificate used by ACS is not trusted.

image

Adding a line to the identity model configuration can override the certificate check.

<system.identityModel>

  <identityConfiguration>

    <certificateValidation certificateValidationMode="None" />

    <audienceUris>

      <add value="http://localhost/RelyingPartyApp/" />

    </audienceUris>

    <issuerNameRegistry type="System.IdentityModel.Tokens.ConfigurationBasedIssuerNameRegistry, System.IdentityModel, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089">

      <trustedIssuers>

        <add thumbprint="B52D78084A4DF22E0215FE82113370023F7FCAC4" name="https://acsscenario.accesscontrol.windows.net/" />

      </trustedIssuers>

    </issuerNameRegistry>

  </identityConfiguration>

</system.identityModel>

 

Once this change has been made, the authenticated user will be able to view the members page.

image

The admin page will still be restricted as the user is not a member of the Admin role.

 

 

Authenticating with Google and Yahoo Accounts

Now that the website can use Microsoft Accounts for authentication it is sample to add other social identity providers, such as Yahoo and Google. No changes in the website will be required for this; the only changes that need to take place will be in the ACS portal.

In the portal we can see that “Windows Live ID” is the only configured identity provider.

image

 

Adding Additional Identity Providers

Clicking on the Add link will allow more to be added, in the first case, Google will be added.

image

Note that the ASP.NET relying party application is selected to use Google by default. Once the changes are saved, Yahoo can be added in the same way. The ACS namespace now has three configured identity providers.

image

Generating new Rules

As the new identity providers will all submit claims to ACS, the rule group needs to be configured to pass these claims to the relying party application. The new rules are shown below.

image

Note that as well as the nameidentifier claim, Google and Yahoo also include claims for name and emailaddress.

 

Testing the Application

When the user browses to the members page, they are redirected to ACS, which displays a list of the identity providers that can be used for the site.

image

Selecting Google will redirect the browser to the Google login page.

image

Once authenticated, Google asks the user if it is OK to send details of the account holders name and email address to ACS. (As I am writing this in Sweden, the text is appearing in Swedish.)

image

When this is accepted the members pave will be displayed. There will be a very similar procedure when using a Yahoo account.

 

Examining the Security Tokens and Claims

In this section the claims returned by the identity providers will be examined, along with the HTTP request sequence that is used to authenticate the user using ACS and the social identity providers.

Examining the Claims

In order to work effectively with the different social identity providers, it is important to understand the claims that are supplied by these providers. In order to do this, the members page can be modified to include a data grid control that can be data bound to the collection of claims.

<%@ Page Language="C#" AutoEventWireup="true" CodeBehind="Members.aspx.cs" Inherits="RelyingPartyApp.Members.Members" %>

<!DOCTYPE html>

<html xmlns="http://www.w3.org/1999/xhtml">

<head runat="server">

    <title>Members Page</title>

</head>

<body>

    <form id="form1" runat="server">

        <div>Welcome to the Members Page!</div>

        <div><a href="../Default.aspx">Home</a></div>

        <h1>Claims</h1>

        <asp:DataGrid ID="dgrClaims" runat="server" AutoGenerateColumns="false">

            <Columns>

                <asp:BoundColumn HeaderText="Type" DataField="Type" />

                <asp:BoundColumn HeaderText="Value" DataField="Value" />

            </Columns>

        </asp:DataGrid>

    </form>

</body>

</html>

 

The data grid can be data bound from the page load event.

using System;

using System.Security.Claims;

using System.Threading;

 

namespace RelyingPartyApp.Members

{

    public partial class Members : System.Web.UI.Page

    {

        protected void Page_Load(object sender, EventArgs e)

        {

            ClaimsPrincipal claimsPrincipal = Thread.CurrentPrincipal as ClaimsPrincipal;

 

            if (claimsPrincipal != null)

            {

                dgrClaims.DataSource = claimsPrincipal.Claims;

                dgrClaims.DataBind();

            }

        }

    }

}

 

The ClaimsPrincipal class is part of the System.Security.Claims, and allows claims to be viewed and manipulated by relying party applications.

Claims provided by Microsoft.

image

Claims provided by Google.

image

Claims provided by Yahoo.

image

The claims supplied by ACS and the different social identity providers are summarized in the table below.

Claim

Microsoft

Google

Yahoo

Name Identifier

Yes

Yes

Yes

Name

No

Yes

Yes

Email

No

Yes

Yes

Identity Provider

Yes

Yes

Yes

 

From the tests It can be seen that Microsoft only provides the name identifier claim, whilst Google and Yahoo supply this as well as name and email claims. These claims will be used later on to integrate with the .NET Universal Profile Provider and create a profile on the website.

Examining the Security Token Exchange

In order to gain an understanding of how the authentication process takes place and the parties involved, the Fiddler2 web debugging tool will be used to intercept the web traffic.

The latest version of Fiddler2 can be downloaded here: http://www.fiddler2.com

 

In order to do this the configuration for the relying party in ACS and the Web.config file will need to be changed so that the host name of the computer, rather than localhost is used in the URLs. As the authentication exchange takes place over a secure channel, fiddler must be configured to intercept and decrypt HTTPS traffic. This is done in the Fiddler Options menu.

image

With those changes made, the authentication request sequence can be monitored. The monitoring trace for authentication with a Yahoo account is shown below.

image

 

The sequence is as follows, with the numbers representing the request sequence number.

·         1 – A request is made to the default page for the relying party application, resulting in a 200 response.

·         2 – A request is made to the members page, resulting in a 302 code, redirecting the browser to the acsscenario namespace in ACS.

·         4 – ACS processes the request and provides options for the user to select an identity provider.

·         11 – The request reaches login.live.com, and the Yahoo account authentication takes place.

·         32 – The browser is redirected to ACS again, with the transformed security token.

·         35 – The security token is sent to the application where it is intercepted by ACS and authentication takes place, a 302 code is used to redirect the browser to the members page.

·         36 – The authenticated user can now view the members page, and receive a 200 code.

 

In request 4, the request to ACS, the details regarding the WS-Federation authentication request can clearly be seen. These include the operation to be performed, the relying party application realm, the resource that was requested, and the current system time.

 

https://acsscenario.accesscontrol.windows.net/v2/wsfederation?wa=wsignin1.0&wtrealm=http%3a%2f%2fwin7base%2fRelyingPartyApp%2f&wctx=rm%3d0%26id%3dpassive%26ru%3d%252fRelyingPartyApp%252fMembers%252fMembers.aspx&wct=2013-03-14T18%3a21%3a31Z HTTP/1.1

 

The security token sent in HTTP request 35 contains the SAML security token sent to the relying party by ACS. The information in the token is shown below, the certificate and signature values have been truncated.

It can be seen that the lifetime of the token is 10 minutes, the token applies to the application hosted at http://win7base/RelyingPartyApp/, and is issued from https://acsscenario.accesscontrol.windows.net/. The name identifier is specified as the subject name Id, and the email and name claims are specified as attributes.

<t:RequestSecurityTokenResponse Context="rm=0&amp;id=passive&amp;ru=%2fRelyingPartyApp%2fMembers%2fMembers.aspx" xmlns:t="http://schemas.xmlsoap.org/ws/2005/02/trust">

  <t:Lifetime>

    <wsu:Created xmlns:wsu="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-utility-1.0.xsd">2013-03-14T18:21:50.023Z</wsu:Created>

    <wsu:Expires xmlns:wsu="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-utility-1.0.xsd">2013-03-14T18:31:50.023Z</wsu:Expires>

  </t:Lifetime>

  <wsp:AppliesTo xmlns:wsp="http://schemas.xmlsoap.org/ws/2004/09/policy">

    <EndpointReference xmlns="http://www.w3.org/2005/08/addressing">

      <Address>http://win7base/RelyingPartyApp/</Address>

    </EndpointReference>

  </wsp:AppliesTo>

  <t:RequestedSecurityToken>

    <Assertion ID="_b5e34903-dc8f-47f7-b369-488f0a19c3b4" IssueInstant="2013-03-14T18:21:50.023Z" Version="2.0" xmlns="urn:oasis:names:tc:SAML:2.0:assertion">

      <Issuer>https://acsscenario.accesscontrol.windows.net/</Issuer>

      <ds:Signature xmlns:ds="http://www.w3.org/2000/09/xmldsig#">

        <ds:SignedInfo>

          <ds:CanonicalizationMethod Algorithm="http://www.w3.org/2001/10/xml-exc-c14n#" />

          <ds:SignatureMethod Algorithm="http://www.w3.org/2001/04/xmldsig-more#rsa-sha256" />

          <ds:Reference URI="#_b5e34903-dc8f-47f7-b369-488f0a19c3b4">

            <ds:Transforms>

              <ds:Transform Algorithm="http://www.w3.org/2000/09/xmldsig#enveloped-signature" />

              <ds:Transform Algorithm="http://www.w3.org/2001/10/xml-exc-c14n#" />

            </ds:Transforms>

            <ds:DigestMethod Algorithm="http://www.w3.org/2001/04/xmlenc#sha256" />

            <ds:DigestValue>6HWloBhG5oosH1Uc9B/noIiEp7E7DhL11/ANePeZWK4=</ds:DigestValue>

          </ds:Reference>

        </ds:SignedInfo>

        <ds:SignatureValue>Bq...TQ==</ds:SignatureValue>

        <KeyInfo xmlns="http://www.w3.org/2000/09/xmldsig#">

          <X509Data>

            <X509Certificate>MIID...==</X509Certificate>

          </X509Data>

        </KeyInfo>

      </ds:Signature>

      <Subject>

        <NameID>https://me.yahoo.com/a/r5swqoIjfda_5ZoxpdR95iX81df1uEGDfbCl#b6a9e</NameID>

        <SubjectConfirmation Method="urn:oasis:names:tc:SAML:2.0:cm:bearer" />

      </Subject>

      <Conditions NotBefore="2013-03-14T18:21:50.023Z" NotOnOrAfter="2013-03-14T18:31:50.023Z">

        <AudienceRestriction>

          <Audience>http://win7base/RelyingPartyApp/</Audience>

        </AudienceRestriction>

      </Conditions>

      <AttributeStatement>

        <Attribute Name="http://schemas.xmlsoap.org/ws/2005/05/identity/claims/emailaddress">

          <AttributeValue>alanazuretest2@yahoo.com</AttributeValue>

        </Attribute>

        <Attribute Name="http://schemas.xmlsoap.org/ws/2005/05/identity/claims/name">

          <AttributeValue>Alan Azure</AttributeValue>

        </Attribute>

        <Attribute Name="http://schemas.microsoft.com/accesscontrolservice/2010/07/claims/identityprovider">

          <AttributeValue>Yahoo!</AttributeValue>

        </Attribute>

      </AttributeStatement>

    </Assertion>

  </t:RequestedSecurityToken>

  <t:RequestedAttachedReference>

    <SecurityTokenReference d3p1:TokenType="http://docs.oasis-open.org/wss/oasis-wss-saml-token-profile-1.1#SAMLV2.0" xmlns:d3p1="http://docs.oasis-open.org/wss/oasis-wss-wssecurity-secext-1.1.xsd" xmlns="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-secext-1.0.xsd">

      <KeyIdentifier ValueType="http://docs.oasis-open.org/wss/oasis-wss-saml-token-profile-1.1#SAMLID">_b5e34903-dc8f-47f7-b369-488f0a19c3b4</KeyIdentifier>

    </SecurityTokenReference>

  </t:RequestedAttachedReference>

  <t:RequestedUnattachedReference>

    <SecurityTokenReference d3p1:TokenType="http://docs.oasis-open.org/wss/oasis-wss-saml-token-profile-1.1#SAMLV2.0" xmlns:d3p1="http://docs.oasis-open.org/wss/oasis-wss-wssecurity-secext-1.1.xsd" xmlns="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-secext-1.0.xsd">

      <KeyIdentifier ValueType="http://docs.oasis-open.org/wss/oasis-wss-saml-token-profile-1.1#SAMLID">_b5e34903-dc8f-47f7-b369-488f0a19c3b4</KeyIdentifier>

    </SecurityTokenReference>

  </t:RequestedUnattachedReference>

  <t:TokenType>urn:oasis:names:tc:SAML:2.0:assertion</t:TokenType>

  <t:RequestType>http://schemas.xmlsoap.org/ws/2005/02/trust/Issue</t:RequestType>

  <t:KeyType>http://schemas.xmlsoap.org/ws/2005/05/identity/NoProofKey</t:KeyType>

</t:RequestSecurityTokenResponse>

 

Windows Identity Foundation will validate this security token, and then use it to authenticate the user.

 

Adding a Site Administrator using a Google Account with Two-Phase Authentication

In this step the rules in ACS will be used to add a claim to a specific Google identity that will grant the user administrative privileges on the relying party application. The ASP.NET relying party application has an Admin folder which contains the Admin.aspx page.

image

 

The Web.config file specifies the authorization configuration that will only allow users who are a member of the admin role to access the folder content.

 

<?xml version="1.0"?>

<configuration>

    <system.web>

      <authorization>

        <!-- Allow only users who are members of the Admin role. -->

        <allow roles="Admin"/>

        <deny users="*"/>

      </authorization>

    </system.web>

</configuration>

 

 

In order for a specific social identity to be able to access the Admin folder, a role claim with the value of Admin will need to be added to the security token that is sent from ACS to the relying party application.

 

Create a Rule in ACS to Assign the Admin Role Claim for a Specified Google Account

In order to create a rule to add a role claim to the security token we need to identify the identity uniquely. The claims that we have available are as follows:

Claim

Comments

Name Identifier

Guaranteed to be unique and immutable for the identity.

Name

No guarantee of uniqueness or immutability.

Email Address

Guaranteed to be unique and immutable for the identity.

Identity Provider

Supplied by ACS.

 

The two claims that are contenders for identifying the identity are name identifier and email address. The value of the name identifier claim is typically not accessible, so the email address is the best candidate. Name is not suitable, as another user could create a Google account with the same name, and then gain administrative access to the website.

Not that if a Microsoft account was used the email claim would not be available, and the name identifier claim would have to be used. It may be challenging to obtain the value of the name identifier claim for a specific Microsoft account.

The pseudo code for the rule that will be created is as follows.

if (IdentityProvider == "Google" && InputClaimType == "emailaddress"

    && InputClaimValue == " alanazuretest@gmail.com")

{

    OutputClaimType = "role";

    OutputClaimValue = "Admin";

}

 

The rule can configuration in ACS is shown below.

image

Once this rule is saved in ACS the Google user with the email address alanazuretest@gmail.com will have administrative privileges on the site.

 

 

Configuring Two-Phase Authentication in Google

In order to increase security on the website, a two-step authentication scheme can be used. There are various ways to implement this, one of the most common options is to send a verification code to a mobile device. The alanazuretest@gmail.com Google account has been configured to send a verification code as an SMS message to my mobile phone when I login, the configuration for the account is shown below.

image

Test the Admin Authentication

The application can now be tested to verify that the user can access the administrative section of the site, and that the two-phase authentication functions correctly. When the user browses to the members page, the browser is redirected to the login page for Google, with a message stating that acscenario.accesscontrol.windows.net is requesting a security token.

image

 

When the user authenticates with the first stage of authentication, username and password, a SMS message containing an access code is sent to a mobile device.

image

The second stage of authentication prompts the user for this access code; a screenshot of this is shown below.

image

 

Once the user is authenticated, the browser is redirected back to the members page, where the supplies in the security token are displayed.

image

 

It can be seen that the rule claim has been supplied, with the value of Admin. This should now grant the user access to the admin page in the website.

image

 

 

 

 

Conclusions

In this scenario we have used ACS to add authentication by three of the largest social identity providers to a website. We have also configured a Google account to have administrative privileges on the website, and enabled a two-step authentication process on the account. We can see that the authentication model that is commonly used in Web.config files to grant access to specific users and groups integrates well with the claims-based model used by ACS.

We have also seen that different identity providers provide different claims, and that providing an application that can handle all types of identity providers may be challenging.

Posted On Sunday, March 17, 2013 7:56 PM | Comments (0) |

Sunday, November 4, 2012

Sweden Windows Azure Group Meeting in November & Fast with Windows Azure Competition

SWAG November Meeting

There will be a Sweden Windows Azure Group (SWAG) meeting in Stockholm on Monday 19th November. Chris Klug will be presenting a session on Windows Azure Mobile Services, and I will be presenting a session on Web Site Authentication with Social Identity Providers. Active Solution have been kid enough to host the event, and will be providing food and refreshments.

The registration link is here: http://swag14.eventbrite.com

If you would like to join SWAG the link is here: http://swagmembership.eventbrite.com

Fast with Windows Azure Competition

I’ve entered a 3 minute video of rendering a 3D animation using 256 Windows Azure worker roles in the “Fast with Windows Azure” competition. It’s the last week of voting this week, it would be great if you can check out the video and vote for it if you like it. I have not driven a car for about 15 years, so if I win you can expect a hilarious summery of the track day in Vegas. My preparation for the day would be to play Project Gotham Racing for a weekend, and watch a lot of Top Gear.

image

 

My video is “Rapid Massive On-Demand Scalability Makes Me Fast!”.

The link is here: http://www.meetwindowsazure.com/fast/

Posted On Sunday, November 4, 2012 8:10 PM | Comments (0) |

Monday, October 22, 2012

256 Worker Role 3D Rendering Demo is now a Lab on my Azure Course

Ever since I came up with the crazy idea of creating an Azure application that would spin up 256 worker roles (please vote if you like it Smile) to render a 3D animation created using the Kinect depth camera I have been trying to think of something useful to do with it.

I have also been busy working on developing training materials for a Windows Azure course that I will be delivering through a training partner in Stockholm, and for customers wanting to learn Windows Azure. I hit on the idea of combining the render demo and a course lab and creating a lab where the students would create and deploy their own mini render farms, which would participate in a single render job, consisting of 2,000 frames.

The architecture of the solution is shown below.

image

As students would be creating and deploying their own applications, I thought it would be fun to introduce some competitiveness into the lab. In the 256 worker role demo I capture the rendering statistics for each role, so it was fairly simple to include the students name in these statistics. This allowed the process monitor application to capture the number of frames each student had rendered and display a high-score table.

When I demoed the application I deployed one instance that started rendering a frame every few minutes, and the challenge for the students was to deploy and scale their applications, and then overtake my single role instance by the end of the lab time. I had the process monitor running on the projector during the lab so the class could see the progress of their deployments, and how they were performing against my implementation and their classmates.

When I tested the lab for the first time in Oslo last week it was a great success, the students were keen to be the first to build and deploy their solution and then watch the frames appear. As the students mostly had MSDN suspicions they were able to scale to the full 20 worker role instances and before long we had over 100 worker roles working on the animation.

There were, however, a few issues who the couple of issues caused by the competitive nature of the lab. The first student to scale the application to 20 instances would render the most frames and win; there was no way for others to catch up. Also, as they were competing against each other, there was no incentive to help others on the course get their application up and running.

I have now re-written the lab to divide the student into teams that will compete to render the most frames. This means that if one developer on the team can deploy and scale quickly, the other team still has a chance to catch up. It also means that if a student finishes quickly and puts their team in the lead they will have an incentive to help the other developers on their team get up and running.

As I was using “Sharks with Lasers” for a lot of my demos, and reserved the sharkswithfreakinlasers namespaces for some of the Azure services (well somebody had to do it), the students came up with some creative alternatives, like “Camels with Cannons” and “Honey Badgers with Homing Missiles”. That gave me the idea for the teams having to choose a creative name involving animals and weapons.

The team rendering architecture diagram is shown below.

image

 

Render Challenge Rules

In order to ensure fair play a number of rules are imposed on the lab.

·         The class will be divided into teams, each team choses a name.

·         The team name must consist of a ferocious animal combined with a hazardous weapon.

·         Teams can allocate as many worker roles as they can muster to the render job.

·         Frame processing statistics and rendered frames will be vigilantly monitored; any cheating, tampering, and other foul play will result in penalties.

The screenshot below shows an example of the team render farm in action, Badgers with Bombs have taken a lead over Camels with Cannons, and both are  leaving the Sharks with Lasers standing.

image

If you are interested in attending a scheduled delivery of my Windows Azure or Windows Azure Service bus courses, or would like on-site training, more details are here.

Posted On Monday, October 22, 2012 12:27 AM | Comments (0) |

Tuesday, September 18, 2012

Windows Azure SQL Data Sync Walkthrough

I’ve just completed a basic implementation using Windows Azure SQL Data Sync that I will be using in a demo for an upcoming course. This is a basic walkthrough that you should be able to follow if you want to get to grips with the new SQL Data Sync functionality in Windows Azure.

The walkthrough is based on the SQL Data Sync August 2012 – Service Update, changes in future versions are possible.

Scenario

The scenario for this walkthrough is shown in the diagram below.

Picture1

Adventure Works has an existing on-premise SQL database that contains its business information. They plan to create an on-line store application hosted in Windows Azure. In order to keep the product information in the on-line store database up to date, SQL data sync will be used to coordinate the synchronization between the on-premise database and the product database hosted in SQL Azure.

Existing Implementation

The existing implementation consists of a slightly modified version of the Adventureworks2012 database for Windows Azure. The sample database can be downloaded here: http://msftdbprodsamples.codeplex.com/releases/view/37304.

The database has been modified to replace the custom defined data types, which are currently not supported in Data Sync, with regular SQL data types. The modified code to create the Product table is shown below.

CREATE TABLE [Production].[Product](

       [ProductID] [int] IDENTITY(1,1) NOT NULL,

       [Name] nvarchar(50) NOT NULL,

       [ProductNumber] [nvarchar](25) NOT NULL,

       [MakeFlag] bit NOT NULL,

       [FinishedGoodsFlag] bit NOT NULL,

       [Color] [nvarchar](15) NULL,

       [SafetyStockLevel] [smallint] NOT NULL,

       [ReorderPoint] [smallint] NOT NULL,

       [StandardCost] [money] NOT NULL,

       [ListPrice] [money] NOT NULL,

       [Size] [nvarchar](5) NULL,

       [SizeUnitMeasureCode] [nchar](3) NULL,

       [WeightUnitMeasureCode] [nchar](3) NULL,

       [Weight] [decimal](8, 2) NULL,

       [DaysToManufacture] [int] NOT NULL,

       [ProductLine] [nchar](2) NULL,

       [Class] [nchar](2) NULL,

       [Style] [nchar](2) NULL,

       [ProductSubcategoryID] [int] NULL,

       [ProductModelID] [int] NULL,

       [SellStartDate] [datetime] NOT NULL,

       [SellEndDate] [datetime] NULL,

       [DiscontinuedDate] [datetime] NULL,

       [rowguid] [uniqueidentifier]   NOT NULL,

       [ModifiedDate] [datetime] NOT NULL,

 CONSTRAINT [PK_Product_ProductID] PRIMARY KEY CLUSTERED

(

       [ProductID] ASC

)WITH (STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF)

)

 

With the changes made to AdventureWorks2012ForSQLAzure_Schema.sql the CreateAdventureWorksForSQLAzure.cmd is run specifying local server and credentials to create and populate the database on the local instance of SQL Server.

A very WPF basic line-of-business (LOB) application is created in WPF to allow product data to be modified. The XMLT for the min window is shown here:

<Window x:Class="ProductEditor.MainWindow"

        xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"

        xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"

        Title="MainWindow" Height="350" Width="525">

    <StackPanel>

        <Button Name="btnUpdate" Content="Update" Click="btnUpdate_Click" />

        <DataGrid Name="dgdProducts" AutoGenerateColumns="True"  >                    

        </DataGrid>

    </StackPanel>

</Window>

 

The code-behind here:

public partial class MainWindow : Window

{

    AdventureWorks2012Entities m_Entities;

 

    public MainWindow()

    {

        m_Entities = new AdventureWorks2012Entities();

        InitializeComponent();

        List<Product> prods =

            (from p in m_Entities.Products where p.ListPrice > 0

            select p).ToList<Product>();

        dgdProducts.ItemsSource = prods;           

    }

 

    private void btnUpdate_Click(object sender, RoutedEventArgs e)

    {

        m_Entities.SaveChanges();

    }

 

}

 

The application uses an entity framework model to connect to the on-premise Adventureworks2012 database. A screenshot of the application is shown below.

clip_image004[4]

 

Creating Windows Azure SQL Databases

The first step in creating the cloud-based online store is to create the databases that will be used for the online store and the data synchronization. The following two 1 GB web databases are created using the Windows Azure Portal.

·         SyncHubDb – A database to act as the hub database for SQL Data Sync.

·         WebStoreDb – A database for the online web store application.

clip_image006[4]

Both the databases are creates as the 1G Web databases.

Creating a Sync Group

With the SQL Databases created in Windows Azure, the next task is to use SQL Data Sync to define a sync group to synchronize the produce data.

The first step is provision a SQL Data Sync server in an appropriate region. This is currently done using the old Azure portal. Click on the Data Sync button, and then click Provision, select the account, and specify a region, its best to use the same region where your Windows Azure SQL Databases are located.

clip_image008[4]

 

Once the Data Sync server has been provisioned, a new sync group can be created. With Sync Groups selected, click Create.

clip_image010[4]

 

Enter the name for the sync group, and then click on the icon for the Sync Hub database.

clip_image012[4]

Enter the database details for the Sync Hub database and click Add.

clip_image014[4]

All data synchronization that is processed by this sync group will pass through the sync hub database, which must be hosted in Windows Azure SQL Database.

Adding an On-Premise Database

In order to use SQL Data Sync with an on-premise database you will need to install and configure Microsoft SQL Data Sync Agent on your local SQL server and register the database with the service.

Click on the icon to add an on-premise database and select Add a new SQL database to the sync group, you can also specify the Sync Direction here, I left the default of Bi-Directional selected.

clip_image016[4]

If you don’t have an agent configured, select Install a new Agent.

clip_image018[4]

Download and install the agent, and then enter a name and generate an agent key.

clip_image020[4]

Start Microsoft SQL Data Sync Agent from the start menu and set the agent key that was generated in the portal. This will open a management console for the on-premise service.

clip_image022[4]

Click on Register, and then select the appropriate database on the local SQL server, and click Save.

clip_image024[4]

You should see that the database is reachable.

clip_image026[4]

Back in the portal, click Next, then Get Database List, select the database, and click Finish.

clip_image028[4]

The on-premise database will now be connected to the Data Sync server in Windows Azure.

Configuring the Sync Schedule

The schedule for data synchronization can now be configured, along with the conflict resolution options, the time interval must be between 5 minutes and one month. I selected a 5 minute interval and Client Wins. The short time interval is good for demo purposes.

clip_image030[4]

Setting Client Wins for conflict resolution means that the on-premise database will win if there are any conflicts with the data synchronization.

Defining the Data Set

The next step is to define the data that will be synchronized by the sync group. In this scenario, only the Product table in the Production schema will be synchronized. To do this, select the AdventureWorks2012 database, and then select the Products table, ensuring all columns are selected. Note that some of the tables contain data types that do not meet the schema requirements for synchronization; this is why the data types in the Product table were modified.

clip_image032[4]

 

Now the data set has been selected, the sync group can be deployed. Click the Deploy button to do this.

clip_image034[4]

The Sync Group will be provisioned and the first data synchronization will run. After a few seconds the status of the two databases should transition to good.

clip_image036[4]

Opening the SyncHubDb in the Windows Azure portal shows the tables that have been created. The Production.Product table is present, with the 504 rows of product data. There are also a number of tables that are used by SyncFramework to manage the synchronization process.

 

clip_image038[4]

Selecting data from the Production.Product table shows that the product data is now present in the SyncHubDb database. This was synchronized from the on-premise database when the Sync Group was provisioned. The synchronization will run every 5 minutes to synchronize any changes between the Product tables in both databases.

clip_image040[4]

 

Adding a Windows Azure SQL Database

The next step is to add the WebStoreDb database to the sync group. This is done in a similar way to adding the SyncHubDb database. Click on the Add Windows Azure SQL Database icon and specify the WebStoreDb database details. With this database the data synchronization will be one directional, data will be synchronized from the hub to the WebStoreDb database, but not from the WebStoreDb database to the sync hub.

clip_image042[4]

Deploy the Sync Group to save the changes, after a while the provisioning and synchronization will complete, and the topology will be as follows.

clip_image044[4]

Clicking on the Log Viewer icon will show logs from the synchronization process. This can be used to verify that synchronization is taking place, and to diagnose any errors with the synchronization process.

clip_image046[4]

Examining the WebStoreDb database shows that the product data has successfully been synchronized. The synchronization will run using the 5 minute schedule to ensure that any changes in the on-premise database are updated in the on-line store.

 

clip_image048[4]

 

Creating the Online Store Application

The on-line store application is a simple ASP.net web forms application that uses entity framework to data-bind to the products table in the WebStoreDb database.

The default page has been modified to use a Repeater control to display product information.

<%@ Page Title=”Home Page” Language=”C#” MasterPageFile=”~/Site.Master” AutoEventWireup="true" CodeBehind="Default.aspx.cs" Inherits="AdventureWorksStore._Default" %>

 

<asp:Content runat="server" ID="FeaturedContent" ContentPlaceHolderID="FeaturedContent">

    <section class="featured">

        <div class="content-wrapper">

            <hgroup class="title">

                <h1>Adventure Works Online Store</h1>               

            </hgroup>

            <h2>Select * from our Products...</h2>

        </div>

    </section>

</asp:Content>

<asp:Content runat="server" ID="BodyContent" ContentPlaceHolderID="MainContent">

    <h3>This is what we have:</h3>

    <asp:Repeater ID="rptProducts" runat="server">

        <HeaderTemplate>

 

        </HeaderTemplate>

        <ItemTemplate>

            <div>

                <div>

                  <h1><%# Eval("Name") %> - <%# Eval("ListPrice", "{0:c}") %></h1>

                </div>

            </div>

        </ItemTemplate>

    </asp:Repeater>

   

</asp:Content>

 

The code behind file will use entity framework to load the product data and data-bind it to the Repeater.

public partial class _Default : Page

{

    protected void Page_Load(object sender, EventArgs e)

    {

        AdventureWorks2012Entities ents = new AdventureWorks2012Entities();

 

        List<Product> products =

           (from p in ents.Products where p.ListPrice > 0 select p).ToList<Product>();

 

        rptProducts.DataSource = products;

        rptProducts.DataBind();

    }

}

 

The connection string is modified to use the WebStoreDb database hosted in Windows Azure SQL Database.

<connectionStrings>

  <add name="DefaultConnection" providerName="System.Data.SqlClient" connectionString="Data Source=(LocalDb)\v11.0;Initial Catalog=aspnet-AdventureWorksStore-20120913144202;Integrated Security=SSPI;AttachDBFilename=|DataDirectory|\aspnet-AdventureWorksStore-20120913144202.mdf" />

  <!--<add name="AdventureWorks2012Entities" connectionString="metadata=res://*/ProductsModel.csdl|res://*/ProductsModel.ssdl|res://*/ProductsModel.msl;provider=System.Data.SqlClient;provider connection string=&quot;data source=win7base;initial catalog=AdventureWorks2012;integrated security=True;MultipleActiveResultSets=True;App=EntityFramework&quot;" providerName="System.Data.EntityClient" />-->

  <add name="AdventureWorks2012Entities" connectionString="metadata=res://*/ProductsModel.csdl|res://*/ProductsModel.ssdl|res://*/ProductsModel.msl;provider=System.Data.SqlClient;provider connection string=&quot;Server=tcp:SERVER.database.windows.net,1433;Database=WebStoreDb;User ID=USER@SERVER;Password=password;Trusted_Connection=False;Encrypt=True;Connection Timeout=30&quot;" providerName="System.Data.EntityClient" />

</connectionStrings>

 

The application is deployed as a Windows Azure Web Site, and tested. The default page successfully displays product information.

clip_image050[4]

 

 

Testing Data Synchronization & Scheduling

In order to test that the data synchronization and scheduling are working correctly, the WPF LOB application will be used to make some changes to the products data. In this test the first three products are put on special offer, and the changes updated in the on-premise database.

clip_image052[4]

After a few minutes the changes have been successfully synchronized with the WbStoreDb database, and the special offer products are viewable in the online store website.

clip_image054[4]

The log files can be viewed to see the results of the synchronization operations.

From the Adventureworks2012 database to the SyncHubDb database:

Sync completed successfully in 17.27 seconds.

                Upload:   3 changes applied

                Download: 0 changes applied

 

For more information, provide tracing id ‘cd32f136-c1aa-4cdc-a220-5568b897ce14’ to customer support.

 

From the SyncHubDb database to the WebStoreDb database:

Sync completed successfully in 1.11 seconds.

                Upload:   0 changes applied

                Download: 3 changes applied

 

For more information, provide tracing id ‘c229816d-eff7-4549-a94a-6bd5985b4777’ to customer support.

 

Conclusions

I’ve only taken a quick look at Windows Azure SQL Data Sync and it seems fairly intuitive to get up and running with it and get basic synchronization up and running. I’ve always been a believer that companies will “Extend to the Cloud” rather than “Move to the Cloud”, and will focus on creating hybrid applications, with some services hosted in the cloud, whilst maintaining existing on-premise applications. Windows Azure SQL Data Sync, like Windows Azure Service Bus, is a great technology for bridging the gap between on-premise applications and applications hosted in the cloud.

Posted On Tuesday, September 18, 2012 11:28 AM | Comments (0) |

Powered by: