News



The first thing we need to do is set up a service reference to Import Service here are the URL’s for the different datacenters. You will need to use the datacenter that your you have created your SQL Azure Server in. 
 
North Central US    https://ch1prod-dacsvc.azure.com/DACWebService.svc
South Central US    https://sn1prod-dacsvc.azure.com/DACWebService.svc
North Europe          https://db3prod-dacsvc.azure.com/DACWebService.svc
West Europe           https://am1prod-dacsvc.azure.com/DACWebService.svc
East Asia             https://hkgprod-dacsvc.azure.com/DACWebService.svc
Southeast Asia https://sg1prod-dacsvc.azure.com/DACWebService.svc
Northwest US         http://dacdc.cloudapp.net/DACWebService.svc
 
Once we have setup, a service reference it will be uncomplicated to execute an import. You will need to have already uploaded the bacpac file to Blob storage to perform the import. The first thing we need to do is create our service client. I have named my service reference ImportService. 
 

var client = new ImportService.DACWebServiceClient();


Once the service client has been created, we need to create an ImportInput object and populate it with the appropriate values. 
 

var importInput = new ImportService.ImportInput();

importInput.AzureEdition = "Web";

importInput.DatabaseSizeInGB = 5;

importInput.ConnectionInfo = new ImportService.ConnectionInfo();

importInput.ConnectionInfo.DatabaseName = "YourDatabaseName";

importInput.ConnectionInfo.Password = "YourPassword";

importInput.ConnectionInfo.ServerName = "YourServerName";

importInput.ConnectionInfo.UserName = "YourUserName";


 
Now we need to setup or blob credentials. You can use either a shared access key or your storage access key. 
 

var credentials = new ImportService.BlobStorageAccessKeyCredentials();

credentials.StorageAccessKey = "YourStorageAccessKey";

credentials.Uri = "Your Uri to the bacpac located in blob storage";

 

var sharedCredentials = new ImportService.BlobSharedAccessKeyCredentials();

sharedCredentials.SharedAccessKey = "YourSharedAccessKey";

sharedCredentials.Uri = "Your Uri to the bacpac located in blob storage";

 
We then set the credentials on the importInput object.
 

importInput.BlobCredentials = credentials;

  

Once we have everything setup we just need to call the Import method with the inputimport object as a parameter.
 
client.Import(importInput);
 
This call also can be done asynchronously which I recommend. This should get you started using the import service. I will do a follow up with the export service.
 



This weekend I was in Boston participating in Startup Weekend. There were over 150 people participating this weekend. This was my first startup weekend and their were over 70 ideas pitched on friday night. Once the voting was in the pitches were narrowed down to 20 ideas. We formed teams and I ended up on a team with 5 business people, 2 creative people and 3 developers. I worked with 2 Microsoft Evangelist’s Scott Klein and Nathan Totten who were in from Redmond. I noticed that there were a lot more business people than developers at the event. We need to get more developers to participate. A number of the ideas that were pitched were very interesting and could turn into a nice startup. The weekend had students from MIT, Harvard, Tufts and a number of other local colleges and universities participating. If you have not herd about startup weekend you can find information at http://startupweekend.org there are a number of events all over the United States and World.



So I was asked today how to do cross joins in SQL Azure using Linq. Well the simple answer is you cant do it. It is not supported but there are ways around that. The solution is actually very simple and easy to implement. So here is what I did and how I did it.
I created two SQL Azure Databases. The first Database is called AccountDb and has a single table named Account, which has an ID, CompanyId and Name in it. The second database I called CompanyDb and it contains two tables. The first table I named Company and the second I named Address. The Company Table has an Id and Name column. The Address Table has an Id and CompanyId columns. Since we cannot do cross joins in Azure we have to have one of the models preloaded with data. I simply put the Accounts into a List of accounts and use that in my join.
 
var accounts = new AccountsModelContainer().Accounts.ToList();
var companies = new CompanyModelContainer().Companies;
var query = from account in accounts
            join company in
                (
                      from c in companies
                     select c
                 ) on account.CompanyId equals company.Id
            select new AccountView() {
                                              AccountName = account.Name,
CompanyName = company.Name,                                
Addresses = company.Addresses
                        };
return query.ToList();
 
So as long as you have your data loaded from one of the contexts you can still execute your queries and get the data back that you want.


Mounting a Drive in a VMRole is a little more complicated then a web or worker.  The Web and Worker roles offer OnStart and OnStop events, which you can use to mount or unmount your drives. The VMRole does not have these same events so you have to provide another way for the drives to be mounted or unmounted. The problem I have run into is what if you have multiple drives and you only want to mount certain drives. How do you let your user mount the drive.

I am not going to go into details on what kind of GUI to present to the user. I have done this in a simple WPF application as well as a console application.
We are going to need to get the storage account details. One thing to note when you are mounting cloud drives you cannot use https and have to use http. We force the use of http by using false when we create the CloudStorageAccount.
 
StorageCredentialsAccountAndKey credentials = new StorageCredentialsAccountAndKey("AccountName", "AccountKey");
CloudStorageAccount storageAccount = new CloudStorageAccount(credentials, false);
 
Next we need to get a reference to the container.
 
var blobClient = storageAccount.CreateCloudBlobClient();
var container = blobClient.GetContainerReference("ContainerName");
 
Now we need to get a list of the drives in the container
 
var drives = container.ListBlobs();
 
Now that we have a list of the drives in the container we can let the user choose which drive they want to mount. I am just selecting the 1st drive in the list for the example and getting the Uri of the drive.
 
var driveUri = drives.First().Uri;
 
Now that we have the Uri we need to get the reference to the drive.
var drive = new CloudDrive(driveUri, storageAccount.Credentials);
 
Now all that is left is to mount the drive.
 
var driveLetter = drive.Mount(0, DriveMountOptions.None);
 
To unmount the drive all you have to do is call unmount on the drive.
drive.Unmount();
 
You do need to make sure you unount the drives when you are done with them. I have run into issues with the drives being locked until the VMRole is rebooted. I have also managed to have a drive be permanently locked and I was forced to delete it and upload it again. I have been unable to reproduce the permanent lock but I am still trying.
The CloudDrive class provides a handy method to retrieve all the mounted drives in the Role.
foreach (var drive in CloudDrive.GetMountedDrives()) {
         var mountedDrive = Account.CreateCloudDrive(drive.Value.PathAndQuery);
         mountedDrive.Unmount();
}