EPiServer 7 – issue with debugging and creating login user on brand new alloy default site

Today I made a clean, brand new install of EPiServer 7 on a Windows 2012 server using VS2012. For googlers finding this post with any issue present in title, just look for the bold text to try my solutions for your problemLer

I followed the instructions step by step here: http://world.episerver.com/en/Download/EPiServer-CMS/?id=66773&epslanguage=en&version=7

When the installation was finished I added my license file and browsed the site. So far so good. Then however I tried to start the site in debug mode to be able too look around for any news under the hood of the alloy site. Pressing F5 kept sending me to a page stating "This installation contains only core files."

Weird. Opened IIS and choose: Browse site. Site starts good and all. Checked the properties for IIS in VS2012. Nothing weird(changed site path from computer name to localhost though). Then I tried Release mode. No luck there either. BUT Release mode without debug mode worked so I tried to attach to process when site running. This is where I noticed the “Attach to:” was set for Automatically detect(..) option. Switching it to Managed code solved the problem.

Log in to http://rootsitename:17000/episerver

User name?? Password?? I have no idea. Guess I should go by the ASP.NET Configuration option and set up a user with administrator rights there. But, for whatever reason I wanted to try out setting the user from code. here goes:

   1: protected override void OnLoad(System.EventArgs e)
   2:         {
   3:             base.OnLoad(e);
   5:             if (Master != null) // We only do this initialization for the top-level master page
   6:             {
   7:                 return;
   8:             }
   9:             string username = "epiadmin";
  10:             string password = "epiadmin";
  11:             string email = "m.karlsson@outlook.com";
  13:             Membership.CreateUser(username, password, email);
  14:             Roles.CreateRole("Administrators");
  15:             Roles.AddUserToRole(username, "Administrators");
  17:             SetupMetaTags();
  18:         }
  19:  }

Compiling and then browsing the start page adds my user to the database and I’m able to login. Butt ugly solution I know BUT-quick as a weasel!

Intellitrace bug causes “Operation could destabilize the runtime” exception


We cant use it when we use simplemembership to handle external authorizations.


Server Error in '/' Application.

Operation could destabilize the runtime.

Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code.
Exception Details: System.Security.VerificationException: Operation could destabilize the runtime.
Source Error:

An unhandled exception was generated during the execution of the current web request. Information regarding the origin and location of the exception can be identified using the exception stack trace below.

Stack Trace:

[VerificationException: Operation could destabilize the runtime.]
   DotNetOpenAuth.OpenId.Messages.IndirectSignedResponse.GetSignedMessageParts(Channel channel) +943
   DotNetOpenAuth.OpenId.ChannelElements.ExtensionsBindingElement.GetExtensionsDictionary(IProtocolMessage message, Boolean ignoreUnsigned) +282
   DotNetOpenAuth.OpenId.ChannelElements.<GetExtensions>d__a.MoveNext() +279
   DotNetOpenAuth.OpenId.ChannelElements.ExtensionsBindingElement.ProcessIncomingMessage(IProtocolMessage message) +594
   DotNetOpenAuth.Messaging.Channel.ProcessIncomingMessage(IProtocolMessage message) +933
   DotNetOpenAuth.OpenId.ChannelElements.OpenIdChannel.ProcessIncomingMessage(IProtocolMessage message) +326
   DotNetOpenAuth.Messaging.Channel.ReadFromRequest(HttpRequestBase httpRequest) +1343
   DotNetOpenAuth.OpenId.RelyingParty.OpenIdRelyingParty.GetResponse(HttpRequestBase httpRequestInfo) +241
   DotNetOpenAuth.OpenId.RelyingParty.OpenIdRelyingParty.GetResponse() +361
   DotNetOpenAuth.AspNet.Clients.OpenIdClient.VerifyAuthentication(HttpContextBase context) +136
   DotNetOpenAuth.AspNet.OpenAuthSecurityManager.VerifyAuthentication(String returnUrl) +984
   Microsoft.Web.WebPages.OAuth.OAuthWebSecurity.VerifyAuthenticationCore(HttpContextBase context, String returnUrl) +333
   Microsoft.Web.WebPages.OAuth.OAuthWebSecurity.VerifyAuthentication(String returnUrl) +192
   PrioMvcWebRole.Controllers.AccountController.ExternalLoginCallback(String returnUrl) in c:hiddenforyou
   lambda_method(Closure , ControllerBase , Object[] ) +127
   System.Web.Mvc.ReflectedActionDescriptor.Execute(ControllerContext controllerContext, IDictionary`2 parameters) +250
   System.Web.Mvc.ControllerActionInvoker.InvokeActionMethod(ControllerContext controllerContext, ActionDescriptor actionDescriptor, IDictionary`2 parameters) +39
   System.Web.Mvc.Async.<>c__DisplayClass39.<BeginInvokeActionMethodWithFilters>b__33() +87
   System.Web.Mvc.Async.<>c__DisplayClass4f.<InvokeActionMethodFilterAsynchronously>b__49() +439
   System.Web.Mvc.Async.<>c__DisplayClass4f.<InvokeActionMethodFilterAsynchronously>b__49() +439
   System.Web.Mvc.Async.<>c__DisplayClass37.<BeginInvokeActionMethodWithFilters>b__36(IAsyncResult asyncResult) +15
   System.Web.Mvc.Async.<>c__DisplayClass2a.<BeginInvokeAction>b__20() +34
   System.Web.Mvc.Async.<>c__DisplayClass25.<BeginInvokeAction>b__22(IAsyncResult asyncResult) +221
   System.Web.Mvc.<>c__DisplayClass1d.<BeginExecuteCore>b__18(IAsyncResult asyncResult) +28
   System.Web.Mvc.Async.<>c__DisplayClass4.<MakeVoidDelegate>b__3(IAsyncResult ar) +15
   System.Web.Mvc.Controller.EndExecuteCore(IAsyncResult asyncResult) +42
   System.Web.Mvc.Async.<>c__DisplayClass4.<MakeVoidDelegate>b__3(IAsyncResult ar) +15
   System.Web.Mvc.<>c__DisplayClass8.<BeginProcessRequest>b__3(IAsyncResult asyncResult) +42
   System.Web.Mvc.Async.<>c__DisplayClass4.<MakeVoidDelegate>b__3(IAsyncResult ar) +15
   System.Web.CallHandlerExecutionStep.System.Web.HttpApplication.IExecutionStep.Execute() +523
   System.Web.HttpApplication.ExecuteStep(IExecutionStep step, Boolean& completedSynchronously) +176

Version Information: Microsoft .NET Framework Version:4.0.30319; ASP.NET Version:4.0.30319.17929

Connect to running web role on Azure using Remote Desktop Connection and VS2012


We want to be able to collect IntelliTrace information from our running app and also use remote desktop to connect to the IIS and look around(probably debugging).

1. Create certificate

1.1 Right-click the cloud project (marked in red) and select “Configure remote desktop”.


1.2 In the drop down list of certificates, choose <create> at the bottom.

1.3. Follow the instructions, you can set it up with default values.

1.4 When done. Choose the certificate and click “Copy to File…” as seen in the left of the picture above.

1.5. Save the file with any name you want.

Now we will save it to local storage to be able to import it to our solution through the azure configuration manager in step 3.

2. Save certificate to local storage

Now we need to attach it to our local certificate storage to be able to reach it from our confiuguration manager in visual studio. Microsoft provides the following steps for doing this:


In order to view the Certificates store on the local computer, perform the following steps:

  1. Click Start, and then click Run.
  2. Type "MMC.EXE" (without the quotation marks) and click OK.
  3. Click Console in the new MMC you created, and then click Add/Remove Snap-in.
  4. In the new window, click Add.
  5. Highlight the Certificates snap-in, and then click Add.
  6. Choose the Computer option and click Next.
  7. Select Local Computer on the next screen, and then click OK.
  8. Click Close , and then click OK.
  9. You have now added the Certificates snap-in, which will allow you to work with any certificates in your computer's certificate store. You may want to save this MMC for later use.
Now that you have access to the Certificates snap-in, you can import the server certificate into you computer's certificate store by following these steps:
  1. Open the Certificates (Local Computer) snap-in and navigate to Personal, and then Certificates.
    Note: Certificates may not be listed. If it is not, that is because there are no certificates installed.
  2. Right-click Certificates (or Personal if that option does not exist.)
  3. Choose All Tasks, and then click Import.
  4. When the wizard starts, click Next. Browse to the PFX file you created containing your server certificate and private key. Click Next.
  5. Enter the password you gave the PFX file when you created it. Be sure the Mark the key as exportable option is selected if you want to be able to export the key pair again from this computer. As an added security measure, you may want to leave this option unchecked to ensure that no one can make a backup of your private key.
  6. Click Next, and then choose the Certificate Store you want to save the certificate to. You should select Personal because it is a Web server certificate. If you included the certificates in the certification hierarchy, it will also be added to this store.
  7. Click Next. You should see a summary of screen showing what the wizard is about to do. If this information is correct, click Finish.
  8. You will now see the server certificate for your Web server in the list of Personal Certificates. It will be denoted by the common name of the server (found in the subject section of the certificate).
Now that you have the certificate backup imported into the certificate store, you can enable Internet Information Services 5.0 to use that certificate (and the corresponding private key). To do this, perform the following steps:
  1. Open the Internet Services Manager (under Administrative Tools) and navigate to the Web site you want to enable secure communications (SSL/TLS) on.
  2. Right-click on the site and click Properties.
  3. You should now see the properties screen for the Web site. Click the Directory Security tab.
  4. Under the Secure Communications section, click Server Certificate.
  5. This will start the Web Site Certificate Wizard. Click Next.
  6. Choose the Assign an existing certificate option and click Next.
  7. You will now see a screen showing that contents of your computer's personal certificate store. Highlight your Web server certificate (denoted by the common name), and then click Next.
  8. You will now see a summary screen showing you all the details about the certificate you are installing. Be sure that this information is correct or you may have problems using SSL or TLS in HTTP communications.
  9. Click Next, and then click OK to exit the wizard.
You should now have an SSL/TLS-enabled Web server. Be sure to protect your PFX files from any unwanted personnel.

Image of a typical MMC.EXE with the certificates up.



3. Import the certificate to you visual studio project.

3.1 Now right click your equivalent to the MvcWebRole1 (as seen in the first picture under the red oval) and choose properties.

3.2 Choose Certificates. Right click the ellipsis to the right of the “thumbprint” and you should be able to select your newly created certificate here. After selecting it- save the file.



4. Upload the certificate to your Azure subscription.

4.1 Go to the azure management portal, click the services menu icon to the left and choose the service. Click Upload in the bottom menu.




5. Connect to server.

Since I tried to use account settings(have to use another name) we have to set up a new name for the connection. No biggie.

5.1 Go to azure management portal, select your service and in the bottom menu, choose “REMOTE”. This will display the configuration for remote connection. It will actually change your ServiceConfiguration.cscfg file. After you change It here it might be good to choose download and replace the one in your project. Set a name that is not your windows azure account name and not Administrator.


5.2 Goto visual studio, click Server Explorer. Choose as selected in the picture below and click “COnnect using remote desktop”.


5.2 You will now be able to log in with the name and password set up in step 5.1.

and voila! Windows server 2012, IIS and other nice stuff!



To do this one I’ve been using http://msdn.microsoft.com/en-us/library/windowsazure/ff683671.aspx where you can collect some of this information and additional one.

How to use Azure storage for uploading and displaying pictures.

Basic set up of Azure storage for local development and production.

This is a somewhat completion of the following guide from http://www.windowsazure.com/en-us/develop/net/how-to-guides/blob-storage/ that also involves a practical example that I believe is commonly used, i.e. upload and present an image from a user.


First we set up for local storage and then we configure for them to work on a web role.


1. Configure connection string locally.

2. Configure model, controllers and razor views.


1. Setup connectionsstring


1.1 Right click your web role and choose “Properties”.

1.2 Click Settings.

1.3 Add setting.

1.4 Name your setting. This will be the name of the connectionstring.

1.5 Click the ellipsis to the right. (the ellipsis appear when you mark the area.

1.6 The following window appears- Select “Windows Azure storage emulator” and click ok.



Now we have a connection string to use. To be able to use it we need to make sure we have windows azure tools for storage.

2.1 Click Tools –> Library Package manager –> Manage Nuget packages for solution.

2.2 This is what it looks like after it has been added.



Now on to what the code should look like.

3.1 First we need a view which collects images to upload. Here Index.cshtml.

   1: @model List<string>   
   3: @{
   4:     ViewBag.Title = "Index";
   5: }
   7: <h2>Index</h2>
   8: <form action="@Url.Action("Upload")" method="post" enctype="multipart/form-data">
  10:     <label for="file">Filename:</label>
  11:     <input type="file" name="file" id="file1" />
  12:     <br />
  13:     <label for="file">Filename:</label>
  14:     <input type="file" name="file" id="file2" />
  15:     <br />
  16:     <label for="file">Filename:</label>
  17:     <input type="file" name="file" id="file3" />
  18:     <br />
  19:     <label for="file">Filename:</label>
  20:     <input type="file" name="file" id="file4" />
  21:     <br />
  22:     <input type="submit" value="Submit" />
  24: </form>
  26: @foreach (var item in Model) {
  28:     <img src="@item" alt="Alternate text"/>
  29: }

3.2 We need a controller to receive the post. Notice the “containername” string I send to the blobhandler. I use this as a folder for the pictures for each user. If this is not a requirement you could just call it container or anything with small characters directly when creating the container.

   1: public ActionResult Upload(IEnumerable<HttpPostedFileBase> file)
   2:         {
   3:             BlobHandler bh = new BlobHandler("containername");
   4:             bh.Upload(file);
   5:             var blobUris=bh.GetBlobs();
   7:             return RedirectToAction("Index",blobUris);
   8:         }

3.3 The handler model. I’ll let the comments speak for themselves.

   1: public class BlobHandler
   2:     {
   3:         // Retrieve storage account from connection string.
   4:         CloudStorageAccount storageAccount = CloudStorageAccount.Parse(
   5:         CloudConfigurationManager.GetSetting("StorageConnectionString"));
   7:         private string imageDirecoryUrl; 
   9:         /// <summary>
  10:         /// Receives the users Id for where the pictures are and creates 
  11:         /// a blob storage with that name if it does not exist.
  12:         /// </summary>
  13:         /// <param name="imageDirecoryUrl"></param>
  14:         public BlobHandler(string imageDirecoryUrl)
  15:         {
  16:             this.imageDirecoryUrl = imageDirecoryUrl;
  17:             // Create the blob client.
  18:             CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
  20:             // Retrieve a reference to a container. 
  21:             CloudBlobContainer container = blobClient.GetContainerReference(imageDirecoryUrl);
  23:             // Create the container if it doesn't already exist.
  24:             container.CreateIfNotExists();
  26:             //Make available to everyone
  27:             container.SetPermissions(
  28:                 new BlobContainerPermissions
  29:                 {
  30:                     PublicAccess = BlobContainerPublicAccessType.Blob
  31:                 });
  32:         }
  34:         public void Upload(IEnumerable<HttpPostedFileBase> file)
  35:         {
  36:             // Create the blob client.
  37:             CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
  39:             // Retrieve a reference to a container. 
  40:             CloudBlobContainer container = blobClient.GetContainerReference(imageDirecoryUrl);
  42:             if (file != null)
  43:             {
  44:                 foreach (var f in file)
  45:                 {
  46:                     if (f != null)
  47:                     {
  48:                         CloudBlockBlob blockBlob = container.GetBlockBlobReference(f.FileName);
  49:                         blockBlob.UploadFromStream(f.InputStream);
  50:                     }
  51:                 }
  52:             }
  53:         }
  55:         public List<string> GetBlobs()
  56:         {
  57:             // Create the blob client. 
  58:             CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
  60:             // Retrieve reference to a previously created container.
  61:             CloudBlobContainer container = blobClient.GetContainerReference(imageDirecoryUrl);
  63:             List<string> blobs = new List<string>();
  65:             // Loop over blobs within the container and output the URI to each of them
  66:             foreach (var blobItem in container.ListBlobs())
  67:                 blobs.Add(blobItem.Uri.ToString());
  69:             return blobs;
  70:         }
  71:     }

3.4 So, when the files have been uploaded we will get them to present them to out user in the index page. Pretty straight forward. In this example we only present the image by sending the Uri’s to the view. A better way would be to save them up in a view model containing URI, metadata, alternate text, and other relevant information but for this example this is all we need.


4. Now press F5 in your solution to try it out. You can see the storage emulator UI here:




4.1 If you get any exceptions or errors I suggest to first check if the service Is running correctly. I had problem with this and they seemed related to the installation and a reboot fixed my problems.




5. Set up for Cloud storage. To do this we need to add configuration for cloud just as we did for local in step one.

5.1 We need our keys to do this. Go to the windows Azure menagement portal, select storage icon to the right and click “Manage keys”. (Image from a different blog post though).



5.2 Do as in step 1.but replace step 1.6 with:

1.6 Choose “Manually entered credentials”. Enter your account name.

1.7 Paste your Account Key from step 5.1. and click ok.



5.3. Save, publish and run!

Please feel free to ask any questions using the comments form at the bottom of this page. I will get back to you to help you solve any questions. Our consultancy agency also provides services in the Nordic regions if you would like any further support.

Connect to localdb using Sql Server management studio


I was trying to find my databse for local db under localhost etc but no luck.

The following led me to just connect to it, kind of obvious really when you look at your connections string but.. its sunday morning or something..

From: http://blogs.msdn.com/b/sqlexpress/archive/2011/07/12/introducing-localdb-a-better-sql-express.aspx

High-Level Overview

After the lengthy introduction it's time to take a look at LocalDB from the technical side. At a very high level, LocalDB has the following key properties:

  1. LocalDB uses the same sqlservr.exe as the regular SQL Express and other editions of SQL Server. The application is using the same client-side providers (ADO.NET, ODBC, PDO and others) to connect to it and operates on data using the same T-SQL language as provided by SQL Express.
  2. LocalDB is installed once on a machine (per major SQL Server version). Multiple applications can start multiple LocalDB processes, but they are all started from the same sqlservr.exe executable file from the same disk location.
  3. LocalDB doesn't create any database services; LocalDB processes are started and stopped automatically when needed. The application is just connecting to "Data Source=(localdb)\v11.0" and LocalDB process is started as a child process of the application. A few minutes after the last connection to this process is closed the process shuts down.
  4. LocalDB connections support AttachDbFileName property, which allows developers to specify a database file location. LocalDB will attach the specified database file and the connection will be made to it.


Add SQL Azure database to Azure Web Role and persist data with entity framework code first.

By: Magnus Karlsson, http://geekswithblogs.net/MagnusKarlsson

In my last post I went for a warts n all approach to set up a web role on Azure. In this post I’ll describe how to add an SQL Azure database to the project. This will be described with an as minimal as possible amount of code and screen dumps. All questions are welcome in the comments area. Please don’t email since questions answered in the comments field is made available to other visitors.

As an example we will add a comments section to the site we used in the previous post (Länk här).


1. Create a Comments entity and then use Scaffolding to set up controller and view, and add ConnectionString to web.config.

2. Create SQL Azure database in Management Portal and link the new database

3. Test it online!


1. Right click Models folder, choose add, choose “class…” . Name the Class Comment.


1.1 Replace the Code in the class with the following:

using System.Data.Entity;

namespace MvcWebRole1.Models


public class Comment


   public int CommentId { get; set; }

   public string Name { get; set; }  

   public string Content { get; set; }


public class CommentsDb : DbContext


public DbSet<Comment> CommentEntries { get; set; }



Now Entity Framework can create a database and a table named Comment.

Build your project to assert there are no build errors.


1.2 Right click Controllers folder, choose add, choose “class…” . Name the Class CommentController and fill out the values as in the example below.




1.3 Click Add.

Visual Studio now creates default View for CRUD operations and a Controller adhering to these and opens them.


1.3 Open Web.config and add the following connectionstring in <connectionStrings> node.

<add name="CommentsDb”

connectionString="data source=(LocalDB)\v11.0;Integrated Security=SSPI;AttachDbFileName=|DataDirectory|\CommentsDb.mdf;Initial Catalog=CommentsDb;MultipleActiveResultSets=True"

providerName="System.Data.SqlClient" />


1.4 Save All and press F5 to start the application.

1.5 Go to which will redirect you through CommentsController to the Index View which looks like this:




Click Create new. In the Create-view, add name and content and press Create.



   1: //
   2: // POST: /Comments/Create
   4: [HttpPost]
   5: public ActionResult Create(Comment comment)
   6: {
   7:     if (ModelState.IsValid)
   8:     {
   9:         db.CommentEntries.Add(comment);
  10:         db.SaveChanges();
  11:         return RedirectToAction("Index");
  12:     }
  14:     return View(comment);
  15: }


The default View() is Index so that is the View you will come to. Looking like this:

   1: //
   2: // GET: /Comments/
   4: public ActionResult Index()
   5: {
   6:     return View(db.CommentEntries.ToList());
   7: }

Resulting in the following screen dump(success!):



2. Now, go to the Management portal and Create a new db.



2.1 With the new database created. Click the DB icon in the left most menu. Then click the newly created database. Click DASHBOARD in the top menu. Finally click Connections strings in the right menu to get the connection string we need to add in our web.debug.config file.



2.2 Now, take a copy of the connection String earlier added to the web.config and paste in web.debug.conifg in the connectionstrings node.

Replace everything within “ “ in the copied connectionstring with that you got from SQL Azure. You will have something like this:



2.3 Rebuild the application, right click the cloud project and choose “Package…” (if you haven’t set up publishing profile which we will do in our next blog post).

Remember to choose the right config file, use debug for staging and release for production so your databases won’t collide.

You should see something like this:



2.4 Go to Management Portal and click the Web Services menu, choose your service and click update in the bottom menu.



2.5 Link the newly created database to your application. Click the LINKED RESOURCES in the top menu and then click “Link” in the bottom menu. You should get something like this.


3. Alright then. Under the Dashboard you can find the link to your application. Click it to open it in a browser and then go to ~/Comments to try it out just the way we did locally.

Success and end of this story!


Setup and configure a MVC4 project for Cloud Service(web role) and SQL Azure

Update Oct. 29 2012: Got a question about how to upload Certificate files. See part 4 for instructions.

I aim at keeping this blog post updated and add related posts to it. Since there are a lot of these out there I link to others that has done kind of the same before me, kind of a blog-DRY pattern that I'm aiming for. I also keep all mistakes and misconceptions for others to see. As an example; if I hit a stacktrace I will google it if I don't directly figure out the reason for it. I will then probably take the most plausible result and try it out. If it fails because I misinterpreted the error I will not delete it from the log but keep it for future reference and for others to see. That way people that finds this blog can see multiple solutions for indexed stacktraces and I can better remember how to do stuff. To avoid my errors I recommend you to read through it all before going from start to finish.
The steps:
  1. Setup project in VS2012. (msdn blog)
  2. Setup Azure Services (half of mpspartners.com blog)
  3. Setup connections strings and configuration files (msdn blog + notes)
  4. Export certificates.
  5. Create Azure package from vs2012 and deploy to staging (same steps as for production).
  6. Connections string error


  1. Set up the visual studio project:

 2. Then login in to Azure to setup the services:
Stop following this guide at the "publish website" part since we'll be uploading a package.

 3. When set up (connection strings for debug and release and all), follow this guide to set up the configuration files:

Trying to package our application at this step will generate the following warning:
3>MvcWebRole1(0,0): warning WAT170: The configuration setting 'Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString' is set up to use the local storage emulator for role 'MvcWebRole1' in configuration file 'ServiceConfiguration.Cloud.cscfg'. To access Windows Azure storage services, you must provide a valid Windows Azure storage connection string.


Right click the web role under roles in solution manager and choose properties. Choose "Service configuration: Cloud". At "specify storage account credentials" we will copy/paste our account name and key from the Azure management platform.

4. Right-click the Cloud Project(called BIAB in the screen shot). 
Click "Remote desktop Configuration". In the Remote desktop configuration window, 
Click the "View..." button next to drop down list. I have two certificates in my drop down list. Remember the name of the certificate selected in the list since you might wanna name the file youre exporting to something similar at the end of the process. You will do this this procedure for each certificate you are using. Clicking "View..." will open a new window named "Certificate". 
In this window, click the "Copy to file..." button in the down right area of the window. 
Select "Yes, export private key" and click next. 
Keep the suggesed export format and click next. Name the certificate and choose browse to save it somewhere you'll find it from the portal manager later. 
Finally, click "Finish". Now do this again for your other certificates before we go to the Portal manager.


When the certificates saved to files we can go to the portal manager to upload them to our Cloud Service. 
In portal manager: Click "Cloud Services in the left menu. Select your Cloud Service. Choose staging or production depending of the wnvironment you are configuring. In the top menu to the right. Select certificates. Choose upload in the bottom menu and browse for your certificates to upload them. Done.

5 Now right click the cloud project and select package.

5.1 Showing dialogue box.


5.2 Package success


Now copy the path to the packaged file and go to management portal again. 
Click your web role and choose staging (or production). Upload. 

Tick the box about the single instance if that's what you want or you don't know what it means. Otherwise the following will happen (see image 4.6)
5.4 Dialogue box


When you have clicked the symbol for accept- button you will see the following screen with some green indicators down at the right corner. Click them if you want to see status.
5.5 Information screen.

"Failed to deploy application. 
The upload application has at least one role with only one instance. We recommend that you deploy at least two instances per role to ensure high availability in case one of the instances becomes unavailable. "
To fix, go to step 5.4

If you forgot to (or just didn't know you were supposed to) export your certificates. The following error will occur. 
Side note, the following thread suggests. To prevent: "Enable Remote Desktop for all roles" when right-clicking BIAB and choosing "Package". But in my case it was the not so present certificates. I fund the solution here.

5.8 Success! 

5.9 Nice URL n' all. (More on that at another blog post).

6. If you try to login and get

When this error occurs many web sites suggest this is because you need:http://nuget.org/packages/Microsoft.AspNet.Providers.LocalDB

Or : http://nuget.org/packages/Microsoft.AspNet.Providers

But it can also be that you don't have the correct setup for converting connectionstrings between your web.config to your debug.web.config(or release.web.config, whichever your using).
Run as suggested in the "ordinary project in your solution.

So, hope you find my notes useful and please feel free to send more emails but I would really prefer if you wrote any questions here directly since the answers would benefit other visitors as well then. Happy coding!





Entity Framework 4.1 Code First- Use MSSQLSERVER instead of EXPRESS

This is my first blog post and will be short and concise. I will get back to formatting later.

Edit: Here is the same explanation but from MSDN :)

Been playing around with MVC, Scaffolding and Razor a couple of weeks now on my spare time. Think I should write som stuff down here to share with others!

I started out here, thanks to Steve Sanderson!

However, what is not included is how to set up your database if you prefer to use MS SQLSERVER instead of EXPRESS. 
Take the name of the persistence class (probably NAMEOFPROJECTcontext.cs) in you models folder and use it like I use TestProject here.

<add name="TestContext"
        connectionString="Server=.;Database=TestContext.mdf;Integrated Security=SSPI;"
        providerName="System.Data.SqlClient" />

This created a TestContext.mdf database directly when my Initializedb ran.