Sunday, May 19, 2013 #

Things to Watch out for with a Production system in Microsoft Azure

At Industrack,we migrated our production backend to Azure. Our platform is a real-time data collection system of GPS data streaming from remote devices. For the most part it is working well, there are some gaps in services that I am expecting from Azure that hard are hard to find from external vendors. Microsoft has a 99.9% SLA on their Azure services that covers Maintenance windows an transient errors as well that arise from load throttling other intermittent connection failures.

I have notices that since Microsoft has been trying to get so much out into the hands of developers lately that they are sacrificing some of their historical consistency. For example, we decided to upgrade our Enterprise Library from 4.1 to 5.0. Ent Lib 6.0 was not an option for us at this point as we had dependencies on shard libs that have to work on Windows XP. Ent Lib 6.0 is compiled with .NET 4.5, which is not supported on Windows XP. That said, I ended up having to recompile the Azure TransientFaultHandling Block locally anyway as it had dependencies on an earlier version of the Azure SDK that I was using.

For those that are using any Azure services, you should be using the TransientFaultHandling block for handling retries to the service. It has some nice classes for allowing you to define different retry strategies.

SQL Azure has been working well, although we ran in to an instance where we ran our of space in the tier of database that we had (5GB) and we lost 2 hours of data before we noticed that connections were failing with an exception that described the problem, however, it seems like the admin on the Azure account SHOULD have received a notification of the condition so it could be resolved. It is an easy fix, go to the admin screen and pick the next tier of SQL and click save! But some notification here would have saved me some big headaches.

We are using the Service Bus to consolidate the data from the different endpoints we support which has also been working well. However, I recommend spending time learning about Prefetch, Deadlettering, and PeekLock to use the queues in an efficient manner

We are just starting to use SQL Reporting in Azure and so far this looks promising. If you do have a current installation of SSRS that you are thinking of migrating, make sure you look at the list of things NOT supported in SSRS for Azure to make sure you are aware of all the things in your current reports that might not work as well as scheduling and delivery.

So far my only real disappointment is in the Azure VM area. so far this has proven very unreliable with the VMs restarting unannounced on the order of 1 time per week or so. I have no idea why it restarts, but from what I have read it can happen due to hardware failures and the platform moving the VM to different hardware and restarting it. but again, with NO NOTIFICATION! We cannot manage something that does not alert us when actions like this are being taken.

In general the whole Monitoring area of Azure is missing. When you consider the tools that Microsoft has for monitoring, System Center, and the capabilities it brings to table, the fact that these features are missing from Azure is really disappointing. If that kind of monitoring was available out of the box, Azure would truly be a Word Class platform differentiating itself from Amazon, Google, etc.

Posted On Sunday, May 19, 2013 9:27 AM | Comments (0)

Monday, January 28, 2013 #

TFS In the Cloud!

Late last year Microsoft launched TFS in the cloud at It is free for 5 users and less. Microsoft hasn’t completely worked out the pricing yet for larger teams, but that will be out soon. This is a great opportunity to create an account that can be used to learn more about the ins and outs of TFS. This service does not have SharePoint or Reporting like the full blown TFS would, but for getting familiar with concepts like Work Items, Change sets, Shelf sets, Branching and Merging this a place where all of that can be experimented with and used for your own personal development projects. Cloud TFS does have build services deployed so here is another great learning opportunity to see how the TFS build system works without having to find a server or using your local machine.

Instead of SharePoint and Reports for on line viewing, there is a web portal that gives users of the project the ability to view work items and reports exposing project health, task burn down and others for visualizing project data. This eliminates a lot of the extra stuff that comes with SharePoint and SSRS that tends to distract customers from TFS and muddies the water in terms of losing focus on the data that TFS provides

The same alerting functionality is available from the Cloud version so that team members can receive emails when events like Work Item Assigned or Code Checked In can be sent to team members to allow them to always be aware of what is happening on the project. This can be challenging to set up at large customers who need many approvals to tie a tool into their internal email systems.

There is much more Source Control view ability on in the Cloud implementation of TFS than there is using the current web portal and SharePoint sites. From the web you can view the histories of files and see the comparisons of each version right from the browser. This is handy for teams that might not be using Visual Studio as their IDE for developing things like Java, or other languages that aren’t native to Microsoft. It can also be very useful for seeing changes in non-complied files as well such as HTML and XML without having to have Visual Studio installed.

One of the challenges faced with TFS is that it is such a comprehensive tool. Most organizations already have all of the pieces that TFS brings together already built out using other tools or on many cases the same tools, just not coupled like TFS can couple all project information together. And the challenge is that to demonstrate how their project information can be much more informative using TFS usually requires a significant amount of time from the Project Management, to Infrastructure support to Project participants and up to the ‘C’ level people who can use the information to make critical decisions based on project related data. But with TFS in the Cloud a ‘Pilot’ project can be put together quickly and the focus can be power that TFS brings to bear and less on effort required to implement.

Posted On Monday, January 28, 2013 12:45 PM | Comments (0)

Friday, November 16, 2012 #

Problem Solving vs. Solution Finding

By enlarge, most developers fall into these two camps I will try to explain what I mean by way of example.

A manager gives the developer a task that is communicated like this: “Figure out why control A is not loading on this form”. Now, right there it could be argued that the manager should probably have given better direction and said something more like: “Control A is not loading on the Form, fix it”. They might sound like the same thing to most people, but the first statement will have the developer problem solving the reason why it is failing. The second statement should have the developer looking for the solution to make it work, not focus on why it is broken. In the end, they might be the same thing, but I usually see the first approach take way longer than the second approach.

The Problem Solver:

The problem solver’s approach to fixing something that is broken is likely to take the error or behavior that is being observed and start to research it using a tool like Google, or any other search engine. 7/10 times this will yield results for the most common of issues. The challenge is in the other 30% of issues that will take the problem solver down the rabbit hole and cause them not to surface for days on end while every avenue is explored for the cause of the problem. In the end, they will probably find the cause of the issue and resolve it, but the cost can be days, or weeks of work.

The Solution Finder:

The solution finder’s approach to a problem will begin the same way the Problem Solver’s approach will. The difference comes in the more difficult cases. Rather than stick to the pure “This has to work so I am going to work with it until it does” approach, the Solution Finder will look for other ways to get the requirements satisfied that may or may not be using the original approach. For example. there are two area of an application of externally equivalent features, meaning that from a user’s perspective, the behavior is the same. So, say that for whatever reason, area A is now not working, but area B is working. The Problem Solver will dig in to see why area A is broken, where the Solution Finder will investigate to see what is the difference between the two areas and solve the problem by potentially working around it.

The other notable difference between the two types of developers described is what point they reach before they re-emerge from their task. The problem solver will likely emerge with a triumphant “I have found the problem” where as the Solution Finder will emerge with the more useful “I have the solution”.


At the end of the day, users are what drives features in software development. With out users there is no need for software. In todays world of software development with so many tools to use, and generally tight schedules I believe that a work around to a problem that takes 8 hours vs. the more pure solution to the problem that takes 40 hours is a more fruitful approach.

Posted On Friday, November 16, 2012 11:07 AM | Comments (0)

Tuesday, July 31, 2012 #

How long before I have to pay?

I have some concerns where I see Microsoft going in terms of making developers pay to use tools that they already own. For example, to test out Windows Phone development I had to pay the $99 fee to allow me to actually deploy MY test app on to MY phone to test against actual hardware and not in the emulator. Now, if I choose to upload my app the the App Store and use all of the services available there, I would gladly pay the fee as I understand that all of that infrastructure doesn't pay for itself (although, maybe if there were more apps it would...chicken and egg??). I have been working with Metro apps on Windows 8 and had to update my "Developer License" today. At the moment it is free, but how long before I have to pay $99 to develop and debug locally with that as well? The point I am getting to is, Microsoft has done an amazing job over the years of getting tools into developers hands to proliferate their footprint in the market place. The complaint I hear most from developers today is the 'entry fee' for working on the windows phone, with the argument that: "Developing on the Android is free" My concern is that with the new model of making developers "pay to play" will keep many away from what is an amazing platform both in terms of Windows Phone 7 and Window 8 Metro.

Posted On Tuesday, July 31, 2012 10:13 AM | Comments (2)

Thursday, June 21, 2012 #

Managing common code on Windows 7 (.NET) and Windows 8 (WinRT)

Recent announcements regarding Windows Phone 8 and the fact that it will have the WinRT behind it might make some of this less painful but I  discovered the "XmlDocument" object is in a new location in WinRT and is almost the same as it's brother in .NET

  • System.Xml.XmlDocument (.NET)
  • Windows.Data.Xml.Dom.XmlDocument (WinRT)

The problem I am trying to solve is how to work with both types in the code that performs the same task on both Windows Phone 7 and Windows 8 platforms. The first thing I did was define my own XmlNode and XmlNodeList classes that wrap the actual Microsoft objects so that by using the "#if" compiler directive either work with the WinRT version of the type, or the .NET version from the calling code easily.

  1. public class XmlNode
  2.     {
  3. #if WIN8
  4.         public Windows.Data.Xml.Dom.IXmlNode Node { get; set; }
  5.         public XmlNode(Windows.Data.Xml.Dom.IXmlNode xmlNode)
  6.         {
  7.             Node = xmlNode;
  8.         }
  9. #endif
  10. #if !WIN8
  11. public System.Xml.XmlNode Node { get; set ; }
  12. public XmlNode(System.Xml.XmlNode xmlNode)
  13.         {
  14.             Node = xmlNode;
  15.         }
  16. #endif
  17.     }
  18. public class XmlNodeList
  19.     {
  20. #if WIN8
  21.         public Windows.Data.Xml.Dom.XmlNodeList List { get; set; }
  22.         public int Count {get {return (int)List.Count;}}
  23.         public XmlNodeList(Windows.Data.Xml.Dom.XmlNodeList list)
  24.         {
  25.             List = list;
  26.         }
  27. #endif
  28. #if !WIN8
  29. public System.Xml.XmlNodeList List { get; set ; }
  30. public int Count { get { return List.Count;}}
  31. public XmlNodeList(System.Xml.XmlNodeList list)
  32.         {
  33.             List = list;
  34.        }
  35. #endif
  36.     }

From there I can then use my XmlNode and XmlNodeList in the calling code with out having to clutter the code with all of the additional #if switches. The challenge after this was the code that worked directly with the XMLDocument object needed to be seperate on both platforms since the method for populating the XmlDocument object is completly different on both platforms.

To solve this issue. I made partial classes, one partial class for .NET and one for WinRT. Both projects have Links to the Partial Class that contains the code that is the same for the majority of the class, and the partial class contains the code that is unique to the version of the XmlDocument.


The files with the little arrow in the lower left corner denotes 'linked files' and are shared in multiple projects but only exist in one location in source control. You can see that the _Win7 partial class is included directly in the project since it include code that is only for the .NET platform, where as it's cousin the _Win8 (not pictured above) has all of the code specific to the _Win8 platform.

In the _Win7 partial class is this code:

  1. public partial class WUndergroundViewModel
  2.     {
  3. public static WUndergroundData GetWeatherData( double lat, double lng)
  4.         {
  5. WUndergroundData data = new WUndergroundData();
  6.             System.Net. WebClient c = new System.Net. WebClient();
  7. string req = "[LAT],[LNG].xml" ;
  8.             req = req.Replace( "[LAT]" , lat.ToString());
  9.             req = req.Replace( "[LNG]" , lng.ToString());
  10. XmlDocument doc = new XmlDocument();
  11.             doc.Load(c.OpenRead(req));
  12. foreach (XmlNode item in doc.SelectNodes("/response/features/feature" ))
  13.             {
  14. switch (item.Node.InnerText)
  15.                 {
  16. case "yesterday" :
  17.                         ParseForecast( new FishingControls.XmlNodeList (doc.SelectNodes( "/response/forecast/txt_forecast/forecastdays/forecastday" )),
  18. new FishingControls.XmlNodeList (doc.SelectNodes( "/response/forecast/simpleforecast/forecastdays/forecastday" )), data);
  19. break ;
  20. case "conditions" :
  21.                         ParseCurrent( new FishingControls.XmlNode (doc.SelectSingleNode("/response/current_observation" )), data);
  22. break ;
  23. case "forecast" :
  24.                         ParseYesterday( new FishingControls.XmlNodeList (doc.SelectNodes( "/response/history/observations/observation" )),data);
  25. break ;
  26.                 }
  27.             }
  28. return data;
  29.         }
  30.     }

in _win8 partial class is this code:

  1. public partial class WUndergroundViewModel
  2.     {
  3. public async static Task< WUndergroundData > GetWeatherData(double lat, double lng)
  4.         {
  5. WUndergroundData data = new WUndergroundData ();
  6. HttpClient c = new HttpClient ();
  7. string req = "[LAT],[LNG].xml" ;
  8.             req = req.Replace( "[LAT]" , lat.ToString());
  9.             req = req.Replace( "[LNG]" , lng.ToString());
  10. HttpResponseMessage msg = await c.GetAsync(req);
  11. string stream = await msg.Content.ReadAsStringAsync();
  12. XmlDocument doc = new XmlDocument ();
  13.             doc.LoadXml(stream, null);
  14. foreach ( IXmlNode item in doc.SelectNodes("/response/features/feature" ))
  15.             {
  16. switch (item.InnerText)
  17.                 {
  18. case "yesterday" :
  19.                         ParseForecast( new FishingControls.XmlNodeList (doc.SelectNodes( "/response/forecast/txt_forecast/forecastdays/forecastday" )),
  20. new FishingControls.XmlNodeList (doc.SelectNodes( "/response/forecast/simpleforecast/forecastdays/forecastday" )), data);
  21. break;
  22. case "conditions" :
  23.                         ParseCurrent( new FishingControls.XmlNode (doc.SelectSingleNode("/response/current_observation" )), data);
  24. break;
  25. case "forecast" :
  26.                         ParseYesterday( new FishingControls.XmlNodeList (doc.SelectNodes( "/response/history/observations/observation")), data);
  27. break;
  28.                 }
  29.             }
  30. return data;
  31.         }
  32.     }


This method allows me to have common 'business' code for both platforms that is pretty clean, and I manage the technology differences separately. Thank you tostringtheory for your suggestion, I was considering that approach.

Posted On Thursday, June 21, 2012 12:40 PM | Comments (0)

Wednesday, June 13, 2012 #

TFS Rant *WARNING* negative opinions are being expressed.

It has happened several times now where I end up installing TFS "over the shoulder" of the system admin guy whose job it will be to "own" the server when I am gone. TFS is challenging enough to stand up when doing it myself on a completely open platform, but at these locations, networks are locked down, machines are locked down, and the unexpected always seems to pop up.

I personally have the tolerance for these things as a software developer, but as we are installing I have to listen to all of 'colorful' remarks being made about: "why is it like this" or "this is a piece of crap". Generally the issues center around SharePoint integration. TFS on it's own is straightforward, but the last flavor in everyone's mouth is the SharePoint piece.

As a product I like SharePoint, but installation is a nightmare. In this particular case, we are going to use WSS since the customer would like this separate from their corp SharePoint 2010 installations since there dev team is really small (1 developer) and it is being used as a VSS replacement, more than a full blown ALM tool. The server where it is being installed as a Cisco Security Agent on it that seems to block 'suspicious' activity, and as far as I can tell is preventing WSS from installing properly. The most confounding thing we can find no meaningful log entries to help diagnose the issue.

it didn't help matters that when we tried to contact Microsoft for support, because we mentioned TFS in the list of things that we were trying to install, that after waiting 2 hours that we got a TFS support person NOT the SharePoint person that we really needed, so after another 2 hours the SharePoint support that we did get managed to corrupt the registry sufficiently with his 'tools' that we ended up starting over from scratch the next day anyway after going home at midnight.

My point to this is: The System Administrator who is going to own this, now thinks it is a piece of crap because SharePoint wouldn't install properly. Perception is everything.  Everyone today is conditioned that software installs and works in a very simple matter. When looking at the different options to install TFS with the different "modes" there is inconsistency in the information being presented which leads to choices that causes headaches and this bad perception before the product is even installed.

I am highlighting this because I love TFS as a product, but I HATE installing it, and would like it to install as simply and elegantly as the product operates once it is installed.

Posted On Wednesday, June 13, 2012 2:11 PM | Comments (0)

Tuesday, June 12, 2012 #

Windows Phone 7 v. Windows 8 Metro “Same but Different”

I have been doing development on both the Windows Phone 7 and Windows 8 Metro style applications over the past month and have really been enjoying doing both. What is great is that Silverlight is used for both development platforms. What is frustrating is the "Same but Different" nature of both platforms. Many similar services and ways of doing things are available on both platforms, but the objects, namespaces, and ways of handling certain cases are different.

I almost had a heart attack when I thought that XmlDocument had been removed from the new WinRT. I was relived (but a little annoyed)  when I found out that it had shifted from the "System.Xml" namespace to the "Windows.Data.Xml.Dom" namespace. In my opinion this is worse than deprecating and reintroducing it since there isn't the lead time to know that the change is coming, maker changes and adjust.

I also think the breaks the compatibility that is advertised between the WinRT and .NET framework from a programming perspective, as the code base will have to be physically different if compiled for one platform versus the other. Which brings up another issue, the need for separate DLLs with for the different platforms that contain the same C# code behind them which seems like the beginning of a code maintenance headache.

Historically, I have kept source files "co-located" with the projects that they are compiled into. After doing some research, I think I will end up keeping "common" files that need to be compiled in to DLLs for the different platforms in a seperate location in TFS, not directly included in any one Visual Studio project, but added as links in the project that would get compiled into the windows 7 phone, or Windows 8. This will work fine, except for the case where dependencies don't line up for each platform as described above, but will work fine for base classes that do the raw work at the most basic programming level.

Posted On Tuesday, June 12, 2012 1:24 PM | Comments (1)

Monday, June 4, 2012 #

Windows 8 Location Services


I spent the afternoon with the Geolocator object in the WinRT and Widows 8 platform. I have also been working with doing Windows Phone 7 development, and first had to wrap my head around the fact that while similar, it is not the same as the GeoCoordinateWatcher that environment. I found a nice example here

But the behavior of my app wasn’t the same. Once you ensure that location services is enabled by following these instructions:

Location Services was still disabled. From everything I read, it sounded like the first time you try to use the Geolocator object, the user would be prompted to allow to “Access to your location”. After nosing around I found the issue. You need to add the location service as a Capability in the Package.appxmanifest file:


After checking the box, I was prompted to allow access to location services as expected the first time I needed to use it.

Posted On Monday, June 4, 2012 5:01 PM | Comments (0)

Friday, April 27, 2012 #

Installing TFS 2011 Beta in a Virtual Machine


This was about as straight forward of an affair as one would think for installing a Beta product. Once I got over my own memory lapses of what do first.

VMWare Player:

After spending about an hour trying to find a 32bit version of Windows Server 2008 (which will be the last 32bit Server OS according to Microsoft), I gave up and capitulated to having to use VMWare, since VirtualPC does NOT support 64 bit operating systems. The answer to the question: “What am I supposed use?” from the Microsoft is to use HyperV, but again, that seemed like more work.


Now that Server 2008 is installed and running (Version doesn’t matter for this), I installed SP1, and a day or so later all of the updates were applied and I was finally ready to start installing TFS! Make sure your VM has allocated min 4GB memory or the install will tell you “You do no have enough Memory”. When installing SP 2010, it will tell you that performance will be affected unless you have 10GB of ram installed, but it will work. It is little wonder that MS is ceasing support for 32bit OSs after 2008, since it seems that the memory required by their products cannot be addresses by 32bit systems anymore.

TFS 2011 Beta:

Maybe it is because that I have installed every version of TFS since the first one I can’t remember what the installer does versus what I need to do first (or maybe it is that I am almost 40), but after the installer told me that Reporting Services wasn’t installed, I remembered that I needed to install SQL Server 2008 first. Make sure to install EVERYTHING if you are doing a standalone instance

  • Reporting Services
  • Analysis Services
  • Et Al.

SQL Server 2008 R2 SP1 + Hotfixes:

Ok, start installing SQL 2008 R2. After that completes there are 4 other things that need to be installed to get SQL. The error message that you get from the TFS installed includes a link that you can go to to request access to the additional items that need to be installed to get SQL up to snuff for TFS 2011

  • Service Pack1 with cumulative updates
  • MasterDataServices
  • RS Sharepoint (I never did get this to install, it kept rolling back, but TFS didn’t complain during the install and appears work work)

It seems like I always for get this step as well, but TCP/IP needs to be enabled in SQL Server for TFS to install.


After 3 days (off and on, mind you) I have a working VMWare TFS2011 Beta installed and ready for configuration and playing.

Now, I just to figure out how to get my VS2011 Beta running outside of the VM to talk to it….stay tuned…

Posted On Friday, April 27, 2012 2:09 PM | Comments (0)

Monday, April 23, 2012 #

vs2011 Monochromatic icons, where’s the color?


I think everyone who uses Visual Studio can agree that Microsoft has packed an amazing amount of functionality into one tool. One thing that makes that possible is the ability to quickly get to all of the functionality easily not spending a lot of time sitting around RTFM. I am not sure about the direction that the monochromatic icons in the VS 2011 are going.

Here is a snippet from the VS2011 tool bar


Here is a snippet from the same tool bar in VS2010


I narrow down all of the little icons that I am looking at very quickly by color association. With that gone, I find I spend more time looking at the shape of each icon to understand what its function it.

I know that I am getting older, and as a result am fighting hard against the curmudgeon tendencies that go along with it, but it seems to me an important dimension is being removed to allow the quick association of icons based on color, in favor of style preferences.

Posted On Monday, April 23, 2012 2:26 PM | Comments (1)