SharePoint Saturday Charlotte 2014

A couple of weeks ago, I presented How Carolinas HealthCare System Governs SharePoint at the Charlotte SharePoint Saturday and I had a great time. The people who attended my session asked lots of questions, which always helps. This was my fourth time presenting at a community Saturday event (SPS Charlotte 2013, SPS Richmond 2013, and Carolina Code Camp 2013) and all of them have been well organized and run.

I thought it would be useful to share my team's experience introducing SharePoint governance over the last three years at CHS (Carolinas HealthCare System). We've had some great lessons learned, as well as solid and practical tips for implementing governance in large organizations. Our governance is still a work in progress, but anyone attempting to impose governance should understand that it may take years to get to where you need to be.

For those interested, I've uploaded my slide deck to the site. I've also uploaded my slides to

Upgrading a low end laptop–What a difference an SSD can make!

I upgraded my daughter’s laptop this weekend, with some incredible results.  th

We bought this laptop three years ago for general home use and over time it became my daughter’s.  We only spent $350 on it from Best Buy, where it was one of their back to school summer specials.  For the price, it had decent specs and has proven to be rather solid.  It’s a Lenovo B570 with 4GB ram, Intel Pentium processor, 720p 15” screen, and a 500GB Western Digital Scorpio Blue hard drive.

It’s still running Windows 7 x64 Home Premium.

Since we bought it, I’ve had some experience with machines using solid state drives (SSDs) instead of traditional spinning hard drives.  The price has really come down, so I decided to buy one as an upgrade for this laptop.

20-148-820-TSI bought the replacement drive from Newegg, a 256GB Crucial MX100 SSD.  When deciding on which SSD to get, I looked at only a couple of key specs: capacity, IOPs, price, and name brand.  I knew she’d need at least a 256GB drive, since it would be replacing a 500GB drive.  256 turned out to be plenty, once I cleaned off some the apps and data that had accumulated over the years.

IOPS is the standard for measuring input/output operations per second – the higher the number the better.  I have a Samsung SSD in my home desktop, which had a read speed of 94k IOPS and write of 35k IOPS, and I’ve been happy with it.  The Crucial drive, which is a year newer and has twice the capacity, for the same price as I paid for that Samsung, is rated at 85k IOPS read and 70k IOPS write.

The price was pretty good – Newegg was running a special, so I was able to get it for $99, but the price as I write this is $122.  As I said, I paid around that for a slower drive with half the capacity in 2013.

The last decision point was the name brand.  I wanted one that I had at least heard of, so drives from Samsung, Crucial, or Kingston would have been fine.

One last part I found necessary was a EZ-Connect kit that I picked up from Microcenter the last time I was in one of their stores.  This kit has the cables to connect from USB 2.0 to SATA and IDE drives, as well as providing a power cable.  This allowed me to hook up the new drive to the laptop via USB, and copy the contents of the existing hard drive to the new drive (using Acronis True Image software, which came with the kit or the SSD).  It was all very straight forward.

After I copied/cloned the drive, I powered down the laptop and swapped out the old drive for the new drive.  The Lenovo was pretty simple to open and the hard drive compartment was clearly labeled.

I then booted up the laptop using the new drive.

I timed three things before I swapped the drives, so I could measure the difference:

  • Time from when it is powered on until the Windows login prompt is displayed
  • Time from when I log in until the Windows mouse cursor stops displaying “wait” (aka – spinning wheel)
  • Time when shutting down while logged in

With the old drive, these times were:

  • 42 seconds
  • 36 seconds
  • 45 seconds

With the new drive, these times are now:

  • 20 seconds
  • 10 seconds
  • 15 seconds


That’s better than twice as fast for every measure!! Wow.

I also checked out the Windows Experience Index (Windows 7) and the “Disk data transfer rate” went from a score of 5.7 to 7.9!

With this kind of improvement, I expect this laptop will be very usable for at least another couple of years.

Living without a laptop, Part II, aka: Why I cried uncle

About this time last spring, I wrote how I was going to start Living without a laptop.  I had just switched jobs and had to turn in a good work laptop, which was replaced by a desktop at my new job. After living and working like this for a year, I have finally caved in and ordered a laptop for work.

z400-2So how did I get here? Well, for the last year my main computing devices have been: an HP desktop (z400) at work, a custom built AMD desktop at home, and an original Surface RT tablet.  Whenever I was away from my desk at work, I could use my Surface to keep notes (OneNote – awesome) and also keep up with my Outlook.  I was able to do this because I had my Surface connected to the secure Wi-Fi network at the office.

I also had the option of using Remote Desktop to connect to my office workstation if I needed to do something that WindowsRT couldn’t do.  A lot of this functionality hinged on me being able to connect to the office network.  This all worked until an update to my SurfaceRT in January killed my ability to do just that.

SurfaceRT/WindowsRT has its limitsSurfaceRT

Our private wireless network at the office requires that I enter my username and password, which it still prompts me for after the update, but it also prompts me for a network key which we don’t have and the Surface didn’t require before.  Since there isn’t a big group of us Surface RT users, I fell into the category of unsupported (on your own) users at work.

There is still a guest network that I can use, which allows my Surface to get out the internet, where I store my OneNote notes.  However, I can’t connect using our remote access options, because I can’t install Java for the VPN nor can I get the Citrix Receiver application to work with our Citrix remote option. Without these remote options, I can’t use Outlook and I can’t use Remote Desktop, making my Surface RT rather limited. (For those wondering, I went so far in my troubleshooting that I completely wiped my Surface and reinstalled everything – hoping that it would work like it did before, but I had no luck with that.)

I also found another scenario where the Surface isn’t a laptop replacement – trying to use the keyboard without a proper desk/table available.  I first ran into this when I attended a conference in 2012.  The conference was set up with rows of chairs, but no tables in front of them.  I had the choice of either using the on screen keyboard to take notes (which isn’t a great experience for more than brief notes) or using the keyboard and trying to balance it on my lap. My typing would often make the Surface move/teeter, and the kick stand didn’t make the viewing angle of the screen good, nor did it feel good as it cut into the tops of legs.

This isn’t the only place where a table wasn’t available.  Turns out I missed being able to use the keyboard while seated on our couch in front of the TV.  Most people will say that tablets are great for being in front of the TV. Tablets are great for consuming things, like catching up on Twitter or Facebook, or reading blogs, but I’m just not that productive doing email or development without my keyboard.

To the cloud – new goal at the office

While all of this was happening with my mobile options, the game was also changing when it came to my need for a powerful desktop. One of the reasons I needed this computing power was so I could run a SharePoint development environment.  This included running not only Visual Studio, but also the SharePoint server software which takes gobs of memory and storage. So how did this requirement change? Two new changes we adopted – moving to Office 365 and using Azure as part of our MSDN subscriptions.

First, we’re moving to SharePoint Online and won’t have SharePoint on premise in a few months.  This changes our development to be focused on more wide stream web technologies which don’t require that you develop on a SharePoint server.

Second, Microsoft introduced Azure credits with an MSDN subscription.  I can now use that to spin up virtual machines in Azure for development environments that require I be running on a server, like some SharePoint projects require (maintaining legacy sandbox solutions).  For those that don’t, I can simply use Visual Studio on

After thinking through all of this, I decided to request an Ultrabook laptop for work.  I got a nice little HP Elitebook Folio 9470m, which is pretty slim/small, making it a good note taking device for meetings, but I can also use it to run Outlook and all of our remote access options work.  I also have a docking station and large monitors on my desk, so when I’m there I’m not limited to the smallish screen of the laptop.


I’m still using my home desktop when I’m telecommuting, but if I need to catch up on email, I now have the option of doing so from my couch.

User Profile Web Service error when accessing from InfoPath Forms Services

In the last couple of weeks, we had to move a custom application from one SharePoint 2010 farm to another.  We tested our migration method by moving it to a test farm first, which highlighted a few things that we needed to change, but it was otherwise successful.

However, when we moved it to the new production farm, we ran into an error that we hadn’t seen in test.  It occurred when we opened an InfoPath form that was calling the User Profile web service.  The error presented to the user was a dialog with this: “An entry has been added to the Windows event log of the server. Log ID:7893” .

When we looked it up in SharePoint’s ULS log, we found this entry:


InfoPath Forms Services


Runtime – Data Connection




The following data connection (GetUserProfileByName) has exceeded the maximum configured time limit. This threshold can be configured by using the SPIPFormsService -MaxDataConnectionRoundTrip PowerShell commandlet


Scouring the internet, I found tips that suggested the identity of the application pool for the “SharePoint Web Services Root” should be something other than localservice.  However, I then found a blog post by Spence Harbar stating that it was ok to be localservice.

I also found a tip that suggested increasing the timeout limits on the InfoPath Forms Services. I tried that without luck.

After discounting everything I found online, I went back to my list of differences between our test environment and our production environment.  After hours of combing through service account permissions both in SharePoint and SQL, I finally decided to check the HOSTS file on the SharePoint servers.


Turns out the fix for us was to add an entry to the hosts file that pointed the SharePoint URL to (the loopback address).  We had already configured this for the other three SharePoint web application we had launched.  We had neglected to do this with our new web application, which was only recently put into production.

So, we learned a couple of lessons again:

First – do everything you can to make your test environment mirror your production environment.  I thought our environments were pretty well matched, including having a separate web front end from the app server, a separate SQL server, and even a separate FAST server in test.  The one thing we don’t have is a load balancer with multiple WFEs.

Second – document your processes.  We should have had a checklist to refer to when creating the new web application.

Carolina Code Camp 2013: Introduction to SharePoint Development

I spoke this weekend at the Carolina Code Camp on the CPCC campus here in Charlotte and my topic was an Intro to SharePoint Development for .NET Developers (those that have no knowledge of SharePoint).  It’s a topic that I’ve been talking about often with new and experienced devs alike, who want to know everything from how to set up a development environment to how to start coding and what can SharePoint do.

From my experience, a lot of the intro presentations given focus on writing those first lines of code, which is a great topic – but it’s about two or three steps away from the absolute beginning.  I spoke a little about SharePoint’s version history, my thoughts on setting up an environment, as well as the tools that you use before you open Visual Studio.  I ended with some Visual Studio info and I was hoping to get into code, but I was only able to get through my slides before my hour was up.

As I promised to those in attendance, here’s a link to my slide deck on SkyDrive:

..and here’s a version you view from Slideshare:

It was actually good that I didn’t need to do live demos, because I didn’t have access to any of my normal virtual machines that I use for presentations (see my previous post for details:  I was actually using my Surface RT tablet for the presentation and it worked great. I bought the special mini HDMI to VGA adapter from the Microsoft store here in Charlotte the previous weekend.  I went with the VGA adapter because I knew they had VGA connections at the campsite ( Smile ) but I wasn’t sure that they had HDMI.

I was going to do a demo using CloudShare based VMs, but I had trouble getting onto their WiFi and as I said, I didn’t have time for it anyway.

After my presentation, which was the first one of the day for the SharePoint track (one of five tracks with 60 sessions !), I was able to relax and enjoy the other presentations.  I got to attend four: an Intro to 2013 Apps Dev, a 2013 SP Designer New Features, an Agile Dev with TFS, and 2013 Search Driven UI session.  All of them very good, and all very relevant to what’s going on at the office.  All of those speakers did a great job.

Living without a laptop

Due to my job change, I'm getting the "opportunity" to re-evaluate my computing needs at home. This is because I've had a company laptop since I first started consulting back in 2004.

Most recently, I had an awesome laptop from work (Cardinal Solutions). It was a Dell Precision M6400 (I think) with an Intel Core i7 processor, 16GB RAM, a separate video card, two SSDs for storage, and a 15in screen with a high resolution (I think it was 1080p or better). All of this power was needed when I was running SharePoint Virtual Machines (or Outlook – for those of you using Outlook, you know what I mean). Just an awesomely powerful computer.

It was my primary computer both at work and at home, since I'd use it for work/home email, web browsing, etc.

So, once I knew I'd be returning it, I started thinking about its replacement. I already had a desktop at my new job and I might request a laptop, but I knew I wasn't going to be using the work laptop as much as I had the other one.

First things first, I thought about what I already had at home. I have a Windows server (custom built using typical workstation hardware) that I use for general file storage, running Virtual Machines with Hyper-V, media server (for the Xbox's and TVs in the house), and some network services (DNS, DHCP). I built this box to have lots of storage with moderate CPU power. It has five hard drives in it, that total 7TB of storage (but with RAID mirroring, I think the available storage is 4 TB). It also has several strong, loud fans in it, but since this sits in a closet I don't have to hear it.

We also have a Mac Mini (early 2009) that is my wife's primary computer. It is in the kitchen at her desk and she uses it for email/web browsing, plus tracking our finances in Quicken. We also have over a decade's worth of photos and music on it (iPhoto and iTunes). We're often both using computers at the same time, so sharing a computer wouldn't work.

We have a Lenovo Ideapad (Pentium B940) that we bought two years ago. It was going to be my main computer at home, but the kids soon discovered how cool it was and I found the screen resolution small (720p) when compared to my work laptop. The kids use it a lot, so my daughter was especially fearful once I explained that I had to return my work laptop. Again, sharing it would not work well.

One other piece of hardware we have is a Microsoft Surface RT. It's mine, but I let the kids use it every now and then. I take it to work daily and use it in meetings, which it is great for. The small size, the touch screen, the type cover – great. I'm a heavy OneNote user, so I have it on the Surface as well as my work laptop and the notes sync between the two.

Since I have the Surface, I got to wondering if I really still needed a powerful mobile computer (laptop). As a consultant, I often needed to spin up demos on Virtual Machines, but I don't need to do that anymore. Nor do I need to have a good computer ready for when the client doesn't have a machine for me.

SO, after much thinking I decided to build a desktop computer for my own use. I built a machine that has much more power than if I had spent the same amount of money on a laptop. I also get the ability to upgrade any piece of it I need to over the next few years, pretty much the opposite of a laptop (I only ever upgraded RAM and hard drives in a laptop).

I did a bunch of research (Tom's Hardware guide) and bought the following:

  • AMD A8-5500 Trinity 3.2GHz Quad-Core APU (CPU + GPU)
  • ASRock FM2A85X Extreme6 motherboard
  • Corsair XMS3 8GB DDR3 1600 RAM
  • Samsung 840 Series 120GB SATA III SSD
  • Fractal Design Define R4 Mid Tower Case
  • Rosewill Fortress Series 550W power supply
  • ViewSonic 23" LED monitor
  • Logitech wireless keyboard
  • Microsoft Wireless Mobile Mouse 4000

One of my design goals was to have a quiet computer, since it'll be sitting in my office and not in a closet. I also wanted a moderately powerful computer that I could upgrade, so I went middle of the road on CPU, but picked a newer design/socket so I can still get upgrades later.

The case and power supply were a little more than I would normally pay, but that was the trade off to get a quiet computer – the case has sound dampening styrofoam and the power supply is highly energy efficient (so I don't need loud fans running to cool it down).

When it came to the processor, I am probably just sentimental about AMD. I do think you get more power per dollar from them, on the low end. If my budget was larger, I would have gone with an Intel Core i7.

I made a huge trade off when it came to storage – only 120GB (our low end Lenovo laptop has 500GB), but I wanted the speed of a solid state drive (SSD). Not just any SSD, but these new Samsung 840 series drives, that have incredible IOP numbers. One of the first upgrades I'll do is probably add something like a 2 or 3TB standard hard drive, but for the OS, I wanted an SSD.

I ordered almost everything from Newegg and put it all together a couple of days ago. So far, I like it. I may miss having a laptop in the future, but for now I'm good.

My own March madness

This past March was a big one for me professionally. I spoke at two SharePoint Saturdays, Charlotte and Richmond, I had a huge migration project that was completed (mostly) which involved late nights and one weekend. To cap it all off, I accepted a full time position with the client I've been consulting with for sixteen months – the Carolinas HealthCare System. I started this week as their SharePoint Architect.

First, I want to talk about my experience speaking at SharePoint Saturday – in short, it was great. I want to publically thank both organizations for allowing me to speak and being great hosts. I've done presentations before for work and I spoke once before at a Microsoft hosted mini conference (wow, that must have been three or four years ago and it was about IE8), but this was my first time speaking at a community hosted event/conference. I really appreciated how well organized they were the day of the conference. I enjoyed my experience so much that I volunteered to speak at another event coming up, the Carolina Code Camp here in Charlotte in early May.

Probably the biggest accomplishment for March was the SharePoint migration project at work. We moved several hundred gigs of data, spread out over 4500+ sites/subsites. We did some reorganization, splitting up one huge site collection into 330, while moving another 50 site collections. We also applied branding and turned on the SharePoint 2010 user interface for the first time. (We had migrated from 2007 to 2010 last year, but left the 2007 UI turned on) We also trained several hundred site owners and introduced strong governance. We have a lot of work to do, but we're off to great start.

Given how much I've enjoyed working as a consultant for Carolinas HealthCare System (CHS), I decided to accept their offer for a full time position as a SharePoint Architect. We have executive support, a crucial element, which has allowed us to build both a great infrastructure and a great team. I'm particularly excited about the opportunity to help define how things are going to work, technically in the SharePoint environment as well as the business processes that we do as a team. Things like how we process incoming requests for solutions, what services we're going to offer, plus what tools we're going to build to help our forty thousand plus users solve problems in their workday.

One other thing I should say, in the context of the new job. I'm leaving behind a great organization – Cardinal Solutions. I've been a consultant with them for a few years now and they've been very good to me and I will continue to refer anyone looking for a consulting gig to Cardinal. This career move for me was all about the opportunity I was going to and was nothing negative about Cardinal.


2012 Year in Review and Goals for 2013

It’s been quite a while since I’ve written a blog entry. I seem to start every year with a goal of writing X number of blog entries per month or week or something, so let’s see if I do any better this year, with my goal being at least one lengthy post a month.  Hopefully this post will get the ball rolling.

2012 Year in Review

This past year, I’ve worked with one client who needed help migrating their WSS 3.0 farm to SharePoint 2010.  I started at the client around Thanksgiving 2011 and I quickly discovered their plan for a quick six week project wouldn’t work. Their SharePoint 2010 farm wasn’t configured correctly as it was really just a proof of concept quick install (only one service account, inadequate server resources, etc.).

Once we got a few servers together, we created a test farm and a production farm and I started doing test migrations.  After some false starts, we finally got everything migrated by July. (Yes, a six month project became six months.)

For the last half of the year, I joined a new team at the client who had taken over responsibility for SharePoint and we’ve been working to upgrade and migrate all of the sites again.  This time, we’re reorganizing the information architecture, installing custom branding, and moving to a much bigger farm.  We’re hoping to have all of this work wrapped up by the end of March.

Another big piece of my work life last year was getting to attend the SharePoint Conference in November. I’ve been to a few conferences before (Connections, Business Process and Workflow, and of course SharePoint Saturdays), but this was my first SharePoint Conference.  It was a good year to go especially with 2013 being launched.

When it came to speaking engagements, I only had a few internal ones.  I also taught a SharePoint class for our business analysts, as we try to train our BAs to fill SharePoint Analyst roles.

2013 Goals

2013 is starting off with a bang with SharePoint Saturday Charlotte, which I’ll be speaking at (a week from today) on the 26th. Usually, I have goals related to public speaking, so the fact that I’m getting one done in January is a good sign.  I’m going to try to attend at least one other SPS, maybe Atlanta, Richmond, or an Ohio one if the travel timing works out.  I thought the SPS DC was pretty cool in 2011, so maybe make a family trip out of that one again.

I’m also wrapping up the migration project at my client in the next few months, which will be a big accomplishment. We have further projects to do, which will be a nice change of pace from migrations.

At some point this year, I want to start digging into SharePoint 2013, with an eye toward passing the certification exams for it.  My client has no interest in moving to 2013 this year, and probably not even next year, but I still want to try to keep pace with the SharePoint community.

Well, those are my modest goals for the year.

How to delete SharePoint alerts for orphaned (deleted) users–WSS 3.0/SharePoint 2007/2010

An issue came up at my client recently, where our webmaster email account was being flooded with non deliverable email notices. Messages were bouncing back because our SharePoint farm was sending alerts to users and email address that no longer existed.

This was on a WSS 3.0 farm, and we don’t have any third party tools (like Axceler’s ControlPoint) to help manage scenarios like this.  So, I decided to tackle the problem with PowerShell.

Of course the first step in writing a PowerShell script is to search the net to see if anyone has already written one.  That led me to a couple of useful blog posts/scripts: Check-SharePoint-Orphaned-Users and Delete all alerts for a user in SharePoint.  Based on these two posts, I came up with the following script:

1 #For WSS 3.0 or MOSS 2007 2 $12HiveDir = "${env:CommonProgramFiles}\Microsoft Shared\web server extensions\12\" 3 cd $12HiveDir 4 [void][reflection.assembly]::Loadwithpartialname("Microsoft.SharePoint") | out-null 5 [void][reflection.assembly]::Loadwithpartialname("Microsoft.Office.Server.Search") | out-null 6 [void][reflection.assembly]::Loadwithpartialname("Microsoft.Office.Server") | out-null 7 8 $farm = [Microsoft.SharePoint.Administration.SPFarm]::Local 9 $websvcs = $farm.Services | where -FilterScript {$_.GetType() -eq [Microsoft.SharePoint.Administration.SPWebService]} 10 $webapps = @() 11 12 #Change this to match your domain, or even include the OU 13 $LDAP_Domain = "LDAP://DC=contoso,DC=local" 14 15 #From Check-SharePoint-Orphaned-Users: 16 function Check_User_In_ActiveDirectory([string]$LoginName, [string]$domaincnx) 17 { 18 $returnValue = $false 19 #Filter on User and NTgroups which only exists 20 $strFilter = "(&(|(objectCategory=user)(objectCategory=group))(samAccountName=$LoginName))" 21 $objDomain = New-Object System.DirectoryServices.DirectoryEntry($domaincnx) 22 23 $objSearcher = New-Object System.DirectoryServices.DirectorySearcher 24 $objSearcher.SearchRoot = $objDomain 25 $objSearcher.PageSize = 1000 26 $objSearcher.Filter = $strFilter 27 $objSearcher.SearchScope = "Subtree" 28 29 #$objSearcher.PropertiesToLoad.Add("name") 30 31 $colResults = $objSearcher.FindAll() 32 33 if($colResults.Count -gt 0) 34 { 35 #Write-Host "Account exists and Active: ", $LoginName 36 $returnValue = $true 37 } 38 return $returnValue 39 } 40 41 #From Delete alerts for a user: 42 function DeleteOrphanedAlerts{ 43 foreach ($websvc in $websvcs) { 44 foreach ($webapp in $websvc.WebApplications) { 45 Write-Host "Webapp Name -->"$webapp.Name 46 47 #limit our search to one web app 48 if ($webapp.Name -eq "sharepoint80 - content"){ 49 foreach ($site in $webapp.Sites) 50 { 51 # get the collection of webs 52 $webs = $site.AllWebs 53 foreach ($web in $webs) 54 { 55 # get the alerts 56 $alerts = $web.Alerts 57 58 Write-Host " "$web.Title", Alert count "$alerts.Count 59 60 # if more than 0 alerts, iterate through these alerts to see if there is one for the user 61 if ($alerts.Count -gt 0) 62 { 63 $myalerts = @() 64 $mysites += $web.Url 65 foreach ($alert in $alerts) 66 { 67 if ($alert.user.LoginName -ne $null) 68 { 69 if ($alert.user.LoginName.Contains("\")) 70 { 71 $UserName = $alert.User.LoginName.split("\") 72 73 $result = Check_User_In_ActiveDirectory $UserName[1] $LDAP_Domain 74 75 if ($result -eq $false) 76 { 77 echo " Deleting $($alert.Title) alert for $($alert.User.LoginName)" 78 $myalerts += $alert 79 $alertcount++ 80 } 81 } 82 } 83 } 84 85 # now we have alerts for this site, we can delete them 86 foreach ($alertdel in $myalerts) 87 { 88 $alerts.Delete($alertdel.ID) 89 } 90 } 91 } 92 } 93 } 94 } 95 } 96 } 97 98 DeleteOrphanedAlerts 99 Write-Host "Completed" 100

This script will loop through all of the Web Applications in the farm, site collections, and webs, looking for all of the alerts.  Once an alert is found, it checks to see if the user is still in Active Directory.  If not, it adds the alert to a list of alerts to delete.

If you only want it to search through one web application, you can modify line 48 to only continue if the name of the web application matches.

Also note, you’ll need to modify the LDAP path, stored in the $LDAP_Domain variable to match your environment.

WSS 3.0 to SharePoint 2010: Tips for delaying the Visual Upgrade

My most recent project has been to migrate a bunch of sites from WSS 3.0 (SharePoint 2007) to SharePoint Server 2010.  The users are currently working with WSS 3.0 and Office 2003, so the new ribbon based UI in 2010 will be completely new.  My client wants to avoid the new SharePoint 2010 look and feel until they’ve had time to train their users, so we’ve been testing the upgrades by keeping them with the 2007 user interface.

Permission to perform the Visual Upgrade

One of the first things we noticed was the default permissions for who was allowed to switch the UI from 2007 to 2010.  By default, site collection administrators and site owners can do this.  Since we wanted to more tightly control the timing of the new UI, I added a few lines to the PowerShell script that we are using to perform the migration.  This script creates the web application, sets the User Policy, and then does a Mount-SPDatabase to attach the old 2007 content database to the 2010 farm.  I added the following steps after the Mount-SPDatabase step:

#Remove the visual upgrade option for site owners
#   it remains for Site Collection administrators
foreach ($sc in $WebApp.Sites){
    foreach ($web in $sc.AllWebs){
        #Visual Upgrade permissions for the site/subsite (web)
        $web.UIversionConfigurationEnabled = $false;

These script steps loop through each Site Collection in a particular web application ($WebApp) and then it loops through each subsite ($web) in the Site Collection ($sc) and disables the Site Owner’s permission to perform the Visual Upgrade. This is equivalent to going to the Site Collection administrator settings page –> Visual Upgrade and selecting “Hide Visual Upgrade”.

Since only IT people have Site Collection administrator privileges, this will allow IT to control the timing of the new 2010 UI rollout.

Newly created subsites

Our next issue was brought to our attention by SharePoint Joel’s blog post last week ( ).  In it, he lists some updates about the 2010 upgrade, and his fourth point was one that I hadn’t seen yet:

4. If a 2007 upgraded site has not been visually upgraded, the sites created underneath it will look like 2010 sites – While this is something I’ve been aware of, I think many don’t realize how this impacts common look and feel for master pages, and how it impacts good navigation and UI. As well depending on your patch level you may see hanging behavior in the list picker. The site and list creation Silverlight control in Internet Explorer is looking for resources that don’t exist in the galleries in the 2007 site, and hence it continues to spin and spin and eventually time out. The work around is to upgrade to SP1, or use Chrome or Firefox which won’t attempt to render the Silverlight control. When the root site collection is a 2007 site and has it’s set of galleries and the children are 2010 sites there is some strange behavior linked to the way that the galleries work and pull from the parent.

Our production SharePoint 2010 Farm has SP1 installed, as well as the December 2011 Cumulative Update, so I think the “hanging behavior” he mentions won’t affect us.

However, since we want to control the roll out of the UI, we are concerned that new subsites will have the 2010 look and feel, no matter what the parent site has.

Ok, time to dust off my developer skills.

I first looked into using feature stapling, but I couldn’t get that to work (although I’m pretty sure I had everything wired up correctly).  Then I stumbled upon SharePoint 2010’s web events – a great way to handle this.

Using Visual Studio 2010, I created a new SharePoint project and added a Web Event Receiver:


In the Event Receiver class, I used the WebProvisioned method to check if the parent site is a 2007 site (UIVersion = 3), and if so, then set the newly created site to 2007:


/// <summary>
/// A site was provisioned.
/// </summary>
public override void WebProvisioned(SPWebEventProperties properties)
        SPWeb curweb = properties.Web;
        if (curweb.ParentWeb != null)
            //check if the parent website has the 2007 look and feel
            if (curweb.ParentWeb.UIVersion == 3)
                //since parent site has 2007 look and feel
                //  we'll apply that look and feel to the current web
                curweb.UIVersion = 3;
    catch (Exception)
        //TODO: Add logging for errors

This event is part of a Feature that is scoped to the Site Level (Site Collection).  I added a couple of lines to my migration PowerShell script to activate the Feature for any site collections that we migrate.

Plan Going Forward

The plan going forward is to perform the visual upgrade after the users for a particular site collection have gone through 2010 training. If we need to do several site collections at once, we’ll use a PowerShell script to loop through each site collection to update the sites to 2010.  If it’s just one or two, we’ll be using the “Update All Sites” button on the Visual Upgrade page for Site Collection Administrators.

The custom code for newly created sites won’t need to be changed, since it relies on the UI version of the parent site.  If the parent is 2010, then the new site will look 2010.

SharePoint 2010 MSDN Labs

Eric Ligman, from Microsoft, posted a great blog post this week listing all of the SharePoint 2010 Virtual Labs that are available from Microsoft.  His blog entry is here:

He also posted other resources as well.

I’ve copied his Virtual Lab links here:

SharePoint Server 2010 Virtual Labs

In addition to the SharePoint Server 2010 Virtual Labs, here are a few other SharePoint 2010 resources that I thought you might also be interested in:

How to migrate an InfoPath form from test to production in SharePoint 2010


In order to migrate (copy) an InfoPath form from one environment (say test to production), there are several things to consider. First, does your form use a data connection to retrieve or submit data? Second, if you update the form in the future, do you want all of the locations where this form is used to be updated as well? Consider these issues while reading the instructions below.

Step 1: Create a form

First, let’s create a simple InfoPath. For my example, I’ve created an overly simplified leave request form. The PTO (Paid-Time-Off) form has only three fields on it: a leave date, a return date, and an employee name field. I know this isn’t all too realistic, but I don’t want to get lost in the details of what’s on the form since we’re focusing only on how to publish it.

Step 2: Publish your form using InfoPath

Our next step is to save and publish our form. In InfoPath, go to File à Publish and select “SharePoint Server”:


The Publishing Wizard will then appear:

· Enter a SharePoint URL (don’t worry about the URL – you can use a test URL if you like)

· Click next

The next step will ask how you want to publish it. You’ll want to choose “Administrator-approved form template (advanced)”:


After selecting Administrator-Approved and clicking next, InfoPath will then ask you to specify a file location to save the form’s template to. This is the location (file share) that you want to save the file to in order for your administrator to access it. In my example, I’m saving it to a mapped drive:


The next step is to choose any fields in your form that you want be available as columns in the SharePoint forms library:


For my example, I’ve chosen to include all three fields, but this is up to you.

The final step in the Publishing Wizard is to click “Publish”. This will save your form template to the location you’ve chosen. Once it’s saved, the final confirmation step will appear and you can click “Close”.

Uploading the Form to SharePoint

The next step in our process is to upload the form to the “Managed Form Templates” list in SharePoint’s Central Administration. In order to do this, you will need Farm Administrator permission.

Navigate to Central Administration and select “General Application Settings” in the left navigation. Once you do that, you should see “Upload form template” as a choice under “InfoPath Forms Services” on the right:


Click “Upload form template” and then you’ll see this:


The first thing you do on this screen is to click the browse button and locate your published InfoPath form. This will be the location that you entered in the Publishing Wizard in InfoPath. You can the click the Verify button and see if this form is ready to be uploaded. Hopefully, you’ll get the message “This form template is ready to upload to the server.”

If you receive an error at this step, be sure that you selected “Administrator Approved” in the Publishing Wizard, and that you’ve selected the same form to upload to SharePoint. InfoPath will also save a working copy that would normally be considered the author’s master copy, so you more than likely will have multiple copies/versions of the same form – so be sure you have the correct one.

When you click “Ok” on the Form Verification Status page, you will be directed back to the Upload page. You’ll have to select your form again, using the browse button. This time, don’t select Verify, since we’ve already verified that this template will work. Instead, look at the Upgrade options and consider if these are applicable.

Once you’ve made your upgrade selections, click the Upload button. You should then see this:


Click “Ok” and you’re sent to the “Manage Form Templates” page. Depending on how quick you click Ok and how speedy your farm is, you may see the Status as “Installing”:


Just give it a minute and then refresh the page. Once the status is shown as “Ready”, we can do the next step.

Step 3: Activate Form for a Site Collection

Now that we have the form in Central Administration, we can now make it available to the Site Collections where we want it to be used. To do this, hover over the form link and use the drop down menu to select “Activate to a Site Collection”:


You will then be asked to select the site collection to make it available to. Now that the form has been available to a site collection, let’s go use it.

Step 4: Using the InfoPath form in your Site Collection

Once your form is available in a Site Collection, you still need to use it with a library. So first, navigate to your Site Collection and create a new “Form Library”. After the library is created, then go to “Library Settings” for that library.

We need to add a content type to the library. In order to do this, we go to “Advanced Settings”:


On the Advanced Settings page, you’ll need to select “Yes” under “Allow management of content types?”:


You can scroll down on the page and click “Ok”. You should now see a “Content Types” section added to the Library settings page, just above the columns section.

In the Content Types section, click “Add from existing site content types”. On the “Select Content Types” page, find your InfoPath form in the available content types and click Ok.

We’re almost ready, but one thing I like to clean up is the list of available Content Types for this library. As it is currently configured, my library would allow users to submit two different forms: “Form” which is blank and “PTO Form” which is the one I want. So to clean this up, you click on “Change new button order and default content type”.

On this page, I’ll uncheck the box in the “Visible” column for my “Form” and change the position my “PTO Form” to 1. My page now looks like this:


After clicking Ok, I then go back to my library and submit a new form…and there it is -- my PTO form that I published via Central Administration.

Step 5: Updating an already deployed Managed InfoPath form

Once the form is deployed, you can then make changes to the original. Let’s say you need to add field to the form. You would update the form and use the Publishing Wizard to save it as an “Administrator Approved” form. Your SharePoint farm administrator would then upload the new form to the Managed Form list, paying careful attention to the Upgrade options.

Once the form is uploaded, then SharePoint will find where that form is deployed and update it. How quickly this happens will depend on your farm.

Further Information

If you would like further information about the options for upgrading, or even how to update the form using PowerShell steps, please see Microsoft’s documentation on TechNet:

As I mentioned at the beginning of this post, you will need to consider how your form is updated, which I discussed briefly above.  I also mentioned that you would need to think about your data connections.  I’ll cover the data connection topic in my next blog post.

SharePoint 2010 Client Object Model – CAML Query inaccurate results error

I ran into a frustrating scenario today, while working with SharePoint 2010’s Managed Client Object Model.  My application queries a SharePoint document library using the file’s name (the FileLeafRef field).  Given that this field is unique, I was expecting only one result with the following code:

1 CamlQuery qry = new CamlQuery(); 2 3 //filter the results to only get back the item with the filename we're looking for 4 qry.ViewXml = string.Format( 5 "<Query><Where><Eq><FieldRef Name='FileLeafRef' /><Value Type='File'>{0}</Value></Eq></Where></Query>", 6 remoteFileName); 7 8 ListItemCollection itms = lst.GetItems(qry); 9 ctx.Load(itms); 10 ctx.ExecuteQuery();

Instead of getting one (or none) records, I was getting more than one.  SO, I searched some more and found some examples with slightly different xml:

1 CamlQuery qry = new CamlQuery(); 2 3 //filter the results to only get back the item with the filename we're looking for 4 qry.ViewXml = string.Format( 5 "<View><Query><Where><Eq><FieldRef Name='FileLeafRef' /><Value Type='File'>{0}</Value></Eq></Where></Query></View>", 6 remoteFileName); 7 8 ListItemCollection itms = lst.GetItems(qry); 9 ctx.Load(itms); 10 ctx.ExecuteQuery();

See the difference?  Look carefully at line 5 – that’s right. I left out the surrounding “<View>” tags!

Error when trying to access SharePoint 2010 via PowerShell in a new environment

I got this error today while trying to access my new SharePoint 2010 environment via PowerShell:

Cannot access the local farm. Verify that the local farm is properly configured, currently available, and that you have the appropriate permissions to access the database before trying again.

If I would have just read the error message a little more carefully, I would have realized that I should start by looking at my permissions in SQL – which my account didn’t.  This happened because I installed SQL Server 2008 R2 in my virtual machine while I was logged on as the local administrator.  When I was attempting to run PowerShell, I was logged on as the domain administrator. (These are virtual machines that I’m using for demos….).  The local administrator was granted access automatically as part of the SQL install.

Error creating PowerPivot Service Application in SharePoint 2010 (SP1 + June 2011 CU)

A few weeks ago I ran into an error while setting up our test farm.  I was creating the service applications and when I got to PowerPivot, I got the following error:

Login failed for user 'NT AUTHORITY\ANONYMOUS LOGON' during PowerPivot Service application creation

I was able to work around this error by using PowerShell to create the service application.


First, some details of our installation: two web front ends, two application servers, and one SQL server.  All are running Windows Server 2008 R2 x64 with Service Pack1.  The SQL Server is 2008 R2 with SP1.  The SharePoint servers are SharePoint Server 2010 Enterprise, and I installed SharePoint’s Service Pack 1 and the June 2011 Cumulative Update (by running all of the installers before running the first Configuration Wizard).

Clean Up from the error

Once you run into the error, you’ll see that a new database was created and a phantom application pool was created.  To clean these up:

  • Delete the database in SQL server (I did this just using SQL Studio)
  • Delete the database in SharePoint, using PowerShell: (Please note: this PowerShell line will delete any database entry in SharePoint where the actual database doesn’t exist in SQL)
    Get-SPDatabase | Where{$_.Exists -eq $false} | ForEach {$_.Delete()} 
  • Now delete the phantom Application Pool (the error causes a service application pool to be created within SharePoint, but doesn’t actually create it in IIS):
Remove-SPServiceApplicationPool -Identity "My App Pool" 

Just replace the “My App Pool” text with the name of the application pool you want to delete.

Create PowerPivot Service Application using PowerShell

Now, to create the PowerPivot service application, you can use the following PowerShell script:

   1:  $PowerPivotServiceName = "PowerPivot Service"
   3:  Write-Host "Creating PowerPivot Service Application..." 
   4:  New-PowerPivotServiceApplication -ServiceApplicationName $PowerPivotServiceName
       -DatabaseServerName "SQL.MYDOMAIN.LOCAL" -DatabaseName "Service_PowerPivot_1" -AddToDefaultProxyGroup 
   5:  Write-Host "PowerPivot Service Application created" 
   7:  Write-Host "Creating Application Pool" 
   8:  $AppPool = New-SPServiceApplicationPool -Name "AppPool_PowerPivot" -Account "DOMAIN\SERVICEACCOUNT_USERNAME" 
   9:  Write-Host "App Pool created"
  11:  Write-Host "Assigning PowerPivot Application Pool" 
  12:  $sa = Get-PowerPivotServiceApplication | where {$_.DisplayName -eq $PowerPivotServiceName} 
  13:  $sa.ApplicationPool = $AppPool; 
  14:  $sa.Update(); 
  15:  Write-Host "PowerPivot Application Pool Assigned"
  17:  Write-Host "Script Complete"

You’ll need to fill in a few parameters specific to your environment:

Line 4: DatabaseServerName, DatabaseName

Line 8: The Application Pool’s name and the service account that it will run as


Doing these steps should give you a working PowerPivot service application.  We did open a support ticket with Microsoft.  They confirmed that other customers have seen this error, but they are still researching the cause.