Living without a laptop, Part II, aka: Why I cried uncle

About this time last spring, I wrote how I was going to start Living without a laptop.  I had just switched jobs and had to turn in a good work laptop, which was replaced by a desktop at my new job. After living and working like this for a year, I have finally caved in and ordered a laptop for work.

z400-2So how did I get here? Well, for the last year my main computing devices have been: an HP desktop (z400) at work, a custom built AMD desktop at home, and an original Surface RT tablet.  Whenever I was away from my desk at work, I could use my Surface to keep notes (OneNote – awesome) and also keep up with my Outlook.  I was able to do this because I had my Surface connected to the secure Wi-Fi network at the office.

I also had the option of using Remote Desktop to connect to my office workstation if I needed to do something that WindowsRT couldn’t do.  A lot of this functionality hinged on me being able to connect to the office network.  This all worked until an update to my SurfaceRT in January killed my ability to do just that.

SurfaceRT/WindowsRT has its limitsSurfaceRT

Our private wireless network at the office requires that I enter my username and password, which it still prompts me for after the update, but it also prompts me for a network key which we don’t have and the Surface didn’t require before.  Since there isn’t a big group of us Surface RT users, I fell into the category of unsupported (on your own) users at work.

There is still a guest network that I can use, which allows my Surface to get out the internet, where I store my OneNote notes.  However, I can’t connect using our remote access options, because I can’t install Java for the VPN nor can I get the Citrix Receiver application to work with our Citrix remote option. Without these remote options, I can’t use Outlook and I can’t use Remote Desktop, making my Surface RT rather limited. (For those wondering, I went so far in my troubleshooting that I completely wiped my Surface and reinstalled everything – hoping that it would work like it did before, but I had no luck with that.)

I also found another scenario where the Surface isn’t a laptop replacement – trying to use the keyboard without a proper desk/table available.  I first ran into this when I attended a conference in 2012.  The conference was set up with rows of chairs, but no tables in front of them.  I had the choice of either using the on screen keyboard to take notes (which isn’t a great experience for more than brief notes) or using the keyboard and trying to balance it on my lap. My typing would often make the Surface move/teeter, and the kick stand didn’t make the viewing angle of the screen good, nor did it feel good as it cut into the tops of legs.

This isn’t the only place where a table wasn’t available.  Turns out I missed being able to use the keyboard while seated on our couch in front of the TV.  Most people will say that tablets are great for being in front of the TV. Tablets are great for consuming things, like catching up on Twitter or Facebook, or reading blogs, but I’m just not that productive doing email or development without my keyboard.

To the cloud – new goal at the office

While all of this was happening with my mobile options, the game was also changing when it came to my need for a powerful desktop. One of the reasons I needed this computing power was so I could run a SharePoint development environment.  This included running not only Visual Studio, but also the SharePoint server software which takes gobs of memory and storage. So how did this requirement change? Two new changes we adopted – moving to Office 365 and using Azure as part of our MSDN subscriptions.

First, we’re moving to SharePoint Online and won’t have SharePoint on premise in a few months.  This changes our development to be focused on more wide stream web technologies which don’t require that you develop on a SharePoint server.

Second, Microsoft introduced Azure credits with an MSDN subscription.  I can now use that to spin up virtual machines in Azure for development environments that require I be running on a server, like some SharePoint projects require (maintaining legacy sandbox solutions).  For those that don’t, I can simply use Visual Studio on

After thinking through all of this, I decided to request an Ultrabook laptop for work.  I got a nice little HP Elitebook Folio 9470m, which is pretty slim/small, making it a good note taking device for meetings, but I can also use it to run Outlook and all of our remote access options work.  I also have a docking station and large monitors on my desk, so when I’m there I’m not limited to the smallish screen of the laptop.

WP_20140329_008

I’m still using my home desktop when I’m telecommuting, but if I need to catch up on email, I now have the option of doing so from my couch.

User Profile Web Service error when accessing from InfoPath Forms Services

In the last couple of weeks, we had to move a custom application from one SharePoint 2010 farm to another.  We tested our migration method by moving it to a test farm first, which highlighted a few things that we needed to change, but it was otherwise successful.

However, when we moved it to the new production farm, we ran into an error that we hadn’t seen in test.  It occurred when we opened an InfoPath form that was calling the User Profile web service.  The error presented to the user was a dialog with this: “An entry has been added to the Windows event log of the server. Log ID:7893” .

When we looked it up in SharePoint’s ULS log, we found this entry:

Area:

InfoPath Forms Services

Category:

Runtime – Data Connection

Level:

Unexpected

Message:

The following data connection (GetUserProfileByName) has exceeded the maximum configured time limit. This threshold can be configured by using the SPIPFormsService -MaxDataConnectionRoundTrip PowerShell commandlet

 

Scouring the internet, I found tips that suggested the identity of the application pool for the “SharePoint Web Services Root” should be something other than localservice.  However, I then found a blog post by Spence Harbar stating that it was ok to be localservice.

I also found a tip that suggested increasing the timeout limits on the InfoPath Forms Services. I tried that without luck.

After discounting everything I found online, I went back to my list of differences between our test environment and our production environment.  After hours of combing through service account permissions both in SharePoint and SQL, I finally decided to check the HOSTS file on the SharePoint servers.

frustrated_man2

Turns out the fix for us was to add an entry to the hosts file that pointed the SharePoint URL to 127.0.0.1 (the loopback address).  We had already configured this for the other three SharePoint web application we had launched.  We had neglected to do this with our new web application, which was only recently put into production.

So, we learned a couple of lessons again:

First – do everything you can to make your test environment mirror your production environment.  I thought our environments were pretty well matched, including having a separate web front end from the app server, a separate SQL server, and even a separate FAST server in test.  The one thing we don’t have is a load balancer with multiple WFEs.

Second – document your processes.  We should have had a checklist to refer to when creating the new web application.

Carolina Code Camp 2013: Introduction to SharePoint Development

I spoke this weekend at the Carolina Code Camp on the CPCC campus here in Charlotte and my topic was an Intro to SharePoint Development for .NET Developers (those that have no knowledge of SharePoint).  It’s a topic that I’ve been talking about often with new and experienced devs alike, who want to know everything from how to set up a development environment to how to start coding and what can SharePoint do.

From my experience, a lot of the intro presentations given focus on writing those first lines of code, which is a great topic – but it’s about two or three steps away from the absolute beginning.  I spoke a little about SharePoint’s version history, my thoughts on setting up an environment, as well as the tools that you use before you open Visual Studio.  I ended with some Visual Studio info and I was hoping to get into code, but I was only able to get through my slides before my hour was up.

As I promised to those in attendance, here’s a link to my slide deck on SkyDrive: http://sdrv.ms/10B0uZZ

..and here’s a version you view from Slideshare:

It was actually good that I didn’t need to do live demos, because I didn’t have access to any of my normal virtual machines that I use for presentations (see my previous post for details: http://www.kellydjones.com/archive/2013/04/21/152760.aspx).  I was actually using my Surface RT tablet for the presentation and it worked great. I bought the special mini HDMI to VGA adapter from the Microsoft store here in Charlotte the previous weekend.  I went with the VGA adapter because I knew they had VGA connections at the campsite ( Smile ) but I wasn’t sure that they had HDMI.

I was going to do a demo using CloudShare based VMs, but I had trouble getting onto their WiFi and as I said, I didn’t have time for it anyway.

After my presentation, which was the first one of the day for the SharePoint track (one of five tracks with 60 sessions !), I was able to relax and enjoy the other presentations.  I got to attend four: an Intro to 2013 Apps Dev, a 2013 SP Designer New Features, an Agile Dev with TFS, and 2013 Search Driven UI session.  All of them very good, and all very relevant to what’s going on at the office.  All of those speakers did a great job.

Living without a laptop

Due to my job change, I'm getting the "opportunity" to re-evaluate my computing needs at home. This is because I've had a company laptop since I first started consulting back in 2004.

Most recently, I had an awesome laptop from work (Cardinal Solutions). It was a Dell Precision M6400 (I think) with an Intel Core i7 processor, 16GB RAM, a separate video card, two SSDs for storage, and a 15in screen with a high resolution (I think it was 1080p or better). All of this power was needed when I was running SharePoint Virtual Machines (or Outlook – for those of you using Outlook, you know what I mean). Just an awesomely powerful computer.

It was my primary computer both at work and at home, since I'd use it for work/home email, web browsing, etc.

So, once I knew I'd be returning it, I started thinking about its replacement. I already had a desktop at my new job and I might request a laptop, but I knew I wasn't going to be using the work laptop as much as I had the other one.

First things first, I thought about what I already had at home. I have a Windows server (custom built using typical workstation hardware) that I use for general file storage, running Virtual Machines with Hyper-V, media server (for the Xbox's and TVs in the house), and some network services (DNS, DHCP). I built this box to have lots of storage with moderate CPU power. It has five hard drives in it, that total 7TB of storage (but with RAID mirroring, I think the available storage is 4 TB). It also has several strong, loud fans in it, but since this sits in a closet I don't have to hear it.

We also have a Mac Mini (early 2009) that is my wife's primary computer. It is in the kitchen at her desk and she uses it for email/web browsing, plus tracking our finances in Quicken. We also have over a decade's worth of photos and music on it (iPhoto and iTunes). We're often both using computers at the same time, so sharing a computer wouldn't work.

We have a Lenovo Ideapad (Pentium B940) that we bought two years ago. It was going to be my main computer at home, but the kids soon discovered how cool it was and I found the screen resolution small (720p) when compared to my work laptop. The kids use it a lot, so my daughter was especially fearful once I explained that I had to return my work laptop. Again, sharing it would not work well.

One other piece of hardware we have is a Microsoft Surface RT. It's mine, but I let the kids use it every now and then. I take it to work daily and use it in meetings, which it is great for. The small size, the touch screen, the type cover – great. I'm a heavy OneNote user, so I have it on the Surface as well as my work laptop and the notes sync between the two.

Since I have the Surface, I got to wondering if I really still needed a powerful mobile computer (laptop). As a consultant, I often needed to spin up demos on Virtual Machines, but I don't need to do that anymore. Nor do I need to have a good computer ready for when the client doesn't have a machine for me.

SO, after much thinking I decided to build a desktop computer for my own use. I built a machine that has much more power than if I had spent the same amount of money on a laptop. I also get the ability to upgrade any piece of it I need to over the next few years, pretty much the opposite of a laptop (I only ever upgraded RAM and hard drives in a laptop).

I did a bunch of research (Tom's Hardware guide) and bought the following:

  • AMD A8-5500 Trinity 3.2GHz Quad-Core APU (CPU + GPU)
  • ASRock FM2A85X Extreme6 motherboard
  • Corsair XMS3 8GB DDR3 1600 RAM
  • Samsung 840 Series 120GB SATA III SSD
  • Fractal Design Define R4 Mid Tower Case
  • Rosewill Fortress Series 550W power supply
  • ViewSonic 23" LED monitor
  • Logitech wireless keyboard
  • Microsoft Wireless Mobile Mouse 4000

One of my design goals was to have a quiet computer, since it'll be sitting in my office and not in a closet. I also wanted a moderately powerful computer that I could upgrade, so I went middle of the road on CPU, but picked a newer design/socket so I can still get upgrades later.

The case and power supply were a little more than I would normally pay, but that was the trade off to get a quiet computer – the case has sound dampening styrofoam and the power supply is highly energy efficient (so I don't need loud fans running to cool it down).

When it came to the processor, I am probably just sentimental about AMD. I do think you get more power per dollar from them, on the low end. If my budget was larger, I would have gone with an Intel Core i7.

I made a huge trade off when it came to storage – only 120GB (our low end Lenovo laptop has 500GB), but I wanted the speed of a solid state drive (SSD). Not just any SSD, but these new Samsung 840 series drives, that have incredible IOP numbers. One of the first upgrades I'll do is probably add something like a 2 or 3TB standard hard drive, but for the OS, I wanted an SSD.

I ordered almost everything from Newegg and put it all together a couple of days ago. So far, I like it. I may miss having a laptop in the future, but for now I'm good.

My own March madness

This past March was a big one for me professionally. I spoke at two SharePoint Saturdays, Charlotte and Richmond, I had a huge migration project that was completed (mostly) which involved late nights and one weekend. To cap it all off, I accepted a full time position with the client I've been consulting with for sixteen months – the Carolinas HealthCare System. I started this week as their SharePoint Architect.

First, I want to talk about my experience speaking at SharePoint Saturday – in short, it was great. I want to publically thank both organizations for allowing me to speak and being great hosts. I've done presentations before for work and I spoke once before at a Microsoft hosted mini conference (wow, that must have been three or four years ago and it was about IE8), but this was my first time speaking at a community hosted event/conference. I really appreciated how well organized they were the day of the conference. I enjoyed my experience so much that I volunteered to speak at another event coming up, the Carolina Code Camp here in Charlotte in early May.

Probably the biggest accomplishment for March was the SharePoint migration project at work. We moved several hundred gigs of data, spread out over 4500+ sites/subsites. We did some reorganization, splitting up one huge site collection into 330, while moving another 50 site collections. We also applied branding and turned on the SharePoint 2010 user interface for the first time. (We had migrated from 2007 to 2010 last year, but left the 2007 UI turned on) We also trained several hundred site owners and introduced strong governance. We have a lot of work to do, but we're off to great start.

Given how much I've enjoyed working as a consultant for Carolinas HealthCare System (CHS), I decided to accept their offer for a full time position as a SharePoint Architect. We have executive support, a crucial element, which has allowed us to build both a great infrastructure and a great team. I'm particularly excited about the opportunity to help define how things are going to work, technically in the SharePoint environment as well as the business processes that we do as a team. Things like how we process incoming requests for solutions, what services we're going to offer, plus what tools we're going to build to help our forty thousand plus users solve problems in their workday.

One other thing I should say, in the context of the new job. I'm leaving behind a great organization – Cardinal Solutions. I've been a consultant with them for a few years now and they've been very good to me and I will continue to refer anyone looking for a consulting gig to Cardinal. This career move for me was all about the opportunity I was going to and was nothing negative about Cardinal.

 

2012 Year in Review and Goals for 2013

It’s been quite a while since I’ve written a blog entry. I seem to start every year with a goal of writing X number of blog entries per month or week or something, so let’s see if I do any better this year, with my goal being at least one lengthy post a month.  Hopefully this post will get the ball rolling.

2012 Year in Review

This past year, I’ve worked with one client who needed help migrating their WSS 3.0 farm to SharePoint 2010.  I started at the client around Thanksgiving 2011 and I quickly discovered their plan for a quick six week project wouldn’t work. Their SharePoint 2010 farm wasn’t configured correctly as it was really just a proof of concept quick install (only one service account, inadequate server resources, etc.).

Once we got a few servers together, we created a test farm and a production farm and I started doing test migrations.  After some false starts, we finally got everything migrated by July. (Yes, a six month project became six months.)

For the last half of the year, I joined a new team at the client who had taken over responsibility for SharePoint and we’ve been working to upgrade and migrate all of the sites again.  This time, we’re reorganizing the information architecture, installing custom branding, and moving to a much bigger farm.  We’re hoping to have all of this work wrapped up by the end of March.

Another big piece of my work life last year was getting to attend the SharePoint Conference in November. I’ve been to a few conferences before (Connections, Business Process and Workflow, and of course SharePoint Saturdays), but this was my first SharePoint Conference.  It was a good year to go especially with 2013 being launched.

When it came to speaking engagements, I only had a few internal ones.  I also taught a SharePoint class for our business analysts, as we try to train our BAs to fill SharePoint Analyst roles.

2013 Goals

2013 is starting off with a bang with SharePoint Saturday Charlotte, which I’ll be speaking at (a week from today) on the 26th. Usually, I have goals related to public speaking, so the fact that I’m getting one done in January is a good sign.  I’m going to try to attend at least one other SPS, maybe Atlanta, Richmond, or an Ohio one if the travel timing works out.  I thought the SPS DC was pretty cool in 2011, so maybe make a family trip out of that one again.

I’m also wrapping up the migration project at my client in the next few months, which will be a big accomplishment. We have further projects to do, which will be a nice change of pace from migrations.

At some point this year, I want to start digging into SharePoint 2013, with an eye toward passing the certification exams for it.  My client has no interest in moving to 2013 this year, and probably not even next year, but I still want to try to keep pace with the SharePoint community.

Well, those are my modest goals for the year.

How to delete SharePoint alerts for orphaned (deleted) users–WSS 3.0/SharePoint 2007/2010

An issue came up at my client recently, where our webmaster email account was being flooded with non deliverable email notices. Messages were bouncing back because our SharePoint farm was sending alerts to users and email address that no longer existed.

This was on a WSS 3.0 farm, and we don’t have any third party tools (like Axceler’s ControlPoint) to help manage scenarios like this.  So, I decided to tackle the problem with PowerShell.

Of course the first step in writing a PowerShell script is to search the net to see if anyone has already written one.  That led me to a couple of useful blog posts/scripts: Check-SharePoint-Orphaned-Users and Delete all alerts for a user in SharePoint.  Based on these two posts, I came up with the following script:

1 #For WSS 3.0 or MOSS 2007 2 $12HiveDir = "${env:CommonProgramFiles}\Microsoft Shared\web server extensions\12\" 3 cd $12HiveDir 4 [void][reflection.assembly]::Loadwithpartialname("Microsoft.SharePoint") | out-null 5 [void][reflection.assembly]::Loadwithpartialname("Microsoft.Office.Server.Search") | out-null 6 [void][reflection.assembly]::Loadwithpartialname("Microsoft.Office.Server") | out-null 7 8 $farm = [Microsoft.SharePoint.Administration.SPFarm]::Local 9 $websvcs = $farm.Services | where -FilterScript {$_.GetType() -eq [Microsoft.SharePoint.Administration.SPWebService]} 10 $webapps = @() 11 12 #Change this to match your domain, or even include the OU 13 $LDAP_Domain = "LDAP://DC=contoso,DC=local" 14 15 #From Check-SharePoint-Orphaned-Users: http://sharepointpsscripts.codeplex.com/releases/view/21699 16 function Check_User_In_ActiveDirectory([string]$LoginName, [string]$domaincnx) 17 { 18 $returnValue = $false 19 #Filter on User and NTgroups which only exists 20 $strFilter = "(&(|(objectCategory=user)(objectCategory=group))(samAccountName=$LoginName))" 21 $objDomain = New-Object System.DirectoryServices.DirectoryEntry($domaincnx) 22 23 $objSearcher = New-Object System.DirectoryServices.DirectorySearcher 24 $objSearcher.SearchRoot = $objDomain 25 $objSearcher.PageSize = 1000 26 $objSearcher.Filter = $strFilter 27 $objSearcher.SearchScope = "Subtree" 28 29 #$objSearcher.PropertiesToLoad.Add("name") 30 31 $colResults = $objSearcher.FindAll() 32 33 if($colResults.Count -gt 0) 34 { 35 #Write-Host "Account exists and Active: ", $LoginName 36 $returnValue = $true 37 } 38 return $returnValue 39 } 40 41 #From Delete alerts for a user: http://www.dunxd.com/2010/12/22/delete-all-alerts-for-a-user-in-sharepoint-with-this-powershell-script/ 42 function DeleteOrphanedAlerts{ 43 foreach ($websvc in $websvcs) { 44 foreach ($webapp in $websvc.WebApplications) { 45 Write-Host "Webapp Name -->"$webapp.Name 46 47 #limit our search to one web app 48 if ($webapp.Name -eq "sharepoint80 - content"){ 49 foreach ($site in $webapp.Sites) 50 { 51 # get the collection of webs 52 $webs = $site.AllWebs 53 foreach ($web in $webs) 54 { 55 # get the alerts 56 $alerts = $web.Alerts 57 58 Write-Host " "$web.Title", Alert count "$alerts.Count 59 60 # if more than 0 alerts, iterate through these alerts to see if there is one for the user 61 if ($alerts.Count -gt 0) 62 { 63 $myalerts = @() 64 $mysites += $web.Url 65 foreach ($alert in $alerts) 66 { 67 if ($alert.user.LoginName -ne $null) 68 { 69 if ($alert.user.LoginName.Contains("\")) 70 { 71 $UserName = $alert.User.LoginName.split("\") 72 73 $result = Check_User_In_ActiveDirectory $UserName[1] $LDAP_Domain 74 75 if ($result -eq $false) 76 { 77 echo " Deleting $($alert.Title) alert for $($alert.User.LoginName)" 78 $myalerts += $alert 79 $alertcount++ 80 } 81 } 82 } 83 } 84 85 # now we have alerts for this site, we can delete them 86 foreach ($alertdel in $myalerts) 87 { 88 $alerts.Delete($alertdel.ID) 89 } 90 } 91 } 92 } 93 } 94 } 95 } 96 } 97 98 DeleteOrphanedAlerts 99 Write-Host "Completed" 100

This script will loop through all of the Web Applications in the farm, site collections, and webs, looking for all of the alerts.  Once an alert is found, it checks to see if the user is still in Active Directory.  If not, it adds the alert to a list of alerts to delete.

If you only want it to search through one web application, you can modify line 48 to only continue if the name of the web application matches.

Also note, you’ll need to modify the LDAP path, stored in the $LDAP_Domain variable to match your environment.

WSS 3.0 to SharePoint 2010: Tips for delaying the Visual Upgrade

My most recent project has been to migrate a bunch of sites from WSS 3.0 (SharePoint 2007) to SharePoint Server 2010.  The users are currently working with WSS 3.0 and Office 2003, so the new ribbon based UI in 2010 will be completely new.  My client wants to avoid the new SharePoint 2010 look and feel until they’ve had time to train their users, so we’ve been testing the upgrades by keeping them with the 2007 user interface.

Permission to perform the Visual Upgrade

One of the first things we noticed was the default permissions for who was allowed to switch the UI from 2007 to 2010.  By default, site collection administrators and site owners can do this.  Since we wanted to more tightly control the timing of the new UI, I added a few lines to the PowerShell script that we are using to perform the migration.  This script creates the web application, sets the User Policy, and then does a Mount-SPDatabase to attach the old 2007 content database to the 2010 farm.  I added the following steps after the Mount-SPDatabase step:

#Remove the visual upgrade option for site owners
#   it remains for Site Collection administrators
foreach ($sc in $WebApp.Sites){
    foreach ($web in $sc.AllWebs){
        #Visual Upgrade permissions for the site/subsite (web)
        $web.UIversionConfigurationEnabled = $false;
        $web.Update();
    }
}

These script steps loop through each Site Collection in a particular web application ($WebApp) and then it loops through each subsite ($web) in the Site Collection ($sc) and disables the Site Owner’s permission to perform the Visual Upgrade. This is equivalent to going to the Site Collection administrator settings page –> Visual Upgrade and selecting “Hide Visual Upgrade”.

Since only IT people have Site Collection administrator privileges, this will allow IT to control the timing of the new 2010 UI rollout.

Newly created subsites

Our next issue was brought to our attention by SharePoint Joel’s blog post last week (http://www.sharepointjoel.com/Lists/Posts/Post.aspx?ID=524 ).  In it, he lists some updates about the 2010 upgrade, and his fourth point was one that I hadn’t seen yet:

4. If a 2007 upgraded site has not been visually upgraded, the sites created underneath it will look like 2010 sites – While this is something I’ve been aware of, I think many don’t realize how this impacts common look and feel for master pages, and how it impacts good navigation and UI. As well depending on your patch level you may see hanging behavior in the list picker. The site and list creation Silverlight control in Internet Explorer is looking for resources that don’t exist in the galleries in the 2007 site, and hence it continues to spin and spin and eventually time out. The work around is to upgrade to SP1, or use Chrome or Firefox which won’t attempt to render the Silverlight control. When the root site collection is a 2007 site and has it’s set of galleries and the children are 2010 sites there is some strange behavior linked to the way that the galleries work and pull from the parent.

Our production SharePoint 2010 Farm has SP1 installed, as well as the December 2011 Cumulative Update, so I think the “hanging behavior” he mentions won’t affect us.

However, since we want to control the roll out of the UI, we are concerned that new subsites will have the 2010 look and feel, no matter what the parent site has.

Ok, time to dust off my developer skills.

I first looked into using feature stapling, but I couldn’t get that to work (although I’m pretty sure I had everything wired up correctly).  Then I stumbled upon SharePoint 2010’s web events – a great way to handle this.

Using Visual Studio 2010, I created a new SharePoint project and added a Web Event Receiver:

image

In the Event Receiver class, I used the WebProvisioned method to check if the parent site is a 2007 site (UIVersion = 3), and if so, then set the newly created site to 2007:

 

/// <summary>
/// A site was provisioned.
/// </summary>
public override void WebProvisioned(SPWebEventProperties properties)
{
    base.WebProvisioned(properties);
 
    try
    {
        SPWeb curweb = properties.Web;
 
        if (curweb.ParentWeb != null)
        {
 
            //check if the parent website has the 2007 look and feel
            if (curweb.ParentWeb.UIVersion == 3)
            {
                //since parent site has 2007 look and feel
                //  we'll apply that look and feel to the current web
                curweb.UIVersion = 3;
                curweb.Update();
            }
        }
    }
    catch (Exception)
    {
        //TODO: Add logging for errors
    }
}
 

This event is part of a Feature that is scoped to the Site Level (Site Collection).  I added a couple of lines to my migration PowerShell script to activate the Feature for any site collections that we migrate.

Plan Going Forward

The plan going forward is to perform the visual upgrade after the users for a particular site collection have gone through 2010 training. If we need to do several site collections at once, we’ll use a PowerShell script to loop through each site collection to update the sites to 2010.  If it’s just one or two, we’ll be using the “Update All Sites” button on the Visual Upgrade page for Site Collection Administrators.

The custom code for newly created sites won’t need to be changed, since it relies on the UI version of the parent site.  If the parent is 2010, then the new site will look 2010.

SharePoint 2010 MSDN Labs

Eric Ligman, from Microsoft, posted a great blog post this week listing all of the SharePoint 2010 Virtual Labs that are available from Microsoft.  His blog entry is here: http://blogs.msdn.com/b/mssmallbiz/archive/2012/03/13/sharepoint-server-2010-msdn-virtual-labs-available-to-you-online-plus-more-sharepoint-2010-resources.aspx

He also posted other resources as well.

I’ve copied his Virtual Lab links here:

SharePoint Server 2010 Virtual Labs

In addition to the SharePoint Server 2010 Virtual Labs, here are a few other SharePoint 2010 resources that I thought you might also be interested in:

How to migrate an InfoPath form from test to production in SharePoint 2010

 

In order to migrate (copy) an InfoPath form from one environment (say test to production), there are several things to consider. First, does your form use a data connection to retrieve or submit data? Second, if you update the form in the future, do you want all of the locations where this form is used to be updated as well? Consider these issues while reading the instructions below.

Step 1: Create a form

First, let’s create a simple InfoPath. For my example, I’ve created an overly simplified leave request form. The PTO (Paid-Time-Off) form has only three fields on it: a leave date, a return date, and an employee name field. I know this isn’t all too realistic, but I don’t want to get lost in the details of what’s on the form since we’re focusing only on how to publish it.

Step 2: Publish your form using InfoPath

Our next step is to save and publish our form. In InfoPath, go to File à Publish and select “SharePoint Server”:

clip_image002

The Publishing Wizard will then appear:

· Enter a SharePoint URL (don’t worry about the URL – you can use a test URL if you like)

· Click next

The next step will ask how you want to publish it. You’ll want to choose “Administrator-approved form template (advanced)”:

clip_image004

After selecting Administrator-Approved and clicking next, InfoPath will then ask you to specify a file location to save the form’s template to. This is the location (file share) that you want to save the file to in order for your administrator to access it. In my example, I’m saving it to a mapped drive:

clip_image006

The next step is to choose any fields in your form that you want be available as columns in the SharePoint forms library:

clip_image008

For my example, I’ve chosen to include all three fields, but this is up to you.

The final step in the Publishing Wizard is to click “Publish”. This will save your form template to the location you’ve chosen. Once it’s saved, the final confirmation step will appear and you can click “Close”.

Uploading the Form to SharePoint

The next step in our process is to upload the form to the “Managed Form Templates” list in SharePoint’s Central Administration. In order to do this, you will need Farm Administrator permission.

Navigate to Central Administration and select “General Application Settings” in the left navigation. Once you do that, you should see “Upload form template” as a choice under “InfoPath Forms Services” on the right:

clip_image010

Click “Upload form template” and then you’ll see this:

clip_image012

The first thing you do on this screen is to click the browse button and locate your published InfoPath form. This will be the location that you entered in the Publishing Wizard in InfoPath. You can the click the Verify button and see if this form is ready to be uploaded. Hopefully, you’ll get the message “This form template is ready to upload to the server.”

If you receive an error at this step, be sure that you selected “Administrator Approved” in the Publishing Wizard, and that you’ve selected the same form to upload to SharePoint. InfoPath will also save a working copy that would normally be considered the author’s master copy, so you more than likely will have multiple copies/versions of the same form – so be sure you have the correct one.

When you click “Ok” on the Form Verification Status page, you will be directed back to the Upload page. You’ll have to select your form again, using the browse button. This time, don’t select Verify, since we’ve already verified that this template will work. Instead, look at the Upgrade options and consider if these are applicable.

Once you’ve made your upgrade selections, click the Upload button. You should then see this:

clip_image014

Click “Ok” and you’re sent to the “Manage Form Templates” page. Depending on how quick you click Ok and how speedy your farm is, you may see the Status as “Installing”:

clip_image016

Just give it a minute and then refresh the page. Once the status is shown as “Ready”, we can do the next step.

Step 3: Activate Form for a Site Collection

Now that we have the form in Central Administration, we can now make it available to the Site Collections where we want it to be used. To do this, hover over the form link and use the drop down menu to select “Activate to a Site Collection”:

clip_image018

You will then be asked to select the site collection to make it available to. Now that the form has been available to a site collection, let’s go use it.

Step 4: Using the InfoPath form in your Site Collection

Once your form is available in a Site Collection, you still need to use it with a library. So first, navigate to your Site Collection and create a new “Form Library”. After the library is created, then go to “Library Settings” for that library.

We need to add a content type to the library. In order to do this, we go to “Advanced Settings”:

clip_image020

On the Advanced Settings page, you’ll need to select “Yes” under “Allow management of content types?”:

clip_image022

You can scroll down on the page and click “Ok”. You should now see a “Content Types” section added to the Library settings page, just above the columns section.

In the Content Types section, click “Add from existing site content types”. On the “Select Content Types” page, find your InfoPath form in the available content types and click Ok.

We’re almost ready, but one thing I like to clean up is the list of available Content Types for this library. As it is currently configured, my library would allow users to submit two different forms: “Form” which is blank and “PTO Form” which is the one I want. So to clean this up, you click on “Change new button order and default content type”.

On this page, I’ll uncheck the box in the “Visible” column for my “Form” and change the position my “PTO Form” to 1. My page now looks like this:

clip_image024

After clicking Ok, I then go back to my library and submit a new form…and there it is -- my PTO form that I published via Central Administration.

Step 5: Updating an already deployed Managed InfoPath form

Once the form is deployed, you can then make changes to the original. Let’s say you need to add field to the form. You would update the form and use the Publishing Wizard to save it as an “Administrator Approved” form. Your SharePoint farm administrator would then upload the new form to the Managed Form list, paying careful attention to the Upgrade options.

Once the form is uploaded, then SharePoint will find where that form is deployed and update it. How quickly this happens will depend on your farm.

Further Information

If you would like further information about the options for upgrading, or even how to update the form using PowerShell steps, please see Microsoft’s documentation on TechNet: http://technet.microsoft.com/en-us/library/cc262921.aspx

As I mentioned at the beginning of this post, you will need to consider how your form is updated, which I discussed briefly above.  I also mentioned that you would need to think about your data connections.  I’ll cover the data connection topic in my next blog post.

SharePoint 2010 Client Object Model – CAML Query inaccurate results error

I ran into a frustrating scenario today, while working with SharePoint 2010’s Managed Client Object Model.  My application queries a SharePoint document library using the file’s name (the FileLeafRef field).  Given that this field is unique, I was expecting only one result with the following code:

1 CamlQuery qry = new CamlQuery(); 2 3 //filter the results to only get back the item with the filename we're looking for 4 qry.ViewXml = string.Format( 5 "<Query><Where><Eq><FieldRef Name='FileLeafRef' /><Value Type='File'>{0}</Value></Eq></Where></Query>", 6 remoteFileName); 7 8 ListItemCollection itms = lst.GetItems(qry); 9 ctx.Load(itms); 10 ctx.ExecuteQuery();

Instead of getting one (or none) records, I was getting more than one.  SO, I searched some more and found some examples with slightly different xml:

1 CamlQuery qry = new CamlQuery(); 2 3 //filter the results to only get back the item with the filename we're looking for 4 qry.ViewXml = string.Format( 5 "<View><Query><Where><Eq><FieldRef Name='FileLeafRef' /><Value Type='File'>{0}</Value></Eq></Where></Query></View>", 6 remoteFileName); 7 8 ListItemCollection itms = lst.GetItems(qry); 9 ctx.Load(itms); 10 ctx.ExecuteQuery();

See the difference?  Look carefully at line 5 – that’s right. I left out the surrounding “<View>” tags!

Error when trying to access SharePoint 2010 via PowerShell in a new environment

I got this error today while trying to access my new SharePoint 2010 environment via PowerShell:

Cannot access the local farm. Verify that the local farm is properly configured, currently available, and that you have the appropriate permissions to access the database before trying again.

If I would have just read the error message a little more carefully, I would have realized that I should start by looking at my permissions in SQL – which my account didn’t.  This happened because I installed SQL Server 2008 R2 in my virtual machine while I was logged on as the local administrator.  When I was attempting to run PowerShell, I was logged on as the domain administrator. (These are virtual machines that I’m using for demos….).  The local administrator was granted access automatically as part of the SQL install.

Error creating PowerPivot Service Application in SharePoint 2010 (SP1 + June 2011 CU)

A few weeks ago I ran into an error while setting up our test farm.  I was creating the service applications and when I got to PowerPivot, I got the following error:

Login failed for user 'NT AUTHORITY\ANONYMOUS LOGON' during PowerPivot Service application creation

I was able to work around this error by using PowerShell to create the service application.

Background

First, some details of our installation: two web front ends, two application servers, and one SQL server.  All are running Windows Server 2008 R2 x64 with Service Pack1.  The SQL Server is 2008 R2 with SP1.  The SharePoint servers are SharePoint Server 2010 Enterprise, and I installed SharePoint’s Service Pack 1 and the June 2011 Cumulative Update (by running all of the installers before running the first Configuration Wizard).

Clean Up from the error

Once you run into the error, you’ll see that a new database was created and a phantom application pool was created.  To clean these up:

  • Delete the database in SQL server (I did this just using SQL Studio)
  • Delete the database in SharePoint, using PowerShell: (Please note: this PowerShell line will delete any database entry in SharePoint where the actual database doesn’t exist in SQL)
    Get-SPDatabase | Where{$_.Exists -eq $false} | ForEach {$_.Delete()} 
  • Now delete the phantom Application Pool (the error causes a service application pool to be created within SharePoint, but doesn’t actually create it in IIS):
Remove-SPServiceApplicationPool -Identity "My App Pool" 

Just replace the “My App Pool” text with the name of the application pool you want to delete.

Create PowerPivot Service Application using PowerShell

Now, to create the PowerPivot service application, you can use the following PowerShell script:

   1:  $PowerPivotServiceName = "PowerPivot Service"
   2:   
   3:  Write-Host "Creating PowerPivot Service Application..." 
   4:  New-PowerPivotServiceApplication -ServiceApplicationName $PowerPivotServiceName
       -DatabaseServerName "SQL.MYDOMAIN.LOCAL" -DatabaseName "Service_PowerPivot_1" -AddToDefaultProxyGroup 
   5:  Write-Host "PowerPivot Service Application created" 
   6:   
   7:  Write-Host "Creating Application Pool" 
   8:  $AppPool = New-SPServiceApplicationPool -Name "AppPool_PowerPivot" -Account "DOMAIN\SERVICEACCOUNT_USERNAME" 
   9:  Write-Host "App Pool created"
  10:   
  11:  Write-Host "Assigning PowerPivot Application Pool" 
  12:  $sa = Get-PowerPivotServiceApplication | where {$_.DisplayName -eq $PowerPivotServiceName} 
  13:  $sa.ApplicationPool = $AppPool; 
  14:  $sa.Update(); 
  15:  Write-Host "PowerPivot Application Pool Assigned"
  16:   
  17:  Write-Host "Script Complete"

You’ll need to fill in a few parameters specific to your environment:

Line 4: DatabaseServerName, DatabaseName

Line 8: The Application Pool’s name and the service account that it will run as

Summary

Doing these steps should give you a working PowerPivot service application.  We did open a support ticket with Microsoft.  They confirmed that other customers have seen this error, but they are still researching the cause.

SharePoint 2010 Not Indexing OneNote 2010 files

Our SharePoint team at work noticed that our SharePoint 2010 farm wasn’t returning search results based on the contents of OneNote files that were saved in document libraries.  I did a little research and found a solution, by putting together steps from different posts.

First, some details: we’re running SharePoint 2010 SP1 (with the June 2011 CU) on Windows Server 2008 R2 SP1.  However, this issue was seen before we applied the service packs and cumulative updates, so I’m not sure what effect, if any, those will have.

In order to get SharePoint 2010 to index OneNote 2010 files, I had to:

(These steps are from here: http://support.microsoft.com/kb/925765 )
  • Update the registry using the instructions from Microsoft (see above link ) – be sure to change the “12” to “14” in the registry settings, since Microsoft’s instructions are referring to SharePoint 2007 (aka – Office 12 SharePoint) and not SharePoint 2010 (aka – Office 14 SharePoint)

I didn’t need to add the first key that Microsoft lists, because it was already on our servers – this one: [HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Shared Tools\Web Server Extensions\14.0\Search\Setup\ContentIndexCommon\Filters\Extension\.one] @="{B8D12492-CE0F-40AD-83EA-099A03D493F1}"

(These steps are from here: http://social.technet.microsoft.com/Forums/en-AU/sharepoint2010setup/thread/efb6851c-4a1b-4ecd-bbff-e4886ae15751 )

  • Uninstall the Microsoft Filter Pack 2.0 (which I believe is installed by SharePoint 2010) --  ignore the openned files by Search Service warning/error message
  • Install Microsoft Filter Pack 2.0 : http://www.microsoft.com/download/en/details.aspx?id=17062
  • Reboot the server
  • Start a full crawl of the content source in SharePoint Central Administration
    • On one of our farms, Test, I also had to do a Index Reset.  I’m not sure why this farm was different.

I applied these changes to our three environments (Dev, Test, Prod), so I think these steps are pretty accurate.


Update 8/23/2011 4pm:

After some further testing, it appears that our search results are not reflecting changes to save OneNote files in SharePoint, unless we do an Index Reset (!).  It looks like the next step will be to open a support ticket with Microsoft.

SharePoint 2010–Error when deleting Content Type In Use

UPDATE!

July 8th, 2011 -- I’ve been reading Chris Poteet’s blog for a while (probably since Day of .Net Ohio in 2008 or 2009) and he recently started a new blog covering odd user interface quirks and inconsistencies with SharePoint.  He’s titled the blog, Unexpected Error.  Well, today Chris was kind enough to mention me for follow Friday on Twitter and I followed the link to his blog again – and was stunned to see that I had written a post this week that is pretty much the same one he wrote two weeks ago.

I’m sure I read it when he published it, but I had forgotten about by the time I wrote my post this week. SO, my apologies to Chris for copying his idea.

Please read his post “But Where is the Content Type Used?”, along with the rest of his blog.


I’ve found another annoying user interface quirk you get when you try to delete a content type in SharePoint 2010.  If the content type is still being used, you’ll get an error.  (The message display will vary, depending on how the custom errors values are set in the web configs.)

There are a couple of options that the SharePoint team could use to make this friendly for the user.  First, they could eliminate the “Delete this content type” from the choices available.  Another option would be to grey it out, and use tool tips or an asterisk with footnote, to explain why it can’t be deleted.  Another useful option would be to list all of the ways that the content type is still in use.

This issue is similar to the one I documented in this post: Illegal characters for SharePoint 2010 Content Type name . In that post, I also detail how to get the full error messages to appear.

Anyway, here’s the sequence of screens that you’ll see.  First, you navigate to the content type settings and then select “Delete this content type”:

image

Once you click “OK”, you’ll see one of the following error messages:

image

image

image

None of them explain what the user needs to do in order to actually delete the content type – namely, removing all dependencies on it.

Twitter