3 Keys for Managing a Migration Project

I was fortunate to work with a very talented team on a migration project a while back that was pretty successful.  The project involved EVERY version of SharePoint, except for 2013.  It went about as smoothly as it could go, the client was very happy, and with constant process evaluation, we managed to finish phase 1 of the project 2 months early.  This post is going to focus on the 3 key areas that I feel made our project a success (Data, Communication, Process Evaluation).  Don’t expect any technical talk here folks!  The information provided is not exactly laid out in chronological order but should give some insight into the things worth considering in a long term project like this one.

Data Data Data and More Data

As I briefly mentioned, the project involved migrating SharePoint Team Services (yep, the one released in 2002),  SharePoint Portal Server 2003, and SharePoint 2007.  The client wanted to migrate the sites gradually (groups of sites at a time) with minimal interruption.  Gathering the necessary data was the most important part of the project.  We needed to know what we were dealing with.  The questions that we needed to answer were:

  • How large were the sites?  How many sub webs, documents, lists, users.
  • Who owned the sites?
  • What customizations, if any, did the site/sites contain?  Web Parts, Branding, 3rd Party solutions, Workflows, InfoPath forms, etc.
  • What kind of security did the sites implement?  Were all of the sites in the group inheriting permissions?

In the beginning of the project, the client had a tool that we could use to generate a few reports and get most of that information.  Any missing information would require us to go to the site itself and inspect it.  We stored this data in a list that we built a workflow around to guide us through the upgrade/communication process.  We also categorized the data by complexity and gave each level of complexity a range of how long it would take to completely migrate a specific group of sites.  More on that later.

My team was very good at constantly evaluating our processes and determining where we could improve.  In this constant evaluation, we decided to create a PowerShell script that would loop thru all sites and get all of the information that we needed and output it to a report that we later imported into a SharePoint list.

Having all of this data in a single list had several advantages.

  1. The client was always aware of our progress.
  2. We were able to identify early, which sites could pose a problem.
  3. We were able to generate metrics around the data.
    1. Progress charts
    2. Estimated Completion
    3. Percentage of Sites by Complexity

Also, by switching from the tool that we were using to the PowerShell script, we managed to dramatically reduce the amount of time that we spent gathering data.

Communication

I believe that constant communication and a certain level of transparency is essential in all projects.  In this project, we held weekly team meetings with the client.  In those meetings, using a simple dashboard, we provided very clear information regarding our progress.  During those meetings, we would update the client with the number of sites migrated during the previous week, percentage complete, number of simple/moderate/complex migrations remaining and more.  We also updated them on any issues that we were having and informed them of anything that we thought could potentially be an issue.

We also maintained a series of communication “scripts” that we would use when communicating with the site owners.  We had a script for each stage of communication.  This allowed us to maintain a level of consistency should someone have to take over contacting the site owners or if we swapped a team member out (which didn’t happen).  The site owners were always aware of when the migration would happen, who they needed to alert, how long the process should take, updates on any issues that may have come up during the migration, and when the new site was available.

Some of the communication material was used when directly contacting the site owners.  Others were used as part of a workflow created in SharePoint Designer and associated with the list that was populated with the data gathered by the PowerShell script.

The workflow managed the migration process through 5 of 6 stages.  Once we evaluated the site to determine complexity and contacted the owner to schedule a migration date, the workflow would be manually started.  The workflow would inform the owner of the migration date.  Once the date came about, we would move the workflow into the next stage which would inform the owner that the site is being migrated.  The next few stages of the workflow were used to track when the site was in a quality assurance stage where we would verify that everything moved properly, when the site was turned over to the owner for sign off, and then finally a closed state where the workflow would inform the owner that the process had been completed and that they can begin to use their migrated site.  Sometimes owners would not get back to us so we, with the client’s permission, capped the sign off stage to 7 days.  If after 7 days there was no response, we would push the item to closed.  It wasn’t often that we needed to do this.

Process Evaluation

Through constant evaluation, we were able to gradually identify inefficiencies.  The primary source of this evaluation was the workflow used to move the site into different stages of the migration.  Using the created date for each task associated to the workflow, we could determine how long we were in any given stage and focus our efforts on that area.

The PowerShell script mentioned throughout this post was a result of our many moments of evaluation.  The script would be used to determine complexity based on our criteria and that allowed us to spend less time reviewing sites.  We also eliminated the process used to determine site owners.  Early on, we used “business partners” to find out who the owners were.  They would typically contact someone related to the department or project and that person would dig a little more until an owner was identified.  The script would instead grab the list of names that appeared in the owners group and we would determine ourselves who the owner most likely was.  In most cases, we chose people who were already involved in the process.  Since they were already familiar with the process, we would approach them first and cut down on some of the delays caused by unsure or hesitant owners.  If we chose wrong, that person would usually be able to direct us to the correct person.

Early on, we were contacting people on a per site basis.  We then decided to create an InfoPath form that would return a list of sites where the current user’s name appeared in a field that contained a list of owners.  A drop down was provided for each site and the person was able to select whether they wanted a site Archived, Migrated or in some cases just inform us that they weren’t the correct owner.  The form would update our site list with the appropriate action and we were able to cut down on some of the time spent during the initial communications.

In the end, we completed the project 2 months earlier than our initial target and used the remaining 2 months to get a head start on the next phase.  We provided workflows, forms, scripts, and communication materials that the client was able to use for later migrations.  Data, Communication, and constant Process Evaluation made our project a success and earned the trust of our client.

Access Denied in IE but Works in Chrome

Here’s a simple one that made me want to kick myself once I realized what was going on.  When I tried to access SharePoint in IE, I kept getting an access denied page but whenever I would try the site in Chrome, I had no problems.

access denied

So what’s the problem?  Well, it turns out that I provided the local admin credentials (server\administrator) in Chrome once and chrome used it to log me in from that point on.  IE, on the other hand, was using the account that I logged into the server as which was domain\administrator.

In order to back out of it and provide the credentials that I wanted, I simply used the following address:

http://<server>/_layouts/closeConnection.aspx?loginasanotheruser=true

Upload Large Files using CSOM and Memory Streams

I came across a blog post titled Limitation of Uploading Big File to SharePoint 2013 With CSOM a few days ago.  It mentions a plugin that uses CSOM to upload files to SharePoint and points out that by default SharePoint places a 2MB limit on messages received.  The post goes on to describe how to adjust the limit via PowerShell and/or C# code. I then started to wonder about custom code and how would you upload a large file without having to adjust the limit?  The first thing that came to mind was, is there a way to use a file stream to get large files into a document library?  I did some digging and there is in fact a way to stream the file. Microsoft.SharePoint.Client.File contains a SaveBinaryDirect method that accepts the client context, the server relative destination URL and file name, the stream, and a boolean indicating whether you want allow a file overwrite.  Make sure to load the library (lines 6-8) before calling the SaveBinaryDirect method. The following example is uploading the 9MB SharePoint 2010 Patterns and Practices guidance pdf.


ClientContext ctx = new ClientContext("http://sp13");

Folder folder = ctx.Web.GetFolderByServerRelativeUrl("Shared Documents");
string file = String.Concat(Environment.CurrentDirectory, @"\Files\SharePointGuidance2010.pdf");

List docLib = ctx.Web.Lists.GetByTitle("Documents");
ctx.Load(docLib);
ctx.ExecuteQuery();

using (MemoryStream stream = new MemoryStream( System.IO.File.ReadAllBytes(file) ) )
{
Microsoft.SharePoint.Client.File.SaveBinaryDirect(ctx, "/shared documents/spg.pdf", stream, true);
}

Add a Geolocation Column with PowerShell

The following is a simple, reusable PowerShell script that can be used to add a Geolocation field to a list.


Param(

[string]$url,

[string]$listName,

[string]$columnName

)

[Microsoft.SharePoint.SPSecurity]::RunWithElevatedPrivileges(
{

$web = get-spweb $url

$placesList = $web.Lists[$listName]

$spFieldType = [Microsoft.SharePoint.SPFieldType]::Geolocation

$placesList.Fields.Add($columnName, $spFieldType, $false)

$placesList.Update()

})