3 Keys for Managing a Migration Project

I was fortunate to work with a very talented team on a migration project a while back that was pretty successful.  The project involved EVERY version of SharePoint, except for 2013.  It went about as smoothly as it could go, the client was very happy, and with constant process evaluation, we managed to finish phase 1 of the project 2 months early.  This post is going to focus on the 3 key areas that I feel made our project a success (Data, Communication, Process Evaluation).  Don’t expect any technical talk here folks!  The information provided is not exactly laid out in chronological order but should give some insight into the things worth considering in a long term project like this one.

Data Data Data and More Data

As I briefly mentioned, the project involved migrating SharePoint Team Services (yep, the one released in 2002),  SharePoint Portal Server 2003, and SharePoint 2007.  The client wanted to migrate the sites gradually (groups of sites at a time) with minimal interruption.  Gathering the necessary data was the most important part of the project.  We needed to know what we were dealing with.  The questions that we needed to answer were:

  • How large were the sites?  How many sub webs, documents, lists, users.
  • Who owned the sites?
  • What customizations, if any, did the site/sites contain?  Web Parts, Branding, 3rd Party solutions, Workflows, InfoPath forms, etc.
  • What kind of security did the sites implement?  Were all of the sites in the group inheriting permissions?

In the beginning of the project, the client had a tool that we could use to generate a few reports and get most of that information.  Any missing information would require us to go to the site itself and inspect it.  We stored this data in a list that we built a workflow around to guide us through the upgrade/communication process.  We also categorized the data by complexity and gave each level of complexity a range of how long it would take to completely migrate a specific group of sites.  More on that later.

My team was very good at constantly evaluating our processes and determining where we could improve.  In this constant evaluation, we decided to create a PowerShell script that would loop thru all sites and get all of the information that we needed and output it to a report that we later imported into a SharePoint list.

Having all of this data in a single list had several advantages.

  1. The client was always aware of our progress.
  2. We were able to identify early, which sites could pose a problem.
  3. We were able to generate metrics around the data.
    1. Progress charts
    2. Estimated Completion
    3. Percentage of Sites by Complexity

Also, by switching from the tool that we were using to the PowerShell script, we managed to dramatically reduce the amount of time that we spent gathering data.

Communication

I believe that constant communication and a certain level of transparency is essential in all projects.  In this project, we held weekly team meetings with the client.  In those meetings, using a simple dashboard, we provided very clear information regarding our progress.  During those meetings, we would update the client with the number of sites migrated during the previous week, percentage complete, number of simple/moderate/complex migrations remaining and more.  We also updated them on any issues that we were having and informed them of anything that we thought could potentially be an issue.

We also maintained a series of communication “scripts” that we would use when communicating with the site owners.  We had a script for each stage of communication.  This allowed us to maintain a level of consistency should someone have to take over contacting the site owners or if we swapped a team member out (which didn’t happen).  The site owners were always aware of when the migration would happen, who they needed to alert, how long the process should take, updates on any issues that may have come up during the migration, and when the new site was available.

Some of the communication material was used when directly contacting the site owners.  Others were used as part of a workflow created in SharePoint Designer and associated with the list that was populated with the data gathered by the PowerShell script.

The workflow managed the migration process through 5 of 6 stages.  Once we evaluated the site to determine complexity and contacted the owner to schedule a migration date, the workflow would be manually started.  The workflow would inform the owner of the migration date.  Once the date came about, we would move the workflow into the next stage which would inform the owner that the site is being migrated.  The next few stages of the workflow were used to track when the site was in a quality assurance stage where we would verify that everything moved properly, when the site was turned over to the owner for sign off, and then finally a closed state where the workflow would inform the owner that the process had been completed and that they can begin to use their migrated site.  Sometimes owners would not get back to us so we, with the client’s permission, capped the sign off stage to 7 days.  If after 7 days there was no response, we would push the item to closed.  It wasn’t often that we needed to do this.

Process Evaluation

Through constant evaluation, we were able to gradually identify inefficiencies.  The primary source of this evaluation was the workflow used to move the site into different stages of the migration.  Using the created date for each task associated to the workflow, we could determine how long we were in any given stage and focus our efforts on that area.

The PowerShell script mentioned throughout this post was a result of our many moments of evaluation.  The script would be used to determine complexity based on our criteria and that allowed us to spend less time reviewing sites.  We also eliminated the process used to determine site owners.  Early on, we used “business partners” to find out who the owners were.  They would typically contact someone related to the department or project and that person would dig a little more until an owner was identified.  The script would instead grab the list of names that appeared in the owners group and we would determine ourselves who the owner most likely was.  In most cases, we chose people who were already involved in the process.  Since they were already familiar with the process, we would approach them first and cut down on some of the delays caused by unsure or hesitant owners.  If we chose wrong, that person would usually be able to direct us to the correct person.

Early on, we were contacting people on a per site basis.  We then decided to create an InfoPath form that would return a list of sites where the current user’s name appeared in a field that contained a list of owners.  A drop down was provided for each site and the person was able to select whether they wanted a site Archived, Migrated or in some cases just inform us that they weren’t the correct owner.  The form would update our site list with the appropriate action and we were able to cut down on some of the time spent during the initial communications.

In the end, we completed the project 2 months earlier than our initial target and used the remaining 2 months to get a head start on the next phase.  We provided workflows, forms, scripts, and communication materials that the client was able to use for later migrations.  Data, Communication, and constant Process Evaluation made our project a success and earned the trust of our client.

Advertisements

Access Denied in IE but Works in Chrome

Here’s a simple one that made me want to kick myself once I realized what was going on.  When I tried to access SharePoint in IE, I kept getting an access denied page but whenever I would try the site in Chrome, I had no problems.

access denied

So what’s the problem?  Well, it turns out that I provided the local admin credentials (server\administrator) in Chrome once and chrome used it to log me in from that point on.  IE, on the other hand, was using the account that I logged into the server as which was domain\administrator.

In order to back out of it and provide the credentials that I wanted, I simply used the following address:

http://<server>/_layouts/closeConnection.aspx?loginasanotheruser=true

Upload Large Files using CSOM and Memory Streams

I came across a blog post titled Limitation of Uploading Big File to SharePoint 2013 With CSOM a few days ago.  It mentions a plugin that uses CSOM to upload files to SharePoint and points out that by default SharePoint places a 2MB limit on messages received.  The post goes on to describe how to adjust the limit via PowerShell and/or C# code. I then started to wonder about custom code and how would you upload a large file without having to adjust the limit?  The first thing that came to mind was, is there a way to use a file stream to get large files into a document library?  I did some digging and there is in fact a way to stream the file. Microsoft.SharePoint.Client.File contains a SaveBinaryDirect method that accepts the client context, the server relative destination URL and file name, the stream, and a boolean indicating whether you want allow a file overwrite.  Make sure to load the library (lines 6-8) before calling the SaveBinaryDirect method. The following example is uploading the 9MB SharePoint 2010 Patterns and Practices guidance pdf.


ClientContext ctx = new ClientContext("http://sp13");

Folder folder = ctx.Web.GetFolderByServerRelativeUrl("Shared Documents");
string file = String.Concat(Environment.CurrentDirectory, @"\Files\SharePointGuidance2010.pdf");

List docLib = ctx.Web.Lists.GetByTitle("Documents");
ctx.Load(docLib);
ctx.ExecuteQuery();

using (MemoryStream stream = new MemoryStream( System.IO.File.ReadAllBytes(file) ) )
{
Microsoft.SharePoint.Client.File.SaveBinaryDirect(ctx, "/shared documents/spg.pdf", stream, true);
}

Add a Geolocation Column with PowerShell

The following is a simple, reusable PowerShell script that can be used to add a Geolocation field to a list.


Param(

[string]$url,

[string]$listName,

[string]$columnName

)

[Microsoft.SharePoint.SPSecurity]::RunWithElevatedPrivileges(
{

$web = get-spweb $url

$placesList = $web.Lists[$listName]

$spFieldType = [Microsoft.SharePoint.SPFieldType]::Geolocation

$placesList.Fields.Add($columnName, $spFieldType, $false)

$placesList.Update()

})

Client-Side Rendering : Fields

I realize that there are several posts out there regarding this topic but it’s something that I’ve been wanting to play with for a while and I had nothing to do this weekend.

Client-side rendering is simply the process of using JavaScript to manipulate how certain data is rendered.  In this post, I’ll explain a simple JavaScript file that I threw together that checks the value of a specific field and displays an image in place of it’s text.

I have a custom list with a column called Gender.  Gender is a choice field with two possible options: “Male” or “Female”.  The example below starts by defining which field we want to manipulate.  The line containing genderContext.Templates.Fields is where we specify which column will be manipulated.  When I created the Gender column, SharePoint gave it an internal name of “y34x” so that’s how I’m identifying which column we’re dealing with.  Next you specify what needs to be manipulated.  In this case, I want to update the view and I’m passing in the name of the function that will do the work.  You can also update the forms if you choose to.

(function () {
var genderContext = {};
    
genderContext.Templates = {};
genderContext.Templates.Fields = {
  "y34x": {
    "View": GenderViewTemplate
    //DisplayForm: 
    //EditForm:
    //NewForm:
   }
};

SPClientTemplates.TemplateManager.RegisterTemplateOverrides( genderContext );
})();

This next section is the function that changes the rendering.  As you can see, we start by getting the value of the field that we’re interested in.  In this case, we’re passing that value to a variable called gender.  Next, we define what we want to return in place of that value.  As you can see, we’re returning an image tag.

function GenderViewTemplate( ctx ) {
  var gender = ctx.CurrentItem[ctx.CurrentFieldSchema.Name];
  var returnValue = '';

  switch ( gender ) {
    case "Male":
      returnValue = "<img src='/_layouts/15/images/ClientSideRendering/business_user.png' style='height: 32px; width: 32px' />";
    break;
    case "Female":
      returnValue = "<img src='/_layouts/15/images/ClientSideRendering/female_business_user.png' style='height: 32px; width: 32px;' />";
    break;
  }
 
  return returnValue;
}

I published the JavaScript file and images to the file system and now need to reference the JavaScript file where I want to use it.  Edit the list view web part and expand the miscellaneous section.  At the bottom, you’ll see a JS Link textbox where you can specify the path to your js file.


Click OK and your list will look like the following.

InfoPath and SharePoint Web Services: GetListItems

I’ve been working on an InfoPath form (which has been driving me crazy to say the least).  I needed a collection of list items for the current user so I started by creating a data connection to my list, displaying it in a repeating table, and hiding what I didn’t want to see.  (I know, sloppy).  It worked, but my client uses IE8 and it would throw JavaScript errors if it had too pull to many list items.  (IE10 and Chrome worked fine).  The form would eventually load, but that was no good.

I then tried using web services, specified lists.asmx, and attempted to use the GetListItems operation.  No matter what I did, I got the following error message:

The SOAP response indicates that an error occurred on the server:
Microsoft.SharePoint.SoapServer.SoapServerException: Exception of type ‘Microsoft.SharePoint.SoapServer.SoapServerException’ was thrown.
Element of parameter query is missing or invalid.0x82000000

I dug around and it appears to be a common issue but I haven’t found an alternative suggestion so I gave REST a try.  
Start by selecting “From REST Web Service” 
Next, enter the URL followed by “/_vti_bin/listdata.svc/” (you should be able to add your filters here but in my case, I need it to be a little flexible so we’ll do that in a bit).

Now, you can access all of the data but like I said earlier, I needed to filter the data so here’s how I did it by using Rules.
Create a new action based rule.  The action that we’ll need is “Change REST URL”
We’ll need a function to change our URL.  I started by using concat.  There’s an existing SharePointSiteURL() function that will return the URL of site where the form resides.  That function gets passed into the concat first, and fill in the rest of the concat function as needed.  In the image below, my field names aren’t showing up, but they’re there.
The last thing we’ll need it to query the data again.  Just create another rule. Select “Query for Data” in the list of actions, and choose your REST Web Service.

MOSS Site Info with PowerShell

I started working with a team on a large migration project a few months ago.  The client requested that we approach the migration site by site instead of upgrading the whole environment all at once.  The first phase of the project involved migrating STS to SharePoint 2010 (yes folks, I said STS… as in V1).  Phase two involves upgrading their MOSS environment to 2010 and that’s the focus of this post.  We’re trying to approach phase two in a similar fashion while keeping an eye on streamlining our processes. 

Overview of our Approach
Since we’re approaching this project site by site, communication is VERY important.  I won’t bore you with the details so here’s the gist of the process.

1. We kick off a workflow which tosses our item into a Review state where we gather information about our sites (how big is the site, are there workflows or InfoPath forms, are there any other customization, etc).  **This is where PowerShell came in**
2. We contact the owner to schedule some time for the upgrade since their site will experience some downtime.
3. We perform the upgrade (with DocAve)
4. We verify the upgrade
5. We wait for site owner sign off

Enough of the Back Story Already!
The script is long and includes an options menu, error handling and logging so I’ll just show the important parts.

We needed a list of all sites, the site owners, site size, and other information.  In order to use the object model, we need to load the SharePoint assembly.

[void][System.Reflection.Assembly]::LoadWithPartialName(“Microsoft.SharePoint”)

Get an instance of the needed SPWeb:
$site = New-Object Microsoft.SharePoint.SPSite($rootURL)
$web = $site.OpenWeb();

Get a list of people in the owners group:
    $owners = $null
    foreach ($u in $web.AssociatedOwnerGroup.Users)
    {
        $owners += $u.LoginName + ‘;’
    }
   
Next, I needed storage information for each site.   For more info on the following method, visit StorageManagementInformation

Get list storage information:
$listDataTable = New-Object System.Data.DataTable
$listDataTable = $s.StorageManagementInformation(1, “Increasing”, “Size”, “100000”)

Get document storage information:
$docLibDataTable = New-Object System.Data.DataTable
$docLibDataTable = $s.StorageManagementInformation(2, “Increasing”, “Size”, “100000”)

Get a total of list and library storage usage:
The previous lines created a data table that contained storage usage for each list and library.  Now we loop through each line and add up the numbers.  Since the data table contains usage for all lists and libraries in the site collection, we’ll look for just the ones that are relevant to the current web by using the web’s ID.

$siteSize = 0

foreach($row in $docLibDataTable.Rows)
{
     if (([Guid]$row.WebGuid) -eq ([GUID]$web.ID))
     {
           $siteSize += $row.Size
     }
}
       
foreach($row in $listDataTable.Rows)
{
     if (([Guid]$row.WebGuid) -eq ([GUID]$web.ID))
    {
           $siteSize += $row.Size
    }
}

Find running workflows and if an InfoPath form exists:
$availableWorkflows = $null
$containsInfoPathForms = $false
   
    foreach ($list in $web.Lists)
    {
        #if one infopath form is found on the site, there’s no need to search again
        if ($containsInfoPathForms -eq $false)
        {          
            $containsInfoPathForms = ($list.BaseType -eq “DocumentLibrary” -and $list.BaseTemplate -eq “XMLForm”)
        }
       
        foreach ($association in $list.WorkflowAssociations)
        {
            $availableWorkflows += $association.Name + ‘;’
        }
    }

 
Output the results to a CSV:
First, I needed a global variable to store the results since method that gathers all of the data is called recursively.

$global:outputToCsv = @()

Next, I create a new object to store the data for the current web and append those results to my global variable.

$output = New-Object PSObject
$output | Add-Member -MemberType NoteProperty -Name “Title” -Value $web.URL
$output | Add-Member -MemberType NoteProperty -Name Size -Value ($siteSize/1MB)
$output | Add-Member -MemberType NoteProperty -Name “Sub Sites” -Value $web.Webs.Count
$output | Add-Member -MemberType NoteProperty -Name “Infopath Forms” -Value $containsInfoPathForms

$output | Add-Member -MemberType NoteProperty -Name Owners -Value $owners
$output | Add-Member -MemberType NoteProperty -Name “Site Template” -Value $web.WebTemplate

$global:outputToCsv += $output
 
Finally, when all the data has been collected, I export it to a csv:

$global:outputToCsv | Export-Csv (“c:\scripts\OneSiteCollection.csv”) -NoTypeInformation