I am sure we all have our stories highlighting our experiences upgrading from SharePoint 2010 to 2013. I have some lessons learned that might be helpful to note when upgrading to SharePoint 2013 and reorganizing the site structure.
First, here is the scenario:
We were upgrading a fairly large (approx 150GB) single content database farm to 2013 while splitting the content database into 3 parts to better align with future growth and capacity needs. We wanted to keep the SharePoint 2010 look and feel as there wasn’t time to upgrade the UI and master pages to 2013.
The client had created SharePoint sites for each of the site templates available in 2010 to see what the functionality offered. Over time, these sites began to contain real data that needed to be retained and migrated to SharePoint 2013 – I am sure you’ve seen similar situations. Let’s try this BI and Project site template out and see what it does, next thing you know it is in production and can’t be taken away.
We had several challenges with the first being how to get the content split quickly enough that we could migrate all in one weekend. Our challenge looked like this:
- Backup data on Friday at 4PM
- Detach database / attach database
- Start splitting content into new content databases
- Make sure everything works
- Live and Online Monday 8AM
I won’t go into the steps we took to accomplish item #1 and #2 above, as they are very thoroughly blogged elsewhere. However we had some interesting challenges with items 3/4.
Item 3 – Splitting content into new content databases
We wanted to move certain department sites to new content databases (for example: /sites/dept/IT would be moved to /dept/IT). Powershell was our best friend for this process. We found we could split the databases as quickly or more quickly using the powershell export/import commands verses a third party tool (we tried several). This was a little surprising for us as we were certain the tool would outperform powershell. It makes some sense, however, as we are all using the same Object APIs. In the end, we used a combination of the third party tool and powershell to split the sites into new content databases. Here is how we divided the workload:
- The tool was used to create/copy the site collection top levels to ensure all the security and groups were replicated.
- Powershell did the heavy lifting of exporting and importing sites and subsites.
What about InfoPath forms and workflows? Good question! The InfoPath forms came across without many issues. There were several forms where we had to change list URLs in the schema files, but other than that InfoPath was well behaved. As for the workflows, we had to do a bit more work to make them reconnect to the new lists. First and foremost, there is no way to copy over the running workflow states and content. However, we were able to re-attach the workflows to the new lists and all new list workflows worked fine. In summary we pulled Powershell into service. In summary we created a script to:
- Read all lists and libraries in the site and subsites
- Do a little mapping to reflect the old site url and the new site url
- check for a site workflow
- get the task and history list title and ID for the NEW list
$wftasklist=$newweb.Lists[$workflow.TaskListTitle];
$wfhistorylist=$newweb.Lists[$workflow.HistoryListTitle];
- Grab the workflow folder and save all the files to the local drive
$workflowfiles = $newweb.folders[“Workflows”].Subfolders | where {$_.name -eq $workflow.name}
- Update the files with the new IDs
$filecontent = Get-Content $tempxmlfilename
$filecontent.WorkflowConfig.Association.TaskListID = [string](“{“+$wftasklist.ID+”}”)
- save the files
- Upload back to SharePoint
The challenge was to figure out how to publish the workflow w/o having to open SharePoint designer. Again, Powershell to the rescue!
# The web service proxy sets the URL to root, so we reset it to the subsite
$proxy.Url = $web.Url+”/_vti_bin/webpartpages.asmx”
$result = $proxy.ValidateWorkflowMarkupAndCreateSupportObjects($configString,$rulesString,$xomlString,”2″)
Write-Host $result
$result = $proxy.AssociateWorkflowMarkup($xomlConfigFile.Url,”V”+$xomlConfigFile.UIVersionLabel)
Write-Host $result
Item 4 – Testing all the sites
Now that we have the sites back together and workflows working properly. We needed to figure out why the BI site and the analytics were not working as intended. In fact, whenever we accessed the BI site we would get correlation errors complaining the server could not access the analysis server database. I followed all the instructions per the Microsoft Blog posts and doubled checked all the settings. I came to discover I needed to install the 2010 reporting services client in addition to the 2012. Problem solved and everything worked perfectly. This made sense after I realized the connection strings pointed to a SQL database with the reporting services using SQL 2010.
During testing we also found that some InfoPath forms were failing to load properly as they had large amounts of data coming from the lists. We found and edited the web.config property
<httpRuntime maxRequestLength=”51200″ executionTimeout=”1200″ />
and added the executionTimeout you see in the above line so InfoPath would wait just a little longer for the data to be fetched from the server.
This post doesn’t cover all the items we went through in testing, installation, etc. Just the items that seemed most unusual. When Monday morning rolled around, everyone was working and the migration team sat in the conference room staring at each other. Eventually, we went to our respective desks to start on our next projects.
All in all, the migration was very smooth and with the use of Powershell, made seemingly impossible tasks possible.