We had a client moving from an On-Prem v11 Optimizely instance to DXP v12 and we had a lot of blobs (over 40 GB) needing uploading to DXP as a part of the conversion.
This was my first experience doing both a version and environment upgrade and I leaned heavily on Optimizely support to help me get it right. Along the way, I wrote down each step so that I could reference it for any future projects that were doing the same. Perhaps someone else will be in the same spot and can benefit from this tutorial as well.
Blobs and the database can now be uploaded to DXP directly in PowerShell.
To upload blobs to DXP via PowerShell
- Get the location of the blobs
- Find the folder on your machine where the blobs are located and unzipped.
- Write down that folder path (ie “C:\source\MyProject\MyProject.Web\App_Data\blobs”).
- (Important) Make sure the path ends with “\*” so that you copy only the contents of the blob folder and not the folder itself (ie “C:\source\MyProject\MyProject.Web\App_Data\blobs\*”).
- This full path will be referred to later as $BLOB_LOCATION.
- Get AzCopy from Microsoft
- Download AzCopy from Microsoft (https://learn.microsoft.com/en-us/azure/storage/common/storage-use-azcopy-v10#download-azcopy).
- Extract the executable to a folder on your machine.
- Traverse the extracted files to find the executable, “azcopy.exe”.
- Write down the location of the executable (ie “%USERPROFILE%\Downloads\azcopy_windows_amd64_10.23.0\azcopy_windows_amd64_10.23.0\” or “C:\Users\Me\Downloads\azcopy_windows_amd64_10.23.0\azcopy_windows_amd64_10.23.0\”).
- Ask Opti (managed services?) to provide a temporary SAS URL/authentication. This will be referred to later as $SAS_URL.
- Take note of the container name found in the SAS URL (“https://*.blob.core.windows.net/MYCONTAINERNAME”).
- Populate the azcopy command using values from steps 1.4 & 3.
azcopy copy "$BLOB_LOCATION" "$SAS_URL" --recursive=true
Example:
azcopy copy "C:\source\MyProject\MyProject.Web\App_Data\blobs\*" "https://mysite.blob.core.windows.net/MYCONTAINERNAME?sp..." --recursive=true
- Open a PowerShell window and navigate to the folder from step 2.4.
- Paste the azcopy command from step 4 and run it.
- Make sure you have the following in your Startup.cs
public void ConfigureServices(IServiceCollection services) { services.AddAzureBlobProvider(o => { o.ContainerName = "MYCONTAINERNAME"; }); services.AddAzureEventProvider(); }
If this has been helpful, check out my two-part series on setting up Local HTTPS for Optimizely, IIS, and Kestrel using certificates.
You may also generate the SAS-token yourself, using the epicloud powershell module, if you prefer not involving Optimizely support.
See my recent post:
https://www.gulla.net/en/blog/copy-database-and-blobs-to-optimizely-dxp/
Also, in DXP, the default container name is ‘mysitemedia’
Hi Nick!
Thank you for this blog post! I just wanted to mention that you can perform these steps without even reaching out to support. You can get the SAS-link mentioned in step 3 through the deployment API.
To import a DB see:
https://docs.developers.optimizely.com/digital-experience-platform/docs/export-database#import-database
To upload blobs (use the -Writable switch):
https://docs.developers.optimizely.com/digital-experience-platform/docs/storage-containers#get-epistoragecontainersaslink
Hopefully makes this process even smoother!
Thanks Tomas!
I found a solution for a different client that used a different container name, which is why I thought it might be dynamic. However, for this client Opti had created the storage container as ‘mysitemedia’ as you mentioned. The documentation online is confusing because I was using this page in the CMS 12 developer docs to figure out how to deploy blobs: https://docs.developers.optimizely.com/content-management-system/docs/deploying-to-azure-webapps. In this page it says each account creates their own Blob Storage Account and Service Bus, but that’s handled by Optimizely support.
Thanks for this information. It’s a helpful addition for sure.
Hi Anders,
Thanks for these links. They would have been helpful, and hopefully are helpful to anyone reading this post.
Cheers!
Nick