Optimizely Content Delivery API is very well known for building headless CMS, to connect to Single Page Apps, mobile apps, etc. However I also found that come in handy in terms of data migration. With the Content Delivery API, you can easily retrieve detailed information of source content in JSON format and map it to your target application. In this article, I will give a brief instruction on how to migrate data between Optimizely Content Cloud applications using the API.
Prerequisites
Content Delivery API and Content Search API need to be installed and configured properly in your source application.
Note:
Content Search API will require CORS policy to be able to work, to read more about the configurations, please refer to this document for details. You also have to reindex the site after installation.
Migrating pages/blocks
Once Content Delivery API is installed in the source application, we will create something like a scheduled job in the target application. The job will pull the latest data from the source application by calling the API and then map it to the new content types of the target app.
To retrieve a list of content by specific type (e.g. ProductPage), call this API with top and skip parameters for pagination:
https://yoursite.com/api/episerver/v3.0/search/content?filter=ContentType/any(t:t eq 'ProductPage')&top=10&orderby=name&skip=10
Note:
The REST API version of Content Delivery API using in this example is 3.0.0
Once the returned JSON string is deserialized into a list of C# objects, you can iterate through the list and do the mapping to your new content type’s properties.
Migrating media files
The concept of media migration is the same as migrating pages/blocks, but with an extra step. You need to save the source file from the URL as a Stream and then write it to the blob. This code snippet below migrate a single PDF file to the target destination.
// Create a new PDF, name it after the sourceDocument var pdfFile = _contentRepository.GetDefault<PdfFile>(parentFolderReference); pdfFile.Name = sourceDocument.Name; // Create a blob in the binary container var blob = _blobFactory.CreateBlob(pdfFile.BinaryDataContainer, ".pdf"); // using System.Net.Http.HttpClient to get the stream from sourceDocument's url var resultStream = httpClient.GetStreamAsync(sourceDocument.Url).Result; // write the stream directly to blob and Assign to the file blob.Write(resultStream); pdfFile.BinaryData = blob; // TODO Mapping properties // Publish the file var pdfRef = _contentRepository.Save(pdfFile, SaveAction.Publish, AccessLevel.NoAccess);
Conclusion
When it comes to data migration, each project has its own unique problem. The purpose of this article is just to add another option, so we can consider whether it’s suitable for the task or not. Just like others, this approach also has props and cons.
Props
- You don’t have to create any logic or any API endpoints for the migration at the source application. Just installing the packages and it’s enough for most cases.
- If you’re pulling data from the live site, the content will always be the latest. You don’t have export the production database every time you run the migration.
Cons
- For security reasons, you might want to authenticate the API as well as configuring CORS policy.
- When the data migration is running, it might effect the performance of the source site. So please consider running the job outside office hours, and applying pagination to your queries, etc.
Thanks for giving us information about Optimizely Content Delivery API, this information is very useful to us. Thank you very much.