Skip to main content

Optimizely

Using Optimizely Content Delivery API for data migration

Big Data Technology And Data Science Illustration. Data Flow Concept. Querying, Analysing, Visualizing Complex Information. Neural Network For Artificial Intelligence. Data Mining. Business Analytics.

Optimizely Content Delivery API is very well known for building headless CMS, to connect to Single Page Apps, mobile apps, etc. However I also found that come in handy in terms of data migration. With the Content Delivery API, you can easily retrieve detailed information of source content in JSON format and map it to your target application. In this article, I will give a brief instruction on how to migrate data between Optimizely Content Cloud applications using the API.

Prerequisites

Content Delivery API and Content Search API need to be installed and configured properly in your source application.

Note:

Content Search API will require CORS policy to be able to work, to read more about the configurations, please refer to this document for details. You also have to reindex the site after installation.

Migrating pages/blocks

Once Content Delivery API is installed in the source application, we will create something like a scheduled job in the target application. The job will pull the latest data from the source application by calling the API and then map it to the new content types of the target app.

To retrieve a list of content by specific type (e.g. ProductPage), call this API with top and skip parameters for pagination:

https://yoursite.com/api/episerver/v3.0/search/content?filter=ContentType/any(t:t eq 'ProductPage')&top=10&orderby=name&skip=10

Note:

The REST API version of Content Delivery API using in this example is 3.0.0

Once the returned JSON string is deserialized into a list of C# objects, you can iterate through the list and do the mapping to your new content type’s properties.

Migrating media files

The concept of media migration is the same as migrating pages/blocks, but with an extra step. You need to save the source file from the URL as a Stream and then write it to the blob. This code snippet below migrate a single PDF file to the target destination.

// Create a new PDF, name it after the sourceDocument
var pdfFile = _contentRepository.GetDefault<PdfFile>(parentFolderReference);
pdfFile.Name = sourceDocument.Name;

// Create a blob in the binary container
var blob = _blobFactory.CreateBlob(pdfFile.BinaryDataContainer, ".pdf");

// using System.Net.Http.HttpClient to get the stream from sourceDocument's url
var resultStream = httpClient.GetStreamAsync(sourceDocument.Url).Result;

// write the stream directly to blob and Assign to the file
blob.Write(resultStream);
pdfFile.BinaryData = blob;


// TODO Mapping properties

// Publish the file
var pdfRef =  _contentRepository.Save(pdfFile, SaveAction.Publish, AccessLevel.NoAccess);

Conclusion

When it comes to data migration, each project has its own unique problem. The purpose of this article is just to add another option, so we can consider whether it’s suitable for the task or not. Just like others, this approach also has props and cons.

Props

  • You don’t have to create any logic or any API endpoints for the migration at the source application. Just installing the packages and it’s enough for most cases.
  • If you’re pulling data from the live site, the content will always be the latest. You don’t have export the production database every time you run the migration.

Cons

  • For security reasons, you might want to authenticate the API as well as configuring CORS policy.
  • When the data migration is running, it might effect the performance of the source site. So please consider running the job outside office hours, and applying pagination to your queries, etc.

Thoughts on “Using Optimizely Content Delivery API for data migration”

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Tung Tran

Tung is an Optimizely Certified Content Cloud Developer with experience working with .NET technologies. Strong engineering professional skilled in developing both Windows and Web Applications. In his free time, he enjoys watching, playing soccer, and spending time with his small family.

More from this Author

Follow Us