In Part 2, we integrated Azure AI Search with Azure Personalizer to build a smarter, user-focused experience in Optimizely CMS. We used ServiceAPI to send CMS content to Azure AI Search. In Part 3, we’ll take things a step further: enabling users to ask natural language questions and get AI-powered answers. But here’s the twist: instead of using ServiceAPI, this time we’ll explore how to query indexed content directly using Azure AI Search’s REST API.
Sometimes users don’t know the exact keywords. They ask things like:
“How do I return a product?” “Can I book a demo?”
With OpenAI on Azure and the semantic capabilities of Azure AI Search, your website can now understand those queries and provide helpful, contextual responses.
You can directly access indexed content using the Azure Search REST API. Here’s an example of how to fetch top 5 results:
public async Task<List<string>> SearchAzureContentAsync(string query) { var searchServiceName = "<your-search-service-name>"; var indexName = "<your-index-name>"; var apiKey = "<your-api-key>"; using var client = new HttpClient(); client.DefaultRequestHeaders.Add("api-key", apiKey); var url = $"https://{searchServiceName}.search.windows.net/indexes/{indexName}/docs?api-version=2021-04-30-Preview&search={query}&$top=5"; var result = await client.GetStringAsync(url); var json = JsonDocument.Parse(result); var docs = new List<string>(); foreach (var item in json.RootElement.GetProperty("value").EnumerateArray()) { docs.Add(item.GetProperty("content").GetString()); } return docs; }
Once we retrieve relevant documents, we’ll pass them into Azure OpenAI to generate a contextual answer:
public async Task<string> AskOpenAiAsync(string question, List<string> context) { var prompt = $""" You are a helpful assistant. Based on the following content, answer the question: {string.Join("\n\n", context)} Question: {question} Answer: """; var openAiKey = "<your-openai-key>"; var endpoint = "https://<your-openai-endpoint>.openai.azure.com/openai/deployments/<deployment-name>/completions?api-version=2022-12-01"; var payload = new { prompt = prompt, temperature = 0.7, max_tokens = 200 }; using var client = new HttpClient(); client.DefaultRequestHeaders.Add("api-key", openAiKey); var response = await client.PostAsync(endpoint, new StringContent(JsonSerializer.Serialize(payload), Encoding.UTF8, "application/json")); var result = JsonDocument.Parse(await response.Content.ReadAsStringAsync()); return result.RootElement.GetProperty("choices")[0].GetProperty("text").GetString(); }
You can create a controller like this:
public class QnAController : Controller { [HttpPost] public async Task<ActionResult> Ask(string question) { var docs = await SearchAzureContentAsync(question); var answer = await AskOpenAiAsync(question, docs); return Json(new { answer }); } }
And on your view:
<form method="post" id="qnaForm"> <input type="text" name="question" placeholder="Ask a question..." /> <button type="submit">Ask</button> </form> <div id="answer"></div> <script> $('#qnaForm').submit(function(e) { e.preventDefault(); const question = $('input[name=question]').val(); fetch('/QnA/Ask', { method: 'POST', headers: {'Content-Type': 'application/json'}, body: JSON.stringify({ question }) }) .then(r => r.json()) .then(data => $('#answer').text(data.answer)); }); </script>
This wraps up the final puzzle piece: letting users speak freely with your Optimizely site, while AI interprets and responds in real-time. From content indexing to re-ranking to full-on Q&A, your CMS is now intelligent, conversational, and user-first. Want to see this in action? Stay tuned for the sample repo and video walkthrough!
The blog is also published here Natural Language Q&A in Optimizely CMS Using Azure OpenAI and AI Search
]]>In the last blog, we discussed Integrating the Optimizely CMS website with Azure AI search. Now let’s take a bit more advanced topic to serve Personalization experience with Azure AI search with Azure personalizer. Together, they enable you to serve dynamic customized content and search results across user behaviour, preferences, and context.
Azure Personalizer Cognitive Service for Real-time Association using Reinforcement Learning. It gives you the ability to serve content or experiences that are most relevant to a user — informed by past behaviour and current context.
Benefits of AI Personalizer:
Model for Rankable Action
public class RankableDocument { public string Id { get; set; } public string Title { get; set; } public string Summary { get; set; } public string Category { get; set; } }
Send Info to Personalizer with Context:
private object GetUserContext(HttpRequestBase request) { return new { timeOfDay = DateTime.Now.Hour, device = request.Browser.IsMobileDevice ? "mobile" : "desktop", userAgent = request.UserAgent, language = request.UserLanguages?.FirstOrDefault() ?? "en" }; } public async Task<List<RankableDocument>> GetPersonalizedResultsAsync(List<RankableDocument> documents, string userId) { var contextFeatures = new[] { GetUserContext(Request) }; var actions = documents.Select(doc => new { id = doc.Id, features = new[] { new { category = doc.Category }, new { title = doc.Title } } }); _eventId = Guid.NewGuid().ToString(); var request = new { contextFeatures = contextFeatures, actions = actions, excludedActions = new string[] {}, eventId = _eventId, deferActivation = false }; var client = new HttpClient(); client.DefaultRequestHeaders.Add("Ocp-Apim-Subscription-Key", "--YOUR API KEY ---"); var response = await client.PostAsync("--Endpoint--/personalizer/v1.0/rank", new StringContent(JsonSerializer.Serialize(request), Encoding.UTF8, "application/json")); var result = JsonDocument.Parse(await response.Content.ReadAsStringAsync()); var topActionId = result.RootElement.GetProperty("rewardActionId").GetString(); return documents.OrderByDescending(d => d.Id == topActionId).ToList(); }
Now let’s consider our previous example of search page controller & view and extend it
Search Controller
public class AzureSearchPageController : PageController<AzureSearchPage> { private static string _eventId; public async Task<ActionResult> Index(AzureSearchPage currentPage, string q = "") { var results = new List<RankableDocument>(); if (!string.IsNullOrEmpty(q)) { var url = $"https://<search-service>.search.windows.net/indexes/<index-name>/docs?api-version=2021-04-30-Preview&search={q}"; using var client = new HttpClient(); client.DefaultRequestHeaders.Add("api-key", "<your-query-key>"); var response = await client.GetStringAsync(url); var doc = JsonDocument.Parse(response); results = doc.RootElement.GetProperty("value") .EnumerateArray() .Select(x => new RankableDocument { Id = x.GetProperty("id").GetString(), Title = x.GetProperty("name").GetString(), Category = x.GetProperty("type").GetString(), Summary = x.GetProperty("content").GetString() }).ToList(); results = await GetPersonalizedResultsAsync(results, "user123"); } ViewBag.Results = results; ViewBag.Query = q; ViewBag.EventId = _eventId; return View(currentPage); } [HttpPost] public async Task<ActionResult> Reward(string eventId, double rewardScore) { using var client = new HttpClient(); client.DefaultRequestHeaders.Add("Ocp-Apim-Subscription-Key", "<your-api-key>"); var rewardUrl = $"<your-endpoint>/personalizer/v1.0/events/{eventId}/reward"; var result = await client.PostAsync(rewardUrl, new StringContent(rewardScore.ToString(), Encoding.UTF8, "application/json")); return Json(new { success = result.IsSuccessStatusCode }); } }
Search Page View
@model AzureSearchPage <h1>Personalized Search Results</h1> <form method="get"> <input type="text" name="q" value="@ViewBag.Query" placeholder="Search..." /> <button type="submit">Search</button> </form> <ul> @foreach (var result in ViewBag.Results as List<RankableDocument>) { <li> <h4>@result.Title</h4> <p>@result.Summary</p> <button onclick="sendReward('@ViewBag.EventId', 1.0)">Like</button> <button onclick="sendReward('@ViewBag.EventId', 0.0)">Not Relevant</button> </li> } </ul> <script> function sendReward(eventId, score) { fetch('/AzureSearchPage/Reward', { method: 'POST', headers: { 'Content-Type': 'application/json' }, body: JSON.stringify({ eventId: eventId, rewardScore: score }) }).then(r => { if (r.ok) alert("Thanks! Your feedback was recorded."); }); } </script>
With Azure AI Search delivering relevant results and Azure Personalizer re-ranking them based on real-time context, your Optimizely site becomes an intelligent experience engine.
This blog has also been published here.
]]>Want to elevate your Optimizely PaaS CMS site’s search capabilities? Azure AI Search could be just the tool you need! In this blog, I’ll discuss how to connect your CMS with Microsoft’s advanced AI-driven search platform to create fast, smart search experiences that surpass regular keyword searches.
Azure AI Search is Microsoft’s cloud-based search service powered by AI. It enables you to index, search, and analyze extensive amounts of content utilizing full-text searches, faceted navigation, and machine-learning features (such as language comprehension and semantic search).
Why it’s great
In short: it’s a smart search made user-friendly.
Before we get into the benefits, let’s take a moment to consider how Azure AI Search compares to Optimizely’s native search functionalities. Optimizely Search (which relies on Lucene or Find/Search & Navigation) works well for straightforward keyword searches and basic filters, and it’s closely tied to the CMS. However, it doesn’t offer the advanced AI features, scalability, or flexibility that Azure provides right off the bat. Azure AI Search enriches the search experience with functionalities like semantic search, cognitive enhancements, and external data indexing, making it perfect for enterprise-level sites with intricate search requirements.
Here’s why merging these two solutions is beneficial:
To set up Azure AI Search, just follow these steps:
Once created, make sure to note down the Search Service Name and Admin API Key – you’ll need these to send and retrieve
By utilizing the Optimizely ServiceAPI, we can effectively get updated content and synchronize it with Azure AI Search. This process avoids the need to re-index the entire site, which helps boost performance.
[ScheduledPlugIn(DisplayName = "Sync Updated Content to Azure Search")] public class AzureSearchJob : ScheduledJobBase { private readonly HttpClient _httpClient; private readonly string _serviceApiBaseUrl = "https://yourwebsite.com/episerverapi/content/"; public AzureSearchJob() { _httpClient = new HttpClient(); IsStoppable = true; } public override string Execute() { // Step 1: Get content updated in the last 24 hours var yesterday = DateTime.UtcNow.AddDays(-1).ToString("o"); var contentApiUrl = $"{_serviceApiBaseUrl}?updatedAfter={Uri.EscapeDataString(yesterday)}"; var response = _httpClient.GetAsync(contentApiUrl).Result; if (!response.IsSuccessStatusCode) return "Failed to fetch updated content from ServiceAPI."; var contentJson = response.Content.ReadAsStringAsync().Result; var documents = JsonSerializer.Deserialize<JsonElement>(contentJson).EnumerateArray() .Select(content => new Dictionary<string, object> { ["id"] = content.GetProperty("ContentGuid").ToString(), ["name"] = content.GetProperty("Name").GetString(), ["content"] = content.GetProperty("ContentLink").GetRawText(), ["type"] = content.GetProperty("ContentTypeName").GetString() }).ToList(); // Step 2: Push to Azure AI Search var json = JsonSerializer.Serialize(new { value = documents }); var request = new HttpRequestMessage(HttpMethod.Post, "https://servicename.search.windows.net/indexes/<index-name>/docs/index?api-version=2021-04-30-Preview") { Content = new StringContent(json, Encoding.UTF8, "application/json") }; request.Headers.Add("api-key", "<your-admin-key>"); var result = _httpClient.SendAsync(request).Result; return result.IsSuccessStatusCode ? "Success" : "Failed to index in Azure Search."; } }
You can filter and transform the ServiceAPI response further to match your index schema.
Create a new page type to serve as a Search Results page.
[ContentType(DisplayName = "Search Results Page", GUID = "3C918F3E-D82B-480B-9FD8-A3A1DA3ECB1B", Description = "Search using Azure Search")] public class AzureSearchPage : PageData { [Display(Name = "Search Placeholder")] public virtual string PlaceholderText { get; set; } }
public class AzureSearchPageController : PageController<AzureSearchPage> { public ActionResult Index(AzureSearchPage currentPage, string q = "") { var results = new List<string>(); if (!string.IsNullOrEmpty(q)) { var url = $"https://<search-service>.search.windows.net/indexes/<index-name>/docs?api-version=2021-04-30-Preview&search={q}"; using var client = new HttpClient(); client.DefaultRequestHeaders.Add("api-key", "<your-query-key>"); var response = client.GetStringAsync(url).Result; var doc = JsonDocument.Parse(response); results = doc.RootElement.GetProperty("value") .EnumerateArray() .Select(x => x.GetProperty("name").GetString()) .ToList(); } ViewBag.Results = results; ViewBag.Query = q; return View(currentPage); } }
@model AzureSearchPage @{ Layout = "~/Views/Shared/_Layout.cshtml"; } <h1>Search Results</h1> <form method="get"> <input type="text" name="q" value="@ViewBag.Query" placeholder="@Model.PlaceholderText" /> <button type="submit">Search</button> </form> <ul> @foreach (var result in ViewBag.Results as List<string>) { <li>@result</li> } </ul>
Integrating Azure AI Search with Optimizely CMS can truly take your site search from basic to brilliant. With a bit of setup and some clean code, you’re empowering users with fast, smart, and scalable content discovery.
This blog is also published here
]]>I’m now a couple months into exploring Optimizely Configured Commerce and Spire CMS. As much as I’m up to speed with the Configured Commerce side of things (having past experience with Customized commerce), the Spire CMS side is a bit daunting, having worked with traditional Optimizely CMS for a while. We face challenges in figuring out handlers, a key concept in both Customized Commerce and Spire CMS.
And yes there is documentation, but its more high level and not enough to understand the inner functioning of the code (or maybe I just haven’t had the patience to go through it all yet :)).
Needless to say, I took a rather “figure it out by myself” approach here. I find that this is a much better way to learn and remember stuff :).
In a commerce site, there is Order History for every customer, with a “Reorder” capability. I will tweak the behavior of this Reorder action and prevent adding a specific SKU to cart again when user clicks “Reorder”.
Depending on what you are looking for and what you need to change, this can be different files in the Frontend source code.
I start by searching on keywords like “reorder” which do lead me to some files but they are mostly .tsx files aka React components that had the Reorder button on them. What I’m looking for instead is the actual method that passes the current order lines to add to cart, in order to intercept and tweak.
I decided it was time to put my browser skills to good use. I launch the site, open Dev tools, and hit Reorder to monitor all the Network calls that occur. And bravo.. I see the api call to Cart API for bulk load, which is what this action does. Here’s what that looks like :
api/v1/carts/current/cartlines/batch
with a Payload of cartlines sent to add to Cart.
Step #1 – I traced this back in code. Looked for “cartlines/batch” and found 1 file – CartService.ts
Its OOTB code, but for people new to this like me, we don’t know which folder has what. So, I’ll make this one step easier for you by telling you exactly where this file lives. You will find it at
FrontEnd\modules\client-framework\src\Services\CartService.ts
The method that makes the api call is addLineCollection(parameter: AddCartLinesApiParameter).
Step #2 – I now search for files that called this method. I found quite a few files that call this, but for my specific scenario, I stuck to the ones that said “reorder” specifically. These are the Frontend Handlers in Spire CMS.
Here’s the list and paths of the files that are relevant to the context here :
Once I see the line that makes the call to addLineCollection() method, I check how the parameter is being set.
Step #3 – All that’s left now is to update the code that sets the AddCartLinesApiParameter for this call, from the existing Order’s order lines. I add a filter to exclude the one specific SKU that I don’t want re-added to cart on reorder, on the OrderLines collection. Looks something like this :
In today’s mobile-first world, delivering personalized experiences to visitors using mobile devices is crucial for maximizing engagement and conversions. Optimizely’s powerful experimentation and personalization platform allows you to define custom audience criteria to target mobile users effectively.
By leveraging Optimizely’s audience segmentation, you can create tailored experiences based on factors such as device type, operating system, screen size, and user behavior. Whether you want to optimize mobile UX, test different layouts, or personalize content for Android vs. iOS users, understanding how to define mobile-specific audience criteria can help you drive better results.
In this blog, we’ll explore how to set up simple custom audience criteria for mobile visitors in Optimizely, the key benefits of mobile targeting, and the best practices to enhance user experiences across devices. Let’s dive in!
This solution is based on Example – Create audience criteria, which you can find in the Optimizely documentation.
First, we need to create two classes in our solution:
Class VisitorDeviceTypeCriterionSettings
needs to inherit CriterionModelBase
class, and we need only one property (settings) to determine if the visitor is using a desktop or a mobile device.
public bool IsMobile { get; set; }
The abstract CriterionModelBase
class requires you to implement the Copy()
method. Because you are not using complex reference types, you can implement it by returning a shallow copy as shown (see Create custom audience criteria):
public override ICriterionModel Copy() { return base.ShallowCopy(); }
The entire class will look something like this:
using EPiServer.Data.Dynamic; using EPiServer.Personalization.VisitorGroups; namespace AlloyTest.Personalization.Criteria { [EPiServerDataStore(AutomaticallyRemapStore = true)] public class VisitorDeviceTypeCriterionSettings : CriterionModelBase { public bool IsMobile { get; set; } public override ICriterionModel Copy() { // if this class has reference types that require deep copying, then // that implementation belongs here. Otherwise, you can just rely on // shallow copy from the base class return base.ShallowCopy(); } } }
Now, we need to implement the criterion class VisitorDeviceTypeCriterion
and inherit the abstract CriterionBase
class with the settings class as the type parameter:
public class VisitorDeviceTypeCriterion : CriterionBase<VisitorDeviceTypeCriterionSettings>
Add a VisitorGroupCriterion
attribute to set the category, name, and description of the criterion (for more available VisitorGroupCriterion
properties, see Create custom audience criteria:
[VisitorGroupCriterion( Category = "MyCustom", DisplayName = "Device Type", Description = "Criterion that matches type of the user's device" )]
The abstract CriterionBase
class requires you to implement an IsMatch()
method that determines whether the current user matches this audience criterion. In this case, we need to determine from which device the visitor is accessing our site. Because Optimizely doesn’t provide this out of the box, we need to figure out that part.
One of the solutions is to use information from the request header, from the User-Agent
field and analyze it to determine the OS and device type. We can do that by writing our match method:
public virtual bool MatchBrowserType(string userAgent) { var os = new Regex( @"(android|bb\d+|meego).+mobile|avantgo|bada\/|blackberry|blazer|compal|elaine|fennec|hiptop|iemobile|ip(hone|od|ad)|iris|kindle|lge |maemo|midp|mmp|mobile.+firefox|netfront|opera m(ob|in)i|palm( os)?|phone|p(ixi|re)\/|plucker|pocket|psp|series(4|6)0|symbian|treo|up\.(browser|link)|vodafone|wap|windows ce|xda|xiino", RegexOptions.IgnoreCase | RegexOptions.Multiline); var device = new Regex( @"1207|6310|6590|3gso|4thp|50[1-6]i|770s|802s|a wa|abac|ac(er|oo|s\-)|ai(ko|rn)|al(av|ca|co)|amoi|an(ex|ny|yw)|aptu|ar(ch|go)|as(te|us)|attw|au(di|\-m|r |s )|avan|be(ck|ll|nq)|bi(lb|rd)|bl(ac|az)|br(e|v)w|bumb|bw\-(n|u)|c55\/|capi|ccwa|cdm\-|cell|chtm|cldc|cmd\-|co(mp|nd)|craw|da(it|ll|ng)|dbte|dc\-s|devi|dica|dmob|do(c|p)o|ds(12|\-d)|el(49|ai)|em(l2|ul)|er(ic|k0)|esl8|ez([4-7]0|os|wa|ze)|fetc|fly(\-|_)|g1 u|g560|gene|gf\-5|g\-mo|go(\.w|od)|gr(ad|un)|haie|hcit|hd\-(m|p|t)|hei\-|hi(pt|ta)|hp( i|ip)|hs\-c|ht(c(\-| |_|a|g|p|s|t)|tp)|hu(aw|tc)|i\-(20|go|ma)|i230|iac( |\-|\/)|ibro|idea|ig01|ikom|im1k|inno|ipaq|iris|ja(t|v)a|jbro|jemu|jigs|kddi|keji|kgt( |\/)|klon|kpt |kwc\-|kyo(c|k)|le(no|xi)|lg( g|\/(k|l|u)|50|54|\-[a-w])|libw|lynx|m1\-w|m3ga|m50\/|ma(te|ui|xo)|mc(01|21|ca)|m\-cr|me(rc|ri)|mi(o8|oa|ts)|mmef|mo(01|02|bi|de|do|t(\-| |o|v)|zz)|mt(50|p1|v )|mwbp|mywa|n10[0-2]|n20[2-3]|n30(0|2)|n50(0|2|5)|n7(0(0|1)|10)|ne((c|m)\-|on|tf|wf|wg|wt)|nok(6|i)|nzph|o2im|op(ti|wv)|oran|owg1|p800|pan(a|d|t)|pdxg|pg(13|\-([1-8]|c))|phil|pire|pl(ay|uc)|pn\-2|po(ck|rt|se)|prox|psio|pt\-g|qa\-a|qc(07|12|21|32|60|\-[2-7]|i\-)|qtek|r380|r600|raks|rim9|ro(ve|zo)|s55\/|sa(ge|ma|mm|ms|ny|va)|sc(01|h\-|oo|p\-)|sdk\/|se(c(\-|0|1)|47|mc|nd|ri)|sgh\-|shar|sie(\-|m)|sk\-0|sl(45|id)|sm(al|ar|b3|it|t5)|so(ft|ny)|sp(01|h\-|v\-|v )|sy(01|mb)|t2(18|50)|t6(00|10|18)|ta(gt|lk)|tcl\-|tdg\-|tel(i|m)|tim\-|t\-mo|to(pl|sh)|ts(70|m\-|m3|m5)|tx\-9|up(\.b|g1|si)|utst|v400|v750|veri|vi(rg|te)|vk(40|5[0-3]|\-v)|vm40|voda|vulc|vx(52|53|60|61|70|80|81|83|85|98)|w3c(\-| )|webc|whit|wi(g |nc|nw)|wmlb|wonu|x700|yas\-|your|zeto|zte\-", RegexOptions.IgnoreCase | RegexOptions.Multiline); var deviceInfo = string.Empty; if (os.IsMatch(userAgent)) { deviceInfo = os.Match(userAgent).Groups[0].Value; } if (device.IsMatch(userAgent.Substring(0, 4))) { deviceInfo += device.Match(userAgent).Groups[0].Value; } if (!string.IsNullOrEmpty(deviceInfo)) { return true; } return false; }
Now, we can go back and implement the IsMatch()
method that is required by CriterionBase
abstract class.
public override bool IsMatch(IPrincipal principal, HttpContext httpContext) { return MatchBrowserType(httpContext.Request.Headers["User-Agent"].ToString()); }
In the CMS we need to create a new audience criterion. When you click on the ‘Add Criteria’ button, there will be ‘MyCustom’ criteria group with our criteria:
When you select the ‘Device Type’ criteria, you will see something like this:
We can easily add a label for the checkbox by using Optimizely’s translation functionality. Create a new XML file VisitorGroupCriterion.xml
and place it in your translations folder where your translation files are, like this:
Put this into the file that you created:
<?xml version="1.0" encoding="utf-8" standalone="yes"?> <languages> <language name="English" id="en-us"> <visitorgroups> <criteria> <ismobile> <key>Is Mobile Device (Use this setting to show content only on Mobile)</key> </ismobile> </criteria> </visitorgroups> </language> </languages>
There is one more thing to do. In VisitorDeviceTypeCriterionSettings.cs,
decorate the IsMobile
property with the translation definition. Add this attribute:
[CriterionPropertyEditor(LabelTranslationKey = "/visitorgroups/criteria/ismobile/key")]
It should look like this:
Now, in the editor view, we have a label for the checkbox.
Personalize the content by setting the content for this visitor group.
Desktop view:
Mobile view:
You can see that there is content that is only visible if you access the site with a mobile device.
And that’s it!
]]>This series of blog posts will cover the main areas of activity for your marketing, product, and UX teams before, during, and after site migration to a new digital experience platform.
Migrating your site to a different platform can be a daunting prospect, especially if the site is sizable in both page count and number of assets, such as documents and images. However, this can also be a perfect opportunity to freshen up your content, perform an asset library audit, and reorganize the site overall.
Once you’ve hired a consultant, like Perficient, to help you implement your new CMS and migrate your content over, you will work with them to identify several action items your team will need to tackle to ensure successful site migration.
Whether you are migrating from/to some of the major enterprise digital experiences platforms like Sitecore, Optimizely, Adobe, or from the likes of Sharepoint or WordPress, there are some common steps to take to make sure content migration runs smoothly and is executed in a manner that adds value to your overall web experience.
One of the first questions you will need to answer is, “What do we need to carry over?” The instinctive answer would be everything. The rational answer is that we will migrate the site over as is and then worry about optimization later. There are multiple reasons why this is usually not the best option.
Even though this activity might take time, it is essential to use this opportunity in the best possible manner. A consultant like Perficient can help drive the process. They will pull up an initial list of active pages, set up simple audit steps, and ensure that decisions are recorded clearly and organized.
The first step is to ensure all current site pages are accounted for. As simple as this may seem, it doesn’t always end up being so, especially on large multi-language sites. You might have pages that are not crawlable, are temporarily unpublished, are still in progress, etc.
Depending on your current system capabilities, putting together a comprehensive list can be relatively easy. Getting a CMS export is the safest way to confirm that you have accounted for everything in the system.
Crawling tools, such as Screaming Frog, are frequently used to generate reports that can be exported for further refinement. Cross-referencing these sources will ensure you get the full picture, including anything that might be housed externally.
Once you’ve ensured that all pages made it to a comprehensive list you can easily filter, edit, and share, the fun part begins.
The next step involves reviewing and analyzing the sitemap and each page. The goal is to determine those that will stay vs candidates for removal. Various different factors can impact this decision from business goals, priorities, page views, conversion rate, SEO considerations, and marketing campaigns to compliance and regulations. Ultimately, it is important to assess each page’s value to the business and make decisions accordingly.
This audit will likely require input from multiple stakeholders, including subject matter experts, product owners, UX specialists, and others. It is essential to involve all interested parties at an early stage. Securing buy-in from key stakeholders at this point is critical for the following phases of the process. This especially applies to review and sign-off prior to going live.
Depending on your time and resources, the keep-kill-merge can either be done in full or limited to keep-kill. The merge option might require additional analysis, as well as follow-up design and content work. Leaving that effort for after the site migration is completed might just be the rational choice.
Once the audit process has been completed, it is important to record findings and decisions simply and easily consumable for teams that will implement those updates. Proper documentation is essential when dealing with large sets of pages and associated content. This will inform the implementation team’s roadmap and timelines.
At this point, it is crucial to establish regular communication between a contact person (such as a product owner or content lead) and the team in charge of content migration from the consultant side. This partnership will ensure that all subsequent activities are carried out respecting the vision and business needs identified at the onset.
Completing the outlined activities properly will help smooth the transition into the next process phase, thus setting your team up for a successful site migration.
]]>With most systems transitioning to cloud-based environments, databases are often hosted across various cloud platforms. However, during the development cycle, there are occasions when having access to a local database environment becomes crucial, particularly for analyzing and troubleshooting issues originating in the production environment.
Sometimes, it is necessary to restore the production database to a local environment to diagnose and resolve production-related issues effectively. This allows developers to replicate and investigate issues in a controlled setting, ensuring efficient debugging and resolution.
In an Azure cloud environment, database backups are often exported as .bacpac
files. The file must be imported and restored locally to work with these databases in a local environment.
There are several methods to achieve this, including:
SqlPackage
command-line.This article will explore the steps to import a .bacpac
file into a local environment, focusing on practical and straightforward approaches.
The first approach—using SQL Server Management Studio (SSMS)—is straightforward and user-friendly. However, challenges arise when dealing with large database sizes, as the import process may fail due to resource limitations or timeouts.
The second approach, using the SqlPackage command-line, is recommended in such cases. This method offers more control over the import process, allowing for better handling of larger .bacpac
files.
.bacpac
File Using SqlPackage.bacpac
file:/tsn
: The server name (IP or hostname) of your SQL Server instance, optionally followed by a port (default: 1433
)./tdn
: The name of the target database (must not already exist)./tu
: SQL Server username./tp
: SQL Server password./sf
: The path to your .bacpac
file (use the full path or ensure the terminal is in the same directory).Important: Ensure the target database does not already exist, as .bacpac
files can only be imported into a fresh database.
The options /p:DisableIndexesForDataPhase
and /p:PreserveIdentityLastValues
optimize the import process for large databases and preserve identity column values. SqlPackage provides more reliability and flexibility than SSMS, especially when dealing with more extensive databases.
Reference:
]]>The mark of a successful website is more than just a collection of pagee. It’s a structured ecosystem where every piece of content serves a purpose, or it should. When building a new site or migrating to a new platform, content mapping is a critical step that determines how information is organized, accessed, and optimized for performance. Without a thoughtful strategy, businesses risk losing valuable content, creating navigation confusion, and impacting search visibility. It should be a process that is constantly reviewed and refined.
Content mapping starts with a deep understanding of what already exists and how it needs to evolve. This process is especially important when working with both structured and unstructured data—two very different types of content that require distinct approaches. Structured data, such as product catalogs, customer profiles, or metadata, follows a defined format and is easier to categorize. Unstructured data, including blog posts, images, and videos, lacks a rigid framework and demands extra effort to classify, tag, and optimize. While structured data migration is often straightforward, unstructured content requires strategic planning to ensure it remains accessible and meaningful within a new digital experience.
A content audit is the first step in developing a solid content mapping strategy. This involves evaluating existing content to determine what should be migrated, what needs to be refined, and what should be left behind. Without this step, businesses risk carrying over outdated or redundant content, which can clutter the new site and dilute the user experience.
A well-executed audit not only catalogs content but also assesses its performance. Understanding which pages drive the most engagement and which fail to connect with audiences helps inform how content is structured in the new environment. This process also highlights gaps—areas where fresh content is needed to align with business goals or audience expectations.
Beyond performance, content audits reveal inconsistencies in voice, formatting, or taxonomy. A new site presents an opportunity to standardize these elements, ensuring that every piece of content follows best practices for branding, SEO, and user experience.
Once content is audited and mapped, the next step is defining a clear taxonomy and metadata strategy. Taxonomy refers to how content is classified and grouped, making it easier for users to navigate and find relevant information. Metadata, on the other hand, provides the structured details that power search functionality, personalization, and content recommendations.
Without proper taxonomy, even high-quality content can become buried and difficult to access. Establishing consistent tagging, categorization, and metadata ensures that content remains discoverable, whether through site search, filtering options, or AI-driven recommendations. This is particularly important when transitioning to platforms like Acquia, Sitecore, or Optimizely, where personalization and dynamic content delivery depend on well-structured metadata.
Additionally, URL consistency and redirect strategies play a crucial role in maintaining SEO authority. A content mapping plan should account for how legacy URLs will transition to the new site, preventing broken links and preserving search rankings.
Content mapping is not just about migrating existing assets—it’s about creating a structure that supports long-term digital success. The best content strategies anticipate future growth, ensuring that new content can be seamlessly integrated without disrupting site architecture.
This means designing a content model that accommodates personalization, omnichannel distribution, and AI-driven enhancements. As businesses scale, the ability to dynamically deliver content across different devices and user segments becomes increasingly important. Content mapping lays the foundation for this flexibility, making it easier to adapt and evolve without requiring constant restructuring.
A well-planned content mapping strategy transforms website migration from a logistical challenge into a strategic opportunity. By auditing existing content, defining clear taxonomy and metadata structures, and building for scalability, businesses can create a site that is not only organized but optimized for engagement and performance.
Content is the heart of any digital experience, but without proper mapping, it can become fragmented and difficult to manage. Taking the time to strategically align content with user needs, business goals, and technological capabilities ensures that a new site isn’t just a fresh coat of paint—it’s a true step forward in delivering meaningful digital experiences.
]]>In today’s fast-paced digital landscape, businesses need robust and scalable e-commerce solutions to meet the growing demands of B2B buyers. Optimizely Commerce (formerly Insite) is a leading platform that empowers businesses with cutting-edge tools to enhance the online purchasing experience.
In this blog, we’ll explore Optimizely Commerce’s key features, which make it a powerful choice for B2B businesses, and discuss best practices for maximizing its potential.
Unlike B2C platforms, B2B commerce has complex requirements, such as bulk orders, contract pricing, and account-based purchasing. Optimizely Commerce is explicitly designed to address these challenges.
Optimizely Commerce offers a powerful and flexible solution for businesses looking to optimize their B2B e-commerce operations. By leveraging its robust features and following best practices, companies can enhance customer experiences, improve efficiency, and drive revenue growth.
Is your business ready to take B2B e-commerce to the next level? Start optimizing with Optimizely Commerce today!
]]>
The Optimizely Configured Commerce SDK and Optimizely Configured Commerce Cloud serve different but complementary purposes within the Optimizely ecosystem. Below is a breakdown of their differences to help clarify their roles:
The SDK is a toolkit developers use to build, extend, and customize Optimizely Configured Commerce solutions.
This is the fully managed, cloud-hosted environment where the Configured Commerce platform operates. It delivers scalability, security, and reliability while offloading the burden of infrastructure management from businesses.
Factor | SDK | Cloud |
---|---|---|
Purpose | Toolkit for building and customizing functionality. | Fully managed, hosted environment for the platform. |
Target Team | Developers and technical teams. | |
Customization | High flexibility for custom features and integrations. | Limited to Cloud version also supports the platform extension at large extent with certain limitation but from this point it sounds like we can only customize the platform through configurations. |
Management | Requires development resources to build and deploy changes. | Managed entirely by Optimizely, including updates and maintenance. |
Hosting | Local or self-hosted for development purposes and production purposes. | Hosted by Optimizely with global availability. |
By leveraging Optimizely Cloud capabilities, you can achieve robust, scalable, and tailored eCommerce experiences with minimized operational complexity, while using SDK version you can have more controlled, customizable website and also have control on infrastructure, upgrades and the deployments
]]>Implementing Optimizely a Configured Commerce platform is a significant milestone for any business looking to scale its digital operations. A well-structured timeline ensures a seamless transition from planning to execution, ultimately delivering a robust eCommerce solution tailored to your needs.
The implementation involves four key phases: Prepare, Build and Verify, Go Live, and Post-Go Live. Let’s examine each phase to understand its importance and components.
Image Source from https://support.optimizely.com/hc/en-us/articles/4413199673229-Configured-Commerce-implementation-timeline
The journey begins with thorough preparation. This phase ensures all stakeholders align on the project goals, requirements, and expectations.
Businesses can minimize risks and ensure a smoother development process by investing time in preparation.
The Build & Verify phase actively constructs the platform and ensures that all functional and technical requirements are met.
This phase involves rigorous testing to verify that the platform meets business needs and performs as intended.
With development and testing completed, the project transitions to the Go Live phase, where the production environment becomes operational.
The platform officially launches, marking the achievement of months of collaboration and hard work.
The implementation process doesn’t end with the platform launch. The Post-Go Live phase ensures that businesses continuously monitor and optimize the production site for performance, scalability, and user experience. Regular maintenance and updates are vital to ensure that the platform remains robust and adaptive to evolving business needs.
A well-planned implementation time frame keeps the project on track and provides flexibility to address unexpected obstacles. Businesses can focus on delivering an efficient and effective commerce solution by breaking the process into distinct, manageable phases.
With this phased approach, Implementing the Optimizely Configured Commerce platform becomes manageable. It provides a path to a scalable, high-performing, and user-friendly eCommerce experience. Proper planning, collaboration, and execution are the keys to success in this transformative journey.
Reference URL – https://support.optimizely.com/hc/en-us/articles/4413199673229-Configured-Commerce-implementation-timeline
]]>
Optimizely Spire CMS provides a feature to create variants of CMS pages. Page variants are variations of a website page with rules for displaying it to end users. Optimizely Spire CMS supports page variants for all CMS pages, including the header and footer. The primary purpose of that user is to see different variations of the same page.
Optimizely Spire CMS provides a feature to apply rule types on page variants. Optimizely Configured Commerce supports creating custom rule types and rule type options to further use the Rules Engine. The basic requirement of creating a custom rule type or using OOTB on a rule type is that users can see different CMS pages as per the applied rule type.
This blog provides complete information on creating custom rule-type options. The steps below will help create a custom rule type option in the Optimizely Configured Commerce solution that can be used in the Optimizely Spire CMS.
References:
]]>