Prasad Joshi, Author at Perficient Blogs https://blogs.perficient.com/author/prasadjoshi/ Expert Digital Insights Sat, 13 Sep 2025 12:34:23 +0000 en-US hourly 1 https://blogs.perficient.com/files/favicon-194x194-1-150x150.png Prasad Joshi, Author at Perficient Blogs https://blogs.perficient.com/author/prasadjoshi/ 32 32 30508587 Optimizely Mission Control – Part III https://blogs.perficient.com/2025/09/13/optimizely-mission-control-part-iii/ https://blogs.perficient.com/2025/09/13/optimizely-mission-control-part-iii/#comments Sat, 13 Sep 2025 12:34:23 +0000 https://blogs.perficient.com/?p=385025

In this article, we will cover all the remaining actions available in Mission Control.

Base Code Deploy

The Optimizely team continuously improves the platform by introducing new features and releasing updated versions. To take advantage of these enhancements and bug fixes, projects must be upgraded to the latest version. After upgrading the project, it needs to be deployed to the appropriate environment. This deployment is carried out using the “Base Code Deploy” option in Mission Control.

How to deploy the Base Code

  • Log in to Mission Control.

  • Navigate to the Customers tab.

  • Select the appropriate Customer.

  • Choose the Environment where you want to deploy the base code changes.

  • Click the Action dropdown in the left pane.

  • Select Base Code Deploy.

  • A pop-up will appear with a scheduler option and a dropdown showing the latest build version.

  • Click Continue to initiate the deployment process.

  • Once the process completes, the base code is successfully deployed to the selected environment.

Reference: Base Code Deploy – Optimizely Support

Extension Deployment

There are many customizations implemented according to project requirements, and these are developed within the extension project following Optimizely framework guidelines. To make these changes available in the environment, we need to deploy the extension project code. This can be done using the Extension Deployment option available in Mission Control.

Deploy Extension Code

  • Log in to Mission Control.

  • Navigate to the Customers tab.

  • Select the appropriate Customer.

  • Choose the Environment where you want to deploy the extension code.

  • Click the Action dropdown in the left pane.

  • Select Extension Deployment.

  • A pop-up will appear with an optional scheduler and a dropdown showing available extension build versions.

  • Select the desired extension version to deploy.

  • Click Continue to initiate the deployment process immediately.

  • Once the process completes, the extension code is successfully deployed to the selected environment.

Reference: Extension Deployment – Optimizely Support

Production User Files Sync

In any project, there are numerous user files—especially images—which play a crucial role in the website’s appearance and user experience. During development, it’s important to keep these files synchronized across all environments. Typically, the files in lower environments should mirror those in the production environment. Since clients often update files directly in production, the “Production User Files Sync” option in Mission Control becomes extremely useful. It allows developers to easily sync user files from production to lower environments, ensuring consistency during development and testing.

How to sync production user files

  • Log in to Mission Control.

  • Navigate to the Customers tab.

  • Select the appropriate Customer.

  • Choose the lower environment where you want to sync the user files.

  • Click the Action dropdown in the left pane.

  • Select User File Sync from the list of available options.

  • A pop-up will appear with an optional scheduler and a Source Environment dropdown containing all environments available for the selected customer.

  • Select Production as the source (or any environment as required), then click Continue to start the sync process.

  • Depending on the size of the user files and network parameters, the process might take several minutes to complete.

Reference: Production User Files Sync – Optimizely Support

Production Database Sync

This option allows you to synchronize data from the production environment to a lower instance.
Note: Data cannot be synced from a lower instance back to production.

Critical Requirements

  • Matching Website Keys
    • The website keys in both the production and target environments must match.
    • If they do not, the site may experience a startup failure and become unstable.
  • Version Compatibility

    • The target environment must be running on a version that is equal to or newer than the source (production) version.

    • Both source and target environments must be on one of the last three supported long-term versions, or their corresponding short-term support versions.

    • If version requirements are not met, the sync process will fail.

  • Data Loss Warning
    • This is a destructive operation—it will overwrite data on the target (lower) environment.

    • Ensure that no critical or important data exists in the sandbox or lower instance before initiating the sync.

The Production Sync option does not replicate all data, but it does synchronize several key components. Below is the list of data that gets synced:

Product Data

  • Product settings (e.g., ERP Managed, Track Inventory, Quote Required)

  • Attribute values

  • Category assignments

  • Product content (metadata and rich content)

  • Product specifications

  • Child variants

  • Pricing and cost

  • Product number and URL segment

  • Warehouse inventory (stock levels)

  • Shipping information

Category Data

  • Category details (name, description)

  • Category hierarchy

  • Assigned products

  • Category content (metadata and content)

  • Attribute values

CMS Content

  • CMS customizations made via out-of-the-box widgets (non-code changes)

  • Variant page customizations and display rules

Additional Data

  • Attribute types and values

  • Variant types

  • Customer records

  • Website users

Data Not Synced from Production to Sandbox

The following areas are excluded from the Production Sync process and remain unchanged in the target sandbox environment:

  • System Configuration
  • Integration Job Settings
  • Admin & User Data
    • Exceptions

      • If a production admin user has made changes to data being synced (like CustomerOrders, Content, etc.), that admin user is also synced to the sandbox.

      • Admin user roles are also synced to preserve permission context.

      • To prevent role duplication:

        • All sandbox roles are appended with -1.

        • Production roles retain their original names.

      • If a matching admin user exists in both environments:

        • The production user and roles are retained.

        • Sandbox-only users receive roles with the -1 suffix.

  • Logs and Cache

Sync production data

  • Log in to Mission Control.

  • Navigate to the Customers tab.

  • Select the appropriate Customer.

  • Choose the lower environment where you want to sync the production data.

  • Click the Action dropdown in the left pane.

  • Select Production Database Sync from the list of available options.

  • A pop-up will appear with:

      • An optional scheduler, and

      • A Source Environment dropdown (select the production environment).

  • Click Continue to initiate the sync process.

  •  This is a large-scale data transfer operation. The sync process may take several minutes to complete, depending on the volume of data.

Note: Optimizely does not provide a rollback option for this process. Once the deployment is complete, any changes—such as modifications to stored procedures or database scripts—restored it.

Reference: Production Database Sync – Optimizely Support

]]>
https://blogs.perficient.com/2025/09/13/optimizely-mission-control-part-iii/feed/ 1 385025
Optimizely Mission Control – Part II https://blogs.perficient.com/2025/08/18/optimizely-mission-control-part-ii/ https://blogs.perficient.com/2025/08/18/optimizely-mission-control-part-ii/#respond Mon, 18 Aug 2025 07:02:45 +0000 https://blogs.perficient.com/?p=384870

In this section, we focused primarily on generating read-only credentials and how to use them to connect to the database.

Generate Database Credentials

The Mission Control tool generates read-only database credentials for a targeted instance, which remain active for 30 minutes. These credentials allow users to run select or read-only queries, making it easier to explore data on a cloud instance. This feature is especially helpful for verifying data-related issues without taking a database backup.

Steps to generate database credentials

  1. Log in to Mission Control.

  2. Navigate to the Customers tab.

  3. Select the appropriate Customer.

  4. Choose the Environment for which you need the credentials.

  5. Click the Action dropdown in the left pane.

  6. Select Generate Database Credentials.

  7. A pop-up will appear with a scheduler option.

  8. Click Continue to initiate the process.

  9. After a short time, the temporary read-only credentials will be displayed.

 

Once the temporary read-only credentials are generated, the next step is to connect to the database using those credentials.

To do this:

  1. Download and install Azure Data Studio
    Download Azure Data Studio

  2. Open Azure Data Studio after installation.

  3. Click “New Connection” or the “Connect” button.

  4. Use the temporary credentials provided by Mission Control to connect:

    • Server Name: Use the server name from the credentials.

    • Authentication Type: SQL Login

    • Username and Password: As provided in the credentials.

  5. Once connected, you can execute SELECT queries to explore or verify data on the cloud instance.

 

For more details, refer to the official Optimizely documentation on Generating Database Credentials.

For Part I, visit: Optimizely Mission Control – Part I

]]>
https://blogs.perficient.com/2025/08/18/optimizely-mission-control-part-ii/feed/ 0 384870
Optimizely Mission Control – Part I https://blogs.perficient.com/2025/08/04/optimizely-mission-control-part-i/ https://blogs.perficient.com/2025/08/04/optimizely-mission-control-part-i/#comments Mon, 04 Aug 2025 13:19:29 +0000 https://blogs.perficient.com/?p=384712

Optimizely provides powerful tools that make it easy to build, release, and manage cloud infrastructure efficiently.

Optimizely Mission Control Access

To use this tool, an Opti ID is required. Once you have an Opti ID, request that your organization grants access to your user account. Alternatively, you can raise a ticket with the Optimizely Support team along with approval from your project organization.

Key Actions

This tool provides various essential actions that can be performed for managing your cloud environments effectively. These include:

  • Restart Site

    • Restart the application in a specific environment to apply changes or resolve issues.

  • Database Backup

    • Create a backup of the environment’s database for debug purposes.

  • Generate Database Credentials

    • Generate secure credentials to connect to the environment’s database.

  • Base Code Deploy

    • Deploy the base application code to the selected environment.

  • Extension Deployment

    • Deploy any custom extension changes.

  • Production User Files Sync

    • Synchronize user-generated files (e.g., media, documents) from the production environment to lower environments.

  • Production Database Sync

    • Sync the production database to another lower environment (such as a sandbox) to sync up data.

Let’s walk through each of these actions step by step to understand how to perform them.

Restart Site

We can restart the site using the Mission Control tool. This option is handy when a website restart is required due to configuration changes. For example, updates to the storage or search provider often require a restart. Additionally, if an integration job gets stuck for any reason, the ability to restart the site becomes very helpful in restoring normal functionality.

How to restart the website

  1. Log in to Mission Control.
  2. Navigate to the Customers tab.

  3. Select the appropriate Customer.

  4. Choose the Environment where the restart is needed.

  5. Click on the Action dropdown in the left pane.

  6. Select Restart Site from the list.

  7. A pop-up will appear where you can either schedule the restart or click Continue for an immediate restart.

 

Reference: Restart Site – Optimizely Support

Database Backup

This is another useful feature available in Mission Control.

Using this option, we can take a backup from the Sandbox or Production instance and import it into the local environment. This helps us debug issues that occur in Sandbox or Production environments.

The backup file is generated with a .bacpac extension.

Steps to take a backup

  1. Log in to Mission Control.

  2. Navigate to the Customers tab.

  3. Select Database Backup from the list.

  4. A pop-up will appear prompting for a scheduled backup time.

  5. Set Skip Log to False to minimize the backup size.

  6. Click Continue and wait for the process to complete.

  7. Once finished, click on the provided link to download the backup file.

 

Reference: Database Backup – Optimizely Support

Stay tuned for the next blog to explore the remaining actions!

]]>
https://blogs.perficient.com/2025/08/04/optimizely-mission-control-part-i/feed/ 1 384712
From Cloud to Local: Effortlessly Import Azure SQL Databases https://blogs.perficient.com/2025/02/26/import-azure-sql-databases/ https://blogs.perficient.com/2025/02/26/import-azure-sql-databases/#comments Wed, 26 Feb 2025 08:54:58 +0000 https://blogs.perficient.com/?p=377428

With most systems transitioning to cloud-based environments, databases are often hosted across various cloud platforms. However, during the development cycle, there are occasions when having access to a local database environment becomes crucial, particularly for analyzing and troubleshooting issues originating in the production environment.

Sometimes, it is necessary to restore the production database to a local environment to diagnose and resolve production-related issues effectively. This allows developers to replicate and investigate issues in a controlled setting, ensuring efficient debugging and resolution.

In an Azure cloud environment, database backups are often exported as .bacpac files. The file must be imported and restored locally to work with these databases in a local environment.

There are several methods to achieve this, including:

  1. Using SQL Server Management Studio (SSMS).
  2. Using the SqlPackage command-line.

This article will explore the steps to import a .bacpac file into a local environment, focusing on practical and straightforward approaches.

The first approach—using SQL Server Management Studio (SSMS)—is straightforward and user-friendly. However, challenges arise when dealing with large database sizes, as the import process may fail due to resource limitations or timeouts.

The second approach, using the SqlPackage command-line, is recommended in such cases. This method offers more control over the import process, allowing for better handling of larger .bacpac files.

Steps to Import a .bacpac File Using SqlPackage

1. Download SqlPackage

  • Navigate to the SqlPackage download page: SqlPackage Download.
  • Ensure you download the .NET 6 version of the tool, as the .NET Framework version may have issues processing databases with very large tables.

2. Install the Tool

  • Follow the instructions under the “Windows (.NET 6)” header to download and extract the tool.
  • After extracting, open a terminal in the directory where you extracted SqlPackage.

3. Run SqlPackage

  • Put .bacpac file into the package folder.(ex: C:\sqlpackage-win7-x64-en-162.1.167.1)
  • Use the following example command in the terminal to import the .bacpac file:
  • powershell
    SqlPackage /a:Import /tsn:"localhost" /tdn:"test" /tu:"sa" /tp:"Password1" /sf:"database-backup-filename.bacpac" /ttsc:True /p:DisableIndexesForDataPhase=False /p:PreserveIdentityLastValues=True

4. Adjust Parameters for Your Setup

  • /tsn: The server name (IP or hostname) of your SQL Server instance, optionally followed by a port (default: 1433).
  • /tdn: The name of the target database (must not already exist).
  • /tu: SQL Server username.
  • /tp: SQL Server password.
  • /sf: The path to your .bacpac file (use the full path or ensure the terminal is in the same directory).

5. Run and Wait

  • Let the tool process the import. The time taken will depend on the size of the database.

Important: Ensure the target database does not already exist, as .bacpac files can only be imported into a fresh database.

The options /p:DisableIndexesForDataPhase and /p:PreserveIdentityLastValues optimize the import process for large databases and preserve identity column values. SqlPackage provides more reliability and flexibility than SSMS, especially when dealing with more extensive databases.

 

Reference:

https://learn.microsoft.com/en-us/azure/azure-sql/database/database-import?view=azuresql&tabs=azure-powershell

]]>
https://blogs.perficient.com/2025/02/26/import-azure-sql-databases/feed/ 1 377428
Optimizely Configured Commerce – Customizing Elasticsearch v7 Index https://blogs.perficient.com/2023/12/15/optimizely-configured-commerce-customizing-elasticsearch-v7-index/ https://blogs.perficient.com/2023/12/15/optimizely-configured-commerce-customizing-elasticsearch-v7-index/#comments Fri, 15 Dec 2023 07:06:49 +0000 https://blogs.perficient.com/?p=350345

The Optimizely configured commerce introduces Elasticsearch v7 for a better search experience. In the dynamic landscape of the Commerce world, there is always room for extended code customization. Optimizely offers detailed instructions on customizing Elasticsearch v7 indexes.

There are a lot of advantages to using Elasticsearch v7. some are

  • Improved Performance
  • Security Enhancements
  • Elasticsearch SQL
  • GeoJSON Support
  • Usability and Developer-Friendly Features

In this post, we will go through how we will add the custom column in the Elasticsearch v7 index step by step.

Setting Elasticsearch v7 as a Default Provider from the Admin

The very first step is to set a default provider in admin. Below are the steps to set the default provider:

  1. Login into Admin
  2. Navigate to the Settings
  3. Search for “Search Provider Name”
  4. Set Elasticsearch v7 into “Search Provider Name” and “Search Indexer Name” (See Screenshot)
  5. Click on Save.

 

Creating Custom Field

After configuring the default provider in the admin section, the site will use Elasticsearch v7, conducting searches on indexes newly established by Elasticsearch v7.

If we want to add a new custom field to these indexes, Optimizely provides some pipelines to add the new custom field.

Add a Class into the Solution to Extend the ElasticsearchProduct Class and Create a New Field

In this class, we have created a property named StockedInWharehouses which is the type of list of strings.

namespace Extensions.Search.ElasticsearchV7.DocumentTypes.Product
{
 using Insite.Search.ElasticsearchV7.DocumentTypes.Product;
 using Nest7;

 [ElasticsearchType(RelationName = "product")]
 public class ElasticsearchProductCustom : ElasticsearchProduct
 {
     public ElasticsearchProductCustom(ElasticsearchProduct source)
         : base(source) // This constructor copies all base code properties.
     {
     }

     [Keyword(Index = true, Name = "stockedInWarehouses")]
     public List<string> StockedInWarehouses { get; set; }
 }
}

Override Pipeline to Insert Data into Custom Property

To add the data into custom property, use a PrepareToRetrieveIndexableProducts class extension. Handle data retrieval within custom code by composing a LINQ query to fetch the required data. The best performance achive by returing Dictonary like.ToDictionary(record => record.ProductId). Here is an example code snippet

namespace Extensions.Search.ElasticsearchV7.DocumentTypes.Product.Index.Pipelines.Pipes.PrepareToRetrieveIndexableProducts
{
 using Insite.Core.Interfaces.Data;
 using Insite.Core.Plugins.Pipelines;
 using Insite.Data.Entities;
 using Insite.Search.ElasticsearchV7.DocumentTypes.Product.Index.Pipelines.Parameters;
 using Insite.Search.ElasticsearchV7.DocumentTypes.Product.Index.Pipelines.Results;
 using System.Linq;

 public sealed class PrepareToRetrieveIndexableProducts : IPipe<PrepareToRetrieveIndexableProductsParameter, PrepareToRetrieveIndexableProductsResult>
 {
     public int Order => 0; // This pipeline has no base code so Order can be anything.

     public PrepareToRetrieveIndexableProductsResult Execute(IUnitOfWork unitOfWork, PrepareToRetrieveIndexableProductsParameter parameter, PrepareToRetrieveIndexableProductsResult result)
     {
         result.RetrieveIndexableProductsPreparation = unitOfWork.GetRepository<ProductWarehouse>().GetTableAsNoTracking()
               .Join(unitOfWork.GetRepository<Product>().GetTableAsNoTracking(), x => x.ProductId, y => y.Id, (x, y) => new { prodWarehouse = x })
               .Join(unitOfWork.GetRepository<Warehouse>().GetTableAsNoTracking(), x => x.prodWarehouse.WarehouseId, y => y.Id, (x, y) => new { Name = y.Name, productId = x.prodWarehouse.ProductId })
               .GroupBy(z => z.productId).ToList()
               .Select(p => new {
                   productId = p.Key.ToString(),
                   warehouses = string.Join(",", p.Select(i => i.Name))
               })
               .ToDictionary(z => z.productId, x => x.warehouses);

         return result;
     }
 }
}

Assign Data to Custom Property in the Elasticsearch7

After retrieving data into the “RetrieveIndexableProductsPreparation” result property, set the data into a custom property for indexable products. To achieve this crate a class “ExtendElasticsearchProduct” and extend with IPipe<CreateElasticsearchProductParameter, CreateElasticsearchProductResult>

Here in the execute method, the parameter contains the RetrieveIndexableProductsPreparation property and this contains our data. Fetch this data using the TryGetValue method.

Avoid having the logic to fetch data return in the CreateElasticsearchProductResult extension class. Writing the data retrieval logic in this class will impact the performance of creating the product indexes.

Here you’ll find an illustrative code snippet:

namespace Extensions.Search.ElasticsearchV7.DocumentTypes.Product.Index.Pipelines.Pipes.CreateElasticsearchProduct
{
 using System;
 using System.Collections.Generic;
 using Insite.Core.Interfaces.Data;
 using Insite.Core.Plugins.Pipelines;
 using Insite.Search.ElasticsearchV7.DocumentTypes.Product.Index.Pipelines.Parameters;
 using Insite.Search.ElasticsearchV7.DocumentTypes.Product.Index.Pipelines.Results;

 public sealed class ExtendElasticsearchProduct : IPipe<CreateElasticsearchProductParameter, CreateElasticsearchProductResult>
 {
     public int Order => 150;

     public CreateElasticsearchProductResult Execute(IUnitOfWork unitOfWork, CreateElasticsearchProductParameter parameter, CreateElasticsearchProductResult result)
     {
         var elasticsearchProductCustom = new ElasticsearchProductCustom(result.ElasticsearchProduct);

         if (((Dictionary<Guid, int>)parameter.RetrieveIndexableProductsPreparation).TryGetValue(elasticsearchProductCustom.ProductId.ToString(), out var stockedInWarehouses))
         {
            elasticsearchProductCustom.StockedInWarehouses = ExtractList(stockedInWarehouses);
         }

         result.ElasticsearchProduct = elasticsearchProductCustom;

         return result;
     }

     private static List<string> ExtractList(string content)
     {
       if (string.IsNullOrWhiteSpace(content))
         return new List<string>();
       return content
        .Split(new[] { ',' }, StringSplitOptions.RemoveEmptyEntries)
        .ToList();
     }
 }
}

After rebuilding the full product index, it displays the newly created ‘StockedInWarehouses’ column in product indexes. The below screeshot showing index with value

Term Query to filter StockedInWarehouses

Now you can easily use the StockedInWarehouses field in term query to filter out search results.

var currentWarehouseId = SiteContext.Current.PickUpWarehouseDto == null
   ? SiteContext.Current.WarehouseDto.Name
   : SiteContext.Current.PickUpWarehouseDto.Name;  
 
result.StockedItemsOnlyQuery = result
      .SearchQueryBuilder
      .MakeTermQuery("stockedInWarehouses.keyword", currentWarehouseId);

Reference Link : https://docs.developers.optimizely.com/configured-commerce/docs/customize-the-search-rebuild-process

]]>
https://blogs.perficient.com/2023/12/15/optimizely-configured-commerce-customizing-elasticsearch-v7-index/feed/ 2 350345
Optimizely CMS 12 – Content Delivery API Integration https://blogs.perficient.com/2023/07/25/optimizely-cms-12-content-delivery-api-integration/ https://blogs.perficient.com/2023/07/25/optimizely-cms-12-content-delivery-api-integration/#comments Tue, 25 Jul 2023 12:39:49 +0000 https://blogs.perficient.com/?p=339963

In the Optimizely Content Management System (CMS), a headless approach is achieved using content delivery API. This handy package can get the data into JSON format using the REST API. In Single Page, Application, and content delivery, API works very well with JavaScript languages like React, Vue, and Angular.

A headless system can work with HTTP requests and be cross-platform. Optimizely content management does not provide direct headless solutions. However, the content delivery API package was used to meet this expectation.

Integrate Content Delivery API into CMS 12:

To integrate Content Delivery API, we need to install the content delivery API NuGet package. Following is the PowerShell query to install

dotnet add package EPiServer.ContentDeliveryApi.Cms

Packages Reference:

 

After installing the package, we need to modify the startup.cs file as:

public void ConfigureServices(IServiceCollection services)
{
   services.AddContentDeliveryApi().WithSiteBasedCors();
}

 

This content delivery API follows the default CORS policy. This may cause CORS errors like the “Access-Control-Allow-Origin” CORS header is missing. To overcome this issue, we need to add CROS into the startup.cs file.

services.AddCors(opt =>
 {
   opt.AddPolicy(name: "CorsPolicy", builder =>
    {
       builder.AllowAnyOrigin()
              .AllowAnyHeader()
              .AllowAnyMethod();
    });
 });    

 

Now we are ready to use content delivery API. To verify our content delivery API working properly, type the following URL into the browser (you can replace localhost with your site domain URL)

https://localhost:5000/api/episerver/v3.0/site/

If the API works properly, we get a 200 OK status, and we get the JSON result as follows

There are some endpoints to retrieve data which are:

  1. Content by content ID: /api/episerver/v3.0/content/{ContentIdentifier}
  2. Child content: /api/episerver/v3.0/content/{ContentIdentifier}/children
  3. Ancestor content: /api/episerver/v3.0/content/{ContentIdentifier}/ancestors
  4. A list of sites in the system: /api/episerver/v3.0/site

 

Following is the Postman screenshot example to retrieve “About Us” content:

 

How does Content Delivery API work?

Below is the flow of the content delivery API.

 

 

In this way, the Headless Content Delivery API works and is implemented in Optimizely CMS 12.

Reference URL: https://docs.developers.optimizely.com/content-management-system/v1.5.0-content-delivery-api/docs/content-delivery-api

To create a drop-down list using Enum and database, in Optimizely 12 click here

]]>
https://blogs.perficient.com/2023/07/25/optimizely-cms-12-content-delivery-api-integration/feed/ 1 339963
Creating Dropdown Lists in Optimizely CMS 12 with Enum and Database https://blogs.perficient.com/2023/05/15/creating-dropdown-lists-in-optimizely-cms-12-with-enum-and-database/ https://blogs.perficient.com/2023/05/15/creating-dropdown-lists-in-optimizely-cms-12-with-enum-and-database/#comments Mon, 15 May 2023 11:18:33 +0000 https://blogs.perficient.com/?p=335198

This blog will help to create single or multiple list options in CMS 12. Dropdown lists play an important role whenever we give the user to select a single value from the list.

In CMS 12, single or multiple selection lists populate using “SelectOne” and “SelectMany” attributes. This attribute is in the “EPiServer.Shell.ObjectEditing” namespace. You can define these attributes on a property and require a reference to a class implementing the “ISelectionFactory” interface. I referred to the Optimizely developer site.

Creating Dropdown Lists using Enum

  1. Create Enum Region.
    Enum
  2. Create property Region with Enum type Region.
  3. Create a class named “SelectOneEnumAttribute” and inherit it from “SelectOneAttribute.”
  4. Also, implement this class with “IDisplayMetadataProvider.” In CMS 11, we need to implement with interface “IMetadataAware.” But in CMS 12, we need to implement “IDisplayMetadataProvider.”
  5. The method “CreateDisplayMetadata” is not virtual; hence we need to write a new keyword to override this method from SelectOneAttribute. Note that we were calling the base method as well.
  6. After writing all the code, we will get the below output
  7. If we decorate a property “Region” with “SelectManyEnum,” then we will get below output

Create Dropdown Lists using Database

  1. Create a property “OtherCities” of string type.https://blogs.perficient.com/files/Property.png
  2. Decorate this property with “SelectMany” and give the selection factory type as “CitySelectionFactory”
  3. Create a class “CitySelectionFactory” and implement it with the “ISelectionFactory” interface.
  4. In this class, we need to get all cities from the Database. Write the logic in the “GetSelection” method and return the list to do that.
  5. Create the “SelectManyEnumAttribute” Class and inherit it with the “SelectManyAttribute” class, and the same as Select Single Attribute implements with the “IDisplayMetadataProvider” interface.
  6. The method “CreateDisplayMetadata” is not virtual; hence we need to write a new keyword to override this method from SelectManyAttribute. Note that we were calling the base method as well.
  7. After writing all code, we will get the following output.
  8. If we decorate a property “OtherCities” with “SelectOne,” we will get below output.

]]>
https://blogs.perficient.com/2023/05/15/creating-dropdown-lists-in-optimizely-cms-12-with-enum-and-database/feed/ 12 335198