Customer Experience + Design Articles / Blogs / Perficient https://blogs.perficient.com/category/services/customer-experience-design/ Expert Digital Insights Tue, 27 Jan 2026 13:51:10 +0000 en-US hourly 1 https://blogs.perficient.com/files/favicon-194x194-1-150x150.png Customer Experience + Design Articles / Blogs / Perficient https://blogs.perficient.com/category/services/customer-experience-design/ 32 32 30508587 Moving to CJA? Sunset Adobe Analytics Without Causing Chaos https://blogs.perficient.com/2026/01/27/moving-to-cja-sunset-adobe-analytics-without-causing-chaos/ https://blogs.perficient.com/2026/01/27/moving-to-cja-sunset-adobe-analytics-without-causing-chaos/#comments Tue, 27 Jan 2026 13:51:10 +0000 https://blogs.perficient.com/?p=389876

Adobe Experience Platform (AEP) and Customer Journey Analytics (CJA) continue to emerge as the preferred solutions for organizations seeking a unified, 360‑degree view of customer behavior.  For organizations requiring HIPAA compliance, AEP and CJA is a necessity.  Many organizations are now having discussions about whether they should retool or retire their legacy Adobe Analytics implementations.  The transition from Adobe Analytics to CJA is far more complex than simply disabling an old tool. Teams must carefully plan, perform detailed analysis, and develop a structured approach to ensure that reporting continuity, data integrity, and downstream dependencies remain intact.

Adobe Analytics remains a strong platform for organizations focused exclusively on web and mobile app measurement; however, enterprises that are prioritizing cross‑channel data activation, real‑time profiles, and detailed journey analysis should embrace AEP as the future. Of course, you won’t be maintaining two platforms after building out CJA so you must think about how to move on from Adobe Analytics.

Decommissioning Options and Key Considerations

You can approach decommissioning Adobe Analytics in several ways. Your options include: 1) disabling the extension; 2) adding an s.abort at the top of the AppMeasurement custom‑code block to prevent data from being sent to Adobe Analytics; 3) deleting all legacy rules; or 4) discarding Adobe Analytics entirely and creating a new Launch property for CJA. Although multiple paths exist, the best approach almost always involves preserving your data‑collection methods and keeping the historical Adobe Analytics data. You have likely collected that data for years, and you want it to remain meaningful after migration. Instead of wiping everything out, you can update Launch by removing rules you no longer need or by eliminating references to Adobe Analytics.

Recognizing the challenges involved in going through the data to make the right decisions during this process, I have developed a specialized tool – Analytics Decommissioner (AD) — designed to support organizations as they decommission Adobe Analytics and transition fully to AEP and CJA. The tool programmatically evaluates Adobe Platform Launch implementations using several Adobe API endpoints, enabling teams to quickly identify dependencies, references, and potential risks associated with disabling Adobe Analytics components.

Why Decommissioning Requires More Than a Simple Shutdown

One of the most significant obstacles in decommissioning Adobe Analytics is identifying where legacy tracking still exists and where removing Adobe Analytics could potentially break the website or cause errors. Over the years, many organizations accumulate layers of custom code, extensions, and tracking logic that reference Adobe Analytics variables—often in places that are not immediately obvious. These references may include s. object calls, hard‑coded AppMeasurement logic, or conditional rules created over the course of several years. Without a systematic way to surface dependencies, teams risk breaking critical data flows that feed CJA or AEP datasets.

Missing or outdated documentation makes the problem even harder. Many organizations fail to maintain complete or current solution design references (SDRs), especially for older implementations. As a result, teams rely on tribal knowledge, attempts to recall discussions from years ago, or a manual inspection of data collected to understand how the system collects data. This approach moves slowly, introduces errors, and cannot support large‑scale environments. When documentation lacks clarity, teams struggle to identify which rules, data elements, or custom scripts still matter and which they can safely remove. Now imagine repeating this process for every one of your Launch properties.

This is where Perficient and the AD tool provide significant value.
The AD tool programmatically scans Launch properties and uncovers dependencies that teams may have forgotten or never documented. A manual analysis might easily overlook these dependencies. AD also pinpoints where custom code still references Adobe Analytics variables, highlights rules that have been modified or disabled since deployment, and surfaces AppMeasurement usage that could inadvertently feed into CJA or AEP data ingestion. This level of visibility is essential for ensuring that the decommissioning process does not disrupt data collection or reporting.

How Analytics Decommissioner (AD) Works

The tool begins by scanning all Launch properties across your organization and asking the user to select a property. This is necessary because the decommissioning process must be done on each property individually.  This is the same way data is set for Adobe Analytics, one Launch property at a time.  Once a property is selected, the tool retrieves all production‑level data elements, rules, and rule components, including their revision histories.  The tool ignores rules and data element revisions that developers disabled or never published (placed in production).  The tool then performs a comprehensive search for AppMeasurement references and Adobe Analytics‑specific code patterns. These findings show teams exactly where legacy tracking persists and see what needs to be updated or modified and which items can be safely removed.  If no dependencies exist, AD can disable the rules and create a development library for testing.  When AD cannot confirm that a dependency exists, it reports the rule names and components where potential issues exist and depend on development experts to make the decision about the existence of a dependency.  The user always makes the final decisions.

This tool is especially valuable for large or complex implementations. In one recent engagement, a team used it to scan nearly 100 Launch properties. Some of those properties included more than 300 data elements and 125 active rules.  Attempting to review this level of complexity manually would have taken weeks and the risk would remain that critical dependencies are missed. Programmatic scanning ensures accuracy, completeness, and efficiency.  This allows teams to move forward with confidence.

A Key Component of a Recommended Decommissioning Approach

The AD tool and a comprehensive review are essential parts of a broader, recommended decommissioning framework. A structured approach typically includes:

  • Inventory and Assessment – Identifying all Adobe Analytics dependencies across Launch, custom code, and environments.
  • Mapping to AEP/CJA – Ensuring all required data is flowing into the appropriate schemas and datasets.
  • Gap Analysis – Determining where additional configuration or migration work needs to be done.
  • Remediation and Migration – Updating Launch rules, removing legacy code, and addressing undocumented dependencies.
  • Validation and QA – Confirming that reporting remains accurate in CJA after removal of Launch rules and data elements created for Adobe Analytics.
  • Sunset and Monitoring – Disabling AppMeasurement, removing Adobe Analytics extensions, and monitoring for errors.

Conclusion

Decommissioning Adobe Analytics is a strategic milestone in modernizing the digital data ecosystem. Using the right tools and having the right processes are essential.  The Analytics Decommissioner tool allows organizations to confidently transition to AEP and CJA. This approach to migration preserves data quality, reduces operational costs, and strengthens governance when teams execute it properly. By using the APIs and allowing the AD tool to handle the heavy lifting, teams ensure that they don’t overlook any dependencies.  This will enable a smooth and risk‑free transition with robust customer experience analytics.

]]>
https://blogs.perficient.com/2026/01/27/moving-to-cja-sunset-adobe-analytics-without-causing-chaos/feed/ 1 389876
Build a Custom Accordion Component in SPFx Using React – SharePoint https://blogs.perficient.com/2026/01/22/build-a-custom-accordion-component-in-spfx-using-react-sharepoint/ https://blogs.perficient.com/2026/01/22/build-a-custom-accordion-component-in-spfx-using-react-sharepoint/#comments Thu, 22 Jan 2026 07:50:54 +0000 https://blogs.perficient.com/?p=389813

When building modern SharePoint Framework (SPFx) solutions, reusable UI components play a crucial role in keeping your code clean, scalable, and maintainable. In particular, interactive components help improve the user experience without cluttering the interface.

Among these components, the Accordion is a commonly used UI element. It allows users to expand and collapse sections, making it easier to display large amounts of information in a compact and organized layout. In this blog, we’ll walk through how to create a custom accordion component in SPFx using React.


Create the Accordion Wrapper Component

To begin with, we’ll create a wrapper component that acts as a container for multiple accordion items. At a high level, this component’s responsibility is intentionally simple: it renders child accordion items while keeping styling and layout consistent across the entire accordion.This approach allows individual accordion items to remain focused on their own behavior, while the wrapper handles structure and reusability.

Accordion.tsx

import * as React from 'react';
import styles from './Accordion.module.scss';
import classNames from 'classnames';
import { IAccordionItemProps } from './subcomponents/AccordionItem';

import { ReactElement } from 'react';

export interface IAccordionProps {
  children?:
    | ReactElement<IAccordionItemProps>
    | ReactElement<IAccordionItemProps>[];
  className?: string;
}


const Accordion: React.FunctionComponent<
  React.PropsWithChildren<IAccordionProps>
> = (props) => {
  const { children, className } = props;
  return (
    <div className={classNames(styles.accordionSubcomponent, className)}>
      {children}
    </div>
  );
};

export default Accordion;

Styling with SCSS Modules

Next, let’s focus on styling. SPFx supports SCSS modules, which is ideal for avoiding global CSS conflicts and keeping styles scoped to individual components. Let’s see styling for accordion and accordion items.

Accordion.module.scss

.accordionSubcomponent {
    margin-bottom: 12px;
    .accordionTitleRow {
        display: flex;
        flex-direction: row;
        align-items: center;
        padding: 5px;
        font-size: 18px;
        font-weight: 600;
        cursor: pointer;
        -webkit-touch-callout: none;
        -webkit-user-select: none;
        -khtml-user-select: none;
        -moz-user-select: none;
        -ms-user-select: none;
        user-select: none;
        border-bottom: 1px solid;
        border-color: "[theme: neutralQuaternaryAlt]";
        background: "[theme: neutralLighter]";
    }
    .accordionTitleRow:hover {
        opacity: .8;
    }
    .accordionIconCol {
        padding: 0px 5px;
    }
    .accordionHeaderCol {
        display: inline-block;
        width: 100%;
    }
    .iconExpandCollapse {
        margin-top: -4px;
        font-weight: 600;
        vertical-align: middle;
    }
    .accordionContent {
        margin-left: 12px;
        display: grid;
        grid-template-rows: 0fr;
        overflow: hidden;
        transition: grid-template-rows 200ms;
        &.expanded {
          grid-template-rows: 1fr;
        }
        .expandableContent {
          min-height: 0;
        }
    }
}

Styling Highlights

  • Grid‑based animation for expand/collapse
  • SharePoint theme tokens
  • Hover effects for better UX

Creating Accordion Item Component

Each expandable section is managed by AccordionItem.tsx.

import * as React from 'react';
import styles from '../Accordion.module.scss';
import classNames from 'classnames';
import { Icon } from '@fluentui/react';
import { useState } from 'react';


export interface IAccordionItemProps {
  iconCollapsed?: string;
  iconExpanded?: string;
  headerText?: string;
  headerClassName?: string;
  bodyClassName?: string;
  isExpandedByDefault?: boolean;
}
const AccordionItem: React.FunctionComponent<React.PropsWithChildren<IAccordionItemProps>> = (props: React.PropsWithChildren<IAccordionItemProps>) => {
  const {
    iconCollapsed,
    iconExpanded,
    headerText,
    headerClassName,
    bodyClassName,
    isExpandedByDefault,
    children
  } = props;
  const [isExpanded, setIsExpanded] = useState<boolean>(!!isExpandedByDefault);
  const _toggleAccordion = (): void => {
    setIsExpanded((prevIsExpanded) => !prevIsExpanded);
  }
  return (
    <Stack>
    <div className={styles.accordionTitleRow} onClick={_toggleAccordion}>
        <div className={styles.accordionIconCol}>
            <Icon
                iconName={isExpanded ? iconExpanded : iconCollapsed}
                className={styles.iconExpandCollapse}
            />
        </div>
        <div className={classNames(styles.accordionHeaderCol, headerClassName)}>
            {headerText}
        </div>
    </div>
    <div className={classNames(styles.accordionContent, bodyClassName, {[styles.expanded]: isExpanded})}>
      <div className={styles.expandableContent}>
        {children}
      </div>
    </div>
    </Stack>
  )
}
AccordionItem.defaultProps = {
  iconExpanded: 'ChevronDown',
  iconCollapsed: 'ChevronUp'
};
export default AccordionItem;

Example Usage in SPFx Web Part

<Accordion>
  <AccordionItem headerText="What is SPFx?">
    <p>SPFx is a development model for SharePoint customizations.</p>

  </AccordionItem>

  <AccordionItem
    headerText="Why use custom controls?"
    isExpandedByDefault={true}
  >
    <p>Custom controls improve reusability and UI consistency.</p>
  </AccordionItem>
</Accordion>

Accordion

Conclusion

By building a custom accordion component in SPFx using React, you gain:

  • Full control over UI behavior
  • Lightweight and reusable code
  • Native SharePoint theming

This pattern is perfect for:

  • FAQ sections
  • Configuration panels
  • Dashboard summaries
]]>
https://blogs.perficient.com/2026/01/22/build-a-custom-accordion-component-in-spfx-using-react-sharepoint/feed/ 1 389813
Building Custom Search Vertical in SharePoint Online for List Items with Adaptive Cards https://blogs.perficient.com/2026/01/14/build-custom-search-vertical-in-sharepoint-for-list-items-with-adaptive-cards/ https://blogs.perficient.com/2026/01/14/build-custom-search-vertical-in-sharepoint-for-list-items-with-adaptive-cards/#respond Wed, 14 Jan 2026 06:25:15 +0000 https://blogs.perficient.com/?p=389614

This blog explains the process of building a custom search vertical in SharePoint Online that targets a specific list using a dedicated content type. It covers indexing important columns, and mapping them to managed properties for search. Afterward, a result type is configured with Adaptive Cards JSON to display metadata like title, category, author, and published date in a clear, modern format. Then we will have a new vertical on the hub site, giving users a focused tab for Article results. In last, the result is a streamlined search experience that highlights curated content with consistent metadata and an engaging presentation.

For example, we will start with the assumption that a custom content type is already in place. This content type includes the following columns:

  • Article Category – internal name article_category
  • Article Topic – internal name article_topic

We’ll also assume that a SharePoint list has been created which uses this content type, with the ContentTypeID: 0x0101009189AB5D4FBA4A9C9BFD5F3F9F6C3B

With the content type and list ready, the next steps focus on configuring search so these items can be surfaced effectively in a dedicated vertical.

Index Columns in the List

Indexing columns optimize frequently queried metadata, including category or topic, for faster search.. This improves performance and makes it easier to filter and refine results in a custom vertical.

  • Go to List Settings → Indexed Columns.
  • Ensure article_category and article_topic are indexed for faster search queries.

Create Managed Properties

First, check which RefinableString managed properties are available in your environment. After you identify them, configure them as shown below.:

Refinable stringField nameAlias nameCrawled property
RefinableString101article _topicArticleTopicows_article _topic
RefinableString102article_categoryArticleCategoryows_article_category
RefinableString103article_linkArticleLinkows_article_link

Tip: Creating an alias name for a managed property makes it easier to read and reference. This step is optional — you can also use the default RefinableString name directly.

To configure these fields, follow the steps below:

  • Go to the Microsoft Search Admin Center → Search schema.
  • Go to Search Schema → Crawled Properties
  • Look for the field (ex. article _topic or article_category),  find its crawled property (starts with ows_)
  • Click on property → Add mapping
  • Popup will open → Look for unused RefinableString properties (e.g., RefinableString101, RefinableString102) → click “Ok” button
  • Click “Save”
  • Likewise, create managed properties for all the required columns.

Once mapped, these managed properties can be searched, found, and defined. This means they can be used in search filters, result types, and areas.

Creating a Custom Search Vertical

This lets you add a dedicated tab that filters results to specific content, improving findability and user experience. It ensures users quickly access targeted items like lists, libraries, or content types without sifting through all search results. In this example, we will set the filter for a specific articles list.

This lets you add a dedicated tab that filters results to specific content, improving findability and user experience. It ensures users quickly access targeted items like lists, libraries, or content types without sifting through all search results. In this example, we will set the filter for a specific articles list.

Following the steps given below to create and configure a custom search vertical from the admin center:

  • In “Verticals” tab, add a new value as per following configuration:
    • Name = “Articles”
    • Content source = SharePoint and OneDrive
    • KQL query = It is the actual filter where we specify the filter for items from the specific list to display in search results. In our example, we will set it as: ContentTypeId:0x0101009189AB5D4FBA4A9C9BFD5F3F9F6C3B*Verticalskql
    • Filters: Filters are an optional setting that allows users to narrow search results based on specific criteria. In our example, we can add a filter by category. To add “Categories” filter on search page, follow below steps:
      • Click on add filter
      • Select “RefinableString102” (This is a refinable string managed property for “article_category” column as setup in above steps)
      • Name = “Category” or other desired string to display on search

Set Vertical filter

Creating a Result Type

Creating a new result type in the Microsoft Search Admin Center lets you define how specific content (like items from a list or a content type) is displayed in search results. In this example, we set some rules and use Adaptive Card template to make search easier and more interesting.

Following are the steps to create a new result type in the admin center.

  • Go to admin center, https://admin.cloud.microsoft
  • Settings → Search & intelligence
  • In “Customizations”, go to “Result types”
  • Add new result types with the following configurations:
    • Name = “AarticlesResults” (Note: Specify any name you want to display in search vertical)
    • Content source = SharePoint and OneDrive
    • Rules
      • Type of content = SharePoint list item
      • ContentTypeId starts with 0x0101009189AB5D4FBA4A9C9BFD5F3F9F6C3B (Note: Content type Id created in above steps)Set Result type
      • Layout = Put the JSON string for Adaptive card to display search result. Following is the JSON for displaying the result:
        {
           "type": "AdaptiveCard",
          "version": "1.3",
          "body": [
            {
              "type": "ColumnSet",
              "columns": [
                {
                  "type": "Column",
                  "width": "auto",
                  "items": [
                    {
                    "type": "Image",
                    "url": <url of image/thumbnail to be displayed for each displayed item>,
                    "altText": "Thumbnail image",
                    "horizontalAlignment": "Center",
                    "size": "Small"
                    }
                  ],
                  "horizontalAlignment": "Center"
                },
                {
                  "type": "Column",
                  "width": 10,
                  "items": [
                    {
                      "type": "TextBlock",
                      "text": "[${ArticleTopic}](${first(split(ArticleLink, ','))})",
                      "weight": "Bolder",
                      "color": "Accent",
                      "size": "Medium",
                      "maxLines": 3
                    },
                    {
                      "type": "TextBlock",
                      "text": "**Category:** ${ArticleCategory}",
                      "spacing": "Small",
                      "maxLines": 3
                    }
                  ],
                  "spacing": "Medium"
                }
              ]
            }
          ],
          "$schema": "http://adaptivecards.io/schemas/adaptive-card.json"
        }

        Set Result type adaptive card

When you set up everything properly, the final output will look like this:

Final search results

Conclusion

Finally, we created a special search area in SharePoint Online for list items with adaptive cards. This changes how users use search. Important metadata becomes clearly visible when you index key columns, map them to managed properties, and design a tailored result type. Since we used Adaptive Card, it adds a modern, interesting presentation layer. It makes it easier to scan and more visually appealing. In the end, publishing a special section gives you a special tab that lets you access a special list of content. This makes it easier to work with and makes the user experience better.

]]>
https://blogs.perficient.com/2026/01/14/build-custom-search-vertical-in-sharepoint-for-list-items-with-adaptive-cards/feed/ 0 389614
From Legacy to Modern: Migrating WCF to Web API with the Help of AI https://blogs.perficient.com/2026/01/13/from-legacy-to-modern-migrating-wcf-to-web-api-with-the-help-of-ai/ https://blogs.perficient.com/2026/01/13/from-legacy-to-modern-migrating-wcf-to-web-api-with-the-help-of-ai/#respond Tue, 13 Jan 2026 17:32:36 +0000 https://blogs.perficient.com/?p=389673

Introduction

The modernization of legacy applications has always been a costly process: understanding old code, uncovering hidden dependencies, translating communication models (for example, from SOAP to REST), and ensuring that nothing breaks in production. This is where artificial intelligence changes the game.

AI does not replace the architect or the developer, but it speeds up the heaviest steps in a migration: it helps read and summarize large codebases, proposes equivalent designs in the new technology, generates drafts of controllers, DTOs, and tests, and even suggests architectural improvements that take advantage of the change. Instead of spending hours on mechanical tasks, the team can focus on what really matters: the business rules and the quality of the new solution.

In this post, we’ll look at that impact applied to a concrete case: migrating a WCF service written in C# to an ASP.NET Core Web API, using a real public repository as a starting point and relying on AI throughout the entire process.

Sample project: a real WCF service to be migrated

For this article, we’ll use the public project jecamayo/t-facturo.net as a real-world example: a .NET application that exposes SOAP services based on WCF to manage advisors and branches, using NHibernate for data access. This kind of solution perfectly represents the scenario of many legacy applications currently running in production, and it will serve as our basis to show how artificial intelligence can speed up and improve their migration to a modern architecture with ASP.NET Core Web API.

Key Steps to Migrate from Legacy WCF to a Modern Web API

Migrating a legacy application is not just about “moving code” from one technology to another: it involves understanding the business context, the existing architecture, and designing a modern solution that will be sustainable over time. To structure that process—and to clearly show where artificial intelligence brings the most value—it’s useful to break the migration down into a few key steps like the ones we’ll look at next.

  1. Define the goals and scope of the migration
    Clarify what you want to achieve with the modernization (for example, moving to .NET 8, exposing REST, improving performance or security) and which parts of the system are in or out of the project, in order to avoid surprises and rework.
  2. Analyze the current architecture and design the target architecture
    Understand how the solution is built today (layers, projects, WCF, NHibernate, database) and, with that snapshot, define the target architecture in ASP.NET Core Web API (layers, patterns, technologies) that will replace the legacy system.
  3. Identify dependencies, models, DTOs, and business rules
    Locate external libraries, frameworks, and critical components; inventory domain entities and DTOs; and extract the business rules present in the code to ensure they are properly preserved in the new implementation.
  4. Design the testing strategy and migration plan
    Decide how you will verify that the new API behaves the same (unit tests, integration tests, comparison of WCF vs Web API responses) and define whether the migration will be gradual or a “big bang”, including phases and milestones.
  5. Implement the new Web API, validate it, and retire the legacy WCF
    Build the Web API following the target architecture, migrate the logic and data access, run the test plan to validate behavior, deploy the new solution and, once its stability has been confirmed, deactivate the inherited WCF service.

How to Use AI Prompts During a Migration

Artificial intelligence becomes truly useful in a migration when we know what to ask of it and how to ask it. It’s not just about “asking for code,” but about leveraging it in different phases: understanding the legacy system, designing the target architecture, generating repetitive parts, proposing tests, and helping document the change. To do this, we can classify prompts into a few simple categories (analysis, design, code generation, testing, and documentation) and use them as a practical guide throughout the entire migration process.

Analysis and Understanding Prompts

These focus on having the AI read the legacy code and help you understand it faster: what a WCF service does, what responsibilities a class has, how projects are related, or which entities and DTOs exist. They are ideal for obtaining “mental maps” of the system without having to review every file by hand.

Usage examples:

  • Summarize what a project or a WCF service does.
  • Explain what responsibilities a class or layer has.
  • Identify domain models, DTOs, or design patterns.

Design and Architecture Prompts

These are used to ask the AI for target architecture proposals in the new technology: how to translate WCF contracts into REST endpoints, what layering structure to follow in ASP.NET Core, or which patterns to apply to better separate domain, application, and infrastructure. They do not replace the architect’s judgment, but they offer good starting points and alternatives.

Usage examples:

  • Propose how to translate a WCF contract into REST endpoints.
  • Suggest a project structure following Clean Architecture.
  • Compare technological alternatives (keeping NHibernate vs migrating to EF Core).

Code Generation and Refactoring Prompts

These are aimed at producing or transforming specific code: generating Web API controllers from WCF interfaces, creating DTOs and mappings, or refactoring large classes into smaller, more testable services. They speed up the creation of boilerplate and make it easier to apply good design practices.

Usage examples:

  • Create a Web API controller from a WCF interface.
  • Generate DTOs and mappings between entities and response models.
  • Refactor a class with too many responsibilities into cleaner services/repositories.

Testing and Validation Prompts

Their goal is to help ensure that the migration does not break existing behavior. They can be used to generate unit and integration tests, define representative test cases, or suggest ways to compare responses between the original WCF service and the new Web API.

Usage examples:

  • Generate unit or integration tests for specific endpoints.
  • Propose test scenarios for a business rule.
  • Suggest strategies to compare responses between WCF and Web API.

Documentation and Communication Prompts

They help explain the before and after of the migration: documenting REST endpoints, generating technical summaries for the team, creating tables that show the equivalence between WCF operations and Web API endpoints, or writing design notes for future evolutions. They simplify communication with developers and non-technical stakeholders.

Usage examples:

  • Write documentation for the new API based on the controllers.
  • Generate technical summaries for the team or stakeholders.
  • Create equivalence tables between WCF operations and REST endpoints.

To avoid making this article too long and to be able to go deeper into each stage of the migration, we’ll leave the definition of specific prompts —with real examples applied to the t-facturo.net project— for an upcoming post. In that next article, we’ll go through, step by step, what to ask the AI in each phase (analysis, design, code generation, testing, and documentation) and how those prompts directly impact the quality, speed, and risk of a WCF-to-Web-API migration.

Conclusions

The experience of migrating a legacy application with the help of AI shows that its main value is not just in “writing code,” but in reducing the intellectual friction of the process: understanding old systems, visualizing possible architectures, and automating repetitive tasks. Instead of spending hours reading WCF contracts, service classes, and DAOs, AI can summarize, classify, and propose migration paths, allowing the architect and the team to focus their time on key design decisions and business rules.

At the same time, AI speeds up the creation of the new solution: it generates skeletons for Web API controllers, DTOs, mappings, and tests, acting as an assistant that produces drafts for the team to iterate on and improve. However, human judgment remains essential to validate each proposal, adapt the architecture to the organization’s real context, and ensure that the new application not only “works,” but is maintainable, secure, and aligned with business goals.

]]>
https://blogs.perficient.com/2026/01/13/from-legacy-to-modern-migrating-wcf-to-web-api-with-the-help-of-ai/feed/ 0 389673
Microservices: The Emerging Complexity Driven by Trends and Alternatives to Over‑Design https://blogs.perficient.com/2025/12/31/microservices-the-emerging-complexity-driven-by-trends-and-alternatives-to-over-design/ https://blogs.perficient.com/2025/12/31/microservices-the-emerging-complexity-driven-by-trends-and-alternatives-to-over-design/#respond Wed, 31 Dec 2025 15:13:56 +0000 https://blogs.perficient.com/?p=389360

The adoption of microservice‑based architectures has grown exponentially over the past decade, often driven more by industry trends than by a careful evaluation of system requirements. This phenomenon has generated unnecessarily complex implementations—like using a bazooka to kill an ant. Distributed architectures without solid foundations in domain capabilities, workloads, operational independence, or real scalability needs have become a common pattern in the software industry. In many cases, organizations migrate without having a mature discipline in observability, traceability, automation, domain‑driven design, or an operational model capable of supporting highly distributed systems; as a consequence, they end up with distributed monoliths that require coordinated deployments and suffer cascading failures, losing the benefits originally promised by microservices (Iyer, 2025; Fröller, 2025).

Over‑Design

The primary issue in microservices is not rooted in their architectural essence, but in the over‑design that emerges when attempting to implement such architecture without having a clear roadmap of the application’s domains or of the contextual boundaries imposed by business rules. The decomposition produces highly granular, entity‑oriented services that often result in circular dependencies, duplicated business logic, excessive events without meaningful semantics, and distributed flows that are difficult to debug. Instead of achieving autonomy and independent scalability, organizations create a distributed monolith with operational complexity multiplied by the number of deployed services. A practical criterion to avoid this outcome is to postpone decomposition until stable boundaries and non‑functional requirements are fully understood, even adopting a monolith‑first approach before splitting (Fowler, 2015; Danielyan, 2025).

Minimal API and Modular Monolith as Alternatives to Reduce Complexity

In these scenarios, it is essential to explore alternatives that allow companies to design simpler microservices without sacrificing architectural clarity or separation of concerns. One such alternative is the use of Minimal APIs to reduce complexity in the presentation layer: this approach removes ceremony (controllers, conventions, annotations) and accelerates startup while reducing container footprint. It is especially useful for utility services, CRUD operations, and limited API surfaces (Anderson & Dykstra, 2024; Chauhan, 2024; Nag, 2025).

Another effective alternative is the Modular Monolith. A well‑modularized monolith enables isolating functional domains within internal modules that have clear boundaries and controlled interaction rules, simplifying deployment, reducing internal latency, and avoiding the explosion of operational complexity. Additionally, it facilitates a gradual migration toward microservices only when objective reasons exist (differentiated scaling needs, dedicated teams, different paces of domain evolution) (Bächler, 2025; Bauer, n.d.).

Improving the API Gateway and the Use of Event‑Driven Architectures (EDA)

The API Gateway is another critical component for managing external complexity: it centralizes security policies, versioning, rate limiting, and response transformation/aggregation, hiding internal topology and reducing client cognitive load. Patterns such as Backend‑for‑Frontend (BFF) and aggregation help decrease network trips and prevent each public service from duplicating cross‑cutting concerns (Microsoft, n.d.-b; AST Consulting, 2025).

A key principle for reducing complexity is to avoid decomposition by entities and instead guide service boundaries using business capabilities and bounded contexts. Domain‑Driven Design (DDD) provides a methodological compass to define coherent semantic boundaries; mapping bounded contexts to services (not necessarily in a 1:1 manner) reduces implicit coupling, prevents domain model ambiguity, and clarifies service responsibilities (Microsoft, n.d.-a; Polishchuk, 2025).

Finally, the use of Event‑Driven Architectures (EDA) should be applied judiciously. Although EDA enhances scalability and decoupling, poor implementation significantly increases debugging effort, introduces hidden dependencies, and complicates traceability. Mitigating these risks requires discipline in event design/versioning, the outbox pattern, idempotency, and robust telemetry (correlation IDs, DLQs), in addition to evaluating when orchestration (Sagas) is more appropriate than choreography (Three Dots Labs, n.d.; Moukbel, 2025).

Conclusion

The complexity associated with microservices arises not from the architecture itself, but from misguided adoption driven by trends. The key to reducing this complexity is prioritizing cohesion, clarity, and gradual evolution: Minimal APIs for small services, a Modular Monolith as a solid foundation, decomposition by real business capabilities and bounded contexts, a well‑defined gateway, and a responsible approach to events. Under these principles, microservices stop being a trend and become an architectural mechanism that delivers real value (Fowler, 2015; Anderson & Dykstra, 2024).

References

  • Anderson, R., & Dykstra, T. (2024, julio 29). Tutorial: Create a Minimal API with ASP.NET Core. Microsoft Learn. https://learn.microsoft.com/en-us/aspnet/core/tutorials/min-web-api?view=aspnetcore-10.0
  • AST Consulting. (2025, junio 12). API Gateway in Microservices: Top 5 Patterns and Best Practices Guide. https://astconsulting.in/microservices/api-gateway-in-microservices-patterns
  • Bächler, S. (2025, enero 23). Modular Monolith: The Better Alternative to Microservices. ti&m. https://www.ti8m.com/en/blog/monolith
  • Bauer, R. A. (s. f.). On Modular Monoliths. https://www.raphaelbauer.com/posts/on-modular-monoliths/
  • Danielyan, M. (2025, febrero 4). When to Choose Monolith Over Microservices. https://mikadanielyan.com/blog/when-to-choose-monolith-over-microservices
  • Fowler, M. (2015, junio 3). Monolith First. https://martinfowler.com/bliki/MonolithFirst.html
  • Fröller, J. (2025, octubre 30). Many Microservice Architectures Are Just Distributed Monoliths. MerginIT Blog. https://merginit.com/blog/31102025-microservices-antipattern-distributed-monolit
  • Iyer, A. (2025, junio 3). Why 90% of Microservices Still Ship Like Monoliths. The New Stack. https://thenewstack.io/why-90-of-microservices-still-ship-like-monoliths/
  • Microsoft. (s. f.-a). Domain analysis for microservices. Azure Architecture Center. https://learn.microsoft.com/en-us/azure/architecture/microservices/model/domain-analysis
  • Microsoft. (s. f.-b). API gateways. Azure Architecture Center. https://learn.microsoft.com/en-us/azure/architecture/microservices/design/gateway
  • Moukbel, T. (2025). Event-Driven Architecture: Pitfalls and Best Practices. Undercode Testing. https://undercodetesting.com/event-driven-architecture-pitfalls-and-best-practices/
  • Nag, A. (2025, julio 29). Why Minimal APIs in .NET 8 Are Perfect for Microservices Architecture?. embarkingonvoyage.com. https://embarkingonvoyage.com/blog/technologies/why-minimal-apis-in-net-8-are-perfect-for-microservices-architecture/
  • Polishchuk. (2025, diciembre 12). Design Microservices: Using DDD Bounded Contexts. bool.dev. https://bool.dev/blog/detail/ddd-bounded-contexts
  • Three Dots Labs. (s. f.). Event-Driven Architecture: The Hard Parts. https://threedots.tech/episode/event-driven-architecture/
  • Chauhan, P. (2024, septiembre 30). Deep Dive into Minimal APIs in ASP.NET Core 8. https://www.prafulchauhan.com/blogs/deep-dive-into-minimal-apis-in-asp-net-core-8
]]>
https://blogs.perficient.com/2025/12/31/microservices-the-emerging-complexity-driven-by-trends-and-alternatives-to-over-design/feed/ 0 389360
HCIC 2025 Takeaway: AI is Changing Healthcare Marketing https://blogs.perficient.com/2025/12/19/hcic-2025-takeaway-ai-is-changing-healthcare-marketing/ https://blogs.perficient.com/2025/12/19/hcic-2025-takeaway-ai-is-changing-healthcare-marketing/#respond Fri, 19 Dec 2025 18:21:54 +0000 https://blogs.perficient.com/?p=388950

At the Healthcare Interactive Conference (HCIC) last month, I got to talk to marketers who are very focused on results. They are also very focused on what will impact their marketing efforts and why. Every conversation came back to AI.

In my previous HCIC takeaway, I wrote about how AI is not a strategy—it’s a tool to solve real problems. Now I want to dig into a specific problem AI is creating for healthcare marketers: how we get found. We need to be thinking about all aspects of how AI can be used. In general, this breaks down into both impact and opportunity.

Impact: AI Search Is Transforming Healthcare Discovery

Several conference sessions alluded to this shift, but marketing experts Brittany Young and Gina Linville gave some deeper insight.

From a marketing perspective, the largest impact is one of being found. Think about how much time a typical hospital marketer puts into being found. I have had many conversations over the years about Search Engine Optimization (SEO) and the importance of having valuable content that the search engines view as unique and relevant.

AI impacts that in ways that are not at first obvious.

The New Reality of Patient Search

Think of how you typically use ChatGPT or how your search engine has evolved. AI now pulls the data and gives you a brief with information culled from multiple online sources. The good news is that the AI tool will typically reference a website it sources. The bad news is while AI typically credits source websites, patients get their answers without ever clicking through to your site.

The scale of this shift is staggering:

AI provides an overview for up to 84% of search queries when it comes to healthcare questions.

Healthcare leads nearly every sector in AI-powered search results—a trend that’s accelerating:

Screenshot 2025 12 10 At 4.43.15 pm

Strategic Response: Winning at AI Search in Healthcare

This shift demands a fundamental rethinking of content strategy. Two concepts are emerging as critical:

1) Answer Engine Optimization (AEO)

  • Answer Engine Optimization (AEO) is the practice of structuring and optimizing content so that AI-powered systems, such as Google’s AI Overviews, ChatGPT, Perplexity AI, and voice assistants, can easily identify, extract, and cite it as a direct answer to user queries.

2) Generative Engine Optimization (GEO)

  • Generative Engine Optimization (GEO) is a digital marketing technique designed to improve a brand’s visibility in results produced by generative artificial intelligence (GenAI) platforms. It involves adapting digital content and online presence to ensure that AI systems can accurately interpret, cite, and use the content when generating responses to user queries.

The imperative is clear: Organizations that don’t optimize for AI-powered discovery won’t just lose rankings—they’ll lose visibility entirely.

If you are not already thinking about how to orient your content to this then be aware that you will soon feel an impact.

Opportunity: Agentic AI and Productivity

On the flip side of the coin is the opportunity. While the impact above provides you with an opportunity provided you react appropriately, I want to focus on the productivity part of this. Specifically, think of what Agentic AI can do for your organization.

What Traditional Campaign Development Looks Like

Let me give you a few examples of common tasks and how long they typically take:

  • Create a campaign brief: up to two weeks
  • Create copy across multiple channels: 8-16 hours
  • Create digital assets related to the campaign which fit your brand standards and work in each individual channel. Web site may allow for larger images. Paid search or paid social may have limited space: 40 hours
  • Creation of the segment and pushing it to marketing automation tools: several hours

Now imagine specialized AI agents handling each component—not replacing human strategy and judgment, but accelerating execution while maintaining brand standards and compliance. Just getting one campaign going across multiple channels become a multi-person engagement over several weeks. While focused on that, you won’t focus on additional campaign or in honing your craft.

The AI Agent Team Your Marketing Organization Needs

The answer lies with Agentic AI. We believe that AI can cut down on the time necessary to complete these tasks and still keep humans in the loop. Here are a few examples of agents you might need in your organization:

Agent Name Purpose
Hunter Prospect identification and acquisition specialist that hunts down leads using predictive AI and behavioral signals.
Oracle Predictive intelligence that forecasts customer behavior, market trends, and campaign performance.
Conductor Omnichannel orchestration that translates strategy into compliant high performing journeys.
Guardian Predictive retention specialist that monitors satisfaction predicts churn and intervenes to preserve valuable relationships.
Artisan Creative engine that operationalizes Gen AI to produce on-brand assets at scale.
Advisor Strategic marketing consultant that provides real-time recommendations and optimizes campaigns based on performance data.
Conversational Engages prospect across chat, email and social with context awareness.
Sentinel Compliance and security that ensure all marketing activities adhere to HIPAA regulations.
Segmentation Discovers audience segments and builds new segments for activation.
Bridge Content Migration specialist to seamlessly transfer content between platforms.
Scribe Copywriting specialist to create compelling on brand copy.
Forge App migration specialist to assist with code generation and web development.

Most importantly, this frees your marketing team to focus on what AI can’t do: strategic thinking, creative problem-solving, and understanding the nuanced needs of your community. 

The Path Forward: Integration, Not Replacement

The organizations winning in this new landscape aren’t choosing between human expertise and AI capabilities. They’re strategically integrating both.

Success requires more than technology. It needs an integrated approach:

  1. Rethinking discoverability through AEO and GEO optimization
  2. Deploying specialized AI agents for productivity acceleration
  3. Maintaining human oversight for strategy, creativity, and judgment
  4. Ensuring compliance at every step, particularly in heavily regulated healthcare
  5. Measuring impact against business outcomes, not just operational metrics

Enabling Healthcare Organizations To Lead This Shift

HCIC reminded us that success in healthcare marketing isn’t about chasing technology for its own sake. As I shared in my first HCIC takeaway, AI is not a strategy—it’s a tool to solve real challenges that impact your organization’s ability to connect patients to care.

The search revolution is here. The productivity opportunity is real. The organizations that move quickly to optimize for AI-powered discovery while deploying strategic AI agents will gain a competitive advantage that compounds over time.

Start a conversation with our experts today.

]]>
https://blogs.perficient.com/2025/12/19/hcic-2025-takeaway-ai-is-changing-healthcare-marketing/feed/ 0 388950
Purpose-Driven AI in Insurance: What Separates Leaders from Followers https://blogs.perficient.com/2025/12/19/purpose-driven-ai-in-insurance-what-separates-leaders-from-followers/ https://blogs.perficient.com/2025/12/19/purpose-driven-ai-in-insurance-what-separates-leaders-from-followers/#respond Fri, 19 Dec 2025 17:57:54 +0000 https://blogs.perficient.com/?p=389098

Reflecting on this year’s InsureTech Connect Conference 2025 in Las Vegas, one theme stood out above all others: the insurance industry has crossed a threshold from AI experimentation to AI expectation. With over 9,000 attendees and hundreds of sessions, the world’s largest insurance innovation gathering became a reflection of where the industry stands—and where it’s heading.

What became clear: the carriers pulling ahead aren’t just experimenting with AI—they’re deploying it with intentional discipline. AI is no longer optional, and the leaders are anchoring every investment in measurable business outcomes.

The Shift Is Here: AI in Insurance Moves from Experimentation to Expectation

This transformation isn’t happening in isolation though. Each shift represents a fundamental change in how carriers approach, deploy, and govern AI—and together, they reveal why some insurers are pulling ahead while others struggle to move beyond proof-of-concept.

Here’s what’s driving the separation:

  • Agentic AI architectures that move beyond monolithic models to modular, multi-agent systems capable of autonomous reasoning and coordination across claims, underwriting, and customer engagement. Traditional models aren’t just slow—they’re competitive liabilities that can’t deliver the coordinated intelligence modern underwriting demands.
  • AI-first strategies that prioritize trust, ethics, and measurable outcomes—especially in underwriting, risk assessment, and customer experience.
  • A growing emphasis on data readiness and governance. The brutal reality: carriers are drowning in data while starving for intelligence. Legacy architectures can’t support the velocity AI demands.

Success In Action: Automating Insurance Quotes with Agentic AI

Why Intent Matters: Purpose-Driven AI Delivers Measurable Results

What stood out most this year was the shift from “AI for AI’s sake” to AI with purpose. Working with insurance leaders across every sector, we’ve seen the industry recognize that without clear intent—whether it’s improving claims efficiency, enhancing customer loyalty, or enabling embedded insurance—AI initiatives risk becoming costly distractions.

Conversations with leaders at ITC and other industry events reinforced this urgency. Leaders consistently emphasize that purpose-driven AI must:

  • Align with business outcomes. AI enables real-time decisions, sharpens risk modeling, and delivers personalized interactions at scale. The value is undeniable: new-agent success rates increase up to 20%, premium growth boosts by 15%, customer onboarding costs reduce up to 40%.

  • Be ethically grounded. Trust is a competitive differentiator—AI governance isn’t compliance theater, it’s market positioning.

  • Deliver tangible value to both insurers and policyholders. From underwriting to claims, AI enables real-time decisions, sharpens risk modeling, and delivers personalized interactions at scale. Generative AI accelerates content creation, enables smarter agent support, and transforms customer engagement. Together, these capabilities thrive on modern, cloud-native platforms designed for speed and scalability.

Learn More: Improving CSR Efficiency With a GenAI Assistant

Building the AI-Powered Future: How We’re Accelerating AI in Insurance

So, how do carriers actually build this future? That’s where strategic partnerships and proven frameworks become essential.

At Perficient, we’ve made this our focus. We help clients advance AI capabilities through virtual assistants, generative interfaces, agentic frameworks, and product development, enhancing team velocity by integrating AI team members.

Through our strategic partnerships with industry-leading technology innovators—including AWS, MicrosoftSalesforceAdobe, and more— we accelerate insurance organizations’ ability to modernize infrastructure, integrate data, and deliver intelligent experiences. Together, we shatter boundaries so you have the AI-native solutions you need to boldly advance business.

But technology alone isn’t enough. We take it even further by ensuring responsible AI governance and ethical alignment with our PACE framework—Policies, Advocacy, Controls, and Enablement—to ensure AI is not only innovative, but also rooted in trust. This approach ensures AI is deployed with purpose, aligned to business goals, and embedded with safeguards that protect consumers and organizations.

Because every day your data architecture isn’t AI-ready is a day you’re subsidizing your competitors’ advantage.

You May Also Enjoy: 3 Ways Insurers Can Lead in the Age of AI

Ready to Lead? Partner with Perficient to Accelerate Your AI Transformation

Are you building your AI capabilities at the speed the market demands?

From insight to impact, our insurance expertise helps leaders modernize, personalize, and scale operations. We power AI-first transformation that enhances underwriting, streamlines claims, and builds lasting customer trust.

  • Business Transformation: Activate strategy and innovation ​within the insurance ecosystem.​
  • Modernization: Optimize technology to boost agility and ​efficiency across the value chain.​
  • Data + Analytics: Power insights and accelerate ​underwriting and claims decision-making.​
  • Customer Experience: Ease and personalize experiences ​for policyholders and producers.​

We are trusted by leading technology partners and consistently mentioned by analysts. Discover why we have been trusted by 13 of the 20 largest P&C firms and 11 of the 20 largest annuity carriers. Explore our insurance expertise and contact us to learn more.

]]>
https://blogs.perficient.com/2025/12/19/purpose-driven-ai-in-insurance-what-separates-leaders-from-followers/feed/ 0 389098
Setting the Table for Tomorrow: CX Mastery in 2026 https://blogs.perficient.com/2025/12/16/setting-the-table-for-tomorrow-cx-mastery-in-2026/ https://blogs.perficient.com/2025/12/16/setting-the-table-for-tomorrow-cx-mastery-in-2026/#comments Tue, 16 Dec 2025 18:44:37 +0000 https://blogs.perficient.com/?p=389093

Customer expectations are not just rising; they are skyrocketing. By 2026, the brands that create real customer loyalty won’t be the ones that send clever emails or spruce up their loyalty programs. They’ll be the ones who can almost read their customers’ minds, anticipating needs before customers have to ask. I think of it like this:

If Great CX in 2025 were a skilled short-order cook, it would have expertly filled every customer’s order as it came in. Great CX in 2026 is the master chef who knows your favorite dish before you even sit down, having already prepped the ingredients and timed everything perfectly for your arrival.

This leap from reacting to anticipating is already underway, and it is about to redefine what “Great CX” really means. With this post, I plan to explore what predictive engagement looks like, why it matters for your business, and how you can start cooking up this future today.

 

The Big Shift: From answering questions to reading minds

For ages, customer experience has been about responding. We’re answering questions, fixing hiccups, gently nudging conversions, and recommending next best actions. Predictive engagement changes everything. Instead of waiting for customers to reach out, we’ll use smart data and AI to understand what is coming and act with intention. Imagine a couple of predictive scenarios:

A customer’s delivery is running a bit late. Instead of them anxiously checking, tracking, or calling support, your system detects the delay, automatically offers alternative options, and sends a friendly, proactive update.

A customer is thinking about leaving for another brand. Predictive models can spot those early signals and trigger tailored, helpful offers before customers even consider shopping elsewhere. This proactive approach transforms customer relationships into something far more intuitive and supportive.

 

Four big ideas shaping CX in 2026

Given the ever-shifting nature of customer expectations, I tried to pinpoint what I think will cause the biggest changes for CX teams in 2026.

  1. Hyper-Personalization Gets Real. Personalization still, oftentimes, means sending different messages to different defined segments. In 2026, it means real-time, adaptive journeys explicitly tailored to each person. Generative and predictive AI will power “next-best-experience” decisions, making every interaction feel incredibly relevant and perfectly timed.
  2. Conversational AI Becomes Your Best Server. Chatbots are no longer just for FAQs. They are becoming intelligent assistants that can resolve issues, escalate with all the right context, and learn from every chat. The best experiences will combine smart automation with real human empathy, so customers always feel understood.
  3. Service Fixes Itself. Predictive insights will flow into operations, allowing systems to fix problems before they even impact customers. Think of billing errors being corrected automatically or parts being replaced before they fail. This minimizes disruption and keeps things running smoothly.
  4. Trust Becomes Your Brand’s Superpower. As AI takes center stage, transparency and ethics matter more than ever. Customers expect clear explanations of how their data is used and confidence that decisions are fair. Brands that build trust into every experience will truly shine.

 

Building your smart CX kitchen

Delivering predictive CX is not about adding just one more gadget. It is about creating a unified, event-driven ecosystem. This includes:

  • A solid data foundation that brings together customer profiles and real-time signals.
  • An AI and decisioning layer to predict intent and recommend the most helpful next steps.
  • Experience orchestration to trigger proactive outreach across all your channels.
  • Operational automation for those self-healing processes.
  • Governance and trust controls for transparency and clear explanations.

 

How to start cooking up the future

  • Map your most important customer journeys. Start by identifying those key signals that hint at intent, risk, or value.
  • Build smart decision-making logic. Set clear boundaries for automation and when to bring in human help.
  • Perfection is the enemy of progress. Pilot a few proactive ideas that move the needle from a CX perspective, things like helpful delivery updates or churn prevention, to get started.
  • Scale your conversational AI to make sure customer context follows them across all channels.
  • Measure what works and keep optimizing and experimenting.
  • Make trust a core ingredient by using transparent notices and carefully overseeing your AI models.

 

Looking ahead – answering the question before it’s asked

Predictive CX is not just a nice-to-have for customers; it is a game-changer for your business. It boosts retention by catching churn risk early, reduces service costs by preventing avoidable contacts, and drives revenue by surfacing the right offer at the perfect moment. Plus, it makes your operations far more efficient by avoiding downstream issues.

Customer experience is shifting fast. In the next year, anticipation will be a customer expectation. The brands that win will combine data, predictive intelligence, and automation built on trust and transparency. Begin laying the groundwork today so you can deliver experiences that feel effortless and intuitive.

]]>
https://blogs.perficient.com/2025/12/16/setting-the-table-for-tomorrow-cx-mastery-in-2026/feed/ 1 389093
The Visual Revolution Is Here – Drupal Canvas 1.0 https://blogs.perficient.com/2025/12/15/the-visual-revolution-is-here-drupal-canvas-1-0/ https://blogs.perficient.com/2025/12/15/the-visual-revolution-is-here-drupal-canvas-1-0/#respond Mon, 15 Dec 2025 13:00:35 +0000 https://blogs.perficient.com/?p=389066

For years, the Drupal community has been engaged in a delicate balancing act. On one side, we have the “Structured Data Purists”—those who value Drupal for its rigid data modeling, fieldability, and API-first architecture. On the other side, we have the “Marketing Realists”—the content creators and site builders who look at tools like Squarespace, Wix, or Webflow and ask, “Why can’t my enterprise CMS be this easy to use?”

Historically, Drupal’s answer to this was a compromise. We gave you Layout Builder, which was powerful but often clunky. We gave you Paragraphs, which offered structure but lacked true visual editing. The result was often a disjointed experience where the backend power was unmatched, but the frontend authoring experience felt a decade behind.

That era effectively ends. With the release of Drupal Canvas 1.0 (formerly developed under the “Experience Builder” initiative), Drupal has finally delivered the missing piece of the puzzle. This is not just a new module; it is a fundamental reimagining of how we build, theme, and manage Drupal sites.

This guide will take you through exactly what Drupal Canvas is, what is included in the 1.0 release, and why it represents the single biggest leap forward for the platform in the modern web era.

 

What is Drupal Canvas?

Drupal Canvas is the new, default visual page builder for Drupal, designed to replace the aging Layout Builder and bridge the gap between “No-Code” ease of use and “Pro-Code” structural integrity.

It was born out of the Drupal Starshot initiative, which aims to make Drupal accessible to non-developers right out of the box. However, unlike previous attempts at “easy” site builders, Drupal Canvas does not sacrifice the underlying architecture. It is built on a modern React frontend that communicates seamlessly with Drupal’s core APIs.

The core philosophy of Drupal Canvas is “Visual Control with Architectural Safety.” It allows marketers to drag, drop, and style components in a true WYSIWYG (What You See Is What You Get) environment, while ensuring that every pixel they manipulate is backed by clean, reusable code defined by developers.

The End of the “Blank Screen” Problem

One of the biggest hurdles for new Drupal users has always been the “Blank Screen.” You install Drupal, and you get a plain white page. You have to build content types, configure views, and code a theme before you see anything.

Drupal Canvas flips this. When paired with the new Mercury theme (the default frontend for Drupal CMS v2) and Site Templates, users start with a fully visualized design system. They aren’t building from scratch; they are assembling experiences from a library of polished, brand-compliant components.

 

What’s Included in Drupal Canvas 1.0?

The 1.0 release is packed with features that target both the content editor’s need for speed and the developer’s need for order. Here is a detailed breakdown of the key components included in this release.

1. The React-Based Visual Editor

The heart of Drupal Canvas is a lightning-fast, React-based interface. Gone are the days of waiting for AJAX throbbers to spin while a block saves.

  • True Drag-and-Drop: You can drag components from a sidebar directly onto the page canvas. The interface is smooth, mimicking the fluidity of SaaS site builders.
  • Live Viewport Preview: The editor includes a native viewport switcher. You can instantly toggle between Desktop, Tablet, and Mobile views to see how your layout responds. This isn’t just a CSS trick; it simulates the rendering to ensure responsive integrity.
  • Instant Property Editing: Click on any component—a button, a header, a card—and a sidebar opens with its properties. Change the text, swap an image, or adjust the alignment, and the canvas updates instantly.

2. Native Integration with Single Directory Components (SDC)

This is the “Architect” feature that makes Drupal Canvas revolutionary. In the past, site builders generated “magic code” that developers hated because it was hard to maintain.

Drupal Canvas is built entirely on Single Directory Components (SDC).

  • One Source of Truth: A component (e.g., a “Hero Banner”) is defined in code with its Twig template, CSS, and JS all in one folder. Drupal Canvas simply discovers this component and exposes it to the UI.
  • No Code Forking: When a developer updates the CSS for the “Hero Banner” in the code repository, every instance of that banner inside Drupal Canvas updates automatically. There is no separation between “Builder Components” and “Code Components.”

3. Props and Slots Architecture

Drupal Canvas introduces a standardized way to handle data, borrowed from modern frontend frameworks:

  • Props (Properties): These are the settings of a component. For example, a “Card” component might have Props for Image, Title, Description, and Link. In the Canvas UI, these appear as simple form fields.
  • Slots: These are “drop zones” inside a component. A “Grid” component might have four “Slots.” You can drag other components into these slots. This allows for nesting and complex layouts (e.g., putting a “Video Player” inside a “Modal” inside a “Grid”) without breaking the code.

4. In-Browser Code Editor

For the “Low-Code” developer, Drupal Canvas 1.0 includes a shocking amount of power directly in the browser.

  • Edit Logic on the Fly: You can actually modify the CSS or lightweight logic of a component directly within the Canvas interface using a built-in code editor.
  • JSON:API Client: The builder includes a client to fetch data dynamically. You can create a component that says “Show the latest 3 blog posts,” and configure the data fetching query right in the browser, bridging the gap between a static design and dynamic Drupal content.

5. The “Byte” Site Template & Mercury Theme

Drupal Canvas 1.0 doesn’t come empty-handed.

  • The Mercury Theme: Replacing Olivero as the default face of Drupal CMS, Mercury is a modern, component-first theme designed specifically to work with Canvas. It is accessible (WCAG AA) and responsive by default.
  • Byte Template: A pre-configured site template included in the release. It demonstrates the power of Canvas by providing a full “Agency” style website structure (Services, About, Contact, Portfolio) ready to be customized.

6. Multi-Step Undo/Redo & History

It sounds basic, but for enterprise teams, this is critical. Drupal Canvas maintains a robust history stack. If a content editor accidentally deletes a complex section, they can hit “Undo” (Command+Z) to restore it instantly. This safety net encourages experimentation without fear of breaking the site.

 

How Drupal Canvas Transforms the Workflow

The arrival of Drupal Canvas 1.0 fundamentally changes the lifecycle of a Drupal project. It solves the friction points that have historically slowed down enterprise deployments.

For the Content Marketer: “Independence Day”

The biggest winner here is the marketing team. In the pre-Canvas era, creating a new landing page usually meant filing a ticket with IT/Engineering. “We need a new layout for the Q3 campaign.”

  • The Old Way: The developer builds a custom content type or template. Two weeks later, the marketer gets to enter text.
  • The Canvas Way: The marketer opens Drupal Canvas. They drag a “Hero” component, a “Two-Column Text” component, and a “Sign-Up Form” component onto the page. They tweak the colors to match the campaign branding using the “Design Tokens” exposed in the UI. They hit publish.
  • Result: Time-to-market drops from weeks to minutes.

For the Developer: “Governance Without Grunt Work”

Developers often fear page builders because they produce “spaghetti code” that is impossible to maintain. Drupal Canvas respects the developer’s craft.

  • Focus on Components, Not Pages: Developers stop building “pages.” Instead, they build robust, accessible SDC components. Once a component is built (e.g., a complex “Pricing Calculator”), they push it to the library.
  • Governance: The developer controls the constraints. They can define in the code that a “Testimonial Slider” allows for 3 to 5 slides, but no more. The Canvas UI enforces this. The marketer cannot break the layout because the code forbids it.
  • Frontend Freedom: Because Canvas creates a clean separation via SDC, frontend developers can use modern tools (Tailwind, React, PostCSS) inside their components without worrying about Drupal’s legacy rendering pipeline interfering.

For the Agency/Enterprise: “Scalable Design Systems”

For large organizations managing hundreds of sites (Universities, Government, Multi-brand Corps), Canvas is a design system enabler.

  • Brand Consistency: You can deploy a “Global Design System” module. Every site in the portfolio gets the same set of approved components in their Canvas library.
  • Centralized Updates: If the brand color changes from Blue to Navy, you update the Design Token in the central theme. Every page built with Canvas across the ecosystem updates instantly.

 

The Strategic Edge – AI and Future Proofing

Drupal Canvas 1.0 is not just catching up to competitors; it is positioning Drupal to leapfrog them via the Starshot AI integration.

The AI Assistant

Demonstrated in the Driesnote and part of the roadmap for Drupal CMS v2, Canvas is designed to host AI agents.

  • Generative Layouts: A user will be able to click a button in Canvas and type: “Build me a pricing page with a comparison table and an FAQ section.” The AI will select the correct components from your SDC library and assemble the page for you.
  • Content Rewriting: Inside the Canvas properties panel, AI can assist in rewriting headlines for SEO or adjusting tone, directly within the visual flow.

Headless Readiness

Unlike many page builders that lock you into a specific rendering engine, the data saved by Drupal Canvas is structured. This means it can be decoupled. You can use Drupal Canvas to visually build a page, and then serve that layout data via JSON:API to a Next.js or React application on the frontend. It bridges the gap between “Visual Editing” and “Headless Architecture.”

 

Conclusion:

For many years, choosing a CMS was a choice between power and ease. You picked WordPress or Squarespace for ease, and you picked Drupal for power.

Drupal Canvas 1.0 eliminates the need to choose.

It provides the slick, intuitive authoring experience that modern web users demand, but it lays it on top of the most secure, scalable, and structured data engine in the world. It creates a workflow where developers can code rigorous standards and marketers can exercise creative freedom, simultaneously and harmoniously.

If you are currently running Drupal with Layout Builder, Paragraphs, or a legacy theme, the release of Drupal Canvas is your signal to start planning your evolution. The future of Drupal is not just about managing code; it’s about crafting experiences, and finally, we have the canvas to do it properly.

]]>
https://blogs.perficient.com/2025/12/15/the-visual-revolution-is-here-drupal-canvas-1-0/feed/ 0 389066
Perficient Named a Major Player in 2 IDC MarketScape Reports https://blogs.perficient.com/2025/12/11/perficient-named-a-major-player-in-2-idc-marketscape-reports/ https://blogs.perficient.com/2025/12/11/perficient-named-a-major-player-in-2-idc-marketscape-reports/#respond Thu, 11 Dec 2025 18:19:34 +0000 https://blogs.perficient.com/?p=389027

Perficient is proud to be named a Major Player in the IDC MarketScape: Worldwide Experience Build Services 2025 Vendor Assessment (Doc #US52973125, October 2025) and IDC MarketScape: Worldwide Experience Design Services 2025 Vendor Assessment (Doc #US52973225, October 2025). These IDC MarketScapes assessed providers, offering a comprehensive framework including product and service offerings, capabilities and strategies, and current/future market success factors.

“We believe being recognized by IDC for Experience Design and Experience Build reinforces the impact we have on behalf of clients creating personalized, seamless interactions that accelerate growth. In today’s experience-driven economy, that’s the competitive advantage that matters,” says Erin Rushman, general manager of digital marketing and experience design operations at Perficient.

What This Inclusion Means for Perficient

Being named a Major Player, we believe, underscores our dedication to transforming customer experiences and empowering businesses through personalized, seamless, and impactful interactions. Perficient combines strategy and research with human-centered design to help organizations craft agile, customer-focused solutions that thrive in dynamic markets. By leveraging data-driven insights, personalization, AI, and more, we deliver end-to-end experiences that deepen engagement and drive measurable business impact.

According to the IDC MarketScape for Experience Design Services, “Perficient has strong capabilities in digital offering design and offers leading-edge experience design services backed by a global innovation network.” The report also notes, “In conversations with Perficient’s reference clients, the three areas where experience design services buyers commended the vendor highly were for the quality of its professionals, for its industry specific capabilities, and differentiation as a vendor.”

The IDC MarketScape for Experience Build Services states, “As an independent digital experience agency, Perficient combines business and technology transformation capabilities, including a robust collection of supporting assets and tools, with a focus on the design and build of customer experiences. Perficient has strong personalization capabilities.”

Additionally, Perficient was named a Major Player in the IDC MarketScape for Customer Experience Strategy Consulting Services 2025 Vendor Assessment (Doc #US52973025, September 2025). We believe this inclusion reflects our commitment to delivering AI-first solutions that transform customer experiences through scalable, high-impact innovations. It establishes Perficient as a trusted partner, driving unmatched success in the experience-driven market of tomorrow.

Read the News Release: Perficient Named a Major Player in Three IDC MarketScapes For AI-First Approach to Customer Experience

What This Inclusion Means for Our Clients

Perficient continues to be a leader in experience strategy and design, helping clients align vision, accelerate innovation, and achieve lasting transformation. We enable businesses to embed AI into processes and deliver personalized customer experiences at scale. By expanding and strengthening alliances with partners, we ensure our solutions remain innovative and leading-edge, empowering clients to stay ahead in a dynamic market.

Exceptional CX is essential for growth and loyalty. Our expertise across platforms and global delivery ensures brands can quickly adapt, innovate, and meet rising customer expectations. Explore our expertise to see how we can be a partner in your experience journey.

]]>
https://blogs.perficient.com/2025/12/11/perficient-named-a-major-player-in-2-idc-marketscape-reports/feed/ 0 389027
Migrating React from version 18 to 19 https://blogs.perficient.com/2025/12/10/migrating-react-from-version-18-to-19/ https://blogs.perficient.com/2025/12/10/migrating-react-from-version-18-to-19/#comments Wed, 10 Dec 2025 07:20:16 +0000 https://blogs.perficient.com/?p=388890

React 19 was released on 25 April 2024 and it is based out of React 18. This release introduces major improvements and removes some features to enhance developer experience and application performance. Migrating is a straightforward approach, but you need to watch for removed features. In this blog, I share my experience in migrating React to version 19.

Key Features added in React 19

  1. Optimized Concurrent Rendering:React introduced concurrent rendering in version 18. In React 19, they further improved it by scheduling algorithm dynamically to adapt user interactions.
const root = ReactDOM.createRoot(document.getElementById('root'));
root.render(<App />, { adaptiveScheduling: true });
  1. New Hooks introduced
    • useActionState -> Update state based on the result of form submission.
    • useFormStatus -> Gives status information of the last form submission.
    • useOptimistic → Lets you  to optimistically update the UI.
  1. Actions API: React introduced Actions to handle mutations and state updates. It acts as an async function that gets called on form submission.

Deprecations & removals to consider

  • React removed String Refs. Use callback refs or createRef instead.
  • createFactory, render, element.ref and ReactDom.hydrate were removed.
  • React removed propTypes and defaultProps from package.

Step by step instructions for React 18 to 19 migration

Step1: Audit packages

Audit your package.json file and ensure it supports React 19. If not, upgrade the packages to the supported version.

Note: It is highly recommended to keep packages with the latest version.

Npm run audit

Step2: Update React version

npm install react@^19.0.0 react-dom@^19.0.0

Step3: Run Codemods

npx codemod@latest react/19/migration-recipe

Codemods will update the following

  • Replace ReactDOM.render with createRoot or hydrateRoot.
  • Convert String refs to callback refs or useRef.
  • Remove legacy Context APIs (contextTypes, getChildContext).

Step4: Address breaking issues

  • We used Material UI, where Grid2 was not supported. We switched from Grid2 to Grid.
  • React removed String Ref, so we replaced it with createRef.
  • We upgraded to the modern Context API like createContext and contextType.

Step5: Test and validate the application

Run the application and test the end-to-end flow. Look for performance, server-side rendering, lazy loading impacts as well.

Step6: Post migration cleans up

Update lint/eslint rules according to react 19.Remove unused imports and packages. Add dev notes for the breaking issues to faced/resolution took while migrating

Reference:

  1. React 19 Upgrade Guide – React
  2. Built-in React Hooks – React
  3. React 18 to 19 Migration – Codemod.com
]]>
https://blogs.perficient.com/2025/12/10/migrating-react-from-version-18-to-19/feed/ 1 388890
Salesforce Marketing Cloud + AI: Transforming Digital Marketing in 2025 https://blogs.perficient.com/2025/12/05/salesforce-marketing-cloud-ai-transforming-digital-marketing-in-2025/ https://blogs.perficient.com/2025/12/05/salesforce-marketing-cloud-ai-transforming-digital-marketing-in-2025/#respond Fri, 05 Dec 2025 06:48:04 +0000 https://blogs.perficient.com/?p=388389

Salesforce Marketing Cloud + AI is revolutionizing marketing by combining advanced artificial intelligence with marketing automation to create hyper-personalized, data-driven campaigns that adapt in real time to customer behaviors and preferences. This fusion drives engagement, conversions, and revenue growth like never before.

Key AI Features of Salesforce Marketing Cloud

  • Agentforce: An autonomous AI agent that helps marketers create dynamic, scalable campaigns with effortless automation and real-time optimization. It streamlines content creation, segmentation, and journey management through simple prompts and AI insights. Learn more at the Salesforce official site.

  • Einstein AI: Powers predictive analytics, customized content generation, send-time optimization, and smart audience segmentation, ensuring the right message reaches the right customer at the optimal time.

  • Generative AI: Using Einstein GPT, marketers can automatically generate email copy, subject lines, images, and landing pages, enhancing productivity while maintaining brand consistency.

  • Marketing Cloud Personalization: Provides real-time behavioral data and AI-driven recommendations to deliver tailored experiences that boost customer loyalty and conversion rates.

  • Unified Data Cloud Integration: Seamlessly connects live customer data for dynamic segmentation and activation, eliminating data silos.

  • Multi-Channel Orchestration: Integrates deeply with platforms like WhatsApp, Slack, and LinkedIn to deliver personalized campaigns across all customer touchpoints.

Latest Trends & 2025 Updates

  • With advanced artificial intelligence, marketing teams benefit from systems that independently manage and adjust their campaigns for optimal results.

  • Real-time customer journey adaptations powered by live data.

  • Enhanced collaboration via AI integration with Slack and other platforms.

  • Automated paid media optimization and budget control with minimal manual intervention.

For detailed insights on AI and marketing automation trends, see this industry report.

Benefits of Combining Salesforce Marketing Cloud + AI

  • Increased campaign efficiency and ROI through automation and predictive analytics.

  • Hyper-personalized customer engagement at scale.

  • Reduced manual effort with AI-assisted content and segmentation.

  • Better decision-making powered by unified data and AI-driven insights.

  • Greater marketing agility and responsiveness in a changing landscape.

]]>
https://blogs.perficient.com/2025/12/05/salesforce-marketing-cloud-ai-transforming-digital-marketing-in-2025/feed/ 0 388389