Today in our “We Are Perficient” series, we explore how businesses can take their digital experience to the next level through mobile optimization. In an exclusive conversation with Jonathan Crockett, Managing Director of Go-To-Market, Sales, and Solutions at Perficient, we dive into key strategies to ensure brands deliver seamless, high-impact experiences on mobile devices.
In today’s digital world, user experience is everything. Companies looking to stand out must provide seamless, personalized, and optimized interactions at every touchpoint. In this video, we explore how the combination of Artificial Intelligence, advanced digital experience strategies, and collaboration with technology leaders like Adobe is redefining the way brands connect with their customers.
Today, most digital interactions happen on mobile devices. Without a well-optimized mobile strategy, brands risk losing conversions and engagement. From ultra-fast loading times to intuitive and accessible interfaces, mobile optimization is no longer optional—it’s essential to improving customer retention and conversion rates.
Artificial intelligence is transforming user experiences by enabling real-time personalization based on data. From content recommendations to adaptive interfaces that respond to user behavior, AI helps deliver unique and relevant experiences at every interaction. This not only enhances customer satisfaction but also boosts lifetime value and brand loyalty.
As an Adobe strategic partner, Perficient helps businesses unlock the full potential of Adobe’s cutting-edge solutions. From Adobe Experience Manager to Adobe Sensei, our strategies merge creativity and technology to design immersive, scalable, and highly effective digital experiences.
The future of digital experience lies in personalization, optimization, and continuous innovation. If you’re looking to transform how your customers interact with your brand, Perficient can help.
Contact us today and discover how we can elevate your digital strategy.
]]>Adobe Summit 2025 is officially a wrap, and I have pages and pages of notes to go through with my team to plan out our year! The two terms that appear the most in my notes are Performance and Personalization. Customer expectations continue to rise, and the demand is higher than ever for lightning-fast sites that keep shoppers engaged, with hyper-personalization that lets them know brands care about them, leading to loyalty and customer retention. These themes were consistent throughout the sessions that I attended throughout the week.
ADOBE COMMERCE AS A CLOUD SERVICE
This summer, Adobe Commerce as a Cloud Service will officially be available. This high-speed storefront is a fully SaaS solution that can be provisioned in just a few minutes. It is version-less and maintained by Adobe, eliminating the maintenance costs that come along with patches and upgrades. Catalog Service, Intelligent Merchandising, Payment Services, Product Asset Management, and Developer Tools are added on top of the existing Commerce Foundation.
The results are super-fast experiences delivered from the edge, perfect Lighthouse scores, boosts to search engine rankings, more organic traffic, and higher conversions. Add in generative AI to rapidly build personalized variations of content, alongside analysis of shopper behavior and sales data for that next level personalization that keeps customers coming back.
ADOBE COMMERCE OPTIMIZER
Available at the same time, Adobe Commerce Optimizer offers the same features as Cloud Service but allows customers to keep their existing commerce back-end. The experience layer provides the storefront, catalog and merchandising tools, allowing for quick wins with ROI and modernization. Pre-built APIs and Connectors can be utilized and tweaked as needed, to get to launch as quickly as possible. Then, if desired, the commerce back-end can be migrated later, at the merchant’s pace to realize the full set of benefits that come with Cloud Service. Adobe offers migration tools to make the transition smoother.
KEY FEATURES
The new Storefront is powered by Edge Delivery, resulting in four times faster page loads, fifteen percent or higher increase in organic traffic, improved search engine rankings, and ninety or higher Lighthouse scores. Separating the front-end from Commerce Foundation in a composable fashion, and loading data in, increases stability and security, and decreases total cost of ownership.
Authoring options include drag and drop visual, and document based via SharePoint or Google. Content creation powered by generative AI helps create personalized experiences from content variations, which can then be validated with built-in A/B testing. This helps merchandisers find the right content to display to the right customer at the right time.
With integrated digital asset management powered by AEM Assets, gone are the days of uploading product images and other media directly into Commerce. Through the power of AI and integration with Firefly and Express, product image variations can be generated in bulk and stored in the DAM. The images are easily linked to SKUs making them immediately available for product pages, once they are published. Enhanced experiences are possible with the product variations, 3D models, and augmented reality.
Intelligent Merchandising can increase conversion rates and order values by tailoring search results, category pages, and product recommendations based on customer behavior, account history data, and business goals.
SCALABILITY AND REDUCED TCO
Catalog Service can handle 250 million SKUs with 30K prices each, syndicated across multiple channels and audiences without any duplication. By using a single catalog, 10K products and 50K prices can be ingested per minute from integrated systems.
The cloud-native platform scales dynamically for increased traffic and order volumes. 330 data centers and API orchestration on the edge allow for 99.9% availability and over 10K requests per minute with ease.
Continuous updates managed by Adobe, and instant access to new features turned on via toggle when ready, remove hours and hours of maintenance. Boilerplate themes and drop-in components allow for sites to be built rapidly with less development needed. Starter kits, APIs, events, web-hooks, and marketplace apps speed up integrations and time to launch. This reduction in operating costs allows resources to use the freed-up time for fine tuning strategies for rapid, continued growth.
OMNI-CHANNEL SUPPORT
B2B buyers are expecting the same personalized experiences at work, that they receive through personal interactions with brands in their daily lives. Over 60% of B2B buyers develop selection criteria or finalize a vendor list based solely on content available to them. Adobe makes this achievable by supporting B2C, B2B, and B2B2X all one the same platform. Toggle-able features designed for B2B users include parent-child account model with support for multi-buyer assignments, a storefront context switcher, standing quotes, requisition lists, and contract pricing. All that needs to change is access to them on the storefront, driven by the customer’s account type. The new Storefront has boilerplates for both B2C and B2B.
Experiences on other channels like social media are just as important, providing additional touch points for a brand to interact with customers. Adobe has partnered with Meta (Facebook and Instagram) and TikTok to release apps for seamless integration. Product catalogs, ratings and reviews, and cart rules can be sent from Commerce to the corresponding social platform. Direct integrations with ad accounts allow this data to be used for product promotions. The customer can then use social shops, or link directly to a brand’s website to purchase products, with all orders automatically flowing through Commerce.
The well-known Extension Marketplace is also receiving some updates and becoming the Apps Marketplace. Pre-built apps from third parties can be acquired to enhance native features or provide features that might be specific to only certain industries. These apps are built on API Mesh and App Builder, which can also be used by brands to build their own apps to solve their unique business needs.
ADVANCED ADOBE EXPERIENCE CLOUD FEATURES
Adobe Commerce data, including storefront clicks, back-office fulfillment information, and customer profiles can be shared with AEP for use with other Adobe applications. CDP can be used to analyze data and build audiences for advertisements, merchandising, and abandoned cart campaigns. AJO can be used for personalized journeys and offers. Marketo can be used for automated marketing campaigns, account nurturing, interactive webinars, and dynamic chat. These are just some of the possibilities!
AEM Sites Optimizer uses an AI agent to identify issues and propose resolutions, showing immediate value. The entire funnel is optimized, from acquisition, to brand engagement, and ultimately conversion. The agent identifies issues, projects lost traffic and revenue and suggests resolutions for review and deployment with a few clicks.
It is estimated that 39% of consumers already use AI for online shopping, and over half of Fortune 500 companies will adopt experience agents to match that growing expectation. Adobe Brand Concierge is a virtual assistant, that uses first party data to deliver true personalization. Built on AEM, CJA, AJO and AEP, it is connected to assets, insights, profiles, and campaign orchestration. Customers can interact with Brand Concierge through welcome or purchase confirmation emails, or once they log into their account again after onboarding. Using account history and profile preferences, Brand Concierge interacts with customers using buttons, and text or voice prompts. Curated search results, and product recommendations can be shared, and easily purchased, making it the ultimate shopper assistant.
WHAT’S NEXT?
Adobe remains committed to their customers and will continue supporting the existing on-premises and PaaS versions of Commerce. The nearly 9 million lines of Commerce Foundation code remains unmodified, and the business logic will continue to be available and supported. If you would like to learn more about any of these solutions, or plan out an incremental approach to implementation, Perficient is here to help. Please contact us to set up time to review some of our success stories and discuss your future customer experience and commerce strategy. Talk to you soon!
]]>In this post, we’ll take a look at a couple Sitecore SDKs commonly used to build XM Cloud head applications. Specifically, we’ll be looking at how to disable cookies used by these SDKs. This can be useful for data privacy and/or regulatory compliance reasons. These SDKs allow your application to integrate with other composable Sitecore services like analytics, personalization, and search. The cookies these SDKs use need to be considered as part of your application’s overall data protection strategy.
It’s worth noting that, even without any additional SDKs, an XM Cloud head application can issue cookies; see XM Cloud visitor cookies for more information.
The Sitecore Cloud SDK allows developers to integrate with Sitecore’s Digital Experience Platform (DXP) products. These include Sitecore CDP, Sitecore Personalize, etc. You can read the official documentation here. To learn more about the first-party cookies used by this SDK, see Cloud SDK cookies. These cookies include:
Sitecore are actively working on integrating the disparate Sitecore SDKs into the Sitecore Cloud SDK. The latest version, 0.5, was released on January 29, 2025, and added search capabilities (see the XM Cloud changelog entry here). As Sitecore’s Technical Product Manager Christian Hahn put it in this recent Discover Sitecore YouTube video:
“…[the] Cloud SDK is not another Sitecore SDK–it is the Sitecore SDK.”
It’s safe to assume that, eventually, the Sitecore Cloud SDK will be the only Sitecore SDK developers need to include in their head applications to integrate with any other Sitecore DXP offerings (which will be nice ).
For the remainder of this post, assume that a pre-0.5 version of the Cloud SDK is in use, say, 0.3.0—any version that doesn’t include search widgets (such that the Search JS SDK for React is still required).
The Search JS SDK for React allows developers to create components such as search bars, search results components, etc. These components interact with search sources defined and indexed in Sitecore Search. You can read the official documentation here. While the latest version of the Cloud SDK includes some search dependencies, for older Next.js applications using older versions of the Cloud SDK, the Search JS SDK for React can still be used to build search interfaces.
The Search JS SDK for React uses a cookie to track events context called __ec (reference). This SDK is historically based on Sitecore Discover whose cookies are similarly documented here, e.g., __rutma.
For the remainder of this post, assume that version 2.5.5 of the Search JS SDK for React is in use.
Let’s say your XM Cloud project leverages JSS for Next.js, including the multisite add-on. This add-on (which is included in the official starter kit by default) allows a single Next.js application to drive multiple headless sites. Next, let’s assume that some of these sites operate outside of the United States and are potentially subject to different data protection and privacy laws. Finally, let’s assume that not all of the sites will use the full feature set from these SDKs. For example, what if a couple of the sites are small and don’t need to integrate with Sitecore Search at all?
How do you disable the cookies written to the browser when the Search SDK’s <WidgetsProvider> component is initialized? Even though the smaller sites aren’t using search widgets on any of their pages, the <WidgetsProvider> component is (usually) included in the Layout.tsx file and is still initialized. We don’t want to remove the component since other sites do use search widgets and require the <WidgetsProvider> component.
Can these SDKs be configured to (conditionally) not create cookies on the client browser?
First and foremost (before we get into how to disable cookies used by these SDKs), know that you must ensure that your application is compliant with any and all data privacy and data protection laws to which it is subject. This includes allowing users to opt-out of all browser cookies. Cookie preferences, their management, third-party solutions, GDPR, CCPA, etc. are all great topics but are well outside the scope of this post. To get started, refer to Sitecore’s documentation on data privacy to understand who is responsible for what when building an XM Cloud application.
With that small disclaimer out of the way, the programmatic hooks discussed in the sections below can be used in conjunction with whatever cookie management solution that makes sense for your application. Let’s assume that, for these smaller sites operating in different geographies that require neither CDP nor search, we just want to disable cookies from these SDKs altogether.
The short version: just don’t call the SDK’s init() function . One way this can be done is to add an environment variable and check its value within the .\src\<rendering-host>\src\lib\context\sdk\events.ts file and either return early or throw before the call to Events.init():
import * as Events from '@sitecore-cloudsdk/events/browser'; import { SDK } from '@sitecore-jss/sitecore-jss-nextjs/context'; const sdkModule: SDK<typeof Events> = { sdk: Events, init: async (props) => { // Events module can't be initialized on the server side // We also don't want to initialize it in development mode if (typeof window === 'undefined') throw 'Browser Events SDK is not initialized in server context'; if (process.env.NODE_ENV === 'development') throw 'Browser Events SDK is not initialized in development environment'; // We don't want to initialize if the application doesn't require it if (process.env.DISABLE_CLOUD_SDK === 'true') // <===== HERE throw 'Browser Events SDK is not initialized for this site'; await Events.init({ siteName: props.siteName, sitecoreEdgeUrl: props.sitecoreEdgeUrl, sitecoreEdgeContextId: props.sitecoreEdgeContextId, // Replace with the top level cookie domain of the website that is being integrated e.g ".example.com" and not "www.example.com" cookieDomain: window.location.hostname.replace(/^www\./, ''), // Cookie may be created in personalize middleware (server), but if not we should create it here enableBrowserCookie: true, }); }, }; export default sdkModule;
By not calling Events.init(), the cookies aren’t written to the browser.
Note that in newer versions of the XM Cloud starter kit using the Cloud SDK, the initialize function may be in the Bootstrap.tsx file; however, the same principle applies—don’t call the initialize() function by either returning early or setting up conditions such that the function is never called.
For consistency, assuming your application uses the OOTB CdpPageView.tsx component, you’d probably want to do something similar within that component. By default, page view events are turned off when in development mode. Simply add another condition to ensure that the return value of disabled() is true:
import { CdpHelper, LayoutServicePageState, useSitecoreContext, } from '@sitecore-jss/sitecore-jss-nextjs'; import { useEffect } from 'react'; import config from 'temp/config'; import { context } from 'lib/context'; /** * This is the CDP page view component. * It uses the Sitecore Cloud SDK to enable page view events on the client-side. * See Sitecore Cloud SDK documentation for details. * https://www.npmjs.com/package/@sitecore-cloudsdk/events */ const CdpPageView = (): JSX.Element => { ... /** * Determines if the page view events should be turned off. * IMPORTANT: You should implement based on your cookie consent management solution of choice. * By default it is disabled in development mode */ const disabled = () => { return process.env.NODE_ENV === 'development' || process.env.DISABLE_CLOUD_SDK === 'true'; // <===== HERE }; ... return <></>; }; export default CdpPageView;
The <WidgetsProvider> component (imported from @sitecore-search/react) includes a property named trackConsent (documented here) and it controls exactly that—whether or not tracking cookies related to visitor actions are created. Setting the value of this property to false disables the various cookies. In the Layout.tsx file, assuming we added another environment variable, the code would look something like this:
ata-enlighter-language="typescript">/** * This Layout is needed for Starter Kit. */ import React from 'react'; ... import { Environment, WidgetsProvider } from '@sitecore-search/react'; const Layout = ({ layoutData, headLinks }: LayoutProps): JSX.Element => { ... return ( <> ... <div className="App"> <WidgetsProvider env={process.env.NEXT_CEC_APP_ENV as Environment} customerKey={process.env.NEXT_CEC_CUSTOMER_KEY} apiKey={process.env.NEXT_CEC_API_KEY} publicSuffix={true} trackConsent={!(process.env.DISABLE_TRACK_CONSENT === 'true') /* <===== HERE */} > ... </WidgetsProvider> </div> ... </> ); }; export default Layout;
If trackConsent is false, then the various __r… cookies are not written to the browser.
It’s worth mentioning that, by default, trackConsent is true. To opt-out of cookies, developers must set the property to false.
Whether you control the use of cookies by using environment variables as described in this post or by integrating a more complex cookie preference and consent management system, the onus is on you and your XM Cloud head application to avoid using cookies without a user’s consent.
Thanks for the read!
As I gear up for my presentation at the Adobe Summit this March, I’ve been reflecting on the transformative potential of a well-executed content supply chain—and the hurdles large organizations face in making it a reality. And since it is Summit, I will obviously be referencing tools like Adobe GenStudio, Adobe Workfront, AEM Sites, and AEM Assets, which all aim to streamline content creation, management, and activation. Yet, one pain point consistently rises to the top when implementing this process at scale: siloed teams and disconnected workflows.
In large organizations, it is practically impossible to audit the content generation processes because content production often resembles a patchwork quilt rather than a seamless assembly line, with each patch representing a different department or agency. The patches themselves represent the individual departmental processes, which makes stitching them together difficult.
For example, the Marketing team might be crafting campaigns within the agency, while design teams work in isolation on visuals that get handed off to the dev team for assembly within their isolated channel teams. This fragmentation isn’t just a minor inconvenience—it’s the number one barrier to achieving an efficient content supply chain.
This pain point isn’t insurmountable, however. The Adobe Experience Cloud is designed specifically to bridge these process clusters to help orchestrate tasks across teams, ensuring everyone—from copywriters to legal reviewers—is aligned on timelines and deliverables. Now coined GenStudio from Adobe, Adobe has invested significant dollars into creating one holistic solution to streamline content development. The trick is getting everyone on the same page.
Ok, so, yes, it’s the age-old adage of people and processes and not just technology. You probably didn’t need to read this to figure that out, but if it is so obvious, why do so many large organizations struggle to improve? My observation is this: if you don’t have the proper change management and cross-functional training in place and if you can’t foster and establish a cultural shift toward collaboration, then all the technology in the world won’t help, making leadership buy-in is critical.
At Adobe Summit, my plan is to review the technology elements but to also dive deeper into how organizations can tackle this pain point head-on, with real-world examples and practical strategies. We have to start by connecting the dots—and the people—behind the content, unlocking the full potential of the organization to scale up content production.
Stay tuned for more insights as I prepare for March, and let me know your thoughts on streamlining content workflows in the comments!
Join us for lunch during Adobe Summit to explore why having a clear, strategic vision is essential before deploying new technologies. We’ll discuss how GenStudio and other tools can fit into your existing content workflow to maximize efficiency and creativity.
We hope to see you there!
Beyond GenStudio: Crafting a Modern Content Supply Chain Vision
Wednesday, March 19 | 11:30 A.M. – 1:30 P.M.
Register
If you’re a UX/UI designer, you’ve probably heard of Figma—the design tool that’s changing the game. Whether you’re crafting a stunning website, a sleek mobile app, or a killer prototype, Figma has everything you need in one place. No heavy downloads, no version mix-ups, just smooth, collaborative, and creative freedom!
So, let’s dive in and see why Figma is a must-have for UX/UI designers and how you can use it to bring your design visions to life.
Designing from Scratch: Creating Web & App Interfaces in Figma
One of the best things about Figma? You can design ANYTHING—from landing pages to full-fledged applications—without needing a separate tool for wireframes, UI elements, or interactive prototypes.
How to Get Started
Create a New File: Open Figma and start a fresh project. Use frames instead of artboards (frames act as your screen sizes).
Use Grids & Layouts: Figma lets you set up grids and columns to keep your designs responsive and well-structured.
Start with Wireframes: Quickly sketch out low-fidelity wireframes before diving into detailed UI elements.
Add UI Components: Drag and drop buttons, icons, and form fields from Figma’s built-in libraries or create your own components for reuse.
Before you know it, you’ll have a polished webpage or app interface ready to go!
Mockups Made Easy: Figma’s Power Tools
Forget about constantly resizing elements or manually adjusting spacing—Figma’s smart features do the heavy lifting for you!
Top Features for Mockups
Auto Layout: Helps elements resize dynamically—perfect for buttons, cards, and responsive designs.
Components & Variants: Create reusable UI elements like buttons with different states (hover, active, disabled, etc.).
Design Systems: Maintain consistency by creating a UI library with colors, typography, and icons.
Plugins & Widgets: Speed up your workflow with plugins like Unsplash for images, Icons8 for icons, and Content Reel for placeholder text.
Want a pixel-perfect design?
Just snap elements into place using Figma’s alignment tools—super easy!
Prototyping: Bringing Your Designs to Life
Figma isn’t just for static screens—it lets you transform designs into interactive prototypes without writing a single line of code!
How to Prototype in Figma
Link Screens Together: Connect buttons to different pages using “Prototype” mode.
Add Micro-Interactions: Use Smart Animate to create smooth transitions between states.
Create Clickable Prototypes: Set up interactions like hover effects, scrolling behavior, and animated overlays.
Test & Iterate: Share a prototype link with your team or clients for feedback—no need to export PDFs or images!
With Figma’s real-time collaboration, anyone can leave comments directly on the prototype, making revisions faster and smoother.
Exploring Different UI & Web Designs in Figma
Figma is versatile, whether you’re designing for:
Websites: Create landing pages, e-commerce stores, or dashboards with responsive layouts.
Mobile Apps: Design intuitive user experiences for iOS and Android.
Dark Mode Interfaces: Easily switch between light and dark themes using components.
Custom UI Kits: Build your own set of buttons, forms, and modals for future projects.
No matter your style, Figma adapts to YOU, not the other way around!
Why UX/UI Designers Love Figma (and You Will Too!)
Cloud-Based: No need to install software—access your designs from anywhere.
Real-Time Collaboration: Work with teammates without version control headaches.
Fast & Lightweight: Unlike heavy design tools, Figma runs smoothly in your browser.
Easy Handoff to Developers: Share designs with devs using Figma’s CSS code inspector.
Cross-Platform: Works on Mac, Windows, and even Chromebooks!
Ready to Master Figma?
Figma is more than just a design tool—it’s your creative playground for designing, prototyping, and collaborating like a pro. So if you haven’t explored it yet, now’s the time to dive in!
Who’s up for a Figma design challenge?
Sitecore Personalize uses goals to track the performance of experiences and experiments. But what happens if your experience is live and you don’t see any executions or goals tracked in the performance tab for the experience? It can be hard to debug, test and troubleshoot goal tracking especially if the experience is hidden behind a login for example. But you can trigger the goal programmatically by making a series of api calls. I’ve created a Postman collection that you can download to make the process easy! Just update the environment with the appropriate values and you’ll be able to trigger your goals in no time!
Create your experience using any variant, page targeting and filtering settings. Expand the goals section and ensure “track performance” is selected, then click Add goals. In this example, I’m using a custom goal. Give your goal a name and set your performance targets. Add the name of your custom event under “track event” and click the save button.
Make sure to preview your experience to ensure it displays properly. But keep in mind that an experience must be live in order for the system to track goals.
In order to trigger a goal in Sitecore Personalize the experience must be viewed in the same session as the goal event. I’ve setup a postman collection that simulates a user navigating the site, triggering the experience and executing the custom event.
In order to track your guest, you need a browser id (even for api calls). You can obtain a new browser id by using the “Get Browser Id” call in the postman collection or by using your existing browser id from your browser. To get your existing browser id, open your browsers dev tools and enter the following command in the console.
engage.getBrowserId()
You can update the environment vars in the postman collection to use your existing browser id.
Once you have your browser id, we will simulate the user journey through the site. I have included 5 page view calls in the postman collection along with 5 page variables. Set the urls of the pages you want to call, then execute the VIEW requests.
After you have made a few view requests, you’ll want to trigger the experience. You’ll need the friendly id of the experience which can be found on the experience page in the grey details box. Add this name in the environment variables for the postman collection the run the “Trigger Experience” request.
Once the experience is triggered, you can trigger your custom event. Add the custom event name and page url to the environment variables, then execute the “Custom Event” request.
In this example, we have triggered a goal for an experience using a custom event. You can just as easily trigger a page view goal by executing a VIEW request to the proper page after triggering your experience. Similarly, you may want to trigger the identity event if your experience requires the user to log into your site.
It is important to note that the goal will not appear on the report until the browsing session is closed! Your point of sale has a timeout period that will close a browsing session after X minutes of inactivity. When you are testing, you can execute the FORCE CLOSE request to end the session manually.
Once you have triggered the goal and the session has ended, check the performance and operational data of the experience to validate the goal was triggered properly. You will see the successful execution of the experience in one graph
and the primary goal in a separate graph.
You can now be confident that your goal is configured correctly and Sitecore Personalize is tracking your goal correctly.
]]>Have you seen the speed at which the digital landscape is shifting and evolving and thought to yourself, how can I keep up? How can I level up my organization’s digital customer experience and futureproof my website and digital ecosystem to ensure consistent growth for years to come?
The answer might just be a shift to a Composable Digital Experience Platform (DXP) like Sitecore. This is the latest approach to providing digital experiences that offer flexibility, scalability and faster iteration. Sitecore is a true leader in digital experience management and is fully embracing this composable future, while empowering businesses to create personalized experiences for their customers. Let’s take a closer look at what this means for your strategy and how Sitecore can help you navigate this transition.
We are coming from a place where monolithic DXP’s were the norm. While this type of platform offered convenience, they could be expensive, required regular upgrades and were difficult to scale, especially with the introduction of AI technologies.
Some of the benefits that migrating to a composable DXP can offer include, but are certainly not limited to:
Sitecore has shifted from a one-size-fits-all platform to a modular ecosystem, where companies can seamlessly integrate custom components, API’s and third-party platforms. Here are some key areas Sitecore’s composable DXP is driving results for customers across numerous industries.
As you can see, there are a lot of reasons why a composable DXP makes a lot of sense for organizations across all industry verticals, and Sitecore specifically can add a ton of value to Marketing and Technology teams alike in a world of constantly change. At Perficient, we have a team of dedicated and experienced folks ready to help you tackle the transformation and transition into the world of Composable DXP. Reach out to us today, and see how we can work with you to drive outstanding digital customer experiences for your customers.
]]>Artificial Intelligence (AI) is revolutionizing B2B ecommerce, enabling capabilities such as personalized product recommendations, dynamic pricing, and predictive analytics. However, the effectiveness of these AI-driven solutions depends heavily on the quality of the underlying data. Despite AI’s potential, poor data governance remains a significant challenge in the industry. A recent Statista survey revealed that 25% of B2B ecommerce companies in the United States have fully implemented AI technologies, while 56% are experimenting with them.
As AI adoption grows, B2B companies must address data quality issues to leverage AI’s benefits fully. Anyone who has spent time in the B2B industry will acknowledge that quality data is often a struggle. This article explores the critical importance of clean data in AI applications and offers strategies for improving data governance in the B2B ecommerce sector.
Bad data governance is a pervasive issue in the B2B ecommerce landscape, particularly in industries like manufacturing, where complex supply chains and product catalogs create unique challenges. Here are some of the most common symptoms:
Unlike B2C industries, where streamlined data processes are often a core focus, manufacturing businesses face unique challenges due to their operations’ complexity, reliance on legacy systems, and decentralized structures. Understanding why these problems are so prevalent is key to addressing the underlying causes and fostering long-term improvements.
By recognizing these symptoms and understanding the reasons behind poor data governance, B2B manufacturing companies can take the first steps toward addressing these issues. This foundation is critical for leveraging AI and other technologies to their fullest potential in ecommerce.
AI thrives on data—structured, accurate, and relevant data. For B2B ecommerce, where AI powers everything from dynamic pricing to predictive inventory, clean data isn’t just a nice-to-have; it’s the foundation for success. Without clean data governance, AI systems struggle to provide reliable insights, leading to poor decisions and diminished trust in the technology.
As the B2B commerce world embraces AI, those who recognize and prioritize addressing a systemic industry problem of bad data will quickly move to the front of the pack. Garbage in, garbage out. Implementing AI tools with bad data will be doomed to failure as the tools will be ineffective. Meanwhile, those who take the time to ensure they have a good foundation for AI support will overtake the competition. It’s a watershed moment for the B2B industry where those who recognize how to get the most value out of AI while those who refuse to alter their own internal workflows because “that’s the way it’s always been done” will see their market share diminish.
Ignoring data governance in the AI era isn’t just a missed opportunity—it’s a liability. Poor data practices lead to inefficient AI models, frustrated customers, and, ultimately, lost revenue. Moreover, as competitors invest in clean data and AI, companies with bad data governance risk falling irreparably behind.
Clean data governance is no longer optional; it’s a strategic imperative in the AI-driven B2B ecommerce landscape. By prioritizing data accuracy and consistency, companies can unlock AI’s full potential and position themselves for long-term success.
Tackling bad data governance is no small feat, but it’s a journey worth undertaking for B2B companies striving to unlock AI’s full potential. The solution involves strategic planning, technological investment, and cultural change. Here are actionable steps businesses can take to clean up their data and ensure it stays that way:
The first step is conducting a thorough data audit—think of it as a spring cleaning for your databases. By identifying gaps, redundancies, and inaccuracies, businesses can reveal the full extent of their data issues. This process isn’t just about finding errors; it’s about creating a baseline understanding of the company’s data health. Regular audits prevent these issues from snowballing into more significant, costly problems.
Once the audit is complete, it’s time to set some ground rules. Standardizing data entry processes is critical for ensuring consistency. Clear guidelines for formatting SKUs, recording customer details, and storing supplier information can prevent the chaos of mismatched or incomplete records. Employees should be trained on these standards, and tools like automated forms or validation rules can make compliance seamless.
Of course, even the best data entry standards won’t help if different systems across the organization aren’t communicating. That’s where Master Data Management (MDM) comes in. By centralizing data into a single source of truth, companies ensure that updates in one system are automatically reflected across all others. With MDM in place, teams can work confidently, knowing that their data is accurate and consistent.
But standardizing and centralizing aren’t enough if you’re already sitting on a mountain of messy data. Performing this step by hand is significantly time-intensive. Enter data cleaning and enrichment tools. AI-powered solutions can quickly identify and correct errors, deduplicate records and fill in missing fields. These tools don’t just clean up the past; they automate routine processes to keep data clean moving forward.
For many B2B companies, fragmentation is one of the biggest hurdles to clean data. Silos between ERP systems, CRM platforms, and ecommerce tools create inconsistencies that ripple across the business. Breaking down these silos through system integration ensures a unified flow of data, improving collaboration and decision-making across departments. This requires a thoughtful integration strategy, often with the help of IT experts, but the payoff is well worth the effort.
Clean data isn’t just a technical problem—it’s a cultural one. Companies must foster a culture of data ownership, where employees understand the importance of the data they handle and feel accountable for its accuracy. Assigning clear responsibilities, such as appointing a Chief Data Officer (CDO) or similar role, can ensure that data governance remains a priority.
Finally, data governance isn’t a one-and-done project. Continuous improvement is essential. Regular review of data policies and feedback from team members help refine processes over time. Establishing KPIs for data quality can also provide measurable insights into the success of these efforts.
By taking these steps, B2B companies can move from reactive problem-solving to proactive data management. Clean, well-governed data isn’t just the backbone of AI success—it’s a strategic asset that drives better decisions, smoother operations, and stronger customer relationships. In an increasingly data-driven world, those who master their data will lead the way.
In the rapidly evolving landscape of B2B ecommerce, integrating AI technologies offers unprecedented opportunities for growth and efficiency. However, as we’ve explored, the effectiveness of AI is intrinsically linked to the quality of the underlying data. Companies risk undermining their AI initiatives without robust data governance, leading to inaccurate insights and missed opportunities.
Perficient stands at the forefront of addressing these challenges. With extensive experience in implementing comprehensive data governance frameworks, we empower B2B organizations to harness the full potential of their data. Our expertise encompasses:
Investing in clean data governance is not just a technical necessity but a strategic imperative. With Perficient’s expertise, you can transform your data into a powerful asset, driving informed decision-making and sustainable growth in the AI era.
]]>
We’re pleased to announce that Perficient has been named a Major Player in the IDC MarketScape: Worldwide Adobe Experience Cloud Professional Services 2024-2025 Vendor Assessment (Doc #US51741024, December 2024). We believe this recognition is a testament to our commitment to excellence and our dedication to delivering top-notch Adobe services to our clients.
Continue reading to learn more about what the IDC MarketScape is, why Perficient is named a Major Player, and what this designation means to our clients.
This IDC MarketScape evaluated Adobe Experience Cloud professional service providers, creating a framework to compare vendors’ capabilities and strategies. Many organizations need help planning and deploying technology, and finding the right vendor is critical.
According to Douglas Hayward, senior research director for CX services and strategies at IDC, “Organizations choosing an Adobe Experience Cloud professional service should look for proof that their vendor has high-quality professionals who have a track record in empowering their clients and delivering the best value for the fairest price.”
This IDC MarketScape study provides a comprehensive vendor assessment of the Adobe Experience Cloud professional services ecosystem. It evaluates both quantitative and qualitative characteristics that contribute to success in this market. The study covers various vendors, assessing them against a rigorous framework that highlights the most influential factors for success in both the short and long term.
We believe being named a Major Player in the IDC MarketScape is a significant achievement for Perficient and underscores our Adobe Experience Cloud capabilities, industry and technical acumen, global delivery center network, and commitment to quality customer service. We further believe the study is evidence of our expertise and continued focus on solving our clients’ business challenges.
Hayward said, “In our evaluation of Perficient for the IDC MarketScape: Worldwide Adobe Experience Cloud Professional Services 2024-2025 Vendor Assessment, it was evident that Perficient has global delivery expertise that combines an experience design heritage with strong capabilities in digital experience transformation.”
The IDC MarketScape also says, “Based on conversations with Perficient’s clients, the vendor’s three main strengths are value creation, people quality, and client empowerment.”
At Perficient, we are committed to maintaining and improving our services and solutions. We continuously strive to innovate and enhance our capabilities and offerings to meet the evolving needs of our clients, further empower them, and drive value.
You can also read our News Release for more details on this recognition and make sure to follow our Adobe blog for more Adobe platform insights!
]]>Remember the days when robots and artificial intelligence (AI) were confined to the realms of science fiction? Fast forward to today, and AI in healthcare is rapidly transforming how we diagnose, treat, and care for patients. From intelligent algorithms diagnosing diseases faster than the human eye, to virtual health assistants providing round-the-clock support, AI is revolutionizing the healthcare industry. But with this technological revolution comes a host of challenges that must be guided by ethical considerations, data privacy protections, and ongoing evaluation to ensure equitable and safe patient outcomes.
Quick lesson – AI in healthcare refers to using AI technologies and systems to improve various aspects of healthcare delivery, including diagnosis, treatment, patient care, and operational efficiency. AI, by definition, involves the development of computer systems that can perform tasks typically requiring human intelligence. These tasks include learning, reasoning, problem-solving, perception, and language understanding. In the context of healthcare, AI technologies are applied to analyze complex medical data, enhance patient care, streamline operations, and improve decision-making processes for healthcare professionals.
YOU MAY ENJOY: Evolving Healthcare: Generative AI Strategy for Payers and Providers
As I mentioned, AI in healthcare represents a transformative force, offering significant potential to improve diagnostics, treatment personalization, and operational efficiency. Some examples include:
As AI continues to transform healthcare, it brings with it a double-edged scalpel, if you will, capable of making groundbreaking advancements yet full of challenges and considerations. Let me shed some light on the vital considerations necessary to navigate this new frontier responsibly:
The use of AI in healthcare raises ethical issues, such as algorithmic bias that can lead to disparities in treatment, based on race, age, gender, and socioeconomic status. Ensuring that AI systems are trained on diverse data sets and are transparent in their decision-making processes, with accountability for errors and outcomes, is crucial for fairness and equity.
Healthcare AI relies on large datasets that include sensitive patient information. Protecting this data from breaches and ensuring compliance with regulations like HIPAA is essential to maintain patient trust and confidentiality. Also, patients may not be fully aware of how their data is being used by AI systems. Therefore, clear communication about data usage, as well as obtaining explicit consent, is critical to maintaining trust.
While AI can support clinical decisions, it should not replace human judgment. Physicians should use AI as a tool to augment their expertise, ensuring that they remain accountable for patient care and can question AI recommendations when necessary. In addition, healthcare professionals need proper training to effectively utilize AI tools and interpret their results.
Imagine building a house without a blueprint. It would be chaos and probably wouldn’t stay standing for very long. Similarly, implementing AI in healthcare requires more than technological prowess. It requires a strategic plan that ensures seamless integration, ethical considerations, and long-term sustainability. As part of this plan, healthcare organizations need:
AI holds the promise of significantly enhancing healthcare by improving diagnostic accuracy, personalizing treatment, and increasing operational efficiency. However, this potential can only be fully realized if AI is implemented with careful consideration of ethical, privacy, and oversight issues. By taking a strategic and inclusive approach, we can harness the power of AI to improve healthcare outcomes while ensuring the technology is used responsibly and equitably.
Success Story: Improving Health Through Innovation and Technology
Perficient combines strategy, industry best practices, and technology expertise to shape the experiences and engagement of healthcare consumers, streamline operations, and improve the cost and quality of care. Contact us to learn more.
]]>
The 2024 InsureTech Connect (ITC) conference was truly exhilarating, with key takeaways impacting the insurance industry. Each year, it continues to improve, offering more relevant content, valuable industry connections, and opportunities to delve into emerging technologies.
This year’s event was no exception, showcasing the importance of personalization to the customer, tech-driven relationship management, and AI-driven underwriting processes. The industry is constantly evolving, and ITC displays the alignment of everyone within the insurance industry surrounding the same purpose.
As I reflect on ITC and my experience, it is evident the progression of the industry is remarkable. Here are a few key takeaways from my perspective that will shape our industry roadmap:
We’ve spoken for many years about the need to drive greater personalization across our interactions in our industry. We know that customers engage with companies that demonstrate authentic knowledge of their relationship. This year, we saw great examples of how companies are treating personalization, not as an incremental initiative, but rather embedding it at key moments in the insurance experience, particularly underwriting and claims.
For example, New York Life highlighted how personalization is driving generational loyalty. We’ve been working with industry leading insurers to help drive personalization across the distribution network: carriers to agents and the final policyholder.
Success In Action: Our client wanted to integrate better contact center technology to improve internal processes and allow for personalized, proactive messaging to clients. We implemented Twilio Flex and leveraged its outbound notification capabilities to support customized messaging while also integrating their cloud-based outbound dialer and workforce management suite. The insurer now has optimized agent productivity and agent-customer communication, as well as newfound access to real-time application data across the entire contact center.
Insurance has always had a complex distribution network across platforms, partnerships, carriers, agents, producers, and more. Leveraging technology to manage these relationships opens opportunities to gain real-time insights and implement effective strategies, fostering holistic solutions and moving away from point solutions. Managing this complexity and maximizing the value of this network requires a good business and digital transformation strategy.
Our proprietary Envision process has been leading the way to help carriers navigate this complex system with proprietary strategy tools, historical industry data, and best practices.
Not surprisingly, AI permeated many of the presentations and demos across the session. AI Offers insurers unique decisioning throughout the value chain to create differentiation. It was evident that while we often talk about AI as an overarching technology, the use cases were more point solutions across the insurance value chain. Moreover, AI is not here to replace the human, but rather assist the human. By automating the mundane process activities, mindshare and human capital can be invested toward more value-added activity and critical problems to improve customer experience. Because these point solutions are available across many disparate groups, organizational mandates demand safe and ethical use of AI models.
Our PACE framework provides a holistic approach to responsibly operationalize AI across an organization. It empowers organizations to unlock the benefits of AI while proactively addressing risks.
Our industry continues to evolve in delivering its noble purpose – to protect individual’s and businesses’ property, liability, and financial obligations. Technology is certainly an enabler of this purpose, but transformation must be managed to be effective.
Want to know the now, new, and next of digital transformation in insurance? Contact us and let us help you meet the challenges of today and seize the opportunities of tomorrow in the insurance industry.
]]>Shoptalk held its first ever Fall conference in Chicago this past week and our very own Justin Racine, Principal of Unified Commerce, was present to take it all in. This year’s theme was 007, so Justin was on a reconnaissance mission to gain as much information on retail and commerce trends as possible. Here’s a debrief on his sources and the intel he was able to gather from them during his time at the show.
Before the explosion of social media, consumers gained knowledge of new brands by window shopping and traditional ads, but the reach only went so far. With the prevalence of social media, retail products that are niche and unique now have access to the whole world with platforms like TikTok and Instagram. Rent the Runway is a clothing rental brand that’s currently reveling in the fact that indie brands are controlling the fashion space. Jennifer Hyman, the Founder and CEO of Rent the Runway, claimed that today a brand can skyrocket in value just by a teenager talking about its products on TikTok. Social media has been changing the way consumers shop, and brands need to be able to keep up and provide new and fresh ways to connect and interact. Rent the Runway did that by providing fashion clothing for rent, rather than purchase, thus allowing for an ever-changing wardrobe with plenty of variety. Retail – just like fashion – should take greater risks and be bolder.
Glossier CEO, Kyle Leahy, provided a thoughtful look at three C’s that should be focused on to provoke thoughtful, engaging, and relatable conversations between brands and their consumers. Community, connection, and customers are the three C’s in question here. Glossier prides itself on being a community-based brand, focusing on how their products make people feel. On stage, Kyle spoke about how their consumers come to them because of the community they’ve created, and by actively listening to their consumers.
However, it’s one thing just to listen – it’s another thing entirely to respond. Kyle stated that Glossier responds to every single comment or post on their socials. That’s how they’re building a strong community, like by like, comment by comment. Glossier doesn’t stop there; they continually strive for personalized customer experience. Their storefronts are full of local apparel and products, and a new fragrance they’ve recently launched is an aggressively impressive campaign.
It’s called Glossier You, and the bottle is specifically designed to be activated by using the consumer’s thumb. In this way, it gives the feeling of the product being “encoded” to their thumbprint, giving them a personalized experience. To go one step further, they claim the smell is a little different on each person. With this product, they’ve built a conversation piece for their consumers around how it’s unique to them allowing connection across users for comparison, thus creating a shared experience and further solidifying their community.
Surprise and delight! That’s the name of the game when it comes to creating buzz with consumers. Many companies feel like they have a finger on the pulse, but according to the Vice President and Principal Analyst at Forrester, Brendan Witcher, that’s not the case. He challenged attendees’ perspectives saying that many customers don’t feel that companies are delighting or surprising them. These brands are operating off opinions rather than data, and these companies should gather real, credible data if they want to get to the heart of their consumers emotional responses.
Creative decisions should be based on true data and gained by letting the customer express themselves through their behaviors, actions, and even more importantly, their inactions. A great example of inactivity is those would be consumers who are visiting your site but not purchasing. Their lack of purchase, or inaction, is a clear look into a way to create growth simply by focusing on making those conversions. These customers are there, now all that’s needed is a push in the right direction. Doubling up from Glossier’s 3C’s, Brendan revealed that there are 6 elements of customer data to focus on: characteristics, considerations, curiosities, conditions, context, and conceptions.
For it being their first Shoptalk Fall, the show undoubtedly inspired and renewed the energy and enthusiasm of those in attendance. Individuals present expressed an urge to build deeper connections with their customers through a wide variety of strategies. The speakers encouraged the audience to listen to their customers, stay true to relevant retail trends, and dive deep into the data to curate connections and build communities. Now is the time to be unconventional, unique, and exciting.
Your mission, if you choose to accept it, is to explore Perficient’s industry expertise in retail and commerce.
Read Justin’s full article on CMSWire.
]]>