Perficient Blogs https://blogs.perficient.com/ Expert Digital Insights Fri, 05 Dec 2025 18:12:21 +0000 en-US hourly 1 https://blogs.perficient.com/files/favicon-194x194-1-150x150.png Perficient Blogs https://blogs.perficient.com/ 32 32 30508587 Why Inter-Plan Collaboration Is the Competitive Edge for Health Insurers https://blogs.perficient.com/2025/12/05/why-inter-plan-collaboration-is-the-competitive-edge-for-health-insurers/ https://blogs.perficient.com/2025/12/05/why-inter-plan-collaboration-is-the-competitive-edge-for-health-insurers/#respond Fri, 05 Dec 2025 13:00:12 +0000 https://blogs.perficient.com/?p=387904

A health insurance model built for yesterday won’t meet the demands of today’s consumers. Expectations for seamless, intuitive experiences are accelerating, while fragmented systems continue to drive up costs, create blind spots, and erode trust.

Addressing these challenges takes more than incremental fixes. The path forward requires breaking down silos and creating synergy across plans, while aligning technology, strategy, and teams to deliver human-centered experiences at scale. This is more than operational; it’s strategic. It’s how health insurers build resilience, move with speed and purpose, and stay ahead of evolving demands.

Reflecting on recent industry conversations, we’re proud to have sponsored LeadersIgnite and the 2025 Inter-Plan Solutions Forum. As Hari Madamalla shared:

Hari Madamalla Headshot“When insurers share insights, build solutions together, and scale what works, they can cut costs, streamline prior authorization and pricing, and deliver the experiences members expect.”– Hari Madamalla, Senior Vice President, Healthcare + Life Sciences

To dig deeper into these challenges, we spoke with healthcare leaders Hari Madamalla, senior vice president, and directors Pavan Madhira and Priyal Patel about how health insurers can create a competitive edge by leveraging digital innovation with inter-plan collaboration.

The Complexity Challenge Health Insurers Can’t Ignore

Health insurance faces strain from every angle: slow authorizations, confusing pricing, fragmented data, and widening care gaps. The reality is, manual fixes won’t solve these challenges. Plans need smarter systems that deliver clarity and speed at scale. AI and automation make it possible to turn data into insight, reduce fragmentation, and meet mandates without adding complexity.

Headshot Pavan Madhira“Healthcare has long struggled with inefficiencies and slow tech adoption—but the AI revolution is changing that. We’re at a pivotal moment, similar to the digital shift of the 1990s, where AI is poised to disrupt outdated processes and drive real transformation.” – Pavan Madhira, Director, Healthcare + Life Sciences

But healthcare organizations face unique constraints, including HIPAA, PHI, and PII regulations that limit the utility of plug-and-play AI solutions. To meet these challenges, we apply our PACE framework—Policies, Advocacy, Controls, and Enablement—to ensure AI is not only innovative but also rooted in trust. This approach ensures AI is deployed with purpose, aligned to business goals, and embedded with safeguards that protect consumers and organizations.

Still, technology alone isn’t enough though. Staying relevant means designing human-centered experiences that reduce friction and build trust. Perficient’s award-winning Access to Care research study reveals that friction in the care journey directly impacts consumer loyalty and revenue.

More than 45% of consumers aged 18–64 have used digital-first care instead of their regular provider, and 92% of them believe the quality is equal to—or better.

That’s a signal leaders can’t afford to ignore. It tells us when experiences fall short, consumers go elsewhere, and they won’t always come back.

For health insurers, that shift creates issues. When members seek care outside your ecosystem, you risk losing visibility into care journeys, creating gaps in data and blind spots in member health management. The result? Higher costs, duplicative services, and missed opportunities for proactive coordination. Fragmented care journeys also undermine efforts to deliver a true 360-degree view of the member.

For leaders, the solution lies in intuitive digital transformation that turns complexity into clarity.

Explore More: Empathy, Resilience, Innovation, and Speed: The Blueprint for Intelligent Healthcare Transformation

Where Inter-Plan Collaboration Creates Real Momentum

When health plans work together, the payoff is significant. Collaboration moves the industry from silos to synergy, enabling human-centered experiences across networks that keep members engaged and revenue intact.

Building resilience is key to that success. Leaders need systems that anticipate member needs and remove barriers before they impact access to care. That means reducing friction in scheduling and follow-up, enabling seamless coordination across networks, and delivering digital experiences that feel as simple and intuitive as consumer platforms like Amazon or Uber. Resilience also means preparing for the unexpected and being able to pivot quickly.

When plans take this approach, the impact is clear:

  • Higher Quality Scores and Star Ratings: Shared strategies for closing gaps and improving provider data can help lift HEDIS scores and Star Ratings, unlocking higher reimbursement and bonus pools.
  • Faster Prior Authorizations: Coordinated rules and automation help reduce delays and meet new regulatory requirements like CMS Interoperability and Prior Authorization Final Rule (CMS-0057-F).
  • True Price Transparency: Consistent, easy-to-understand cost and quality information across plans helps consumers make confident choices and stay in-network.
  • Stronger Member Loyalty: Unified digital experiences across plans help improve satisfaction and engagement.
  • Lower Administrative Overhead: Cleaner member data means fewer errors, less duplication, and lower compliance risk.

Priyal Patel Headshot“When plans work together, they can better serve their vulnerable populations, reduce disparities, and really drive to value based care. It’s about building trust, sharing responsibility, and innovating with empathy.” – Priyal Patel, Director, Healthcare + Life Sciences

Resilience and speed go hand in hand though. Our experts help health insurers deliver both by:

This approach supports the Quintuple Aim: better outcomes, lower costs, improved experiences, clinician well-being, and health equity. It also ensures that innovation is not just fast, but focused, ethical, and sustainable.

You May Also Enjoy: Access to Care is Evolving: What Consumer Insights and Behavior Models Reveal

Accelerating Impact With Digital Innovation and Inter-Plan Collaboration

Beyond these outcomes, collaboration paired with digital innovation unlocks even greater opportunities to build a smarter, more connected future of healthcare. It starts with aligning consumer expectations, digital infrastructure, and data governance to strategic business goals.

Here’s how plans can accelerate impact:

  • Real-Time Data Sharing and Interoperability: Shared learning ensures insights aren’t siloed. By pooling knowledge across plans, leaders can identify patterns, anticipate emerging trends, and act faster on what works. Real-time interoperability, like FHIR-enabled solutions, gives plans the visibility needed for accurate risk adjustment and timely quality reporting. AI enhances this by predicting gaps and surface actionable insights, helping plans act faster and reduce costs.
  • Managing Coding Intensity in the AI Era: As provider AI tools capture more diagnoses, insurers can see risk scores and costs rise, creating audit risk and financial exposure. This challenge requires proactive oversight. Collaboration helps by establishing shared standards and applying predictive analytics to detect anomalies early, turning a potential cost driver into a managed risk.
  • Prior Authorization Modernization: Prior authorization delays drive up costs and erode member experience. Aligning on streamlined processes and leveraging intelligent automation can help meet mandates like CMS-0057-F, while predicting approval likelihood, flagging exceptions early, and accelerating turnaround times.
  • Joint Innovation Pilots: Co-development of innovation means plans can shape technology together. This approach balances unique needs with shared goals, creating solutions that cut costs, accelerate time to value, and ensure compliance stays front and center.
  • Engaging Member Experience Frameworks: Scaling proven approaches across plans amplifies impact. When plans collaborate on digital experience standards and successful capabilities are replicated, members enjoy seamless interactions across networks. Building these experiences on solid foundations with purpose-driven AI is key to delivering stronger engagement and loyalty at scale.
  • Shared Governance and Policy Alignment: Joint governance establishes accountability, aligns incentives for value-based care, and reduces compliance risk while protecting revenue.

Success in Action: Empowering Healthcare Consumers and Their Care Ecosystems With Interoperable Data

Make Inter-Plan Collaboration Your Strategic Advantage

Ready to move from insight to impact? Our healthcare expertise equips leaders to modernize, personalize, and scale care. We drive resilient, AI-powered transformation to shape the experiences and engagement of healthcare consumers, streamline operations, and improve the cost, quality, and equity of care.

  • Business Transformation: Activate strategy for transformative outcomes and health experiences.
  • Modernization: Maximize technology to drive health innovation, efficiency, and interoperability.
  • Data + Analytics: Power enterprise agility and accelerate healthcare insights.
  • Consumer Experience: Connect, ease, and elevate impactful health journeys.

We have been trusted by the 10 largest health systems and the 10 largest health insurers in the U.S., and Modern Healthcare consistently ranks us as one of the largest healthcare consulting firms.

]]>
https://blogs.perficient.com/2025/12/05/why-inter-plan-collaboration-is-the-competitive-edge-for-health-insurers/feed/ 0 387904
Lightning Web Security (LWS) in Salesforce https://blogs.perficient.com/2025/12/05/lightning-web-security-lws-in-salesforce/ https://blogs.perficient.com/2025/12/05/lightning-web-security-lws-in-salesforce/#respond Fri, 05 Dec 2025 06:51:04 +0000 https://blogs.perficient.com/?p=388406

What is Lightning Web Security?

Lightning Web Security (LWS) is Salesforce’s modern client-side security architecture designed to secure Lightning Web Components (LWC) and Aura components. Introduced as an improvement over the older Lightning Locker service, LWS enhances component isolation with better performance and compatibility with modern web standards.

Key Features of LWS

  • Namespace isolation: Each Lightning web component runs in its own JavaScript sandbox, preventing unauthorized access to data or code from other namespaces.

  • API distortion: LWS modifies standard JavaScript APIs dynamically to enforce security policies without breaking developer experience.

  • Supports third-party libraries: Unlike Locker, LWS allows broader use of community and open-source JS libraries.

  • Default in new orgs: Enabled by default for all new Salesforce orgs created from Winter ’23 release onwards.

Benefits of Using LWS

  • Stronger security: Limits cross-component and cross-namespace vulnerabilities.

  • Improved performance: Reduced overhead compared to Locker’s wrappers, resulting in faster load times for users.

  • Better developer experience: Easier to build robust apps without excessive security workarounds.

  • Compatibility: Uses the latest web standards and works well with modern browsers and tools.

How to Enable LWS in Your Org

  1. Navigate to Setup > Session Settings in Salesforce.

  2. Enable the checkbox for Use Lightning Web Security for Lightning web components and Aura components.

  3. Save settings and clear browser cache to ensure the change takes effect.

  4. Test your Lightning components thoroughly, ideally starting in a sandbox environment before deploying to production.

Best Practices for Working with LWS

  • Test extensively: Some existing components may require minor updates due to stricter isolation.

  • Use the LWS Console: Salesforce provides developer tools to inspect and debug components under LWS.

  • Follow secure coding guidelines: Maintain least privilege principle and avoid direct DOM manipulations.

  • Plan migration: Gradually transition from Lightning Locker to LWS, if upgrading older orgs.

  • Leverage Third-party Libraries Wisely: Confirm compatibility with LWS to avoid runtime errors.

Troubleshooting Common LWS Issues

  • Components failing due to namespace restrictions.

  • Unexpected behavior with third-party libraries.

  • Performance bottlenecks during initial page loading.

Utilize Salesforce’s diagnostic tools, logs, and community forums for support.

Resources for Further Learning

]]>
https://blogs.perficient.com/2025/12/05/lightning-web-security-lws-in-salesforce/feed/ 0 388406
Salesforce Marketing Cloud + AI: Transforming Digital Marketing in 2025 https://blogs.perficient.com/2025/12/05/salesforce-marketing-cloud-ai-transforming-digital-marketing-in-2025/ https://blogs.perficient.com/2025/12/05/salesforce-marketing-cloud-ai-transforming-digital-marketing-in-2025/#respond Fri, 05 Dec 2025 06:48:04 +0000 https://blogs.perficient.com/?p=388389

Salesforce Marketing Cloud + AI is revolutionizing marketing by combining advanced artificial intelligence with marketing automation to create hyper-personalized, data-driven campaigns that adapt in real time to customer behaviors and preferences. This fusion drives engagement, conversions, and revenue growth like never before.

Key AI Features of Salesforce Marketing Cloud

  • Agentforce: An autonomous AI agent that helps marketers create dynamic, scalable campaigns with effortless automation and real-time optimization. It streamlines content creation, segmentation, and journey management through simple prompts and AI insights. Learn more at the Salesforce official site.

  • Einstein AI: Powers predictive analytics, customized content generation, send-time optimization, and smart audience segmentation, ensuring the right message reaches the right customer at the optimal time.

  • Generative AI: Using Einstein GPT, marketers can automatically generate email copy, subject lines, images, and landing pages, enhancing productivity while maintaining brand consistency.

  • Marketing Cloud Personalization: Provides real-time behavioral data and AI-driven recommendations to deliver tailored experiences that boost customer loyalty and conversion rates.

  • Unified Data Cloud Integration: Seamlessly connects live customer data for dynamic segmentation and activation, eliminating data silos.

  • Multi-Channel Orchestration: Integrates deeply with platforms like WhatsApp, Slack, and LinkedIn to deliver personalized campaigns across all customer touchpoints.

Latest Trends & 2025 Updates

  • With advanced artificial intelligence, marketing teams benefit from systems that independently manage and adjust their campaigns for optimal results.

  • Real-time customer journey adaptations powered by live data.

  • Enhanced collaboration via AI integration with Slack and other platforms.

  • Automated paid media optimization and budget control with minimal manual intervention.

For detailed insights on AI and marketing automation trends, see this industry report.

Benefits of Combining Salesforce Marketing Cloud + AI

  • Increased campaign efficiency and ROI through automation and predictive analytics.

  • Hyper-personalized customer engagement at scale.

  • Reduced manual effort with AI-assisted content and segmentation.

  • Better decision-making powered by unified data and AI-driven insights.

  • Greater marketing agility and responsiveness in a changing landscape.

]]>
https://blogs.perficient.com/2025/12/05/salesforce-marketing-cloud-ai-transforming-digital-marketing-in-2025/feed/ 0 388389
Salesforce Custom Metadata getInstance vs SOQL: Key Differences & Best Practices. https://blogs.perficient.com/2025/12/05/salesforce-custom-metadata-getinstance-vs-soql/ https://blogs.perficient.com/2025/12/05/salesforce-custom-metadata-getinstance-vs-soql/#respond Fri, 05 Dec 2025 06:46:26 +0000 https://blogs.perficient.com/?p=377157

Salesforce provides powerful features to handle metadata, allowing you to store and access configuration data in a structured manner. In this blog, we explore Salesforce Custom Metadata getInstance vs SOQL—two key approaches developers use to retrieve custom metadata efficiently. Custom metadata types in Salesforce offer a great way to define reusable and customizable application data without worrying about governor limits that come with other storage solutions, like custom objects. For more details, you can visit the official Salesforce Trailhead Custom Metadata Types module. We will delve into the differences, use cases, and best practices for these two approaches.

What is Custom Metadata in Salesforce?

Custom metadata types are custom objects in Salesforce that store metadata or configuration data. Unlike standard or custom objects, they are intended for storing application configurations that don’t change often. These types are often used for things like:

  • Configuration settings for apps
  • Defining global values (like API keys)
  • Storing environment-specific configurations
  • Reusable data for automation or integrations

Custom metadata records can be easily managed via Setup, the Metadata API, or APEX.

Approach 1: Using getInstance()

getInstance() is a method that allows you to access a single record of a custom metadata type. It works on a “singleton” basis, meaning that it returns a specific instance of the custom metadata record.

How getInstance() Works

The getInstance() method is typically used when you’re looking to retrieve a single record of custom metadata in your code. This method is not intended to query multiple records or create complex filters. Instead, it retrieves a specific record directly, based on the provided developer name.

Example:

// Get a specific custom metadata record by its developer name
My_Custom_Metadata__mdt metadataRecord = My_Custom_Metadata__mdt.getInstance('My_Config_1');

// Access fields of the record
String configValue = metadataRecord.Config_Value__c;

When to Use getInstance()

  • Single Record Lookup: If you know the developer name of the record you’re looking for and expect to access only one record.
  • Performance: Since getInstance() is optimized for retrieving a single metadata record by its developer name, it can offer better performance than querying all records, especially when you only need one record.
  • Static Configuration: Ideal for use cases where the configuration is static, and you are sure that the metadata record will not change often.

Advantages of getInstance()

  • Efficiency: It’s quick and easy to retrieve a single metadata record when you already know the developer name.
  • Less Complex Code: This approach requires fewer lines of code and simplifies the logic, particularly in configuration-heavy applications.

Limitations of getInstance()

  • Single Record: It can only retrieve one record at a time.
  • No Dynamic Querying: It does not support complex filtering or dynamic querying like SOQL.

Approach 2: Using SOQL Queries

SOQL (Salesforce Object Query Language) is the standard way to retrieve multiple records in Salesforce, including custom metadata records. By using SOQL, you can query a custom metadata type much like any other object in Salesforce, providing flexibility in how records are retrieved.

How SOQL Queries Work

With SOQL, you can write queries that return multiple records, filter based on field values, or sort the records as needed. For instance:

// Query for multiple custom metadata records with SOQL
List<My_Custom_Metadata__mdt> metadataRecords = [SELECT MasterLabel, Config_Value__c FROM My_Custom_Metadata__mdt WHERE Active__c = TRUE];

// Loop through records and access their values
for (My_Custom_Metadata__mdt record : metadataRecords) {
    System.debug('Label: ' + record.MasterLabel + ', Value: ' + record.Config_Value__c);
}

When to Use SOQL Queries

  • Multiple Records: If you need to retrieve more than one record or apply filters to the query.
  • Dynamic Queries: When the records you’re querying are dynamic (e.g., based on user input or other logic).
  • Complex Criteria: If you need to use conditions like WHERE, ORDER BY, or join metadata with other objects.

Advantages of SOQL Queries

  • Flexibility: SOQL queries allow you to retrieve multiple records based on complex conditions.
  • Filtering and Sorting: You can easily filter and sort records to get the exact data you need.
  • Dynamic Usage: Ideal for cases where the data or records you’re querying may change, such as pulling all active configuration records.

Limitations of SOQL Queries

  • Governor Limits: SOQL queries are subject to Salesforce’s governor limits (e.g., the number of records returned and the number of queries per transaction).
  • Complexity: Writing and managing SOQL queries might introduce additional complexity in the code, especially when dealing with large datasets.

Key Differences: getInstance() vs. SOQL Queries

AspectgetInstance()SOQL Query
PurposeRetrieves a single record by developer nameRetrieves multiple records with flexibility
PerformanceFaster for a single record lookupSlower when retrieving many records
Use CaseStatic configuration data, single record lookupDynamic and multiple record retrieval
ComplexitySimple, minimal codeMore complex, requires query handling
Filtering & SortingNone, only by developer nameSupports filtering, sorting, and conditions
Governor LimitsDoesn't count against query limitsSubject to governor limits (e.g., 50,000 records per query)

Best Practices for Using getInstance() and SOQL

  • Use getInstance() when you need to access one specific metadata record and know the developer name beforehand. It’s efficient and optimized for simple lookups.
  • Use SOQL when you need to filter, sort, or access multiple metadata records. It’s more flexible and ideal for dynamic scenarios, but you should always be aware of governor limits to avoid hitting them.
  • Combine the Two: In some cases, you can use getInstance() for fetching critical single configuration records and SOQL for retrieving a list of configuration settings.

Conclusion

Both getInstance() and SOQL queries have their strengths when it comes to working with custom metadata types in Salesforce. Understanding when to use each will help optimize your code and ensure that your Salesforce applications run efficiently. For simple, static configurations, getInstance() is the way to go. For dynamic, large, or complex datasets, SOQL queries will offer the flexibility you need. By carefully selecting the right approach for your use case, you can harness the full power of Salesforce custom metadata.

]]>
https://blogs.perficient.com/2025/12/05/salesforce-custom-metadata-getinstance-vs-soql/feed/ 0 377157
Creators in Coding, Copycats in Class: The Double-Edged Sword of Artificial Intelligence https://blogs.perficient.com/2025/12/03/creators-in-coding-copycats-in-class-the-double-edged-sword-of-artificial-intelligence/ https://blogs.perficient.com/2025/12/03/creators-in-coding-copycats-in-class-the-double-edged-sword-of-artificial-intelligence/#respond Thu, 04 Dec 2025 00:30:15 +0000 https://blogs.perficient.com/?p=388808

“Powerful technologies require equally powerful ethical guidance.” (Bostrom, N. Superintelligence: Paths, Dangers, Strategies. Oxford University Press, 2014).

The ethics of using artificial intelligence depend on how we apply its capabilities—either to enhance learning or to prevent irresponsible practices that may compromise academic integrity. In this blog, I share reflections, experiences, and insights about the impact of AI in our environment, analyzing its role as a creative tool in the hands of developers and as a challenge within the academic context.

Between industry and the classroom

As a Senior Developer, my professional trajectory has led me to delve deeply into the fascinating discipline of software architecture. Currently, I work as a Backend Developer specializing in Microsoft technologies, facing daily the challenges of building robust, scalable, and well-structured systems in the business world.

Alongside my role in the industry, I am privileged to serve as a university professor, teaching four courses. Three of them are fundamental parts of the software development lifecycle: Software Analysis and Design, Software Architecture, and Programming Techniques. This dual perspective—as both a professional and a teacher—has allowed me to observe the rapid changes that technology is generating both in daily development practice and in the formation of future engineers.

Exploring AI as an Accelerator in Software Development

One of the greatest challenges for those studying the software development lifecycle is transforming ideas and diagrams into functional, well-structured projects. I always encourage my students to use Artificial Intelligence as a tool for acceleration, not as a substitute.

For example, in the Software Analysis and Design course, we demonstrate how a BPMN 2.0 process diagram can serve as a starting point for modeling a system. We also work with class diagrams that reflect compositions and various design patterns. AI can intervene in this process in several ways:

  • Code Generation from Models: With AI-based tools, it’s possible to automatically turn a well-built class diagram into the source code foundation needed to start a project, respecting the relationships and patterns defined during modeling.
  • Rapid Project Architecture Setup: Using AI assistants, we can streamline the initial setup of a project by selecting the technology stack, creating folder structures, base files, and configurations according to best practices.
  • Early Validation and Correction: AI can suggest improvements to proposed models, detect inconsistencies, foresee integration issues, and help adapt the design context even before coding begins.

This approach allows students to dedicate more time to understanding the logic behind each component and design principle, instead of spending hours on repetitive setup and basic coding tasks. The conscious and critical use of artificial intelligence strengthens their learning, provides them with more time to innovate, and helps prepare them for real-world industry challenges.

But Not Everything Is Perfect: The Challenges in Programming Techniques

However, not everything is as positive as it seems. In “Programming Techniques,” a course that represents students’ first real contact with application development, the impact of AI is different compared to more advanced subjects. In the past, the repetitive process of writing code—such as creating a simple constructor public Person(), a function public void printFullName() or practicing encapsulation in Java with methods like public void setName(String name) and public String getName()—kept the fundamental programming concepts fresh and clear while coding.

This repetition was not just mechanical; it reinforced their understanding of concepts like object construction, data encapsulation, and procedural logic. It also played a crucial role in developing a solid foundation that made it easier to understand more complex topics, such as design patterns, in future courses.

Nowadays, with the widespread availability and use of AI-based tools and code generators, students tend to skip these fundamental steps. Instead of internalizing these concepts through practice, they quickly generate code snippets without fully understanding their structure or purpose. As a result, the pillars of programming—such as abstraction, encapsulation, inheritance, and polymorphism—are not deeply absorbed, which can lead to confusion and mistakes later on.

Although AI offers the promise of accelerating development and reducing manual labor, it is important to remember that certain repetition and manual coding are essential for establishing a solid understanding of fundamental principles. Without this foundation, it becomes difficult for students to recognize bad practices, avoid common errors, and truly appreciate the architecture and design of robust software systems.

Reflection and Ethical Challenges in Using AI

Recently, I explained the concept of reflection in microservices to my Software Architecture students. To illustrate this, I used the following example: when implementing the Abstract Factory design pattern within a microservices architecture, the Reflection technique can be used to dynamically instantiate concrete classes at runtime. This allows the factory to decide which object to create based on external parameters, such as a message type or specific configuration received from another service. I consider this concept fundamental if we aim to design an architecture suitable for business models that require this level of flexibility.

However, during a classroom exercise where I provided a base code, I asked the students to correct an error that I had deliberately injected. The error consisted of an additional parameter in a constructor—a detail that did not cause compilation failures, but at runtime, it caused 2 out of 5 microservices that consumed the abstract factory via reflection to fail. From their perspective, this exercise may have seemed unnecessary, which led many to ask AI to fix the error.

As expected, the AI efficiently eliminated the error but overlooked a fundamental acceptance criterion: that parameter was necessary for the correct functioning of the solution. The task was not to remove the parameter but to add it in the Factory classes where it was missing. Out of 36 students, only 3 were able to explain and justify the changes they made. The rest did not even know what modifications the AI had implemented.

This experience highlights the double-edged nature of artificial intelligence in learning: it can provide quick solutions, but if the context or the criteria behind a problem are not understood, the correction can be superficial and jeopardize both the quality and the deep understanding of the code.

I haven’t limited this exercise to architecture examples alone. I have also conducted mock interviews, asking basic programming concepts. Surprisingly, even among final-year students who are already doing their internships, the success rate is alarmingly low: approximately 65% to 70% of the questions are answered incorrectly, which would automatically disqualify them in a real technical interview.

Conclusion

Artificial intelligence has become increasingly integrated into academia, yet its use does not always reflect a genuine desire to learn. For many students, AI has turned into a tool for simply getting through academic commitments, rather than an ally that fosters knowledge, creativity, and critical thinking. This trend presents clear risks: a loss of deep understanding, unreflective automation of tasks, and a lack of internalization of fundamental concepts—all crucial for professional growth in technological fields.

Various authors have analyzed the impact of AI on educational processes and emphasize the importance of promoting its ethical and constructive use. As Luckin et al. (2016) suggest, the key lies in integrating artificial intelligence as support for skill development rather than as a shortcut to avoid intellectual effort. Similarly, Selwyn (2019) explores the ethical and pedagogical challenges that arise when technology becomes a quick fix instead of a resource for deep learning.

References:

]]>
https://blogs.perficient.com/2025/12/03/creators-in-coding-copycats-in-class-the-double-edged-sword-of-artificial-intelligence/feed/ 0 388808
Agentforce World Tour Chicago: How AI and Data Are Powering Manufacturing’s Next Chapter https://blogs.perficient.com/2025/12/02/agentforce-world-tour-chicago-how-ai-and-data-are-powering-manufacturings-next-chapter/ https://blogs.perficient.com/2025/12/02/agentforce-world-tour-chicago-how-ai-and-data-are-powering-manufacturings-next-chapter/#respond Tue, 02 Dec 2025 23:40:06 +0000 https://blogs.perficient.com/?p=388784

AI is no longer optional for manufacturers. It is the dividing line between industry leaders and those falling behind. Companies that embrace AI and data are setting the pace for efficiency, customer engagement, and growth. Those that delay risk losing relevance in a market that rewards speed, precision, and innovation.

Perficient will join industry leaders at the Agentforce World Tour in Chicago on December 16. Salesforce will showcase its most advanced capabilities, including Agentforce, Slack, and Data 360. These solutions give manufacturers the power to predict demand, automate decisions, and deliver connected experiences that drive measurable results across the entire value chain.

Why Attend Agentforce World Tour Chicago

Chicago is where top industries come together to lead what’s next. At Agentforce World Tour, you will explore sessions and solutions built for the sectors that define this city, including healthcare and life sciences, retail and consumer goods, and manufacturing. You will see real use cases, dive into emerging trends, connect with peers, and gain insights from experts who are shaping the future. You will leave with practical strategies and a roadmap for growth.

New to Salesforce? Start here.
Maybe you attended your first Dreamforce this year and want to get more hands-on. Agentforce World Tour is the perfect next step. This event gives you a closer look at what Salesforce can do for your business. You will learn how the latest agentic and AI innovations drive real results. Hear from customers, explore live demos, and see how Salesforce helps you unlock productivity, accelerate growth, and deliver exceptional customer experiences.

What’s Next for Manufacturing: From Products to Services With Servitization

Perficient is committed to helping manufacturers turn AI and data into measurable results. Here’s where you can connect with us and gain practical strategies for your business:

Transforming Manufacturing Aftermarket: From Products to Services with Servitization
Date: December 4 at 1:00 PM EST
Location: Online | Registration link coming soon
Join these industry experts for our upcoming webinar:

  • Sarah McDowell, Director, Perficient
  • Lester McHargue, Director of Manufacturing, Perficient
  • Pete Niesen, Sr. Director, Business Strategy Consulting, Salesforce

They will explore how servitization and connected digital experiences are transforming the manufacturing and equipment aftermarket. Learn how Salesforce, data, and AI enable new revenue streams, predictive maintenance, and automated support. Walk away with practical strategies to deliver proactive, data-driven services that boost loyalty, satisfaction, and profitability long after the initial sale. Register here.

Ready to take the next step?
If you want to learn more about servitization and how it can transform your aftermarket strategy, download our Manufacturing Servitization Workshop Guide for practical insights and a roadmap to success. And don’t forget to register for Agentforce World Tour Chicago—it’s free, but registration is required. Send us a note here if you would like to connect during World Tour. Secure your spot today and join us for a day of innovation, hands-on learning, and real-world strategies that will help you lead what’s next in manufacturing.

]]>
https://blogs.perficient.com/2025/12/02/agentforce-world-tour-chicago-how-ai-and-data-are-powering-manufacturings-next-chapter/feed/ 0 388784
Join Perficient at Agentforce World Tour New York: Build What’s Next https://blogs.perficient.com/2025/12/02/join-perficient-at-agentforce-world-tour-new-york-build-whats-next/ https://blogs.perficient.com/2025/12/02/join-perficient-at-agentforce-world-tour-new-york-build-whats-next/#respond Tue, 02 Dec 2025 14:11:30 +0000 https://blogs.perficient.com/?p=388771

Close Out the Year. Start Building the Next.

As 2025 winds down, the smartest companies aren’t just looking back; they’re planning. The future of business is agentic, and the time to prepare is now.

Join us at Agentforce World Tour New York on December 10, 2025, and experience the innovation that defined Dreamforce, live in NYC. In just one day, you’ll:

  • 140+ expert-led sessions, demos, and hands-on trainings
  • A front-row look at Salesforce’s biggest launches, including Agentforce 360, Slack, and Data 360
  • Practical ways companies are increasing productivity, accelerating growth, and modernizing customer experiences

All free. All designed to help you turn future plans into action. Register for World Tour here!

Why Attend Agentforce World Tour NYC?

NYC is where Salesforce is pushing the next wave of agentic AI. If you want a real, unfiltered look at how companies are applying Agentforce and Data 360 to drive revenue, speed, and operational lift, this is the event.

You’ll walk away with:

  • Clear, proven examples of agentic AI driving results
  • Direct access to Salesforce product experts and industry innovators
  • Practical steps you can immediately apply to your own AI roadmap

More Ways to Connect During World Tour Week

Agentforce World Tour NYC is just the start. We’re hosting and joining exclusive experiences throughout the week to help you dive deeper into AI, data, and the future of agentic business.

December 10 – Agentforce Champions Breakfast

Start your World Tour experience with an exclusive breakfast for Agentforce champions and power users across leading industries. Connect with peers, share insights, and engage directly with Salesforce leaders. Perficient’s Allie Vaughan will be on-site to share how we’re helping organizations harness agentic AI for real business impact.

Wednesday December 10, 2025 | 8:30AM –10:00AM EST
Onsite at World Tour Javits Center | 429 11th Ave, New York, NY 10001
Register Here → World Tour NYC Agentforce Champions Breakfast

December 10 – Perficient Breakfast at Russ & Daughters

Join us for a relaxed pre-event meetup at one of NYC’s most iconic spots. Enjoy great conversation and connect with Perficient experts and fellow attendees before the main World Tour sessions begin.

Wednesday, December 10 | 8:00AM – 10:30AM EST
Russ & Daughters, NYC | 502 W 34th St., New York, NY 10001
Contact Us for an Invite → Save Your Spot

December 11 – Data 360 + Agentforce Workshops at Salesforce Tower

Take your World Tour experience further with a hands-on workshop designed to help you unlock the full potential of Data 360 and Agentforce. Guided by Perficient’s AI and Data 360 specialists Allie Vaughan and Anu Pandey, you’ll go beyond theory with practical strategies you can apply immediately.

Thursday, December 11 | 10:00AM – 2PM EST
Salesforce Tower New York | 1095 6th Ave, New York, NY 10036
Contact Allie and Anu for an Invite

December 12 – Datablazer Mastery Onsite

Wrap up the week with Salesforce’s full-day enablement experience designed for the Datablazer Community. Deepen your expertise in Data 360 and Agentforce with hands-on learning.

December 12, 2025 | 8:30AM–4:30PM EST
Salesforce Tower New York | 1095 6th Ave, New York, NY 10036
Register Here → Datablazer Mastery Onsite: Agentforce Edition NYC

Your Next Step Toward an Agentic Future

Agentforce World Tour NYC is your chance to see where the Salesforce platform is going and how quickly companies are adapting. From the main event to hands-on workshops, this week offers a complete view of what it takes to operate as an agentic enterprise.

Follow Perficient on LinkedIn for event updates, key takeaways, and our latest insights on Agentforce, Data 360, and the future of AI-driven business.

]]>
https://blogs.perficient.com/2025/12/02/join-perficient-at-agentforce-world-tour-new-york-build-whats-next/feed/ 0 388771
5 Imperatives Financial Leaders Must Act on Now to Win in the Age of AI-Powered Experience https://blogs.perficient.com/2025/12/02/5-imperatives-financial-leaders-must-act-on-now-to-win-in-the-age-of-ai-powered-experience/ https://blogs.perficient.com/2025/12/02/5-imperatives-financial-leaders-must-act-on-now-to-win-in-the-age-of-ai-powered-experience/#respond Tue, 02 Dec 2025 12:29:07 +0000 https://blogs.perficient.com/?p=388106

Financial institutions are at a pivotal moment. As customer expectations evolve and AI reshapes digital engagement, leaders in marketing, CX, and IT must rethink how they deliver value.

Adobe’s report, State of Customer Experience in Financial Services in an AI-Driven World,” reveals that only 36% of the customer journey is currently personalized, despite 74% of executives acknowledging rising customer expectations. With transformation already underway, financial leaders face five imperatives that demand immediate action to drive relevance, trust, and growth.

1. Make Personalization More Meaningful

Personalization has long been a strategic focus, but today’s consumers expect more than basic segmentation or name-based greetings. They want real-time, omnichannel interactions that align with their financial goals, life stages, and behaviors.

To meet this demand, financial institutions must evolve from reactive personalization to predictive, intent-driven engagement. This means leveraging AI to anticipate needs, orchestrate journeys, and deliver content that resonates with individual context.

Perficient Adobe-consulting principal Ross Monaghan explains, “We are still dealing with disparate data and slow progression into a customer 360 source of truth view to provide effective personalization at scale. What many firms are overlooking is that this isn’t just a data issue. We’re dealing with both a people and process issue where teams need to adjust their operational process of typical campaign waterfall execution to trigger-based and journey personalization.”

His point underscores that personalization challenges go beyond technology. They require cultural and operational shifts to enable real-time, AI-driven engagement.

2. Redesign the Operating Model Around the Customer

Legacy structures often silo marketing, IT, and operations, creating friction in delivering cohesive customer experiences. To compete in a digital-first world, financial institutions must reorient their operating models around the customer, not the org chart.

This shift requires cross-functional collaboration, agile workflows, and shared KPIs that align teams around customer outcomes. It also demands a culture that embraces experimentation and continuous improvement.

Only 3% of financial services firms are structured around the customer journey, though 19% say it should be the ideal.

3. Build Content for AI-Powered Search

As AI-powered search becomes a primary interface for information discovery, the way content is created and structured must change. Traditional SEO strategies are no longer enough.

Customers now expect intelligent, personalized answers over static search results. To stay visible and trusted, financial institutions must create structured, metadata-rich content that performs in AI-powered environments. Content must reflect experience-expertise-authoritativeness-trustworthiness principles and be both machine-readable and human-relevant. Success depends on building discovery journeys that work across AI interfaces while earning customer confidence in moments that matter.

4. Unify Data and Platforms for Scalable Intelligence

Disconnected data and fragmented platforms limit the ability to generate insights and act on them at scale. To unlock the full potential of AI and automation, financial institutions must unify their data ecosystems.

This means integrating customer, behavioral, transactional, and operational data into a single source of truth that’s accessible across teams and systems. It also involves modernizing MarTech and CX platforms to support real-time decisioning and personalization.

But Ross points out, “Many digital experience and marketing platforms still want to own all data, which is just not realistic, both in reality and cost. The firms that develop their customer source of truth (typically cloud-based data platforms) and signal to other experience or service platforms will be the quickest to marketing execution maturity and success.”

His insight emphasizes that success depends not only on technology integration but also on adopting a federated approach that accelerates marketing execution and operational maturity.

5. Embed Guardrails Into GenAI Execution

As financial institutions explore GenAI use cases, from content generation to customer service automation, governance must be built in from the start. Trust is non-negotiable in financial services, and GenAI introduces new risks around accuracy, bias, and compliance.

Embedding guardrails means establishing clear policies, human-in-the-loop review processes, and robust monitoring systems. It also requires collaboration between legal, compliance, marketing, and IT to ensure responsible innovation.

At Perficient, we use our PACE (Policies, Advocacy, Controls, Enablement) Framework to holistically design tailored operational AI programs that empower business and technical stakeholders to innovate with confidence while mitigating risks and upholding ethical standards.

The Time to Lead is Now

The future of financial services will be defined by how intelligently and responsibly institutions engage in real time. These five imperatives offer a blueprint for action, each one grounded in data, urgency, and opportunity. Leaders who move now will be best positioned to earn trust, drive growth, and lead in the AI-powered era.

Learn About Perficient and Adobe’s Partnership

Are you looking for a partner to help you transform and modernize your technology strategy? Perficient and Adobe bring together deep industry expertise and powerful experience technologies to help financial institutions unify data, orchestrate journeys, and deliver customer-centric experiences that build trust and drive growth.

Get in Touch With Our Experts

]]>
https://blogs.perficient.com/2025/12/02/5-imperatives-financial-leaders-must-act-on-now-to-win-in-the-age-of-ai-powered-experience/feed/ 0 388106
AI and the Future of Financial Services UX https://blogs.perficient.com/2025/12/01/ai-banking-transparency-genai-financial-ux/ https://blogs.perficient.com/2025/12/01/ai-banking-transparency-genai-financial-ux/#comments Mon, 01 Dec 2025 18:00:28 +0000 https://blogs.perficient.com/?p=388706

I think about the early ATMs now and then. No one knew the “right” way to use them. I imagine a customer in the 1970s standing there, card in hand, squinting at this unfamiliar machine and hoping it would give something back; trying to decide if it really dispensed cash…or just ate cards for sport. That quick panic when the machine pulled the card in is an early version of the same confusion customers feel today in digital banking.

People were not afraid of machines. They were afraid of not understanding what the machine was doing with their money.

Banks solved it by teaching people how to trust the process. They added clear instructions, trained staff to guide customers, and repeated the same steps until the unfamiliar felt intuitive. 

However, the stakes and complexity are much higher now, and AI for financial product transparency is becoming essential to an optimized banking UX.

Today’s banking customer must navigate automated underwriting, digital identity checks, algorithmic risk models, hybrid blockchain components, and disclosures written in a language most people never use. Meanwhile, the average person is still struggling with basic money concepts.

FINRA reports that only 37% of U.S. adults can answer four out of five financial literacy questions (FINRA Foundation, 2022).

Pew Research finds that only about half of Americans understand key concepts like inflation and interest (Pew Research Center, 2024).

Financial institutions are starting to realize that clarity is not a content task or a customer service perk. It is structural. It affects conversion, compliance, risk, and trust. It shapes the entire digital experience. And AI is accelerating the pressure to treat clarity as infrastructure.

When customers don’t understand, they don’t convert. When they feel unsure, they abandon the flow. 

 

How AI is Improving UX in Banking (And Why Institutions Need it Now)

Financial institutions often assume customers will “figure it out.” They will Google a term, reread a disclosure, or call support if something is unclear. In reality, most customers simply exit the flow.

The CFPB shows that lower financial literacy leads to more mistakes, higher confusion, and weaker decision-making (CFPB, 2019). And when that confusion arises during a digital journey, customers quietly leave without resolving their questions.

This means every abandoned application costs money. Every misinterpreted term creates operational drag. Every unclear disclosure becomes a compliance liability. Institutions consistently point to misunderstanding as a major driver of complaints, errors, and churn (Lusardi et al., 2020).

Sometimes it feels like the industry built the digital bank faster than it built the explanation for it.

Where AI Makes the Difference

Many discussions about AI in financial services focus on automation or chatbots, but the real opportunity lies in real-time clarity. Clarity that improves financial product transparency and streamlines customer experience without creating extra steps.

In-context Explanations That Improve Understanding

Research in educational psychology shows people learn best when information appears the moment they need it. Mayer (2019) demonstrates that in-context explanations significantly boost comprehension. Instead of leaving the app to search unfamiliar terms, customers receive a clear, human explanation on the spot.

Consistency Across Channels

Language in banking is surprisingly inconsistent. Apps, websites, advisors, and support teams all use slightly different terms. Capgemini identifies cross-channel inconsistency as a major cause of digital frustration (Capgemini, 2023). A unified AI knowledge layer solves this by standardizing definitions across the system.

Predictive Clarity Powered by Behavioral Insight

Patterns like hesitation, backtracking, rapid clicking, or form abandonment often signal confusion. Behavioral economists note these patterns can predict drop-off before it happens (Loibl et al., 2021). AI can flag these friction points and help institutions fix them.

24/7 Clarity, Not 9–5 Support

Accenture reports that most digital banking interactions now occur outside of business hours (Accenture, 2023). AI allows institutions to provide accurate, transparent explanations anytime, without relying solely on support teams.

At its core, AI doesn’t simplify financial products. It translates them.

What Strong AI-Powered Customer Experience Looks Like

Onboarding that Explains Itself

  • Mortgage flows with one-sentence escrow definitions.
  • Credit card applications with visual explanations of usage.
  • Hybrid products that show exactly what blockchain is doing behind the scenes. The CFPB shows that simpler, clearer formats directly improve decision quality (CFPB, 2020).

A Unified Dictionary Across Channels

The Federal Reserve emphasizes the importance of consistent terminology to help consumers make informed decisions (Federal Reserve Board, 2021). Some institutions now maintain a centralized term library that powers their entire ecosystem, creating a cohesive experience instead of fragmented messaging.

Personalization Based on User Behavior

Educational nudges, simplified paths, multilingual explanations. Research shows these interventions boost customer confidence (Kozup & Hogarth, 2008). 

Transparent Explanations for Hybrid or Blockchain-backed Products

Customers adopt new technology faster when they understand the mechanics behind it (University of Cambridge, 2021). AI can make complex automation and decentralized components understandable.

The Urgent Responsibilities That Come With This

 

GenAI can mislead customers without strong data governance and oversight. Poor training data, inconsistent terminology, or unmonitored AI systems create clarity gaps. That’s a problem because those gaps can become compliance issues. The Financial Stability Oversight Council warns that unmanaged AI introduces systemic risk (FSOC, 2023). The CFPB also emphasizes the need for compliant, accurate AI-generated content (CFPB, 2024).

Customers are also increasingly wary of data usage and privacy. Pew Research shows growing fear around how financial institutions use personal data (Pew Research Center, 2023). Trust requires transparency.

Clarity without governance is not clarity. It’s noise.

And institutions cannot afford noise.

What Institutions Should Build Right Now

To make clarity foundational to customer experience, financial institutions need to invest in:

  • Modern data pipelines to improve accuracy
  • Consistent terminology and UX layers across channels
  • Responsible AI frameworks with human oversight
  • Cross-functional collaboration between compliance, design, product, and analytics
  • Scalable architecture for automated and decentralized product components
  • Human-plus-AI support models that enhance, not replace, advisors

When clarity becomes structural, trust becomes scalable.

Why This Moment Matters

I keep coming back to the ATM because it perfectly shows what happens when technology outruns customer understanding. The machine wasn’t the problem. The knowledge gap was. Financial services are reliving that moment today.

Customers cannot trust what they do not understand.

And institutions cannot scale what customers do not trust.

GenAI gives financial organizations a second chance to rebuild the clarity layer the industry has lacked for decades, and not as marketing. Clarity, in this new landscape, truly is infrastructure.

Related Reading

References 

  • Accenture. (2023). Banking top trends 2023. https://www.accenture.com
  • Capgemini. (2023). World retail banking report 2023. https://www.capgemini.com
  • Consumer Financial Protection Bureau. (2019). Financial well-being in America. https://www.consumerfinance.gov
  • Consumer Financial Protection Bureau. (2020). Improving the clarity of mortgage disclosures. https://www.consumerfinance.gov
  • Consumer Financial Protection Bureau. (2024). Supervisory highlights: Issue 30. https://www.consumerfinance.gov
  • Federal Reserve Board. (2021). Consumers and mobile financial services. https://www.federalreserve.gov
  • FINRA Investor Education Foundation. (2022). National financial capability study. https://www.finrafoundation.org
  • Financial Stability Oversight Council. (2023). Annual report. https://home.treasury.gov
  • Kozup, J., & Hogarth, J. (2008). Financial literacy, public policy, and consumers’ self-protection. Journal of Consumer Affairs, 42(2), 263–270.
  • Loibl, C., Grinstein-Weiss, M., & Koeninger, J. (2021). Consumer financial behavior in digital environments. Journal of Economic Psychology, 87, 102438.
  • Lusardi, A., Mitchell, O. S., & Oggero, N. (2020). The changing face of financial literacy. University of Pennsylvania, Wharton School.
  • Mayer, R. (2019). The Cambridge handbook of multimedia learning. Cambridge University Press.
  • Pew Research Center. (2023). Americans and data privacy. https://www.pewresearch.org
  • Pew Research Center. (2024). Americans and financial knowledge. https://www.pewresearch.org
  • University of Cambridge. (2021). Global blockchain benchmarking study. https://www.jbs.cam.ac.uk
]]>
https://blogs.perficient.com/2025/12/01/ai-banking-transparency-genai-financial-ux/feed/ 6 388706
Perficient Hyderabad Cricket Tournament Recap https://blogs.perficient.com/2025/12/01/perficient-hyderabad-cricket-tournament-recap-2025/ https://blogs.perficient.com/2025/12/01/perficient-hyderabad-cricket-tournament-recap-2025/#respond Mon, 01 Dec 2025 13:19:31 +0000 https://blogs.perficient.com/?p=388681

Introduction

The Perficient Hyderabad Cricket Tournament Recap‑2025 highlights a lively and memorable day at our Hyderabad office. The event brought together four enthusiastic teams—Challengers, Risers, Strikers, and Warriors—who displayed strong energy from the very first match. The tournament built excitement, encouraged teamwork, and strengthened our workplace spirit.

A Thrilling Path to Victory in the Perficient Hyderabad Cricket Tournament

The tournament kicked off with fast‑paced matches. Every team played with confidence and intention. The Risers delivered consistent performance and smart strategy throughout the event. Their teamwork helped them secure the championship title. The Warriors finished as the runner‑up team, winning applause for their determined effort until the last over.

You can also revisit our Perficient Hyderabad Cricket Tournament Recap‑2024 to see how last year’s matches unfolded.

Click to view slideshow.

Celebration at the Townhall: Recap 2025

The monthly town-hall added more excitement to the day. Leaders shared updates, upcoming plans, and new initiatives that inspired the entire team. After the discussions, everyone looked forward to the tournament results. The hall filled with cheers when the Risers received their trophies, marking a proud moment in the Perficient Hyderabad Cricket Tournament Recap‑2025.

Lunch, Laughter, and Team Bonding at Perficient Hyderabad Cricket Tournament

The day continued with a cheerful lunch for all. Conversations flowed, teammates shared fun match moments, and everyone enjoyed the relaxed atmosphere. Teammates discussed strategies, celebrated highlights, and bonded across teams. The combination of cricket, celebration, and connection created an experience that everyone will remember.

This extended camaraderie made the Perficient Hyderabad Cricket Tournament Recap‑2025 more than just a sporting event—it became a symbol of workplace unity.

Click to view slideshow.

 

A Memorable Day for Perficient Hyderabad

The Perficient Hyderabad Cricket Tournament Recap‑2025 achieved more than just a winning team. It strengthened team spirit, encouraged friendly competition, and brought everyone closer. The celebration at the town-hall added meaning to the moment and wrapped up the day perfectly.

This event stands as a reminder that teamwork and enthusiasm make every workplace brighter. For more highlights, explore our Perficient Blog Page.

]]>
https://blogs.perficient.com/2025/12/01/perficient-hyderabad-cricket-tournament-recap-2025/feed/ 0 388681
Building with Sitecore APIs: From Authoring to Experience Edge https://blogs.perficient.com/2025/11/28/building-with-sitecore-apis-from-authoring-to-experience-edge/ https://blogs.perficient.com/2025/11/28/building-with-sitecore-apis-from-authoring-to-experience-edge/#respond Fri, 28 Nov 2025 20:31:24 +0000 https://blogs.perficient.com/?p=388662

Sitecore has made a significant shift towards a fully API-first, headless-friendly architecture. This modern approach decouples content management from delivery, giving developers unprecedented flexibility to work with content from virtually anywhere—be it front-end applications, backend systems, integration services, or automated pipelines.

One of the biggest advantages of this shift is that you no longer need server-side access to Sitecore to manipulate content. Instead, the system exposes a robust set of APIs to support these powerful new use cases.

Sitecore provides three key APIs, each designed for a specific purpose: Experience Edge APIs, Authoring and Management API. Understanding how these APIs relate and differ is crucial for designing robust external integrations, building sync services, and managing your content programmatically.

This blog will  provide a practical, end-to-end view of how these APIs fit into modern architectures. We will specifically walk through how any external system can call the Authoring API using GraphQL,  how to execute common GraphQL mutations such as create, update, delete, rename, and move. If you’re building integration services or automation pipelines for SitecoreAI, this will give you a complete picture of what’s possible

Sitecore’s modern architecture separates content operations into three distinct API layers. This crucial separation is designed to ensure scalability, security, and clear responsibility boundaries across the content lifecycle.

Let’s break down the purpose and typical use case for each API:

1. Experience Edge Delivery API

The Experience Edge Delivery API is Sitecore’s public-facing endpoint, dedicated purely to high-performance content delivery.

  • Primary Use: Used primarily by your front-end applications (ex- Next.js, React, mobile apps) and kiosks to fetch published content for your presentation layer.

  • Core Function: It is fundamentally read-only and does not support content creation or modification.

  • Interface: Exposes a GraphQL endpoint that allows for querying items, fields, and components efficiently.

  • Authentication: Requires minimal or no complex authentication (often just an API key) when fetching published content, as it is designed for global, low-latency access.

  • Endpoint: https://edge.sitecorecloud.io/api/graphql/v1

2. Authoring API (GraphQL)

The Authoring API is the control center for all item-level content management operations from external systems.

  • Primary Use: This is the API you use when building integration pipelines, external systems or third-party applications that need to manipulate content programmatically.

  • Core Functions: It allows external systems to perform the same operations authors execute in the CMS UI, including:

    • Create, update, and delete items.

    • Rename or move items.

    • Manage media assets.

    • Work with workflows and language settings.

  • Interface: Exposed through a dedicated GraphQL endpoint that supports both queries and mutations.

  • Authentication: All calls must be authenticated. The recommended secure approach is using OAuth’s client_credentials flow to obtain a Bearer JWT access token, as detailed in Sitecore’s security documentation.

  • Endpoint Structure: The endpoint is hosted on your Content Management (CM) instance, following a structure like:
    https://your-cm-instance/sitecore/api/authoring/graphql/v1

3. Management API

The Management API supports all administrative, system, and environment-level capabilities.

  • Primary Use: Often used in CI/CD pipelines, server-side processes, and automated scripts for environment maintenance.

  • Core Functions: These include operations that affect the system state or background jobs, such as:

    • Triggering content publishing jobs.

    • Running index rebuilds.

    • Managing environment metadata and background jobs.

    • Generating access tokens (such as through the client_credentials flow).

  • Interface: It shares the same GraphQL endpoint as the Authoring API.

  • Endpoint: same as Authoring API.
    Note: The distinction between the Authoring and Management API is primarily managed by the OAuth scopes assigned to the access token used for authentication, not by a different URL.
  • Relationship to Authoring: While it doesn’t handle item-level content edits, it works alongside the Authoring API to support a full content lifecycle, such as writing content (Authoring API) followed by publishing it (Management API).

Enabling the Authoring and Management APIs: The Prerequisites

Before we can send our first GraphQL mutation to manage content, we have to handle the setup and security. The prerequisites for enabling the Authoring and Management APIs are slightly different depending on your Sitecore environment, but the end goal is the same: getting a secure access token.

Sitecore XM Cloud / SitecoreAI

If you’re on a cloud-native platform like XM Cloud or SitecoreAI, the GraphQL endpoints are already up and running. You don’t have to fiddle with configuration files. Your main focus is on authorization:

  1. Generate Credentials: You need to use the Sitecore interface (often in the Manage or Connect section) to generate a set of Client Credentials (a Client ID and a Client Secret). These are your secure “keys” to the content.

  2. Define Scopes: When you generate these credentials, you must ensure the associated identity has the appropriate OAuth scopes. For instance, you’ll need scopes like sitecore.authoring and sitecore.management to be included in your token request. This is what tells the system what your application is actually allowed to do (read, write, or publish).

Sitecore XM /XP

For traditional, self-hosted Sitecore XM installations, you have a small administrative step to get the endpoints operational:

  1. Enable the Endpoint: You need to deploy a simple configuration patch file. This patch explicitly enables the API endpoint itself and often the helpful GraphQL Playground IDE (for easy testing). You’ll typically set configuration settings in your CM instnace.

    <setting name="GraphQL.Enabled" value="true" /> <setting name="GraphQL.ExposePlayground" value="true" />

  2. Configure Identity Server: Similar to XM Cloud, you then need to register your client application with your Sitecore Identity Server. This involves creating a client record in your IDS configuration that specifies the required allowedGrantTypes (like client_credentials) and the necessary allowedScopes (sitecore.authoring, etc.).

Whether you’re in the SitecoreAI or Sitecore XP/XM the biggest hurdle is obtaining that secure JWT Bearer token. Every request you send to the Authoring and Management APIs must include this token in the Authorization header. We’ll dive into the client_credentials flow for getting this token in the next section.

For the absolute definitive guide on the steps specific to your environment, always check the official documentation: Sitecore XM Cloud / SitecoreAI and Sitecore XP/XM.

Authoring API – Authentication, Requests, and Query Examples

The Authoring API exposes the full set of content-management capabilities through GraphQL. Because these operations can modify items, media, workflows, and other critical pieces of the content tree, every request must be authenticated. The Authoring API uses OAuth, and the recommended approach is the client_credentials flow.

To authorize a request, you first create a client in the Sitecore Cloud Portal. This client gives you a client_id and client_secret. Once you have them, you request an access token from the token endpoint:

POST https://auth.sitecorecloud.io/oauth/token
Content-Type: application/x-www-form-urlencoded

grant_type=client_credentials
client_id=your_client_id
client_secret=your_client_secret
audience=https://api.sitecorecloud.io

The response contains an access_token and an expiry. This token is then passed in the Authorization header for all subsequent GraphQL calls.

The Authoring API endpoint accepts only POST requests with a JSON body containing a GraphQL query or mutation. A typical request looks like this:

POST https://your-tenant.sitecorecloud.io/api/authoring/graphql
Authorization: Bearer your_access_token
Content-Type: application/json

However, the real value of the Authoring API comes from the mutations it supports. These mutations allow external systems to take over tasks that were traditionally only possible inside the CMS. They enable you to create new content, update fields, delete obsolete items, restructure information architecture, or even rename and move items. For integrations, sync services, or automated workflows, these mutations become the core building blocks. Below are a few queries that can be helpful:

1. Create Item:

mutation CreateItem(
  $name: String!
  $parentId: ID!
  $templateId: ID!
  $fields: [FieldValueInput]!
) {
  createItem(
    input: {
      name: $name
      parent: $parentId
      templateId: $templateId
      fields: $fields
    }
  ) {
    item {
      ItemID: itemId
      ItemName: name
    }
  }
}

Input Variables:

{
    name = <item-name>,
    parentId = <parent-item-id>,
    templateId = <template-id>,
    fields = [
        { name: "title", value: "Contact US" }
        { name: "text", value: "Contact US Here" }
      ]
};

2. Update Item:

mutation UpdateItem(
  $id: ID!
  $database: String!
  $language: String!
  $fields: [FieldValueInput!]
) {
  updateItem(
    input: {
      itemId: $id
      database: $database
      language: $language
      version: 1
      fields: $fields
    }
  ) {
    item {
      ItemID: itemId
      ItemName: name
    }
  }
}


Input Variables:
{
    id = <item-id>,
    database = "master",
    language = "en" ,
    fields = [
        { name: "Title", value: "New Title" }
        { name: "Content", value: "New Content" }
      ]
};
3. Rename Item:
mutation RenameItem(
  $id: ID!, 
  $database: String!,
  $newName: String!
) {
  renameItem(
    input: {
      itemId: $id,
      database: $database, 
      newName: $newName 
  }
) {
    item {
      ItemID: itemId
      ItemName: name
    }
  }
}

Input Variables:
{
     id = <item-id>,
     database = "master",
     newName = <new-item-name>
};
4. Move Item:
mutation MoveItem($id: ID!, $targetParentId: ID!) {
  moveItem(input: { itemId: $id, targetParentId: $targetParentId }) {
    item {
      ItemID: itemId
      ItemName: name
    }
  }
}

Input Variables:
{
    id = <item-id>,
    targetParentId = <target-parent-item-id>,
};
5. Delete Item:
mutation DeleteItem($itemID: ID!) {
  deleteItem(input: { itemId: $itemID, permanently: false }) {
    successful
  }
}

Input Variables:
{
itemID = <item-id>,
};

These mutations are extremely powerful because they give you full authoring control from any external system. You can build automated pipelines, sync content from third-party sources, integrate back-office systems, or maintain content structures without needing direct access to the CMS. The Authoring API essentially opens up the same level of control Sitecore developers traditionally had through server-side APIs, but now in a clean, modern, and fully remote GraphQL form.

Management API  – Authentication and Usage

The Management API sits alongside the Authoring API but focuses on administrative and system-level operations. These include running indexing jobs, publishing content, listing background jobs, working with workflows, or inspecting environment metadata. The authentication model is the same: you obtain an access token using the same client_credentials flow and include it in the Authorization header when making requests.

The Management API also uses GraphQL, though the endpoint is different. The requests still follow the same structure: POST calls with a JSON body containing the GraphQL query or mutation.

A typical request looks like:

POST https://your-tenant.sitecorecloud.io/api/management/graphql
Authorization: Bearer your_access_token
Content-Type: application/json

A common example is triggering a publish operation. The mutation for that might look like:

mutation PublishItem($root: ID!) {
  publishItem(
    input: {
      rootItemId: $root
      languages: "en"
      targetDatabases: "experienceedge"
      publishItemMode: FULL
      publishRelatedItems: false
      publishSubItems: true
    }
  ) {
    operationId
  }
}

Input Variables:
 {
     root = <iten-to-publish>
 };

The Management API is often used after content changes are made through the Authoring API. For example, after creating or modifying items, your external service may immediately trigger a publish so that the changes become available through Experience Edge.

The authorization workflow is identical to the Authoring API, which keeps integration straightforward: your service requests one token and can use it for both Authoring and Management operations as long as the client you registered has the appropriate scopes.

Experience Edge Delivery API –  Authentication and Query Examples

Experience Edge exposes published content through a globally distributed read-only API. Unlike the Authoring and Management APIs, the Delivery API uses API keys rather than OAuth tokens for content retrieval. However, the API key itself is obtained through an authenticated request that also uses an access token.

To get the Experience Edge API key for a specific environment, you first authenticate using the same client_credentials flow. Once you have your access token, you call the Deploy or Environment API endpoint to generate or retrieve an Edge Access Token or Delivery API key for that specific environment. This token is what your application uses when querying Edge.

Once you have the key, requests to Experience Edge look more like this:

POST https://edge.sitecorecloud.io/api/graphql/v1
X-GQL-Token: your_edge_api_key
Content-Type: application/json

A basic read query might be:

query ItemExists($id: String!, $language: String!) {
  item(path: $id, language: $language) {
    ItemID: id
    ItemName: name
  }
}

Input Variables:
{
   id= <item-id>,
   language = "en"
}
Experience Edge only returns published content. If you have just created or updated an item through the Authoring API, it will not be available in Edge until a publish operation has been performed, either manually or through the Management API.
The workflow for external applications is usually:
  1. Obtain access token
  2. Use token to retrieve or generate the Edge API key
  3. Use the Edge key in all Delivery API requests
  4. Query published content through GraphQL
Because Edge is optimized for front-end delivery, it is highly structured, cached, and tuned for fast reads. It does not support mutations. Anything involving content modification must happen through the Authoring API.

Making Sense of the Entire Flow

With the combination of Experience Edge for delivery and the Authoring and Management APIs for write and operational tasks, Sitecore has opened up a model where external systems can participate directly in the creation, maintenance, and publication of content without ever touching the CM interface. This gives developers and teams a lot more freedom. You can build sync services that keep Sitecore aligned with external data sources, migrate content with far less friction, or automate repetitive authoring work that used to require manual effort. It also becomes straightforward to push structured data such as products, locations, events, or practitioner information – into Sitecore from CRMs, commerce engines, or any internal system you rely on. Everything is just an authenticated GraphQL call away.

The separation between these APIs also brings clarity. The Authoring API handles the content changes, the Management API supports the operational steps around them, and Experience Edge takes care of delivering that content efficiently to any front end. Each piece has its own responsibility, and they work together without getting in each other’s way. Authors continue working in the CMS. Front-end applications consume only published content. Integration services run independently using APIs built for automation.

The end result is a content platform that fits naturally into modern technical ecosystems. It’s cloud-friendly, headless from the ground up, and flexible enough to integrate with whatever tools or systems an organization already uses. And because everything runs through secure, well-defined APIs, you get consistency, stability, and a workflow that scales as your requirements grow.

This unified approach – external content operations through Authoring and Management APIs, and high-performance delivery through Experience Edge, is what makes the platform genuinely powerful. It lets you build reliable, maintainable, and future-ready content solutions without being tied to the internals of the CMS, and that is a significant shift in how we think about managing content today.

]]>
https://blogs.perficient.com/2025/11/28/building-with-sitecore-apis-from-authoring-to-experience-edge/feed/ 0 388662
How to Approach Implementing Sitecore Content Hub https://blogs.perficient.com/2025/11/26/how-to-approach-implementing-sitecore-content-hub/ https://blogs.perficient.com/2025/11/26/how-to-approach-implementing-sitecore-content-hub/#respond Wed, 26 Nov 2025 20:38:38 +0000 https://blogs.perficient.com/?p=388649

Content chaos is costing you more than you think

Every disconnected asset, every redundant workflow, every missed opportunity to reuse content adds up, not just in operational inefficiency, but in lost revenue, slower time-to-market, and diminished brand consistency. For many organizations, the content supply chain is broken, and the cracks show up everywhere: marketing campaigns delayed, creative teams overwhelmed, and customers receiving fragmented experiences.

Sitecore Content Hub can help solve this, but here’s the truth: technology alone won’t solve the problem. Success requires a strategic approach that aligns people, processes, and platforms. Over the years, I’ve seen one principle hold true: when you break the process into digestible steps, clarity emerges. Here’s the five-step framework I recommend for leaders who want to turn Content Hub into a competitive advantage. It’s what I wish I had before my first implementation. While Content Hub is extremely powerful for a Digital Asset Management (DAM) platform, and there could be entire books written on each configuration point, my hope in this post is to give someone new to the platform a mindset to have before beginning an implementation.

 

Step 1: Discover and Decode

Transformation starts with visibility. Before you configure anything, take a hard look at your current state. What assets do you have? How do they move through your organization, from creation to approval to archival? Who touches them, and where do bottlenecks occur?

This isn’t just an audit; it’s an opportunity to uncover inefficiencies and align stakeholders. Ask questions like:

  • Are we duplicating content because teams don’t know what already exists?
  • Where are the delays that slow down time-to-market?
  • Which assets drive value and which are digital clutter?

Document these insights in a way that tells a story. When leadership sees the cost of inefficiency and the opportunity for improvement, alignment becomes easier. This step sets the foundation for governance, taxonomy, and integration decisions later. Skip it, and everything else wobbles.

 

Step 2: Design the Blueprint

Once you know where you are, define where you’re going. This is your architectural phase and the moment to design a system that scales.

Start with taxonomy. A well-structured taxonomy makes assets easy to find and reuse, while a poor one creates friction and frustration. Establish naming conventions and metadata standards that support searchability and personalization. Then, build a governance model that enforces consistency without stifling creativity.

Finally, map the flow of content across systems. Where is content coming from? Where does it need to go? These answers determine integration points and connectors. If you skip this step, you risk building silos inside your new system, which is a mistake that undermines the entire investment.

 

Step 3: Deploy the (Content) Hub

See what we did there?! With the blueprint in hand, it’s time to implement. Configure the environment, validate user roles, and migrate assets with care.

Deployment is more than a technical exercise. It’s a change management moment. How you roll out the platform will influence adoption. Consider a phased approach: start with a pilot group, gather feedback, and refine before scaling.

Testing is critical. Validate search functionality, user permissions, and workflows before you go live. A smooth deployment isn’t just about avoiding errors. It’s about building confidence across the organization.

 

Step 4: Drive Intelligent Delivery

Content Hub isn’t just a repository; it’s a strategic engine. This is where you unlock its full potential. Enable AI features to automate tagging and improve personalization. Create renditions and transformations that make omnichannel delivery seamless.

Think beyond efficiency. Intelligent delivery is about elevating the customer experience. When your content is enriched with metadata and optimized for every channel, you’re not just saving time. You’re driving engagement and revenue.

Governance plays a starring role here. Standards aren’t just rules. They’re the guardrails that keep your ecosystem healthy and scalable. Without them, even the smartest technology can devolve into chaos.

 

Step 5: Differentiate

This is where leaders separate themselves from the pack. Implementation is not the finish line—it’s the starting point for continuous improvement.

Differentiation begins with measurement. Build dashboards that show how content performs across channels and campaigns. Which assets drive conversions? Which formats resonate with your audience? These insights allow you to double down on what works and retire what doesn’t.

But don’t stop at performance metrics. Use audits to identify gaps in your content strategy. Are you missing assets for emerging channels? Are you over-investing in content that doesn’t move the needle? This level of visibility turns your content operation into a strategic lever for growth.

Finally, think about innovation. How can you use Content Hub to enable personalization at scale? How can AI-driven insights inform creative decisions? Leaders who embrace this mindset turn Content Hub from a tool into a competitive advantage.

 

Final Thoughts

Your current state may feel daunting, but clarity is within reach. By breaking the process into these five steps, you can transform chaos into a content strategy that drives real business outcomes. Sitecore Content Hub is powerful—but only if you implement it with intention.

Ready to start your journey? Begin with discovery. The rest will follow. If Perficient can help, reach out!

]]>
https://blogs.perficient.com/2025/11/26/how-to-approach-implementing-sitecore-content-hub/feed/ 0 388649