Strategy and Transformation Articles / Blogs / Perficient https://blogs.perficient.com/category/services/strategy-and-consulting/ Expert Digital Insights Tue, 28 Oct 2025 12:28:50 +0000 en-US hourly 1 https://blogs.perficient.com/files/favicon-194x194-1-150x150.png Strategy and Transformation Articles / Blogs / Perficient https://blogs.perficient.com/category/services/strategy-and-consulting/ 32 32 30508587 Executing a Sitecore Migration: Development, Performance, and Beyond https://blogs.perficient.com/2025/10/28/executing-a-sitecore-migration-development/ https://blogs.perficient.com/2025/10/28/executing-a-sitecore-migration-development/#respond Tue, 28 Oct 2025 12:23:25 +0000 https://blogs.perficient.com/?p=388061

In previous blog, the strategic and architectural considerations that set the foundation for a successful Sitecore migration is explored. Once the groundwork is ready, it’s time to move from planning to execution, where the real complexity begins. The development phase of a Sitecore migration demands precision, speed, and scalability. From choosing the right development environment and branching strategy to optimizing templates, caching, and performance, every decision directly impacts the stability and maintainability of your new platform.

This blog dives into the practical side of migration, covering setup best practices, developer tooling (IDE and CI/CD), coding standards, content model alignment, and performance tuning techniques to help ensure that your transition to Sitecore’s modern architecture is both seamless and future-ready.Title (suggested): Executing a Successful Sitecore Migration: Development, Performance, and Beyond

 

1. Component and Code Standards Over Blind Reuse

  • In any Sitecore migration, one of the biggest mistakes teams make is lifting and shifting old components into the new environment. While this may feel faster in the short term, it creates long-term problems.
  • Missed product offerings: Old components were often built around constraints of an earlier Sitecore version. Reusing them as-is means you can’t take advantage of new product features like improved personalization, headless capabilities, SaaS integrations, and modern analytics.
  • Outdated standards: Legacy code usually does not meet current coding, security, and performance standards. This can introduce vulnerabilities and inefficiencies into your new platform.
    Accessibility gaps: Many older components don’t align with WCAG and ADA accessibility standards — missing ARIA roles, semantic HTML, or proper alt text. Reusing them will carry accessibility debt into your fresh build.
  • Maintainability issues: Old code often has tight coupling, minimal test coverage, and obsolete dependencies. Keeping it will slow down future upgrades and maintenance.

Best practice: Treat the migration as an opportunity to raise your standards. Audit old components for patterns and ideas, but don’t copy-paste them. Rebuild them using modern frameworks, Sitecore best practices, security guidelines, and accessibility compliance. This ensures the new solution is future-proof and aligned with the latest Sitecore roadmap.

 

2. Template Creation and Best Practices

  • Templates define the foundation of your content structure, so designing them carefully is critical.
  • Analyze before creating: Study existing data models, pages, and business requirements before building templates.
  • Use base templates: Group common fields (e.g., Meta, SEO, audit info) into base templates and reuse them across multiple content types.
  • Leverage branch templates: Standardize complex structures (like a landing page with modules) by creating branch templates for consistency and speed.
  • Follow naming and hierarchy conventions: Clear naming and logical organization make maintenance much easier.

 

3. Development Practices and Tools

A clean, standards-driven development process ensures the migration is efficient, maintainable, and future-proof. It’s not just about using the right IDEs but also about building code that is consistent, compliant, and friendly for content authors.

  • IDEs & Tools
    • Use Visual Studio or VS Code with Sitecore- and frontend-specific extensions for productivity.
    • Set up linting, code analysis, and formatting tools (ESLint, Prettier in case of JSS code, StyleCop) to enforce consistency.
    • Use AI assistance (GitHub Copilot, Codeium, etc.) to speed up development, but always review outputs for compliance and quality. There are many different AI tools available in market that can even change the design/prototypes into specified code language.
  • Coding Standards & Governance
    • Follow SOLID principles and keep components modular and reusable.
    • Ensure secure coding standards: sanitize inputs, validate data, avoid secrets in code.
    • Write accessible code: semantic HTML, proper ARIA roles, alt text, and keyboard navigation.
    • Document best practices and enforce them with pull request reviews and automated checks.
  • Package & Dependency Management
    • Select npm/.NET packages carefully: prefer well-maintained, community-backed, and security-reviewed ones.
    • Avoid large, unnecessary dependencies that bloat the project.
    • Run dependency scanning tools to catch vulnerabilities.
    •  Keep lockfiles for environment consistency.
  • Rendering Variants & Parameters
    • Leverage rendering variants (SXA/headless) to give flexibility without requiring code changes.
    • Add parameters so content authors can adjust layouts, backgrounds, or alignment safely.
    • Always provide sensible defaults to protect design consistency.
  • Content Author Experience

Build with the content author in mind:

    • Use clear, meaningful field names and help text.
    • Avoid unnecessary complexity: fewer, well-designed fields are better.
    • Create modular components that authors can configure and reuse.
    • Validate with content author UAT to ensure the system is intuitive for day-to-day content updates.

Strong development practices not only speed up migration but also set the stage for easier maintenance, happier authors, and a longer-lasting Sitecore solution.

 

4. Data Migration & Validation

Migrating data is not just about “moving items.” It’s about translating old content into a new structure that aligns with modern Sitecore best practices.

  • Migration tools
    Sitecore does provides migration tools to shift data like XM to XM Cloud. Leverage these tools for data that needs to be copied.
  • PowerShell for Migration
    • Use Sitecore PowerShell Extensions (SPE) to script the migration of data from the old system that does not need to be as is but in different places and field from old system.
    • Automate bulk operations like item creation, field population, media linking, and handling of multiple language versions.
    • PowerShell scripts can be run iteratively, making them ideal as content continues to change during development.
    • Always include logging and reporting so migrated items can be tracked, validated, and corrected if needed.
  • Migration Best Practices
    • Field Mapping First: Analyze old templates and decide what maps directly, what needs transformation, and what should be deprecated.
    • Iterative Migration: Run migration scripts in stages, validate results, and refine before final cutover.
    • Content Cleanup: Remove outdated, duplicate, or unused content instead of carrying it forward.
    • SEO Awareness: Ensure titles, descriptions, alt text, and canonical fields are migrated correctly.
    • Audit & Validation:
      • Use PowerShell reports to check item counts, empty fields, or broken links.
      • Crawl both old and new sites with tools like Screaming Frog to compare URLs, metadata, and page structures.

 

5. SEO Data Handling

SEO is one of the most critical success factors in any migration — if it’s missed, rankings and traffic can drop overnight.

  • Metadata: Preserve titles, descriptions, alt text, and Open Graph tags. Missing these leads to immediate SEO losses.
  • Redirects: Map old URLs with 301 redirects (avoid chains). Broken redirects = lost link equity.
  • Structured Data: Add/update schema (FAQ, Product, Article, VideoObject). This improves visibility in SERPs and AI-generated results.
  • Core Web Vitals: Ensure the new site is fast, stable, and mobile-first. Poor performance = lower rankings.
  • Emerging SEO: Optimize for AI/Answer Engine results, focus on E-E-A-T (author, trust, freshness), and create natural Q&A content for voice/conversational search.
  • Validation: Crawl the site before and after migration with tools like Screaming Frog or Siteimprove to confirm nothing is missed.

Strong SEO handling ensures the new Sitecore build doesn’t just look modern — it retains rankings, grows traffic, and is ready for AI-powered search.

 

6. Serialization & Item Deployment

Serialization is at the heart of a smooth migration and ongoing Sitecore development. Without the right approach, environments drift, unexpected items get deployed, or critical templates are missed.

  • ✅ Best Practices
    • Choose the Right Tool: Sitecore Content Serialization (SCS), Unicorn, or TDS — select based on your project needs.
    • Scope Carefully: Serialize only what is required (templates, renderings, branches, base content). Avoid unnecessary content items.
    • Organize by Modules: Structure serialization so items are grouped logically (feature, foundation, project layers). This keeps deployments clean and modular.
    • Version Control: Store serialization files in source control (Git/Azure devops) to track changes and allow safe rollbacks.
    • Environment Consistency: Automate deployment pipelines so serialized items are promoted consistently from dev → QA → UAT → Prod.
    • Validation: Always test deployments in lower environments first to ensure no accidental overwrites or missing dependencies.

Properly managed serialization ensures clean deployments, consistent environments, and fewer surprises during migration and beyond.

 

7. Forms & Submissions

In Sitecore XM Cloud, forms require careful planning to ensure smooth data capture and migration.

  •  XM Cloud Forms (Webhook-based): Submit form data via webhooks to CRM, backend, or marketing platforms. Configure payloads properly and ensure validation, spam protection, and compliance.
  • Third-Party Forms: HubSpot, Marketo, Salesforce, etc., can be integrated via APIs for advanced workflows, analytics, and CRM connectivity.
  • Create New Forms: Rebuild forms with modern UX, accessibility, and responsive design.
  • Migrate Old Submission Data: Extract and import previous form submissions into the new system or CRM, keeping field mapping and timestamps intact.
  • ✅ Best Practices: Track submissions in analytics, test end-to-end, and make forms configurable for content authors.

This approach ensures new forms work seamlessly while historical data is preserved.

 

8. Personalization & Experimentation

Migrating personalization and experimentation requires careful planning to preserve engagement and insights.

  • Export & Rebuild: Export existing rules, personas, and goals. Review them thoroughly and recreate only what aligns with current business requirements.
  • A/B Testing: Identify active experiments, migrate if relevant, and rerun them in the new environment to validate performance.
  • Sitecore Personalize Implementation:
    • Plan data flow into the CDP and configure event tracking.
    • Implement personalization via Sitecore Personalize Cloud or Engage SDK for xm cloud implementation, depending on requirements.

✅Best Practices:

  • Ensure content authors can manage personalization rules and experiments without developer intervention.
  • Test personalized experiences end-to-end and monitor KPIs post-migration.

A structured approach to personalization ensures targeted experiences, actionable insights, and a smooth transition to the new Sitecore environment.

 

9. Accessibility

Ensuring accessibility is essential for compliance, usability, and SEO.

  • Follow WCAG standards: proper color contrast, semantic HTML, ARIA roles, and keyboard navigation.
  • Validate content with accessibility tools and manual checks before migration cutover.
  • Accessible components improve user experience for all audiences and reduce legal risk.

 

10. Performance, Caching & Lazy Loading

Optimizing performance is critical during a migration to ensure fast page loads, better user experience, and improved SEO.

  • Caching Strategies:
    • Use Sitecore output caching and data caching for frequently accessed components.
    • Implement CDN caching for media assets to reduce server load and improve global performance.
    • Apply cache invalidation rules carefully to avoid stale content.
  • Lazy Loading:
    • Load images, videos, and heavy components only when they enter the viewport.
    • Improves perceived page speed and reduces initial payload.
  • Performance Best Practices:
    • Optimize images and media (WebP/AVIF).
    • Minimize JavaScript and CSS bundle size, and use tree-shaking where possible.
    • Monitor Core Web Vitals (LCP, CLS, FID) post-migration.
    • Test performance across devices and regions before go-live.
    • Content Author Consideration:
    • Ensure caching and lazy loading do not break dynamic components or personalization.
    • Provide guidance to authors on content that might impact performance (e.g., large images or embeds).

Proper caching and lazy loading ensure a fast, responsive, and scalable Sitecore experience, preserving SEO and user satisfaction after migration.

 

11. CI/CD, Monitoring & Automated Testing

A well-defined deployment and monitoring strategy ensures reliability, faster releases, and smooth migrations.

  • CI/CD Pipelines:
    • Set up automated builds and deployments according to your hosting platform: Azure, Vercel, Netlify, or on-premise.
    • Ensure deployments promote items consistently across Dev → QA → UAT → Prod.
    • Include code linting, static analysis, and unit/integration tests in the pipeline.
  • Monitoring & Alerts:
    • Track website uptime, server health, and performance metrics.
    • Configure timely alerts for downtime or abnormal behavior to prevent business impact.
  • Automated Testing:
    • Implement end-to-end, regression, and smoke tests for different environments.
    • Include automated validation for content, forms, personalization, and integrations.
    • Integrate testing into CI/CD pipelines to catch issues early.
  • ✅ Best Practices:
    • Ensure environment consistency to prevent drift.
    • Use logs and dashboards for real-time monitoring.
    • Align testing and deployment strategy with business-critical flows.

A robust CI/CD, monitoring, and automated testing strategy ensures reliable deployments, reduced downtime, and faster feedback cycles across all environments.

 

12. Governance, Licensing & Cutover

A successful migration is not just technical — it requires planning, training, and governance to ensure smooth adoption and compliance.

  • License Validation: Compare the current Sitecore license with what the new setup requires. Ensure coverage for all modules, environments. Validate and provide accurate rights to users and roles.
  • Content Author & Marketer Readiness:
    • Train teams on the new workflows, tools, and interface.
    • Provide documentation, demos, and sandbox environments to accelerate adoption.
  • Backup & Disaster Recovery:
    • Plan regular backups and ensure recovery procedures are tested.
    • Define RTO (Recovery Time Objective) and RPO (Recovery Point Objective) for critical data.
  • Workflow, Roles & Permissions:
    • Recreate workflows, roles, and permissions in the new environment.
    • Implement custom workflows if required.
    • Governance gaps can lead to compliance and security risks — audit thoroughly.
  • Cutover & Post-Go-Live Support:
    • Plan the migration cutover carefully to minimize downtime.
    • Prepare a support plan for immediate issue resolution after go-live.
    • Monitor KPIs, SEO, forms, personalization, and integrations to ensure smooth operation.

Proper governance, training, and cutover planning ensures the new Sitecore environment is compliant, adopted by users, and fully operational from day one.

 

13. Training & Documentation

Proper training ensures smooth adoption and reduces post-migration support issues.

  • Content Authors & Marketers: Train on new workflows, forms, personalization, and content editing.
  • Developers & IT Teams: Provide guidance on deployment processes, CI/CD, and monitoring.
  • Documentation: Maintain runbooks, SOPs, and troubleshooting guides for ongoing operations.
  • Encourage hands-on sessions and sandbox practice to accelerate adoption.

 

Summary:

Sitecore migrations are complex, and success often depends on the small decisions made throughout development, performance tuning, SEO handling, and governance. This blog brings together practical approaches and lessons learned from real-world implementations — aiming to help teams build scalable, accessible, and future-ready Sitecore solutions.

While every project is different, the hope is that these shared practices offer a useful starting point for others navigating similar journeys. The Sitecore ecosystem continues to evolve, and so do the ways we build within it.

 

]]>
https://blogs.perficient.com/2025/10/28/executing-a-sitecore-migration-development/feed/ 0 388061
The Personalization Gap Is Hurting Financial Services, Here’s How to Close It https://blogs.perficient.com/2025/10/15/the-personalization-gap-is-hurting-financial-services-heres-how-to-close-it/ https://blogs.perficient.com/2025/10/15/the-personalization-gap-is-hurting-financial-services-heres-how-to-close-it/#respond Wed, 15 Oct 2025 15:22:25 +0000 https://blogs.perficient.com/?p=387848

In today’s financial landscape, personalization is no longer a luxury; it’s a customer expectation. Yet, according to Adobe’s latest State of Customer Experience in Financial Services in an AI-Driven World report, only 36% of the customer journey is currently personalized, despite 74% of financial services executives acknowledging that their customers expect tailored interactions.

This gap isn’t just a missed opportunity; it’s a trust breaker.

Why Personalization Matters More Than Ever

Financial decisions are deeply personal. Whether a customer is exploring mortgage options, planning for retirement, or managing small business finances, they expect advice and experiences that reflect their unique goals and life stage. Generic nudges and one-size-fits-all messaging simply don’t cut it anymore.

Early-stage interactions—like product discovery or financial education—are especially critical. These are high-value moments where relevance builds trust and guides decision-making. Yet many institutions fall short, lacking the orchestration needed to deliver personalized engagement across these initial touchpoints.

What’s Holding Institutions Back?

The report highlights several barriers:

  • Fragmented data systems that prevent a unified view of the customer
  • Legacy operating models that prioritize product silos over customer journeys
  • Compliance concerns that limit personalization efforts, even when customers expect it

These challenges are compounded by the rise of AI-powered experiences, which demand real-time, context-aware personalization across channels.

Adobe State of CX In Fs in an AI-Driven World Report Stat 2025

The Path Forward: Adaptive, Lifecycle Personalization

To close the gap, financial institutions must evolve from episodic personalization to adaptive, full-lifecycle engagement. That means:

  • Investing in unified customer profiles and behavioral insights
  • Building real-time content engines that respond to customer signals
  • Designing personalization strategies that grow with the relationship and not just the transaction

Download the full Adobe report to explore the top 10 insights shaping the future of financial services, and discover how your organization can lead with intelligence, responsibility, and trust.

Learn About Perficient and Adobe’s Partnership

Are you looking for a partner to help you transform and modernize your technology strategy? Perficient and Adobe bring together deep industry expertise and powerful experience technologies to help financial services organizations unify data, orchestrate journeys, and deliver customer-centric experiences that build trust and drive growth.

Get in Touch With Our Experts

]]>
https://blogs.perficient.com/2025/10/15/the-personalization-gap-is-hurting-financial-services-heres-how-to-close-it/feed/ 0 387848
Perficient Included in Forrester’s Q4 2025 Organizational Change Management Services Landscape  https://blogs.perficient.com/2025/10/13/forrester-organizational-change-management-services-q4-2025/ https://blogs.perficient.com/2025/10/13/forrester-organizational-change-management-services-q4-2025/#respond Mon, 13 Oct 2025 21:06:56 +0000 https://blogs.perficient.com/?p=387418

Perficient is proud to be included in The Organizational Change Management Services Landscape, Q4 2025 by Forrester. We believe this recognition reflects our continued momentum in helping enterprises navigate complexity, accelerate transformation, and deliver measurable outcomes through strategic change. 

In the report, Perficient is listed as a consultancy with a geographic focus across North America, EMEA, APAC, and LATAM, with an industry focus in financial services, healthcare, and utilities.  

Forrester asked each provider included in the Landscape to select the top business scenarios for which clients select them and from there determined which are the extended business scenarios that highlight differentiation among the providers.  Perficient is shown in the report for having selected extended business scenarios including Declining Performance, Process Improvement/Engineering, and Volatility as top reasons clients work with them out of those business scenarios. Perficient believes these are areas where change is not optional, but essential. 

Built to Lead with Change 

Forrester defines organizational change management (OCM) as: 

“A method that companies use to evolve their capabilities via people, process, and technology changes. OCM’s success rests on the organization’s ability to continuously sense and respond to changing requirements in order to generate the scale of change at strategic, operational, and leadership levels.” (The Organizational Change Management Landscape, Q4 2025, Forrester) 

This aligns with our approach at Perficient. We integrate strategy, execution, and innovation to help clients build adaptive organizations that thrive amid disruption. Our change strategies are designed to scale, sustain, and deliver impact across the enterprise. 

AI Is Reshaping Change—and How It’s Managed 

AI is transforming how work gets done and how change is delivered. At Perficient, we take an AI-first approach to change. We’ve helped clients launch AI-powered research platforms, deploy virtual assistants for over 30,000 employees, and define governance frameworks to accelerate the responsible adoption of AI. Our proprietary platform, Envision, connects strategy to execution. Using intelligent tools to assess readiness, identify capability gaps, and prioritize high-impact initiatives. 

We also use AI to streamline how we deliver change. From avatar-based training to multilingual narration, our team leverages AI to move faster and reach broader audiences with precision. 

Human-Centered, Outcome-Focused 

Change fatigue is real. So is the cost of poor execution. That’s why our approach is grounded in empathy, data, and measurable outcomes.  

We help leaders prepare their people to lead in an AI-powered world by: 

  • Aligning change strategy with business ambition 
  • Assessing impact and readiness across the organization 
  • Designing enablement programs that drive adoption 
  • Engaging stakeholders with clarity and purpose 
  • Communicating change with precision and relevance 

Whether you’re facing declining performance, reengineering processes, or navigating volatility, Perficient is ready to help you lead through change, with purpose and precision. 

Explore the full report to see how we believe Perficient and other providers are shaping the future of organizational change: The Organizational Change Management Services Landscape, Q4 2025

 

Forrester does not endorse any company, product, brand, or service included in its research publications and does not advise any person to select the products or services of any company or brand based on the ratings included in such publications. Information is based on the best available resources. Opinions reflect judgment at the time and are subject to change. For more information, read about Forrester’s objectivity here . 

 

]]>
https://blogs.perficient.com/2025/10/13/forrester-organizational-change-management-services-q4-2025/feed/ 0 387418
AI-Driven Data Lineage for Financial Services Firms: A Practical Roadmap for CDOs https://blogs.perficient.com/2025/10/06/ai-driven-data-lineage-for-financial-services-firms-a-practical-roadmap-for-cdos/ https://blogs.perficient.com/2025/10/06/ai-driven-data-lineage-for-financial-services-firms-a-practical-roadmap-for-cdos/#respond Mon, 06 Oct 2025 11:17:05 +0000 https://blogs.perficient.com/?p=387626

Introduction

Imagine just as you’re sipping your Monday morning coffee and looking forward to a hopefully quiet week in the office, your Outlook dings and you see that your bank’s primary federal regulator is demanding the full input – regulatory report lineage for dozens of numbers on both sides of the balance sheet and the income statement for your latest financial report filed with the regulator. The full first day letter responses are due next Monday, and as your headache starts you remember that the spreadsheet owner is on leave; the ETL developer is debugging a separate pipeline; and your overworked and understaffed reporting team has three different ad hoc diagrams that neither match nor reconcile.

If you can relate to that scenario, or your back starts to tighten in empathy, you’re not alone. Artificial Intelligence (“AI”) driven data lineage for banks is no longer a nice-to-have. We at Perficient working with our clients in banking, insurance, credit unions, and asset managers find that it’s the practical answer to audit pressure, model risk (remember Lehman Brothers and Bear Stearns), and the brittle manual processes that create blind spots. This blog post explains what AI-driven lineage actually delivers, why it matters for banks today, and a phased roadmap Chief Data Officers (“CDOs”) can use to get from pilot to production.

Why AI-driven data lineage for banks matters today

Regulatory pressure and real-world consequences

Regulators and supervisors emphasize demonstrable lineage, timely reconciliation, and governance evidence. In practice, financial services firms must show not just who touched data, but what data enrichment and/or transformations happened, why decisions used specific fields, and how controls were applied—especially under BCBS 239 guidance and evolving supervisory expectations.

In addition, as a former Risk Manager, the author knows that he would have wanted and has spoken to a plethora of financial services executives who want to know that the decisions they’re making on liquidity funding, investments, recording P&L, and hedging trades are based on the correct numbers. This is especially challenging at global firms that operate in in a transaction heavy environment with constantly changing political, interest rate, foreign exchange and credit risk environment.

Operational risks that keep CDOs up at night

Manual lineage—spreadsheets, tribal knowledge, and siloed code—creates slow audits, delayed incident response, and fragile model governance. AI-driven lineage automates discovery and keeps lineage living and queryable, turning reactive fire drills into documented, repeatable processes that will greatly shorten the time QA tickets are closed and reduce compensation costs for misdirected funds. It also provides a scalable foundation for governed data practices without sacrificing traceability.

What AI-driven lineage and controls actually do (written by and for non-tech staff)

At its core, AI-driven data lineage combines automated scanning of code, SQL, ETL jobs, APIs, and metadata with semantic analysis that links technical fields to business concepts. Instead of a static map, executives using AI-driven data lineage get a living graph that shows data provenance at the field level: where a value originated, which transformations touched it, and which reports, models, or downstream services consume it.

AI adds value by surfacing hidden links. Natural language processing reads table descriptions, SQL comments, and even README files (yes they do still exist out there) to suggest business-term mappings that close the business-IT gap. That semantic layer is what turns a technical lineage graph into audit-ready evidence that regulators or auditors can understand.

How AI fixes the pain points keeping CDOs up at night

Faster audits: As a consultant at Perficient, I have seen AI-driven lineage that after implementation allowed executives to answer traceability questions in hours rather than weeks. Automated evidence packages—exportable lineage views and transformation logs—provide auditors with a reproducible trail.
Root-cause and incident response: When a report or model spikes, impact analysis highlights which datasets and pipelines are involved, highlighting responsibility and accountability, speeding remediation and alleviating downstream impact.
Model safety and feature provenance: Lineage that includes training datasets and feature transformations enables validation of model inputs, reproducibility of training data, and enforcement of data controls—supporting explainability and governance requirements. That allows your P&L to be more R&S. (a slogan used by a client that used R&S P&L to mean rock solid profit and loss.)

Tooling, architecture, and vendor considerations

When evaluating vendors, demand field-level lineage, semantic parsing (NLP across SQL, code, and docs), auditable diagram exports, and policy enforcement hooks that integrate with data protection tools. Deployment choices matter in regulated banking environments; hybrid architectures that keep sensitive metadata on-prem while leveraging cloud analytics often strike a pragmatic balance.

A practical, phased roadmap for CDOs

Phase 0 — Align leadership and define success: Engage CRO, COO, and Head of Model Risk. Define 3–5 KPIs (e.g., lineage coverage, evidence time, mean time to root cause) and what “good” will look like. This is often done during a evidence gathering phase by Perficient with clients who are just starting their Artificial Intelligence journey.
Phase 1 — Inventory and quick wins: Target a high-risk area such as regulatory reporting, a few production models, or a critical data domain. Validate inventory manually to establish baseline credibility.
Phase 2 — Pilot AI lineage and controls: Run automated discovery, measure accuracy and false positives, and quantify time savings. Expect iterations as the model improves with curated mappings.
Phase 1 and 2 are usually done by Perficient with clients as a Proof-of-Concept phase to show that the key feeds into and out of existing technology platforms can be done.
Phase 3 — Operationalize and scale: Integrate lineage into release workflows, assign lineage stewards, set SLAs, and connect with ticketing and monitoring systems to embed lineage into day-to-day operations.
Phase 4 — Measure, refine, expand: Track KPIs, adjust models and rules, and broaden scope to additional reports, pipelines, and models as confidence grows.

Risks, human oversight, and governance guardrails

AI reduces toil but does not remove accountability. Executives, auditors and regulators either do or should require deterministic evidence and human-reviewed lineage. Treat AI outputs as recommendations subject to curator approval. This will avoid what many financial services executives are dealing with what is now known as AI Hallucinations.

Guardrails include the establishment of exception processing workflows for disputed outputs and toll gates to ensure security and privacy are baked into design—DSPM, masking, and appropriate IAM controls should be integral, not afterthoughts.

Conclusion and next steps

AI data lineage for banks is a pragmatic control that directly addresses regulatory expectations, speeds audits, and reduces model and reporting risk. Start small, prove value with a focused pilot, and embed lineage into standard data stewardship processes. If you’re a CDO looking to move quickly with minimal risk, contact Perficient to run a tailored assessment and pilot design that maps directly to your audit and governance priorities. We’ll help translate proof into firm-wide control and confidence.

]]>
https://blogs.perficient.com/2025/10/06/ai-driven-data-lineage-for-financial-services-firms-a-practical-roadmap-for-cdos/feed/ 0 387626
Transform Your Data Workflow: Custom Code for Efficient Batch Processing in Talend-Part 2 https://blogs.perficient.com/2025/10/03/transform-your-data-workflow-custom-code-for-efficient-batch-processing-in-talend-part-2/ https://blogs.perficient.com/2025/10/03/transform-your-data-workflow-custom-code-for-efficient-batch-processing-in-talend-part-2/#comments Fri, 03 Oct 2025 07:25:24 +0000 https://blogs.perficient.com/?p=387517

Introduction:

Custom code in Talend offers a powerful way to enhance batch processing efficiently by allowing developers to implement specialized logic that is not available through Talend’s standard components. This can involve data transformations, custom code as per use case and integration with flat files as per specific project needs. By leveraging custom code, users can optimize performance, improve data quality, and streamline complex batch workflows within their Talend jobs.

Talend Components:

Key components for batch processing as mention below:

  • tDBConnection: Establish and manage database connections within a job & allow configuration with single connection to reuse within Talend job.
  • tFileInputDelimited: For reading data from flat files.
  • tFileRowCount: Reads file row by row to calculate the number of rows.
  • tLoop: Executes a task automatically, based on a loop size.
  • tHashInput, tHashOutput: For high-speed data transfer and processing within a job. tHashOutput writes data to cache memory, while tHashInput reads from that cached data.
  • tFilterRow: For filtering rows from a dataset based on specified.
  • tMap: Data transformation allows you to map input data with output data and enables you to perform data filtering, complex data manipulation, typecasting, and multiple input source joins.
  • tJavaRow: It can be used as an intermediate component, and we are able to access the input flow and transform the data using custom Java code.
  • tJava: It has no input or output data flow & can be used independently to Integrate custom Java code.
  • tPreJob, tPostJob: PreJob start the execution before the job & PostJob at the end of the job.
  • tDBOutput: Supports wide range of databases & used to write data to various databases.
  • tDBCommit:It retains and verifies the alterations applied to a connected database throughout a Talend job, guaranteeing that it permanently records the data changes.
  • tDBClose:  It explicitly close a database connection that was opened by a tDBConnection component.
  • tLogCatcher: It is used in error handling within Talend job for adding runtime logging information. It catches all the exceptions and warnings raised by tWarn and tDie components during Talend job execution.
  • tLogRow: It is employed in error handling to display data or keep track of processed data in the run console.
  • tDie: We can stop the job execution explicitly if it fails. In addition, we can create a customized warning message and exit code.

Workflow with example:

To process the bulk of data in Talend, we can implement batch processing to efficiently process flat file data within a minimal execution time. We can read the flat file data & after the execution, we can process it to insert it into MySQL database table as a target & we can achieve this without batch processing. But this data flow will take quite a longer time to execute. If we use batch processing using the custom code, it takes minimal execution time to write the entire source file data into batch of records into MySQL database table at the target location.

Talend Job Design

Talend Job Design 

Solution:

  • Establish the database connection at the start of the execution so that we can reuse.
  • Read the number of rows in the source flat file using tFileRowCount component.
  • To determine the batch size, subtract the header count from the total row count and then divide the number by the total batch size. Take the whole number nearby which indicates the total number of batch or chunk.

    Calculate the batch size from total row count

    Calculate the batch size from total row count

  • Now use tFileInputDelimited component to read the source file content. In the tMap component, utilize the sequence Talend function to generate row numbers for your data mapping and transformation tasks. Then, load all of the data into the tHashOutput component, which stores the data into a cache.
  • Iterate the loop based on the calculated whole number using tLoop
  • Retrieve all the data from tHashInput component.
  • Filter the dataset retrieved from tHashInput component based on the rowNo column in the schema using tFilterRow

Filter the dataset using tFilterRow

Filter the dataset using tFilterRow

  • If First Iteration is in progress & batch size is 100 then rowNo range will be as 1 to 100.
    If Third Iteration is in progress & batch size is 100 then rowNo range will be as 201 to 300.
    For example, if the value of current iteration is 3 then [(3-1=2)* 100]+1 = 201 & [3*100=300]. So final dataset range for the 3rd iteration will be 201 to 300.
  • Finally extract the dataset range between the rowNo column & write the batch data MySQL database table using tDBOutput
  • The system uses the tLogCatcher component for error management by capturing runtime logging details, including warning or exception messages, and employs tLogRow to display the information in the execution console.
  • Regarding performance tuning, tMap component that maps source data to output data, allows for complex data transformation, and offers unique join, first join, and all other join options for looking up data within the tMap component.
  • The temporary data that the tHashInput & tHashOutput components store in cache memory enhances runtime performance.
  • At the end of the job execution, we are committing the database modification & closing the connection to release the database resource.

Advantages of Batch Processing:

  • Batch processing can efficiently handle large datasets.
  • It takes minimal time to process the data even after data transformation.
  • By grouping records from a large dataset and processing them as a single unit, it can be highly beneficial for improving performance.
  • With the batch processing, it can easily scale to accommodate growing data volumes.
  • It is particularly useful for operations like generating reports, performing data integration, and executing complex transformations on large datasets.

For more details: Get-started-talend-open-studio-data-integration

Note: Efficient Batch Processing in Talend-Part 1

]]>
https://blogs.perficient.com/2025/10/03/transform-your-data-workflow-custom-code-for-efficient-batch-processing-in-talend-part-2/feed/ 3 387517
Trust, Data, and the Human Side of AI: Lessons From a Lifelong Automotive Leader https://blogs.perficient.com/2025/10/02/customer-experience-automotive-wally-burchfield/ https://blogs.perficient.com/2025/10/02/customer-experience-automotive-wally-burchfield/#respond Thu, 02 Oct 2025 17:05:47 +0000 https://blogs.perficient.com/?p=387540

In this episode of “What If? So What?”, Jim Hertzfeld sits down with Wally Burchfield, former senior executive at GM, Nissan, and Nissan United, to explore what’s driving transformation in the automotive industry and beyond. 

 Wally’s perspective is clear: in a world obsessed with automation and data, the companies that win will be the ones that stay human. 

 From “Build and Sell” to “Know and Serve” 

 The old model was simple: build a car, sell a car, repeat. But as Wally explains it, that formula no longer works in a world where customer expectations are shaped by digital platforms and instant personalization. “It’s not just about selling a product,” he said. “It’s about retaining the customer through a high-quality experience one that feels personal, respectful, and effortless.” Every interaction matters, and every brand is in the experience business. 

 Data Alone Doesn’t Build Loyalty – Trust Does 

 It’s true that organizations have more data than ever before. But as Wally points out, it’s not how much data you have, it’s what you do with it. The real differentiator is how responsibly, transparently, and effectively you use that data to improve the customer experience. 

 “You can have a truckload of data but if it doesn’t help you deliver value or build trust, it’s wasted,” Wally said. 

 When used carelessly, data can feel manipulative. When used well, it creates clarity, relevance, and long-term relationships. 

 AI Should Remove Friction, Not Feeling 

 Wally’s take on AI is refreshingly grounded. He sees it as a tool to reduce friction, not replace human connection. Whether it’s scheduling service appointments via SMS or filtering billions of digital signals, the best AI is invisible, working quietly in the background to make the customer feel understood. 

 Want to Win? Listen Better and Faster 

 At the end of the day, the brands that thrive won’t be the ones with the biggest data sets; they’re the ones that move fast, use data responsibly, and never lose sight of the customer at the center. 

🎧 Listen to the full conversation with Wally Burchfield for more on how trust, data, and AI can work together to build lasting customer relationships—and why the best strategies are still the most human. 

Subscribe Where You Listen

Apple | Spotify | Amazon | Overcast | Watch the full video episode on YouTube

Meet our Guest – Wally Burchfield

Wally Burchfield is a veteran automotive executive with deep experience across retail, OEM operations, marketing, aftersales, dealer networks, and HR. 

He spent 20 years at General Motors before joining Nissan, where he held multiple VP roles across regional operations, aftersales, and HR. He later served as COO of Nissan United (TBWA), leading Tier 2/3 advertising and field marketing programs to support dealer and field team performance. Today, Wally runs a successful consulting practice helping OEMs, partners, and dealer groups solve complex challenges and drive results. A true “dealer guy”, he’s passionate about improving customer experience, strengthening OEM-dealer partnerships, and challenging the status quo to unlock growth. 

Follow Wally on LinkedIn  

Learn More about Wally Burchfield

 

Meet our Host

Jim Hertzfeld

Jim Hertzfeld is Area Vice President, Strategy for Perficient.

For over two decades, he has worked with clients to convert market insights into real-world digital products and customer experiences that actually grow their business. More than just a strategist, Jim is a pragmatic rebel known for challenging the conventional and turning grand visions into actionable steps. His candid demeanor, sprinkled with a dose of cynical optimism, shapes a narrative that challenges and inspires listeners.

Connect with Jim:

LinkedIn | Perficient

 

 

]]>
https://blogs.perficient.com/2025/10/02/customer-experience-automotive-wally-burchfield/feed/ 0 387540
Beyond Denial: How AI Concierge Services Can Transform Healthcare from Reactive to Proactive https://blogs.perficient.com/2025/09/24/beyond-denial-how-ai-concierge-services-can-transform-healthcare-from-reactive-to-proactive/ https://blogs.perficient.com/2025/09/24/beyond-denial-how-ai-concierge-services-can-transform-healthcare-from-reactive-to-proactive/#respond Wed, 24 Sep 2025 14:39:32 +0000 https://blogs.perficient.com/?p=387380

The headlines are troubling but predictable. The Trump administration will launch a program next year to find out how much money an artificial intelligence algorithm could save the federal government by denying care to Medicare patients. Meanwhile, a survey of physicians published by the American Medical Association in February found that 61% think AI is “increasing prior authorization denials, exacerbating avoidable patient harms and escalating unnecessary waste now and into the future.”

We’re witnessing the healthcare industry’s narrow vision of AI in action: algorithms designed to say “no” faster and more efficiently than ever before. But what if we’re missing the bigger opportunity?

The Current AI Problem: Built to Deny, Not to Help

The recent expansion of AI-powered prior authorization reveals a fundamental flaw in how we’re approaching healthcare technology. “The more expensive it is, the more likely it is to be denied,” said Jennifer Oliva, a professor at the Maurer School of Law at Indiana University-Bloomington, whose work focuses on AI regulation and health coverage.

This approach creates a vicious cycle: patients don’t understand their benefits, seek inappropriate or unnecessary care, trigger costly prior authorization processes, face denials, appeal those denials, and ultimately either give up or create even more administrative burden for everyone involved.

The human cost is real. Nearly three-quarters of respondents thought prior authorization was a “major” problem in a July poll published by KFF, and we’ve seen how public displeasure with insurance denials dominated the news in December, when the shooting death of UnitedHealthcare’s CEO led many to anoint his alleged killer as a folk hero.

A Better Vision: The AI Concierge Approach

What if instead of using AI to deny care more efficiently, we used it to help patients access the right care more effectively? This is where the AI Concierge concept transforms the entire equation.

An AI Concierge doesn’t wait for a claim to be submitted to make a decision. Instead, it proactively:

  • Educates patients about their benefits before they need care
  • Guides them to appropriate providers within their network
  • Explains coverage limitations in plain language before appointments
  • Suggests preventive alternatives that could avoid more expensive interventions
  • Streamlines pre-authorization by ensuring patients have the right documentation upfront

The Quantified Business Case

The financial argument for AI Concierge services is compelling:

Star Ratings Revenue Impact: A half-star increase in Medicare Star Ratings is valued at approximately $500 per member. For a 75,000-member plan, that translates to $37.5 million in additional funding. An AI Concierge directly improves patient satisfaction scores that drive these ratings.

Operational Efficiency Gains: Healthcare providers implementing AI-powered patient engagement systems report 15-20% boosts in clinic revenue and 10-20% reductions in overall operational costs. Clinics using AI tools see 15-25% increases in patient retention rates.

Cost Avoidance Through Prevention: Utilizing AI to help patients access appropriate care could save up to 50% on treatment costs while improving health outcomes by up to 40%. This happens by preventing more expensive interventions through proper preventive care utilization.

The HEDIS Connection

HEDIS measures provide the perfect framework for demonstrating AI Concierge value. With 235 million people enrolled in plans that report HEDIS results, improving these scores directly impacts revenue through bonus payments and competitive positioning.

An AI Concierge naturally improves HEDIS performance in:

  • Preventive Care Measures: Proactive guidance increases screening and immunization rates
  • Care Gap Closure: Identifies and addresses gaps before they become expensive problems
  • Patient Engagement: Improves medication adherence and chronic disease management

Beyond the Pilot Programs

While government initiatives like the WISeR pilot program focus on “Wasteful and Inappropriate Service Reduction” through AI-powered denials, forward-thinking healthcare organizations have an opportunity to differentiate themselves with AI-powered patient empowerment.

The math is simple: preventing a $50,000 hospitalization through proactive care coordination delivers better ROI than efficiently denying the claim after it’s submitted.

AI Healthcare Concierge Implementation Strategy

For healthcare leaders considering AI Concierge implementation:

  • Phase 1: Deploy AI-powered benefit explanation tools that reduce call center volume and improve patient understanding
  • Phase 2: Integrate predictive analytics to identify patients at risk for expensive interventions and guide them to preventive alternatives
  • Phase 3: Expand to comprehensive care navigation that optimizes both patient outcomes and organizational performance

The Competitive Advantage

While competitors invest in AI to process denials faster, organizations implementing AI Concierge services are investing in:

  • Member satisfaction and retention (15-25% improvement rates)
  • Star rating improvements ($500 per member value per half-star)
  • Operational cost reduction (10-20% typical savings)
  • Revenue protection through better member experience

Conclusion: Choose Your AI Future

The current trajectory of AI in healthcare—focused on denial optimization—represents a massive missed opportunity. As one physician noted about the Medicare pilot: “I will always, always err on the side that doctors know what’s best for their patients.”

AI Healthcare Concierge services align with this principle by empowering both patients and providers with better information, earlier intervention, and more effective care coordination. The technology exists. The business case is proven. The patient need is urgent.

The question isn’t whether AI will transform healthcare—it’s whether we’ll use it to build walls or bridges between patients and the care they need.

The choice is ours. Let’s choose wisely.

]]>
https://blogs.perficient.com/2025/09/24/beyond-denial-how-ai-concierge-services-can-transform-healthcare-from-reactive-to-proactive/feed/ 0 387380
3 Ways Insurers Can Lead in the Age of AI https://blogs.perficient.com/2025/09/16/3-ways-insurers-can-lead-in-the-age-of-ai/ https://blogs.perficient.com/2025/09/16/3-ways-insurers-can-lead-in-the-age-of-ai/#respond Tue, 16 Sep 2025 15:03:43 +0000 https://blogs.perficient.com/?p=387117

For years, insurers have experimented with digital initiatives, but the pace of disruption has accelerated. Legacy models can’t keep up with rising risks, evolving customer expectations, and operational pressures. The question isn’t whether insurers will transform, but rather how fast they can adapt.

Technologies like AI, advanced analytics, and embedded solutions have moved from emerging concepts to essential capabilities for competitive advantage. Earlier this year, we highlighted these opportunities in our Top 5 Digital Trends for Insurance.

As we gear up for the world’s largest event for insurance innovation in October, InsureTech Connect (ITC) Vegas, it’s clear these trends are driving the conversations that matter most. Hear from industry experts Brian Bell and Conall Chabrunn on why this moment is so transformative.

“ITC is a great opportunity to explore the latest innovations shaping the future of insurance and see how insurers are leveraging AI across the value chain—from underwriting to claims and customer engagement.” – Brian Bell, Principal

Here’s a closer look at three AI trends that are leading the way, at ITC and beyond.

1. Make AI Your Growth Engine

Artificial intelligence is a core enabler of insurance innovation. It’s powering efficiency and elevating customer experiences across the value chain. From underwriting to claims, AI enables real-time decisions, sharpens risk modeling, and delivers personalized interactions at scale. Generative AI builds on this foundation by accelerating content creation, enabling smarter agent support, and transforming customer engagement. Together, these capabilities thrive on modern, cloud-native platforms designed for speed and scalability.

Why Leaders Should Act Now:

AI creates value when it’s embedded in workflows. Focus on the high-impact domains that accelerate outcomes: underwriting, claims, and distribution. Research shows early AI adopters are already seeing measurable results:

  • New-agent success and sales conversion rates increased up to 20%
  • Premium growth boosted by as much as 15%
  • New customer onboarding costs reduced up to 40%

“Ironically, AI has been the hottest topic at ITC the last three years. This year, the playing field has truly changed. Perficient’s AI product partners will be on full display, and we are excited to show our customers how we can enhance and optimize them for real world performance.” – Conall Chabrunn, Head of Sales – Insurance

We help clients advance AI capabilities through virtual assistants, generative interfaces, agentic frameworks, and product development, enhancing team velocity by integrating AI team members.

Read More: Empowering the Modern Insurance Agent

2. Personalize Every Moment

Today’s policyholders expect the same level of personalization they receive from other industries like retail and streaming platforms. By leveraging AI and advanced analytics, insurers can move beyond broad segments to anticipate needs, remove friction, and tailor products and pricing in the moments that matter.

Forbes highlights three key pillars of modern personalization critical for insurers aiming to deliver tailored experiences: data, intent signals, and artificial intelligence. At ITC, these principles are front and center as insurers explore how to meet expectations and unlock new revenue streams, without adding complexity.

Why Leaders Should Act Now:

Personalization isn’t just about customer experience—it’s a growth strategy. Research shows over 70% of consumers expect personalized interactions, and more than three-quarters feel frustrated when they don’t get them. Insurers that utilize AI to anticipate needs and simplify choices can earn trust and loyalty faster than those who don’t.

Success In Action: Proving Rapid Value and Creating Better Member Experiences

3. Meet Customers at the Point of Need

Embedded insurance is moving into everyday moments, and research shows it’s on a massive growth trajectory. Global P&C embedded sales are projected to reach as high as $700 billion by 2030, including $70 billion in the U.S. alone. By meeting customers where decisions happen, carriers can create seamless experiences, new revenue streams, and stronger brand visibility—while offering convenience, transparency, and choice.

Insurers that embrace ecosystems will expand their reach and relevance as consumer expectations and engagement continually shift. Agencies will continue to play a critical role in navigating difficult underwriting conditions by tailoring policy coverages and providing transparency, which requires that they have access to modern sales and servicing tools. It’s a prominent theme that’s echoed throughout ITC sessions this year.

Why Leaders Should Act Now:

AI amplifies embedded strategies by enabling real-time pricing, risk assessment, and personalized offers within those touchpoints. What matters most is making the “yes” simple: clear options, plain language, and confidence about what’s covered. Together, embedded ecosystems and AI-driven insights help insurers deliver relevance at scale when and where consumers need it.

“Perficient stands apart in the AI consulting landscape because every decision we make ties back to industry-specific use cases and measurable success criteria. We complement our technology partners by bringing deep industry expertise to ensure solutions deliver real-world impact.” – Conall Chabrunn, Head of Sales – Insurance

You May Also Enjoy: Commerce Experiences and the Rise of Digital-First Insurance

Lead the Insurance Evolution With AI-First Transformation

The insurance industry is entering uncharted territory. Those who act decisively and swiftly to leverage AI, embrace embedded ecosystems, and personalize every moment will lead the curve in the next era of insurance.

As the industry gathers at events like ITC Vegas, these conversations come to life. Expect AI to be the common thread across underwriting, claims, distribution, and customer experience.

“There’s never been a more transformative time in insurance, and ITC is the perfect place to be part of the conversation.” – Brian Bell, Principal

If you’re attending ITC at Mandalay Bay in October, schedule a meeting with our team to explore how we help insurers turn disruption into opportunity.

Carriers and brokers count on us to help modernize, innovate, and win in an increasingly competitive marketplace. Our solutions power personalized omnichannel experiences and optimize performance across the enterprise.

  • Business Transformation: Activate strategy and innovation ​within the insurance ecosystem.​
  • Modernization: Optimize technology to boost agility and ​efficiency across the value chain.​
  • Data + Analytics: Power insights and accelerate ​underwriting and claims decision-making.​
  • Customer Experience: Ease and personalize experiences ​for policyholders and producers.​

We are trusted by leading technology partners and consistently mentioned by analysts. Discover why we have been trusted by 13 of the 20 largest P&C firms and 11 of the 20 largest annuity carriers. Explore our insurance expertise and contact us to learn more.

]]>
https://blogs.perficient.com/2025/09/16/3-ways-insurers-can-lead-in-the-age-of-ai/feed/ 0 387117
Perficient’s “What If? So What?” Podcast Wins Gold Stevie® Award for Technology Podcast https://blogs.perficient.com/2025/09/08/what-if-so-what-podcast-gold-stevie-award/ https://blogs.perficient.com/2025/09/08/what-if-so-what-podcast-gold-stevie-award/#respond Mon, 08 Sep 2025 16:32:32 +0000 https://blogs.perficient.com/?p=386592

We’re proud to share that Perficient’s What If? So What? podcast has been named a Gold Stevie® Award winner in the Technology Podcast category at the 22nd Annual International Business Awards®. These awards are among the world’s top honors for business achievement, celebrating innovation, impact, and excellence across industries.

Winners were selected by more than 250 executives worldwide, whose feedback praised the podcast’s ability to translate complex digital trends into practical, high-impact strategies for business and technology leaders.

Hosted by Jim Hertzfeld, Perficient’s AVP of Strategy, the podcast explores the business impact of digital transformation, AI, and disruption. With guests like Mark Cuban, Neil Hoyne (Google), May Habib (WRITER), Brian Solis (ServiceNow), and Chris Duffey (Adobe), we dive into the possibilities of What If?, the practical impact of So What?, and the actions leaders can take with Now What?

The Stevie judges called out what makes the show stand out:

  • “What If? So What? Podcast invites experts from different industries, which is important to make sure that audiences are listening and gaining valuable information.”
  • “A sharp, forward-thinking podcast that effectively translates complex digital trends into actionable insights.”
  • “With standout guests like Mark Cuban, Brian Solis, and Google’s Neil Hoyne, the podcast demonstrates exceptional reach, relevance, and editorial curation.”

In other words, we’re not just talking about technology for technology’s sake. We’re focused on real business impact, helping leaders make smarter, faster decisions in a rapidly changing digital world.

We’re honored by this recognition and grateful to our listeners, guests, and production team who make each episode possible.

If you haven’t tuned in yet, now’s the perfect time to hear why the judges called What If? So What? a “high-quality, future-forward show that raises the standard for business podcasts.”

🎧 Catch the latest episodes here: What If? So What? Podcast

Subscribe Where You Listen

APPLE PODCASTS | SPOTIFY | AMAZON MUSIC | OTHER PLATFORMS 

Watch Full Video Episodes on YouTube

Meet our Host

Jim Hertzfeld

Jim Hertzfeld is Area Vice President, Strategy for Perficient.

For over two decades, he has worked with clients to convert market insights into real-world digital products and customer experiences that actually grow their business. More than just a strategist, Jim is a pragmatic rebel known for challenging the conventional and turning grand visions into actionable steps. His candid demeanor, sprinkled with a dose of cynical optimism, shapes a narrative that challenges and inspires listeners.

Connect with Jim: LinkedIn | Perficient

 

 

]]>
https://blogs.perficient.com/2025/09/08/what-if-so-what-podcast-gold-stevie-award/feed/ 0 386592
Perficient Included in the IDC Market Glance for Digital Business Professional Services, 3Q25 https://blogs.perficient.com/2025/09/04/perficient-idc-digital-business-services-3q25/ https://blogs.perficient.com/2025/09/04/perficient-idc-digital-business-services-3q25/#respond Thu, 04 Sep 2025 20:40:47 +0000 https://blogs.perficient.com/?p=386622

Perficient is proud to be included in the IDC Market Glance: Digital Business Professional Services, 3Q25, (Doc # US52789825, July 2025)”. This marks our fourth consecutive year of inclusion.

In the report, Perficient is included in the Technology Transformation Dominant category, where IDC defines participants as: “organizations [that] offer technology consulting advice and services as their primary service line to drive digital transformation.”

Built to Lead with Technology

IDC notes: “Many technology transformation dominant firms also offer business consulting services, and some are expanding their capabilities into design services and product engineering services as well.”

This aligns with our evolution as a consulting partner. At Perficient, we integrate strategy, implementation, and innovation, giving clients the technology foundations and AI readiness to accelerate transformation and achieve tangible outcomes.

An Industry in Motion

The report also highlights a key trend shaping the digital business consulting landscape:

“Acquisitions to boost AI/cloud/digital capabilities and to fill niche areas of expertise.”

Perficient’s strategic investments reflect this shift. We continue to deepen our AI-first capabilities, expand our industry expertise, and deliver consulting services grounded in execution.

Strategy to Execution, Powered by AI

Technology transformation at Perficient is built for scale, speed, and strategy. We help enterprises modernize platforms, streamline architectures, and align technology investments to real business outcomes. Our AI-powered Envision experience connects strategy to execution, combining proven frameworks with intelligent tools that help leaders prioritize, activate, and accelerate transformation. From capability mapping to platform selection, Envision turns insight into action and helps organizations move faster with confidence.

Ready to move from ambition to impact? Let’s define your AI-first strategy and build the foundation to lead what’s next.

 

 

]]>
https://blogs.perficient.com/2025/09/04/perficient-idc-digital-business-services-3q25/feed/ 0 386622
Planning Sitecore Migration: Things to consider https://blogs.perficient.com/2025/08/29/planning-sitecore-migration-things-to-consider/ https://blogs.perficient.com/2025/08/29/planning-sitecore-migration-things-to-consider/#respond Fri, 29 Aug 2025 10:49:31 +0000 https://blogs.perficient.com/?p=386668

Migrating a website or upgrading to a new Sitecore platform is more than a technical lift — it’s a business transformation and an opportunity to align your site and platform with your business goals and take full advantage of Sitecore’s capabilities. A good migration protects functionality, reduces risk, and creates an opportunity to improve user experience, operational efficiency, and measurable business outcomes.

Before jumping to the newest version or the most hyped architecture, pause and assess. Start with a thorough discovery: review current architecture, understand what kind of migration is required, and decide what can realistically be reused versus what should be refactored or rebuilt, along with suitable topology and Sitecore products.

This blog expands the key considerations before committing to a Sitecore-specific migration, translating them into detailed, actionable architecture decisions and migration patterns that guide impactful implementation.

 

1) Clarifying client requirements

Before starting any Sitecore migration or implementation, it’s crucial to clarify client’s requirements thoroughly. This ensures the solution aligns with actual business needs, not just technical requests and helps avoid rework or misaligned outcomes.

Scope goes beyond just features: Don’t settle for “migrate this” as the requirement. Ask deeper questions to shape the right migration strategy:

  • Business goals: Is the aim a redesign, conversion uplift, version upgrade, multi-region rollout, or compliance?
  • Functional scope: Are we redesigning the entire site or specific flows like checkout/login, or making back office changes?
  • Non-functional needs: What are the performance SLAs, uptime expectations, compliance (e.g.: PCI/GDPR), and accessibility standards?
  • Timeline: Is a phased rollout preferred, or a big-bang launch?

Requirements can vary widely, from full redesigns using Sitecore MVC or headless (JSS/Next.js), to performance tuning (caching, CDN, media optimization), security enhancements (role-based access, secure publishing), or integrating new business flows into Sitecore workflows.
Sometimes, the client may not fully know what’s needed, it’s up to us to assess the current setup and recommend improvements. Don’t assume the ask equals the need, A full rewrite isn’t always the best path. A focused pilot or proof of value can deliver better outcomes and helps validate the direction before scaling.

 

2) Architecture of the client’s system

Migration complexity varies significantly based on what the client is currently using. You need to evaluate current system and its uses and reusability.

Key Considerations

  • If the client is already on Sitecore, the version matters. Older versions may require reworking the content model, templates, and custom code to align with modern Sitecore architecture (e.g.: SXA, JSS).
  • If the client is not on Sitecore, evaluate their current system, infrastructure, and architecture. Identify what can be reused—such as existing servers(in case of on-prem), services, or integrations—to reduce effort.
  • Legacy systems often include deprecated APIs, outdated connectors, or unsupported modules, which increase technical risk and require reengineering.
  • Historical content, such as outdated media, excessive versioning, or unused templates, can bloat the migration. It’s important to assess what should be migrated, cleaned, or archived.
  • Map out all customizations, third-party integrations, and deprecated modules to estimate the true scope, effort, and risk involved.
  • Understanding the current system’s age, architecture, and dependencies is essential for planning a realistic and efficient migration path.

 

3) Media Strategy

When planning a Sitecore migration or upgrade, media handling can lead to major performance issues post-launch. These areas are critical for user experience, scalability, and operational efficiency, so they need attention early in the planning phase. Digital Asset Management (DAM) determines how assets are stored, delivered, and governed.

Key Considerations

  • Inventory: Assess media size, formats, CDN references, metadata, and duplicates. Identify unused assets, and plat to adopt modern formats (e.g., WebP).
  • Storage Decisions: Analyze and decide whether assets stay in Sitecore Media Library, move to Content Hub, or use other cloud storage (Azure Blob, S3)?
  • Reference Updates: Plan for content reference updates to avoid broken links.

 

4) Analytics, personalization, A/B testing, and forms

These features often carry stateful data and behavioral dependencies that can easily break during migration if not planned for. Ignoring them can lead to data loss and degraded user experience.

Key Considerations

  • Analytics: Check if xDB, Google Analytics, or other trackers are in use? Decide how historical analytics data will be preserved, validated, and integrated into the new environment?
  • Personalization: Confirm use of Sitecore rules, xConnect collections, or an external personalization engine. Plan to migrate segments, conditions, and audience definitions accurately.
  • A/B Testing & Experiments: Draft a plan to export experiment definitions and results is present.
  • Forms: Analyze which forms collects data, and how do they integrate with CRM or marketing automation?

Above considerations play important role in choosing Sitecore topology, if there is vast use of analytics XP makes a suitable option, forms submission consent flows have different approach in different topologies.

 

5) Search Strategy

Search is critical for user experience, and a migration is the right time to reassess whether your current search approach still makes sense.

Key Considerations

  • Understand how users interact with the site, Is search a primary navigation tool or a secondary feature? Does it significantly impact conversion or engagement?
  • Identify current search engine if any. Access its features, if advanced capabilities like AI recommendations, synonyms, or personalization being used effectively.
  • If the current engine is underutilized, note that maintaining it may add unnecessary cost and complexity. If search is business-critical, ensure feature parity or enhancement in the new architecture.
  • Future Alignment:  Based on requirements, determine whether the roadmap supports:
    • Sitecore Search (SaaS) for composable and cloud-first strategies.
    • Solr for on-prem or PaaS environments.
    • Third-party engines for enterprise-wide search needs.

 

6) Integrations, APIs & Data Flows

Integrations are often the hidden complexity in Sitecore migrations. They connect critical business systems, and any disruption can lead to post-go-live incidents. For small, simple content-based sites with no integrations, migrations tend to be quick and straightforward. However, for more complex environments, it’s essential to analyze all layers of the architecture to understand where and how data flows. This includes:

Key Considerations

  • Integration Inventory: List all synchronous and asynchronous integrations, including APIs, webhooks, and data pipelines. Some integrations may rely on deprecated endpoints or legacy SDKs that need refactoring.
  • Criticality & Dependencies: Identify mission-critical integrations (e.g.: CRM, ERP, payment gateways).
  • Batch & Scheduled Jobs: Audit long-running processes, scheduled exports, and batch jobs. Migration may require re-scheduling or re-platforming these jobs.
  • Security & Compliance: Validate API authentication, token lifecycles, and data encryption. Moving to SaaS or composable may require new security patterns.

 

7) Identify Which Sitecore offerings are in use — and to what extent?

Before migration, it’s essential to document the current Sitecore ecosystem and evaluate what the future state should look like. This determines whether the path is a straight upgrade or a transition to a composable stack.

Key Considerations

  • Current Topology: Is the solution running on XP or XM? Assume that XP features (xDB, personalization) may not be needed if moving to composable.
  • Content Hub: Check if DAM or CMP is in use. If not, consider whether DAM is required for centralized asset management, brand consistency, and omnichannel delivery.
  • Sitecore Personalize & CDP: Assess if personalization is currently rule-based or if advanced testing and segmentation are required.
  • OrderCloud: If commerce capabilities exist today or are planned in the near future.

 

Target Topologies

This is one of the most critical decisions is choosing the target architecture. This choice impacts infrastructure, licensing, compliance, authoring experience, and long-term scalability. It’s not just a technical decision—it’s a business decision that shapes your future operating model.

Key Considerations

  • Business Needs & Compliance: Does your organization require on-prem hosting for regulatory reasons, or can you move to SaaS for agility?
  • Authoring Experience: Will content authors need Experience Editor, or is a headless-first approach acceptable?
  • Operational Overhead: How much infrastructure management can team handle post-migration?
  • Integration Landscape: Are there tight integrations with legacy systems that require full control over infrastructure?

Architecture Options & Assumptions

Option Best For Pros Cons Assumptions
XM (on-prem/PaaS) CMS-only needs, multilingual content, custom integrations Visual authoring via Experience Editor

Hosting control

Limited marketing features Teams want hosting flexibility and basic CMS capabilities but analytics is not needed
Classic XP (on-prem/PaaS) Advanced personalization, xDB, marketing automation Full control

Deep analytics

Advanced marketing Personalization

Complex infrastructure, high resource demand Marketing features are critical; infra-heavy setup is acceptable
XM Cloud (SaaS) Agility, fast time-to-market, composable DXP Reduced overhead

Automatic updates

Headless-ready

Limited low-level customization SaaS regions meet compliance, Needs easy upgrades

 

Along with topology its important to consider hosting and frontend delivery platform. Lets look at available hosting options with their pros and cons:

  • On-Prem(XM/XP): You can build the type of machine that you want.
    • Pros: Maximum control, full compliance for regulated industries, and ability to integrate with legacy systems.
    • Cons: High infrastructure cost, slower innovation, and manual upgrades, difficult to scale.
    • Best For: Organizations with strict data residency, air-gapped environments, or regulatory mandates.
    • Future roadmap may require migration to cloud, so plan for portability.
  • PaaS (Azure App Services, Managed Cloud – XM/XP)
    • Pros: Minimal up-front costs and you do not need to be concerned about the maintenance of the underlying machine.
    • Cons: Limited choice of computing options and functionality.
    • Best For: Organizations expecting to scale vertically and horizontally, often and quickly
  • IaaS (Infrastructure as a service – XM/XP)
    • This is same as on-premise, but with VMs you can tailor servers to meet your exact requirements.
  • SaaS (XM Cloud)
    • Pros: Zero infrastructure overhead, automatic upgrades, global scalability.
    • Cons: Limited deep customization at infra level.
    • Best For: Organizations aiming for composable DXP and agility.
    • Fully managed by Sitecore (SaaS).

For development, you have different options for example: .Net MVC, .Net Core, Next JS, React. Depending on topology suggested, selection of frontend delivery can be hybrid or headless:

.NET MVC → For traditional, web-only application.
Headless → For multi-channel, composable, SaaS-first strategy.
.NET Core Rendering → For hybrid modernization with .NET.

 

8) Security, Compliance & Data Residency

Security is non-negotiable during any Sitecore migration or upgrade. These factors influence architecture, hosting choices and operational processes.

Key Considerations

  • Authentication & Access: Validate SSO, SAML/OAuth configurations, API security, and secrets management. Assume that identity providers or token lifecycles may need reconfiguration in the new environment.
  • Compliance Requirements: Confirm obligations like PCI, HIPAA, GDPR, Accessibility and regional privacy laws. Assume these will impact data storage, encryption, and as AI is in picture now a days it will even have impact on development workflow.
  • Security Testing: Plan for automated vulnerability scans(decide tools you going to use for the scans) and manual penetration testing as part of discovery and pre go-live validation.

 

9) Performance

A migration is the perfect opportunity to identify and fix structural performance bottlenecks, but only if you know your starting point. Without a baseline, it’s impossible to measure improvement or detect regressions.

Key Considerations

  • Baseline Metrics: Capture current performance indicators like TTFB (Time to First Byte), LCP (Largest Contentful Paint), CLS (Cumulative Layout Shift), throughput, and error rates. These metrics will guide post-migration validation and SLA commitments.
  • Caching & Delivery: Document existing caching strategies, CDN usage, and image delivery methods. Current caching patterns may need reconfiguration in the new architecture.
  • Load & Stress Testing: Define peak traffic scenarios and plan load testing tools with Concurrent Users and Requests per Second.

 

10) Migration Strategies

Choosing the right migration strategy is critical to balance risk, cost, and business continuity. There’s no one size fits all approach—your suggestion/choice depends on timeline, technical debt and operational constraints.

Common Approaches

    • Lift & Shift
      Move the existing solution as is with minimal changes.
      This is low-risk migrations where speed is the priority. When the current solution is stable and technical debt is manageable.
      However with this approach the existing issues and inefficiencies stays as is which can be harmful.

 

    • Phased (Module-by-Module)
      Migrate critical areas first (e.g.: product pages, checkout) and roll out iteratively.
      This can be opted for large, complex sites where risk needs to be minimized, when business continuity is critical.
      With this approach, timelines are longer and requires dual maintenance during transition.

 

    • Rewrite & Cutover
      Rebuild the solution from scratch and switch over at once.
      This is can be chosen when the current system doesn’t align with future architecture. When business wants a clean slate for modernization.

 

 

Above options can be suggested based on several factors whether business tolerate downtime or dual maintenance. What are the Timelines, What’s the budget. If the current solution worth preserving, or is a rewrite inevitable? Does the strategy align with future goals?

 

Final Thoughts

Migrating to Sitecore is a strategic move that can unlock powerful capabilities for content management, personalization, and scalability. However, success lies in the preparation. By carefully evaluating your current architecture, integration needs and team readiness, you can avoid common pitfalls and ensure a smoother transition. Taking the time to plan thoroughly today will save time, cost, and effort tomorrow setting the stage for a future-proof digital experience platform.

 

]]>
https://blogs.perficient.com/2025/08/29/planning-sitecore-migration-things-to-consider/feed/ 0 386668
2025 Modern Healthcare Survey Ranks Perficient Among the 10 Largest Management Consulting Firms https://blogs.perficient.com/2025/08/28/modern-healthcare-ranks-perficient-among-the-10-largest-management-consulting-firms/ https://blogs.perficient.com/2025/08/28/modern-healthcare-ranks-perficient-among-the-10-largest-management-consulting-firms/#comments Thu, 28 Aug 2025 07:45:26 +0000 https://blogs.perficient.com/?p=296761

Modern Healthcare has once again recognized Perficient among the largest healthcare management consulting firms in the U.S., ranking us ninth in its 2025 survey. This honor reflects not only our growth but also our commitment to helping healthcare leaders navigate complexity with clarity, precision, and purpose.

What’s Driving Demand: Innovation with Intent

As provider, payer, and MedTech organizations face mounting pressure to modernize, our work is increasingly focused on connecting digital investments to measurable business and health outcomes. The challenges are real—and so are the opportunities.

Healthcare leaders are engaging our experts to tackle shifts from digital experimentation to enterprise alignment in business-critical areas, including:

  • Digital health transformation that eases access to care.
  • AI and data analytics that accelerate insight, guide clinical decisions, and personalize consumer experiences.
  • Workforce optimization that supports clinicians, streamlines operations, and restores time to focus on patients, members, brokers, and care teams.

These investments represent strategic maturity that reshapes how care is delivered, experienced, and sustained.

Operational Challenges: Strategy Meets Reality

Serving healthcare clients means working inside a system that resists simplicity. Our industry, technical, and change management experts help leaders address three persistent tensions:

  1. Aligning digital strategy with enterprise goals. Innovation often lacks a shared compass. We translate divergent priorities—clinical, operational, financial—into unified programs that drive outcomes.
  2. Controlling costs while preserving agility. Budgets are tight, but the need for speed and competitive relevancy remains. Our approach favors scalable roadmaps and solutions that deliver early wins and can flex as the health care marketplace and consumer expectations evolve.
  3. Preparing the enterprise for AI. Many of our clients have discovered that their AI readiness lags behind ambition. We help build the data foundations, governance frameworks, and workforce capabilities needed to operationalize intelligent systems.

Related Insights: Explore the Digital Trends in Healthcare

Consumer Expectations: Access Is the New Loyalty

Our Access to Care research, based on insights from more than 1,000 U.S. healthcare consumers, reveals a fundamental shift: if your healthcare organization isn’t delivering a seamless, personalized, and convenient experience, consumers will go elsewhere. And they won’t always come back.

Many healthcare leaders still view competition as other hospitals or clinics in their region. But today’s consumer has more options—and they’re exercising them. From digital-first health experiences to hyper-local disruptors and retail-style health providers focused on accessibility and immediacy, the competitive field is rapidly expanding.

  • Digital convenience is now a baseline. More than half of consumers who encountered friction while scheduling care went elsewhere.
  • Caregivers are underserved. One in three respondents manage care for a loved one, yet most digital strategies treat the patient as a single user.
  • Digital-first care is mainstream. 45% of respondents aged 18–64 have already used direct-to-consumer digital care, and 92% of those adopters believe the quality is equal or better to the care offered by their regular health care system.

These behaviors demand a rethinking of access, engagement, and loyalty. We help clients build experiences that are intuitive, inclusive, and aligned with how people actually live and seek care.

Looking Ahead: Complexity Accelerates

With intensified focus on modernization, data strategy, and responsible AI, healthcare leaders are asking harder questions. We’re helping them find and activate answers that deliver value now and build resilience for what’s next.

Our technology partnerships with Adobe, AWS, Microsoft, Salesforce, and other platform leaders allow us to move quickly, integrate deeply, and co-innovate with confidence. We bring cross-industry expertise from financial services, retail, and manufacturing—sectors where personalization and operational excellence are already table stakes. That perspective helps healthcare clients leapfrog legacy thinking and adopt proven strategies. And our fluency in HIPAA, HITRUST, and healthcare data governance ensures that our digital solutions are compliant, resilient, and future-ready.

Optimized, Agile Strategy and Outcomes for Health Insurers, Providers, and MedTech

Discover why we been trusted by the 10 largest U.S. health systems, 10 largest U.S. health insurers, and 14 of the 20 largest medical device firms. We are recognized in analyst reports and regularly awarded for our excellence in solution innovation, industry expertise, and being a great place to work.

Contact us to explore how we can help you forge a resilient, impactful future that delivers better experiences for patients, caregivers, and communities.

]]>
https://blogs.perficient.com/2025/08/28/modern-healthcare-ranks-perficient-among-the-10-largest-management-consulting-firms/feed/ 2 296761