SEO Articles / Blogs / Perficient https://blogs.perficient.com/category/services/digital-marketing/seo/ Expert Digital Insights Tue, 28 Oct 2025 12:28:50 +0000 en-US hourly 1 https://blogs.perficient.com/files/favicon-194x194-1-150x150.png SEO Articles / Blogs / Perficient https://blogs.perficient.com/category/services/digital-marketing/seo/ 32 32 30508587 Executing a Sitecore Migration: Development, Performance, and Beyond https://blogs.perficient.com/2025/10/28/executing-a-sitecore-migration-development/ https://blogs.perficient.com/2025/10/28/executing-a-sitecore-migration-development/#respond Tue, 28 Oct 2025 12:23:25 +0000 https://blogs.perficient.com/?p=388061

In previous blog, the strategic and architectural considerations that set the foundation for a successful Sitecore migration is explored. Once the groundwork is ready, it’s time to move from planning to execution, where the real complexity begins. The development phase of a Sitecore migration demands precision, speed, and scalability. From choosing the right development environment and branching strategy to optimizing templates, caching, and performance, every decision directly impacts the stability and maintainability of your new platform.

This blog dives into the practical side of migration, covering setup best practices, developer tooling (IDE and CI/CD), coding standards, content model alignment, and performance tuning techniques to help ensure that your transition to Sitecore’s modern architecture is both seamless and future-ready.Title (suggested): Executing a Successful Sitecore Migration: Development, Performance, and Beyond

 

1. Component and Code Standards Over Blind Reuse

  • In any Sitecore migration, one of the biggest mistakes teams make is lifting and shifting old components into the new environment. While this may feel faster in the short term, it creates long-term problems.
  • Missed product offerings: Old components were often built around constraints of an earlier Sitecore version. Reusing them as-is means you can’t take advantage of new product features like improved personalization, headless capabilities, SaaS integrations, and modern analytics.
  • Outdated standards: Legacy code usually does not meet current coding, security, and performance standards. This can introduce vulnerabilities and inefficiencies into your new platform.
    Accessibility gaps: Many older components don’t align with WCAG and ADA accessibility standards — missing ARIA roles, semantic HTML, or proper alt text. Reusing them will carry accessibility debt into your fresh build.
  • Maintainability issues: Old code often has tight coupling, minimal test coverage, and obsolete dependencies. Keeping it will slow down future upgrades and maintenance.

Best practice: Treat the migration as an opportunity to raise your standards. Audit old components for patterns and ideas, but don’t copy-paste them. Rebuild them using modern frameworks, Sitecore best practices, security guidelines, and accessibility compliance. This ensures the new solution is future-proof and aligned with the latest Sitecore roadmap.

 

2. Template Creation and Best Practices

  • Templates define the foundation of your content structure, so designing them carefully is critical.
  • Analyze before creating: Study existing data models, pages, and business requirements before building templates.
  • Use base templates: Group common fields (e.g., Meta, SEO, audit info) into base templates and reuse them across multiple content types.
  • Leverage branch templates: Standardize complex structures (like a landing page with modules) by creating branch templates for consistency and speed.
  • Follow naming and hierarchy conventions: Clear naming and logical organization make maintenance much easier.

 

3. Development Practices and Tools

A clean, standards-driven development process ensures the migration is efficient, maintainable, and future-proof. It’s not just about using the right IDEs but also about building code that is consistent, compliant, and friendly for content authors.

  • IDEs & Tools
    • Use Visual Studio or VS Code with Sitecore- and frontend-specific extensions for productivity.
    • Set up linting, code analysis, and formatting tools (ESLint, Prettier in case of JSS code, StyleCop) to enforce consistency.
    • Use AI assistance (GitHub Copilot, Codeium, etc.) to speed up development, but always review outputs for compliance and quality. There are many different AI tools available in market that can even change the design/prototypes into specified code language.
  • Coding Standards & Governance
    • Follow SOLID principles and keep components modular and reusable.
    • Ensure secure coding standards: sanitize inputs, validate data, avoid secrets in code.
    • Write accessible code: semantic HTML, proper ARIA roles, alt text, and keyboard navigation.
    • Document best practices and enforce them with pull request reviews and automated checks.
  • Package & Dependency Management
    • Select npm/.NET packages carefully: prefer well-maintained, community-backed, and security-reviewed ones.
    • Avoid large, unnecessary dependencies that bloat the project.
    • Run dependency scanning tools to catch vulnerabilities.
    •  Keep lockfiles for environment consistency.
  • Rendering Variants & Parameters
    • Leverage rendering variants (SXA/headless) to give flexibility without requiring code changes.
    • Add parameters so content authors can adjust layouts, backgrounds, or alignment safely.
    • Always provide sensible defaults to protect design consistency.
  • Content Author Experience

Build with the content author in mind:

    • Use clear, meaningful field names and help text.
    • Avoid unnecessary complexity: fewer, well-designed fields are better.
    • Create modular components that authors can configure and reuse.
    • Validate with content author UAT to ensure the system is intuitive for day-to-day content updates.

Strong development practices not only speed up migration but also set the stage for easier maintenance, happier authors, and a longer-lasting Sitecore solution.

 

4. Data Migration & Validation

Migrating data is not just about “moving items.” It’s about translating old content into a new structure that aligns with modern Sitecore best practices.

  • Migration tools
    Sitecore does provides migration tools to shift data like XM to XM Cloud. Leverage these tools for data that needs to be copied.
  • PowerShell for Migration
    • Use Sitecore PowerShell Extensions (SPE) to script the migration of data from the old system that does not need to be as is but in different places and field from old system.
    • Automate bulk operations like item creation, field population, media linking, and handling of multiple language versions.
    • PowerShell scripts can be run iteratively, making them ideal as content continues to change during development.
    • Always include logging and reporting so migrated items can be tracked, validated, and corrected if needed.
  • Migration Best Practices
    • Field Mapping First: Analyze old templates and decide what maps directly, what needs transformation, and what should be deprecated.
    • Iterative Migration: Run migration scripts in stages, validate results, and refine before final cutover.
    • Content Cleanup: Remove outdated, duplicate, or unused content instead of carrying it forward.
    • SEO Awareness: Ensure titles, descriptions, alt text, and canonical fields are migrated correctly.
    • Audit & Validation:
      • Use PowerShell reports to check item counts, empty fields, or broken links.
      • Crawl both old and new sites with tools like Screaming Frog to compare URLs, metadata, and page structures.

 

5. SEO Data Handling

SEO is one of the most critical success factors in any migration — if it’s missed, rankings and traffic can drop overnight.

  • Metadata: Preserve titles, descriptions, alt text, and Open Graph tags. Missing these leads to immediate SEO losses.
  • Redirects: Map old URLs with 301 redirects (avoid chains). Broken redirects = lost link equity.
  • Structured Data: Add/update schema (FAQ, Product, Article, VideoObject). This improves visibility in SERPs and AI-generated results.
  • Core Web Vitals: Ensure the new site is fast, stable, and mobile-first. Poor performance = lower rankings.
  • Emerging SEO: Optimize for AI/Answer Engine results, focus on E-E-A-T (author, trust, freshness), and create natural Q&A content for voice/conversational search.
  • Validation: Crawl the site before and after migration with tools like Screaming Frog or Siteimprove to confirm nothing is missed.

Strong SEO handling ensures the new Sitecore build doesn’t just look modern — it retains rankings, grows traffic, and is ready for AI-powered search.

 

6. Serialization & Item Deployment

Serialization is at the heart of a smooth migration and ongoing Sitecore development. Without the right approach, environments drift, unexpected items get deployed, or critical templates are missed.

  • ✅ Best Practices
    • Choose the Right Tool: Sitecore Content Serialization (SCS), Unicorn, or TDS — select based on your project needs.
    • Scope Carefully: Serialize only what is required (templates, renderings, branches, base content). Avoid unnecessary content items.
    • Organize by Modules: Structure serialization so items are grouped logically (feature, foundation, project layers). This keeps deployments clean and modular.
    • Version Control: Store serialization files in source control (Git/Azure devops) to track changes and allow safe rollbacks.
    • Environment Consistency: Automate deployment pipelines so serialized items are promoted consistently from dev → QA → UAT → Prod.
    • Validation: Always test deployments in lower environments first to ensure no accidental overwrites or missing dependencies.

Properly managed serialization ensures clean deployments, consistent environments, and fewer surprises during migration and beyond.

 

7. Forms & Submissions

In Sitecore XM Cloud, forms require careful planning to ensure smooth data capture and migration.

  •  XM Cloud Forms (Webhook-based): Submit form data via webhooks to CRM, backend, or marketing platforms. Configure payloads properly and ensure validation, spam protection, and compliance.
  • Third-Party Forms: HubSpot, Marketo, Salesforce, etc., can be integrated via APIs for advanced workflows, analytics, and CRM connectivity.
  • Create New Forms: Rebuild forms with modern UX, accessibility, and responsive design.
  • Migrate Old Submission Data: Extract and import previous form submissions into the new system or CRM, keeping field mapping and timestamps intact.
  • ✅ Best Practices: Track submissions in analytics, test end-to-end, and make forms configurable for content authors.

This approach ensures new forms work seamlessly while historical data is preserved.

 

8. Personalization & Experimentation

Migrating personalization and experimentation requires careful planning to preserve engagement and insights.

  • Export & Rebuild: Export existing rules, personas, and goals. Review them thoroughly and recreate only what aligns with current business requirements.
  • A/B Testing: Identify active experiments, migrate if relevant, and rerun them in the new environment to validate performance.
  • Sitecore Personalize Implementation:
    • Plan data flow into the CDP and configure event tracking.
    • Implement personalization via Sitecore Personalize Cloud or Engage SDK for xm cloud implementation, depending on requirements.

✅Best Practices:

  • Ensure content authors can manage personalization rules and experiments without developer intervention.
  • Test personalized experiences end-to-end and monitor KPIs post-migration.

A structured approach to personalization ensures targeted experiences, actionable insights, and a smooth transition to the new Sitecore environment.

 

9. Accessibility

Ensuring accessibility is essential for compliance, usability, and SEO.

  • Follow WCAG standards: proper color contrast, semantic HTML, ARIA roles, and keyboard navigation.
  • Validate content with accessibility tools and manual checks before migration cutover.
  • Accessible components improve user experience for all audiences and reduce legal risk.

 

10. Performance, Caching & Lazy Loading

Optimizing performance is critical during a migration to ensure fast page loads, better user experience, and improved SEO.

  • Caching Strategies:
    • Use Sitecore output caching and data caching for frequently accessed components.
    • Implement CDN caching for media assets to reduce server load and improve global performance.
    • Apply cache invalidation rules carefully to avoid stale content.
  • Lazy Loading:
    • Load images, videos, and heavy components only when they enter the viewport.
    • Improves perceived page speed and reduces initial payload.
  • Performance Best Practices:
    • Optimize images and media (WebP/AVIF).
    • Minimize JavaScript and CSS bundle size, and use tree-shaking where possible.
    • Monitor Core Web Vitals (LCP, CLS, FID) post-migration.
    • Test performance across devices and regions before go-live.
    • Content Author Consideration:
    • Ensure caching and lazy loading do not break dynamic components or personalization.
    • Provide guidance to authors on content that might impact performance (e.g., large images or embeds).

Proper caching and lazy loading ensure a fast, responsive, and scalable Sitecore experience, preserving SEO and user satisfaction after migration.

 

11. CI/CD, Monitoring & Automated Testing

A well-defined deployment and monitoring strategy ensures reliability, faster releases, and smooth migrations.

  • CI/CD Pipelines:
    • Set up automated builds and deployments according to your hosting platform: Azure, Vercel, Netlify, or on-premise.
    • Ensure deployments promote items consistently across Dev → QA → UAT → Prod.
    • Include code linting, static analysis, and unit/integration tests in the pipeline.
  • Monitoring & Alerts:
    • Track website uptime, server health, and performance metrics.
    • Configure timely alerts for downtime or abnormal behavior to prevent business impact.
  • Automated Testing:
    • Implement end-to-end, regression, and smoke tests for different environments.
    • Include automated validation for content, forms, personalization, and integrations.
    • Integrate testing into CI/CD pipelines to catch issues early.
  • ✅ Best Practices:
    • Ensure environment consistency to prevent drift.
    • Use logs and dashboards for real-time monitoring.
    • Align testing and deployment strategy with business-critical flows.

A robust CI/CD, monitoring, and automated testing strategy ensures reliable deployments, reduced downtime, and faster feedback cycles across all environments.

 

12. Governance, Licensing & Cutover

A successful migration is not just technical — it requires planning, training, and governance to ensure smooth adoption and compliance.

  • License Validation: Compare the current Sitecore license with what the new setup requires. Ensure coverage for all modules, environments. Validate and provide accurate rights to users and roles.
  • Content Author & Marketer Readiness:
    • Train teams on the new workflows, tools, and interface.
    • Provide documentation, demos, and sandbox environments to accelerate adoption.
  • Backup & Disaster Recovery:
    • Plan regular backups and ensure recovery procedures are tested.
    • Define RTO (Recovery Time Objective) and RPO (Recovery Point Objective) for critical data.
  • Workflow, Roles & Permissions:
    • Recreate workflows, roles, and permissions in the new environment.
    • Implement custom workflows if required.
    • Governance gaps can lead to compliance and security risks — audit thoroughly.
  • Cutover & Post-Go-Live Support:
    • Plan the migration cutover carefully to minimize downtime.
    • Prepare a support plan for immediate issue resolution after go-live.
    • Monitor KPIs, SEO, forms, personalization, and integrations to ensure smooth operation.

Proper governance, training, and cutover planning ensures the new Sitecore environment is compliant, adopted by users, and fully operational from day one.

 

13. Training & Documentation

Proper training ensures smooth adoption and reduces post-migration support issues.

  • Content Authors & Marketers: Train on new workflows, forms, personalization, and content editing.
  • Developers & IT Teams: Provide guidance on deployment processes, CI/CD, and monitoring.
  • Documentation: Maintain runbooks, SOPs, and troubleshooting guides for ongoing operations.
  • Encourage hands-on sessions and sandbox practice to accelerate adoption.

 

Summary:

Sitecore migrations are complex, and success often depends on the small decisions made throughout development, performance tuning, SEO handling, and governance. This blog brings together practical approaches and lessons learned from real-world implementations — aiming to help teams build scalable, accessible, and future-ready Sitecore solutions.

While every project is different, the hope is that these shared practices offer a useful starting point for others navigating similar journeys. The Sitecore ecosystem continues to evolve, and so do the ways we build within it.

 

]]>
https://blogs.perficient.com/2025/10/28/executing-a-sitecore-migration-development/feed/ 0 388061
Top 5 Drupal AI Modules to Transform Your Workflow https://blogs.perficient.com/2025/09/29/top-5-drupal-ai-modules-to-transform-your-workflow/ https://blogs.perficient.com/2025/09/29/top-5-drupal-ai-modules-to-transform-your-workflow/#respond Mon, 29 Sep 2025 14:58:30 +0000 https://blogs.perficient.com/?p=387495

The AI Revolution is in Drupal CMS 

The way we create, optimize, and deliver content has fundamentally changed. Artificial Intelligence is no longer a futuristic concept; it’s a practical, indispensable tool for content teams. For years, Drupal has been the gold standard for structured, enterprise-level content management. Now, with the rapid maturation of the community’s Artificial Intelligence Initiative, Drupal is emerging as the premier platform for an Intelligent CMS. 

This post is for every content editor, site builder, and digital marketer who spends too much time on repetitive tasks like writing alt text, crafting meta descriptions, or translating copy. We’re moving the AI power from external tools directly into your Drupal admin screen. 

We will explore five essential Drupal modules that leverage AI to supercharge your content workflow, making your team faster, your content better, and your website more effective. This is about making Drupal work smarter, not just harder. 

The collective effort to bring this intelligence to Drupal is being driven by the community, and you can see the foundational work, including the overview of many related projects, right here at the Drupal Artificial Intelligence Initiative. 

 

  1. AI CKEditor Integration: The Content Co-Pilot

This functionality is typically provided by a suite of modules, with the core framework being the AI (Artificial Intelligence) module and its submodules like AI CKEditor. It integrates large language models (LLMs) like those from OpenAI or Anthropic directly into your content editor. 

Role in the CMS 

This module places an AI assistant directly inside the CKEditor 5 toolbar, the primary rich-text editor in Drupal. It turns the editor from a passive text field into an active, helpful partner. It knows the context of your page and is ready to assist without ever requiring you to leave the edit screen. 

How It’s Useful 

  • For Content Editors: It eliminates the dreaded “blank page syndrome.” Highlight a bulleted list and ask the AI to “turn this into a formal paragraph” or “expand this summary into a 500-word article.” You can instantly check spelling and grammar, adjust the tone of voice (e.g., from professional to friendly), and summarize long blocks of text for teasers or email excerpts. It means spending less time writing the first draft and more time editing and refining the final, human-approved version. 
  • For Site Builders: It reduces the need for editors to jump between Drupal and external AI tools, streamlining the entire content creation workflow and keeping your team focused within the secure environment of the CMS. 

 

  1. AI Image Alt Text: The SEO Automator

AI Image Alt Text is a specialized module that performs one critical task exceptionally well: using computer vision to describe images for accessibility and SEO. 

Role in the CMS 

This module hooks into the Drupal Media Library workflow. The moment an editor uploads a new image, the module sends that image to a Vision AI service (like Google Vision or an equivalent LLM) for analysis. The AI identifies objects, actions, and scenes, and then generates a descriptive text which is automatically populated into the image’s Alternative Text (Alt Text) field. 

How It’s Useful 

  • For Accessibility: Alt text is crucial for WCAG compliance. Screen readers use this text to describe images to visually impaired users. This module ensures that every image, regardless of how busy the editor is, has a meaningful description, making your site more inclusive right from the start. 
  • For SEO & Editors: Alt text is a ranking signal for search engines. It also saves the editor the most tedious part of their job. Instead of manually typing a description like “Woman sitting at a desk typing on a laptop with a cup of coffee,” the AI provides a high-quality, descriptive draft instantly, which the editor can quickly approve or slightly refine. It’s a huge time-saver and compliance booster. 

 

  1. AI Translation: The Multilingual Enabler

This feature is often a submodule within the main AI (Artificial Intelligence) framework, sometimes leveraging a dedicated integration like the AI Translate submodule, or integrating with the Translation Management Tool (TMGMT). 

Role in the CMS 

Drupal is one of the world’s most powerful platforms for building multilingual websites. This module builds upon that strength by injecting AI as a Translation Provider. Instead of waiting for a human translator for the first pass, this module allows content to be translated into dozens of languages with the click of a button. 

How It’s Useful 

  • For Global Content Teams: Imagine launching a product page simultaneously across five markets. This tool performs the initial, high-quality, machine-generated translation and saves it as a draft in the corresponding language node. The local editor then only needs to perform post-editing (reviewing and culturally adapting the text), which is significantly faster and cheaper than translating from scratch. 
  • For Site Owners: It drastically cuts the time-to-market for multilingual content and ensures translation consistency across technical terms. It leverages the AI’s power for speed while retaining the essential human oversight for cultural accuracy. 

 

  1. AI Automators: The Smart Curator

AI Automators (a powerful submodule of the main AI project) allows you to set up rules that automatically populate or modify fields based on content entered in other fields. 

Role in the CMS 

This is where the magic of “smart” content happens. An Automator is a background worker that monitors when a piece of content is saved. You can configure it to perform chained actions using an LLM. For instance, when an editor publishes a new blog post: 

  1. Read the content of the Body field. 
  2. Use a prompt to generate five relevant keywords/topics. 
  3. Automatically populate the Taxonomy/Tags field with those terms. 
  4. Use another prompt to generate a concise post for X (formerly Twitter). 
  5. Populate a new Social Media Post field with that text. 

How It’s Useful 

  • For Content Strategists: It enforces content standards and completeness. Every piece of content is automatically tagged and optimized, reducing the chance of human error and improving content discoverability through precise categorization. It ensures your SEO and content strategy is executed flawlessly on every save. 
  • For Site Builders: It brings the power of Event-Condition-Action (ECA) workflows into the AI space. It’s a no-code way to build complex, intelligent workflows that ensure data integrity and maximize the usefulness of content metadata. 

 

  1. AI Agents: The Operational Assistant

AI Agents, typically used in conjunction with the main AI framework, is a powerful new tool that uses natural language to execute administrative and site-building tasks. 

Role in the CMS

An AI Agent is like a virtual assistant for your Drupal back-end. Instead of navigating through multiple complex configuration forms to, say, create a new field on a content type, you simply tell the Agent what you want it to do in plain English. The Agent interprets your request, translates it into the necessary Drupal API calls, and executes the changes. The module comes with various built-in agents (like a Field Type Agent or a Content Type Agent). 

How It’s Useful 

  • For Site Builders and Non-Technical Admins: This is a revolutionary step toward conversational configuration. You can issue a command like: “Please create a new Content Type called ‘Product Review’ and add a new text field named ‘Reviewer Name’.” The agent handles the creation process instantly. This dramatically reduces the learning curve and time needed for common site-building tasks. 
  • For Automation: Agents can be chained together or triggered by other systems to perform complex, multi-step actions on the CMS structure itself. Need to update the taxonomy on 50 terms? A dedicated agent can handle the large-scale configuration change based on a high-level instruction, making system maintenance far more efficient. It turns administrative management into a conversation. 

 

Conclusion:

The integration of AI into Drupal is one of the most exciting developments in the platform’s history. It is a powerful affirmation of Drupal’s strength as a structured content hub. These modules—the AI CKEditor, AI Image Alt Text, AI Translation, AI Automators, and now the transformative AI Agentsare not here to replace your team. They are here to empower them. 

By automating the mundane, repetitive, and technical aspects of content management and even site configuration, these tools free up your content creators and site builders to focus on what humans do best: strategy, creativity, and high-level decision-making. The future of content management in Drupal is intelligent, efficient, and, most importantly, human-powered. It’s time to equip your team with these new essentials and watch your digital experiences flourish. 

]]>
https://blogs.perficient.com/2025/09/29/top-5-drupal-ai-modules-to-transform-your-workflow/feed/ 0 387495
Drupal 11’s AI Features: What They Actually Mean for Your Team https://blogs.perficient.com/2025/09/04/drupal-11s-ai-features-what-they-actually-mean-for-your-team/ https://blogs.perficient.com/2025/09/04/drupal-11s-ai-features-what-they-actually-mean-for-your-team/#respond Thu, 04 Sep 2025 14:04:33 +0000 https://blogs.perficient.com/?p=386893

Drupal 11’s AI Features: What They Actually Mean for Your Team

If you’ve been following the Drupal community lately, you’ve probably heard about the excitement with AI in Drupal 11 and the new Drupal AI Initiative. With over $100,000 in funding and 290+ AI modules already available, this will be a game changer.

But here’s the thing, AI in Drupal isn’t about replacing your team. It’s about making everyone more effective at what they already do best. Let’s talk through some of these new capabilities and what they mean for different teams in your organization.

Content Teams: Finally, An Assistant That Actually Helps

Creating quality content quickly has always been a challenge, but Drupal 11’s AI features tackle this head-on. The AI CKEditor integration gives content creators real-time assistance right in the editing interface, things like spelling corrections, translations, and contextual suggestions as you type.

The AI Content module is where things get interesting. It can automatically adjust your content’s tone for different audiences, summarize long content, and even suggest relevant taxonomy terms. For marketing teams juggling multiple campaigns, this means maintaining brand consistency without the usual back-and-forth reviews.

One feature that’s already saving teams hours is the AI Image Alt Text module. Instead of manually writing alt text for accessibility compliance, it generates descriptions automatically. The AI Translate feature is another game-changer for organizations with global reach—one-click multilingual content creation that actually understands context.

The bottom line? Your content team can focus on strategy and creativity instead of getting bogged down in routine tasks.

Developers: Natural Language Site Building

Here’s where Drupal 11 gets really exciting for a dev team. The AI Agents module introduces something we haven’t seen before, text-to-action capabilities. Developers can now modify Drupal configurations, create content types, and manage taxonomies just by describing what they need in spoken english.

Instead of clicking through admin interfaces, you can literally tell Drupal what you want, “Create a content type for product reviews with fields for rating, pros, cons, and reviewer information.” The system understands and executes these commands.

The AI module ecosystem supports over 21 major providers, OpenAI, Claude, AWS Bedrock, Google Vertex, and more. This means you’re not locked into any single AI provider and can choose the best model for specific tasks. The AI Explorer gives you a testing ground to experiment with prompts before pushing anything live.

For complex workflows, AI Automators let you chain multiple AI systems together. Think automated content transformation, field population, and business logic handling with minimal custom code.

The other great aspect of Drupal AI, is the open source backbone of Drupal, allows you to extend, add and build upon these agents in any way your dev team sees fit.

Marketing Teams: Data-Driven Campaign Planning

Marketing teams might be the biggest winners here. The AI Content Strategy module analyzes your existing content and provides recommendations for what to create next based on actual data, not guesswork. It identifies gaps in your content strategy and suggests targeted content based on audience behavior and industry trends.

The AI Search functionality means visitors can find content quickly, no more keyword guessing games. The integrated chatbot framework provides intelligent customer service that can access your site’s content to give accurate responses.

For SEO, the AI SEO module generates reports with user recommendations, reviewing content and metadata automatically. This reduces the need for separate SEO tools while giving insights right where you can act on them.

Why This Matters Right Now

The Drupal AI Initiative represents something more than just new features. With dedicated teams from leading agencies and serious funding behind it, this is Drupal positioning itself as the go-to platform for AI-powered content management.

For IT executives evaluating CMS options, Drupal 11’s approach is a great fit. You maintain complete control over your data and AI interactions while getting enterprise-grade governance with approval workflows and audit trails. It’s AI augmentation rather than AI replacement.

The practical benefits are clear: faster campaign launches, consistent brand voice across all content, and teams freed from manual tasks to focus on strategic work. In today’s competitive landscape, that kind of operational efficiency can make the difference between leading your market and playing catch-up.

The Reality Check

We all know, no technology is perfect. The success of these AI features, especially within the open source community, depends heavily on implementation and team adoption. You’ll need to spend time in training and process development to see real benefits. Like any new technology, there will be a learning curve as your team figures out the best ways to leverage these new features.

Based on what we are seeing within groups that have done early adoption of the AI features, they are seeing a good ROI on improvement of team efficiency, marketing time as well as reduced SEO churn.

If you’re considering how Drupal 11’s AI features might fit your organization, it’s worth having a conversation with an experienced implementation partner like Perficient. We can help you navigate the options and develop an AI strategy that makes sense for your specific situation.

]]>
https://blogs.perficient.com/2025/09/04/drupal-11s-ai-features-what-they-actually-mean-for-your-team/feed/ 0 386893
How to Track User Interactions in React with a Custom Event Logger https://blogs.perficient.com/2025/07/28/how-to-track-user-interactions-in-react/ https://blogs.perficient.com/2025/07/28/how-to-track-user-interactions-in-react/#respond Mon, 28 Jul 2025 08:56:12 +0000 https://blogs.perficient.com/?p=385319

In today’s data-driven world, understanding how users interact with your application is no longer optional , it’s essential. Every scroll, click, and form submission tells a story, a story about what your users care about, what they ignore, and where they might be facing friction.

This is where event tracking and analytics come into play.

Traditionally, developers and product teams rely on third-party tools like Google Analytics, Log rocket, or Hot-jar to collect and analyse user behaviour. These tools are powerful, but they come with trade-offs:

  • Privacy concerns : You may not want to share user data with external services.
  • Cost : Premium analytics platforms can be expensive.
  • Limited customization : You’re often restricted to predefined event types and dashboards.

 What Is Event Tracking?

Event tracking is the process of capturing and analyzing specific user interactions within a website or application. These events help you understand how users engage with your product.

 Common Events to Track:

  • Page Views – When a user visits a page
  • Button Clicks – Interactions with CTAs or navigation
  • Scroll Events – How far users scroll down a page
  • Form Submissions – When users submit data
  • Text Inputs – Typing in search bars or forms
  • Mouse Movements – Hovering or navigating with the cursor

Why Is It Important?

The primary goal of event tracking is to:

  • Understand user behaviour
  • Identify friction points in the UI/UX
  • Make data-informed decisions for product improvements
  • Measure feature adoption and conversion rates

Whether you’re a developer, product manager, or designer, having access to this data empowers you to build better, more user-centric applications.

In this blog, I’ll give you a high-level overview of a custom Event Tracker POC built with React.js and Bootstrap—highlighting only the key snippets and how user interactions are tracked.

  1. Reusable Event Tracker Utility:
    const eventTracker = (eventName, eventData = {}) => {
      const key = 'eventCounts';
      const existing = JSON.parse(localStorage.getItem(key)) || {};
      existing[eventName] = (existing[eventName] || 0) + 1;
      localStorage.setItem(key, JSON.stringify(existing));
      const event = {
        name: eventName,
        data: eventData,
        timestamp: new Date().toISOString(),
      };
      console.log('Tracked Event:', event);console.log('Event Counts:', existing);};
    

     

  2. Wherever event happen add in below format(e.g: form submit)
    eventTracker('Form Submitted', { name, email });

     

  3. To view any event tracker count and which event it is, we can do as per below code.
    export const getEventCount = (eventName) => {
      const counts = JSON.parse(localStorage.getItem('eventCounts')) || {};
      return counts[eventName] || 0;
    };
    
    

     

  4. Usage in dashboard
    import { getEventCount } from '../utils/eventTracker';
    
    const formSubmitCount = getEventCount('Form Submitted');
    const inputChangeCount = getEventCount('Input Changed');
    const pageViewCount = getEventCount('Page Viewed');
    const scrollEventCount = getEventCount('Scroll Event');
    
    

    This allows you to monitor how many times each event has occurred during the users session  (if local storage is retained).

Advantages of Custom Event Tracker:

  1. Full Control – Track only what matters, with custom data structure
  2. Data Privacy – No third-party servers, easier GDPR/CCPA compliance
  3. Cost Effective – No subscription, suitable for POCs and internal tools
  4. Custom UI – Fully customizable dashboard with React and Bootstrap
  5. No External Dependencies – Works offline or in secure environments
  6. Easy Debugging – Transparent logic and flexible debugging process

Conclusion:

  1. If your focus is flexibility, cost-saving, and data ownership, a custom event tracker built in any framework or library (like your POC) is a powerful choice—especially for MVPs, internal dashboards, and privacy-conscious applications.
  2. However, for quick setup, advanced analytics, and visual insights, third-party tools are better suited—particularly in production-scale apps where speed and insights for non-developers matter most.
  • Use custom tracking when you want control.
  • Use third-party tools when you need speed.
]]>
https://blogs.perficient.com/2025/07/28/how-to-track-user-interactions-in-react/feed/ 0 385319
SEO in the Age of AI: Should Marketers Defend or Discover? https://blogs.perficient.com/2025/07/22/seo-in-the-age-of-ai-should-marketers-defend-or-discover/ https://blogs.perficient.com/2025/07/22/seo-in-the-age-of-ai-should-marketers-defend-or-discover/#respond Tue, 22 Jul 2025 14:04:41 +0000 https://blogs.perficient.com/?p=384571

The rules of digital engagement and SEO are changing fast. As Generative AI platforms and large language models (LLMs) become the go-to tools for information and decision-making, traditional SEO strategies are being upended. Search engines are no longer the only gateway to brand discovery. Increasingly, consumers are turning to AI-powered assistants to ask questions, compare products, and make decisions without ever having to click a link.

For marketing leaders, this shift presents a strategic dilemma: Should we focus on defending the audiences we already own, or invest in optimizing for AI-driven discovery?

The Disruption: Defend What You Own

According to Gartner’s 2025 Marketing Predictions, as AI continues to disrupt traditional search and social channels, CMOs must prioritize owned media and direct customer relationships. The report emphasizes that brands can no longer rely solely on third-party platforms to drive traffic and engagement. Instead, marketers should double down on email, CRM, and loyalty programs to build resilience and maintain control over their customer connections.

This “defend” strategy is about protecting what you’ve already built, a.k.a. your audience, your data, and your brand equity. By investing in personalized content, lifecycle marketing, and community engagement, brands can create meaningful experiences that are not dependent on search engine rankings or social media algorithms.

The Opportunity: LLMs as the New Front Door

On the other side of the spectrum, Adobe recently introduced its LLM Optimizer, a powerful tool designed to help brands surface more effectively in AI-generated responses. Adobe’s perspective is clear: LLMs are not just a threat, they’re a massive opportunity for brand discovery.

As consumers increasingly rely on AI to answer questions like “What’s the best CRM for small businesses?” or “Which skincare brands are sustainable?”, the brands that show up in those answers will win. But showing up requires a new kind of optimization. One that goes beyond keywords and backlinks. It means structuring content in a way that LLMs can understand, contextualize, and trust.

Adobe’s LLM Optimizer helps marketers do just that. By analyzing how your brand is represented across AI platforms and providing actionable insights to improve visibility, it empowers marketing teams to stay ahead of the curve.

The Strategic Dilemma: Defend or Discover?

So, which path should marketing leaders take?

Defend

Focus on retention, loyalty, and lifetime value. Strengthen your email strategy, build community, and personalize every touchpoint.

Discover

Embrace the AI frontier. Optimize your content for LLMs, monitor how your brand appears in AI-generated answers, and adapt your strategy accordingly.

The truth is, this isn’t an either/or decision. It’s a both/and.

The Hybrid Approach: Rethinking SEO for the AI Era

Modern SEO is no longer just about ranking on Google. It’s about being the best answer in an AI-generated conversation. That means:

  • Auditing your content to ensure it’s structured, factual, and authoritative
  • Leveraging tools like Adobe’s LLM Optimizer to understand and improve your brand’s AI visibility
  • Doubling down on owned channels to build lasting relationships with your audience

The New Front Door Is Open

Generative AI isn’t just changing how people search. It’s changing how they discover, evaluate, and engage with brands. As marketing leaders, we must evolve our strategies to meet this moment.

So, ask yourself: Is your brand ready for the new front door?

Let Perficient Be Your Key to the New Front Door

At Perficient, we help brands unlock the full potential of this new era in digital marketing. By combining Adobe’s cutting-edge technology with our strategic marketing expertise, we empower organizations to not only protect their existing audiences but also pioneer new paths to discovery through AI-driven experiences.

Whether you’re looking to reinforce your foundation or step confidently into the future, we’ll help you build a strategy that’s not just future-ready, but future-proof.

]]>
https://blogs.perficient.com/2025/07/22/seo-in-the-age-of-ai-should-marketers-defend-or-discover/feed/ 0 384571
Integrate Coveo Atomic CLI-Based Hosted Search Page into Adobe Experience Manager (AEM) https://blogs.perficient.com/2025/06/18/integrate-coveo-atomic-cli-based-hosted-search-page-into-adobe-experience-manager-aem/ https://blogs.perficient.com/2025/06/18/integrate-coveo-atomic-cli-based-hosted-search-page-into-adobe-experience-manager-aem/#respond Wed, 18 Jun 2025 06:20:24 +0000 https://blogs.perficient.com/?p=382055

Getting Started with Coveo Atomic CLI

This section explains how to install, configure, and deploy a Coveo Atomic project using the Coveo CLI

Install the CLI

To get started, install the Coveo CLI globally with npm:

npm install -g @coveo/cli

To ensure you’re always using the latest version, update it anytime with:

npm update -g @coveo/cli

Authentication

Once the CLI is installed, you will need to authenticate to your coveo organization. Use the following command, replacing the placeholders with your specific organization details:

coveo auth:login --environment=prod --organization=<your-organization> --region=<your-region>

For example:

coveo auth:login --environment=prod --organization=blogtestorgiekhkuqk --region=us

Initialize an Coveo Atomic CLI Project

After logging in, initialize a new atomic project by running:

coveo atomic:init <project-name> --type=app

For example:

coveo atomic:init atomicInterface  --type=app

Building and Deploying the Project

Once the project is ready, build the application:

npm run build

This command compiles your code and prepares it for deployment. It creates a production-ready build inside the dist/ folder.

Then deploy your interface to Coveo using:

coveo ui:deploy

After deployment, your search interface will be hosted on Coveo’s infrastructure, ready to embed anywhere—like Adobe

B11

Using and Initializing Atomic-Hosted-Page

This section guides you through using and initializing the Atomic-Hosted-Page component of your Coveo project.

Use Atomic-Hosted-Page

If you have customized your Atomic search page locally and deployed it to the Coveo infrastructure, then it will be listed in the Custom Deployment tab of the Search Pages (platform-ca | platform-eu | platform-au) page of the Administration Console. You can use the atomic-hosted-page component to consume it from anywhere on the web.

Initialize Atomic-Hosted-Page

Once you have installed the atomic-hosted-page or atomic-hosted-ui web component, you’ll need to add a script like the following to initialize the atomic-hosted-page component:

<head>
  <!-- ... -->
  <script>
    (async () => {
      await customElements.whenDefined('atomic-hosted-ui');
      const atomicHostedUIPage = document.querySelector('atomic-hosted-ui');

      await atomicHostedUIPage.initialize({
        accessToken: '<ACCESS_TOKEN>', 
        organizationId: '<ORGANIZATION_ID>', 
        pageId: '<PAGE_ID>' 
      });
    })();
  </script>
  <!-- ... -->
  <atomic-hosted-ui hosted-type="code"></atomic-hosted-ui> 
  <!-- ... -->
</head>

In this script, replace the placeholders with coveo specific details:

<ACCESS_TOKEN> (string) is an API key or platform token that grants the View all access level on the Search Pages domain in the target Coveo organization.
<ORGANIZATION_ID> (string) is the unique identifier of your organization (for example, mycoveoorganizationa1b23c).
<PAGE_ID> (string) is the unique identifier of the hosted page, which you can copy from the Administration Console.

Steps to Embed in Adobe Experience Manager (AEM)

  1. Login to Adobe AEM Author Instance
    Example URL: https://author-555.adobeaemcloud.com/

  2. Navigate to the AEM Sites Console
    Go to:https://author-555.adobeaemcloud.com/sites.html/content/blog/us/en/search-results
    The Sites Console in AEM, used to manage your website’s pages and structure.
    B12

  3. Create or Select the Page

    • Create new or use an existing page, for example: search-results.

    • Select the page’s checkbox → click Edit (top toolbar).

    • You’ll be redirected to the Page Editor: https://author-555.adobeaemcloud.com/editor.html/content/blog/us/en/search-results.html.

  4. Embed the Coveo Script:
    In the Page Editor, open the Content Tree on the left, select Layout Container, click the Configure (wrench icon) button B13

  5. Choose Embed Type
    Choose Embed → iFrame. Paste your <atomic-hosted-page> script inside the iFrame.
    B14

  6. Preview and Publish the Page

    Click Page Information icon → Publish Page, the alert confirms that the page will be live
    B15

  7. View the Published Page
    Example URL:http://localhost:4502/content/blog/us/en/search-results.html
    B16

That’s it—you’ve successfully embedded your Coveo Atomic CLI-based Hosted Search Page inside Adobe!

References:

Use a hosted page in your infrastructure | Coveo Atomic

 

]]>
https://blogs.perficient.com/2025/06/18/integrate-coveo-atomic-cli-based-hosted-search-page-into-adobe-experience-manager-aem/feed/ 0 382055
User Needs Drive SERP Evolution – Google’s Search Central Live NYC 2025 https://blogs.perficient.com/2025/03/24/user-needs-drive-serp-evolution/ https://blogs.perficient.com/2025/03/24/user-needs-drive-serp-evolution/#respond Mon, 24 Mar 2025 17:41:51 +0000 https://blogs.perficient.com/?p=379117

SERPs will change, AI will be a big part of the changes, and SEO remains relevant.

Those were key points made at Google’s Search Central Live in New York City on March 20.

“Search is never a solved problem,” according to Liz Reid (quoted at the event). John Mueller pointed out that “15% of all searches are new every day.“ This means Google has to work hard to keep a fresh index and run algorithms able to serve results that meet user needs.

John Mueller presenting at Google Search Central Live NYC 2025

AI Overview

The biggest change in SERPs over the past year is the introduction of what Google calls AI Overview (AIO), a form of featured snippet on ‘roids. Unlike featured snippets, AIO is generated text based on multiple sources and often with no easily found links to other web sites.

The impact of AIO on Click-Through Rates is unclear, but client data I have seen and studies by SEO pros and vendors show a 50-75% decrease in CTR, depending on the method. AIO can improve CTR for pages that rank so low that even the meager rates for links listed as citations in AIO are better than for the organic result.

Attendees harshly criticized Google’s refusal to share AIO traffic data in Search Console’s Performance report. A Google rep argued that AIO is still new and changing so much that it doesn’t make sense to share the data at this point. That argument did not seem to go over well.

Spam, Penalties, and Site Reputation Abuse

Site Reputation Abuse is a manual action that also entered the chat last year. It penalizes websites that use third-party content to rank with the help of signals earned by first-party work. (Some SEO practitioners refer to this as “parasite SEO,” which makes little sense since the relationship is symbiotic.)

The quality of the third-party content, as is publishers’ intent, is irrelevant in the assessment.
Google’s Danny Sullivan stressed that the spam team works hard to decide if a site is guilty of reputation abuse.

Sullivan stressed that Google “does not hate freelancers.” In some freelance and publisher circles, there is a fear that freelance work is third-party content in Google’s eyes.

He also stated that Google penalizes spam practices, not people or sites (although a site would obviously be affected when penalized for using spam practices).

Freelancers who write for a site that has been penalized will not have their work on other sites penalized, nor will those sites be affected because they have content created by someone who has also created content for a site that has been penalized.

How big is the spam problem for Google?

50% of the documents that Google encounters on the web is spam, said Sullivan, but 99% of search sessions are free of spam. These numbers are, of course, based on Google’s definitions and measurements. You may view it differently.

Brand, ranking, and E-E-A-T

Brand has become a catch-all cure-all term in the SEO industry. It isn’t quite that, but it is important. I define a brand as a widely recognized solution for a specific problem. People prefer brands when looking for a solution to a problem. It takes work to become a brand.
Sullivan said Google’s “systems don’t say brand at all,” but “if you’re recognized as a brand… that correlates with search success.” The signals Google collects tend to line up with brands.
Mueller pointed out that you can’t “add E-E-A-T” to a website; it’s not something you sprinkle on a page. Putting an “About” link on your site isn’t helping you rank.

The acronym E-E-A-T stands for experience, expertise, authoritativeness, and trustworthiness. Google primarily uses these concepts to explain to its search quality raters what to look for when evaluating the quality of pages linked from the search results,

Google’s search quality rater guidelines give many examples of companies with high E-E-A-T. For example, Visa, Discover, and HSBC are mentioned as companies with high E-E-A-T for credit card services. Brand is a more common and sweeping, if vaguer, way to say E-E-A-T.

Visa did not become a brand by having a Ph.D. in financial services write essay-length web pages on the wonders of credit cards. E-E-A-T, like brand, is an outcome, not an action.

Sullivan said Google is looking for ways ”to give more exposure to smaller publishers.” He also noted that many sites just aren’t good. He asked why Google should rank your site if thousands of other sites are just as good or better.

This is where brand becomes the answer. Become a widely recognized solution, and the signals Google uses will work to your advantage (though not always to the extent you want them to work).

Sundry Points Made at the Event

  • Google doesn’t have a notion of toxic links
  • Use the link disavow tool only to disavow links you have purchased
  •  Google does not crawl or index every page on every site
  • If a page is not “helpful,” the answer is not simply to make it longer. More unhelpful content does not make the page more helpful.
  • Google Partners does not have an internal connection with Google for SEO help.
  • You don’t need to fill your site with topical content to rank. This is especially true for local SEO (the specific question was about “content marketing,” which is a lot broader than putting content on your site, but Mueller spoke only about on-site content).
  • Duplicate content is more of a technical issue than a quality issue. Duplicate content can have an unwanted impact on crawling, indexing, and ranking.
  • Structured data is much more precise than LLM. It’s also much cheaper.
  • Structured data isn’t helpful unless a lot of people use it. Google structured data documentation tells you what types are used by Google and how.
  • Structured data is not a ranking signal but can help you get into SERPP features that drive traffic.
  • Google Search does not favor websites that buy AdWords or carry AdSense.

 

The panel at Google Search Central Live NYC Panel 2025 consisted of Daniel Weisberg, Eric Barbera, John Mueller, Ryan Levering, and Danny Sullivan.

]]>
https://blogs.perficient.com/2025/03/24/user-needs-drive-serp-evolution/feed/ 0 379117
The Power of Storytelling https://blogs.perficient.com/2025/02/24/the-power-of-storytelling/ https://blogs.perficient.com/2025/02/24/the-power-of-storytelling/#comments Mon, 24 Feb 2025 17:20:38 +0000 https://blogs.perficient.com/?p=377671

How to Make Every Presentation Unforgettable

Ever sat through a presentation that felt like watching paint dry? Or maybe you’ve delivered one and noticed people checking their phones instead of hanging onto your every word? The secret to captivating any audience isn’t just great slides or polished data—it’s storytelling.

Storytelling transforms dull facts into memorable moments, making your ideas stick and your message resonate. Whether you’re giving a sales pitch, leading a team, or selling yourself in an interview, your ability to tell a compelling story can set you apart. Let’s dive into how storytelling makes you a more engaging, influential, and confident speaker!

Why Storytelling Matters in Presentations & Speeches

Stories evoke emotions, and emotions drive decisions. When you tell a good story:

  • People remember you – Facts are forgettable, emotions aren’t!
  • You hold attention longer – No one zones out during a great story.
  • Your message sticks – A well-told story makes complex ideas easy to grasp
  • You build stronger connections – People relate to personal and real-life experiences.

Imagine two people pitching the same product:

  • One lists all the features and numbers
  • The other tells a story about how the product changed a customer’s life

Which one would you remember? Exactly.

Becoming the Most Charismatic Version of Yourself

Charisma isn’t something you’re born with—it’s something you develop through practice. Storytelling helps you:

  • Speak with purpose – Every word should serve a goal.
  • Use body language effectively – Gestures, expressions, and voice variations bring your story to life.
  • Master the pause – Silence isn’t awkward; it builds suspense.
  • Engage your audience – Ask questions, invite participation, and make it a two-way experience.

The best speakers aren’t just knowledgeable—they’re magnetic. That’s what storytelling does for you.

How to Make Virtual Sales Presentations Engaging

In the digital world, keeping attention is harder than ever (hello, multitasking!). Here’s how to make sure your virtual sales presentations don’t get lost in the sea of emails and notifications:

  • Start with a compelling hook – A shocking stat, a personal anecdote, or a powerful question.
  • Use visual storytelling – Slides should support your story, not distract from it.
  • Keep it interactive – Polls, questions, and discussions make it engaging.
  • Speak with energy – Your enthusiasm must compensate for the digital barrier.
  • Use the power of silence – Pause before key points to create anticipation.

Remember: People might forget what you say, but they’ll always remember how you made them feel—even through a screen!

How to Be a Purpose-Driven Leader Through Storytelling

The best leaders aren’t just great decision-makers; they inspire action. Storytelling is your tool to:

  • Define your mission: Explain WHY you do what you do.
  • Motivate your team: Share real-life stories that reinforce company values.
  • Make change feel personal: People follow stories, not orders.
  • Humanize leadership: Vulnerability in storytelling builds trust.

Think about it—Martin Luther King Jr. didn’t say, “I have a plan.” He said, “I have a dream.” And that’s why people followed him.

How to Pitch Yourself (And Win Every Time!)

Whether you’re in an interview, networking, or selling an idea, you need a powerful personal pitch. Here’s how to craft one:

  1. Start with a hook – A personal story or a surprising fact.
  2. Highlight what makes you unique – What’s your “superpower”?
  3. Use a success story – Share a real example of how you made an impact.
  4. End with a strong call to action – Make it clear what you want next.

Example: Instead of saying, “I’m a sales professional with 5 years of experience,” try:
“A few years ago, I closed a deal that no one thought was possible. It taught me that persistence and creativity are my strongest assets—and I bring that same mindset to every client I work with.”

Which one sounds more memorable?

How to Influence Audiences Like a Pro

Great speakers don’t just inform—they influence. Here’s how you can do it too:

  • Use emotional appeal – Logic makes people think, but emotions make them act.
  • Frame your message as a story – Even data is more powerful when wrapped in a narrative.
  • Create relatability – “I’ve been in your shoes before…” instantly builds connection.
  • Master the art of contrast – Show “before vs. after” transformations.
  • Close with impact – End on a strong, inspiring note that sticks.

The Secret to Confidence: Believe in Your Own Story

Confidence isn’t just about what you say—it’s about how you feel when you say it. If you believe in your story, others will, too.

  • Prepare, but don’t over-script – Natural storytelling beats robotic speeches.
  • Rehearse with feedback – Record yourself, practice with friends, and tweak your delivery.
  • Visualize success – Imagine the audience hanging onto every word.
  • Embrace nervous energy – It’s just excitement in disguise!

When you step onto that stage—or that Zoom call—own your story. Speak like the expert you are, and watch how you captivate your audience effortlessly.

Final Thoughts: Your Story is Your Superpower

Every great speech, every unforgettable pitch, and every inspiring leader has one thing in common—storytelling. When you learn to craft compelling stories, you don’t just speak—you leave a lasting impression.

So, the next time you step up to present, don’t just share information. Tell a story. Engage, influence, and, most importantly, make it unforgettable.

]]>
https://blogs.perficient.com/2025/02/24/the-power-of-storytelling/feed/ 2 377671
Why Recoil Outperforms Redux and Context API https://blogs.perficient.com/2024/12/18/recoil-the-better-alternative-to-redux-and-context-api/ https://blogs.perficient.com/2024/12/18/recoil-the-better-alternative-to-redux-and-context-api/#respond Wed, 18 Dec 2024 11:24:42 +0000 https://blogs.perficient.com/?p=373872

Why Choose Recoil Over Redux or Context API?

State management is a cornerstone of building dynamic and interactive web applications, and React developers have a plethora  of tools at their disposal. Among these, Redux and Context API have long been popular choices, but Recoil is emerging as a modern alternative designed to simplify state management. In this blog, we’ll explore why Recoil may be a better choice for your project compared to Redux or Context API, focusing on its advantages and alignment with the latest techniques in React development.

Understanding Recoil

Recoil is a state management library developed by Facebook, specifically designed to work seamlessly with React. It introduces a new paradigm for managing both local and global state using atoms and selectors, providing flexibility and performance optimisations.

Example: Defining and Using an Atom in Recoil

import { atom, useRecoilState } from 'recoil';

// Define an atom

const countState = atom({

  key: 'countState', // Unique ID

  default: 0, // Default value

});

function Counter() {

  const [count, setCount] = useRecoilState(countState);

  return (

    <div>

      <p>Count: {count}</p>

      <button onClick={() => setCount(count + 1)}>Increment</button>

    </div>

  );

}

Comparison with Redux

Redux Overview

Redux is a mature state management library known for its centralized store, strict unidirectional data flow, and middleware support. While powerful, it can be verbose and complex for managing even simple state scenarios.

Advantages of Recoil Over Redux

  • No Boilerplate: Unlike Redux, which requires actions, reducers, and a store setup, Recoil simplifies state management with minimal setup. You define an atom or selector and start using it immediately.

Example: Comparing Recoil Atom with Redux Reducer

Recoil Approach:

const textState = atom({

  key: 'textState',

  default: '',

});

function TextInput() {

  const [text, setText] = useRecoilState(textState);

  return <input value={text} onChange={(e) => setText(e.target.value)} />;

}

Redux Approach:

// Action

const setText = (text) => ({

  type: 'SET_TEXT',

  payload: text,

});

// Reducer

function textReducer(state = '', action) {

  switch (action.type) {

    case 'SET_TEXT':

      return action.payload;

    default:

      return state;

  }

}

// Component

function TextInput() {

  const text = useSelector((state) => state.text);

  const dispatch = useDispatch();

  return <input value={text} onChange={(e) => dispatch(setText(e.target.value))} />;

}
  • Decentralised State Management: Recoil allows you to localize state using atoms, which can act as independent pieces of state. This contrasts with Redux’s centralized store, making it easier to manage state in large and modular applications.
  • Built-in Asynchronous Support: Recoil’s selectors can handle asynchronous logic out of the box, eliminating the need for additional middleware like Redux Thunk or Saga.

Example: Handling Async Logic in Recoil

const asyncDataSelector = selector({

  key: 'asyncDataSelector',

  get: async () => {

    const response = await fetch('https://api.example.com/data');

    return response.json();

  },

});

function AsyncDataComponent() {

  const data = useRecoilValue(asyncDataSelector);

  return <div>Data: {JSON.stringify(data)}</div>;

}

When Redux Might Still Be Useful

Redux is a better fit for applications requiring strict control over state changes, such as those with complex business logic or a need for middleware extensibility.

Comparison with Context API

Context API Overview

The Context API is a built-in React feature used for sharing state globally without prop drilling. While simple to use, it is not optimized for frequent state updates.

Advantages of Recoil Over Context API

  • Optimized Performance: Recoil’s granular subscription model ensures that only components relying on specific atoms or selectors re-render. In contrast, Context API re-renders all consuming components when the context value changes.

Example: Comparing Recoil with Context API

Using Context API:

const CountContext = React.createContext();

function CounterProvider({ children }) {

  const [count, setCount] = React.useState(0);

  return (

    <CountContext.Provider value={{ count, setCount }}>

      {children}

    </CountContext.Provider>

  );

}

function Counter() {

  const { count, setCount } = React.useContext(CountContext);

  return <button onClick={() => setCount(count + 1)}>Count: {count}</button>;

}

Using Recoil:

const countState = atom({

  key: 'countState',

  default: 0,

});

function Counter() {

  const [count, setCount] = useRecoilState(countState);

  return <button onClick={() => setCount(count + 1)}>Count: {count}</button>;

}
  • Scalability: Managing multiple contexts in a large application can become cumbersome. Recoil’s atom-based structure is inherently modular and scales effortlessly.

Newest Techniques and Recoil’s Alignment

Concurrent Rendering Compatibility

Recoil is designed to be compatible with React’s concurrent features, such as Suspense and transitions. This ensures a smooth user experience even in complex applications with heavy state updates.

Server-Side Rendering (SSR)

Recoil supports SSR, making it a good choice for applications using frameworks like Next.js. This allows you to hydrate the Recoil state on the server and share it with the client seamlessly.

Composable Architecture

With the rise of component-driven architectures, Recoil’s atom-based model fits naturally into React’s ecosystem, allowing developers to encapsulate state logic alongside UI components.

Key Scenarios Where Recoil Shines

  • Dynamic Forms: Easily manage form states where fields dynamically depend on each other.
  • Real-Time Applications: Handle frequent state updates with minimal re-renders.
  • Large-Scale Applications: Manage modular and distributed state without the complexity of Redux.
  • Asynchronous Data Fetching: Simplify API integrations with built-in async capabilities.

Conclusion

Recoil’s simplicity, performance optimisations, and advanced features make it a compelling alternative to Redux and Context API for state management in React applications. While Redux excels in strict and centralized state control and Context API is perfect for simpler use cases, Recoil strikes a balance that caters to modern React development needs.

If you’re starting a new project or refactoring an existing one, consider trying Recoil to experience its benefits firsthand. With its growing ecosystem and active community, Recoil is poised to become a mainstay in the React state management landscape.

]]>
https://blogs.perficient.com/2024/12/18/recoil-the-better-alternative-to-redux-and-context-api/feed/ 0 373872
React v19: A Game-Changer for React Developers ! https://blogs.perficient.com/2024/12/18/react-v19-a-game-changer-for-react-developers/ https://blogs.perficient.com/2024/12/18/react-v19-a-game-changer-for-react-developers/#respond Wed, 18 Dec 2024 09:20:10 +0000 https://blogs.perficient.com/?p=373851

React 19 has officially been released as a stable version on December 5, 2024. This update introduces amazing features that enhance the developer experience and application performance. In this blog, we’ll explore the most impactful features of React 19 with examples to understand how they change, the way we build React applications.

Key Features of React 19

1. Server Components

Server Components  improves performance and SEO by moving component rendering to the server. This eliminates the need for heavy computations and library downloads on the client side, enhancing user experience and reducing latency.

Example:

Imagine a component hierarchy:

<CourseWrapper>
  <CourseList />
  <Testimonials />
</CourseWrapper>

Let’s assume all three components—CourseWrapper, CourseList, and Testimonials—are making individual network calls. Traditionally, when API requests are made, each component has to wait for its respective response before rendering. If CourseWrapper takes longer to receive its data, CourseList and Testimonials cannot render until CourseWrapper completes. Even if Testimonials fetches its data faster, it still has to wait for the slowest component, causing a delay in rendering and resulting in poor user experience with visible network waterfalls and high latency.

Now, consider the same scenario with Server Components. The fundamental difference lies in relocating the components to the server. Here, both the component logic and the data they need are collocated on the server. When a client makes a request, the server processes these components together, fetches the necessary data, and sends back the fully rendered components to the client. This ensures that all three components—CourseWrapper, CourseList, and Testimonials—are rendered and delivered simultaneously, minimizing latency and improving user experience.

With Server Components, the UI is effectively “pre-baked” on the server, reducing network overhead and ensuring seamless rendering on the client side.

2. Server Actions

Server Actions enable computations and tasks, like form handling, to be executed on the server. By annotating functions with "use server", React moves these tasks server-side for better performance and security. Let’s understand this with an example.

Traditional Form Handling

In a traditional setup, a form might look like this:

<form onSubmit={performSearch}>
  <input type="text" name="searchTerm" />
  <button type="submit">Search</button>
</form>
function performSearch(event) {
  // Create form data from the current target event
  // Perform the search on the client side
}

Here, performSearch is executed on the client side. It processes the input values, performs a client-side search, or makes an API call to the server. This approach, while common, keeps the logic and data processing on the client.

Server Actions in React 19

With React 19, you can move this logic to the server. Using the action attribute on the form and annotating the corresponding function with the "use server" directive, you can execute these computations on the server side. Here’s how it works:

<form action={performSearch}>
  <input type="text" name="searchTerm" />
  <button type="submit">Search</button>
</form>
"use server";

const performSearch = async (formData) => {
  const term = formData.get("searchTerm");
  // Perform the search on the server side
};

Key Advantages

The "use server" directive indicates that this is a special function executed on the server, making it ideal for tasks like : Sending emails, Downloading PDFs, Performing complex calculations such as invoice generation By handling such operations on the server, you reduce client-side processing, improve security, and simplify complex tasks.

Important Notes

  • "use server" is exclusively for server actions. Don’t confuse it with server components, which handle rendering on the server.
  • Server actions provide seamless integration with the form’s action attribute, offering a clean and efficient way to manage server-side logic.

3. Document Meta

When developing an application, it’s common to include metadata such as the app’s title, description, Open Graph images, and social cards (e.g., for X or LinkedIn). Dynamically setting this metadata often required a lot of boilerplate code.

Traditional Approach

Previously, metadata had to be manually managed using useEffect or third-party libraries like react-helmet. For example:

useEffect(() => {
  document.title = "Blog List";
  const metaDescriptionTag = document.querySelector('meta[name="description"]');
  if (metaDescriptionTag) {
    metaDescriptionTag.setAttribute('content', "Blog description");
  }
}, [title]);

This approach involves careful handling, especially when server-rendering a React application, to ensure metadata is applied correctly.

Native Metadata Support in React 19

React 19 simplifies this process by introducing native support for rendering document metadata directly in components. Metadata tags like <title>, <meta>, and <link> can now be included within the JSX of a component and will be automatically hoisted to the <head> section of the document during rendering.

Here’s how it works:

const BlogListPage = () => {
  return (
    <>
      <title>Blog List</title>
      <meta name="description" content="Blog description" />
      <div>
        <h1>Products</h1>
        <ul>
          <li>Product 1</li>
          <li>Product 2</li>
          <li>Product 3</li>
        </ul>
      </div>
    </>
  );
};

When React renders the BlogListPage component, it automatically detects the <title>, <meta>, and <link> tags and moves them to the <head> section of the document.

Benefits

  • Less Boilerplate: Eliminates the need for custom logic or external libraries to handle metadata.
  • Seamless Server Rendering: Ensures metadata is correctly applied during server rendering, improving SEO and user experience.
  • Cleaner Code: Embeds metadata directly in the component, making the code more intuitive and easier to maintain.

With React 19, managing document metadata has become straightforward, allowing developers to focus on building great user experiences without getting bogged down by setup complexities.

4. Enhanced Hooks

As React developers, we are already familiar with hooks and how they simplify state and lifecycle management. With React 19, a new set of APIs and hooks has been introduced to further enhance the development experience. These include:

  • use()
  • useFormStatus()
  • useActionState()
  • useOptimistic()

1. use()

The use() API allows you to work with promises. When you pass a promise to use(), it resolves the promise and provides you with the outcome. You can use use() to conditionally read context values, such as after early returns.

const theme = use(ThemeContext);

This makes use() a versatile tool for handling asynchronous data or dynamic context values effectively.

2. useFormStatus()

useFormStatus() reads the status of the parent <form> and provides four states: pending, data, method, and action.

import { useFormStatus } from 'react-dom';

function DesignButton() {
  const { pending } = useFormStatus();
  return (
    <button type="submit" disabled={pending}>
      {pending ? "Submitting…" : "Get Users"}
    </button>
  );
}

With useFormStatus(), you can create more responsive and dynamic UI elements, such as buttons that display submission status without requiring a full UI loader.

3. useActionState()

useActionState() is a powerful hook for managing validations and handling state transitions in actions. It accepts an “Action” function and returns a wrapped Action. When called, useActionState() provides:

  • The last result of the Action as data
  • The pending state of the Action as pending

This composition makes it easier to manage and track the lifecycle of an action.

For detailed usage, refer to the official React documentation on useActionState.

4. useOptimistic()

The useOptimistic() hook simplifies the implementation of optimistic UI updates. Optimistic updates allow changes to appear immediately in the UI, even before server confirmation. If the server request fails, the state can be reverted to its previous value.

const [optimisticState, setOptimisticState] = useOptimistic(initialState, reducer);

By using useOptimistic(), you can enhance user experience with a more responsive interface while maintaining control over error handling and state rollback. This hook makes React applications more robust and user-friendly.

For more information, see the docs for useOptimistic.

5. ref as a Prop

The usage of ref in React has been an essential feature for directly accessing DOM elements, such as focusing an input field or interacting with specific elements. Previously, passing a ref through a component hierarchy required the use of React.forwardRef. While effective, this approach added boilerplate and complexity.

With React 19, working with ref has become significantly more streamlined. You no longer need forwardRef for passing ref into components. Instead, you can directly pass it as a prop, simplifying your code.

Here’s how ref usage worked before React 19:

const FancyButton = React.forwardRef((props, ref) => (
  <button ref={ref}>
    {props.children}
  </button>
));

In React 19, you can directly pass the ref as a prop:


const FancyButton = ({ ref, children }) => (
  <button ref={ref}>
    {children}
  </button>
);

This change reduces the need for additional abstractions and makes handling ref more intuitive, improving developer productivity and code readability.

6. Asset Loading

When rendering a view in React, you might notice that styles load first, followed by fonts and images. This sequential loading can cause flickering in the UI, leading to a subpar user experience. React 19 aims to eliminate this issue with new resource-loading APIs like preload, preinit, prefetchDNS, and preconnect. These APIs provide fine-grained control over asset loading, ensuring that resources are loaded efficiently and in the background, significantly enhancing the user experience.

React 19 introduces these APIs to simplify the process of building seamless and visually consistent applications. By leveraging these features, you can eliminate flickering and ensure smooth asset loading in your application.

Here’s how you can use these APIs in practice:

import { prefetchDNS, preconnect, preload, preinit } from 'react-dom';

function MyComponent() {
  // Eagerly load and execute a script
  preinit('https://example.com/path/to/some/script.js', { as: 'script' });

  // Preload a font to ensure it's available when needed
  preload('https://example.com/path/to/font.woff', { as: 'font' });

  // Preload a stylesheet for faster style application
  preload('https://example.com/path/to/stylesheet.css', { as: 'style' });

  // Prefetch DNS for a host to reduce latency
  prefetchDNS('https://example.com');

  // Preconnect to a host when you anticipate making requests to it
  preconnect('https://example.com');
}

These APIs ensure that resources are loaded proactively and efficiently, reducing the need for manual optimization. By integrating them into your React application, you can deliver exceptional performance and a flicker-free user experience.

7. Improved Error Reporting

React 19 introduces significant improvements to error handling, addressing duplication and providing more robust options for managing caught and uncaught errors. Previously, when an error occurred during rendering and was caught by an Error Boundary, React would:

  1. Throw the error twice—once for the original error and again after failing to recover automatically.
  2. Log the error with console.error, including information about where it occurred.

This process resulted in three separate error reports for a single caught error, causing redundancy and potential confusion.

With React 19, this behavior is streamlined:

  • A single error log consolidates all relevant error information, simplifying debugging and error tracking.

Additionally, React introduces two new root options to complement the existing onRecoverableError callback. These options provide greater control over how errors are managed:

  • onCaughtError: Invoked when React catches an error within an Error Boundary.
  • onUncaughtError: Invoked when an error is thrown and not caught by an Error Boundary.
  • onRecoverableError: (Existing option) Invoked when an error is thrown but automatically recovered by React.

These enhancements ensure more efficient error handling, reduce unnecessary logs, and empower developers with precise tools to handle different types of errors gracefully. Whether you’re building error-resilient components or improving debugging processes, React 19 makes managing errors more intuitive and effective.

Conclusion

React 19 is a transformative leap for developers. With features like Server Components, Server Actions, enhanced hooks, and efficient asset loading, React 19 empowers developers to build faster, more efficient, and highly dynamic applications. If you haven’t already, it’s time to upgrade and unlock the full potential of React 19.

]]>
https://blogs.perficient.com/2024/12/18/react-v19-a-game-changer-for-react-developers/feed/ 0 373851
Optimizing E-commerce SEO: The Role of Product Information Management (PIM) https://blogs.perficient.com/2024/12/17/optimizing-e-commerce-seo-the-role-of-product-information-management-pim/ https://blogs.perficient.com/2024/12/17/optimizing-e-commerce-seo-the-role-of-product-information-management-pim/#comments Tue, 17 Dec 2024 22:02:44 +0000 https://blogs.perficient.com/?p=327689

A strong and successful search engine optimization (SEO) strategy is essential in the extremely competitive world of e-commerce today. You can increase the visibility, draw in more visitors, and raise conversion rates with the correct tools and strategies. Product information management (PIM) is a crucial tool for accomplishing these objectives.

What is PIM?

PIM provides a central repository for product information, ensuring that information is accurate, consistent, and up-to-date. This allows businesses to streamline the management of product data, such as descriptions, images, specifications, and other key information related to their products. Having this organized and easily accessible information can be extremely beneficial to businesses looking to improve their customer service, increase sales, and ultimately enhance their SEO performance.

By using PIM, businesses can save time and resources by reducing manual work, increasing accuracy, and eliminating redundant data entry. A PIM system can also help with managing different versions of product descriptions, images, and other data fields in different languages and currencies. This allows businesses to quickly launch products into new markets and keep them updated across multiple channels.

How can PIM help improve your SEO?

Product Information Management (PIM) systems are designed to help businesses store, manage, and distribute product information in an efficient and organized manner. It has become a popular tool for businesses looking to improve their SEO rankings.

PIM can help improve your SEO rankings in several ways:

  1. High-quality Content: PIM can help ensure that product information is accurate, complete, and consistent, which can lead to better on-page optimization and search engine visibility.
  2. Enhanced Product Descriptions: PIM enables the creation of detailed and optimized product descriptions, which can help improve the relevance and quality of content for search engines.
  3. Better Keyword Targeting: PIM can provide insights into which keywords are most relevant for each product, enabling e-commerce websites to better target those keywords in their product pages and other content.
  4. Improved Taxonomy: Taxonomy helps to improve the customer experience by making it easier for customers to find what they are looking for, and to compare products based on relevant attributes. In addition, a well-structured taxonomy can also help to improve search engine optimization (SEO) by increasing the relevance of search results, which can drive more traffic to a company’s website.
  5. Cross-Channel Distribution – PIM systems also make it easy to distribute your product information across multiple channels. This helps increase the visibility of your product pages and will help improve your SEO rankings.
  6. Faster and More Efficient SEO Updates – PIM can also help make SEO updates faster and more efficient. With PIM, you can quickly and easily make changes to your product information, which can then be automatically updated across all of your sales channels. This saves time and reduces the risk of errors, making it easier to optimize your product pages for search engines. With PIM, you can keep your website up-to-date with the latest product information and take advantage of new SEO opportunities as they arise.
  7. Asset Management – Asset management in a Product Information Management (PIM) system refers to the process of organizing and managing digital assets, such as images, videos, and other multimedia files, associated with a product. This includes storing, categorizing, and versioning these assets to ensure that they are easily accessible and up-to-date. We can also attach metadata to digital assets to help improve the search.

This can lead to improved organic search traffic and more conversions for your business but business always questions how do I know the optimization we were doing in PIM is helping us, One way to identify is utilizing Digital Self analytics.

inriver’s digital self-analytics tool, Evaluate, significantly enhances SEO optimization in several ways:

  1. Content Compliance: Evaluate ensures that your product information is accurate and consistent across all channels, which is crucial for SEO. Accurate data helps search engines understand your products better, improving visibility.
  2. Keyword Optimization: The tool tracks keyword performance and helps you optimize product listings for better search rankings. This includes monitoring keyword search and share-of-shelf.
  3. Real-Time Insights: Evaluate provides real-time insights into how your products are performing on the digital shelf. This includes monitoring product search rankings, competitor pricing, and stock levels, allowing you to make data-driven decisions to improve SEO.
  4. Engagement Intelligence: By analyzing customer interactions and engagement with your product listings, Evaluate helps you understand what works and what doesn’t. This information is vital for refining your SEO strategy to attract more traffic and improve conversions.
  5. Automated Monitoring: The tool uses smart automation to constantly monitor your products, providing actionable insights that help you stay ahead of the competition and ensure your product information is always optimized for search engines.

Using inriver Evaluate, you can take control of your digital shelf, drive revenue growth, and enhance your SEO efforts with precise, actionable data.

By following these recommendations, you can make sure that you get the most out of your PIM system and improve your SEO performance. PIM can help you stay ahead of the competition in the e-commerce space. So if you’re looking to improve your SEO performance and reach more customers, it’s time to invest in PIM. For more information on this, contact our experts today.

 

 

 

 

 

]]>
https://blogs.perficient.com/2024/12/17/optimizing-e-commerce-seo-the-role-of-product-information-management-pim/feed/ 1 327689
SEO, GEO, SGE, and Beyond the Horizon: Tips, Tricks, and Takeaways from Sitecore Symposium 2024 https://blogs.perficient.com/2024/10/01/seo-geo-sge-and-beyond-the-horizon-tips-tricks-and-takeaways-from-sitecore-symposium-2024/ https://blogs.perficient.com/2024/10/01/seo-geo-sge-and-beyond-the-horizon-tips-tricks-and-takeaways-from-sitecore-symposium-2024/#respond Tue, 01 Oct 2024 19:26:45 +0000 https://blogs.perficient.com/?p=370014

If you’ve been following the massive changes happening at the intersection of AI and SEO, you’re probably asking yourself: What should we do next? The good news is, while the landscape is evolving, there are clear steps you can take to ensure your SEO strategy thrives in this new era, especially with the rise of AI-powered search experiences like Google’s Search Generative Experience (SGE).

Important Metrics to Track to Help with SEO and SGE

  • Organic Traffic – Monitor how many sessions and visitors are coming to your site via organic traffic.
  • Keyword Performance – Ensure you have a short and long-tail keyword strategy to watch performance. This should include tracking rankings, clicks, impressions and CTR for keyword performance.
  • Direct/Referral Traffic – This will be important to get a pulse on overall performance. With zero-click studies showing organizations may see a decrease in overall organic performance, but direct/referral traffic should remain the same or increase based on efforts to strength brand awareness by populating in AI results.

 

Checklists
Top 5 things you should do if you have no SEO SMEs on staff

  1. Invest in a subscription to SEO software like SEMRush or Moz and audit your site for on-page errors.
  2. Utilize canonical tags to avoid duplicate content issues with canonical URLs.
  3. Prioritize the on-page errors such as the meta titles, descriptions, H-tags, and image alt texts are optimized.
  4. Follow mobile-first best practices during site implementation and build.
  5. Focus on quality content and ensure you keep content up-to-date, relevant and valuable to improve engagement.

 

Top 10 things you should do if you have a one-person SEO program
In addition to the previous five activities, make sure to do the following:

  1. Utilize SEO software to perform keyword research to identify the most high-value keywords that your audience is searching for; update this list quarterly and make sure your writers and developers are working off the same list.
  2. Ensure you are strategically building internal links to improve site architecture
  3. Evaluate accessibility scores and fix any potential errors that may be occurring
  4. Monitor and track core web vital and page load performance to improve user experience and SEO rankings.
  5. Build high-quality backlinks to boost domain authority and contribute to your E-E-A-T

Top 20 things you should do if you have an SEO team
With a dedicated SEO team, you can expand your strategy to include the following items:

  1. Build out correct localization of content to improve local search rankings
  2. Utilize schema mark-up to help populate in Ai results, featured snippets and
  3. Leverage multimedia content to incorporate images, videos and interactive content to improve engagement and rankings
  4. Monitor and analyze metrics such as organic traffic, keyword position, domain authority, conversion rates
  5. Research and analyze competitors SEO strategists and find opportunities where you can outperform them
  6. Optimize for image search which includes descriptive files names, alt text and captions to capture traffic from image search engines.
  7. Understand audience personas to guide content creation to meet their needs and intent.
  8. Leverage A/B testing and personalization to deliver relevant, high-quality content to users based on intent and needs.
  9. Stay informed on major search engine algorithm updates and adjust strategist to remain complaint with best practices.
  10. Integrate SEO with other marketing channels such as social media, paid advertising, email marketing to drive organic and referral traffic.

Websites to bookmark to keep up to date on SEO news

Search Engine Land

Semrush

CMSWire

This blog was co-authored by Tiffany Laster, Lead Digital Strategist at Perficient.

 

 

]]>
https://blogs.perficient.com/2024/10/01/seo-geo-sge-and-beyond-the-horizon-tips-tricks-and-takeaways-from-sitecore-symposium-2024/feed/ 0 370014