Perficient Blogs https://blogs.perficient.com/ Expert Digital Insights Wed, 18 Jun 2025 06:20:24 +0000 en-US hourly 1 https://blogs.perficient.com/files/favicon-194x194-1-150x150.png Perficient Blogs https://blogs.perficient.com/ 32 32 30508587 Integrate Coveo Atomic CLI-Based Hosted Search Page into Adobe Experience Manager (AEM) https://blogs.perficient.com/2025/06/18/integrate-coveo-atomic-cli-based-hosted-search-page-into-adobe-experience-manager-aem/ https://blogs.perficient.com/2025/06/18/integrate-coveo-atomic-cli-based-hosted-search-page-into-adobe-experience-manager-aem/#respond Wed, 18 Jun 2025 06:20:24 +0000 https://blogs.perficient.com/?p=382055

Getting Started with Coveo Atomic CLI

This section explains how to install, configure, and deploy a Coveo Atomic project using the Coveo CLI

Install the CLI

To get started, install the Coveo CLI globally with npm:

npm install -g @coveo/cli

To ensure you’re always using the latest version, update it anytime with:

npm update -g @coveo/cli

Authentication

Once the CLI is installed, you will need to authenticate to your coveo organization. Use the following command, replacing the placeholders with your specific organization details:

coveo auth:login --environment=prod --organization=<your-organization> --region=<your-region>

For example:

coveo auth:login --environment=prod --organization=blogtestorgiekhkuqk --region=us

Initialize an Coveo Atomic CLI Project

After logging in, initialize a new atomic project by running:

coveo atomic:init <project-name> --type=app

For example:

coveo atomic:init atomicInterface  --type=app

Building and Deploying the Project

Once the project is ready, build the application:

npm run build

This command compiles your code and prepares it for deployment. It creates a production-ready build inside the dist/ folder.

Then deploy your interface to Coveo using:

coveo ui:deploy

After deployment, your search interface will be hosted on Coveo’s infrastructure, ready to embed anywhere—like Adobe

B11

Using and Initializing Atomic-Hosted-Page

This section guides you through using and initializing the Atomic-Hosted-Page component of your Coveo project.

Use Atomic-Hosted-Page

If you have customized your Atomic search page locally and deployed it to the Coveo infrastructure, then it will be listed in the Custom Deployment tab of the Search Pages (platform-ca | platform-eu | platform-au) page of the Administration Console. You can use the atomic-hosted-page component to consume it from anywhere on the web.

Initialize Atomic-Hosted-Page

Once you have installed the atomic-hosted-page or atomic-hosted-ui web component, you’ll need to add a script like the following to initialize the atomic-hosted-page component:

<head>
  <!-- ... -->
  <script>
    (async () => {
      await customElements.whenDefined('atomic-hosted-ui');
      const atomicHostedUIPage = document.querySelector('atomic-hosted-ui');

      await atomicHostedUIPage.initialize({
        accessToken: '<ACCESS_TOKEN>', 
        organizationId: '<ORGANIZATION_ID>', 
        pageId: '<PAGE_ID>' 
      });
    })();
  </script>
  <!-- ... -->
  <atomic-hosted-ui hosted-type="code"></atomic-hosted-ui> 
  <!-- ... -->
</head>

In this script, replace the placeholders with coveo specific details:

<ACCESS_TOKEN> (string) is an API key or platform token that grants the View all access level on the Search Pages domain in the target Coveo organization.
<ORGANIZATION_ID> (string) is the unique identifier of your organization (for example, mycoveoorganizationa1b23c).
<PAGE_ID> (string) is the unique identifier of the hosted page, which you can copy from the Administration Console.

Steps to Embed in Adobe Experience Manager (AEM)

  1. Login to Adobe AEM Author Instance
    Example URL: https://author-555.adobeaemcloud.com/

  2. Navigate to the AEM Sites Console
    Go to:https://author-555.adobeaemcloud.com/sites.html/content/blog/us/en/search-results
    The Sites Console in AEM, used to manage your website’s pages and structure.
    B12

  3. Create or Select the Page

    • Create new or use an existing page, for example: search-results.

    • Select the page’s checkbox → click Edit (top toolbar).

    • You’ll be redirected to the Page Editor: https://author-555.adobeaemcloud.com/editor.html/content/blog/us/en/search-results.html.

  4. Embed the Coveo Script:
    In the Page Editor, open the Content Tree on the left, select Layout Container, click the Configure (wrench icon) button B13

  5. Choose Embed Type
    Choose Embed → iFrame. Paste your <atomic-hosted-page> script inside the iFrame.
    B14

  6. Preview and Publish the Page

    Click Page Information icon → Publish Page, the alert confirms that the page will be live
    B15

  7. View the Published Page
    Example URL:http://localhost:4502/content/blog/us/en/search-results.html
    B16

That’s it—you’ve successfully embedded your Coveo Atomic CLI-based Hosted Search Page inside Adobe!

References:

Use a hosted page in your infrastructure | Coveo Atomic

 

]]>
https://blogs.perficient.com/2025/06/18/integrate-coveo-atomic-cli-based-hosted-search-page-into-adobe-experience-manager-aem/feed/ 0 382055
Microsoft Copilot for Power Platform https://blogs.perficient.com/2025/06/17/microsoft-copilot-for-power-platform/ https://blogs.perficient.com/2025/06/17/microsoft-copilot-for-power-platform/#respond Wed, 18 Jun 2025 03:25:39 +0000 https://blogs.perficient.com/?p=382923

Introduction to Copilot for Power Platform

Microsoft Copilot is a revolutionary AI-powered tool for Power Platform, designed to streamline the development process and enhance the intelligence of your applications. This learning path will take you through the fundamentals of Copilot and its integration with Power Apps, Power Automate, Power Virtual Agents, and AI Builder.

Copilot in Microsoft Power Platform helps app makers quickly solve business problems. A copilot is an AI assistant that can help you perform tasks and obtain information. You interact with a copilot by using a chat experience. Microsoft has added copilots across the different Microsoft products to help users be more productive. Copilots can be generic, such as Microsoft Copilot, and not tied to a specific Microsoft product. Alternatively, a copilot can be context-aware and tailored to the Microsoft product or application that you’re using at the time.

Picture1

Microsoft Power Platform Copilots & Specializations.

Microsoft Power Platform has several copilots that are available to makers and users.

Microsoft Copilot for Microsoft Power Apps

Use this copilot to help create a canvas app directly from your ideas. Give the copilot a natural language description, such as “I need an app to track my customer feedback.” Afterward, the copilot offers a data structure for you to iterate until it’s exactly what you need, and then it creates pages of a canvas app for you to work with that data. You can edit this information along the way. Additionally, this copilot helps you edit the canvas app after you create it. Power Apps also offers copilot controls for users to interact with Power Apps data, including copilots for canvas apps and model-driven apps.

Microsoft Copilot for Microsoft Power Automate

Use this copilot to create automation that communicates with connectors and improves business outcomes. This copilot can work with cloud flows and desktop flows. Copilot for Power Automate can help you build automation by explaining actions, adding actions, replacing actions, and answering questions.

Microsoft Copilot for Microsoft Power Pages

Use this copilot to describe and create an external-facing website with Microsoft Power Pages. As a result, you have theming options, standard pages to include, and AI-generated stock images and relevant text descriptions for the website that you’re building. You can edit this information as you build your Power Pages website.

How Copilots Work

You can create a copilot by using a language model, which is like a computer program that can understand and generate human-like language. A language model can perform various natural language processing tasks based on a deep-learning algorithm. The massive amounts of data that the language model processes can help the copilot recognize, translate, predict, or generate text and other types of content.

Despite being trained on a massive amount of data, the language model doesn’t contain information about your specific use case, such as the steps in a Power Automate flow that you’re editing. The copilot shares this information for the system to use when it interacts with the language model to answer your questions. This context is commonly referred to as grounding data. Grounding data is use case-specific data that helps the language model perform better for a specific topic. Additionally, grounding data ensures that your data and IP are never part of training the language model.

Accelerate Solution Building with Copilot

Consider the various copilots in Microsoft Power Platform as specialized assistants that can help you become more productive. Copilot can help you accelerate solution building in the following ways:

  • Prototyping
  • Inspiration
  • Help with completing tasks
  • Learning about something

Prototyping

Prototyping is a way of taking an idea that you discussed with others or drew on a whiteboard and building it in a way that helps someone understand the concept better. You can also use prototyping to validate that an idea is possible. For some people, having access to your app or website can help them become a supporter of your vision, even if the app or website doesn’t have all the features that they want.

Inspiration

Building on the prototyping example, you might need inspiration on how to evolve the basic prototype that you initially proposed. You can ask Copilot for inspiration on how to handle the approval of which ideas to prioritize. Therefore, you might ask Copilot, “How could we handle approval?”

Help with Completing Tasks

By using a copilot to assist in your solution building in Microsoft Power Platform, you can complete more complex tasks in less time than if you do them manually. Copilot can also help you complete small, tedious tasks, such as changing the color of all buttons in an app.

Learn about Something

While building an app, flow, or website, you can open a browser and use your favorite search engine to look up something that you’re trying to figure out. With Copilot, you can learn without leaving the designer. For example, your Power Automate flow has a step to List Rows from Dataverse, and you want to find out how to check if rows are retrieved. You could ask Copilot, “How can I check if any rows were returned from the List rows step?”

Knowing the context of your flow, Copilot would respond accordingly.

Design and Plan with Copilot

Copilot can be a powerful way to accelerate your solution-building. However, it’s the maker’s responsibility to know how to interact with it. That interaction includes writing prompts to get the desired results and evaluating the results that Copilot provides.

Consider the Design First

While asking Copilot to “Help me automate my company to run more efficiently” seems ideal, that prompt is unlikely to produce useful results from Microsoft Power Platform Copilots.

Consider the following example, where you want to automate the approval of intake requests. Without significant design thinking, you might use the following prompt with Copilot for Power Automate.

Copilot in cloud flow

Picture2

“Create an approval flow for intake requests and notify the requestor of the result.”

This prompt produces the following suggested cloud flow.

Picture3

While the prompt is an acceptable start, you should consider more details that can help you create a prompt that might get you closer to the desired flow.

A good way to improve your success is to spend a few minutes on a whiteboard or other visual design tool, drawing out the business process.

Picture4

Include the Correct Ingredients in the Prompt

A prompt should include as much relevant information as possible. Each prompt should include your intended goal, context, source, and outcome.

When you’re starting to build something with Microsoft Power Platform copilots, the first prompt that you use sets up the initial resource. For Power Apps, this first prompt is to build a table and an app. For Power Automate, this first prompt is to set up the trigger and the initial steps. For Power Pages, this first prompt sets up the website.

Consider the previous example and the sequence of steps in the sample drawing. You might modify your initial prompt to be similar to the following example.

“When I receive a response to my Intake Request form, start and wait for a new approval. If approved, notify the requestor saying so and also notify them if the approval is denied.”

Continue the Conversation

You can iterate with your copilot. After you establish the context, Copilot remembers it.

The key to starting to build an idea with Copilot is to consider how much to include with the first prompt and how much to refine and add after you set up the resource. Knowing this key consideration is helpful because you don’t need to get a perfect first prompt, only one that builds the idea. Then, you can refine the idea interactively with Copilot.

6 Unique Copilot Features in Power Platform

  1. Natural Language Power FX Formulas in Power Apps

Copilot enables developers to write Power FX formulas using natural language. For instance, typing /subtract datepicker1 from datepicker2 in a label control prompts Copilot to generate the corresponding formula, such as DateDiff(DatePicker1. SelectedDate, DatePicker2. SelectedDate, Days). This feature simplifies formula creation, especially for those less familiar with coding.

  1. AI-Powered Document Analysis with AI Builder

By integrating Copilot with AI Builder, users can automate the extraction of data from documents, such as invoices or approval forms. For example, Copilot can extract approval justifications and auto-generate emails for swift approvals within Outlook. This process streamlines workflows and reduces manual data entry.

  1. Automated Flow Creation in Power Automate

Copilot assists users in creating automated workflows by interpreting natural language prompts. For example, a user can instruct Copilot to “Create a flow that sends an email when a new item is added to SharePoint,” and Copilot will generate the corresponding flow. This feature accelerates the automation process without requiring extensive coding knowledge.

  1. Conversational App Development in Power Apps Studio

In Power Apps Studio, Copilot allows developers to build and edit apps using natural language commands. For instance, typing “Add a button to my header” or “Change my container to align center” enables Copilot to execute these changes, simplifying the development process and making it more accessible.

  1. Generative Topic Creation in Power Virtual Agents

Copilot facilitates the creation of conversation topics in Power Virtual Agents by generating them from natural language descriptions. For example, describing a topic like “Customer Support” prompts Copilot to create a topic with relevant trigger phrases and nodes, streamlining the bot development process.

  1. AI-Driven Website Creation in Power Pages

Copilot assists in building websites by interpreting natural language descriptions. For example, stating “Create a homepage with a contact form and a product gallery” prompts Copilot to generate the corresponding layout and components, expediting the website development process.

Limitations of Copilot

LimitationDescriptionExample
1. Limited understanding of business contextCopilot doesn’t always understand your specific business rules or logic.You ask Copilot to "generate a travel approval form," but your org requires approval from both the team lead and HR. Copilot might only include one level of approval.
2. Restricted to available connectors and dataCopilot can only access data sources that are already connected in your app.You ask it to "show top 5 sales regions," but haven’t connected your Sales DB — Copilot can't help unless that connection is preconfigured.
3. Not fully customizable outputYou might not get exactly the layout, formatting, or logic you want — especially for complex logic.Copilot generates a form with 5 input fields, but doesn't group them or align them properly; you still need to fine-tune it manually.
4. Model hallucination (AI guessing wrong info)Like other LLMs, Copilot may “guess” when unsure — and guess incorrectly.You ask Copilot to create a formula for filtering “Inactive users,” and it writes a filter condition that doesn’t exist in your dataset.
5. English-only or limited language supportMost effective prompts and results come in English; support for other languages is limited or not optimized.You try to ask Copilot in Hindi, and it misinterprets the logic or doesn't return relevant suggestions.
6. Requires clean, named data structuresCopilot struggles when your tables/columns aren't clearly named.If you name a field fld001_status instead of Status, Copilot might fail to identify it correctly or generate unreadable code.
7. Security roles not respected by CopilotCopilot may suggest features that would break your security model if implemented directly.You generate a data view for all users, but your app is role-based — Copilot won’t automatically apply row-level security filters.
8. No support for complex logic or multi-step workflowsIt’s good at simple flows, but not for things like advanced branching, looping, or nested conditions.You ask Copilot to automate a 3-level approval chain with reminder logic and escalation — it gives a very basic starting point.
9. Limited offline or disconnected useCopilot and generated logic assume you’re online.If your app needs to work offline (e.g., for field workers), Copilot-generated logic may not account for offline sync or local caching.
10. Only works inside Microsoft ecosystemCopilot doesn’t support 3rd-party AI tools natively.If your company uses Google Cloud or OpenAI directly, Copilot won’t connect unless you build custom connectors or use HTTP calls.

Build Good Prompts

Knowing how to best interact with the copilot can help get your desired results quickly. When you’re communicating with the copilot, make sure that you’re as clear as you can be with your goals. Review the following dos and don’ts to help guide you to a more successful copilot-building experience.

Do’s of Prompt-Building

To have a more successful copilot building experience, do the following:

  • Be clear and specific.
  • Keep it conversational.
  • Give examples.
  • Check for accuracy.
  • Provide contextual details.
  • Be polite.

Don’ts of Prompt-Building

  • Be vague.
  • Give conflicting instructions.
  • Request inappropriate or unethical tasks or information.
  • Interrupt or quickly change topics.
  • Use slang or jargon.

Conclusion

Copilot in Microsoft Power Platform marks a major step forward in making low-code development truly accessible and intelligent. By enabling users to build apps, automate workflows, analyze data, and create bots using natural language, it empowers both technical and non-technical users to turn ideas into solutions faster than ever.

It transforms how people interact with technology by:

  • Accelerating solution creation
  • Lowering technical barriers
  • Enhancing productivity and innovation

With built-in security, compliance with organizational governance, and continuous improvements from Microsoft’s AI advancements, Copilot is not just a tool—it’s a catalyst for transforming how organizations solve problems and deliver value.

As AI continues to evolve, Copilot will play a central role in democratizing software development and helping organizations move faster and smarter with data-driven, automated tools.

]]>
https://blogs.perficient.com/2025/06/17/microsoft-copilot-for-power-platform/feed/ 0 382923
Perficient Shares Expertise On Digitally Transforming Extended Enterprises in Manufacturing https://blogs.perficient.com/2025/06/16/perficient-shares-expertise-on-digitally-transforming-extended-enterprises-in-manufacturing/ https://blogs.perficient.com/2025/06/16/perficient-shares-expertise-on-digitally-transforming-extended-enterprises-in-manufacturing/#respond Mon, 16 Jun 2025 20:13:31 +0000 https://blogs.perficient.com/?p=382990

For manufacturers, customer experience is no longer a differentiator, but a growth strategy. We’re proud to share that Perficient was selected as a company interviewed for Forrester’s May 2025 report, “Transform Your Manufacturing Extended Enterprise For Maximum Business Impact.”  We believe being interviewed for this report underscores our leadership in helping manufacturers modernize their digital ecosystems to deliver seamless, scalable, and personalized experiences across the value chain. 

Access the report here. 

Partnering With Manufacturing Companies As They Digitally Mature 

At Perficient, we have a proven track record in integrating complex systems like PLM (Product Lifecycle Management), PIM (Product Information Management), and OMS (Order Management Systems) to streamline product content and availability. 

As Forrester notes in the report, “As B2C digital experiences become more convenient and more B2B buying groups include younger members, manufacturers must curate, enrich, and distribute product content that supports compelling customer experiences at every touchpoint in the product buying group’s journey.” 

Our client-centric approach addresses these real-world challenges in manufacturing CX — from abandoned carts due to inaccurate lead times to inconsistent product content across channels. We also regularly publish thought leadership on leveraging AI and generative AI to scale personalization and automate product data management. 

“We’ve seen firsthand how aligning product data, systems, and customer expectations can unlock real business value, and we’re proud to contribute to the broader industry conversation on what it takes to lead in this space.” – Kevin Espinosa, Director of Digital Strategy and Manufacturing Industry Lead 

Perficient’s Vision for the Manufacturing Extended Enterprise 

Key insights the Perficient team provided in their interview for the Forrester report included the critical role content plays in customer experience, the responsible use of AI, strategies for overcoming organizational silos, the importance of inventory accuracy, and the path to digital maturity. 

Perficient’s position is that these changes should be addressed with urgency, and we are well-positioned to offer expert guidance. Below are some brief perspectives on each of these topics. 

Content is the Cornerstone of CX 

Accurate, timely, and localized product content is essential to building trust and driving conversions. This is especially true in B2B and D2C manufacturing, where buying decisions hinge on technical accuracy and availability.  

According to a recent Perficient survey of multiple manufacturers, commercial customers, and consumers of connected products, trust in a manufacturer is critical for the end consumers in making a buying decision. Providing consistent, accurate and up to date information about the products is a crucial first step in establishing this trust factor. 

AI is a Game-Changer — But Only When Used Responsibly 

Generative AI is transforming areas like product design, search, product descriptions and recommendations, virtual agents, agent assistance, coding, content creation, narrative reporting, and process automation, but human oversight remains critical. Our vision for the future of genAI includes agentic architectures where AI agents validate each other’s outputs and flag for human review, ensuring quality and compliance before content reaches the customer. 

System Integration is the Secret to Speed and Scale 

Manufacturers often struggle with siloed systems and disconnected teams. We’ve helped clients overcome these barriers by integrating PLM and PIM systems, establishing clear ownership models and creating enterprise-wide catalogs that reduce friction and accelerate time-to-market. 

Syndication and Availability Drive Revenue 

A number of our clients have experienced abandoned carts and dissatisfied customers when ATP (available-to-promise) and lead times are not available or inaccurate. Further, if the ordering experience is different online than the information received when calling an agent, the credibility of the brand is at risk. From ATP data to local inventory visibility, we help manufacturers ensure that what’s promised online matches what’s available in reality.  

Maturity Matters 

We help manufacturers assess and evolve their digital maturity, from foundational capabilities like centralized content management to advanced practices like AI-powered hyper-personalization and dynamic UI generation based on user personas. 

Invest in End-to-End Digital Transformation for Manufacturing 

We believe that Perficient being interviewed for the Forrester report is a validation of our commitment to helping manufacturers transform their extended enterprise for a differentiated customer experience. Whether you’re looking to modernize your product content strategy, integrate complex systems, or harness the power of AI, Perficient is the partner to help you lead with confidence. 

Explore our manufacturing industry expertise and our portfolio of commerce and contact center capabilities that make an excellent customer experience a reality. 

]]>
https://blogs.perficient.com/2025/06/16/perficient-shares-expertise-on-digitally-transforming-extended-enterprises-in-manufacturing/feed/ 0 382990
Jeff Molsen Leads With Knowledge and Empathy https://blogs.perficient.com/2025/06/16/jeff-molsen-leads-with-knowledge-and-empathy/ https://blogs.perficient.com/2025/06/16/jeff-molsen-leads-with-knowledge-and-empathy/#respond Mon, 16 Jun 2025 18:03:27 +0000 https://blogs.perficient.com/?p=382699

Perficient’s innovative edge is fueled by talented colleagues who shatter boundaries and go beyond the expected. Our strength lies in the people who bring passion, expertise, and a deep knowledge and understanding to every project. Jeff Molsen, a senior technical architect, embodies this spirit. From the specialized aspects of his role to leading others through mentorship, Jeff’s journey shows how knowledge and empathy intertwine to drive success. 

Jeff Molsen headshot

Keep reading to see how Jeff’s approach to leadership and community-building makes a difference at Perficient.

When did you join Perficient, and what are your responsibilities in your current role? 

I joined Perficient in March 2021, and I am a senior technical architect in the Adobe Business Unit. I mostly work with Adobe Experience Manager (AEM). I’m also a career counselor and wear many hats, so my day-to-day can vary quite a bit.  

As a senior technical architect, I’m responsible for the technical delivery of a project. I lead the dev team and direct them on the approach for developing solutions. I also work closely with the project leaders, such as a project manager and business analyst, to make sure we understand the client’s needs. I don’t do as much development work anymore, but I work on proofs of concept that are often related to new technologies.

Are you a member of any culture groups at Perficient? 

I joined the Women in Technology (WiT) Employee Resource Group (ERG) within my first month of joining Perficient. I’m part of the career growth committee, and I helped with the mentorship program early on in its development. I used my experience as an architect to lead developers in creating the matching program. My role was to guide them as well as write copy to make the content clear and concise. 

In the WiT Mentorship Program, both the mentor and mentee can learn from each other. You get to know people from across the organization, gain a different perspective, and expand your network. 

Can you tell us more about your experience participating in Perficient’s Leading With Impact Program? What were some of your takeaways? 

It was a very valuable experience, and they broke the program down into lots of manageable sections. We started out with learning how to lead yourself because you need to know how to manage yourself before you can effectively lead others. There was also education around communication and cross-cultural competence. Since Perficient is a global company with colleagues around the world, it’s important to know how to work with people of different backgrounds and communication styles.  

We also talked about how to have difficult conversations and approach others with a growth mindset. This comes down to approaching the situation with empathy and exploring ways to grow together to make positive changes.  

Throughout the program, we met in smaller cohorts consisting of about five or six colleagues. We did learning activities and discussed topics in these small groups. It was really helpful to hear different opinions and learn from each other’s experiences to deepen our understanding together. 

 

What advice would you give to colleagues who are starting their careers with Perficient?

Jeff Molsen

My advice is to get involved with groups at Perficient, whether that’s an ERG or another kind of group. It can be particularly helpful if it’s something outside of your everyday role. This gives you an opportunity to interact with people across the organization and from different cultures. An important part of building your career is building your personal brand. Getting involved and working with all kinds of people helps you develop that. 

READ MORE: Perficient’s Award-Winning Culture Fosters Meaningful Connections

Whether big or small, how do you make a difference for our clients, colleagues, communities, or teams? 

I try to bring empathy and collaboration in everything I do. I do this for our clients by listening to understand what their needs are and then guiding them toward what will be the best long-term solution. As a career counselor, I make a difference through mentoring. I have always been passionate about teaching and advocating for others. 

What are your proudest accomplishments in your work?

I keep a log of my accomplishments, and this is something I recommend to everyone because it helps you keep track of your progress. My proudest accomplishments are a lot of little things, but most recently, I’m proud of an event I helped coordinate with WiT. I helped organize and host an allyship event with a guest speaker who spoke about how to be an ally. It was titled, “The Inclusive Ally and Leader: Identifying Your Role in Advancing Equity.”  

The WiT ERG sponsored it and invited the Cultural Connections and PRISM ERGs to participate in this event as well. It took a couple of months to plan, and I’m very proud of how it turned out. 

READ MORE: 7 Ways to Be a Better LGBTQ+ Ally at Work  

Jeff Molsen

Why did you choose Perficient, and what keeps you here? 

I think that oftentimes your job satisfaction depends mostly on the people you work with and the person you report to. I have a lot of respect for my manager, Jeffrey Brown, and he’s the reason why I joined Perficient. He drives me to be a better career counselor and manager because he’s shown me the difference it makes.  

I feel that in every project I’ve been a part of at Perficient, I have worked with so many intelligent and passionate people. That keeps me happy and motivated. 

READ MORE: People of Perficient 

What are you passionate about outside of work?

Life is too short, so I’m passionate about all kinds of stuff. My goal in life is to grow in both knowledge and empathy. I’m passionate about learning and education, and I will often watch educational videos, read, or take online classes.  

I also do volunteer work, which intersects with a lot of my passions—learning about others’ experiences, growing empathy, as well as building community. Another thing I do just for fun is play board games. I like board games because they flex your mind and also build community. 


SEE MORE PEOPLE OF PERFICIENT  

It’s no secret our success is because of our people. No matter the technology or time zone, our colleagues are committed to delivering innovative, end-to-end digital solutions for the world’s biggest brands, and we bring a collaborative spirit to every interaction. We’re always seeking the best and brightest to work with us. Join our team and experience a culture that challenges, champions, and celebrates our people.  

Visit our Careers page to see career opportunities and more!  

]]>
https://blogs.perficient.com/2025/06/16/jeff-molsen-leads-with-knowledge-and-empathy/feed/ 0 382699
How Inclusive Design Leading and Creating Solutions for Universal Design https://blogs.perficient.com/2025/06/16/how-inclusive-design-leading-and-creating-solutions-for-universal-design/ https://blogs.perficient.com/2025/06/16/how-inclusive-design-leading-and-creating-solutions-for-universal-design/#respond Mon, 16 Jun 2025 15:53:02 +0000 https://blogs.perficient.com/?p=382987

In the world of design, the relationship between Inclusive Design and Universal Design is often misunderstood. While they share the goal of creating usable and accessible experiences, Inclusive Design focuses on offering multiple solutions for diverse needs, while Universal Design refines those solutions into seamless experiences that work for everyone.

Understanding this connection is key to making accessibility a built-in feature rather than an afterthought.

The Role of Inclusive Design: Designing for Diversity

Inclusive Design embraces the idea that people experience the world in different ways—whether due to ability, age, culture, language, or personal preferences. Instead of assuming a single design works for all, Inclusive Design creates multiple pathways to usability.

For example:

  • Offering multiple navigation options in a digital product (mouse, keyboard, voice commands).
  • Designing adjustable-height workstations, allowing both seated and standing users to work comfortably.
  • Providing varied communication formats, such as text, audio, and visual cues, for different learning styles.

By integrating diverse perspectives into the design process, Inclusive Design expands possibilities, making products and environments more adaptable.

How Inclusive Design Contributes to Universal Design

Over time, Inclusive Design solutions prove to be beneficial for all users, leading to Universal Design principles that remove barriers altogether.

Consider these examples:

Inclusive Design Solution How It Becomes Universal Design
Adjustable-height desks for diverse user needs Workspaces with ergonomic flexibility for everyone
Multi-language support in software Standardized global accessibility features
Closed captions for accessibility Default captions benefiting all users in noisy environments
Multiple navigation options for apps Intuitive interfaces designed for diverse user preferences

Through Inclusive Design, we create options that ensure accessibility for all. As these solutions become widely adopted and standardized, they evolve into Universal Design—meaning they work without requiring adaptations or modifications.

Universal Design: The End Goal

Universal Design ensures that products, spaces, and experiences are naturally usable by everyone, removing the need for accessibility retrofits. It follows seven key principles, including equitable use, flexibility, and intuitive design.

Examples of Universal Design include:

  • Automatic doors, benefiting wheelchair users, parents with strollers, and people carrying bags.
  • Voice-controlled technology, assisting users with disabilities while enhancing convenience for all.
  • Lever-style door handles, which are easier for those with arthritis yet beneficial for everyone.

Without Inclusive Design paving the way, Universal Design wouldn’t exist. The multiple solutions explored through Inclusive Design help shape universally beneficial designs.

Design should never be about accommodations alone—it should be about inclusion from the start. By embracing Inclusive Design, we create a world where accessibility is built-in, not added later. And when these solutions evolve into Universal Design, we achieve a society where everyone benefits, without barriers.

Let’s design for diversity so we can ultimately design for everyone.

]]>
https://blogs.perficient.com/2025/06/16/how-inclusive-design-leading-and-creating-solutions-for-universal-design/feed/ 0 382987
Integrating Drupal with Salesforce SSO via SAML and Dynamic User Sync https://blogs.perficient.com/2025/06/14/integrating-drupal-with-salesforce-sso-via-saml-and-dynamic-user-sync/ https://blogs.perficient.com/2025/06/14/integrating-drupal-with-salesforce-sso-via-saml-and-dynamic-user-sync/#respond Sat, 14 Jun 2025 05:43:30 +0000 https://blogs.perficient.com/?p=382943

Single Sign-On (SSO) is a crucial part of modern web applications, enabling users to authenticate once and access multiple systems securely. If your organization uses Salesforce as an Identity Provider (IdP) and Drupal as a Service Provider (SP), you can establish a secure SSO connection using the SAML protocol.

In this blog, we’ll walk through how to integrate Drupal with Salesforce for SSO using the SAML Authentication module. We’ll also explore how to dynamically sync user data—like first name, last name, company, and roles—from Salesforce into Drupal during login.

Prerequisites

Before starting, ensure you have the following:

  • A working Drupal 9 or 10 site.
  • Access to the Salesforce admin console.
  • The SAML Authentication module installed in Drupal.
  • SSL enabled on your Drupal site (SAML requires HTTPS).

Step 1: Install the SAML Authentication Module in Drupal

You can install the module via Composer:

composer require drupal/saml_auth

Then enable it using Drush or through the Drupal admin interface:

drush en saml_auth

Dependencies (like simplesamlphp) may need to be managed manually or via the simplesamlphp_auth module if you prefer a different approach.

Step 2: Configure Salesforce as an Identity Provider (IdP)

  • Log in to Salesforce, and go to: Setup → Apps → App Manager → New Connected App
  • Fill in the basic details, then under Web App Settings:
    • Enable SAML.
    • Entity ID: Use your Drupal site’s SP Entity ID (e.g., https://example.com/saml/metadata)
    • ACS URL: https://example.com/saml/acs
    • Subject Type: Usually Email or Username.
    • Name ID Format: urn:oasis:names:tc:SAML:1.1:nameid-format:emailAddress
  • Add custom attributes:
    • FirstName
    • LastName
    • Company
    • Roles
  • Download the IdP metadata or note:
    • IdP SSO URL
    • IdP Entity ID
    • X.509 certificate

Step 3: Configure the SAML Authentication Module in Drupal

Navigate to: Admin → Configuration → People → SAML Authentication Settings (/admin/config/people/saml)

Fill in the settings:

  • IdP Entity ID and SSO URL: From Salesforce.
  • X.509 Certificate: Paste the public cert here.
  • SP Entity ID: Can be your site URL or a custom value.
  • ACS URL: Must match what you provided to Salesforce.
  • NameID format: Match Salesforce (usually emailAddress).
  • User match field: Set to mail.

Step 4: Dynamic User Synchronization

By default, SAML Authentication handles user login and account creation, but we extended this with custom logic to map additional attributes from Salesforce into the Drupal user profile.

Salesforce sends additional user information in the SAML assertion, including:

  • First name
  • Last name
  • Company
  • Roles

We’ve extended the default SAML authentication behavior with a custom hook or event subscriber to:

  • Create new users in Drupal using the email as the unique identifier.
  • Populate additional profile fields like first name, last name, and company.
  • Assign user roles dynamically based on the roles attribute from Salesforce.

This ensures that user accounts are fully provisioned and kept up-to-date every time a user logs in through SSO.

Step 5: Test the SSO Flow

  • Log out of your Drupal site.
  • Navigate to /saml/login.
  • You’ll be redirected to Salesforce to authenticate.
  • After login, you’ll be redirected back to Drupal and logged in automatically with synced user details.

Check that:

  • A new Drupal user is created if it doesn’t exist.
  • First name, last name, and company fields are populated.
  • Roles are assigned correctly.

If there’s an error, enable debugging logs and inspect the SAML response and assertion for mismatches.

Conclusion

Integrating Salesforce with Drupal using the SAML Authentication module enables a seamless and secure SSO experience. This is particularly useful for organizations using Salesforce as a central identity system. With proper configuration, users can enjoy frictionless access to your Drupal site while benefiting from Salesforce’s authentication infrastructure.

]]>
https://blogs.perficient.com/2025/06/14/integrating-drupal-with-salesforce-sso-via-saml-and-dynamic-user-sync/feed/ 0 382943
Salesforce Lead-to-Revenue Management https://blogs.perficient.com/2025/06/12/salesforce-lead-to-revenue-management/ https://blogs.perficient.com/2025/06/12/salesforce-lead-to-revenue-management/#respond Thu, 12 Jun 2025 18:30:25 +0000 https://blogs.perficient.com/?p=381438

The Cost of Disconnected Sales and Marketing Processes

In many organizations, there’s palpable friction between marketing and sales teams. Should marketing be responsible for MQLs, meetings, pipeline—or all three? Why aren’t sales converting and closing enough deals? Too often, what happens between these teams is messy, manual, and misaligned. Leads fall through the cracks, handoffs lack crucial context, and reporting becomes a guessing game. 

This internal disarray doesn’t just hurt alignment—it damages the customer experience. When prospects encounter inconsistent communication, slow follow-ups, and disconnected touchpoints (like having to repeat themselves), trust erodes. Frustrated buyers turn elsewhere, leading to lost revenue and poor marketing ROI. 

High-Performing Organizations Take a Different Approach

Leading organizations aren’t leaving sales and marketing alignment to chance. They map every step from initial awareness to deal closure and beyond—building a lead-to-revenue engine that’s repeatable, measurable, and scalable. This transformation starts by: 

  • Connecting marketing, sales, and revenue operations teams around shared goals.
  • Unifying data and streamlining workflows.
  • Executing clean, actionable handoffs—not just basic lead routing, but delivering rich, contextual insights throughout the customer journey. 

And it’s all powered by Salesforce. 

How Perficient Accelerates Your Lead-to-Revenue Transformation

At Perficient, we bring strategy, process, and platform into seamless alignment. Our Salesforce Lead-to-Revenue Management solution helps you:  

  • Identify and eliminate friction points.
  • Streamline handoffs between marketing and sales.
  • Design reporting frameworks that are intuitive, actionable, and trusted across teams. 

We’ve helped clients shorten sales cycles, increase conversion rates, and unlock hidden pipeline with a connected, insight-driven approach. 

Core Components of Salesforce Lead-to-Revenue Management

  • Strategy: Centralize marketing performance-management processes around unified revenue goals.
  • Journey Design: Align the buyer journey and lead-to-revenue processes to deliver consistent value at every stage.
  • Lead Management: Streamline lead generation, scoring, nurturing, and routing with dynamic results chains.
  • Platform Foundation: Modernize marketing, sales, and data integration platforms to build a future-ready tech stack.
  • Insight Model: Unlock outcome-based analytics and AI-driven insights to optimize decision-making and measure impact. 

Business Benefits You’ll Gain

When sales and marketing teams are aligned, enterprises gain: 

  • Mature processes and automation that deliver a seamless, end-to-end customer experience.
  • Multichannel engagement strategies that adapt to real-time customer behavior and preferences.
  • Holistic value capture with clear campaign ROI, marketing attribution, and predictive insights powered by AI. 

Ready to Unify Your Path to Revenue?

If your growth engine feels disconnectedor if marketing and sales are speaking different languagesPerficient can help you align, streamline, and accelerate with our Salesforce Lead-to-Revenue Management solution.

]]>
https://blogs.perficient.com/2025/06/12/salesforce-lead-to-revenue-management/feed/ 0 381438
Master Data Management: The Key to Improved Analytics Reporting https://blogs.perficient.com/2025/06/12/master-data-management-the-key-to-improved-analytics-reporting/ https://blogs.perficient.com/2025/06/12/master-data-management-the-key-to-improved-analytics-reporting/#respond Thu, 12 Jun 2025 14:50:34 +0000 https://blogs.perficient.com/?p=382763

In today’s data-driven business environment, organizations rely heavily on analytics to make strategic decisions. However, the effectiveness of analytics reporting depends on the quality, consistency, and reliability of data. This is where Master Data Management (MDM) plays a crucial role. By establishing a single, authoritative source of truth for critical data domains, MDM ensures that analytics reporting is built on a foundation of high-quality, trustworthy information.

The Role of MDM in Accurate Data Insights

  1. Ensuring Data Consistency and Quality

One of the biggest challenges organizations face is inconsistent and poor-quality data. Disparate systems often contain duplicate, outdated, or conflicting records, leading to inaccurate analytics. MDM addresses this by creating a golden record—a unified, clean version of each critical data entity. Through robust data governance and validation processes, MDM ensures that data used for reporting is accurate, consistent, and complete, fostering trust among the user community.

  1. Eliminating Data Silos and Enabling Systems Consolidation

Enterprises often struggle with fragmented data stored across multiple systems. This creates inefficiencies, as teams must reconcile conflicting records manually. MDM plays a pivotal role in systems consolidation by eliminating data silos and harmonizing information across the organization. By integrating data from various sources into a single, authoritative repository, MDM ensures that analytics tools and business intelligence platforms access consistent, up-to-date information.

  1. Acting as a Bridge Between Enterprise Systems

MDM does not operate in isolation—it seamlessly integrates with enterprise systems through APIs and connectors. By syndicating critical data across platforms, MDM acts as a bridge between disparate applications, ensuring a smooth flow of reliable information. This integration enhances operational efficiency and empowers businesses to leverage advanced analytics and AI-driven insights more effectively.

  1. Enhancing Data-Driven Decision-Making

With a reliable MDM framework in place, organizations can confidently use analytics to drive strategic decisions. High-quality data leads to more accurate reporting, allowing businesses to identify trends, optimize processes, and uncover new opportunities. By maintaining clean and governed master data, companies can fully realize the potential of data-driven decision-making.

Why Organizations Should Implement MDM

Organizations that invest in MDM gain a competitive edge by ensuring that their analytics and reporting efforts are based on trustworthy data. Key benefits include:

  • Improved operational efficiency through reduced manual data reconciliation
  • Higher confidence in analytics due to consistent and accurate data
  • Streamlined data integration across enterprise systems
  • Better compliance and governance with regulated data policies

By implementing MDM, businesses create a strong data foundation that fuels accurate analytics, fosters collaboration, and drives informed decision-making. In an era where data is a strategic asset, MDM is not just an option—it’s a necessity for organizations aiming to maximize their analytics potential.

Reference Data Management (RDM) plays a vital role in ensuring that standardized data—such as country codes, product classifications, industry codes, and currency symbols—remains uniform across all systems and applications. Without effective RDM, businesses risk inconsistencies that can lead to reporting errors, compliance issues, and operational inefficiencies. By centralizing the management of reference data, companies can enhance data quality, improve decision-making, and ensure seamless interoperability between different departments and software systems.

Beyond maintaining consistency, RDM is essential for regulatory compliance and risk management. Many industries, such as finance, healthcare, and manufacturing, depend on accurate reference data to meet regulatory requirements and adhere to global standards. Incorrect or outdated reference data can result in compliance violations, financial penalties, or operational disruptions. A well-structured RDM strategy not only helps businesses stay compliant but also enables greater agility by ensuring data integrity across evolving business landscapes. As organizations continue to embrace digital transformation, investing in robust Reference Data Management practices is no longer optional—it’s a necessity for maintaining competitive advantage and operational excellence.

 

 

 

 

]]>
https://blogs.perficient.com/2025/06/12/master-data-management-the-key-to-improved-analytics-reporting/feed/ 0 382763
Why AI-Led Experiences Are the Future — And How Sitecore Stream Delivers Them https://blogs.perficient.com/2025/06/12/why-ai-led-experiences-are-the-future-and-how-sitecore-stream-delivers-them/ https://blogs.perficient.com/2025/06/12/why-ai-led-experiences-are-the-future-and-how-sitecore-stream-delivers-them/#respond Thu, 12 Jun 2025 11:08:30 +0000 https://blogs.perficient.com/?p=382748

In a world that’s moving at lightning speed, customers expect brands to keep up — to understand them instantly, respond to their behavior in real-time, and offer relevant, helpful experiences wherever they are. This is where Artificial Intelligence (AI) has become not just useful, but absolutely essential.

The Growing Power of AI in Today’s World

AI is revolutionizing how businesses operate and how brands engage with customers. What started as automation is now intelligent orchestration:

  • Recommending the right product at the right time
  • Automatically generating content in your brand voice
  • Detecting patterns in real-time behavior
  • Personalizing experiences across every channel

From e-commerce to healthcare, entertainment to education, AI is making every industry smarter, faster, and more responsive. And customer expectations are rising accordingly.

 

Why Sitecore Has Embraced AI

As a leader in digital experience platforms, Sitecore understands that personalization, content delivery, and customer journey orchestration can’t rely on manual processes.

That’s why Sitecore has integrated AI deeply into its product ecosystem — not just to enhance content workflows, but to transform how real-time experiences are built and delivered.

At the heart of this transformation is Sitecore Stream.

 

🌐 Introducing Sitecore Stream

Sitecore Stream brings AI capabilities to Sitecore products, tailored specifically for marketers, it empowers smarter, faster end-to-end content creation and distribution at scale—unlocking remarkable efficiency gains. Featuring brand-aware AI, intelligent copilots, autonomous agents, and streamlined agentic workflows, Sitecore Stream transforms marketing effectiveness by helping you speed up time-to-market, lower costs, and deliver compelling, consistent digital experiences across all channels.

But Sitecore Stream doesn’t just work fast — it works intelligently and brand-safely.

Sitecore Stream takes all the tasks and deliverables in a marketing workflow and uses AI to make it faster, easier and more consistent. Through copilots, agents, content ideation and creation, and then optimizing the customer experience.

Stream is built on Microsoft Azure OpenAI Service, it uses advanced large language model (LLM) technology to help teams of all sizes ideate, create, and refine on-brand content more strategically and securely.

 

What is LLM

A large language model (LLM) is a type of machine learning model designed for natural language processing tasks such as language generation. LLMs are language models with many parameters, and are trained with self-supervised learning on a vast amount of text.

Large Language Models (LLMs) are a cornerstone of generative AI, powering a wide array of natural language processing tasks, including:

  • Searching, translating, and summarizing text
  • Answering questions with contextual understanding
  • Creating new content—ranging from written text and images to music and software code

What truly sets LLMs apart is their ability to synthesize information, analyze complex data, and identify patterns and trends. This enables them to go beyond simple text generation and adapt seamlessly to diverse, specialized use cases across industries

 

Core Concepts & Capabilities of Sitecore Stream

Sitecore Stream transforms your marketing stack by combining:

  • Brand intelligence
  • AI automation
  • Real-time decisioning

It’s designed to help marketers do more, faster, with less manual effort—while maintaining creative control and brand integrity. Let’s look at how Sitecore stream makes it possible with below capabilities:

 

  1. Brand-Aware AI

Unlike generic AI tools, Stream uses RAG to anchor every response in Organization’s brand knowledge.

Retrieval-augmented generation (RAG) grounds AI outputs by pulling brand-specific information directly from documents uploaded by the organization. This ensures that content generation is informed by accurate, contextual brand knowledge.

Brand-aware AI is an advanced capability designed to maintain brand consistency across all Sitecore products. It leverages large language models (LLMs) and retrieves relevant information from brand resources to ensure alignment with the brand’s identity.

When brand documents are uploaded—detailing brand values, messaging, tone, visual identity, and the intended customer experience—brand knowledge is created through a process known as brand ingestion. This process organizes and optimizes the uploaded information, making it readily accessible to AI copilots across the platform.

As a result, AI-powered tools within Sitecore Stream consistently generate or suggest content that reflects the brand’s voice, tone, and guidelines. This enables marketers to scale content creation confidently, knowing all output remains true to the brand.

 

  1. Copilots and Agents

Stream introduces AI copilots that assist marketers in real-time – offering intelligent suggestions for content, layout, targeting, and workflows. Agents go further, autonomously executing tasks like campaign personalization, journey orchestration, or data segmentation.

Copilots provide intelligent guidance to support strategic decisions, while agents handle routine actions autonomously—freeing marketers to focus on high-value strategy and creative execution.

Both copilots and agents understand natural language and predefined prompts, seamlessly assisting throughout the content creation process and minimizing repetitive work. Marketers can effortlessly request content drafts, campaign ideas, or personalized experiences, all with simple, intuitive commands.

Fully integrated into Sitecore products, these tools deliver chat-based interactions, one-click workflows, and autonomous operations, making marketing smarter, faster, and more efficient.

Sitecore stream currently offer 3 copilots:

  • Brand Copilot – Helps marketers create content that matches the brand using tools for brand-aware chat, idea generation, and content briefs.
  • Content Copilot – Supports content tasks like writing, refining, translating, generating content with AI in Experience Platform, and optimizing/personalizing content in Sitecore.
  • Experience Copilot – Improves search with features like visual search in Content Hub and Q&A generation in Sitecore Search.

 

  1. Agentic Workflows

What is Agentic AI –  Agentic AI refers to artificial intelligence systems that can act autonomously, pursue goals, and make decisions proactively—almost like an “agent” with a mission.

In simple terms:

Agentic AI is AI that doesn’t just respond to commands—it plans, decides, and takes initiative to achieve a goal on its own. AI that doesn’t just assist—it acts.

Stream enables agentic workflows, where AI agents execute actions (e.g., publish content, trigger campaigns) based on real-time customer behavior or campaign goals.

Successful project management starts with a clear, organized plan that prioritizes tasks and involves the right team members at the right time. By setting defined goals, monitoring progress, and fostering collaboration, teams can ensure projects stay aligned and deliver desired outcomes.

Sitecore Stream’s orchestration capability takes project management to the next level by integrating AI-driven automation tailored for marketing teams. Whether managing campaigns, product launches, or digital advertising strategies, this feature helps coordinate efforts seamlessly across teams and Sitecore products.

By introducing early-stage AI agents, orchestration supports smarter task execution and informed decision-making. This paves the way for advanced agentic workflows within Sitecore—where AI systems can autonomously drive actions, make decisions, and dynamically respond to evolving project demands.

 

Conclusion

AI is no longer a future investment — it’s a present necessity. Customers demand relevance, speed, and brand coherence. Sitecore Stream is Sitecore’s answer to that demand: A real-time, AI-powered platform that combines behavioral insight, brand knowledge, and automation to help brands engage customers intelligently and instantly.

This is the future of digital experience. And with Sitecore Stream, it’s already here.

]]>
https://blogs.perficient.com/2025/06/12/why-ai-led-experiences-are-the-future-and-how-sitecore-stream-delivers-them/feed/ 0 382748
YAML files in DBT https://blogs.perficient.com/2025/06/12/yaml-files-in-dbt/ https://blogs.perficient.com/2025/06/12/yaml-files-in-dbt/#respond Thu, 12 Jun 2025 05:18:13 +0000 https://blogs.perficient.com/?p=382730

To make streamline project development and maintenance, in any programming language, we need the support of metadata, configuration, and documentation. Project configurations can be done using configuration files. Configuration files are easy to use and make it user friendly to interact with developer. One such type of configuration files used in DBT are the YAML files.
In this blog, will go through the required YAML files in DBT.
Let’s understand first what YAML is and DBT

DBT (Data Build Tool) :
Data transformation is the important process in modern analytics. DBT is a system to transform, clean and aggregate data within data warehouse. The power of DBT lies in its utilization of YAML files for both configuration and transformation.
Note:
Please go through link for DBT(DBT)
What is YAML file:
YAML acronym as “Yet Another Markup Language.” It is easy to read and understand. YAML is superset of JSON.
Common use of YAML file:
– Configuration Management:
Use to define configuration like roles, environment.
– CI/CD Pipeline:
CI/CD tools depend on YAML file to describe their pipeline.
– Data Serialization:
YAML can manage complex data types such as linked list, arrays, etc.
– API:
YAML can be use in defining API contracts and specification.

Sample Example of YAML file:
Pictureyaml
YAML files are the core of defining configuration and transformation in DBT. YAML files have “.yml” extension.

The most important YAML file is
profiles.yml:
This file needs to be locally. It contains sensitive that can be used to connect with target data warehouse.
Purpose:
It consists of main configuration details to which connect with data warehouse(Snowflake, Postgres, etc.)
profile configuration looks like as :
Picturedbtdemo
Note:
We should not share profiles.yml file with anyone because it consists of target data warehouse information. This file will be used in DBT core and not  in DBT cloud.
YAML file classification according to DBT component:
Let us go through different components of DBT with corresponding YAML files:

1.dbt_project.yml:
This is the most important configuration file in DBT. This file tells DBT what configuration
need to use for projects. By default, dbt_project.yml is the current directory structure

For Example:

name: string

config-version: 2
version: version

profile: profilename

model-paths: [directorypath]
seed-paths: [directorypath]
test-paths: [directorypath]
analysis-paths: [directorypath]
macro-paths: [directorypath]
snapshot-paths: [directorypath]
docs-paths: [directorypath]
asset-paths: [directorypath]

packages-install-path: directorypath

clean targets: [directorypath]

query-comment: string

require-dbt-version: version-range | [version-range]

flags:
  <global-configs>

dbt-cloud:
  project-id: project_id # Required
  defer-env-id: environment # Optional

exposures:
  +enabled: true | false.

quoting:
  database: true | false
  schema: true | false
  identifier: true | false

metrics:
  <metric-configs>

models:
  <model-configs>

seeds:
  <seed-configs>

semantic-models:
  <semantic-model-configs>

saved-queries:
  <saved-queries-configs>

snapshots:
  <snapshot-configs>

sources:
  <source-configs>
  
tests:
  <test-configs>

vars:
  <variables>

on-run-start: sql-statement | [sql-statement]
on-run-end: sql-statement | [sql-statement]

dispatch:
  - macro_namespace: packagename
    search_order: [packagename]

restrict-access: true | false

 

Model:
Models use SQL language that defines how your data is transformed .In a model, configuration file, you define the source and the target tables and their transformations. It is under the model directory of DBT project, and we can give name as per our convenience.
Below is the example:
Picturemodel   This is the YAML file in model. Given name as “schema.yml”
Purpose of model YML file:
It configures the model level metadata such as tags, materialization, name, column which use for transforming the data
It looks like as below:

version: 2

models:
  - name: my_first_dbt_model
    description: "A starter dbt model"
    columns:
      - name: id
        description: "The primary key for this table"
        data_tests:
          - unique
          - not_null

  - name: my_second_dbt_model
    description: "A starter dbt model"
    columns:
      - name: id
        description: "The primary key for this table"
        data_tests:
          - unique
          - not_null


2.Seed:
Seeds used to load CSV files into data model. This is useful for staging before applying any
transformation.
Below is the example:
Pictureseeds

Purpose of Seeds YAML file:
To define the path of CSV file under seed directory and which column needs to transform in CSV file and load into the data warehouse tables.

Configuration file looks like as below:

version: 2
seeds:
  - name: <name>
    description: Raw data from a source
    database: <database name>
    schema: <database schema>
    materialized: table
    sql: |-
      SELECT
        id,
        name
      FROM <source_table>

Testing:
Testing is a key step in any project. Similarly, DBT create test folder to test unique constraints, not null values.

Create dbtTest.yml file under test folder of DBT project

And it looks like as below:

Picturetest
Purpose of test YML file as:
It helps to check data integrity quality and separates from the business logic
It looks like as below:

columns:
  - name: order_id
    tests:
      - not_null
      - unique

As we go through different YAML files in DBT and purpose for the same.

Conclusion:
dbt and its YAML files provide human readable way to manage data transformation. With dbt, we can easily create, transform, and test the data models and make valuable tools for data professionals. With both DBT and YAML, it empowers you to work more efficiently as data analyst. Data engineers or business analysts

Thanks for reading.

 

 

 

]]>
https://blogs.perficient.com/2025/06/12/yaml-files-in-dbt/feed/ 0 382730
Developing a Serverless Blogging Platform with AWS Lambda and Python https://blogs.perficient.com/2025/06/11/developing-a-serverless-blogging-platform-with-aws-lambda-and-python/ https://blogs.perficient.com/2025/06/11/developing-a-serverless-blogging-platform-with-aws-lambda-and-python/#respond Thu, 12 Jun 2025 04:55:52 +0000 https://blogs.perficient.com/?p=382159

Introduction

Serverless is changing the game—no need to manage servers anymore. In this blog, we’ll see how to build a serverless blogging platform using AWS Lambda and Python. It’s scalable, efficient, and saves cost—perfect for modern apps.

How It Works

 

Lalit Serverless

Prerequisites

Before starting the demo, make sure you have: an AWS account, basic Python knowledge, AWS CLI and Boto3 installed.

Demonstration: Step-by-Step Guide

Step 1: Create a Lambda Function

Open the Lambda service and click “Create function.” Choose “Author from scratch,” name it something like BlogPostHandler, select Python 3.x, and give it a role with access to DynamoDB and S3. Then write your code using Boto3 to handle CRUD operations for blog posts stored in DynamoDB.

Lamda_Function.txt

Step 2: Set Up API Gateway

First, go to REST API and click “Build.” Choose “New API,” name it something like BlogAPI, and select “Edge optimized” for global access. Then create a resource like /posts, add methods like GET or POST, and link them to your Lambda function (e.g. BlogPostHandler) using Lambda Proxy integration. After setting up all methods, deploy it by creating a stage like prod. You’ll get an Invoke URL which you can test using Postman or curl.

Picture1

 

Step 3: Configure DynamoDB

Open DynamoDB and click “Create table.” Name it something like BlogPosts, set postId as the partition key. If needed, add a sort key like category for filtering. Default on-demand capacity is fine—it scales automatically. You can also add extra attributes like timestamp or tags for sorting and categorizing. Once done, hit “Create.”

.

 

Picture2

Step 4: Deploy Static Content on S3

First, make your front-end files—HTML, CSS, maybe some JavaScript. Then go to AWS S3, create a new bucket with a unique name, and upload your files like index.html. This will host your static website.

Index.html

After uploading, set the bucket policy to allow public read access so anyone can view your site. That’s it—your static website will now be live from S3.

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "PublicReadGetObject",
            "Effect": "Allow",
            "Principal": "*",
            "Action": "s3:GetObject",
            "Resource": "arn:aws:s3:::your-bucket-name/*"
        }
    ]
}

After uploading, don’t forget to replace your-bucket-name in the bucket policy with your actual S3 bucket name. This makes sure the permissions work properly. Now your static site is live—S3 will serve your HTML, CSS, and JS smoothly and reliably.

Step 5: Distribute via CloudFront

Go to CloudFront and create a new Web distribution. Set the origin to your S3 website URL (like your-bucket-name.s3-website.region.amazonaws.com, not the ARN). For Viewer Protocol Policy, choose “Redirect HTTP to HTTPS” for secure access. Leave other settings as-is unless you want to tweak cache settings. Then click “Create Distribution”—your site will now load faster worldwide.

Picture3

To let your frontend talk to the backend, you need to enable CORS in API Gateway. Just open the console, go to each method (like GET, POST, DELETE), click “Actions,” and select “Enable CORS.” That’s it—your frontend and backend can now communicate properly.

Picture4

Additionally, in your Lambda function responses.(We already added in our lambda function), make sure to include the following headers.

 

Results

That’s it—your serverless blogging platform is ready! API Gateway gives you the endpoints, Lambda handles the logic, DynamoDB stores your blog data, and S3 + CloudFront serve your frontend fast and globally. Fully functional, scalable, and no server headaches!

 

Picture5

Conclusion

Building a serverless blog with AWS Lambda and Python shows how powerful and flexible serverless really is. It’s low-maintenance, cost-effective, and scales easily perfect for anything from a personal blog to a full content site. A solid setup for modern web apps!

]]>
https://blogs.perficient.com/2025/06/11/developing-a-serverless-blogging-platform-with-aws-lambda-and-python/feed/ 0 382159
An “Inconceivable” Conversation With Dr. Pete Cornwell on Simple vs. Agentic AI https://blogs.perficient.com/2025/06/11/an-inconceivable-conversation-with-dr-pete-cornwell-on-simple-vs-agentic-ai/ https://blogs.perficient.com/2025/06/11/an-inconceivable-conversation-with-dr-pete-cornwell-on-simple-vs-agentic-ai/#respond Wed, 11 Jun 2025 21:22:24 +0000 https://blogs.perficient.com/?p=382721

Dr. Pete Cornwell, Director of Contact Center, offers a fresh perspective on customer care and is sharing his wealth of knowledge at Customer Contact Week in Las Vegas. With over 35 years of experience spanning information systems, design, architecture, and consulting for industry leaders like Terazo and Blue Cross Blue Shield North Carolina, his expertise runs deep. Add to that a decade as a professor and chair of Engineering and Information Sciences at DeVry University, and it’s clear that Dr. Cornwell has plenty to say about the ever-evolving world of digital transformation.

Before heading west for the conference, I sat down with him to glean some insights he’ll be sharing with attendees and partners alike.

Take a Seat, Class is in Session

Our conversation was set to focus on AI and its applications in the contact center, but as I launched into my questions, Dr. Cornwell first asked me to examine a meme.

Princess Bride Meme

If you’re unfamiliar with the 1987 movie The Princess Bride, you’re not only missing out on a cherished piece of nostalgia, but you’ll also need a bit of background to understand his analogy. In the film, the protagonist delivers a famous line to a villain who repeatedly uses the word “inconceivable”, even when things are clearly very conceivable.

Pete followed up the meme by saying that if there’s any term that makes him grind his teeth more than Digital Transformation, it’s Agentic AI. It’s tossed around daily as a flashy, vague placeholder for everything from artificial intelligence and large language models (LLMs) to integrations and machine learning (ML)-driven workflows. This misuse is particularly troubling in the contact center space, where it has become a buzzword applied to almost anything.

Pete is intent on drawing a clear distinction between AI and Agentic AI from a customer contact perspective. Both are critical components of today’s AI-driven customer care, but Agentic AI is poised to unlock a wealth of future opportunities in this space.

“Simple” AI

While LLM-based models are incredibly complex, AI is often used for relatively simple applications in customer service, such as self-service, agent deflection, or assistance. Many companies begin their AI journey by deploying it to deflect calls from human agents, handling straightforward tasks like providing business hours, account balances, or credit card activation.

Additionally, voice and text-based chatbots can support intelligent routing, allowing customers to bypass frustrating IVR menus and connect directly through an Intelligent Virtual Agent (IVA). Yet, despite these capabilities, this is still not Agentic AI, these functions serve as filters between customers and human agents, managing deflection or routing rather than true autonomous decision-making.

The space continues to evolve as new AI-driven capabilities are added to CCaaS (Contact Center as a Service) offerings each year. AI-powered agent prompting, coaching, and even translation are all part of what’s possible. While Pete admits his skills as a clairvoyant aren’t highly rated, he predicts that these capabilities will become commoditized within five years, standard features in the arsenal of any CCaaS vendor. He invites anyone to call him out if he’s wrong, he’ll be waiting.

What Is Agentic AI?

If you want a simple definition of what agentic means, it’s actually embedded in the term. Agentic AI is used to describe LLM-driven software that can execute sufficiently complex workflows that it could replace an agent. This typically means that like a human agent, agentic AI will need to make decisions in a highly variable data environment, sourced from both the customer and via integration.

From an implementation perspective, like its human equivalent we want to give each unique agent type a well-defined set of responsibilities to achieve advantages of understandability, maintainability, error management and observability. Similarly, many of the communication and business metrics we use to measure the performance of human agents can apply to their agentic counterparts. For example, a credit card company could conceivably use agentic AI for everything from general service enquiries to fraud reporting, and even customer satisfaction surveys.

What we can draw from this example is that we open the possibility for collaboration with agents coordinating activity to fulfill extensive tasks that would often require multiple human representatives and frustrating delays as the customer is transferred between departments.

Agentic AI in Action

Pete expanded on this concept using a lost credit card scenario, illustrating how Agentic AI can streamline customer service. This process involves four AI components:

  1. A simple AI IVA chatbot that will provide voice-based routing for a customer. The following agentic components (“bots”) that will use natural language processing and output to speak to the customer.
  2. General Customer Service – agentic AI designed to handle a range of customer service scenarios from simple balance inquiries, new card verification and of course then going on to lose it and requiring a replacement.
  3. Fraud Handling – a bot designed to establish and open a fraud investigation.
  4. Customer Survey – a bot that will craft an optional customer satisfaction survey based on the workflow delivered to the customer.

These agentic bots, coupled with the aforementioned IVA have the capability to provide a seamless flow of interaction with the customer. Consider the following voice flow:

  1. An anxious customer calls the customer service number, after a prompt, the caller simply says “I’ve lost my card.” The Simple AI IVA routes the call to the General Customer Service Bot.
  2. The General Customer Service bot first validates the customers identity and the missing card number [omitted to save trees and your time]. The customer reciprocates with the correct information.
  3. The General Customer Service bot then asks the customer when they believe they lost the card. “Sometime last week, I don’t use it much and when I looked in my wallet I couldn’t find it” the customer answers.
  4. There are recent transactions on the card, so the card account is passed to the Fraud Handling bot. Meanwhile the General Customer Service verifies the address on file with the customer and calls back-end services to print and dispatch a new card.
  5. The Fraud Handling bot then asks the customer to verify a predetermined number of recent transactions from the account.
  6. The customer replies “I’ve never been to Cancún” when presented with a specific transaction involving a Mexican resort and spa. The Fraud Handling bot opens a fraud case, again using a back-end integration.
  7. The General Customer Service bot gives the customer a claim fraud case number and a delivery time for the card.
  8. Finally, with assent from the customer (hopefully now less stressed) a Customer Survey bot draws a set of questions drawn banks associated with the bots they interacted with. The customer responds, the data is logged and the call ends.

Pete recalled a similar experience that required three different human agents, each with a long wait time between transfers. This AI-driven approach achieves the same outcome but with major advantages:

  • No human were required for the workflow.
  • The customer perceives interacting with a single entity throughout the call.
  • Reduced wait times and minimized anxiety about disconnections.

To Recap

Agentic AI has the potential to completely replace an agent through the provision of complex workflows driven by complex inputs and integrations. Simple AI provides simple self-service for deflection purposes and supports a live agent and/or can provide natural language driven call routing.  Finally both have a critical role to play in building an AI-driven contact center self-service experience.

]]>
https://blogs.perficient.com/2025/06/11/an-inconceivable-conversation-with-dr-pete-cornwell-on-simple-vs-agentic-ai/feed/ 0 382721