cloud Articles / Blogs / Perficient https://blogs.perficient.com/tag/cloud/ Expert Digital Insights Thu, 21 Aug 2025 09:33:21 +0000 en-US hourly 1 https://blogs.perficient.com/files/favicon-194x194-1-150x150.png cloud Articles / Blogs / Perficient https://blogs.perficient.com/tag/cloud/ 32 32 30508587 Part 2: Implementing Azure Virtual WAN – A Practical Walkthrough https://blogs.perficient.com/2025/08/21/part-2-implementing-azure-virtual-wan-a-practical-walkthrough/ https://blogs.perficient.com/2025/08/21/part-2-implementing-azure-virtual-wan-a-practical-walkthrough/#respond Thu, 21 Aug 2025 09:33:21 +0000 https://blogs.perficient.com/?p=386292

In Part 1 (Harnessing the Power of AWS Bedrock through CloudFormation / Blogs / Perficient), we discussed what Azure Virtual WAN is and why it’s a powerful solution for global networking. Now, let’s get hands-on and walk through the actual implementation—step by step, in a simple, conversational way.

Architecturediagram

1.     Creating the Virtual WAN – The Network’s Control Plane

Virtual WAN is the heart of a global network, not just another resource. It replaces: Isolated VPN gateways per region, Manual ExpressRoute configurations, and complex peering relationships.

Setting it up is easy:

  • Navigate to Azure Portal → Search “Virtual WAN”
  • Click Create and configure.
  • Name: Naming matters for enterprise environments
  • Resource Group: Create new rg-network-global (best practice for lifecycle management)
  • Type: Standard (Basic lacks critical features like ExpressRoute support)

Azure will set up the Virtual WAN in a few seconds. Now, the real fun begins.

2. Setting Up the Virtual WAN Hub – The Heart of The Network

The hub is where all connections converge. It’s like a major airport hub where traffic from different locations meets and gets efficiently routed. Without a hub, you’d need to configure individual gateways for every VPN and ExpressRoute connection, leading to higher costs and management overhead.

  • Navigate to the Virtual WAN resource → Click Hubs → New Hub.
  • Configure the Hub.
  • Region: Choose based on: Primary user locations & Azure service availability (some regions lack certain services)
  • Address Space: Assign a private IP range (e.g., 10.100.0.0/24).

Wait for Deployment, this takes about 30 minutes (Azure is building VPN gateways, ExpressRoute gateways, and more behind the scenes).

Once done, the hub is ready to connect everything: offices, cloud resources, and remote users.

3. Connecting Offices via Site-to-Site VPN – Building Secure Tunnels

Branches and data centres need a reliable, encrypted connection to Azure. Site-to-Site VPN provides this over the public internet while keeping data secure. Without VPN tunnels, branch offices would rely on slower, less secure internet connections to access cloud resources, increasing latency and security risks.

  • In the Virtual WAN Hub, go to VPN (Site-to-Site) → Create VPN Site.
  • Name: branch-nyc-01
  • Private Address Space: e.g., 192.168.100.0/24 (must match on-premises network)
  • Link Speed: Set accurately for Azure’s QoS calculations
  • Download VPN Configuration: Azure provides a config file—apply it to the office’s VPN device (like a Cisco or Fortinet firewall).
  • Lastly, connect the VPN Site to the Hub.
  • Navigate to VPN connections → Create connection → Link the office to the hub.

Now, the office and Azure are securely connected.

4. Adding ExpressRoute – The Private Superhighway

For critical applications (like databases or ERP systems), VPNs might not provide enough bandwidth or stability. ExpressRoute gives us a dedicated, high-speed connection that bypasses the public internet. Without ExpressRoute, latency-sensitive applications (like VoIP or real-time analytics) could suffer from internet congestion or unpredictable performance.

  • Order an ExpressRoute Circuit: We can do this via the Azure Portal or through an ISP (like AT&T or Verizon).
  • Authorize the Circuit in Azure
  • Navigate to the Virtual WAN Hub → ExpressRoute → Authorize.
  • Linking it to Hub: Once it is authorized, connect the ExpressRoute circuit to the hub.

Now, the on-premises network has a dedicated, high-speed connection to Azure—no internet required.

5. Enabling Point-to-Site VPN for Remote Workers – The Digital Commute

Employees working from home need secure access to internal apps without exposing them to the public internet. P2S VPN lets them “dial in” securely from anywhere. Without P2S VPN, remote workers might resort to risky workarounds like exposing RDP or databases to the internet.

  • Configure P2S in The Hub
  • Navigate to VPN (Point-to-Site) → Configure.
  • Set Up Authentication: Choose certificate-based auth (secure and easy to manage) and upload the root/issuer certificates.
  • Assign an IP Pool. e.g., 192.168.100.0/24 (this is where remote users will get their IPs).
  • Download & Distribute the VPN Client

Employees install this on their laptops to connect securely. Now, the team can access Azure resources from anywhere just like they’re in the office.

6. Linking Azure Virtual Networks (VNets) – The Cloud’s Backbone

Applications in one VNet (e.g., frontend servers) often need to talk to another (e.g., databases). Rather than complex peering, the Virtual WAN handles routing automatically. Without VNet integration, it needs manual peering and route tables for every connection, creating a management nightmare at scale.

  • VNets need to be attached.
  • Navigate to The Hub → Virtual Network Connections → Add Connection.
  • Select the VNets. e.g., Connect vnet-app (for applications) and vnet-db (for databases).
  • Azure handles the Routing: Traffic flows automatically through the hub-no manual route tables needed.

Now, the cloud resources communicate seamlessly.

Monitoring & Troubleshooting

Networks aren’t “set and forget.” We need visibility to prevent outages and quickly fix issues. We can use tools like Azure Monitor, which tracks VPN/ExpressRoute health—like a dashboard showing all trains (data packets) moving smoothly. Again, Network Watcher can help to diagnose why a branch can’t connect.

Common Problems & Fixes

  • When VPN connections fail, the problem is often a mismatched shared key—simply re-enter it on both ends.
  • If ExpressRoute goes down, check with your ISP—circuit issues usually require provider intervention.
  • When VNet traffic gets blocked, verify route tables in the hub—missing routes are a common culprit.
]]>
https://blogs.perficient.com/2025/08/21/part-2-implementing-azure-virtual-wan-a-practical-walkthrough/feed/ 0 386292
Optimizely Mission Control – Part II https://blogs.perficient.com/2025/08/18/optimizely-mission-control-part-ii/ https://blogs.perficient.com/2025/08/18/optimizely-mission-control-part-ii/#respond Mon, 18 Aug 2025 07:02:45 +0000 https://blogs.perficient.com/?p=384870

In this section, we focused primarily on generating read-only credentials and how to use them to connect to the database.

Generate Database Credentials

The Mission Control tool generates read-only database credentials for a targeted instance, which remain active for 30 minutes. These credentials allow users to run select or read-only queries, making it easier to explore data on a cloud instance. This feature is especially helpful for verifying data-related issues without taking a database backup.

Steps to generate database credentials

  1. Log in to Mission Control.

  2. Navigate to the Customers tab.

  3. Select the appropriate Customer.

  4. Choose the Environment for which you need the credentials.

  5. Click the Action dropdown in the left pane.

  6. Select Generate Database Credentials.

  7. A pop-up will appear with a scheduler option.

  8. Click Continue to initiate the process.

  9. After a short time, the temporary read-only credentials will be displayed.

 

Once the temporary read-only credentials are generated, the next step is to connect to the database using those credentials.

To do this:

  1. Download and install Azure Data Studio
    Download Azure Data Studio

  2. Open Azure Data Studio after installation.

  3. Click “New Connection” or the “Connect” button.

  4. Use the temporary credentials provided by Mission Control to connect:

    • Server Name: Use the server name from the credentials.

    • Authentication Type: SQL Login

    • Username and Password: As provided in the credentials.

  5. Once connected, you can execute SELECT queries to explore or verify data on the cloud instance.

 

For more details, refer to the official Optimizely documentation on Generating Database Credentials.

For Part I, visit: Optimizely Mission Control – Part I

]]>
https://blogs.perficient.com/2025/08/18/optimizely-mission-control-part-ii/feed/ 0 384870
Optimizely Mission Control – Part I https://blogs.perficient.com/2025/08/04/optimizely-mission-control-part-i/ https://blogs.perficient.com/2025/08/04/optimizely-mission-control-part-i/#comments Mon, 04 Aug 2025 13:19:29 +0000 https://blogs.perficient.com/?p=384712

Optimizely provides powerful tools that make it easy to build, release, and manage cloud infrastructure efficiently.

Optimizely Mission Control Access

To use this tool, an Opti ID is required. Once you have an Opti ID, request that your organization grants access to your user account. Alternatively, you can raise a ticket with the Optimizely Support team along with approval from your project organization.

Key Actions

This tool provides various essential actions that can be performed for managing your cloud environments effectively. These include:

  • Restart Site

    • Restart the application in a specific environment to apply changes or resolve issues.

  • Database Backup

    • Create a backup of the environment’s database for debug purposes.

  • Generate Database Credentials

    • Generate secure credentials to connect to the environment’s database.

  • Base Code Deploy

    • Deploy the base application code to the selected environment.

  • Extension Deployment

    • Deploy any custom extension changes.

  • Production User Files Sync

    • Synchronize user-generated files (e.g., media, documents) from the production environment to lower environments.

  • Production Database Sync

    • Sync the production database to another lower environment (such as a sandbox) to sync up data.

Let’s walk through each of these actions step by step to understand how to perform them.

Restart Site

We can restart the site using the Mission Control tool. This option is handy when a website restart is required due to configuration changes. For example, updates to the storage or search provider often require a restart. Additionally, if an integration job gets stuck for any reason, the ability to restart the site becomes very helpful in restoring normal functionality.

How to restart the website

  1. Log in to Mission Control.
  2. Navigate to the Customers tab.

  3. Select the appropriate Customer.

  4. Choose the Environment where the restart is needed.

  5. Click on the Action dropdown in the left pane.

  6. Select Restart Site from the list.

  7. A pop-up will appear where you can either schedule the restart or click Continue for an immediate restart.

 

Reference: Restart Site – Optimizely Support

Database Backup

This is another useful feature available in Mission Control.

Using this option, we can take a backup from the Sandbox or Production instance and import it into the local environment. This helps us debug issues that occur in Sandbox or Production environments.

The backup file is generated with a .bacpac extension.

Steps to take a backup

  1. Log in to Mission Control.

  2. Navigate to the Customers tab.

  3. Select Database Backup from the list.

  4. A pop-up will appear prompting for a scheduled backup time.

  5. Set Skip Log to False to minimize the backup size.

  6. Click Continue and wait for the process to complete.

  7. Once finished, click on the provided link to download the backup file.

 

Reference: Database Backup – Optimizely Support

Stay tuned for the next blog to explore the remaining actions!

]]>
https://blogs.perficient.com/2025/08/04/optimizely-mission-control-part-i/feed/ 1 384712
Creating Data Lakehouse using Amazon S3 and Athena https://blogs.perficient.com/2025/07/31/creating-data-lakehouse-using-amazon-s3-and-athena/ https://blogs.perficient.com/2025/07/31/creating-data-lakehouse-using-amazon-s3-and-athena/#respond Thu, 31 Jul 2025 10:41:17 +0000 https://blogs.perficient.com/?p=385527

As organizations accumulate massive amounts of structured and unstructured data, consequently, the need for flexible, scalable, and cost-effective data architectures becomes more important than ever. Moreover, with the increasing complexity of data environments, organizations must prioritize solutions that can adapt and grow. In addition, the demand for real-time insights and seamless integration across platforms further underscores the importance of robust data architecture. As a result, Data Lakehouse — combining the best of data lakes and data warehouses — comes into play. In this blog post, we’ll walk through how to build a serverless, pay-per-query Data Lakehouse using Amazon S3 and Amazon Athena.

What Is a Data Lakehouse?

A Data Lakehouse is a modern architecture that blends the flexibility and scalability of data lakes with the structured querying capabilities and performance of data warehouses.

  • Data Lakes (e.g., Amazon S3) allow storing raw, unstructured, semi-structured, or structured data at scale.
  • Data Warehouses (e.g., Redshift, Snowflake) offer fast SQL-based analytics but can be expensive and rigid.

Lakehouse unify both, enabling:

  • Schema enforcement and governance
  • Fast SQL querying over raw data
  • Simplified architecture and lower cost

Flow

Tools We’ll Use

  • Amazon S3: For storing structured or semi-structured data (CSV, JSON, Parquet, etc.)
  • Amazon Athena: For querying that data using standard SQL

This setup is perfect for teams that want low cost, fast setup, and minimal maintenance.

Step 1: Organize Your S3 Bucket

Structure your data in S3 in a way that supports performance:

s3://Sample-lakehouse/

└── transactions/

└── year=2024/

└── month=04/

└── data.parquet

Best practices:

  • Use columnar formats like Parquet or ORC
  • Partition by date or region for faster filtering
  • In addition, compressing files (e.g., Snappy or GZIP) can help reduce scan costs.

 Step 2: Create a Table in Athena

You can create an Athena table manually via SQL. Athena uses a built-in Data Catalog

CREATE EXTERNAL TABLE IF NOT EXISTS transactions (

transaction_id STRING,

customer_id STRING,

amount DOUBLE,

transaction_date STRING

)

PARTITIONED BY (year STRING, month STRING)

STORED AS PARQUET

LOCATION ‘s3://sample-lakehouse/transactions/’;

Then run:

MSCK REPAIR TABLE transactions;

This tells Athena to scan the S3 directory and register your partitions.

Step 3: Query the Data

Once the table is created, querying is as simple as:

SELECT year, month, SUM(amount) AS total_sales

FROM transactions

WHERE year = ‘2024’ AND month = ’04’

GROUP BY year, month;

Benefits of This Minimal Setup

Benefit Description
Serverless No infrastructure to manage
Fast Setup Just create a table and query
Cost-effective Pay only for storage and queries
Flexible Works with various data formats
Scalable Store petabytes in S3 with ease

Building a data Lakehouse using Amazon S3 and Athena offers a modern, scalable, and cost-effective approach to data analytics. With minimal setup and no server management, you can unlock insights from your data quickly while maintaining flexibility and governance. Furthermore, this streamlined approach reduces operational overhead and accelerates time-to-value. Whether you’re a startup or an enterprise, this setup provides the foundation for data-driven decision-making at scale. In fact, it empowers teams to focus more on innovation and less on infrastructure.

]]>
https://blogs.perficient.com/2025/07/31/creating-data-lakehouse-using-amazon-s3-and-athena/feed/ 0 385527
Configuring Adjustment Period in Data Exchange https://blogs.perficient.com/2025/07/28/configuring-adjustment-period-in-data-exchange/ https://blogs.perficient.com/2025/07/28/configuring-adjustment-period-in-data-exchange/#respond Mon, 28 Jul 2025 16:15:23 +0000 https://blogs.perficient.com/?p=385393

An “adjustment period” refers to any accounting period used to adjust balances before the year-end closing. These periods are adjusted to “per12” and consequently are referred to as “per13”.  The dates within the adjustment period overlap with regular accounting periods.

In Data Exchange, adjustments are processed in Period Mappings where the mapping of adjustment period between source and target applications is defined. When setting up the data load rule, data can be loaded to both regular and adjustment periods or to adjustment period only depending on the Options selected for that rule.

Configure data load rule for adjustment period in the following steps:

Step1:

In Data Exchange, select Period Mapping under the Actions tab. In Global Period Mapping, insert the adjustment period using the format below:

Adj-23 as 01-12-2023 to 01-12-2023

Open Period Mapping

Global Mapping

Step 2:

In Source Mappings, select source and target applications. Click Add. Click  and browse to and select the source period key. When you select the Source Period Key, Data Management populates the Source Period and Source Period Year fields automatically.

Note: Ensure that the Source Period Key is the same as the source system.

Similarly, click  and browse to and select the target period key. When you select the Target Period Key, Data Management populates the Target Period NameTarget Period Month, and Target Period Year fields automatically.

Save.

Source Mapping

Step 3:

In the Integration tab, under Options, the user can view the ‘Period Mapping Type’ and ‘Include Adjustment Periods’ settings.

Options tab in Edit Integration

From Include Adjustment Period, select one of the following options for processing the periods:

  • No — Only regular periods will be processed. This is the default setting.
  • Yes — Both regular and adjustment periods are processed. If no adjustment period exists, only the regular period is processed.
  • Yes (Adjustment Only) — Only the adjustment period is processed. If none exists, the regular period is pulled.

Click Save.

Step 4:

Execute the data integration to retrieve data for the adjustment period.

Data load selection

After running the data integration, always validate the results to ensure data accuracy. Check the process logs and review the target application to confirm that all expected entries were loaded correctly. Investigate any discrepancies or errors shown in the log for timely resolution.

Helpful read: Multi-Year Multi-Period Data Load / Blogs / Perficient

]]>
https://blogs.perficient.com/2025/07/28/configuring-adjustment-period-in-data-exchange/feed/ 0 385393
Boost Cloud Efficiency: AWS Well-Architected Cost Tips https://blogs.perficient.com/2025/06/09/boost-cloud-efficiency-aws-well-architected-cost-tips/ https://blogs.perficient.com/2025/06/09/boost-cloud-efficiency-aws-well-architected-cost-tips/#respond Mon, 09 Jun 2025 06:36:11 +0000 https://blogs.perficient.com/?p=378814

In today’s cloud-first world, building a secure, high-performing, resilient, and efficient infrastructure is more critical than ever. That’s where the AWS Well-Architected Framework comes in a powerful guide designed to help architects and developers make informed decisions and build better cloud-native solutions.

What is the AWS Well-Architected Framework?

The AWS Well-Architected Framework provides a consistent approach for evaluating and improving your cloud architecture. It’s built around six core pillars that represent key areas of focus for building robust and scalable systems:

  • Operational Excellence – Continuously monitor and improve systems and processes.
  • Security – Protect data, systems, and assets through risk assessments and mitigation strategies.
  • Reliability – Ensure workloads perform as intended and recover quickly from failures.
  • Performance Efficiency – Use resources efficiently and adapt to changing requirements.
  • Cost Optimization – Avoid unnecessary costs and maximize value.
  • Sustainability – Minimize environmental impact by optimizing resource usage and energy consumption

98bb6d5d218aea2968fc8e8bba96ef68b6a7730c 1600x812

Explore the AWS Well-Architected Framework here https://aws.amazon.com/architecture/well-architected

AWS Well -Architected Timeline

Time to time, AWS made some changes in the framework and introduce new resources which we can follow to utilize them better for our use cases and get better architecture.

Oip

AWS Well-Architected Tool

To help you apply these principles, AWS offers the Well-Architected Tool—a free service that guides you through evaluating your workloads against the six pillars.

How it Works:

  • Select a workload.
  • Answer a series of questions aligned with the framework.
  • Review insights and recommendations.
  • Generate reports and track improvements over time.

Try the AWS Well-Architected Tool here https://aws.amazon.com/well-architected-tool/

Go Deeper with Labs and Lenses

AWS also Provides:

Deep Dive: Cost Optimization Pillar

Cost Optimization is not just about cutting costs—it’s about maximizing value. It ensures that your cloud investments align with business goals and scale efficiently.

Why It Matters:

  • Understand your spending patterns.
  • Ensure costs support growth, not hinder it.
  • Maintain control as usage scales.

5 Best Practices for Cost Optimization

  1. Practice Cloud Financial Management
  • Build a cost optimization team.
  • Foster collaboration between finance and tech teams.
  • Use budgets and forecasts.
  • Promote cost-aware processes and culture.
  • Quantify business value through automation and lifecycle management.
  1. Expenditure and Usage Awareness
  • Implement governance policies.
  • Monitor usage and costs in real-time.
  • Decommission unused or underutilized resources.
  1. Use Cost-Effective Resources
  • Choose the right services and pricing models.
  • Match resource types and sizes to workload needs.
  • Plan for data transfer costs.
  1. Manage Demand and Supply
  • Use auto-scaling, throttling, and buffering to avoid over-provisioning.
  • Align resource supply with actual demand patterns.
  1. Optimize Over Time
  • Regularly review new AWS features and services.
  • Adopt innovations that reduce costs and improve performance.

Conclusion

The AWS Well-Architected Framework is more than a checklist—it’s a mindset. By embracing its principles, especially cost optimization, you can build cloud environments that are not only efficient and scalable but also financially sustainable.

]]>
https://blogs.perficient.com/2025/06/09/boost-cloud-efficiency-aws-well-architected-cost-tips/feed/ 0 378814
Perficient Achieves Premier Partner Status with Snowflake https://blogs.perficient.com/2025/06/05/perficient-achieves-premier-partner-status-with-snowflake/ https://blogs.perficient.com/2025/06/05/perficient-achieves-premier-partner-status-with-snowflake/#comments Thu, 05 Jun 2025 15:50:57 +0000 https://blogs.perficient.com/?p=382480

We are proud to announce that Perficient has officially achieved Premier Partner status with Snowflake, a recognition that underscores our strategic commitment to delivering transformative data and AI solutions in the cloud. 

This milestone marks a significant step forward in our longstanding partnership with Snowflake. Advancing from Select to Premier Partner is more than a status update—it’s a clear reflection of our proven expertise, consistent delivery of value-driven outcomes, and dedication to helping organizations solve their most complex data challenges. 

“Reaching Premier Partner status with Snowflake is a testament to our team’s relentless focus on innovation and excellence,” said Michael Patterson, Managing Director, Data and Analytics, Perficient. “We are proud to work closely with Snowflake to help our clients accelerate their cloud data strategies, unlock AI-driven insights, and create real business impact. Our partnership is built on a shared commitment to delivering meaningful results for our customers.” 

Driving Value Through Strategic Partnership 

Organizations today are navigating an increasingly data-rich, AI-enabled landscape. They need trusted partners to help them move with speed, scale, and precision. At Perficient, our dedicated Snowflake practice is purpose-built to meet that need. 

We help enterprises modernize their data platforms, adopt real-time analytics, and implement responsible AI—delivering scalable, cloud-native architectures powered by the Snowflake Data Cloud and Snowflake Cortex AI. As a Premier Partner, we bring the right strategy, expertise, and technical depth to ensure our clients can confidently unlock the full potential of their data. 

Our elevation to Premier Partner status affirms the strength of our solutions, the trust of our customers, and the momentum behind our vision to lead in the next generation of data and AI transformation. 

Learn more about Perficient’s Snowflake expertise and how we help businesses design and implement intelligent data solutions that drive innovation and deliver measurable value. 

]]>
https://blogs.perficient.com/2025/06/05/perficient-achieves-premier-partner-status-with-snowflake/feed/ 1 382480
IOT and API Integration With MuleSoft: The Road to Seamless Connectivity https://blogs.perficient.com/2025/05/21/iot-and-api-integration-with-mulesoft-the-road-to-seamless-connectivity/ https://blogs.perficient.com/2025/05/21/iot-and-api-integration-with-mulesoft-the-road-to-seamless-connectivity/#respond Wed, 21 May 2025 09:08:59 +0000 https://blogs.perficient.com/?p=381483

In today’s hyper-connected world, the Internet of Things (IoT) is transforming industries, from smart manufacturing to intelligent healthcare. However, the real potential of IoT is to connect continuously with enterprise systems, providing real-time insights and automating. This is where MuleSoft’s Anypoint Platform comes in, a disturbance in integrating IoT units and API to create an ecosystem. This blog explains how MuleSoft sets the platform for connection and introduces a strong basis for IoT and API integration that goes beyond the specific dashboard to offer scalability, safety, and efficiency.

Objective

In this blog, I will show MuleSoft’s ability to integrate IoT devices with enterprise systems through API connectivity, focusing on real-time data processing. I will provide an example of how MuleSoft’s Anypoint Platform connects to an MQTT broker and processes IoT device sensor data. The example highlights MuleSoft’s ability to handle IoT protocols like MQTT and transform data for insights.

How Does MuleSoft Facilitate IoT Integration?

The MuleSoft’s Anypoint Platform is specific to the API connection, native protocol support, and a comprehensive integration structure to handle the complications of IoT integration. This is how MuleSoft IOT does the integration comfortably:

  1. API Connectivity for Scalable Ecosystems

MuleSoft’s API strategy categorizes integrations into System, Process, and Experience APIs, allowing modular connections between IoT devices and enterprise systems. For example, in a smart city, System APIs gather data from traffic sensors and insights into a dashboard. This scalability avoids the chaos of point-to-point integrations, a fault in most visualization-focused tools.

  1. Native IoT Protocol Support

IoT devices are based on protocols such as MQTT, AMQP, and CoAP, which MuleSoft supports. Without middleware, this enables direct communication between sensors and gateways. In a scenario, MuleSoft is better able to connect MQTT data from temperature sensors to a cloud platform such as Azure IoT Hub than other tools that require custom plugins.

  1. Real-Time Processing and Automation

IoT requires real-time data processing, and MuleSoft’s runtime engine processes data streams in real time while supporting automation. For example, if a factory sensor picks up a fault, MuleSoft can invoke an API to notify maintenance teams and update systems. MuleSoft integrates visualization with actionable workflows.

  1. Pre-Built Connectors for Setup

MuleSoft’s Anypoint Exchange provides connectors for IoT platforms (e.g., AWS IoT) and enterprise systems (e.g., Salesforce). In healthcare, connectors link patient wearables to EHRs, reducing development time. This plug-and-play approach beats custom integrations commonly required by other tools.

  1. Centralized Management and Security

IoT devices manage sensitive information, and MuleSoft maintains security through API encryption and OAuth. Its Management Center provides a dashboard to track device health and data flows, offering centralized control that standalone dashboard applications cannot provide without additional infrastructure.

  1. Hybrid and Scalable Deployments

MuleSoft’s hybrid model supports both on-premises and cloud environments, providing flexibility for IoT deployments. Its scalability handles growing networks, such as fleets of connected vehicles, making it a future-proof solution.

Building a Simple IoT Integration with MuleSoft

To demonstrate MuleSoft’s IoT integration, below I have created a simple flow in Anypoint Studio that connects to an MQTT Explorer, processes sensor data, and logs it to the dashboard integration. This flow uses a public MQTT Explorer to simulate IoT sensor data. The following are the steps for the Mule API flow:

Api Flowchart

Step 1: Setting Up the Mule Flow

In Anypoint Studio, create a new Mule project (e.g., ‘IoT-MQTT-Demo’). Design a flow with an MQTT Connector to connect to an explorer, a Transform Message component to process data, and a Logger to output results.

Step1

Step 2: Configuring the MQTT Connector

Configure the MQTT Connector properties. In General Settings, configure on a public broker (“tcp://test.mosquitto.org:1883”). Add the topic filter “iot/sensor/data” and select QoS “AT_MOST_ONCE”.

Step2

Step 3: Transforming the Data

Use DataWeave to parse the incoming JSON payload (e.g., ‘{“temperature”: 25.5 }’) and add a timestamp. The DataWeave code is:

“`

%dw 2.0

output application/json


{

sensor: “Temperature”,

value: read(payload, “application/json“).temperature default “”,

timestamp: now()

 

}

“`

Step3

Step 4: Connect to MQTT

                Click on the Connections and use the credentials as shown below to connect to the MQTT explorer:

Step4

 Step 5: Simulating IoT Data

Once the MQTT connects using an MQTT Explorer, publish a sample message ‘{“temperature”: 28 }’ to the topic ‘iot/sensor/data’, sending to the Mule flow as shown below.

Step5

Step 6: Logging the Output

Run the API and publish the message from the MQTT explorer, and the processed data will be logged into the console. Below shows an example log:

Step6

The above example highlights MuleSoft’s process for connecting IoT devices, processing data, and preparing it for visualization or automation.

Challenges in IoT Integration and MuleSoft’s Solutions

IoT integration faces challenges:

  • Device and Protocol Diversity: IoT ecosystems involve different devices, such as sensors or gateways, using protocols like MQTT or HTTP with different data formats, such as JSON, XML, or binary.
  • Data Volume and Velocity: IoT devices generate high volumes of real-time data, which requires efficient processing to avoid restrictions.
  • Security and Authentication: IoT devices are unsafe and require secure communications like TLS or OAuth for device authentication.
  • Data Transformation and Processing: IoT data sends binary data, which requires transformation from Binary to JSON and needs improvement before use.

The Future of IoT with MuleSoft

The future of IoT with MuleSoft is promising. MuleSoft uses the Anypoint Platform to solve critical integration issues. It integrates different IoT devices and protocols, such as MQTT, to provide data flow between ecosystems. It provides real-time data processing and analytics integration. Security is added with TLS and OAuth.

Conclusion

MuleSoft’s Anypoint Platform reviews IoT and API integration by providing a scalable, secure, real-time solution for connecting devices to enterprise systems. As I showed in the example, MuleSoft processes MQTT-based IoT data and transforms it for useful insights without external scripts or sensors. By addressing challenges like data volume and security, MuleSoft provides a platform to build IoT ecosystems that provide automation and insights. As IoT keeps growing, MuleSoft’s API connectivity and native protocol support establish it as an innovation, with new smart city, healthcare, and more connectivity. Discover MuleSoft’s Anypoint Platform to unlock the full potential of your IoT projects and set the stage for a connected future.

]]>
https://blogs.perficient.com/2025/05/21/iot-and-api-integration-with-mulesoft-the-road-to-seamless-connectivity/feed/ 0 381483
Strategic Cloud Partner: Key to Business Success, Not Just Tech https://blogs.perficient.com/2025/05/13/strategic-cloud-partner-key-to-business-success-not-just-tech/ https://blogs.perficient.com/2025/05/13/strategic-cloud-partner-key-to-business-success-not-just-tech/#comments Tue, 13 May 2025 14:20:07 +0000 https://blogs.perficient.com/?p=381334

Cloud is easy—until it isn’t.

Perficient’s Edge: A Strategic Cloud Partner Focused on Business Outcomes

Cloud adoption has skyrocketed. Multi-cloud. Hybrid cloud. AI-optimized workloads. Clients are moving fast, but many are moving blindly. The result? High costs, low returns, and strategies that stall before they scale.

That’s why this moment matters. Now, more than ever, your clients need a partner who brings more than just cloud expertise—they need business insight, strategic clarity, and real results.

In our latest We Are Perficient episode, we sat down with Kiran Dandu, Perficient’s Managing Director, to uncover exactly how we’re helping clients not just adopt cloud, but win with it.

If you’re in sales, this conversation is your cheat sheet for leading smarter cloud conversations with confidence.

Key #1: Start with Business Outcomes, Not Infrastructure

Kiran makes one thing clear from the start: “We don’t start with cloud. We start with what our clients want to achieve.”

At Perficient, cloud is a means to a business end. That’s why we begin every engagement by aligning cloud architecture with long-term business objectives—not just technical requirements.

Perficient’s Envision Framework: Aligning Cloud with Business Objectives

  • Define their ideal outcomes
  • Assess their existing workloads
  • Select the right blend of public, private, hybrid, or multi-cloud models
  • Optimize performance and cost every step of the way

This outcome-first mindset isn’t just smarter—it’s what sets Perficient apart from traditional cloud vendors.

Key #2: AI in the Cloud – Delivering Millions in Savings Today

Forget the hype—AI is already transforming how we operate in the cloud. Kiran breaks down the four key areas where Perficient is integrating AI to drive real value:

  • DevOps automation: AI accelerates code testing and deployment, reducing errors and speeding up time-to-market.
  • Performance monitoring: Intelligent tools predict and prevent downtime before it happens.
  • Cost optimization: AI identifies underused resources, helping clients cut waste and invest smarter.
  • Security and compliance: With real-time threat detection and automated incident response, clients stay protected 24/7.

The result? A cloud strategy that’s not just scalable, but self-improving.

Key #3: Beyond Cloud Migration to Continuous Innovation

Moving to the cloud isn’t the end goal—it’s just the beginning.

Kiran emphasizes how Perficient’s global delivery model and agile methodology empower clients to not only migrate, but to evolve and innovate faster. Our teams help organizations:

  • Integrate complex systems seamlessly
  • Continuously improve infrastructure as business needs change
  • Foster agility across every department—not just IT

And it’s not just theory. Our global consultants, including the growing talent across LATAM, are delivering on this promise every day.

“The success of our cloud group is really going to drive the success of the organization.”
Kiran Dandu

Global Talent, Local Impact: The Power of a Diverse Strategic Cloud Partner

While visiting our offices in Medellín, Colombia, Kiran highlighted the value of diversity in driving cloud success:

“This reminds me of India in many ways—there’s talent, warmth, and incredible potential here.”

That’s why Perficient is investing in uniting its global cloud teams. The cross-cultural collaboration between North America, LATAM, Europe, and India isn’t just a feel-good story—it’s the engine behind our delivery speed, technical excellence, and customer success.

Key Takeaways for Sales: Lead Smarter Cloud Conversations

If your client is talking about the cloud—and trust us, they are—this interview is part of your toolkit.
You’ll walk away understanding:

  • Why Perficient doesn’t just build cloud platforms—we build cloud strategies that deliver
  • How AI and automation are creating real-time ROI for our clients
  • What makes our global model the best-kept secret in cloud consulting
  • And how to speak the language of business outcomes, not just cloud buzzwords

Watch the Full Interview: Deep Dive with Kiran Dandu

Want to hear directly from the source? Don’t miss Kiran’s full interview, packed with strategic insights that will elevate your next sales conversation.

Watch now and discover how Perficient is transforming cloud into a competitive advantage.

Choose Perficient: Your Client’s Strategic Cloud Partner for a Competitive Edge

Perficient is not just another cloud partner—we’re your client’s competitive edge. Let’s start leading the cloud conversation like it.

]]>
https://blogs.perficient.com/2025/05/13/strategic-cloud-partner-key-to-business-success-not-just-tech/feed/ 1 381334
What’s the point of Headless? https://blogs.perficient.com/2025/04/23/whats-the-point-of-headless/ https://blogs.perficient.com/2025/04/23/whats-the-point-of-headless/#respond Wed, 23 Apr 2025 17:49:11 +0000 https://blogs.perficient.com/?p=379825

In the ever-evolving world of web development, the term “headless” is as popular today as it ever has been. But what does it really mean, and why should you care? Let’s dive into the concept of headless architecture, its benefits, and why Sitecore is leading the charge in this space.

What is Headless?

At its core, a headless CMS is a software design approach that separates the front-end (what users see) from the back-end (where content is managed). Unlike traditional CMS platforms that tightly couple content management with presentation, headless CMS’s use APIs to deliver content anywhere—web, mobile app, kiosk, or a smart device. In many ways, the originator of headless architecture is Jamstack – which stands for JavaScript, APIs, and Markup. Instead of relying on traditional monolithic architectures, Jamstack applications decouple the web experience from the back-end, making them more scalable, flexible, and high-performing. JavaScript handles dynamic interactions on the front-end, allowing developers to build fast and modern user experiences. API’s provide a way to pull in content and services from various sources and the website can also push data to API’s such as form submissions, custom analytics events and other user driven data. Markup refers to pre-built HTML that can be served efficiently, often generated using static site generators or frameworks like Next.js.

Why Go Headless?

You might be wondering, “Why would I build my website entirely in JavaScript when it’s mostly content?” That’s a valid question and I thought the same when Sitecore JSS first came out. Headless though is less about “building your site in JavaScript” and more about the benefits of the architecture.

Flexibility

Headless CMS’s allow developers to work with any front-end framework they choose, whether it’s React, Vue, Angular, or whatever your favorite framework might be. This means teams are not locked into the templating system of a traditional CMS or the underlying backend technology.

Performance

Speed is everything in today’s digital landscape. Studies show that even a slight delay in page load time can significantly impact user engagement and conversion rates. Headless CMS’s improve performance by enabling static site generation (SSG) and incremental static regeneration (ISR)—both of which ensure lightning-fast load times. Instead of a server processing each request from a user, static content can be served from a global CDN – which is a modern composable architecture. Of course, server side rendering is also still an option and can also by very performant with the right caching strategy.

Omnichannel Delivery

Content today is consumed on more than just websites. Whether it’s a mobile app, smart device, digital kiosk, or even a wearable, headless architecture ensures content can be delivered anywhere through API’s. This makes it easier for brands to maintain a consistent digital experience across multiple platforms without duplicating content.

Security

Traditional CMS’s are often vulnerable to security threats because they expose both the content management system and the front-end to potential attacks. In contrast, headless CMS’s separate these layers, reducing the attack surface. With content served via APIs and front-end files hosted on secure CDNs, businesses benefit from enhanced security and fewer maintenance headaches.

Scalability

Handling high traffic volumes is a challenge for traditional CMS platforms, especially during peak times. Since headless solutions rely on cloud-based infrastructure, they can scale dynamically without requiring expensive hardware upgrades. Whether you’re serving thousands or millions of users, headless architecture ensures your site remains stable and responsive.

Why Sitecore for Headless?

There are plenty of options in the headless CMS market, but Sitecore offers a unique blend of features that make it stand out. With XM Cloud, Sitecore provides a fully SaaS-based solution—no more infrastructure headaches, no more costly upgrades, and uptime and reliability are now handled by Sitecore.

Sitecore’s hybrid headless approach allows organizations to transition at their own pace, leveraging the benefits of headless while maintaining familiar content management workflows. Hybrid headless gives content authors complete freedom and flexibility to build content however they’d like – where most purely headless content management systems are more rigid on how pages are built.

As digital experiences become more dynamic and user expectations continue to rise, headless CMS solutions offer the agility businesses need. If you’re looking to modernize your digital strategy, now is the time to embrace headless.

]]>
https://blogs.perficient.com/2025/04/23/whats-the-point-of-headless/feed/ 0 379825
Meet Perficient at Data Summit 2025 https://blogs.perficient.com/2025/04/22/meet-perficient-at-data-summit-2025/ https://blogs.perficient.com/2025/04/22/meet-perficient-at-data-summit-2025/#respond Tue, 22 Apr 2025 18:39:18 +0000 https://blogs.perficient.com/?p=380394

Data Summit 2025 is just around the corner, and we’re excited to connect, learn, and share ideas with fellow leaders in the data and AI space. As the pace of innovation accelerates, events like this offer a unique opportunity to engage with peers, discover groundbreaking solutions, and discuss the future of data-driven transformation. 

We caught up with Jerry Locke, a data solutions expert at Perficient, who’s not only attending the event but also taking the stage as a speaker. Here’s what he had to say about this year’s conference and why it matters: 

Why is this event important for the data industry? 

“Anytime you can meet outside of the screen is always a good thing. For me, it’s all about learning, networking, and inspiration. The world of data is expanding at an unprecedented pace. Global data volume is projected to reach over 180 zettabytes (or 180 trillion gigabytes) by 2025—tripling from just 64 zettabytes in 2020. That’s a massive jump. The question we need to ask is: What are modern organizations doing to not only secure all this data but also use it to unlock new business opportunities? That’s what I’m looking to explore at this summit.” 

What topics do you think will be top-of-mind for attendees this year? 

“I’m especially interested in the intersection of data engineering and AI. I’ve been lucky to work on modern data teams where we’ve adopted CI/CD pipelines and scalable architectures. AI has completely transformed how we manage data pipelines—mostly for the better. The conversation this year will likely revolve around how to continue that momentum while solving real-world challenges.” 

Are there any sessions you’re particularly excited to attend? 

“My plan is to soak in as many sessions on data and AI as possible. I’m especially curious about the use cases being shared, how organizations are applying these technologies today, and more importantly, how they plan to evolve them over the next few years.” 

What makes this event special for you, personally? 

“I’ve never been to this event before, but several of my peers have, and they spoke highly of the experience. Beyond the networking, I’m really looking forward to being inspired by the incredible work others are doing. As a speaker, I’m honored to be presenting on serverless engineering in today’s cloud-first world. I’m hoping to not only share insights but also get thoughtful feedback from the audience and my peers. Ultimately, I want to learn just as much from the people in the room as they might learn from me.” 

What’s one thing you hope listeners take away from your presentation? 

“My main takeaway is simple: start. If your data isn’t on the cloud yet, start that journey. If your engineering isn’t modernized, begin that process. Serverless is a key part of modern data engineering, but the real goal is enabling fast, informed decision-making through your data. It won’t always be easy—but it will be worth it.

I also hope that listeners understand the importance of composable data systems. If you’re building or working with data systems, composability gives you agility, scalability, and future-proofing. So instead of a big, all-in-one data platform (monolith), you get a flexible architecture where you can plug in best-in-class tools for each part of your data stack. Composable data systems let you choose the best tool for each job, swap out or upgrade parts without rewriting everything, and scale or customize workflows as your needs evolve.” 

Don’t miss Perficient at Data Summit 2025. A global digital consultancy, Perficient is committed to partnering with clients to tackle complex business challenges and accelerate transformative growth. 

]]>
https://blogs.perficient.com/2025/04/22/meet-perficient-at-data-summit-2025/feed/ 0 380394
Personalization in Sitecore XM Cloud: What’s New, What’s Different, and What It’s Built On! https://blogs.perficient.com/2025/02/28/personalization-in-sitecore-xm-cloud-whats-new-whats-different-and-what-its-built-on/ https://blogs.perficient.com/2025/02/28/personalization-in-sitecore-xm-cloud-whats-new-whats-different-and-what-its-built-on/#respond Fri, 28 Feb 2025 12:28:36 +0000 https://blogs.perficient.com/?p=377978

Personalization has always been one of the most sought-after features for brands, allowing them to deliver tailored digital experiences. Sitecore’s CMS products XP and XM have built-in personalization capabilities, ensuring that the right content reaches the right contacts, for example, by showing, hiding, or adjusting content.

In recent years, the shift towards headless solutions has gained significant momentum, enabling greater flexibility, omnichannel delivery, and seamless integrations with modern digital ecosystems – making Sitecore XM Cloud the next choice or most obvious step in a brand’s digital transformation. However, with the shift from Sitecore XP to Sitecore XM Cloud, customers will notice a significant change in how personalization works behind the scenes. While XM Cloud provides a cloud-first, headless, and scalable solution, it does come with certain limitations in personalization compared to XP. This blog will explore how personalization in Sitecore XM Cloud differs from XP, its strengths, and how marketers and developers can still leverage its capabilities effectively.

Personalization in Sitecore XM Cloud

Personalization in XM Cloud, also known as Embedded Personalization in XM Cloud pages, offers a predefined set of built-in, page-based personalization features. Users can easily define audience segments and serve them customized page variations. The embedded personalization capabilities include real-time data stream collection, predefined conditions, and comprehensive site and page analytics.

The Differences

For those migrating from Sitecore XP, the most significant change is the absence of a provision to create custom conditions for rule-based personalization. Sitecore XP provided extensive flexibility, allowing users to define custom conditions and personalize based on deep visitor analytics. However, in XM Cloud, personalization is streamlined and focused on a more structured OOTB (Out-of-the-Box) approach.

Here’s what changes:

  • No Custom Conditions: XM Cloud does not support creating custom personalization rules, unlike XP.
  • Page Builder Integration: Personalization is managed within the Personalize tab inside the Page Builder.
  • Page Variants Instead of Component-Level Personalization: XM Cloud primarily offers page-level personalization, meaning different page variants can be displayed based on predefined conditions.
  • Simplified OOTB Conditions: The personalization engine in XM Cloud allows authors to define conditions using a pre-defined set of rules but with limited extensibility.

Elements of Personalization in Sitecore XM Cloud

The personalization model in XM Cloud is straightforward and designed for ease of use. Here’s how it functions:

  • Using the Personalize Tab: Within the Page Builder, authors can access the Personalize tab to define different user experiences.
  • Page Variants: Instead of component-level changes, XM Cloud enables authors to create and manage different page versions to be served based on specific conditions
  • Defining Conditions: While the conditions are OOTB and non-extensible, they still allow for basic personalization, such as user location, visit, point of sale, device type, etc.
  • Analyze and Optimize: The Analyze section helps track performance, providing insights into which variants perform better.

Personalization in Sitecore XM Cloud takes a different approach than XP. Despite its limitations, personalization in XM Cloud still offers benefits: it favors simplicity and scalability over deep customization. While some brands may feel the impact of losing advanced personalization rules, the benefits of speed, cloud-native architecture, and ease of use make XM Cloud an attractive option. Combining XM Cloud with Sitecore Personalize can be a game-changer for those needing deeper personalization.

In the next few blogs, we’ll explore the details of embedded personalization and how Sitecore Personalize can enhance personalization in XM Cloud.

]]>
https://blogs.perficient.com/2025/02/28/personalization-in-sitecore-xm-cloud-whats-new-whats-different-and-what-its-built-on/feed/ 0 377978