Microsoft Articles / Blogs / Perficient https://blogs.perficient.com/category/partners/microsoft/ Expert Digital Insights Wed, 05 Feb 2025 11:04:54 +0000 en-US hourly 1 https://blogs.perficient.com/files/favicon-194x194-1-150x150.png Microsoft Articles / Blogs / Perficient https://blogs.perficient.com/category/partners/microsoft/ 32 32 30508587 Setting Up Virtual WAN (VWAN) in Azure Cloud: A Comprehensive Guide – I https://blogs.perficient.com/2025/02/05/setting-up-azure-vwan/ https://blogs.perficient.com/2025/02/05/setting-up-azure-vwan/#respond Wed, 05 Feb 2025 11:01:41 +0000 https://blogs.perficient.com/?p=376281

As businesses expand their global footprint, the need for a flexible, scalable, and secure networking solution becomes paramount. Enter Azure Virtual WAN (VWAN), a cloud-based offering designed to simplify and centralize network management while ensuring top-notch performance. Let’s dive into what Azure VWAN offers and how to set it up effectively.

What is Azure Virtual WAN (VWAN)?

Azure Virtual WAN, or VWAN, is a cloud-based network solution that connects secure, seamless, and optimized connectivity across hybrid and multi-cloud environments.

It provides:

I. Flexibility for Dynamic Network Requirements

  • Adaptable Connectivity: Azure VWAN supports various connectivity options, including ExpressRoute, Site-to-Site VPN, and Point-to-Site VPN, ensuring compatibility with diverse environments like on-premises data centers, branch offices, and remote workers.
  • Scale On-Demand: As network requirements grow or change, Azure VWAN allows you to dynamically add or remove connections, integrate new virtual networks (VNets), or scale bandwidth based on traffic needs.
  • Global Reach: Azure VWAN enables connectivity across regions and countries using Microsoft’s extensive global network, ensuring that organizations with distributed operations stay connected.
  • Hybrid and Multi-Cloud Integration: Azure VWAN supports hybrid setups (on-premises + cloud) and integration with other public cloud providers, providing the flexibility to align with business strategies.

II. Improved Management with Centralized Controls

  • Unified Control Plane: Azure VWAN provides a centralized dashboard within the Azure Portal to manage all networking components, such as VNets, branches, VPNs, and ExpressRoute circuits.
  • Simplified Configuration: Automated setup and policy management make deploying new network segments, traffic routing, and security configurations easy.
  • Network Insights: Built-in monitoring and diagnostic tools offer deep visibility into network performance, allowing administrators to quickly identify and resolve issues.
  • Policy Enforcement: Azure VWAN enables consistent policy enforcement across regions and resources, improving governance and compliance with organizational security standards.

III. High Performance Leveraging Microsoft’s Global Backbone Infrastructure

  • Low Latency and High Throughput: Azure VWAN utilizes Microsoft’s global backbone network, known for its reliability and speed, to provide high-performance connectivity across regions and to Azure services.
  • Optimized Traffic Routing: Intelligent routing ensures that traffic takes the most efficient path across the network, reducing latency for applications and end users.
  • Built-in Resilience: Microsoft’s backbone infrastructure includes redundant pathways and fault-tolerant systems, ensuring high availability and minimizing the risk of network downtime.
  • Proximity to End Users: With a global footprint of Azure regions and points of presence (PoPs), Azure VWAN ensures proximity to end users, improving application responsiveness and user experience.

High-level architecture of VWAN

This diagram depicts a high-level architecture of Azure Virtual WAN and its connectivity components.

 

Vwanarchitecture

 

  • HQ/DC (Headquarters/Data Centre): Represents the organization’s primary data center or headquarters hosting critical IT infrastructure and services. Acts as a centralized hub for the organization’s on-premises infrastructure. Typically includes servers, storage systems, and applications that need to communicate with resources in Azure.
  • Branches: Represents the organization’s regional or local office locations. Serves as local hubs for smaller, decentralized operations. Each branch connects to Azure to access cloud-hosted resources, applications, and services and communicates with other branches or HQ/DC. The HQ/DC and branches communicate with each other and Azure resources through the Azure Virtual WAN.
  • Virtual WAN Hub: At the heart of Azure VWAN is the Virtual WAN Hub, a central node that simplifies traffic management between connected networks. This hub acts as the control point for routing and ensures efficient data flow.
  • ExpressRoute: Establishes a private connection between the on-premises network and Azure, bypassing the public internet. It uses BGP for route exchange, ensuring secure and efficient connectivity.
  • VNet Peering: Links Azure Virtual Networks directly, enabling low-latency, high-bandwidth communication.
    • Intra-Region Peering: Connects VNets within the same region.
    • Global Peering: Bridges VNets across different regions.
  • Point-to-Site (P2S) VPN: Ideal for individual users or small teams, this allows devices to securely connect to Azure resources over the internet.
  • Site-to-Site (S2S) VPN: Connects the on-premises network to Azure, enabling secure data exchange between systems.

Benefits of VWAN

  • Scalability: Expand the network effortlessly as the business grows.
  • Cost-Efficiency: Reduce hardware expenses by leveraging cloud-based solutions.
  • Global Reach: Easily connect offices and resources worldwide.
  • Enhanced Performance: Optimize data transfer paths for better reliability and speed.

Setting Up VWAN in Azure

Follow these steps to configure Azure VWAN:

Step 1: Create a Virtual WAN Resource

  • Log in to the Azure Portal and create a Virtual WAN resource. This serves as the foundation of the network architecture.

Step 2: Configure a Virtual WAN Hub

  • Make the WAN Hub the central traffic manager and adjust it to meet the company’s needs.

Step 3: Establish Connections

  • Configure VPN Gateways for secure, encrypted connections.
  • Use ExpressRoute for private, high-performance connectivity.

Step 4: Link VNets

  • Create Azure Virtual Networks and link them to the WAN Hub. The seamless interaction between resources is guaranteed by this integration.

Monitoring and Troubleshooting VWAN

Azure Monitor

Azure Monitor tracks performance, availability, and network health in real time and provides insights into traffic patterns, latency, and resource usage.

Network Watcher

Diagnose network issues with tools like packet capture and connection troubleshooting. Quickly identify and resolve any bottlenecks or disruptions.

Alerts and Logs

Set up alerts for critical issues such as connectivity drops or security breaches. Use detailed logs to analyze network events and maintain robust auditing.

Final Thoughts

Azure VWAN is a powerful tool for businesses looking to unify and optimize their global networking strategy. Organizations can ensure secure, scalable, and efficient connectivity by leveraging features like ExpressRoute, VNet Peering, and VPN Gateways. With the correct setup and monitoring tools, managing complex networks becomes a seamless experience.

]]>
https://blogs.perficient.com/2025/02/05/setting-up-azure-vwan/feed/ 0 376281
Power Apps and Components: Understanding Components and Their Role in App Development https://blogs.perficient.com/2025/02/05/power-apps-and-components-understanding-components-and-their-role-in-app-development/ https://blogs.perficient.com/2025/02/05/power-apps-and-components-understanding-components-and-their-role-in-app-development/#comments Wed, 05 Feb 2025 07:28:01 +0000 https://blogs.perficient.com/?p=376573

In this blog, we’ll explore Power Apps, the concept of components, and how they enhance app efficiency and scalability.

What is Power Apps?

Microsoft Power Apps is a low-code/no-code platform that allows users and businesses to create custom applications with minimal coding. It enables app development for web browsers, mobile devices, and tablets. It seamlessly integrates with Microsoft services like SharePoint, Office 365, Teams, and third-party data sources such as Salesforce and SQL databases.

Now that we have a brief introduction to Power Apps,, let’s explore its key aspect: Components. These powerful building blocks can take your Power Apps development to the next level!

What Are Components, and How Do They Strengthen Power Apps?

Power Apps components are reusable elements that help create consistent, efficient app designs. They encapsulate functionality, styling, and logic into a single unit, making them easy to use across multiple screens and apps.

Why Should We Use Components in Power Apps?

1) Reusability: Reduce duplication and maintain consistency across different apps.

2) Efficiency: Speed up development by streamlining app creation and updates.

3) Maintainability: Simplify updates with centralized component management.

4) Customization: Tailor components to specific business needs while ensuring uniformity.

5) Collaboration: Enable multiple developers to work on different components simultaneously.

Building a Menu Component in Power Apps

Now that we understand the components of Power Apps, let’s build one from scratch. In this blog, we will create a Menu component, not just any regular menu bar. This menu bar is easy to customize, allows adding new items using collections, and leverages the custom properties of components (which will be explained later in the blog).

Demo for the component before building

For this component, we will use Collections—like data arrays—along with the component’s custom properties to seamlessly pass data from the screen to the element. This allows users to easily add new menu items by simply updating the collection, and the changes will automatically be reflected in the component. Users only need to adjust the height and width as required.

Step 1: Setting Up the Initial Component and Menu Data

Before we get into the actual logic of the component, let’s add the Menu component and the data for the menu items.

1. Menu Items Data

Below is a preview of how the collection should be initialized in the app’s OnStart property. This serves as a demo of the expected data structure, but you can customize it with any menu items based on your needs.

The Menu items collection

In this collection shown above, we have:

  • MenuGalItems: The collection name that can be referenced throughout the app.
  • ScreenName: The page name to navigate to
  • Screen: The actual screen you want to navigate to
  • Icon: This is the Icon you wish to display to the user in the menu

2. Menu Component

To add the component, navigate to the component tab in the Power Apps tree view. You should have an option called new component; click it, and a new component will be created.

Add Component option in PowerApps Component screen

Once clicked, you will see the screen below with your newly created component.

New Component added to the application

The above image shows that it looks similar to the screen tab, with a properties pane, a tree view, and the ability to add controls to the component just like you would on a screen. The only difference is that you cannot test the component directly; you need to add it to a screen to test it.

If you notice in the properties tab, there is a section called Custom Properties at the bottom. These properties are game changers for components as they function as input and output variables—input allows data to flow from the screen to the component, and output allows data to pass from the component back to the screen.

In this case, we only need Input properties since we do not need to pass any data back to the screen; we just need the menu items to be displayed.

When creating a custom property, you will see a new tab where you can add a name, choose the property type (Input or Output), and select the data type (such as Number, Date, Table, Boolean, and so on…).

Custom Properties Compponent Power apps

The picture (above) shows the properties you can add when creating a custom property. You also have the option to Raise the OnReset when the property’s value changes.

So, we would be creating two input properties:

  1. ComponentSize (Boolean): To have opening and closing of the menu
  2. MenuCollection (Table): A table that is used to display the menu items created previously

Custom Input Properties

This is the basic setup you need before starting to create the component.

Step 2: Adding the Necessary Controls to the Component

First, let’s add the menu icon that will be used to expand and collapse the menu control when it is clicked and also the gallery where we will be displaying the menu items:

Component Controls

In this case, I am using a Hamburger icon as my menu icon, but you can choose any icon. The icon and gallery should look like the image below once you’ve added the icon and gallery.

Current Component state

In this case, I have adjusted the width and height of my component to fit my needs (Width: 165, Height: 640), but you can set them to whatever works best for you.

To display the menu items, we will add the following controls within the gallery:

  • Icon control to show the icon defined in the menu items collection.
  • Label to display the menu item name.
  • Button for styling (to create a rounded box design).

Added control to component

The component should now look like this:

Added Controls to component

 

You can see that the component is starting to take shape. Now that we have added the required controls let’s build the logic that brings the component to life.

Step 3: Adding Logic to the Component

First, let’s set up the custom property of the Menu Items. Navigate to the Components property tab (highlighted in red in the image), select the MenuCollection custom property, and check what it displays.

Custom Input Property before edit

If you look at the table above, it isn’t in the same format as the data we defined earlier, so let’s fix that. Update the property names and their corresponding values to match the correct format. These are just base values to help the component understand what to expect. The final table should look something like this:

Custom Input component after edit

Let’s set up a variable that will control the gallery’s visibility. On the Hamburger icon’s OnSelect property, create a variable called MenuClicked and set it as shown below:

Visibility Variable for gallery

Add this variable to the Gallery’s Visible property. This will allow the gallery to toggle open and close when you click the Hamburger icon (as shown in the demo below):

Visible Logic for component demo

If you notice, when the gallery closes, the white component space remains visible. To fix this, we can use another Component Property to dynamically adjust the width and height of the component. This ensures it doesn’t interfere with other controls on the screen.

Before proceeding to the next part, enable the Access App scope (shown below). This setting is essential because it allows you to use the variables created on the component side within the component itself. This setting is in the principal component’s property tab on the right.

Access Scope image

 

Just like we did for the MenuCollection property, go to the component’s property dropdown and set the value of the ComponentSize property to the Boolean variable (MenuClicked) we created, as shown below:

Component Size

Now, we can use this property to adjust the width and height of the component when it is clicked.

Width

Custom Width


Height

Custom Height

 

This means that when you click the menu to open it, the component size will change, and the same will happen when you close the menu.

Dynamic Heightxwidth

Now, all that’s left is to add the MenuCollection property to the gallery as the items and map the values to the gallery controls.

Gallery Items

Gallery Items

Label Text

Label Text

Icon value

Button 1

I have added an extra button that sits on top of all the controls in the gallery, I have done this so that I can get a pointer when I hover over the menu item, as well as also added the navigate function to that button to navigate to the particular screen.

Button 2

Make sure the overlapping button colors are transparent so that it does not show up when you hover over them.

Overlapping Button OnSelect logic

Button 2 Logic

The component is ready! You might wonder where all the menu items we declared before are. We still need to add that part, but it comes from the screen level. For now, we’re done with the component level.

Integrating the Component into Our Application

To insert the component into the screen, you should see a new dropdown in the Insert tab called Custom. This will contain all the elements you’ve created. Once you find the component you made, simply drag and drop it into place, just like you would with a regular control.

Custom Tab

Once that’s done, you’ll see the component on the screen. You might still wonder why the menu items you created aren’t showing up. Remember the MenuCollection custom property we made as an input property. The collection we set up in the app’s OnStart property will be passed to this input property. This will, in turn, send the data to the component side, displaying the created menu items.

App OnStart

The red highlight is the menu collection we created

Menu Items App Onstart

MenuCollection (Custom component property)

Input Propery From Screen

Just follow the previous steps to add the menu component to other screens so you can navigate back and forth, and you’ll be all set!

Final Component Output

 

Deploying the Menu Component in Another Application

Another helpful feature of Power Apps is the ability to export and import components across different apps, eliminating the need to recreate them from scratch. Let’s explore how to export and use this menu component in a new app.

1. Go back to the Components screen. Next to the ‘Add New Component’ option, you’ll see two buttons. The one highlighted in the red box is the export button, while the other is the import button. Click the export button.

Export Step 1.1

After clicking the button, a popup will appear, confirming that the file is ready for download. Simply click the download button to save the components as a .msapp package, which can be used in another application.

Export Step 1,2

 

Export Step 1.3

2. Once the file is downloaded, open the new application where you want to add your component. Then, navigate to the components screen and click the Import

Export Setp 2

After clicking the Import button, a new tab will appear, allowing you to either upload a file or select a component from other Power Apps applications. In our case, we will upload the file downloaded in the previous step.

  1. Click the Upload button, locate the downloaded file, and upload it. Once uploaded, all the components from the original application will be automatically added to the new application.

Export Demo

As shown in the video above, once you upload the file, the components and their settings are automatically added to the new application. The only limitation is that all elements from the original app will be imported, so you may need to delete any that are not required.

Pros and Cons of Components in Power Apps

Having explored how components function in Power Apps and created one, let’s now go over the pros and cons:

Pros:

  • Components can be reused across multiple screens and apps, minimizing redundant work.
  • Save development time by building once and reusing components wherever needed.
  • Updating a single component automatically applies changes across all instances (especially effective when using a component library).
  • Enables multiple developers to work on different components simultaneously, boosting team productivity.

Cons:

  • Learning how to effectively create and manage components may take time for beginners.
  • Unlike regular screens, components cannot be tested in the component screen; they must first be added to a screen.
  • Managing dependencies can become complex if a component relies on external data sources or variables.

Conclusion

In this guide, we explored Power Apps’ components and how they can be used to create reusable building blocks for your app. We then walked through creating a simple, customizable menu component that can be used across multiple screens in your app. This approach helps maintain consistency in design, saves time, and makes updates easier. By using this menu component, you can ensure a smooth and uniform user experience throughout your app while keeping the development process efficient and flexible.

]]>
https://blogs.perficient.com/2025/02/05/power-apps-and-components-understanding-components-and-their-role-in-app-development/feed/ 1 376573
Customizing Data Exports: Dynamic Excel Updates with Power Apps, Power Automate, and Office Scripts https://blogs.perficient.com/2025/02/05/customizing-data-exports-dynamic-excel-updates-with-power-apps-power-automate-and-office-scripts/ https://blogs.perficient.com/2025/02/05/customizing-data-exports-dynamic-excel-updates-with-power-apps-power-automate-and-office-scripts/#respond Wed, 05 Feb 2025 06:16:02 +0000 https://blogs.perficient.com/?p=376246

Modern business workflows often require flexible and efficient ways to export, transform, and share data. By combining the capabilities of Power Apps, Power Automate, and Office Scripts, you can create a seamless process to dynamically customize and update Excel files with minimal effort.

This guide demonstrates how to dynamically export data from Power Apps, process it with Power Automate, format it in Excel using Office Scripts, and send the updated file via email. Let’s dive into the details.

This blog demonstrates a practical solution for automating data exports and dynamic reporting in Excel, tailored to users who expect dynamic column selection for report headers. Manual data preparation and formatting can be time-consuming and error-prone in many projects, especially those involving custom reporting.

With the process outlined in this blog, you can:

  • Dynamically select and modify column headers based on user input.
  • Automate the transformation of raw data into a formatted Excel file.
  • Share the final output effortlessly via email.

This solution integrates Power Apps, Power Automate, and Office Scripts to ensure that your reporting process is faster, error-free, and adaptable to changing requirements, saving you significant time and effort.

Exporting Data from Power Apps

Creating a Collection in Power Apps

A collection in Power Apps serves as a temporary data storage container that holds the records you want to process. Here’s how to set it up:

Step 1: Define the DATA Collection

  • Open your Power App and navigate to the screen displaying or managing your data.
  • Use the Collect or ClearCollect function in Power Apps to create a collection named ExportData that holds the required data columns.
  • You can dynamically populate this collection based on user interaction or pre-existing data from a connected source. For example:

Picture1

  • Here, the ExportData collection is populated with a static table of records. You can replace this static data with actual data retrieved from your app’s sources.
  • Tip: Use data connectors like SharePoint, SQL Server, or Dataverse to fetch real-time data and add it to the collection.

Step 2: Define a Table HeaderName for Column Names

  • To ensure the exported Excel file includes the correct column headers, define a Variable named HeaderName that holds the names of the columns to be included.
Set(HeaderName, ["Name", "Age", "Country"])

This Variable specifies the column headers appearing in the exported Excel file.

Picture2

Pass Data to Power Automate

Once the ExportData collection and HeaderName are set up, pass them as inputs to the Power Automate flow.

Step 1: Add the Flow to Power Apps

  1. Navigate to the Power Automate tab in Power Apps.
  2. Click on + Add Flow and select the flow you created for exporting data to Excel.

Step 2: Trigger the Flow and Send the Data

    • Use the following formula to trigger the flow and pass the data:
CustomizingDataExports.Run(JSON(ExportData), JSON(HeaderName))

Picture3

  • CustomizingDataExports is the Power Automate flow.
  • JSON(ExportData) converts the collection to a JSON object that Power Automate can process.
  • JSON(HeaderName) converts the collection to a JSON object that passes the column headers for use in the Excel export.

Processing Data with Power Automate

Power Automate bridges Power Apps and Excel, enabling seamless data processing, transformation, and sharing. Follow these steps to configure your flow:

1. Receive Inputs

  • Trigger Action: Use the Power Apps trigger to accept two input variables:
    • ExportData: The dataset.
    • HeaderName: The column headers.
  • Add input parameters:
    • Navigate to the trigger action.
    • Click Add an input, select Text type for both variables and label them.

2. Prepare Data

Add two Compose actions to process inputs.

  • Use these expressions:

For ExportData:

json(triggerBody()?['text'])

For HeaderName:

json(triggerBody()?['text_1'])

Add a Parse JSON action to structure the HeaderName input:

Content:

outputs('Compose_-_HeaderName')

Schema:

{
    "type": "array",
    "items": {
        "type": "object",
        "properties": {
            "Value": {
                "type": "string"
            }
        },
        "required": [
            "Value"
        ]
    }
}

Use a Select action to extract the values:

From:

body('Parse_JSON')

Map:

item()['Value']

Picture4

3. Setup Excel Template

Add a Get file content action to fetch a pre-defined Excel template from storage (e.g., SharePoint or OneDrive).

Use a Create file action to save the template as a new file:

Dynamic File Name:

guid().xlsx

Convert the ExportData to a CSV format:

  • Add a Create CSV Table action:

From:

outputs('Compose_-_ExportData')

Picture5

Formatting Data with Office Scripts

Office Scripts are used to dynamically process and format data in Excel. Here’s how you implement it:

Set up the script

Open Excel and navigate to the “Automate” tab.

Create a new Office Script and paste the following code:

function main(workbook: ExcelScript.Workbook, headersArray: string[], csvData: string) {
  let activeWorksheet = workbook.getWorksheet("Sheet1");
  let csvRows = csvData.split('\n');
  csvRows = csvRows.map(row => row.replace(/\r$/, ''));
  let headerRow = csvRows[0].split(',');
  // Create a mapping of column headers to their indices
  let columnIndexMap: { [key: string]: number } = {};
  for (let i = 0; i < headerRow.length; i++) {
    let header = headerRow[i];
    if (headersArray.includes(header)) {
      columnIndexMap[header] = i;
    }
  }
  // Create new Excel table with headers below the logo
  let range = activeWorksheet.getRangeByIndexes(0, 0, 1, headersArray.length);
  range.setValues([headersArray]);
  // Batch size for inserting data into Excel
  const batchSize = 500;
  let batchData: string[][] = [];
  let columncount = 0;
  // Loop through CSV data and filter/select desired columns
  for (let j = 1; j < csvRows.length; j++) {
    let rowData = parseCSVRow(csvRows[j]);
    let filteredRowData: string[] = [];
    for (let k = 0; k < headersArray.length; k++) {
      let header = headersArray[k];
      let columnIndex = columnIndexMap[header];
      filteredRowData.push(rowData[columnIndex]);
    }
    batchData.push(filteredRowData);
    // Insert data into Excel in batches
    if (batchData.length === batchSize || j === csvRows.length - 1) {
      let startRowIndex = j - batchData.length + 1; // Start after the logo and headers
      let startColIndex = 0;
      let newRowRange = activeWorksheet.getRangeByIndexes(startRowIndex, startColIndex, batchData.length, batchData[0].length);
      newRowRange.setValues(batchData);
      batchData = [];
    }
    columncount=j;
  }
  workbook.addTable(activeWorksheet.getRangeByIndexes(0, 0, columncount, headersArray.length), true).setPredefinedTableStyle("TableStyleLight8");
  activeWorksheet.getRangeByIndexes(0, 0, columncount, headersArray.length).getFormat().autofitColumns();

  // Release the lock on the workbook
  activeWorksheet.exitActiveNamedSheetView();
}
// Custom CSV parsing function to handle commas within double quotes
function parseCSVRow(row: string): string[] {
  let columns: string[] = [];
  let currentColumn = '';
  let withinQuotes = false;
  for (let i = 0; i < row.length; i++) {
    let char = row[i];
    if (char === '"') {
      withinQuotes = !withinQuotes;
    } else if (char === ',' && !withinQuotes) {
      columns.push(currentColumn);
      currentColumn = '';
    } else {
      currentColumn += char;
    }
  }
  columns.push(currentColumn); // Add the last column
  return columns;
}

Picture6

Integrate with Power Automate

Use the Run script action in Power Automate to execute the Office Script.

Pass the header array and CSV data as parameters.

Picture7

Send the Updated File via Email

Once the Excel file is updated with Office Scripts, you can send it to recipients via Outlook email.

1. Retrieve the Updated File:

  • Add a Get file content action to fetch the updated file.

Use the file path or identifier from the Create file action.

outputs('Create_file')?['body/Id']

Picture8

2. Send an Email (V2):

  • Add the Send an email (V2) action from the Outlook connector.
  • Configure the email:
    • To: Add the recipient’s email dynamically or enter it manually.
    • Subject: Provide a meaningful subject, such as “Custom Data Export File”
    • Body: Add a custom message, including details about the file or process.
    • Attachments:
      • Name: Use a dynamic value
outputs('Create_file')?['body/Name']
        • Content: Pass the output from the Get file content action.
body('Get_file_content_-_Created_File')

Picture9

Integrating the Workflow

  1. Test the entire integration from Power Apps to Power Automate and Office Scripts.
  2. Verify the final Excel file includes the correct headers and data formatting.
  3. Confirm that the updated Excel file is attached to the email and sent to the specified recipients.

Result:

Excel

Picture10

Email

Picture11

How This Solution Saves Time

This approach is tailored for scenarios where users require a dynamic selection of column headers for custom reporting. Instead of spending hours manually formatting data and preparing reports, this solution automates the process end-to-end, ensuring:

  • Accurate data formatting without manual intervention.
  • Quick adaptation to changing requirements (e.g., selecting different report headers).
  • Seamless sharing of reports via email in just a few clicks.

This workflow minimizes errors, accelerates the reporting process, and enhances overall project efficiency by automating repetitive tasks.

Conclusion

You can create robust, dynamic workflows for exporting and transforming data by combining Power Apps, Power Automate, and Office Scripts. This approach saves time, reduces manual effort, and ensures process consistency. Adding email functionality ensures the updated file reaches stakeholders without manual intervention. Whether you’re managing simple data exports or complex transformations, this solution provides a scalable and efficient way to handle Excel data.

]]>
https://blogs.perficient.com/2025/02/05/customizing-data-exports-dynamic-excel-updates-with-power-apps-power-automate-and-office-scripts/feed/ 0 376246
Protecting and Securing Your VBA Projects: A Comprehensive Guide https://blogs.perficient.com/2025/02/04/protecting-securing-vba-projects/ https://blogs.perficient.com/2025/02/04/protecting-securing-vba-projects/#respond Tue, 04 Feb 2025 15:49:43 +0000 https://blogs.perficient.com/?p=374271

Visual Basic for Applications (VBA) projects are integral to Microsoft Office automation. From automating repetitive tasks in Excel to creating powerful macros for Word or Excel, VBA can significantly enhance productivity. However, protecting and securing your VBA projects is essential to safeguard your intellectual property, maintain data integrity, and prevent unauthorized access.

This blog will explore effective methods to protect your VBA projects from potential threats while ensuring compliance with best practices.

Why Protect Your VBA Projects?

  1. Prevent Unauthorized Access: Protecting your code ensures unauthorized users cannot access or modify your work.
  2. Safeguard Intellectual Property: Your VBA project may contain unique algorithms, business logic, or confidential data that need protection.
  3. Avoid Accidental Modifications: Securing your project prevents accidental changes that could break its functionality.
  4. Enhance Professionalism: A secure project demonstrates your commitment to quality and professionalism.

How to Protect Your VBA Projects

1. Password Protecting Your VBA Project

Microsoft Office allows you to lock VBA projects with a password. Here’s how:

  1. Open the VBA editor (Alt + F11).
  2. In the Project Explorer, right-click your project and select Properties.
  3. Navigate to the Protection tab.
  4. Check the Lock project for viewing and enter a strong password.
  5. Click OK and save your document.

Refer to the below screenshot:

image showing the "Protection" tab in VBA project properties.

“Protection” tab in VBA project properties.

2. Obfuscating Your Code

Code obfuscation maintains the functionality of your VBA code while making it challenging to read or comprehend. Although VBA doesn’t have built-in obfuscation tools, third-party tools like VBA Compiler for Excel or Smart Indenter can help achieve this.

3. Disabling Macro Settings for Unauthorized Users

Adjusting the macro security settings allows you to limit who can run macros:

  1. Go to File > Options > Trust Center > Trust Center Settings.
  2. Select Macro Settings and choose options like Disable all macros except digitally signed macros.

Sample Code: Enforcing macro security programmatically:

Enhancing macro security programmatically ensures that only authorized macros run in your environment. The code below checks macro security settings and prompts users to adjust if insecure settings are detected.

Sub CheckMacroSecurity()
    If Application.AutomationSecurity <> msoAutomationSecurityForceDisable Then
        MsgBox "Macros are not secure. Adjust your settings.", vbCritical
    End If
End Sub

4. Digitally Signing Your VBA Code

Digitally signing your VBA projects protects your code and assures users of its authenticity. To digitally sign a VBA project:

  1. Open the VBA editor and your project.
  2. Go to Tools > Digital Signature.
  3. Select a certificate or create a self-signed certificate.

Note: Use trusted certificates from reputable authorities for enhanced security.

5. Storing Sensitive Data Securely

Avoid hardcoding sensitive information like passwords or API keys directly in your VBA code. Instead:

  • Use environment variables.
  • Store data in an encrypted external file.
  • Use Windows Credential Manager.

Sample Code: Reading data from an encrypted file:

Reading data from an encrypted file ensures that sensitive information is kept secure from unauthorized access. Combining encryption with secure storage methods effectively safeguards critical data.

Sub ReadEncryptedData()
    Dim filePath As String, fileData As String
    filePath = "C:\secure\data.txt"
    Open filePath For Input As #1
    Input #1, fileData
    MsgBox "Decrypted Data: " & Decrypt(fileData)
    Close #1
End Sub

Function Decrypt(data As String) As String
    ' Custom decryption logic here
    Decrypt = StrReverse(data) ' Example: reversing string
End Function

6. Regular Backups and Version Control

Accidents happen. Ensure you maintain:

  • Regular Backups: Save copies of your projects on secure, remote storage.
  • Version Control: Use tools like Git to track changes and collaborate effectively.

Final Thoughts

Protecting and securing your VBA projects is not just about locking your code; it’s about adopting a comprehensive approach to safeguarding your intellectual property, maintaining functionality, and ensuring trustworthiness. By implementing the steps outlined above, you can significantly enhance the security and reliability of your VBA solutions.

Have tips or experiences with VBA project security? Share them in the comments below. Let’s secure our projects together!

Take Action to Secure Your VBA Projects 

Start protecting your VBA projects today by setting up password protection, implementing digital signatures, or securing sensitive data. Explore the resources above for more advanced security techniques and strengthen your projects against potential risks. 

Do you have insights or experiences with securing VBA projects? Share them in the comments below, and let’s work together to create safer, more reliable solutions! 

Additional Resources:

]]>
https://blogs.perficient.com/2025/02/04/protecting-securing-vba-projects/feed/ 0 374271
Understanding the Difference Between Optimizely Configured Commerce SDK and Cloud https://blogs.perficient.com/2025/01/31/optimizely-configured-commerce-sdk-vs-cloud/ https://blogs.perficient.com/2025/01/31/optimizely-configured-commerce-sdk-vs-cloud/#respond Fri, 31 Jan 2025 07:17:39 +0000 https://blogs.perficient.com/?p=376258

The Optimizely Configured Commerce SDK and Optimizely Configured Commerce Cloud serve different but complementary purposes within the Optimizely ecosystem. Below is a breakdown of their differences to help clarify their roles:

Optimizely Configured Commerce SDK (Software Development Kit)

The SDK is a toolkit developers use to build, extend, and customize Optimizely Configured Commerce solutions.

Key Features

  • Custom Development: Enables developers to create tailored functionality or modify existing features.
  • Extensibility: Allows the integration of third-party tools, systems, or APIs into the platform.
  • Local Development: Provides resources for developers to work offline or in a local development environment.
  • Code Control: Gives developers greater flexibility to build unique features that align with business-specific workflows or industry requirements.

Use Cases

  • Businesses need highly customized solutions that go beyond the standard capabilities of Optimizely Configured Commerce.
  • This is for developers who want to test and implement features locally before deploying them to the live environment.
  • When integrating Optimizely with complex systems such as legacy ERPs, custom CRMs, or bespoke tools.

Optimizely Configured Commerce Cloud

This is the fully managed, cloud-hosted environment where the Configured Commerce platform operates. It delivers scalability, security, and reliability while offloading the burden of infrastructure management from businesses.

Key Features

  • Cloud Hosting: Hosted on Optimizely’s infrastructure, ensuring uptime and reliability.
  • Scalability: Automatically adjusts to handle increased traffic or load.
  • Maintenance-Free: Optimizely manages software updates, patches, and performance optimizations.
  • Global Availability: Designed for businesses operating in multiple regions with global infrastructure support.
  • Security: Includes enterprise-grade security measures, including compliance with data protection standards.

Use Cases

  • Businesses that want to focus on business operations rather than maintaining infrastructure.
  • Companies expecting fluctuations in traffic and requiring a scalable solution.
  • Organizations need high uptime and reliability for their eCommerce operations.

Important Distinctions Between Cloud and SDK

FactorSDKCloud
PurposeToolkit for building and customizing functionality.Fully managed, hosted environment for the platform.
Target TeamDevelopers and technical teams.
CustomizationHigh flexibility for custom features and integrations.Limited to Cloud version also supports the platform extension at large extent with certain limitation but from this point it sounds like we can only customize the platform through configurations.
ManagementRequires development resources to build and deploy changes.Managed entirely by Optimizely, including updates and maintenance.
HostingLocal or self-hosted for development purposes and production purposes.Hosted by Optimizely with global availability.

By leveraging Optimizely Cloud capabilities, you can achieve robust, scalable, and tailored eCommerce experiences with minimized operational complexity, while using SDK version you can have more controlled, customizable website and also have control on infrastructure, upgrades and the deployments

]]>
https://blogs.perficient.com/2025/01/31/optimizely-configured-commerce-sdk-vs-cloud/feed/ 0 376258
Is it really DeepSeek FTW? https://blogs.perficient.com/2025/01/30/is-it-really-deepseek-ftw/ https://blogs.perficient.com/2025/01/30/is-it-really-deepseek-ftw/#respond Thu, 30 Jan 2025 14:55:53 +0000 https://blogs.perficient.com/?p=376512

So, DeepSeek just dropped their latest AI models, and while it’s exciting, there are some cautions to consider. Because of the US export controls around advanced hardware, DeepSeek has been operating under a set of unique constraints that have forced them to get creative in their approach. This creativity seems to have yielded real progress in reducing the amount of hardware required for training high-end models in reasonable timeframes and for inferencing off those same models. If reality bears out the claims, this could be a sea change in the monetary and environmental costs of training and hosting LLMs.

In addition to the increased efficiency, DeepSeek’s R1 model is continuing to swell the innovation curve around reasoning models. Models that follow this emerging chain of thought paradigm in their responses, providing an explanation of their thinking first and then summarizing into an answer, are providing a step change in response quality. Especially when paired with RAG and a library of tools or actions in an agentic framework, baking this emerging pattern into the models instead of including it in the prompt is a serious innovation. We’re going to see even more open-source model vendors follow OpenAI and DeepSeek in this.

Key Considerations

One of the key factors in considering the adoption of DeepSeek models will be data residency requirements for your business. For now, self-managed private hosting is the only option for maintaining full US, EU, or UK data residency with these new DeepSeek models (the most common needs for our clients). The same export restrictions limiting the hardware available to DeepSeek have also prevented OpenAI from offering their full services with comprehensive Chinese data residency. This makes DeepSeek a compelling offering for businesses needing an option within China. It’s yet to be seen if the hyperscalers or other providers will offer DeepSeek models on their platforms (Before I managed to get his published, Microsoft made a move and is offering DeepSeek-R1 in Azure AI Foundry).  The good news is that the models are highly efficient, and self-image hosting is feasible and not overly expensive for inferencing with these models. The downside is managing provisioned capacity when workloads can be uneven, which is why pay-per-token models are often the most cost efficient.

We are expecting that these new models and the reduced prices associated with them will have serious downward pressure on per-token costs for other models hosted by the hyperscalers. We’ll be paying specific attention to Microsoft as they are continuing to diversify their offerings beyond OpenAI, especially with their decision to make DeepSeek-R1 available. We also expect to see US-based firms replicate DeepSeek’s successes, especially given that Hugging Face has already started work within their Open R1 project to take the research behind DeepSeek’s announcements and make it fully open source.

What to Do Now

This is a definite leap forward and progress in the direction of what we have long said is the destination—more and smaller models targeted at specific use cases. For now, when looking at our clients, we advise a healthy dose of “wait and see.” As has been the case for the last three years, this technology is evolving rapidly, and we expect there to be further developments in the near future from other vendors. Our perpetual reminder to our clients is that security and privacy always outweigh marginal cost savings in the long run.

The comprehensive FAQ from Stratechery is a great resource for more information.

]]>
https://blogs.perficient.com/2025/01/30/is-it-really-deepseek-ftw/feed/ 0 376512
Triggering File Creation and Auto-Download in PowerApps Using Power Automate https://blogs.perficient.com/2025/01/29/triggering-file-creation-and-auto-download-in-powerapps-using-power-automate/ https://blogs.perficient.com/2025/01/29/triggering-file-creation-and-auto-download-in-powerapps-using-power-automate/#comments Thu, 30 Jan 2025 03:07:42 +0000 https://blogs.perficient.com/?p=376330

Automation is essential for increasing productivity and simplifying work. Downloading files is one of these tasks, particularly when handling a lot of file downloads. The lack of native file download functionality in PowerApps makes it difficult to download files. In this blog post, we’ll go over how to use Power Automate to automate file downloads from PowerApps by converting sample data into an Excel sheet and downloading it to the local computer. 

PowerApps Collection

  1. PowerApps collection can help us to store required data that the user wants to download as a file. Here are the steps to store them.
  • Go to the On Visible property of the screen. 
  • Create a collection using the Collect() or Clear Collect() functions and assign the required data to the function and give it a name. Screenshot 2025 01 29 101709
  1. Next, pass the data  to the power automate flow by using the below function.

Screenshot 2025 01 29 102756

  1. Afterward, Use the following actions to convert the collection data into an excel file and store it into the SharePoint library.                                                                               Screenshot 2025 01 29 102901                            Screenshot 2025 01 29 102916

  2. Various parameters are generated once the create file action is run.

  3. Out of which Etag is a parameter that contains the UniqueID related to the created file in SharePoint which looks like the one in the below picture.

Screenshot 2025 01 29 103029

 

  1. The string between the curly braces is the UniqueID that we have to capture.

 

  1. To capture the string, use Respond to a PowerApp action to send the Etag back to PowerApps.

Screenshot 2025 01 29 103055

 

8. In the flow run statement, you can observe that we have set the flow run output to a variable response, which will contain the eTag. 

9.  Split statement splits the string based on the provided parameter, in this case its “}” . So after splitting the etag, there will be 2 strings out of which we require the first string. So we use First function to capture that first string. 

10.The second split statement now splits the obtained string again by considering “{: as a parameter and now we require the 2nd split string. So we use Last function to capture the unique ID string.                                              Screenshot 2025 01 29 102743

11. Now, assign the extracted unique ID to the below URL which is taken from the SharePoint site to a Download function and paste this function to a download icon.

Screenshot 2025 01 29 103123

12. Upon clicking the icon containing the above download function, the file will be successfully downloaded to your local machine.    Screenshot 2025 01 29 102122

13. In the end of the flow, we can add a team’s action to notify specific users within a group about the file download.

Pros: 

  • ETag (Entity Tag) provides a way to identify the specific version of a file. This ensures that the file being downloaded is the latest version or matches the user’s request. 
  • By using the ETag, the system avoids unnecessary file processing and data transfers, reducing server load and improving app responsiveness. 
  • Many cloud storage solutions (e.g., SharePoint, OneDrive) and REST APIs support ETags for file identification. Leveraging ETags aligns with these standards, making integration seamless. 

Cons: 

  • PowerApps alone cannot directly handle ETags or complex HTTP headers. Integration with Power Automate or custom connectors is mandatory, which adds complexity to the solution. 
  • If the ETag mismatches (e.g., due to a file being updated during a request), users may receive errors or need to retry the download, which can cause confusion without proper error messages. 
  • The concept of ETags may not be directly visible to users. Without clear feedback or messaging, users might not understand why a download failed (e.g., due to a mismatched ETag). 

Conclusion: 

This method streamlines the process of automating file download through PowerApps not only for excel, but for various file types making it a versatile solution for PowerApps users instead of going with the manual download process. 

]]>
https://blogs.perficient.com/2025/01/29/triggering-file-creation-and-auto-download-in-powerapps-using-power-automate/feed/ 4 376330
Part 2: Capture and Monitor Critical Power Automate Workflows History through Child Flow https://blogs.perficient.com/2025/01/29/part-2-capture-and-monitor-critical-power-automate-workflows-history-through-child-flow/ https://blogs.perficient.com/2025/01/29/part-2-capture-and-monitor-critical-power-automate-workflows-history-through-child-flow/#respond Wed, 29 Jan 2025 11:14:43 +0000 https://blogs.perficient.com/?p=375598

Introduction:

In this Part, we’ll show the process of integrating the child flow created in Part 1 into a Parent flow – the critical flow that you will be tracked for its history.

“If you missed Part 1, be sure to check it out for a complete guide on creating child flow in Power Automate.”

Step-by-Step Guide:

Integrate child flow to your Parent Flow:

Will continue with the remaining steps to incorporate the child flow and fully track the history of each run.

Step 1: Open Your Parent Flow and Add Essential Variables

To begin, open the existing flow you want to track. In this example, we’ll use an approval flow as the base for the parent flow.

Add the Following Variables:

  1. Start Time Variable:
    To track the start time of the flow, create a variable called Start Time. Set its value to the current time using the utcNow() function.

    • This will capture the moment when the parent flow begins executing. It’s useful for tracking the duration of the flow later, especially when monitoring child flows.
      17flowhistory
  2. End Time Variable:
    • Similarly, create a new variable called End Time. You will set this later to capture the moment when the parent flow finishes. This will help you calculate the total runtime of the flow, including any child flows.
      18flowhistory
  3. Workflow Status Variable:
    Add another variable called Workflow Status. Initialize this with a default value of “Success”

    • This will track the overall status of the flow.
      19flowhistory

Step 2: Put Your Parent Flow Actions Inside the Scope Block

Once you’ve set up your variables, the next step is to organize your parent flow’s actions within a Scope block.

A Scope in Power Automate allows you to group multiple actions together. This makes your flow more organized and provides a clear structure for managing the flow’s execution. In this example, we’re putting all the parent flow actions inside the Scope block to track its execution clearly.

Steps to Implement:

  1. Add a Scope Action:
    Search for and add a Scope action to your parent flow. This will act as a container for the actions that you want to track.
  2. Include Flow Actions Inside the Scope:
    Drag and drop all the necessary actions (like sending approvals, updates, or other tasks) inside the Scope action.

    • This will help maintain a clean and structured flow, making it easier to manage and track.
      20flowhistory

Step 3: Create 3 Parallel Branches for Error Handling and Workflow Control

Now, we’ll introduce parallel branches to handle errors, timeouts, and other potential issues within the flow. This will give you greater control over the flow execution, especially when it involves critical operations like approvals.

  1. Add the First Parallel Branch:
    • Click the “+” sign after the Scope action to add a Parallel Branch.
    • In this branch, add the “Set Variable” action to change the Workflow Status to “Failed”. This will help capture and track if the flow encounters any failure.
    • Action to Add:
      • Set Workflow Status Variable to Failed.
        21flowhistory
        22flowhistory
  1. Configure the Run After Settings for the Set Variable Action:
    • In this step, configure the Run After settings for the Set Variable action.
    • Set the Run After condition to trigger if the Scope action fails or encounters an error. This will ensure that the Workflow Status is updated to “Failed” if any issues occur within the Scope.
    • Example Configuration:
      • Run After: Select has failed.
        23flowhistory
        24flowhistory
  2. Repeat the Previous Step for Timeout Handling:
    • In the second parallel branch, add another Set Variable action to update the Workflow Status to “Failed” in case of a timeout.
    • Set the Run After condition for this action to trigger if the Scope action times out.
    • Example Configuration:
      • Run After: Select has timed out.
        25flowhistory
        26flowhistory
  3. Add a Third Parallel Branch with a Dummy Flag:
    • Create another Parallel Branch, this time adding a Compose action.
    • Use this Compose action to set a dummy flag or placeholder. This flag can be useful for tracking or debugging the flow.
    • Action to Add:
      • Compose: Use a simple string like “Dummy Flag” to ensure this branch runs successfully.
    • Example Configuration:
      • Run After: Select has Successful and Skipped.
        27flowhistory
        28flowhistory

Step 4: Add a Scope Block Next to the Dummy Flag Compose Action

In this step, we will add a new Scope block next to the dummy flag compose action to ensure that the workflow history is captured and tracked.

  1. Add a Scope Action:
    • Add a new Scope action next to the dummy flag Compose action.
      29flowhistory
  2. Configure Run After Settings:
    • Set the Run After configuration for the new Scope action to trigger after the Compose (dummy flag) action completes. This ensures the scope runs after the previous actions have completed.
      30flowhistory
  3. Add Workflow History Block:
    • Inside the newly added Scope, add a Compose action. Name this “Workflow History Block” and set its value to the expression workflow().
    • The workflow() function will return the details of the flow execution, including its start and end times, status, and other key properties.
    • Example Action:
      • Compose: Set value to workflow().
        31flowhistory

Step 5: Add the Child Flow Inside the Scope

Now, we’ll add the Child Flow inside the Scope to ensure that the parent flow properly runs and tracks the results of the child flow.

  1. Add the Run a Child Flow Action:
    • Search for and add the Run a Child Flow action inside the newly added Scope.
    • After adding the Run a Child Flow action, select the child flow you want to include in the parent flow. Upon selection, it will automatically generate the configured inputs for the child flow.
      32flowhistory
  2. Map the Workflow History Block:
    • In the Workflow obj input for the child flow, map the value to string(outputs('Workflow_History_Block')).
    • This step will pass the workflow history from the parent flow to the child flow for tracking purposes.
      33flowhistory
  3. Map Remaining Inputs:
    • Map any remaining values to the child flow inputs from the dynamic content provided by the parent flow. This ensures that all necessary information is passed down to the child flow.
      34flowhistory

Accessing Child Flow in Different Environments:

For Same Environment:

  • If you want to use the child flow in a different solution within the same environment, simply follow the same steps as you would to “Integrate child flow to your Parent Flow”.
  • You will be able to see the child flow name in the “Run a Child Flow Action”

For Different Environment:

  • To use the child flow in a different environment, you can export the entire solution as a .zip file. This file can then be uploaded to another environment or shared directly via the Power Automate Platform.
  • After importing the solution into the new environment, ensure the connections are properly configured.
  • Once the connections are verified, you can follow the same steps to “Integrate child flow to your Parent Flow”, and the child flow name will appear in the “Run a Child Flow Action”

Conclusion:

By following these steps, you’ve successfully created a Flow in Power Automate that tracks the history of its execution, incorporates child flows, and manages error handling. This setup provides transparency and ensures that all flow statuses are logged into your SharePoint or Dataverse, whether they succeed or fail.

 

]]>
https://blogs.perficient.com/2025/01/29/part-2-capture-and-monitor-critical-power-automate-workflows-history-through-child-flow/feed/ 0 375598
Part 1: Capture and Monitor Critical Power Automate Workflows History through Child Flow https://blogs.perficient.com/2025/01/29/part-1-capture-and-monitor-critical-power-automate-workflows-history-through-child-flow/ https://blogs.perficient.com/2025/01/29/part-1-capture-and-monitor-critical-power-automate-workflows-history-through-child-flow/#respond Wed, 29 Jan 2025 11:13:56 +0000 https://blogs.perficient.com/?p=375331

Introduction:

In today’s digital landscape, organizations increasingly rely on automated workflows to enhance efficiency and streamline processes. However, as these workflows grow in complexity, tracking their execution history becomes essential for ensuring reliability and facilitating troubleshooting.

This blog explores a powerful approach to logging workflow running history using child flows in Power Automate. By leveraging child flows, users can capture critical data such as flow names, statuses, and execution URLs, storing this information in SharePoint or Dataverse tables for easy access and analysis.

In this Blog Series, you will be briefly seeing these topics in two parts:

  • Part 1: Step-by-Step Guide — Creating a child flow to capture the necessary data for tracking your workflow execution history
  • Part 2: Step-by-Step Guide — Integrating Child flow to the Parent Flow & How to Access this child flow as a reusable template across the different projects or Teams.

In this Part, you will introduce the concept of creating a child flow to get the necessary data.

Prerequisites:

    • SharePoint or Dataverse: Ensure you have access to either SharePoint or Dataverse to store and manage your data.
    • Power Automate: A valid account to access Power Automate for creating and managing your workflows.

Step-by-Step Guide:

Now, let’s dive into the detailed process for creating flows in Power Automate.

Setting Up in SharePoint or Dataverse

  1. Create a List or Table named “Log Workflow Histories.”
  2. Add the Following Columns:
Column Name Data Type
Flow Name Single Line of Text
Flow Run URL Link / Hyperlink or Picture
Flow Start Time Date and Time
Flow End Time Date and Time
Flow Status Single Line of Text

 

Creating a Child Flow:

  1. Access Power Automate at Power Automate.
  2. Select “Solutions” from the left navigation pane.
  3. Open Your Existing Solution or create a new one by clicking “New Solution.”
  4. Create a Child Flow by navigating to:
    • All > New > Automation > Cloud Flow > Instant.
      1childflownav

Step 1: Provide a Flow Name and Select the Manual Trigger

Choose a descriptive name for your flow that clearly indicates its purpose. This helps in identifying the flow later.
2flowhistory

Step 2: Set Inputs in the Manual Trigger

Input Source: The inputs for this flow will be provided by the parent flow. Ensure that the following inputs are configured to receive data from the parent flow:

  • Input 1: Workflow Object (Type: Text)
  • Input 2: Start Date and Time (Type: Text)
  • Input 3: End Date and Time (Type: Text)
  • Input 4: Workflow Status (Type: Text)

3flowhistory

4flowhistory

Step 3: Add a Compose Action to Convert the Workflow Object to JSON Format.

The Workflow Object we receive from the parent flow is in JSON format but is treated as a string in the manual trigger input. By converting it into a JSON object, we can easily extract specific information and manipulate the data as needed in subsequent steps of the flow.

  • Add a Compose Action:
    • Click on “+ New Step.”
    • Search for and select “Compose.”
  • Configure the Compose Action:
    • In the Inputs field of the Compose action, enter the following expression to convert the Workflow Object to JSON format: json(triggerBody()?['Workflow Object'])
      5flowhistory
      6flowhistory
      7flowhistory
      8flowhistory
  • Name the Compose Action (Optional but Best Practice):
    • You can rename the action to something descriptive, like “Convert Workflow Obj to JSON.” In this example, I have named “Convert to Obj”.

Step 4: Add a Compose Action to Extract the Workflow Link

This Compose action allows us to isolate and extract the workflow link from the previously converted JSON object. This makes it easier to use the link in further actions within the flow.

  • Add Another Compose Action:
    • Click on “+ New Step.”
    • Search for and select “Compose.”
  • Configure the Compose Action:
    • In the Inputs field of this Compose action, enter the expression to extract the workflow link.
    • The expression would be: concat('https://make.powerautomate.com/environments/',outputs('Convert_to_Obj')?['tags']?['environmentName'],'/flows/',outputs('Convert_to_Obj')?['name'],'/runs/',outputs('Convert_to_Obj')?['run']?['name'])
      ‘Convert_to_Obj’ – replace it with the name of your previous compose action.
      9flowhistory10flowhistory

Step 5: Add another Compose Action to Extract the Flow Name

Please revisit Step 4 and replace the current expression with the following. Additionally, rename the Compose action to “Workflow Name.”

The Expression would be:outputs('Convert_to_Obj')?['tags']?['flowDisplayName']
11childflownav
12childflownav

The extraction from the workflow object is now complete. Next, it’s time to add the item to either a SharePoint list or a Dataverse table.

Step 6: Add an Item to SharePoint List

  • Add a New Step:
    • Click on “+ New Step.”
    • Search for and select “SharePoint” from the list of connectors.
      13childflownav
  • Choose Action:
    • Select the action “Create Item.”
      14childflownav
  • Configure the Create Item Action:
    • Site Address: Choose the SharePoint site where your list is located.
    • List Name: Select the “Log Workflow Histories” list you created earlier.
  • Map the Fields:
    Fill in the fields using the outputs from previous steps:

    • Workflow Name: Use the output from the “Workflow Name” Compose action.
    • Start Date and Time: Map this to the corresponding input from the manual trigger.
    • End Date and Time: Map this to the corresponding input from the manual trigger.
    • Workflow Status: Map this to the corresponding input from the manual trigger.
    • Workflow Link: Use the output from the Compose action that extracted the workflow link.
      15flowhistory

You can follow either Step 6 or Step 7 based on your chosen data source—SharePoint or Dataverse. If you’re using SharePoint, refer to Step 6 for adding an item to the SharePoint list. If you’re using Dataverse, follow Step 7 to add a row to the Dataverse table.

Step 7: Add an Item to Dataverse Table

  • Add a New Step:
    • Click on “+ New Step.”
    • Search for and select “Dataverse” from the list of connectors.
  • Choose Action:
    • Select the action “Add a Row.”
      16flowhistory
  • Configure the Add a Row Action:
    • Table Name: Select the Dataverse table where you want to add the item (e.g., “Log Workflow Histories”).
  • Map the Fields:
    Fill in the fields using outputs from previous steps:

    • Workflow Name: Use the output from the “Workflow Name” Compose action.
    • Start Date and Time: Map this to the corresponding input from the manual trigger.
    • End Date and Time: Map this to the corresponding input from the manual trigger.
    • Workflow Status: Map this to the corresponding input from the manual trigger.
    • Workflow Link: Use the output from the Compose action that extracted the workflow link.

Conclusion:

“In the first part, we’ve explored how to create the child flow. In the next part of this series, we’ll show you how to integrate this child flow into a parent flow and start capturing the history for your critical flows. “

]]>
https://blogs.perficient.com/2025/01/29/part-1-capture-and-monitor-critical-power-automate-workflows-history-through-child-flow/feed/ 0 375331
Creating Custom Outlook Forms and Reports with VBA https://blogs.perficient.com/2025/01/28/custom-outlook-forms-reports-vba-productivity/ https://blogs.perficient.com/2025/01/28/custom-outlook-forms-reports-vba-productivity/#respond Tue, 28 Jan 2025 08:39:36 +0000 https://blogs.perficient.com/?p=374127

Microsoft Outlook is a powerful tool for organizing emails, calendars, contacts, and tasks. While it offers an array of built-in features, its full potential is unlocked when you customize it to suit your needs. One of the most effective ways to achieve this is through Visual Basic for Applications (VBA). 

VBA allows you to automate repetitive tasks, create custom forms for data collection, and generate tailored reports seamlessly. Its integration capabilities enable Outlook to work harmoniously with other Microsoft Office applications, such as Excel, Word, and external systems. This makes VBA an invaluable tool for streamlining workflows, improving productivity, and reducing manual effort. 

In this post, we’ll explore using VBA to create custom forms that gather the necessary information and automated reports that simplify complex processes. With these skills, you can transform Outlook into a productivity powerhouse tailored to your unique requirements. 

Why Use Custom Forms and Reports in Outlook?

Outlook comes with a standard set of forms for composing emails, scheduling appointments, and managing contacts, but sometimes, you need something more personalized. Custom forms in Outlook allow you to create specialized templates to collect or display information in a way that fits your workflow. Automated reports help you summarize data or track communication, saving time on manual compilation.

With VBA, you can automate and tailor these processes to integrate perfectly with your daily tasks. Let’s dive into using VBA to create custom forms and reports in Outlook.

1. Creating a Custom Outlook Form with VBA

Outlook provides several types of forms, including messages, appointments, and contact forms. While you can create custom forms directly within Outlook using the built-in form designer, VBA allows you to further automate and personalize these forms to suit your needs.

Example: Custom Email Form for Gathering Information

Suppose you need a custom email form to collect specific information from recipients. Instead of sending a regular email in plain text format, you can design a custom form with fields for users to fill out.

Here’s how you can create a simple custom form to send to recipients and collect information:

  • Create a Custom Form:

    • Go to Outlook’s Developer tab and select Design a Form.
    • Select Message as the form type, then click Open.
    • Customize the form by adding text fields, checkboxes, or combo boxes as needed.

Img1 Blog 4

 

  • Write VBA Code to Open the Custom Form:

You can automate the process of opening this custom form and sending it via VBA.

Sub SendCustomForm()
    Dim outlookApp As Object
    Dim mailItem As Object

    'Make a fresh instance of the Outlook application.
    Set outlookApp = CreateObject("Outlook.Application")

    'By using the custom form, you can create a new mail item.
    Set mailItem = outlookApp.CreateItemFromTemplate("C:\Path\to\your\custom_form.oft")

    ' Set up the recipient and subject
    mailItem.To = "recipient@example.com"
    mailItem.Subject = "Please Fill Out This Custom Form"

    ' Send the form
    mailItem.Send
End Sub

 

This script opens the custom form (.oft file) and sends it as an email to the specified recipient. The recipient can then fill in the custom fields within the form and reply with the required information.

For further information, click on the links below:

2. Automating Reports with VBA in Outlook

Generating reports from your emails, calendar events, or tasks in Outlook can be time-consuming if done manually. However, VBA can automate the generation of these reports, pulling data from Outlook and formatting it into a readable report.

Example: Generate a Report of Unread Emails

Let’s say you want to create a report summarizing all unread emails in your inbox. This report can be sent to you automatically or saved for future reference.

Here’s an example of how to generate a simple report of unread emails:

Sub GenerateUnreadEmailReport()
    Dim inbox As Object
    Dim mailItem As Object
    Dim emailreport As String
    Dim i As Integer

    ' Get the Inbox folder
    Set inbox = Application.GetNamespace("MAPI").GetDefaultFolder(6) ' 6 represents the Inbox folder

    ' Initialize the report string
    emailreport = "Unread Email Report" & vbCrLf
    emailreport = emailreport & "====================" & vbCrLf & vbCrLf

    Iterate through every email in the inbox.
    i = 1
    For Each mailItem In inbox.Items
        If mailItem.UnRead = True Then
            emailreport = emailreport & "Email " & i & ":" & vbCrLf
            emailreport= emailreport & "Subject: " & mailItem.Subject & vbCrLf
            emailreport = emailreport & "From: " & mailItem.SenderName & vbCrLf
            emailreport = emailreport & "Received: " & mailItem.ReceivedTime & vbCrLf
            emailreport = emailreport & "------------------------" & vbCrLf
            i = i + 1
        End If
    Next

    ' Display the report in a message box
    MsgBox emailreport, vbInformation, "Unread Email Report"
End Sub

This script will loop through all emails in your inbox and create a report containing each unread email’s subject, sender, and received time. After that, the report appears in a message box.

You can extend this functionality to include more complex reports, such as calendar appointments, task due dates, or emails based on specific criteria.

Conclusion

Creating custom forms and automating reports in Outlook with VBA can significantly streamline your workflow. Custom forms allow you to gather information in a structured way, while reports enable you to track important email and calendar data automatically. Whether you need a simple custom email form, a report of unread emails, or more complex data analysis, VBA in Outlook provides a powerful solution for automating tasks and improving productivity. Integrating VBA into your Outlook routine allows you to work smarter and focus on more critical tasks, while automation handles repetitive tasks.

]]>
https://blogs.perficient.com/2025/01/28/custom-outlook-forms-reports-vba-productivity/feed/ 0 374127
Streamlining Success: A Guide to the Optimizely Configured Commerce Implementation Timeline https://blogs.perficient.com/2025/01/24/streamlining-success-a-guide-to-the-configured-commerce-implementation-timeline/ https://blogs.perficient.com/2025/01/24/streamlining-success-a-guide-to-the-configured-commerce-implementation-timeline/#comments Fri, 24 Jan 2025 07:05:57 +0000 https://blogs.perficient.com/?p=375551

Implementing Optimizely a Configured Commerce platform is a significant milestone for any business looking to scale its digital operations. A well-structured timeline ensures a seamless transition from planning to execution, ultimately delivering a robust eCommerce solution tailored to your needs.

The implementation involves four key phases: PrepareBuild and VerifyGo Live, and Post-Go Live. Let’s examine each phase to understand its importance and components.

Blofe

Image Source from https://support.optimizely.com/hc/en-us/articles/4413199673229-Configured-Commerce-implementation-timeline

Prepare: Creating the Foundation for Success

The journey begins with thorough preparation. This phase ensures all stakeholders align on the project goals, requirements, and expectations.

  • Client Workshop: This critical step involves defining requirements through collaboration with the client. It sets the stage for clearly understanding business objectives and the roadmap.
  • Project Setup: Once the requirements are precise, the focus shifts to creating a sandbox environment for testing and development. This step involves preparing data and configuring ERP and third-party systems for seamless integration with the new platform.

Businesses can minimize risks and ensure a smoother development process by investing time in preparation.

Build & Verify: Realizing the Goal

The Build & Verify phase actively constructs the platform and ensures that all functional and technical requirements are met.

  • Initial Development: Core elements like customer and product data are established, forming the system’s backbone.
  • Ongoing Development and Build: This stage covers integration, site configuration, and the functional requirements outlined earlier. Key tasks include:
    • Loading and structuring content
    • Designing themes to reflect the brand identity
    • Setting up integrations for payment systems, shipping, and more

This phase involves rigorous testing to verify that the platform meets business needs and performs as intended.

Go Live: Launching with Confidence

With development and testing completed, the project transitions to the Go Live phase, where the production environment becomes operational.

  • Create Production Site: A production site is configured to integrate all data and functionalities.
  • Production Prep: This includes loading production data, finalizing integration setups, and ensuring smooth and comprehensive user onboarding. At this stage, internal teams focus on training and ensuring they can manage the platform post-launch.

The platform officially launches, marking the achievement of months of collaboration and hard work.

Post-Go Live: Continuous Optimization

The implementation process doesn’t end with the platform launch. The Post-Go Live phase ensures that businesses continuously monitor and optimize the production site for performance, scalability, and user experience. Regular maintenance and updates are vital to ensure that the platform remains robust and adaptive to evolving business needs.

Why a Structured Timeline Matters

A well-planned implementation time frame keeps the project on track and provides flexibility to address unexpected obstacles. Businesses can focus on delivering an efficient and effective commerce solution by breaking the process into distinct, manageable phases.

With this phased approach, Implementing the Optimizely Configured Commerce platform becomes manageable. It provides a path to a scalable, high-performing, and user-friendly eCommerce experience. Proper planning, collaboration, and execution are the keys to success in this transformative journey.

Reference URL – https://support.optimizely.com/hc/en-us/articles/4413199673229-Configured-Commerce-implementation-timeline

 

]]>
https://blogs.perficient.com/2025/01/24/streamlining-success-a-guide-to-the-configured-commerce-implementation-timeline/feed/ 2 375551
Debugging and Error Handling in VBA for Excel https://blogs.perficient.com/2025/01/11/debugging-error-handling-vba-excel-macros/ https://blogs.perficient.com/2025/01/11/debugging-error-handling-vba-excel-macros/#comments Sat, 11 Jan 2025 06:56:46 +0000 https://blogs.perficient.com/?p=374078

Debugging and Error Handling in VBA

After setting up VBA in Excel, you can start automating tasks and creating your macros. This blog will guide you through what comes next after the setup process—writing, running, and debugging VBA code in Excel.

Debugging and error handling are crucial for writing effective and reliable VBA (Visual Basic for Applications) code. It helps you identify issues and ensure your macros run smoothly. These practices ensure your code runs as intended and gracefully handles unexpected scenarios. In this blog, we’ll explore tools for debugging VBA code effectively and techniques for robust error handling, providing practical examples to make the concepts relatable and actionable.

Tools for Debugging VBA Code Effectively

1. Breakpoints: The First Line of Defense

Breakpoints allow you to pause code execution at specific lines, enabling you to inspect variable values and program flow. To set a breakpoint, click in the margin next to the code line or press F9. When the code execution stops, you can analyze what’s happening.

Higlighted Breakpoint

Breakpoint

Tip: Combine breakpoints with the Step-Into (F8) feature to execute the code line by line.

2. Immediate Window: Real-Time Debugging

The Immediate Window is a versatile tool where you can print variable values and test code snippets without running the entire program. Use Debug. Print to output values or messages to the Immediate Window.

Example:

Immediate Window

Immediate window in VBA Editor

3. Locals Window and Watch Window: Inspect Variables

  • Locals Window: Displays all variables in the current scope and their values.
  • Watch Window: Allows you to monitor specific variables or expressions.
Local Window Watch Window

Local Window Watch Window in VBA editor

4. Error Highlighting and Debugging Features

VBA highlights syntax errors in red and runtime errors with a debug prompt. Clicking “Debug” during runtime errors highlights the problematic line for further inspection.

Example Error: Dividing by zero triggers a runtime error.

Error Highlighting and Debugging Features

The highlighted error line of the code

Writing Robust Code with Error Handling Techniques

1. ‘On Error Resume Next’: Ignore and Proceed

This statement instructs VBA to ignore the error and move to the next line of code. Use it sparingly for non-critical errors.

Example:

Sub IgnoreError()
On Error Resume Next
Dim num As Integer
num = 10 / 0    'Error ignored
MsgBox "Code continues despite the error."
End Sub

You can explore more on error handling in VBA by reviewing the Microsoft VBA API Overview, which provides a comprehensive guide to error handling and other VBA concepts.

Conclusion

Once you’ve set up Excel VBA, you can start writing, debugging, and optimizing your macros. The next steps after setup are crucial for mastering VBA and making your Excel workflows more efficient. Keep practicing, and as you gain more experience, you’ll unlock the full potential of Excel automation.

]]>
https://blogs.perficient.com/2025/01/11/debugging-error-handling-vba-excel-macros/feed/ 1 374078