Excel Articles / Blogs / Perficient https://blogs.perficient.com/tag/excel/ Expert Digital Insights Wed, 05 Feb 2025 06:16:02 +0000 en-US hourly 1 https://blogs.perficient.com/files/favicon-194x194-1-150x150.png Excel Articles / Blogs / Perficient https://blogs.perficient.com/tag/excel/ 32 32 30508587 Customizing Data Exports: Dynamic Excel Updates with Power Apps, Power Automate, and Office Scripts https://blogs.perficient.com/2025/02/05/customizing-data-exports-dynamic-excel-updates-with-power-apps-power-automate-and-office-scripts/ https://blogs.perficient.com/2025/02/05/customizing-data-exports-dynamic-excel-updates-with-power-apps-power-automate-and-office-scripts/#comments Wed, 05 Feb 2025 06:16:02 +0000 https://blogs.perficient.com/?p=376246

Modern business workflows often require flexible and efficient ways to export, transform, and share data. By combining the capabilities of Power Apps, Power Automate, and Office Scripts, you can create a seamless process to dynamically customize and update Excel files with minimal effort.

This guide demonstrates how to dynamically export data from Power Apps, process it with Power Automate, format it in Excel using Office Scripts, and send the updated file via email. Let’s dive into the details.

This blog demonstrates a practical solution for automating data exports and dynamic reporting in Excel, tailored to users who expect dynamic column selection for report headers. Manual data preparation and formatting can be time-consuming and error-prone in many projects, especially those involving custom reporting.

With the process outlined in this blog, you can:

  • Dynamically select and modify column headers based on user input.
  • Automate the transformation of raw data into a formatted Excel file.
  • Share the final output effortlessly via email.

This solution integrates Power Apps, Power Automate, and Office Scripts to ensure that your reporting process is faster, error-free, and adaptable to changing requirements, saving you significant time and effort.

Exporting Data from Power Apps

Creating a Collection in Power Apps

A collection in Power Apps serves as a temporary data storage container that holds the records you want to process. Here’s how to set it up:

Step 1: Define the DATA Collection

  • Open your Power App and navigate to the screen displaying or managing your data.
  • Use the Collect or ClearCollect function in Power Apps to create a collection named ExportData that holds the required data columns.
  • You can dynamically populate this collection based on user interaction or pre-existing data from a connected source. For example:

Picture1

  • Here, the ExportData collection is populated with a static table of records. You can replace this static data with actual data retrieved from your app’s sources.
  • Tip: Use data connectors like SharePoint, SQL Server, or Dataverse to fetch real-time data and add it to the collection.

Step 2: Define a Table HeaderName for Column Names

  • To ensure the exported Excel file includes the correct column headers, define a Variable named HeaderName that holds the names of the columns to be included.
Set(HeaderName, ["Name", "Age", "Country"])

This Variable specifies the column headers appearing in the exported Excel file.

Picture2

Pass Data to Power Automate

Once the ExportData collection and HeaderName are set up, pass them as inputs to the Power Automate flow.

Step 1: Add the Flow to Power Apps

  1. Navigate to the Power Automate tab in Power Apps.
  2. Click on + Add Flow and select the flow you created for exporting data to Excel.

Step 2: Trigger the Flow and Send the Data

    • Use the following formula to trigger the flow and pass the data:
CustomizingDataExports.Run(JSON(ExportData), JSON(HeaderName))

Picture3

  • CustomizingDataExports is the Power Automate flow.
  • JSON(ExportData) converts the collection to a JSON object that Power Automate can process.
  • JSON(HeaderName) converts the collection to a JSON object that passes the column headers for use in the Excel export.

Processing Data with Power Automate

Power Automate bridges Power Apps and Excel, enabling seamless data processing, transformation, and sharing. Follow these steps to configure your flow:

1. Receive Inputs

  • Trigger Action: Use the Power Apps trigger to accept two input variables:
    • ExportData: The dataset.
    • HeaderName: The column headers.
  • Add input parameters:
    • Navigate to the trigger action.
    • Click Add an input, select Text type for both variables and label them.

2. Prepare Data

Add two Compose actions to process inputs.

  • Use these expressions:

For ExportData:

json(triggerBody()?['text'])

For HeaderName:

json(triggerBody()?['text_1'])

Add a Parse JSON action to structure the HeaderName input:

Content:

outputs('Compose_-_HeaderName')

Schema:

{
    "type": "array",
    "items": {
        "type": "object",
        "properties": {
            "Value": {
                "type": "string"
            }
        },
        "required": [
            "Value"
        ]
    }
}

Use a Select action to extract the values:

From:

body('Parse_JSON')

Map:

item()['Value']

Picture4

3. Setup Excel Template

Add a Get file content action to fetch a pre-defined Excel template from storage (e.g., SharePoint or OneDrive).

Use a Create file action to save the template as a new file:

Dynamic File Name:

guid().xlsx

Convert the ExportData to a CSV format:

  • Add a Create CSV Table action:

From:

outputs('Compose_-_ExportData')

Picture5

Formatting Data with Office Scripts

Office Scripts are used to dynamically process and format data in Excel. Here’s how you implement it:

Set up the script

Open Excel and navigate to the “Automate” tab.

Create a new Office Script and paste the following code:

function main(workbook: ExcelScript.Workbook, headersArray: string[], csvData: string) {
  let activeWorksheet = workbook.getWorksheet("Sheet1");
  let csvRows = csvData.split('\n');
  csvRows = csvRows.map(row => row.replace(/\r$/, ''));
  let headerRow = csvRows[0].split(',');
  // Create a mapping of column headers to their indices
  let columnIndexMap: { [key: string]: number } = {};
  for (let i = 0; i < headerRow.length; i++) {
    let header = headerRow[i];
    if (headersArray.includes(header)) {
      columnIndexMap[header] = i;
    }
  }
  // Create new Excel table with headers below the logo
  let range = activeWorksheet.getRangeByIndexes(0, 0, 1, headersArray.length);
  range.setValues([headersArray]);
  // Batch size for inserting data into Excel
  const batchSize = 500;
  let batchData: string[][] = [];
  let columncount = 0;
  // Loop through CSV data and filter/select desired columns
  for (let j = 1; j < csvRows.length; j++) {
    let rowData = parseCSVRow(csvRows[j]);
    let filteredRowData: string[] = [];
    for (let k = 0; k < headersArray.length; k++) {
      let header = headersArray[k];
      let columnIndex = columnIndexMap[header];
      filteredRowData.push(rowData[columnIndex]);
    }
    batchData.push(filteredRowData);
    // Insert data into Excel in batches
    if (batchData.length === batchSize || j === csvRows.length - 1) {
      let startRowIndex = j - batchData.length + 1; // Start after the logo and headers
      let startColIndex = 0;
      let newRowRange = activeWorksheet.getRangeByIndexes(startRowIndex, startColIndex, batchData.length, batchData[0].length);
      newRowRange.setValues(batchData);
      batchData = [];
    }
    columncount=j;
  }
  workbook.addTable(activeWorksheet.getRangeByIndexes(0, 0, columncount, headersArray.length), true).setPredefinedTableStyle("TableStyleLight8");
  activeWorksheet.getRangeByIndexes(0, 0, columncount, headersArray.length).getFormat().autofitColumns();

  // Release the lock on the workbook
  activeWorksheet.exitActiveNamedSheetView();
}
// Custom CSV parsing function to handle commas within double quotes
function parseCSVRow(row: string): string[] {
  let columns: string[] = [];
  let currentColumn = '';
  let withinQuotes = false;
  for (let i = 0; i < row.length; i++) {
    let char = row[i];
    if (char === '"') {
      withinQuotes = !withinQuotes;
    } else if (char === ',' && !withinQuotes) {
      columns.push(currentColumn);
      currentColumn = '';
    } else {
      currentColumn += char;
    }
  }
  columns.push(currentColumn); // Add the last column
  return columns;
}

Picture6

Integrate with Power Automate

Use the Run script action in Power Automate to execute the Office Script.

Pass the header array and CSV data as parameters.

Picture7

Send the Updated File via Email

Once the Excel file is updated with Office Scripts, you can send it to recipients via Outlook email.

1. Retrieve the Updated File:

  • Add a Get file content action to fetch the updated file.

Use the file path or identifier from the Create file action.

outputs('Create_file')?['body/Id']

Picture8

2. Send an Email (V2):

  • Add the Send an email (V2) action from the Outlook connector.
  • Configure the email:
    • To: Add the recipient’s email dynamically or enter it manually.
    • Subject: Provide a meaningful subject, such as “Custom Data Export File”
    • Body: Add a custom message, including details about the file or process.
    • Attachments:
      • Name: Use a dynamic value
outputs('Create_file')?['body/Name']
        • Content: Pass the output from the Get file content action.
body('Get_file_content_-_Created_File')

Picture9

Integrating the Workflow

  1. Test the entire integration from Power Apps to Power Automate and Office Scripts.
  2. Verify the final Excel file includes the correct headers and data formatting.
  3. Confirm that the updated Excel file is attached to the email and sent to the specified recipients.

Result:

Excel

Picture10

Email

Picture11

How This Solution Saves Time

This approach is tailored for scenarios where users require a dynamic selection of column headers for custom reporting. Instead of spending hours manually formatting data and preparing reports, this solution automates the process end-to-end, ensuring:

  • Accurate data formatting without manual intervention.
  • Quick adaptation to changing requirements (e.g., selecting different report headers).
  • Seamless sharing of reports via email in just a few clicks.

This workflow minimizes errors, accelerates the reporting process, and enhances overall project efficiency by automating repetitive tasks.

Conclusion

You can create robust, dynamic workflows for exporting and transforming data by combining Power Apps, Power Automate, and Office Scripts. This approach saves time, reduces manual effort, and ensures process consistency. Adding email functionality ensures the updated file reaches stakeholders without manual intervention. Whether you’re managing simple data exports or complex transformations, this solution provides a scalable and efficient way to handle Excel data.

]]>
https://blogs.perficient.com/2025/02/05/customizing-data-exports-dynamic-excel-updates-with-power-apps-power-automate-and-office-scripts/feed/ 1 376246
Getting Started with VBA Programming: Types of VBA Macros https://blogs.perficient.com/2025/01/06/types-of-vba-macros/ https://blogs.perficient.com/2025/01/06/types-of-vba-macros/#respond Mon, 06 Jan 2025 14:53:57 +0000 https://blogs.perficient.com/?p=374071

What is VBA?

Visual Basic for Applications (VBA) is a programming language developed by Microsoft. Microsoft Office applications like Excel, Word, and Access primarily use VBA to automate repetitive tasks. VBA is a programming language that automates tasks in Microsoft Office applications, especially Excel.

Types of VBA Macros

VBA macros are custom scripts created to automate tasks and improve efficiency within Microsoft Office applications. The types of VBA macros vary in functionality, ranging from simple recorded macros to complex event-driven scripts. Here’s a breakdown of the most commonly used types of VBA macros:

VBA (Visual Basic for Applications) categorizes macros based on their functionality and the events that trigger them. Here are the main types of macros:

A visually appealing infographic showcasing VBA (Visual Basic for Applications) and its different types of macros.

A visually appealing infographic showcasing VBA (Visual Basic for Applications) and its different types of macros.

 

1. Recorded Macros

  • Description: A sequence of actions carried out within an Office application is recorded to create these macros. VBA translates these actions into code automatically.
  • Use Case: Great for automating repetitive tasks without manually writing code.
  • Example: Automatically applying consistent formatting to a set of worksheets in Excel.

Learn more about how to record macros in Excel.

2. Custom-Coded Macros

  • Description: These are manually written scripts that perform specific tasks. They offer more flexibility and functionality than recorded macros.
  • Use Case: Useful for complex tasks that require conditional logic, loops, or interaction between multiple Office applications.
  • Example: Generating customized reports and automating email notifications from Outlook based on Excel data.

3. Event-Driven Macros

  • Description: These macros run automatically in response to specific events, such as opening a document, saving a file, or clicking a button.
  • Use Case: Used for automating tasks that should happen automatically when a certain event occurs.
  • Example: Automatically updating a timestamp in a cell every time a worksheet is modified.

4. User-Defined Functions (UDFs)

  • Description: These are custom functions created using VBA that can be used just like built-in functions in Excel formulas.
  • Use Case: Ideal for creating reusable calculations or functions unavailable in Excel by default.
  • Example: Creating a custom function to calculate a specific financial metric.

5. Macro Modules

  • Description: A module is a container for VBA code, which can include multiple macros, functions, and subroutines. Related macros can be grouped together and organized using these.
  • Use Case: Useful for keeping code organized, especially in large projects.
  • Example: Group all macros related to data processing in one module and all macros associated with reporting in another.

Each type of macro serves a distinct function and suits specific tasks, depending on the requirements. Use these macros actively based on your needs to achieve the best results.

Conclusion

VBA allows you to automate operations and increase productivity in Microsoft Office programs. Understanding the various sorts of macros helps you select the best strategy for your requirements. Whether you are recording activities, building custom scripts, or creating event-driven automated processes, knowing the options can guide your decision. Moreover, this knowledge ensures you choose the most efficient approach for your tasks. Additionally, using the right type of macro can significantly improve your productivity and streamline your workflow. Begin learning VBA to achieve new levels of efficiency in your workflows.

Happy reading and automating!

]]>
https://blogs.perficient.com/2025/01/06/types-of-vba-macros/feed/ 0 374071
Power BI & Excel Connectivity: Scenarios Which Can Break Dashboard https://blogs.perficient.com/2023/09/18/power-bi-excel-connectivity-scenarios-which-can-break-dashboard/ https://blogs.perficient.com/2023/09/18/power-bi-excel-connectivity-scenarios-which-can-break-dashboard/#respond Mon, 18 Sep 2023 07:35:18 +0000 https://blogs.perficient.com/?p=336978

Background

Excel is the most used spreadsheet software in today’s era, used by every level of organization. Quite a huge amount of unorganized data is maintained in Excel workbooks, owing to ease of quick creation, storage & sharing of Excel files over database. Resultant many of the Power BI reports / dashboards are based on Excel as data-source.

Excel design enables it to act as a quasi-DBMS, with individual worksheet acting as a table and workbook as a database. But Excel, being a spreadsheet genre software, lacks enforceability making it vulnerable to breaking-down entire Power BI report in certain scenario. This blog showcases few scenarios, which developer needs to take care while using Excel as data-source for Power BI report.

 

Scenario: Data Type mismatch

Excel supports datatypes like text / number / date / time / logical. Unfortunately, it does not support strong enforcement of datatypes in respective columns. For e.g., users are free to type text into date datatype columns & so on. Data Validation rules of Excel can enforce this, but these rules can easily be de-activated or deleted in few seconds.

Excel Invalid Datatype

As shown in above screenshot, the data contained all the dates when initial Power BI report was prepared. But one fine day, some user entered question marks (???) in Date column since he/she was unaware of Date of transaction during data entry & decided to fill up that information when it becomes available. But such placeholder values generates errors, as Power BI attempts to skip these rows.

Powerbi Invalid Datatype

Power BI Desktop will take care to show up the error as shown in the above screenshot. But Power BI Service might not show the error on the face & silently skip the rows loading remaining data. This might affect reporting since amounts written on those rows would never be added as those rows were not imported into data model.

 

Scenario: Summary Rows at the bottom

Many people have a habit of calculating grand totals at the bottom of data in Excel (refer below screenshot).

Summary Total Habit

This might ruin reporting in Power BI, as this row also gets incorporated into data, thereby inflating sum totals. Below are the comparative images section of the Power BI report with summary cards showing different figures before/after the summary row

Total Before Summary Row

Total After Summary Row

Proper care needs to be exercised when such Excel data is intended to be used as data-source for Power BI. End-user of the Excel workbook needs to be informed of the above thing, and Excel summarization needs to be done in separate worksheets to prevent this.

 

Scenario: Gaps in Data

Certain times user might insert gaps in data rows (typically observed for printing purposes to adjust print preview range)

Gaps In Rows Excel

Power BI imports data including blank rows. The majority of the calculations would not get affected, except few DAX functions which will go on to include blank rows in calculations.

Dax Countrows Function

Above is the result using the COUNTROWS ( ) function, which also includes blank rows in the calculation result.

Dax Count Function

Calculation results differ a bit from other function like COUNT ( ) since this function excludes blank cells while counting.

Few developers prefer to use the COUNTROWS ( ) function, as it yields results faster (it simply returns back the row count of the table). Whereas, COUNT ( ) is relatively slow since it validates the values of each cell while calculating. Power BI report developer needs to account for these scenarios & develop measures accordingly.

Gaps Blank Option Slicer Value

Gaps also create blank options in the slicer dropdown, which does not appear professional.

The above mess could be avoided by adding an extra step of removing empty rows (refer to below image)

Removing Row Gaps

 

Scenario: Renaming of Column(s)

Many times end-users inadvertently change column titles, for better understanding or readability. Some business users might not prefer technical name of the column, so they might be tempted to re-name them before creating PIVOT tables/charts. Like in the below example, Excel workbook user changed column name Amount to Amount (in Rs) since organization is having multi-currency reporting, so user wants column title to depict this fact that amounts are in Indian Rupees.

Excel Column Title Rename

Renaming results to failing of dataset refresh for Power BI reports, since originally while developing report, the column was titled as Amount. Power Query stores column names derived from Excel in the M script for import of Excel data.

Below is the error displayed when Power BI report is opened through Power BI Desktop

Column Rename Error Desktop

Report viewer needs to be a bit vigilant in monitoring refresh errors, since it shows-up as a small error icon as shown in below image

Column Renaming Service Error Icon

On clicking the error icon, message as below is shown which clarifies error in detail

Column Renaming Service Error Message

Report users should get their email ID added into refresh failure notification triggers. Power BI will display data of last successful refresh for reports, which is even more disastrous.

Powerbi Service Dataset Failure Notification Configuration

 

Scenario: Cell Errors

Certain times, Excel formulas break due to deletion of cells which formula referred or any other miscellaneous reason. This results in cell error (as shown in below screenshot)

Excel Cell Error

Just like Data Type mismatch discussed above, when the report is refreshed from Power BI Desktop, it would display count of rows with errors. But in Power BI Service, these errors are silent. Although Power Query can perform basic level of handling for these errors like substitution of errors with other value. Since this error originates from source, fixing it in source is more sensible than handling it in Power BI.

 

Scenario: Renaming / Moving of Excel file

Power BI supports absolute path while referencing any source file (refer below screenshot).

Excel File Drive Reference

 

 

So, if the file is moved to some other folder, or maybe renamed, then the path needs to be updated in Power BI report too.

Same applies for Excel files referencing sharepoint (refer below screenshot).

Excel File Sharepoint Reference

 

 

Renaming or moving file to different folder, will result to change of sharepoint URL which needs to be updated.

Report developers can introduce parameter & link file path / URL with parameters which is easy to update from Power BI Service, without having to download, modify & re-publish Power BI report. It is not solution, but just an easy hack.

Google Sheets enjoys advantage in this scenario compared to Microsoft Excel, as links of Google Sheets do not change on renaming or moving file. Google Sheet assigns unique identifiers to the file which is independent of file name or location. Power BI supports Google Sheets as data-source & one can leverage this, if renaming/moving of file is unavoidable & happens frequently as a normal business scenario.

 

Conclusion

Excel might be a preferred choice of data-source, but one needs to think from broader perspective when using it for analytical & reporting purpose. Moving some of the Excel based data entry into Power Apps would be a strong solution, as forms have capability to validate the data before storing it. Power Apps use Dataverse as a backend which Power BI can connect easily. On an organizational level, this approach provides stronger reporting capability, compared to Excel.

]]>
https://blogs.perficient.com/2023/09/18/power-bi-excel-connectivity-scenarios-which-can-break-dashboard/feed/ 0 336978
Accessing Power BI confidential data in Excel for internal organization users https://blogs.perficient.com/2022/09/06/accessing-power-bi-confidential-data-in-excel-for-internal-organization-users/ https://blogs.perficient.com/2022/09/06/accessing-power-bi-confidential-data-in-excel-for-internal-organization-users/#respond Tue, 06 Sep 2022 10:28:53 +0000 https://blogs.perficient.com/?p=317398

Background

Microsoft Excel is a popular and preferred spreadsheet solution for quick daily use reporting by the majority corporations and businesses in the world. Many times, corporate users need access to organization’s data in Excel for further development of MIS Reports. Power Query is a powerful tool embedded in Excel which can connect to internal Database Server, Online CRM/ERP Services, Excel files etc. hosted within an organization’s network or over the cloud. Apart from that, users also heavily use hyperlink to another Excel files reusing existing source data already prepared by other people.

 

Challenges / Issues

Numerous challenges or issues occur when a user attempts to refer to data stored in another Excel file (via formulas) or Database Server / ERP (via Power Query) as below:

  • There is a high probability that the original source Excel file to which references are made, might get deleted by the owner, or the structure may get modified, eventually breaking cell references. This would certainly lead to errors in reports.
  • Data imported via Power Query arrives in the form of table. Tables are generally not suitable for reporting, as various reporting authorities prescribe a fixed and pre-defined format in which to prepare and submit a report.
  • Power Query is a bit technical, and not every user might be able to grasp it.
  • Database connection requires a server IP address, credentials, etc. Exposing of such critical information might lead to security lapses, allowing easy access to the organization’s internal data.

 

Solutions

Microsoft offers 2 ways to connect to Power BI published data from Excel

  1. Direct connectivity to Power BI published dataset via Power Pivot, which supports creating DAX measures too.Power Bi Published Datasets
  2. Access of designated tables of Power BI datasets, which can be used as an Organizational Data Types.Power Bi Organizational Datasets

 

Power PIVOT

A dataset is essentially a collection of tables co-related using the relationship feature of Power BI. Excel lacks an easy and direct way of analyzing relational data. One needs to use VLOOKUP, XLOOKUP, MATCH, INDEX like functions for co-relating data. Power Pivot provides an easy way of reusing Power BI published dataset, having already defined relationships to design reports in the form of Pivot Table or Pivot Charts directly in Excel itself.

Steps to analyze Power BI published dataset using Power PIVOT is as below

  1. In Excel navigate to Data tab > Get Data > From Power BI <organization_name>
    Pv Step 01
  2. Right-hand pane window will appear. Select target dataset from the list
    Pv Step 02
  3. Excel will create a new worksheet with blank Pivot Table designer. Drag & Drop fields into Pivot just like regular Pivot.Pv Step 03

 

Organizational Data Types

Excel supports 3 types of data types natively:

  1. Text
  2. Number (includes Date-Time)
  3. Logical (True/False)

Formula is not a data type but evaluates to any of these 3 data types. Office 365 version of Excel introduced support for a new data type called Linked Data Type, which is of type record. Linked data type holds a reference to a record (or row) containing multiple fields. So virtually it is a cell which can hold multiple values internally (refer screenshot below).

Screenshot Linked Datatype Cell

(linked data types have an icon as prefix in cell value)

Value of a linked data type cell can be extracted into another cell by referencing the linked data type cell using formula =cell_reference followed by a dot sign, which further enumerates field names in that record type cell (refer screenshot below)

Screenshot Linked Datatype Formula Reference

(evaluated screenshot below)

Screenshot Linked Datatype Formula Evaluated

Excel includes a few built-in linked data type sources such as Stock Market, Currency, Geography, and so on. Apart from that, any Power BI dataset table can be promoted to include itself as a custom linked data type, which is available only to Excel users of that organization. Such data types are available as organizational data types.

Organization Linked Data Types can be created using Power BI, by setting a table as a featured table and then publishing it (screenshot below).

Screenshot Powerbi Featured Table

 

Advantages of Organization Linked Data Types approach:

  • This approach prevents exposure of the source Excel file or Database to the end user, thereby enforcing privacy.
  • Power BI Service supports an elegant system of access control by designating workspace access to a specified user group, which also gets applied to Organizational Data Types.
  • User does not need to import source data again into some separate Worksheet to setup VLOOKUP to fetch values of other fields. So, the resulting excel file is light-weight with smaller size and fewer formulas in it (explained in case study).
  • Organizational Data Types work seamlessly in Excel Online (browser-based Excel). User does not even require to be on an organization’s VPN to access source data. Power Query or external Excel file references require the user to be on an organization’s LAN, which is a downside.

 

Case Study: VLOOKUP vs Organization Data Types

Scenario:

HR of an organization is required to prepare an Excel file where we need to do some analysis for individual employees. He exports an Excel file of Employee Master from ERP and then manually copies and pastes Employee Master data from that Excel file every month. Currently he is referencing and linking to this master data using the VLOOKUP function (as per below screenshot). He is maintaining this Master worksheet in many Excel files & has to manually update it.

Screenshot Method Vlookup

In the above approach, if the HR fails to update Employee Master sheet, it can lead to incorrect reporting & decision-making. Also, if any column gets added or removed from Employee Master in future, VLOOKUP function will require modification of column references manually.

As a BI Consultant what solution can you offer ?

Solution:

Power BI supports connectivity to popular ERP, Database, Excel files etc. We will simply create a dataset in Power BI, extracting this data from ERP (via Power Query transformations if required). And then, without creating any visualization, we will simply publish the dataset, setting Featured table in the modelling window of Power BI, to the desired workspace of HR. This will enable HR to view tables of Power BI datasets shared with him. Afterwards, we will simply remove VLOOKUP and replace it with cell references as demonstrated below:

Screenshot Method Organization Data Type

 

Compatibility

All the things explained and demonstrated in the blog are compatible on an Office 365 version of Excel (Desktop + Web). User needs to be on Office 365 Business or Enterprise subscription. Office 365 Personal / Home subscription or perpetual editions of Office like 2013, 2016, 2019, 2021 etc. do not support all these features as they require associated domain, which is missing in these editions.

 

 

]]>
https://blogs.perficient.com/2022/09/06/accessing-power-bi-confidential-data-in-excel-for-internal-organization-users/feed/ 0 317398
Sitecore Content Hub Tips and Tricks: Enabling Excel Uploads https://blogs.perficient.com/2022/08/11/sitecore-content-hub-tips-and-tricks-enabling-excel-uploads/ https://blogs.perficient.com/2022/08/11/sitecore-content-hub-tips-and-tricks-enabling-excel-uploads/#respond Thu, 11 Aug 2022 16:04:12 +0000 https://blogs.perficient.com/?p=316136

You signed into Sitecore Content Hub and you have your Excel sheet ready with assets to upload. You go to the Create page and … wait! Where is the option to upload from Excel? Don’t worry! If you have the right privileges or can grab the nearest Super User, you can enable the option!

Out of the box, Content Hub will only provide the basic Upload option, where you can browse or drag and drop files.

Enabling Excel uploads is as easy as following these simple steps:

Click the Manage icon in the upper right hand corner. It looks like a gear wheel.

Blog Sitecore Hubspot Icon

On the Manage dashboard, choose Pages.

Blog Sitecore Hubspot Pages

Once in Pages, navigate from Home to its User Import child page.

Blog Sitecore Hubspot Pages Home

 

Blog Sitecore Hubspot Pages Home User Import

Once you select User Import, you’ll need to access the Creation component. It’ll be in the layout to the right. Click the three dots and choose Edit.

Blog Sitecore Hubspot Pages Home User Import Creation

You’ve made it! Inside the Creation component toggle the Import Excel to on (green). You can now import from an Excel file.

Blog Sitecore Hubspot Pages Home User Import Creation Excel Toggle

Now when you return to the Create page, your Upload button will now say Add. Click it and you’ll see the new option Import Excel.

Blog Sitecore Hubspot Download Options

Check back for the next Sitecore Content Hub post, where I’ll show you how to prepare your Excel sheet for a bulk upload.

]]>
https://blogs.perficient.com/2022/08/11/sitecore-content-hub-tips-and-tricks-enabling-excel-uploads/feed/ 0 316136
A tour of PowerQuery’s M language https://blogs.perficient.com/2022/04/22/a-tour-of-powerquerys-m-language/ https://blogs.perficient.com/2022/04/22/a-tour-of-powerquerys-m-language/#respond Fri, 22 Apr 2022 14:43:02 +0000 https://blogs.perficient.com/?p=308564

In a previous post, I introduced PowerQuery and demonstrated how to perform various SQL-like operations. This article gives a tour of PowerQuery’s M language that underlies each query.

let and in

If you select a query and click on the “Advanced Editor” button in the Home tab, you’ll see something like this:

Image 20220421150214466

This is the M language code that constitutes our query. We’ll soon come back to the above code, but for now, let’s gain a basic understanding of how M works.

The first thing to know about M is that most M scripts are of the form let ... in .... In such a script, intermediate computations happen inside the let statement, and the content after in is the script’s return value.

For example, when the M code

let
     x = 3,
     y = x + 5
in
     y

is the script underlying a query, then that query appears as follows in the GUI:

Image 20220421103907452

Interestingly enough, it is not actually necessary for a script to contain the keywords let and inSo long as the only content in the script evaluates to a value. For instance,

x = 5

is a perfectly valid M script!

So, it is more accurate to say that

  • The contents of every M script must evaluate to a value.

  • let ... in ... evaluates the content after in. Therefore, since let ... in ... evaluates to a value, any script may be of the form let ... in ... .

We should also note that one can place the code of the form x = let ... in ... within any existing let block, and then make use of x!

let ... in ... Vs. select ... from ...

In my opinion, the let ... in ... syntax doesn’t really make much sense. I think the M language would make much more sense if there were no let nor inAnd every script simply returned the value of its last line.

It seems to me thatlet ... in ... is supposed to evoke connotations with SQL’s select ... from .... Comparisons between let ... in ... and select ... from ... quickly break down, though:

  • The data source in a SQL query is specified in the from clause, while the data source of a let ... in ... statement typically appears in the let clause.

  • The result set of a SQL query is determined primarily from the select clause, while the result of a let ... in ... statement is whatever comes after in.

 

 

Autogenerated M code

Now that we have some knowledge about let ... in ...We can look at some sample M code that is autogenerated after using the GUI to create a query:

let
     Source = Excel.CurrentWorkbook(){[Name="Table1"]}[Content],
     #"Changed Type" = Table.TransformColumnTypes(Source,{{"col1", Int64.Type}, {"col2", type text}, {"col3", type text}}),
     #"Filtered Rows" = Table.SelectRows(#"Changed Type", each [col1] = 1 or [col2] = "b")
in
     #"Filtered Rows"

Looking closely at the above code teaches us two important facts about the M language:

  1. Variable identifiers can be of the form #"{string}", where {string} is any string of characters.

  2. The autogenerated M code corresponding to each “step” in a PowerQuery query references the previous step. (E.g., when computing #"Changed Type", we pass Source to Table.TransformColumnTypes()).

If we consult the M documentation for any of the functions (Excel.CurrentWorkbook(), Table.TransformColumnTypes(), Table.SelectRows()) in the above, we also see that

  1. The objects that represent each “step” in a query are of type table.

M data types

  • The Microsoft documentation describes M as having the following primitive types: binary, date, datetime, datetimezone, duration, list, logical, null, number, record, text, time, type.

  • There are also “abstract types”: function, table, any, and none.

  • Types in M can be declared as nullable.

  • Some types represent types ( type number and type text are such types).

Lists and records

In M, the basic collection types are lists and records. Lists and records are 0-indexed.

Lists are essentially “arrays”, and records map string-valued “keys” to “values.” (So records are essentially “dictionaries”/”hashmaps”).

To initialize a list, use code such as lst = {1, "a", 2, false}. To initialize a record, use code such as rec = [key = 1, key2 = "blah"]. To access the ith element of a list, use lst{i}. To get the value associated with key key (e.g. key = "key1") in a record rec, use rec[key].

M uses functional programming

In M, we use functional programming constructs in the place of looping constructs. The go-to functional programming construct is the function List.Transform(). Given a list lst and a function fn, List.Transform(lst, fn) returns the list that is the result of applying fn to each element of lst.

The function List.Generate() can also be handy. Whenever you can’t think of a good way to solve your problem by using List.Transform()And whenever it is actually best to essentially implement a for loop, use this code to do so:

List.Generate(() => 0, each _ < n, each _ + 1, {statement})

It will execute {statement} n times.

User-defined functions

Writing user-defined functions in M can prove very useful. In my work, I found that I needed to repeat a certain sequence of steps many times. If I were to manually rewrite these steps with the PowerQuery GUI repeatedly, I would drive myself insane and have way too many PowerQuery steps. But, since I created a user-defined function to perform the above task, I was able to perform collapse the above four steps into a single step!

The syntax for defining a custom function uses anonymous function syntax.

fn = (x) => x * x

(If you were to evaluate fn(x) elsewhere in the script, that invocation fn(x) would return x * x).

The query whose M script is the above looks like this in the GUI:

Image 20220421120442467

Global variables and global functions

When a variable or function is used multiple times in multiple scripts, it is best practice to separate the definition of the variable or function from all of the scripts that use the variable or function. To define a global variable with a value of, say, 5, use the Advanced Editor* to make a query’s M code

5

Then, change the name of the query to be the desired identifier for the variable.

Since functions are variables of type function, the process for defining a global function is the same. For example, to declare a global function named fn that sends x to x * x, create a query whose name is fn, and edit the query’s M code with the Advanced Editor* so that it is

(x) => x * x

* If you use the smaller editor instead of the Advanced Editor, you will have to prepend an equals = to the beginning of your code to avoid errors.

Accessing the “current” table row

Recall that the function call that implements the equivalent of a general where clause looks something like

Table.SelectRows(#"Changed Type", each [col1] = 1)

There are a several concepts at play here we glossed over before that deserve explanation.

  • Rows of tables are represented as records. If row is a record that represents some row of a table, the value in the column row of that row is row[col].

  • The second argument of Table.SelectRows() is a function whose input is a record that represents the “current row” of the table and whose output is a logical (i.e. a boolean) that indicates whether or not to include the current row in the result set.

  • _ is a valid variable name in M, and so the function (_) => fn(_) is the same as the function (x) => fn(x) . For example, the function (_) => _ * _ is the same as the function (x) => x * x.

  • The each keyword is shorthand for the syntax(_) =>.

  • Whenever a variable var appears in square brackets to the right of an each, M interprets [var] as meaning _[var]. Therefore, an expression such as each [var] is the same (_) => _[var].

Knowing all of these things, we see that the above code translates to

Table.SelectRows(#"Changed Type", (_) => _[col1] = 1)

Since you might be uncomfortable with using _ as a variable, let’s consider another equivalent function call:

Table.SelectRows(#"Changed Type", (row) => row[col1] = 1)

Here, we understand (row) => row[col1] = 1 to be the function that takes in a record representing the current row, looks at the value in this record associated with the key col1, and returns true whenever that value is equal to 1. Thus, the above code selects the rows from the table that have a value in column col1 of 1.

]]>
https://blogs.perficient.com/2022/04/22/a-tour-of-powerquerys-m-language/feed/ 0 308564
Data exploration with PowerQuery https://blogs.perficient.com/2022/04/22/data-exploration-with-powerquery/ https://blogs.perficient.com/2022/04/22/data-exploration-with-powerquery/#respond Fri, 22 Apr 2022 14:29:53 +0000 https://blogs.perficient.com/?p=308553

Microsoft’s PowerQuery is a neat tool that allows one to perform SQL-like operations on Excel tables.

When investigating a database, I actually prefer using PowerQuery over raw SQL for a couple reasons:

  • PowerQuery displays result sets that are much easier to look at than the a typical SQL plaintext result set.

  • It’s easy to immediately interact with PowerQuery result sets by using the graphical user interface.

  • Most importantly, you write PowerQuery queries one step at a time and can therefore easily sanity check a query as you write it. (It’s tedious to do so in raw SQL).

If you frequently use SQL to investigate databases, I highly recommend that you try out PowerQuery.

To try PowerQuery out on some test data, just create an Excel Table*, then select any cell within that Table, go to the Data tab at the top of the screen, and click “From Table/Range”. (* To create an Excel Table: enter some random data into a rectangular range of cells, then select any cell within that range, go to the Insert tab at the top of the screen, and click “Table”).

Here’s what happens if I have the following Excel Table:

Image 20220421090512615

After I select a cell from the above table, and click “From Table/Range”, the PowerQuery editor pops up:

We can see that PowerQuery has represented my Excel Table as a query. We can also see the graphical user interface that allows us to interactively add steps to said query.

PowerQuery equivalents to SQL constructs

It’s instructive to think about how we can accomplish various SQL constructs within PowerQuery.

  • To do the equivalent of a select statement, and select a subset of columns from the result set, we would click on the “Choose Columns” button (visible above).

  • To do a select distinct, we use “Choose Columns” to execute the desired select, and then, in the following result set, select all columns, right click, and select “Remove Duplicates”.

  • Accomplishing the equivalent of a where clause- selecting the subset of rows from the result set for which a certain condition is true- is a bit hacky in general. (We describe how to do this later). In the case when the condition only involves one column, though, we can do everything in a non-hacky way. If we want to filter the above result set for with col1 = 1, we would click the downwards arrow inside the col1 header, and use either the “Number Filters” option or the checkbox next to “1” in the following menu:

    Image 20220421091644894

  • To do a group by, we go to the Transform tab at the top of the screen, and click “Group By”.

  • To do a join (whether inner, left, right, full outer, etc.), we click “Merge Queries” from within the Home tab. To do a union, we click “Append Queries” from within the Home Tab.

    • To increase encapsulation, one can use the “Merge Queries as New” or “Append Queries as New” options to produce a table that is the result of joining or unioning two existing tables.

      Image 20220421093022520

General where clauses

Above, we noted that accomplishing a where clause that involves more than one column is a bit hacky. We describe how to write such a where clause here. It’s really not that bad: first, just click the downwards arrow inside any column’s header, and filter for anything you like. I’ve done so, and filtered the above data for rows with col1 = 1:

Image 20220422081254808

Notice the code that appears in the bar that runs horizontally over the top the table:

= Table.SelectRows(#"Changed Type", each [col1] = 1)

This code provides a more low-level description of what the “Filtered Rows” step of the query is doing. You can probably guess how we accomplish a general filter (one that involves columns other than col1). If we wanted to change the filtering condition to, say, col1 = 1 or col2 = "b", then what we do is edit said code to be

= Table.SelectRows(#"Changed Type", each [col1] = 1 or [col2] = "b")

It works! We get

Image 20220422081358076

In general, any column of the table can be referenced in an “each statement” such as the above by enclosing the column name in square brackets. Soon, we’ll learn more about what this square bracket notation actually means, and why it must come after the keyword each.

]]>
https://blogs.perficient.com/2022/04/22/data-exploration-with-powerquery/feed/ 0 308553
Clarifying Excel’s lookup functions https://blogs.perficient.com/2022/02/09/clarifying-excels-lookup-functions/ https://blogs.perficient.com/2022/02/09/clarifying-excels-lookup-functions/#respond Wed, 09 Feb 2022 20:44:43 +0000 https://blogs.perficient.com/?p=304592

I’ve decided to write some of my own documentation for common use cases of the Excel functions LOOKUP, VLOOKUP, HLOOKUP and XLOOKUP because the official documentation is pretty confusing. It uses “lookup value” as a synonym for “key”, when one would conventionally expect a “lookup value” to be a synonym for “value”! (After all, in the typical key-value terminology, “values” are obtained as the result of looking up “keys”!)

Before jumping in- here’s a quick overview. All four lookup functions essentially return the result of the pseudocode values[keys.indexOf(key)], where, given arrays of “keys” and “values” named keys and values, respectively, keys.IndexOf(key) is the index of the key in the array keys. Additionally,

  • LOOKUP is the most simplistic of the four functions- it pretty much looks up “values” from “keys” like you would expect.

  • The “V” and “H” in VLOOKUP and HLOOKUP stand for “vertical” and “horizontal”, respectively; in VLOOKUP, the provided 1D ranges must be columns, and in HLOOKUP they must be rows.

  • XLOOKUP combines the functionality of VLOOKUP and XLOOKUP, and allows for the provided 1D ranges to be either rows or columns. (If you have access to XLOOKUP, you should prefer it over VLOOKUP and HLOOKUP. But at the time of writing, you need access to a Microsoft 365 subscription to use XLOOKUP).

Without further ado, here is my documentation.

LOOKUP

Syntax: LOOKUP(key, keys, values).

Returns the result of the pseudocode values[keys.indexOf(key)], where keys.indexOf(key) is the index of the key in keys, when keys is treated as an array.

key – a value that exists in keys

keys – a 1D range of “keys”

values – a 1D range of “values”

Notes:

  • The official documentation mentions an “array form” of a LOOKUP invocation. I don’t cover that here (the above summarizes the “vector form”) because VLOOKUP, HLOOKUP, and XLOOKUP accomplish the same thing as the “array form”.

 

VLOOKUP

Syntax: VLOOKUP(key, table, valuesIndex, fuzzyMatch).

Returns the result of the pseudocode values[keys.indexOf(key)], where keys is the column of “keys”, “values” is the column of “values”, and where keys.indexOf(key) is the index of the key in keys, when keys is treated as an array.

key – a value that exists in keys

table – a 2D range that contains the column of “keys” and the column of “values” OR a table that contains the column of “keys” and the column of “values”

valuesIndex – the column index (into table) of the column of “values”

fuzzyMatch – whether or not to fuzzily match key with values in the column of “keys” (you almost always want to use fuzzyMatch = FALSE)

Notes:

  • To create a table that you would use for the table argument, select the 2D range that is to be registered as a table. Then, go to the Insert tab, click Table, and then click OK.

  • You might ask: “Why would we want to specify a table that the “key” and “value” columns reside in? Why not just specify the ‘key’ and ‘value’ columns?” The reason it’s advantageous to have this table parameter is that it, if we are calling HLOOKUP multiple times in the same column and varying valuesIndex between calls, we will get an error message if valuesIndex ventures outside the bounds of table. This error message can prevent us from making erroneous computations.

HLOOKUP

HLOOKUP works in the same way as VLOOKUP, with the only difference being that the “keys” and “values” must be stored in rows instead of columns.

XLOOKUP

Syntax: XLOOKUP(key, keys, values).

Returns the result of the pseudocode values[keys.indexOf(key)], where keys.indexOf(key) is the index of the key in keys, when keys is treated as an array.

key – a value that exists in keys

keys – a 1D range of “keys”

values – a 1D range of “values”

Notes:

  • A 1D range can be either a row or a column.

]]>
https://blogs.perficient.com/2022/02/09/clarifying-excels-lookup-functions/feed/ 0 304592
OneStream – Importing Non-Default Descriptions https://blogs.perficient.com/2021/05/27/onestream-importing-non-default-descriptions/ https://blogs.perficient.com/2021/05/27/onestream-importing-non-default-descriptions/#respond Thu, 27 May 2021 13:55:35 +0000 https://blogs.perficient.com/?p=292161

Does your organization need to produce reporting with descriptions other than the Default Description specified on a member?  If yes, this blog post is for you as I will demonstrate how to leverage Excel and a metadata import to update member descriptions that are not the Default Description.  The reason for a custom solution is that currently, the Metadata Builder does not support descriptions other than the Default Description.  This blog will demonstrate the process by first creating the French description and then developing the XML for the metadata import utilizing Excel and Notepad++.

Before I begin reviewing the custom solution and to level set the discussion, a OneStream application with a default install and configuration has Culture Codes for English (United States) “en-US” and French (France) “fr-FR” which are displayed in the next image in addition to the Default Description.  In this blog, French (France) will be utilized as an example of a Non Default Description.

Blog2021 05 01

As mentioned previously, the reason for a custom solution is that currently, the Metadata Builder does not support descriptions other than the Default Description.  To begin the process, launch Excel and log into the appropriate OneStream application which for this blog will be the Golfstream application.

After you have logged into the Golfstream application, create a Quick View which will return the members that require a description update.  In this example, I am creating a list of Balance Sheet base members after selecting CorpAccounts as the dimension.  The Member Filter is displayed in the first image and the Quick View is displayed in the second image.  Note, this presentation was created by selecting the Annotation view member as data is not needed and the Row Header Text Type is set to Name.

Blog2021 05 02

Blog2021 05 03

With the Quick View created and the members displayed, select the option “Convert to XFGetCells” from the OneStream XF Ribbon.

Blog2021 05 04

Select “OK” when the Extensible Finance dialog box renders.

Blog2021 05 05

Next, the description of the member will be retrieved by replacing the XFGetCell represented by “#REFRESH” displayed in column B and initially in cell B2 of the first image with the following formula: “=XFGetMemberPropertyVolatile(“Account”,A2,”Description”,””,””,””)”.  With the formula added to cell B2, copy the formula and replace the XFGetCell formulas.  After the selection of “Refresh Sheet“, the member and member description is displayed in the second image.

Blog2021 05 06

Blog2021 05 07

With the member in column A and the member’s Default Description in Column B, the French description will be added to column C starting in cell C2.  To begin the process, click cell “B2” and then select “Review -> Translate”.

Blog2021 05 08a

The Translator pane will open and the From dialog box detects the language as well as places the selected value in both the From and To text box.

Blog2021 05 09

Update the To language from “English” to “French” and select the “Tab” key to cause the translation to occur which is displayed in the second image.

Blog2021 05 09a

Blog2021 05 11

Select the French translation of the member description and then copy/paste the translated value to cell C2.

Blog2021 05 12

Repeat this process for the other descriptions by selecting the next cell “C3” and the description will automatically translate if the Translator pane was not closed.  Copy/paste the translated value to cell C3 and repeat this process for the other rows.  Note, the selection of multiple cells will Translate all of the selected cells; however, a copy/paste will result in one row instead of multiple rows.  By adding a delimiter to the description, the one row can be parsed to multiple columns using “Text to Columns” and then when the copy/paste occurs the transpose option can convert the multiple columns to multiple rows.

Blog2021 05 13

Before we transition to creating the XML to import into OneStream, the following formula “=XFGetMemberProperty(“Account”,A2,”DisplayMemberGroup”,””,””,””)” will be added to cell D2 to return the security member specified for the DisplayMemberGroup property.  Once this formula is added to cell D2 copy the formula to the member rows.

Blog2021 05 14

Having leveraged the Translate function of Excel to convert the English description to the French description, the next several steps will be to develop an Excel formula to create the XML for the metadata import.  To begin, the format of the XML can be determined by extracting the metadata of a member with a French (France) description and the next image displays the XML needed to import one member.  Note, the extract which occurred was modified to enable the presentation in the image.

Blog2021 05 15

Add the values listed starting with cell E1 and concluding with cell Q1. Note, the text for cell L1, O1, and P1 will include a space after the visible characters which were included for formatting purposes.

CellText
E1<member name="
F1" description="
G1"
H1displayMemberGroup="
I1">
J1<descriptions>
K1<description culture="
L1fr-FR"
M1description="
N1"
O1/>
P1</descriptions> "
Q1</member>

In cell E2, add the following formula “

=CONCATENATE($E$1,A2,$F$1,SUBSTITUTE(B2,”&”,”amp;”,1),$G$1,$H$1,D2,$I$1,$J$1,$K$1,$L$1,$M$1,C2,$N$1,$O$1,$P$1,$Q$1)” which concatenates the text from cell E1 to Q1 with the text in columns A to D for the corresponding row to create the XML.  Note, due to XML reserved characters, the ampersand “&” character is substituted with “amp;” as part of the formula.  Once this is done, copy the formula in cell E2 to the appropriate rows.

Blog2021 05 16

With the XML created utilizing an Excel formula, copy and paste the XML generated to a text editor which in this circumstance is Notepad++.

Blog2021 05 17

After the XML is copied and pasted to a text editor, leverage the example from the first image to include the rows before and after “<member name” which are rows 1 to 6 and 8 to 12 as displayed.

Blog2021 05 18

To upload the file, save the file.  Two images are included for reference, the first displays row 1 to 6 from the example and the second image has the <members> collapsed to show rows 1 to 6 and 81 to 84 both of which originated from the example file.

Blog2021 05 27

Blog2021 05 28

If you have not already done so, log into the OneStream application.  Once logged into the application, select and expand “Application > Tools > Load/Extract”.

Navigate to and select the XML file previously created.  Once this is done, select the “LoadBlog2021 05 2021 05 26 15 21 26 Clipboard icon.  The load will complete without error which is confirmed by the second image sourced from the Activity Log.

Blog2021 05 21

Blog2021 05 23

To see the results using Excel, navigate to “System -> Security” and select your user account.  With the account selected, update the Culture from “English (United States)” to  “French (France)”.  Note, administrator privileges are assumed for this step.

Blog2021 05 24

Blog2021 05 25

To display the uploaded account descriptions without changing the Windows Display Language setting, create a Cube View that duplicates the display of the original Quick View.  Once the Cube View is created, selected the “Open Data Explorer” icon to display the uploaded descriptions.

Blog2021 05 26

I hope this solution is helpful.  Should you have any questions, either leave a comment or email me at terry.ledet@perficient.com.

]]>
https://blogs.perficient.com/2021/05/27/onestream-importing-non-default-descriptions/feed/ 0 292161
Smart View Health Check Simplifying Registry Settings https://blogs.perficient.com/2019/03/22/smart-view-health-check-simplifying-registry-settings/ https://blogs.perficient.com/2019/03/22/smart-view-health-check-simplifying-registry-settings/#respond Fri, 22 Mar 2019 13:00:57 +0000 https://blogs.perficient.com/?p=237754

Have you ever been working in Excel using the Smart View addin and when you go to refresh Excel crashes or your retrieval stops?  Well, one contributor could be your internet registry settings.  I can hear you saying: “What?  Internet settings cannot be the issue… I’m working in Excel!”  However, I have found in many instances that is precisely the cause; so much so, Oracle addresses the issue in multiple documents.  This blog will highlight a topic that I find exciting, updating registry settings.  Exciting, right?…  Absolutely!

Registry Settings

Recent release versions of Smart View demonstrate Oracle responding to requests to integrate some fantastic features.  These features are making Smart View easier to use while placing more control with the user and their machine.  Oracle rolled out a new feature called Health Check with release 11.1.2.5.800 .

The importance for checking and updating registry settings is to ensure an open internet connection when retrieving data for a large query.  The default internet setting to keep an open connection in the registry is only 60 seconds.  Sixty seconds is just not long enough for large queries.  Prior to release 11.1.2.5.800, Smart View users had to manually add three new keys to the registry.

History of Oracle Help Documents
In the past, updating registry settings required running the registry editor, navigating to internet settings and adding the following three keys: KeepAliveTimeout, ReceiveTimeout and ServerInfoTimeout.  Oracle addressed the issue over the years and provided instructions in the following documents.

  • Smart View v9.3x-v11.1.1.x Error: “The request timed out. Contact your administrator to increase netRetrycount and netRetryInterval” (Doc ID 744559.1).
  • Long Query In Smart View Fails: Request Timed Out; Extend Internet Explorer Time Out Settings (Doc ID 1378803.1).
  • Smart View Error: “The request timed out. Contact your Smart View client system administrator to extend your Windows Internet Explorer time out settings (ReceiveTimeout, KeepAliveTimeout, and ServerInfoTimeout)” (Doc ID 1953754.1).

Microsoft also addresses timeout values with similar instructions.

Those who have been completing these steps for years know the difficulty to update the registry.  Admins had the compounded challenge of assisting users less familiar with the process.  However, Oracle listened to our requests and revolutionized the process with the release of  Smart View 11.1.2.5.800 adding the new Health Check process.  Users can now navigate within the Excel Smart View ribbon and in a few simple clicks update their registry settings.  The update is easy and improves retrieval performance.  Additionally, this aids admins by providing a straightforward process to train teams on how to update registry settings.

Golden Ticket Solution
Troubleshooting Smart View Error: “The request timed out. Contact your administrator to increase netRetrycount and netRetryInterval” or “to extend your Windows Internet Explorer time out settings (ReceiveTimeout, KeepAliveTimeout, and ServerInfoTimeout)” (Doc ID 1995401.1).

Let’s review the new process.  The following steps can be found in Document ID 1995401.1.  Please note you must be administrator of your machine to complete these steps.

1  Open Excel and navigate to Smart View -> Help -> Health Check

2  Scroll down to Registry Information section.
3  Select ‘Click here to update settings’ to change the settings and make the changes.
4  Select ‘Update Settings’ to commit your changes and then ‘Close’.

Document ID 1995401.1 applies to the following application versions.

  • Hyperion Planning – Version 9.3.1.0.00 and later
  • Hyperion Essbase – Version 9.3.1.0.00 and later
  • Hyperion Financial Management – Version 9.3.1.0.00 and later
  • Oracle Planning and Budgeting Cloud Service
  • Oracle Enterprise Planning and Budgeting Cloud Service

Health Check
As of version 11.1.2.5.810 , Oracle has a higher default timeout value.  Check it out!  If your Smart View AdHoc and retrievals are failing, perform a health check and update the timeout values for your machine.  As a next step in learning, I encourage you to take a few minutes to read about “Performing a Health Check“.  The knowledge gained will provide insight into this new feature increasing your understanding and productivity of Smart View.

I am excited to find and share knowledge learned to make systems better; thereby, freeing time which users can allocate to core responsibilities.  My hope in writing this blog is to bring greater awareness of the Health Check feature, highlight Oracle’s improvements to Smart View and maybe spark some interest in other features.  Happy learning!

]]>
https://blogs.perficient.com/2019/03/22/smart-view-health-check-simplifying-registry-settings/feed/ 0 237754
Lightning Flow, Data Security, Summer ’18 Enhancements & More https://blogs.perficient.com/2018/04/30/lightning-flow-data-security-summer-18-enhancements-more/ https://blogs.perficient.com/2018/04/30/lightning-flow-data-security-summer-18-enhancements-more/#respond Mon, 30 Apr 2018 19:53:46 +0000 https://blogs.perficient.com/?p=206307

Top Salesforce Updates You Need to Know

  • Einstein, Lightning & More: What’s New in Salesforce Sales Cloud in Summer ’18
  • Top 5 Jaw-Dropping Salesforce Communities Summer ’18 Enhancements
  • Why is Data Security Important to Me?
  • Running an Integrated ABM Campaign in 5 Steps
  • Connecting to Salesforce in Outlook
  • Perficient’s Award-Winning Partnership with Salesforce
  • What Organizations Can Learn From Spreadsheet Debacles
  • Adding Clicks not Code Extensibility to your Apex with Lightning Flow

 

It’s impossible to keep up with all the developments and breaking news that surround Salesforce. Luckily, you’ve got us. Here are some of the top stories of the past week.

Einstein, Lightning & More: What’s New in Salesforce Sales Cloud in Summer ’18

The Salesforce Summer ’18 Release promised a focus on delivering “more personalized and integrated customer journeys while enabling your team to work smarter with new levels of data insights across your Salesforce Org.” For the Salesforce Service Cloud, this means the introduction of many new and exciting features that streamline sales reps’ processes while enabling them to work with enhanced insights and improved forecasts. While there are many Sales Cloud updates in this release, we’ll focus on these main enhancements that we anticipate making the biggest impact. Keep reading

Top 5 Jaw-Dropping Salesforce Communities Summer ’18 Enhancements

Things are heating up in the Salesforce Summer ’18 release, especially in Communities! Here are some highlights to look out for this release that includes SEO, security, and notifications enhancements, gamification, Chatter updates, and tells us how the Partner Community gets even better. Keep reading

Why is Data Security Important to Me?

There is a lot of information out there telling you that you should protect your data. But why is data security important? More data exists online now than at any other point in time, and the quantity is only expected to keep growing. It’s important to protect yourself, and your customers, by using the Salesforce platform securely and staying in the know about data security best practices. Keep reading

Running an Integrated ABM Campaign in 5 Steps

Even though most B2B marketers are bought into Account-Based Marketing (ABM), very few of them are actually using the strategy to its greatest potential. While running a one-off ABM initiative is a good start, the magic really happens when you work across your organization to develop integrated campaigns that address the needs of your target accounts. But what do those campaigns look like? And how do you actually get them off the ground? In this article, we’ll share five simple but effective steps to help you build and refine your integrated ABM campaigns. Keep reading

Connecting to Salesforce in Outlook

Bring together the two platforms you use every day: Salesforce and Microsoft Outlook. View and work with Salesforce data directly in your Outlook inbox, create Salesforce records, and log emails back to Salesforce. Your teams stay in sync and you don’t have to do manual data entry.

Perficient’s Award-Winning Partnership with Salesforce

We have implemented more than 3,000 Salesforce solutions that empower businesses to become more responsive, efficient, and relevant. Whether you’re looking to drive efficiency, insight, reliability, simplicity, scale, or collaboration, we’ll unlock the right Salesforce solution for you. Keep reading

What Organizations Can Learn From Spreadsheet Debacles

Excel is flexible, powerful, and ubiquitous. Organizations across the globe use the spreadsheet software for just about everything – from external financial reporting to simple bar graphs. While some of these activities are inherently riskier than others, they are not outside the capabilities of Excel when used properly. Therein lies the problem: Most organizations are not good at managing spreadsheet risks – or teaching employees how to properly use Microsoft Excel. Here are four straightforward ways to help reduce the risk of spreadsheet errors. Keep reading

Adding Clicks not Code Extensibility to your Apex with Lightning Flow

Building solutions on the Lightning Platform is a highly collaborative process, due to its unique ability to allow Trailblazers in a team to operate in no code, low code and/or code environments. Lightning Flow is a Salesforce native tool for no code automation and Apex is the native programming language of the platform — the code! Keep reading

]]>
https://blogs.perficient.com/2018/04/30/lightning-flow-data-security-summer-18-enhancements-more/feed/ 0 206307
What Organizations Can Learn From Spreadsheet Debacles https://blogs.perficient.com/2018/04/30/what-organizations-can-learn-from-spreadsheet-debacles/ https://blogs.perficient.com/2018/04/30/what-organizations-can-learn-from-spreadsheet-debacles/#respond Mon, 30 Apr 2018 13:12:15 +0000 https://blogs.perficient.com/?p=193052

In January 2010, Harvard economists Carmen Reinhart and Kenneth Rogoff published a controversial and influential paper entitled “Growth in a Time of Debt.” The paper studied economic growth at varying levels of debt across both advanced and emerging markets and concluded that countries experienced significantly lower growth when debt levels exceed 90 percent of gross domestic product.

Over the last several years, that relatively simple finding has been reported extensively by notable publications and has been cited by policymakers across the globe to justify strict austerity programs.

Three years (and countless protests) later, that conclusion was all-but invalidated when a student at the University of Massachusetts Amherst attempted to replicate the findings of the study. After trying to independently recalculate the results of the paper, the student was granted access to the original Microsoft Excel spreadsheets used by Reinhart and Rogoff and quickly discovered that a simple spreadsheet error had substantially distorted the analysis.

This is not the first time that a simple spreadsheet error has been the cause of major problems.

In 2012, an Excel formula error contributed to more than $6.2 billion in trading losses for JP Morgan Chase & Co. in what has come to be known as the “London Whale” incident. Surprisingly, the error wasn’t related to the use of a complex financial formula with multiple variables. Quite the opposite: The spreadsheet in question incorrectly divided the difference of two rates by their sum instead of their average.

Nor are spreadsheet errors uncommon. Raymond Panko, a professor at the University of Hawaii, published a paper on the topic of spreadsheet errors – compiling the results of multiple independent studies. The results of the meta-analysis indicate that spreadsheet error rates are as high as 88 percent.

Excel is flexible, powerful, and ubiquitous. Organizations across the globe use the spreadsheet software for just about everything – from external financial reporting to simple bar graphs. While some of these activities are inherently riskier than others, they are not outside the capabilities of Excel when used properly.

Therein lies the problem: Most organizations are not good at managing spreadsheet risks – or teaching employees how to properly use Microsoft Excel. Here are four straightforward ways to help reduce the risk of spreadsheet errors:

  • Assess Your Exposure to Spreadsheet Risk: Create an inventory of your organization’s key spreadsheets and categorize based on the frequency of use and inherent risk. Consider all of the potential consequences of an error and the magnitude of errors to understand your exposure.
  • Document the Use of Spreadsheets in Process Documentation: Documenting the use of spreadsheets in process flows across the organization can help shed light on the specific areas where spreadsheets are relied upon – and provide a guide to bolster controls and reduce the risk of process escapements due to spreadsheet errors.
  • Avoid “Hardcoding” Spreadsheet Data: Hardcoding data – the practice of removing formulas and replacing with static data – may make it easier to manipulate data, but it eliminates transparency and makes it harder to validate that the spreadsheet operated appropriately.
  • Teach Employees Practical Tips to Reduce Spreadsheet Risk: While there may not be a “catch-all” solution to prevent spreadsheet errors, there are several simple data validation formulas and tools that can significantly improve the accuracy and reliability of spreadsheets. These tools and techniques should be a prominent topic in any Excel training courses.

In addition to the above, there should be a renewed effort to improve Excel-based instruction in the workplace. As stated previously – Excel is a powerful and flexible tool. By understanding and managing spreadsheet risks – and improving our understanding of the program, we’ll hopefully see fewer incidents like Reinhart-Rogoff in the future.

]]>
https://blogs.perficient.com/2018/04/30/what-organizations-can-learn-from-spreadsheet-debacles/feed/ 0 193052