In the past several years, I’ve seen both successful and failed insights initiatives. Perhaps the first element of failure is what they call these efforts:
- Reports
- Executive Reports
- Reporting
- A focus on the number of reports
- Large focus on specific fields in reports, to the exclusion of other elements
When I look at elements of successful insights projects, I have seen the following:
- Other names besides anything with the term reports in it
- Focus on the user’s needs
- Understanding of the daily activities of users which would drive insights
- Looking at the end result as the purpose rather than the report as the purpose
- Allowing action to occur once insight is gained
Example: Risk Portal
I won’t name the client but I will give you enough information about the context. Risk Analysts in this particular department relied on getting data and then doing somewhat complex analysis to determine which firms have enough risk to generate some action. We came to work with them after an unsuccessful first try. The first try failed with both technical and experience issues.
Technical issues: couldn’t get the data refreshed and to the analysts fast enough.
Experience issues: focused on giving reports defined by an end users with the idea that those reports were sufficient for true insight.
While we had a lot of work to do on the technical side, the user experience side required equal focus. We had to do the following:
- Understand their business and what they did. This was the fun part as it involved some pretty interesting stories including some spectacular failures.
- In depth review of what their day to day work involved
- Thorough understanding of what they did to get insights….. even if those insights were delayed and somewhat opaque
- Iterations of development the insights experience
- Use of visualization combined with more common reporting formats to create the experience
- Willingness to break out of the “use my out of the box reporting portal” mold
When we finished, we had one of our most successful business intelligence or analytics projects ever. To those who want the technical details about what we built, it was fairly straightforward. We surfaced a variety of reports using a range of visualizations specific to industry (not out of the box visualizations) and specific to other scenarios (out of the box visualization types). We surfaced those reports in a separate portal site and integrated it to another workflow app that allowed analysts to kick off tasks, approvals, and eventually calls to their customers for specific actions. The flexible portal also allowed them to pull in 3rd party feeds and to customize their own dashboards based on these reports.
If you think it wasn’t that big of a deal from a technical standpoint, you would be right. But remember that the previous failure started from the assumption that all you needed to do was have some smart boss define a few reports and then create those reports in the out of the box reports portal provided by the vendor.
Best Practices
Which leads me to this series. When you think of data as experience, it leads you to some best practices which can be summarized as follows:
- Look at general user roles that will define how you treat them. (basic, power, data scientist)
- Understand your users and what they really need.
- Look at where the user works and what context. Are they mobile? Are they online? Do they have a 46 inch monitor or just the laptop.
- Get to the point where you can trust the data coming to you. This incorporates a lot of back end best practices but from an experience standpoint, I’ll just summarize it by calling it technical elements and data governance
With these best practices defined, look forward to a range of posts about individual elements of these best practices.
Next Post: The Basic User