Here is your data & analytics blueprint
Now comes the actual delivery and build
You get the idea – enjoy the journey or contact us to join the ride along with you.
Hub is a logical/physical space where everyone comes searches content, deciphers information, looks for viable information, lineage etc. Microsoft is enabling a similar concept in Teams. Teams is where we all have begun to collaborate via virtual meetings, notes, lists, apps etc.
The future vision and big picture is to treat Teams as a single pane of glass. Data is the fuel, insights and KPIs drive decision making. The idea is that everyone in an Enterprise with appropriate rights and security should have access to insights off their enterprise data for effective decision making. This drives monumental adoption across the board, improves inter/intra team collaboration and brings digital consistency across the enterprise.
The core of Teams in the backend is SharePoint Online. Teams also integrates with One Drive for Business. If One Drive for Business is an approved service in your Organization when you set up Departmental, Business Unit Teams it will be backed with a One Drive for Business content repository where the users can save files along with other content.
When we talk about an Analytics Hub we are enabling analytics products on Teams. So how do we go about doing this? Great candidates for Teams? Think Departments, Subject Areas, Executive team Consumption space etc. Every Team we create can be secured by Azure Active Directory Groups with appropriate roles like Administrator, member etc.
An individual Team can have multiple Channels which can be further secured to a subset of Users belonging to that Team is necessary.
Enable Analytics Content (True Hub Experience)
Teams has now Released a new Feature about adding Power BI as an App on the Left navigation Ribbon
This will expose the whole Powerbi.com portal on Teams (great Roll Out Feature) – this way users do not have to navigate to a separate portal and can consume all the secured Power BI content on Teams.
Enable Analytics Content (In Each Channel)
Within a channel Teams gives you the capability to add a Tab and associate a universally shared Power BI report to that particular channel. Think of this a place where most of your Premium Capacity ready Only Users can consume a shared Power BI report or multiple Power Bi reports.
Power BI Model Change/Revision Control is built into Microsoft Teams
Another great utility of Teams is the source control feature for Power BI (.pbix) files that is built into it. Power BI files are binary files which carry both metadata and actual data. In traditional DevOps world we can always send incremental changes to the code pipeline but when it’s a binary file that is not possible. With that in mind for establishing a good change/revision management with Power BI models this capability was enabled.
Ideal way to do this will be to create a Revision / SDLC channel in each Teams users are in and then let them sync content from their Local One Drive for Business to that Team’s One Drive for Business.
Here is a flow of how to go about Leveraging Teams + One Drive for Business to streamline Power BI change management (simplified recommended for ease of end user adoption)
In Teams for a Particular Team or Team-Channel> under General > Files – Create a Power BI Models Folder and load the Power BI Models (.pbix) files.
Then in the Power BI service Navigate to the Workspace for that Power BI model and connect the Power BI model by Selecting Get data – this will set up the Model to be configured to pull in changes established at the file level in the Power BI service
Once this is established – Navigate to the Power BI model settings in the Workspace and set it up to refresh from the One Drive File changes.
Now in Teams you are basically set to do Change/revision Management – you are now set to check out the File and make changes and save it.
Now we can Open in SharePoint and Set up the sync from our Local One Drive for Business drive
That’ it! Once you check out a file no one will be able to overwrite it. If you have to revert back a version you also have the capability to look at Version History.
Along with Comments accounting for the changes as shown below.
Last but not the least – You can also enable content from Other BI Tools in Teams as long as SSO is established on the O365 Tenant.
Teams is very versatile and we can help you use it to your full potential along with appropriate content organization and integration with the rest of the Microsoft Suite of products and other 3rd party BI tools.
]]>Various Enterprise clients across different verticals like Healthcare, High tech, Financial Services, Retail, Manufacturing, and Supply Chain etc. can leverage the Modern Data Platform approach. Marry in people, process and technology and you have Producers and Consumers with a great value add. This Digital Transformation approach will help enterprises looking to re-platform, helping various Line of Businesses embark on the journey for Self Service Analytics, Modern fully managed Data Platform services etc. The explosion of data and advances in digital technologies has completely disrupted our industry as service / solution providers.
Many of our existing clientele, new and prospective clients still face challenges around exponential growth of data, Siloed data, the way they budgeted for projects (cloud changed the spend model CAPEX vs OPEX), driving KPIs and insights in a fluid business landscape to drive actionable insights – factor in Organizational politics and various data regulations mandated around privacy – there is a lot of work and consideration that goes into creating an agnostic data and analytics platform as a service. The problem is significant because enterprises are still struggling to drive actionable insights in an automated, scalable manner. It affects decision making on everyday basis which does not let these enterprises provide value to their customers in an efficient manner.
So how do you embark on this journey? We at Perficient leverage Customer Experience Mapping the most.
Being a Services Provider as a part of our engagement offering we do 2 things
Prospect Qualification and Idea Mapping – Hand in hand Sales and Delivery teams exercise
This is where we have pre-sales calls and understand the initial pain points ( you can call them insights into their world ) These can be Integration challenges, dirty data challenges, redundant data, over engineering, maintenance issues, scalability issues, people problems etc. We ask a few questions and formalize the conversation where we address the Customer Experience Mapping Then we make sure there is executive sponsorship involved in this and introduce the idea to them in a follow up meeting We have a few standard templates – which can be customized to the end client addressing their challenges
What is Phase 0 in terms of Customer Experience Mapping for our customers?
Enablement with Workshops
We take the above and produce a Customer Experience Map and then call out essential considerations.
That Customer Experience Map is then leveraged to propose a Future state – that will either augment or improve the client’s experience keeping in mind people and processes.
Deliverable coming out of those workshops is a Customer Experience Map that addresses
Response Based Segmentation for our clients
Then Customer Journey Mapping – formulates the core of the Modern Data Platform and its capabilities
Our target personas are various resources at our Clients
Delivery and Implementation
There will be Call to Actions (CTAs) at the end of these Journey Maps and then we put the delivery in motion in an agile manned in Sprints with relevant Delivery Squads.
Let us now walk the above with a Marketing use case
In my last blog, i mentioned how we leverage Snowflake to tell this story for our clients. The other technologies we are working with are Azure Synapse, RedShift etc. With some good tech thought in place the modern platform idea and implementation will thrive in an organization supporting capabilities for various personas. Add analytics via Power BI, Domo, Tableau, Cognos Analytics, Micro Strategy, AWS Quicksight, ThoughtSpot etc. now we have a bring your tool approach to the Modern Data Platform as a service. If this blog addresses some of the things you are considering please feel free to contact us we would love to have conversations across the board. We do on platform Proof of Concepts, Technology Proof of Values to put the picture together.
]]>As times have evolved, so have our data challenges — but we have been trying to solve it for ages. Giving it dedicated resources, robust hardware, etc. we still end up missing SLAs in our traditional On-Premise & Cloud-hosted worlds. Add to the fact the constant dependency on personnel to manage the data loads, data ops, size of the database, and on and on. When you look at a modern data platform, you are basically looking at something that will scale to your pipeline needs, be agile in the works, fully managed, does not need data shape restrictions, provides dependable performance at scale and is easy to query in SQL and via APIs! Enter Snowflake DB – it’s like the founders sat down around a round table, put all the traditional problems on paper, and literally solved it. Their motto moves data once and accesses it multiple times in various ways at various scale and capacity.
That leads them into many personas, and their own explicit compute surrounding themselves around the data on the cloud.
They truly separated compute and storage and automated the management of the service. That reduces the code/script you will have to handwrite to make these various computes (Snowflake calls them Virtual Warehouses) dance to the needs of more and more data.
Their storage is in the center surrounded by computes of various sizes catering to each of those personas listed above. Add to the fact that the service literally shuts down and does not cost you a penny when nothing is being queried by the compute virtual warehouses. You are billed based on a credit matrix that aligns with the size of the compute. Storage costs are flat — they took the guessing game out for enterprises across the board. Snowflake has an ecosystem that supports many third-party products and ETL tools.
Bring data of any shape and store it into Snowflake – you can basically Lake and Mart on the same service and provide access to a multitude of Analytical Sandboxes to your various end-users across the board.
You can preset and baseline the number of Virtual Warehouses you need and set up an option to add scale when required. When queries start to queue up, the new Virtual Warehouse kicks into gear without disrupting existing queries and finishes the data load. This is the best part of Snowflake, and it can ingest billions of rows of data without any performance issues.
Add time travel ( retention of your data states ) and capability to clone data at no cost for Dev and Test instances. It really changes the game when it comes to serving Organizations of various sizes and needs.
All in all, a Shared data architecture that scales to the needs and manages itself without much overhead.
Analytics Workloads via Connectors and Virtual Warehouses addressed at scale –
Snowflake has a federated connector to Domo that can be leveraged query real-time data and support mobile analytics needs in Apps. With Snowflake, you can literally fire up live queries using Tableau and Power BI and probably get away from loading data into Tableau Extracts or Power BI Data Sets in case the need is for Real-Time Dashboards. Helping you reduce the size of your analytical models in BI tools. Snowflake can be provisioned for HIPPA and PII regulatory needs as well on Azure, AWS, and Google. Add metadata management, and you get a well rounded managed data service for modern times.
We at Perficient are working with many existing and potential clients across various Verticals with Snowflake as a Partner to tell the story of data and reduce age-old overheads – please feel free to reach out to us. We have worked closely with Informatica, Talend, for enterprise implementations to leverage Snowflake across the board. In modern times Snowflake is a real disrupter in the industry.
]]>In uncertain times like these its essential that we consider the disruption that is happening and take a nimble approach to help Organizations including my own by driving analytics and insights across the various Lines of Business.
How can we take all the Public data that is available from John Hopkins, CDC, WHO and various State commissioned data and layer into an Organizations’s business data to tell a story , associate impact and deliver a message around Pandemic readiness and response? And do that at speed and agility so we can address the following
I am putting down a few scenarios to drive the home the idea that every Organization, their Line of Businesses, Operations, Margins, Supply Chain, Hiring, Credit usage, Workforce is disrupted and is looking to track essential KPIs (existing and new) to make sure they are responding to the need of the hour and their customers.
For Healthcare systems to effectively manage a Pandemic Response and proactively monitor ground reality – they will need to bring in Public data and blend it with their respective EHRs and other Siloed databases.
This blended data will provide visibility into how ready are their Provider pools to tackle the surge of Digital Visits due to social distancing, do they have enough Supply of Physicians, do they have visibility into their PPE supply, Bed Utilization, Infrastructure readiness for remote work, Workflows in place to track Digital Adoption etc.
For a Bank – Blend Public and Enterprise Data to track
For a Retail business – Blend Public and Enterprise Data to
To accomplish the following essential entities and capabilities need to be in place
Perficient’s Data Solutions team can address all the above for your Organization in these times of Uncertainty and make sure Data and Insights are being delivered at pace in tune with the disruption and changing ground reality for your Enterprise. If you have implementations, analytics products supporting your Line of Businesses on the cloud or On Premise we can blend in with your strategic and tactical teams and augment every aspect of your analytics journey in these testing time and really drive value.We understand Data and Insights and how they impact the social outreach in these testing times. Feel free to contact us and we will be happy engage with your teams on the ground.
]]>Along with countless others in the IT consulting space, I’ve written about the challenges facing businesses as they undergo their various digital transformation initiatives. I find the topic endlessly fascinating, as it not only speaks to today’s business opportunities but explores what will be the foundation for the next century of innovation.
Discussing how IoT, AI, advanced analytics, and other technologies will transform business is what I imagine it must have been like to discuss the transformational impact of the light bulb, transcontinental railroads, the telegraph, and other industrial age innovations. Just as Edison couldn’t have predicted today’s LED lights, we’re likely only scratching the surface of what digital transformation will ultimately bring.
Because many of us come from tech backgrounds and are speaking to other tech leaders, it’s easy to think of digital transformation in terms like migrating from legacy, on-premises technology to modern, cloud-based technologies. In fact, many of the case studies I see are focused on how tech companies have become even smarter and data-driven. However, we need to remember that even the most traditional, analog companies can reap massive benefits from digital transformation.
One interesting example is Sinar Mas, a palm oil company in Indonesia. The company’s general manager Hong Zhou Wong recently shared his experience in transforming to a data-driven organization at Domopalooza 2019, the annual user conference for cloud-based data platform Domo.
Sinar Mas is responsible for 500,000 hectares of plantation–4.5x the size of Los Angeles. The company has more than 172,000 employees and sells its product across 70 countries for numerous markets ranging from cooking oil to baby food to cosmetics to pharmaceuticals. As a “seed to shelf” producer, Sinar Mas must excel at wildly different functions–grower, harvester, producer, commodity trader, marketer, and seller.
For Sinar Mas, the company needed a way to use data to help assure its “operational rhythm.” With so many moving parts, along with the challenges of working with a product that’s produced by and is at the mercy of Mother Nature, the company needed to be smarter about using its data to eliminate any lag that could spoil its product and create wastage.
To do so, Sinar Mas uses Domo to capture and analyze data spanning across its 500,000 hectares of activities. From the day’s work of a single harvester in the field to the going market rate of palm oil across its global markets, the company has been able to bring all its data together in a single pane of glass to work as one.
In an organization the size and scope of Sinar Mas, you can’t have a one-size-fits-all way of using data. Wong spoke about how the company uses both a bottom-up and top-down approach: for upstream activities, the company takes a top-down approach to ensure that the entire business is operating in rhythm and without disruption, while downstream activities like day-to-day energy consumption and production efficiency are analyzed using a bottom-up approach. The two perspectives are then combined to give company leaders and managers a holistic view of the company.
Something I found interesting about Sinar Mas’ story is that their digital transformation isn’t just about going from no data to data, or stale data to real-time data. It’s about sharing data. In his presentation, Wong told the story about how managers would have to share data using WhatsApp or via paper before they added Domo. This made it impossible for anyone who was not on the text chat or on the other end of the memo to know what was going on at different parts of the company–a significant concern in such an interconnected enterprise.
Now, Sinar Mas is able to collect, share, and collaborate on data all within a single platform so that everyone has access to the same information in real-time. This not only keeps the business moving but allows different refineries and parts of the business to learn from each other and share best practices.
An important thing to remember is that digital transformation isn’t about just collecting or sharing data. It’s about people. Wong shared that people in his organization were initially fearful of the power of Domo. After all, what company leaders would see as the benefit of real-time data and complete transparency of information could be seen as a loss of control to a department manager who used to have time to “smooth” over her department’s data and prepare her version of the story before going into a meeting. Wong said that it was essential to recognize the fear that can come with digital transformation, to take the time to bring staff along, to have a vision for what you’re trying to accomplish, and to build a culture capable of supporting it. I couldn’t agree more.
This same story is being played out around the world every day across every industry and size of business. It no longer matters if your company is a food grower, a manufacturer, a trader, a retailer, or in the case of Sinar Mas, all of the above. Every company is now a tech company.
The more you can incorporate data into your business and your company culture, the more you can reimagine the way your business can and should be operated. The result will make today’s most revolutionary promises of digital transformation merely the foundation for what’s to come.
]]>Context – Bring data together from various web, cloud and on-premise data sources and rapidly drive insights. The biggest challenge Business Analysts and BI developers have is the need to ingest and process medium to large data sets on a regular basis. They spend the most time gathering the data rather than analyzing the data.
Power BI Dataflow, the Azure Data Lake Storage Gen 2 makes this a very intuitive, and result based exercise. Prior to Power BI Data flow data prepping was restricted to the Power BI Desktop.
Power BI Desktop
Power BI Desktop users use Power Query to connect to data, ingest it, and transform it before it lands in the dataset.
There are some limitations to the Power BI Desktop data prep
The below diagram depicts how Dataflows aide the Business Analysts when they on-board data into the Azure Data Lake Storage Gen2 and then can leverage all the other services they have access to. How is this possible – the short answer is Common Data Model.
Common Data Model
The Common Data Model (CDM) provides a shared data language for business and analytical applications to use. The CDM metadata system enables consistency of data and its meaning across applications and business processes (such as PowerApps, Power BI, Dynamics 365, and Azure), which store data in conformance with the CDM.
The diagram above shows a bi-directional approach to enterprise analytics on the cloud.
Low Code approach
Left hand side shows the low code approach where Data Analysts from different Lines of Business can access, prep and curate datasets for their analytics needs – create Analytics content.
Power BI Dataflows allows you to ingest data into the CDM form from a variety of sources such as Dynamics 365, Salesforce, Azure SQL Database, Excel, or SharePoint. Once connected prepare the data and load it as a custom entity in CDM form in Azure Data Lake Storage Gen2.
Enterprise Data Workloads approach
Right hand side shows the various Azure services that potentially leveraged the Enterprise Data Engineers to drive analytical workloads.
Conclusion and Insight
The details outlined in this blog enables rapid insight to enterprise data. This approach drives adoption and establishes standards that drives sustainable analytics’ projects leveraging Power BI. This also enables access to curated and cataloged data. The semantic standardization enhances ML and pushes forth AI on the platform.
The Gartner Magic Quadrant for Analytics and BI released in February 2019 named Microsoft Power BI as the leader in the analytics quadrant. One of the driving factors listed in it is Microsoft’s Comprehensive product vision. It draws attention to Common and open data model, AutoML, Cognitive Services like text, sentiment analytics via Power BI. I tried to draw attention to the same in this blog and simplify it.
]]>Power Up your Analytics with Power BI
Power BI is a hosted BI solution on the Microsoft cloud. Its biggest appeal is the ecosystem it comes with. Microsoft across its various releases has created a very intuitive product on a robust platform. Its a one stop shop for all analytics needs.
Identifying use cases for Power BI
This section will establish the relevance of Power BI. It is an immediate value add to any Organization. It seamlessly enables various teams that are willing to collaborate across the board to drive decision making. Let us now lay down a few scenarios.
Scenario 1 – Company has a sales team who has data in Salesforce , tracking opportunities are traditionally pretty happy with what Salesforce has to offer. The team wants to collaborate extensively and drive analytics to improve pipeline activity across various Regions, Manager and Sales reps.
Scenario 2 – Company has a data lake established. Various analysts want the flexibility to tap into the Lake data and blend it with attribute data that does not reside in the Enterprise data lake.
Scenario 3 – IT Driven Analytics – where IT creates the content leveraging the data from an enterprise Data warehouse.
Scenario 4 – A team in an organization has been living off Excel and macros. There is a huge maintenance overhead and most of the time is spent accessing and gathering data.
Let us now put some perspective into what Power BI enables.
Now that we have discussed Power BI’s advantages let us put it to work to its true potential. For that to happen we will have to depend on the Technical IT team, Line of Business, Data Owners and Stake holders.
Simply put its where people, process and platform come together.
I took a broad swipe at how we can leverage Power BI platform’s true potential by establishing processes and synergy between people. A good understanding of the various licences and supported end user capabilities will go a long in establishing a sustainable solution.
Couple Power BI’s capabilities, processes established along with responsibilities laid out to address the Scenarios laid down at the beginning of this blog.
Scenario 1 – Analyze Salesforce data in Power BI
Scenario 2 – Blend Attribute data with Data Lake data
Scenario 3 – Enable Self Service Analytics for the Organization on curated data supported by the technical IT team for the enterprise.
Scenario 4 – Decouple from Excel completely, house the business logic and establish a solid Semantic Layer in Power BI.
As you can see Power BI lends a lot of flexibility across various scenarios, engages various teams and brings a lot of fire power to analytics as an offering.
In my next blog i will dive deeper into the essential components of Power BI and various platform provisioning approaches.
]]>