Abhisek Majumdar, Author at Perficient Blogs https://blogs.perficient.com/author/amajumdar/ Expert Digital Insights Wed, 19 Jun 2024 19:16:50 +0000 en-US hourly 1 https://blogs.perficient.com/files/favicon-194x194-1-150x150.png Abhisek Majumdar, Author at Perficient Blogs https://blogs.perficient.com/author/amajumdar/ 32 32 30508587 Data and Analytics blueprint – successful delivery https://blogs.perficient.com/2020/12/16/data-and-analytics-blueprint-successful-delivery/ https://blogs.perficient.com/2020/12/16/data-and-analytics-blueprint-successful-delivery/#respond Wed, 16 Dec 2020 18:29:27 +0000 https://blogs.perficient.com/?p=285181

Here is your data & analytics blueprint

  • Clear vision and measurable outcome
  • Establish Sponsor buy in and level set expectations
  • Crisp delivery based on agile with timely deliverables called out
  • Open communication and cadence
  • Establish a RACI!

Now comes the actual delivery and build

  • Data be the center and focus of everything (anyone else tells you otherwise is not looking at it in a holistic manner!)
  • Your analytics project will be successful if that data layer is solid
    • Data that is moved multiple times will create a lot of heart ache
    • So collect, store, transform, curate and standardize your data
      • If that does not exist – conduct a few POCs to draw out the picture – should not take more than 3 week
      • Apply some Data Quality rules
    • Encourage analysis on your Lakes! – Leverage the Object Stores and open them up for analysis for the various stake holders (Serverless options are plenty and cheap nowadays)
    • Fail Fast and iterate
  • Get creative around enabling data access and democracy in the organization – do not enforce rigid top down controls – business users are repulsive to that
  • Encourage the data engineers and analytics crew to get creative with access to content promoting leveraging the right level of Governance
    • Agile data pipelines
    • Self Service analytics
    • Clear Lineage for anything and everything
    • Cloud services are very flexible and various controls/queues can be establish
    • Address Data, analytics, SaaS, IaaS services from the lenses of capabilities and persona matrix – who has access to what and can accomplish
    • Fool proof the vision – Descriptive to Predictive to AI/ML
    • Push user enablement, adoption and education
  • Now we can talk tech stack J
    • Address Scale, provisioning, security, automation etc.
    • Snowflake, Azure Synapse, Databricks, Redshift, Object stores etc.
    • Glue, Informatica, Azure Data Factory, Matillion, Talend etc.
    • Power BI, Tableau, Domo, Qlik, Cognos Analytics etc.
    • Power platform, Workato, Mulesoft, Airflow etc.
    • Containers

You get the idea – enjoy the journey or contact us to join the ride along with you. 🙂

]]>
https://blogs.perficient.com/2020/12/16/data-and-analytics-blueprint-successful-delivery/feed/ 0 285181
Microsoft Teams and Power BI – Analytics Hub and Content Management https://blogs.perficient.com/2020/10/08/microsoft-teams-and-power-bi-analytics-hub-and-content-management/ https://blogs.perficient.com/2020/10/08/microsoft-teams-and-power-bi-analytics-hub-and-content-management/#respond Thu, 08 Oct 2020 14:30:42 +0000 https://blogs.perficient.com/?p=282058

Hub is a logical/physical space where everyone comes searches content, deciphers information, looks for viable information, lineage etc. Microsoft is enabling a similar concept in Teams. Teams is where we all have begun to collaborate via virtual meetings, notes, lists, apps etc.

The future vision and big picture is to treat Teams as a single pane of glass. Data is the fuel, insights and KPIs drive decision making. The idea is that everyone in an Enterprise with appropriate rights and security should have access to insights off their enterprise data for effective decision making. This drives monumental adoption across the board, improves inter/intra team collaboration and brings digital consistency across the enterprise.

The core of Teams in the backend is SharePoint Online. Teams also integrates with One Drive for Business. If One Drive for Business is an approved service in your Organization when you set up Departmental, Business Unit Teams it will be backed with a One Drive for Business content repository where the users can save files along with other content.

When we talk about an Analytics Hub we are enabling analytics products on Teams. So how do we go about doing this? Great candidates for Teams? Think Departments, Subject Areas, Executive team Consumption space etc. Every Team we create can be secured by Azure Active Directory Groups with appropriate roles like Administrator, member etc.

An individual Team can have multiple Channels which can be further secured to a subset of Users belonging to that Team is necessary.

Pic1

Enable Analytics Content (True Hub Experience)

Teams has now Released a new Feature about adding Power BI as an App on the Left navigation Ribbon

Pic2

 

This will expose the whole Powerbi.com portal on Teams (great Roll Out Feature) – this way users do not have to navigate to a separate portal and can consume all the secured Power BI content on Teams.

Enable Analytics Content (In Each Channel)

Within a channel Teams gives you the capability to add a Tab and associate a universally shared Power BI report to that particular channel. Think of this a place where most of your Premium Capacity ready Only Users can consume a shared Power BI report or multiple Power Bi reports.

Pic3

Pic4

Power BI Model Change/Revision Control is built into Microsoft Teams

Another great utility of Teams is the source control feature for Power BI (.pbix) files that is built into it. Power BI files are binary files which carry both metadata and actual data. In traditional DevOps world we can always send incremental changes to the code pipeline but when it’s a binary file that is not possible. With that in mind for establishing a good change/revision management with Power BI models this capability was enabled.

Ideal way to do this will be to create a Revision / SDLC channel in each Teams users are in and then let them sync content from their Local One Drive for Business to that Team’s One Drive for Business.

Pic5

 

Here is a flow of how to go about Leveraging Teams + One Drive for Business to streamline Power BI change management (simplified recommended for ease of end user adoption)

Pic6

In Teams for a Particular Team or Team-Channel> under General > Files – Create a Power BI Models Folder and load the Power BI Models (.pbix) files.

Pic7

 

Then in the Power BI service Navigate to the Workspace for that Power BI model and connect the Power BI model by Selecting Get data – this will set up the Model to be configured to pull in changes established at the file level in the Power BI service

Pic8

Once this is established – Navigate to the Power BI model settings in the Workspace and set it up to refresh from the One Drive File changes.

Pic9

Now in Teams you are basically set to do Change/revision Management – you are now set to check out the File and make changes and save it.

Pic10

Now we can Open in SharePoint and Set up the sync from our Local One Drive for Business drive

Pic11 Pic12

That’ it! Once you check out a file no one will be able to overwrite it. If you have to revert back a version you also have the capability to look at Version History.

Pic13

Along with Comments accounting for the changes as shown below.

Last but not the least – You can also enable content from Other BI Tools in Teams as long as SSO is established on the O365 Tenant.

Teams is very versatile and we can help you use it to your full potential along with appropriate content organization and integration with the rest of the Microsoft Suite of products and other 3rd party BI tools.

]]>
https://blogs.perficient.com/2020/10/08/microsoft-teams-and-power-bi-analytics-hub-and-content-management/feed/ 0 282058
Perficient’s Cloud Modern Data Platform Approach – Customer Journey & Capabilities https://blogs.perficient.com/2020/07/15/perficients-cloud-modern-data-platform-approach-customer-journey-capabilities/ https://blogs.perficient.com/2020/07/15/perficients-cloud-modern-data-platform-approach-customer-journey-capabilities/#respond Wed, 15 Jul 2020 16:50:34 +0000 https://blogs.perficient.com/?p=277406

Various Enterprise clients across different verticals like Healthcare, High tech, Financial Services, Retail, Manufacturing, and Supply Chain etc. can leverage the Modern Data Platform approach. Marry in people, process and technology and you have Producers and Consumers with a great value add. This Digital Transformation approach will help enterprises looking to re-platform, helping various Line of Businesses embark on the journey for Self Service Analytics, Modern fully managed Data Platform services etc. The explosion of data and advances in digital technologies has completely disrupted our industry as service / solution providers.

Many of our existing clientele, new and prospective clients still face challenges around exponential growth of data, Siloed data, the way they budgeted for projects (cloud changed the spend model CAPEX vs OPEX), driving KPIs and insights in a fluid business landscape to drive actionable insights – factor in Organizational politics and various data regulations mandated around privacy – there is a lot of work and consideration that goes into creating an agnostic data and analytics platform as a service. The problem is significant because enterprises are still struggling to drive actionable insights in an automated, scalable manner. It affects decision making on everyday basis which does not let these enterprises provide value to their customers in an efficient manner.

So how do you embark on this journey? We at Perficient leverage Customer Experience Mapping the most.

 

Today Blog1

 

Being a Services Provider as a part of our engagement offering we do 2 things

Prospect Qualification and Idea Mapping – Hand in hand Sales and Delivery teams exercise

This is where we have pre-sales calls and understand the initial pain points ( you can call them insights into their world ) These can be Integration challenges, dirty data challenges, redundant data, over engineering, maintenance issues, scalability issues, people problems etc. We ask a few questions and formalize the conversation where we address the Customer Experience Mapping Then we make sure there is executive sponsorship involved in this and introduce the idea to them in a follow up meeting We have a few standard templates – which can be customized to the end client addressing their challenges

What is Phase 0 in terms of Customer Experience Mapping for our customers?

Enablement with Workshops

  • Functional workshops – Interviews and white board sessions
  • Technical workshops – Interviews, white board sessions & reference architecture discussions
  • Information Gathering – Triage Priorities, Budget considerations, Gaps and Capabilities, Needs and Wants, Delivery Timeline
  • Use Cases facing challenges – Discussions, Improvement considerations

 

We take the above and produce a Customer Experience Map and then call out essential considerations.

That Customer Experience Map is then leveraged to propose a Future state – that will either augment or improve the client’s experience keeping in mind people and processes.

 

Deliverable coming out of those workshops is a Customer Experience Map that addresses

  • Improved Reference Architecture
  • MVP Prototype – ideation and flow
  • Gaps called out and future proofing
  • Future road-map with delivery timeline with Assumptions & Difficulties called out
  • Immediate mitigation plan to address hot button issues

 

Response Based Segmentation for our clients

  • We generally do a Phase 0 with our clients
  • The phase 0 has various markers that are addressed in the workshops
  • The Audience
    • Technical vs Functional
  • We garner responses from Technical Folks and group it into the following and score them in a matrix
    • Platform needs
    • Sustainability
    • Code protocols
    • Capabilities
  • We garner responses from the Functional Folks and group it into the following and score them in a matrix
    • Ease of Use
    • Adoption
    • Training and Enablement
    • Platform / Product Communication
  • The workshops take into consideration a blend of internal and external aspects of that particular customer
    • Internal factors are everything that was mentioned above across the tech and functional segments
    • External factors are things we specialize in
      • Best Practices
      • Delivery Velocity & capacity
      • Architectural patterns in the industry
      • Future the industry is heading towards – like the cloud
      • Digital Transformation
    • We then present a Functional, Capability and technical Matrix
      • Align them to Scores

Then Customer Journey Mapping – formulates the core of the Modern Data Platform and its capabilities

Our target personas are various resources at our Clients

  • Functional – their journey entails shoring up data to the C Suite who can make better decisions towards product adoption, improved sales, forecasting, outreach, better marketing campaign. Their outcomes depends on the Industry vertical for example a Healthcare client wants to reduce cost while improving the quality of care.
    • Business Analysts
    • Decision makers
    • Business SMEs
    • Business Sponsor
    • Office of the CIO, CMO etc.
    • Functional Personas go through the journey of understanding business -> Measuring Business -> Improving efficiency -> Improving Product -> Increasing Profits
  • Technical – their journey entails supporting shoring up data to the business in a sustainable manner, improving product code, improving product experience, technical blogs/documentation along with platform support.
    • Solutions Architects
    • Platform Architects
    • Security Architects
    • Developers
    • Testers
    • Support tier
    • Office of the CTO, CDO etc.
    • Technical Personas go through the journey of understanding business -> Ideation -> POCs-> Supporting Business -> Improving processes-> Improving Code -> SDLC -> Improving product and platform

 

  • The Desired outcomes at each stage of the customer journey
    • All of the following are intent based depending upon the personas and the outcomes they want to drive in their journeys. End goal is to find harmony across both the personas to drive revenue, innovation and product development improving brand recognition.
      • Functional Personas go through the journey of understanding business -> Measuring Business -> Improving efficiency -> Improving Product -> Increasing Profits
        • Desired outcome of each stage is better understanding of the business and improved collaboration with product development and delivery teams to drive revenue and profits
      • Pain points Functional personas experience are
        • no access to real time data impeding business value decision making
        • no 1 version of the truth to help better business outcomes
        • integration challenges with mergers and acquisitions
        • growing costs
      • Technical Personas go through the journey of understanding business -> Ideation -> POCs-> Supporting Business -> Improving processes-> Improving Code -> SDLC -> Improving product and platform
        • Desired outcome of each stage is support the business in the most efficient and cost effective manner with an eye towards innovation and evolution
      • Pain points Technical personas experience are
        • Legacy infrastructure
        • Rapid growth of business
        • Innovation pains – build vs buy decisions
        • Technology debt
        • Scale and Automation

 

Delivery and Implementation

There will be Call to Actions (CTAs) at the end of these Journey Maps and then we put the delivery in motion in an agile manned in Sprints with relevant Delivery Squads.

Let us now walk the above with a Marketing use case

  • The Modern Data platform is supposed to digitally transform our potential client’s capabilities and enable them support their businesses via automation, outreach and better access to data. Marketing teams engage with customers predominantly via Online Media (Digital Channels and Socials). Our Marketing Customers are mostly looking to improve their platforms, their scale and sustainability. This is how it layers into the overarching strategy of being active on various Socials and Digital channels.
    • Webinars will have Online Surveys uniquely identifying prospects
    • Socials will have Industry specific nuggets about the platform and its various capabilities – with download brochure or online forms
    • The only Offline media outreach will be via Tech Cons and booth representation where we can scan participant’s badges stopping by the booth or QR code scans on certain speaking engagements showcasing our Platform and its capabilities.
  • Metrics for success of our initiative are below
    • This eventually will lead with more prospects for our Sales/Portfolio Specialist
    • This will drive Pipeline Utilization for our Services
    • Increase traffic to our company’s website and blogosphere
  • Data Triaging and Ingestion into the platform – We will gather the data from various channels and feed it to a data lake in their Organization tagging those to specific Customer IDs
    • We will track the following in an agile manner and then tweak our content placement on the website, Vertical Expertise plays, Vendor specific Customer testimonials, Vendor Awards testimonials etc.
      • Clicks on Customer testimonials
      • Best practices Brochure Downloads
      • Blog reads
      • Tech con QR Code scan inquiries etc.
    • The ROI of the above iterative agile captures will be tracked and we will provide insights, feedback to our Marketing and Sales teams via Dashboards and models telling a story of our Prospective Customers’ journey
      • Relay that into the sales funnel
      • Relay that into front line Lead Development Reps activities
      • Relay that into our Services and Delivery teams for repeat business and up sell

In my last blog, i mentioned how we leverage Snowflake to tell this story for our clients. The other technologies we are working with are Azure Synapse, RedShift etc. With some good tech thought in place the modern platform idea and implementation will thrive in an organization supporting capabilities for various personas. Add analytics via Power BI, Domo, Tableau, Cognos Analytics, Micro Strategy, AWS Quicksight, ThoughtSpot etc. now we have a bring your tool approach to the Modern Data Platform as a service. If this blog addresses some of the things you are considering please feel free to contact us we would love to have conversations across the board. We do on platform Proof of Concepts, Technology Proof of Values to put the picture together.

]]>
https://blogs.perficient.com/2020/07/15/perficients-cloud-modern-data-platform-approach-customer-journey-capabilities/feed/ 0 277406
Snowflake for the Modern Data Platform https://blogs.perficient.com/2020/07/06/snowflake-for-the-modern-data-platform/ https://blogs.perficient.com/2020/07/06/snowflake-for-the-modern-data-platform/#respond Mon, 06 Jul 2020 22:06:36 +0000 https://blogs.perficient.com/?p=276907

As times have evolved, so have our data challenges — but we have been trying to solve it for ages. Giving it dedicated resources, robust hardware, etc. we still end up missing SLAs in our traditional On-Premise & Cloud-hosted worlds. Add to the fact the constant dependency on personnel to manage the data loads, data ops, size of the database, and on and on. When you look at a modern data platform, you are basically looking at something that will scale to your pipeline needs, be agile in the works, fully managed, does not need data shape restrictions, provides dependable performance at scale and is easy to query in SQL and via APIs! Enter Snowflake DB – it’s like the founders sat down around a round table, put all the traditional problems on paper, and literally solved it. Their motto moves data once and accesses it multiple times in various ways at various scale and capacity.

That leads them into many personas, and their own explicit compute surrounding themselves around the data on the cloud.

  • The data analysts
  • The traditional data pipelines
  • The Data Science teams
  • The Application teams
  • The BI and Analytics teams

They truly separated compute and storage and automated the management of the service. That reduces the code/script you will have to handwrite to make these various computes (Snowflake calls them Virtual Warehouses) dance to the needs of more and more data.

Their storage is in the center surrounded by computes of various sizes catering to each of those personas listed above. Add to the fact that the service literally shuts down and does not cost you a penny when nothing is being queried by the compute virtual warehouses. You are billed based on a credit matrix that aligns with the size of the compute. Storage costs are flat — they took the guessing game out for enterprises across the board. Snowflake has an ecosystem that supports many third-party products and ETL tools.

 

Snowflake

Bring data of any shape and store it into Snowflake – you can basically Lake and Mart on the same service and provide access to a multitude of Analytical Sandboxes to your various end-users across the board.

You can preset and baseline the number of Virtual Warehouses you need and set up an option to add scale when required. When queries start to queue up, the new Virtual Warehouse kicks into gear without disrupting existing queries and finishes the data load. This is the best part of Snowflake, and it can ingest billions of rows of data without any performance issues.

Add time travel ( retention of your data states ) and capability to clone data at no cost for Dev and Test instances. It really changes the game when it comes to serving Organizations of various sizes and needs.

All in all, a Shared data architecture that scales to the needs and manages itself without much overhead.

Analytics Workloads via Connectors and Virtual Warehouses addressed at scale

Snowflake has a federated connector to Domo that can be leveraged query real-time data and support mobile analytics needs in Apps. With Snowflake, you can literally fire up live queries using Tableau and Power BI and probably get away from loading data into Tableau Extracts or Power BI Data Sets in case the need is for Real-Time Dashboards. Helping you reduce the size of your analytical models in BI tools. Snowflake can be provisioned for HIPPA and PII regulatory needs as well on Azure, AWS, and Google. Add metadata management, and you get a well rounded managed data service for modern times.

We at Perficient are working with many existing and potential clients across various Verticals with Snowflake as a Partner to tell the story of data and reduce age-old overheads – please feel free to reach out to us. We have worked closely with Informatica, Talend, for enterprise implementations to leverage Snowflake across the board. In modern times Snowflake is a real disrupter in the industry.

]]>
https://blogs.perficient.com/2020/07/06/snowflake-for-the-modern-data-platform/feed/ 0 276907
Agile BI & Analytics is the Need of the Hour – Drive insights during Pandemic Uncertainty https://blogs.perficient.com/2020/04/04/agile-bi-analytics-is-the-need-of-the-hour-rapidly-drive-insights-during-pandemic-uncertainty/ https://blogs.perficient.com/2020/04/04/agile-bi-analytics-is-the-need-of-the-hour-rapidly-drive-insights-during-pandemic-uncertainty/#respond Sat, 04 Apr 2020 12:14:19 +0000 https://blogs.perficient.com/?p=272625

In uncertain times like these its essential that we consider the disruption that is happening and take a nimble approach to help Organizations including my own by driving analytics and insights across the various Lines of Business.

How can we take all the Public data that is available from John Hopkins, CDC, WHO and various State commissioned data and layer into an Organizations’s business data to tell a story , associate impact and deliver a message around Pandemic readiness and response?  And do that at speed and agility so we can address the following

  • Sanitation of the data while blending it with Organization data
  • Data freshness
  • Data Opps
  • Build trust around the data
  • Drive Adoption and Use with Executive buy in
  • Blending into the existing framework of Analytics and Data delivery while adding value in an iterative manner
  • An agile mindset to surface insights in collaboration with various teams – operating in a Pod and flex capacity

I am putting down a few scenarios to drive the home the idea that every Organization, their Line of Businesses, Operations, Margins, Supply Chain, Hiring, Credit usage, Workforce is disrupted and is looking to track essential KPIs (existing and new) to make sure they are responding to the need of the hour and their customers.

For Healthcare systems to effectively manage a  Pandemic Response and proactively monitor ground reality – they will need to bring in Public data and blend it with their respective EHRs and other Siloed databases.

This blended data will provide visibility into how ready are their Provider pools to tackle the surge of Digital Visits due to social distancing, do they have enough Supply of Physicians, do they have visibility into their PPE supply, Bed Utilization, Infrastructure readiness for remote work, Workflows in place to track Digital Adoption etc.

  • Public sanitized COVID-19 data from John Hopkins, WHO, CDC and other state level Health care data
    • Bed Utilization, Testing Data, ICU Utilization, Sate Entry Points, Air Travel Data, Medical Supplies ( Resourcing )
  • Patient Level REaL GAP data – Race, Ethnicity, Age, Language, Gender, Age, Payor, Sexual Orientation
  • Encounter Level activity and touchpoints
    • Virtual Visits – Telephonic, Video Visits, eVisits , On Demand Video Visits
    • MyChart App engagement
    • Diagnosis
    • Test Administered
    • Emergency and Clinic traffic
    • Procurement data – Vendors supplying safety equipment
    • HR data – Provider pool
  • Department – Facility – Market Roll up Metrics
  • Marketing Outreach data – Pandemic related outreach
    • Social Media Outreach
    • Email Outreach
    • FAQs
    • Chats

For a Bank – Blend Public and Enterprise Data to track

  • Credit loss
  • Customer Level REaL GAP data – Race, Ethnicity, Age, Language, Gender, Age, Payor, Sexual Orientation
  • Credit Stats and Finance Behavior data
  • Marketing Outreach
  • Survey Data

For a Retail business – Blend Public and Enterprise Data to

  • Improve logistics
  • Customer Level REaL GAP data – Race, Ethnicity, Age, Language, Gender, Age, Payor, Sexual Orientation
  • Shopping Behavioral data
  • Virtual Shopping metrics around Shoppers, Market Basket analysis
  • Virtual Payment analytics – Surge of Apple pay etc. to prevent folks using touchpads at checkouts
  • Marketing Outreach
  • Survey Data

To accomplish the following essential entities and capabilities need to be in place 

  • Immediate tactical workshops –  Immediately engage with the Line of Businesses to
    • Discuss Immediate pain points
    • Gaps they need to address
    • The scale the Line of Business needs to operate at
    • The goal here is to have a concrete understanding of the immediate needs and be ready to engage in a very agile manner addressing disruption, daily operational needs and an essential need to deliver insights with new data points at a healthy pace
  • Capability to On-board quickly is very essential to various business use cases
  • Provisioning Activities
    • Extend existing implementations or Stand up new service ( if needed )
  • Data Layer ( Extend existing or create a new Consolidated Staging Area and a Data Mart for Analytics)
    • An ingestion framework to quickly ingest the data
    • Agile Data Pipelines to create the data repository
    • Automation and Orchestration to keep data and Insight Fresh
    • Common Data Model to support Scale and growth
    • Framework in place to Organically on-board new data sets
    • Secure Framework for PII or Highly confidential data
  • Analytics and Insights Layer ( Leverage the Existing BI Platform or Standing up a new one on the Cloud for Agility)
    • Drive insights using Data Modeling & Blending
    • Tactical Command Center KPI Dashboards to Monitor the data – Dashboards and Reports
    • Standard Metrics – One version of truth with Glossary support – Business Layer
    • Mobile capability to consume the insights ( depending upon the tool/platform )
      • Threshold Notifications
      • Embed insights and drive adoption
      • Drive Actions via a Collaboration framework
    • ML capabilities to browse large amounts of data and drive insights ( depending upon the tool/platform )
    • Augmented Analytics to drive NLP insights
    • Enablement workshops to support the Line of Businesses and IT
  • Ongoing Support Framework

Perficient’s Data Solutions team can address all the above for your Organization in these times of Uncertainty and make sure Data and Insights are being delivered at pace in tune with the disruption and changing ground reality for your Enterprise. If you have implementations, analytics products supporting your Line of Businesses on the cloud or On Premise we can blend in with your strategic and tactical teams and augment every aspect of your analytics journey in these testing time and really drive value.We understand Data and Insights and how they impact the social outreach in these testing times. Feel free to contact us and we will be happy engage with your teams on the ground.

]]>
https://blogs.perficient.com/2020/04/04/agile-bi-analytics-is-the-need-of-the-hour-rapidly-drive-insights-during-pandemic-uncertainty/feed/ 0 272625
Why Data Now Makes Every Company a Tech Company https://blogs.perficient.com/2019/06/20/data-makes-every-company-a-tech-company/ https://blogs.perficient.com/2019/06/20/data-makes-every-company-a-tech-company/#respond Thu, 20 Jun 2019 11:00:11 +0000 https://blogs.perficient.com/?p=241050

Along with countless others in the IT consulting space, I’ve written about the challenges facing businesses as they undergo their various digital transformation initiatives. I find the topic endlessly fascinating, as it not only speaks to today’s business opportunities but explores what will be the foundation for the next century of innovation.

Discussing how IoT, AI, advanced analytics, and other technologies will transform business is what I imagine it must have been like to discuss the transformational impact of the light bulb, transcontinental railroads, the telegraph, and other industrial age innovations. Just as Edison couldn’t have predicted today’s LED lights, we’re likely only scratching the surface of what digital transformation will ultimately bring.

Because many of us come from tech backgrounds and are speaking to other tech leaders, it’s easy to think of digital transformation in terms like migrating from legacy, on-premises technology to modern, cloud-based technologies. In fact, many of the case studies I see are focused on how tech companies have become even smarter and data-driven. However, we need to remember that even the most traditional, analog companies can reap massive benefits from digital transformation.

One interesting example is Sinar Mas, a palm oil company in Indonesia. The company’s general manager Hong Zhou Wong recently shared his experience in transforming to a data-driven organization at Domopalooza 2019, the annual user conference for cloud-based data platform Domo.

Data to Assure “Operational Rhythm”

Sinar Mas is responsible for 500,000 hectares of plantation–4.5x the size of Los Angeles. The company has more than 172,000 employees and sells its product across 70 countries for numerous markets ranging from cooking oil to baby food to cosmetics to pharmaceuticals. As a “seed to shelf” producer, Sinar Mas must excel at wildly different functions–grower, harvester, producer, commodity trader, marketer, and seller.

For Sinar Mas, the company needed a way to use data to help assure its “operational rhythm.” With so many moving parts, along with the challenges of working with a product that’s produced by and is at the mercy of Mother Nature, the company needed to be smarter about using its data to eliminate any lag that could spoil its product and create wastage.

To do so, Sinar Mas uses Domo to capture and analyze data spanning across its 500,000 hectares of activities. From the day’s work of a single harvester in the field to the going market rate of palm oil across its global markets, the company has been able to bring all its data together in a single pane of glass to work as one.

In an organization the size and scope of Sinar Mas, you can’t have a one-size-fits-all way of using data. Wong spoke about how the company uses both a bottom-up and top-down approach: for upstream activities, the company takes a top-down approach to ensure that the entire business is operating in rhythm and without disruption, while downstream activities like day-to-day energy consumption and production efficiency are analyzed using a bottom-up approach. The two perspectives are then combined to give company leaders and managers a holistic view of the company.

Collect, Share, and Collaborate

Something I found interesting about Sinar Mas’ story is that their digital transformation isn’t just about going from no data to data, or stale data to real-time data. It’s about sharing data. In his presentation, Wong told the story about how managers would have to share data using WhatsApp or via paper before they added Domo. This made it impossible for anyone who was not on the text chat or on the other end of the memo to know what was going on at different parts of the company–a significant concern in such an interconnected enterprise.

Domo, a cloud-based data platform, created digital transformationNow, Sinar Mas is able to collect, share, and collaborate on data all within a single platform so that everyone has access to the same information in real-time. This not only keeps the business moving but allows different refineries and parts of the business to learn from each other and share best practices.

An important thing to remember is that digital transformation isn’t about just collecting or sharing data. It’s about people. Wong shared that people in his organization were initially fearful of the power of Domo. After all, what company leaders would see as the benefit of real-time data and complete transparency of information could be seen as a loss of control to a department manager who used to have time to “smooth” over her department’s data and prepare her version of the story before going into a meeting. Wong said that it was essential to recognize the fear that can come with digital transformation, to take the time to bring staff along, to have a vision for what you’re trying to accomplish, and to build a culture capable of supporting it. I couldn’t agree more.

Digital Transformation Transcends All Boundaries

This same story is being played out around the world every day across every industry and size of business. It no longer matters if your company is a food grower, a manufacturer, a trader, a retailer, or in the case of Sinar Mas, all of the above. Every company is now a tech company.

The more you can incorporate data into your business and your company culture, the more you can reimagine the way your business can and should be operated. The result will make today’s most revolutionary promises of digital transformation merely the foundation for what’s to come.

]]>
https://blogs.perficient.com/2019/06/20/data-makes-every-company-a-tech-company/feed/ 0 241050
Power BI + Azure Data Lake = Velocity & Scale to Your Analytics https://blogs.perficient.com/2019/02/17/power-bi-azure-data-lake-velocity-scale-analytics/ https://blogs.perficient.com/2019/02/17/power-bi-azure-data-lake-velocity-scale-analytics/#respond Sun, 17 Feb 2019 16:25:58 +0000 https://blogs.perficient.com/?p=236481

Context – Bring data together from various web, cloud and on-premise data sources and rapidly drive insights. The biggest challenge Business Analysts and BI developers have is the need to ingest and process medium to large data sets on a regular basis. They spend the most time gathering the data rather than analyzing the data.

Power BI Dataflow, the Azure Data Lake Storage Gen 2 makes this a very intuitive, and result based exercise.  Prior to Power BI Data flow data prepping was restricted to the Power BI Desktop.

Power BI Desktop

Power BI Desktop users use Power Query to connect to data, ingest it, and transform it before it lands in the dataset.

There are some limitations to the Power BI Desktop data prep

  • The relevant data is self- contained in individual data sets.
  • Difficulty to reuse data transformation logic – restricted to the data set that is being prepared.
  • Inability to set up transformations for incremental data loads – as they just transform the data.
  • Capability to do data load in scale.
  • It lacks the open data approach accomplished by moving the data to the Tenant’s storage account.
    • Open data approach gives access to multiple Azure based services to help data analysts and scientists.

The below diagram depicts how Dataflows aide the Business Analysts when they on-board data into the Azure Data Lake Storage Gen2 and then can leverage all the other services they have access to. How is this possible – the short answer is Common Data Model.

Common Data Model

The Common Data Model (CDM) provides a shared data language for business and analytical applications to use. The CDM metadata system enables consistency of data and its meaning across applications and business processes (such as PowerApps, Power BI, Dynamics 365, and Azure), which store data in conformance with the CDM.

The diagram above shows a bi-directional approach to enterprise analytics on the cloud.

Low Code approach

Left hand side shows the low code approach where Data Analysts from different Lines of Business can access, prep and curate datasets for their analytics needs – create Analytics content.

  • They can also leverage content that has been loaded into the Data Lake by the Enterprise Data Engineers in their Power BI Reports
  • With this approach there will be a need for data governance and guard railing
    • Data Analysts will have dedicated CDM Folder they will have access to map to the workspace.
    • Data Analysts will not write to a CDM Folder that Data Engineers are writing to but will have read access.
    • Limit the use of Dataflows to data preparation – only to analysts who use the data to create datasets to support enterprise analytics.
  • To allow easy data exchange, Power BI stores the data in ADLSg2 in a standard format (CSV files) and with a rich metadata file describing the data structures and semantics.

Power BI Dataflows allows you to ingest data into the CDM form from a variety of sources such as Dynamics 365, Salesforce, Azure SQL Database, Excel, or SharePoint. Once connected prepare the data and load it as a custom entity in CDM form in Azure Data Lake Storage Gen2.

Enterprise Data Workloads approach

Right hand side shows the various Azure services that potentially leveraged the Enterprise Data Engineers to drive analytical workloads.

Conclusion and Insight

The details outlined in this blog enables rapid insight to enterprise data. This approach drives adoption and establishes standards that drives sustainable analytics’ projects leveraging Power BI. This also enables access to curated and cataloged data. The semantic standardization enhances ML and pushes forth AI on the platform.

The Gartner Magic Quadrant for Analytics and BI released in February 2019 named Microsoft Power BI as the leader in the analytics quadrant. One of the driving factors listed in it is Microsoft’s Comprehensive product vision. It draws attention to Common and open data model, AutoML, Cognitive Services like text, sentiment analytics via Power BI. I tried to draw attention to the same in this blog and simplify it.

]]>
https://blogs.perficient.com/2019/02/17/power-bi-azure-data-lake-velocity-scale-analytics/feed/ 0 236481
Power BI: Power up your Analytics and Empower your Teams https://blogs.perficient.com/2019/02/06/power-bi-power-up-your-analytics-empower-your-teams/ https://blogs.perficient.com/2019/02/06/power-bi-power-up-your-analytics-empower-your-teams/#comments Wed, 06 Feb 2019 07:34:43 +0000 https://blogs.perficient.com/?p=235693

Power Up your Analytics with Power BI 

Power BI is a hosted BI solution on the Microsoft cloud. Its biggest appeal is the ecosystem it comes with.  Microsoft across its  various releases has created a very intuitive product on a robust platform. Its a one stop shop for all analytics needs.

Identifying use cases for Power BI

This section will establish the relevance of Power BI. It is an immediate value add to any Organization. It seamlessly enables various teams that are willing to collaborate across the board to drive decision making. Let us now lay down a few scenarios.

Scenario 1 – Company has a sales team who has data in Salesforce , tracking opportunities are traditionally pretty happy with what Salesforce has to offer. The team wants to collaborate extensively and drive analytics to improve pipeline activity across various Regions, Manager and Sales reps.

Scenario 2 – Company has a data lake established. Various analysts want the flexibility to tap into the Lake data and blend it with attribute data that does not reside in the Enterprise data lake.

Scenario 3 –  IT Driven Analytics – where IT creates the content leveraging the data from an enterprise Data warehouse.

Scenario 4 – A team in an organization has been living off Excel and macros. There is a huge maintenance overhead and most of the time is spent accessing and gathering data.

Let us now put some perspective into what Power BI enables.

  • Prototyping
    • This is a big plus and drives immediate adoption
  • Easy access to data
    • Analyze On Premise and Cloud Data sources at ease
  • Data Prepping and blending resulting in a strong Semantic Layer
    • It lets you wrangle data and apply transformations to it
  • Enables Rapid Content Creation
    • It adds velocity to analytics projects
  • Secure Implementations leveraging Azure Active Directory
    • Secure access
  • Data level Security
  • Easy Collaboration
    • Channel integration into Microsoft Teams
  • Improves coordination between IT and Line of Business
  • Promotes Self Service

Now that we have discussed Power BI’s advantages let us put it to work to its true potential. For that to happen we will have to depend on the Technical IT team, Line of Business, Data Owners and Stake holders.

Simply put its where people, process and platform come together.

  • Provisioning – Technical IT Team
    • License Purchases
    • Tenant Administration
    • Security and Access grants
    • Data Source(s)
    • Architectural decisions
  • Content Creation  – Both Technical IT Team and Line of Business
    • Prototyping will be driven by the Line of Business
    • Technical IT team – for Curated Data Sources
    • Identification of Use Cases and Prioritization
    • On boarding Metrics
  • Self Service for Power Users
    • Establish a Standard Semantic layer – Technical IT Team
  • Team Size Transitions
    • Small to Large to Enterprise
  • Platform
    • Microsoft Power BI’s road map and future feature releases
  • Organization or Team’s skills in place
    • How well the resident in house skills can support Power BI

I took a broad swipe at how we can leverage Power BI platform’s true potential by establishing processes and synergy between people. A good understanding of the various licences and supported end user capabilities will go a long in establishing a sustainable solution.

Couple Power BI’s capabilities, processes established along with responsibilities laid out to address the Scenarios laid down at the beginning of this blog.

Scenario 1 – Analyze Salesforce data in Power BI

  • Power BI has connectors for Salesforce
  • Use Power BI Desktop to query the data and Create content
  • Publish the content out to the Power BI service and let the Sales Teams collaborate among themselves
  • To drive more value surface the Power BI content to a channel in Microsoft Teams
  • Secure the Power BI content staging area

Scenario 2 – Blend Attribute data with Data Lake data

  • User Power BI to query the Data Lake
  • Use Power BI to ingest Attribute data in excel
  • Blend and transform the above
  • Create content
  • Share and collaborate

Scenario 3 – Enable Self Service Analytics for the Organization on curated data supported by the technical IT team for the enterprise.

Scenario 4 – Decouple from Excel completely, house the business logic and establish a solid Semantic Layer in Power BI.

As you can see Power BI lends a lot of flexibility across various scenarios, engages various teams and brings a lot of fire power to analytics as an offering.

In my next blog i will dive deeper into the essential components of Power BI and various platform provisioning approaches.

]]>
https://blogs.perficient.com/2019/02/06/power-bi-power-up-your-analytics-empower-your-teams/feed/ 1 235693