Perficient Enterprise Information Solutions Blog

Blog Categories

Subscribe via Email

Subscribe to RSS feed

Archives

Follow our Enterprise Information Technology board on Pinterest

Posts Tagged ‘analytics’

SSRS – Have you used it yet?

While there are several BI technologies and more coming into the foray every day, SSRS has remained a key player in this area for quite some time now.  One of the biggest advantages of SSRS reporting is that it involves the participation of the end user and that is very intuitive to use.

Let’s go back few years when excel was the go to tool for dash boarding.  Every time a director or VP wanted a report, he would go to his developers to extract information from the database to help him make dashboards for his meetings.  The end user had to rely on the developers to extract information and had to spend several minutes if not hours to make a dashboard.  This all works ok when the meeting is scheduled for a specific day of the week or month.  We all know this is a myth and most meetings happen impromptu.  In such cases, there is not enough time to extract data and to extrapolate that information into graphs.

Here is why SSRS came in as a key player.  With a strong foundation of Microsoft, SSRS brought in some of the best features and much needed features:

  • Easy connection to databases
  • User friendly interface allowing users to design reports and make changes on the fly.
  • Report generation on a button click.
  • Subscription based delivery to deliver reports on a specific day and time of the month.

 

While these features may not look ground breaking in the first look, these features actually bring in a lot of value.  These features save a lot of time and that time in business directly translates into revenue.  The developers can design dashboards once and deploy them to a server.  The VP or director can press a button to get these reports on his machine.  Furthermore, the reports can be exported in several formats.  What I really like about the reports though is the look and feel.  Microsoft retained the aesthetics of MS excel reports and by that I mean that you can have a pie chart in excel and in SSRS look exactly same.  This is a great feature especially for the audience since it most people do not like to see the look of the reports change over time.  Another great feature is that SSRS has fantastic security options and one can implement a role based reporting.

In summary, SSRS is a power packed tool and you should reap benefits of the great features that come with it.

For information on Microsoft’s future BI roadmap and self-service BI options check out this post over on our Microsoft blog
 
 

 

Realizing Agile Data Management …

Years of work went into building the elusive single version of truth. Despite all the attempts from IT and business, Excel reporting and Access databases were impossible to eliminate. Excel is the number one BI tool in the industry and for the following good reasons : accessibility to the tool, speed and familiarity. Almost all the BI tools export data to Excel for those reasons. Business will produce the insight they need as soon as the data is available, manual or otherwise. It is time to come to terms with the fact change is imminent and there is no such thing as Perfect Data but only what is good enough to business. As the saying goes:

‘Perfect is the enemy of Good!’

So waiting for all the business rules and perfect data to produce the report or analytics, is too late for the business. Speed is of essence, when the data is available, business wants it; stale data is as good as not having it.

Data_Management_1

In the changing paradigm of Data Management, agile ideas and tools are in play. Waiting for Months, weeks or even a day to analyze the data from Data warehouse is a problem. Data Discovery through Agile BI tools which doubles as ETL, offers significant reduction in data availability. Data Virtualization provides access to data in real-time for quicker insights along with metadata. In-Memory data appliances produce analytics in fraction of the time compared to traditional Data warehouse/ BI.

We are moving from the Gourmet sit-in dining to fast food concept for Data access and analytical insights. Though both have its place, usage benefits and short comings. They complement each other in terms of use and the value they bring to the Business. In the following series let’s look at these new set of tools and how they help Agile  Data Management throughout the life cycle.

  1. Tools in play:
    1. Data Virtualization
    2. In-Memory Database (appliances)
    3. Data Life Cycle Management
    4. Data Visualization
    5. Cloud BI
    6. Big Data (Data Lake & Data Discovery)
    7. Cloud Integration (on-prem and off-prem)
    8. Information Governance (Data Quality, Metadata, Master Data)
  2. Architectural changes traditional Vs Agile
  3. Data Management Impacts
    1. Data Governance
    2. Data Security & Compliance
    3. Cloud Application Management

DevOps Considerations for Big Data

Big Data is on everyone’s mind these days. Creating an analytical environment involving Big Data technologies is exciting and complex. New technology, new ways of looking at the data which is otherwise remained dark or not available. The exciting part of implementing the Big Data solution is to make it a production ready solution.

Once the enterprise comes to rely on the solution, dealing with typical production issues is a must. Expanding the data lakes and creating multiple applications accessing, changing and deploying new statistical learning solutions can hit the overall platform performance. In the end-user experience and trust will become an issue if the environment is not managed properly. Models which used to run in minutes may turn into hours and days based on the data changes and algorithm changes deployed. bigdata_1Having the right DevOps process framework is important to the success of Big Data solutions.

In many organizations the Data Scientist reports to the business and not to IT. Knowing the business and technological requirements and setting up the DevOps process is key to make the solutions production ready.

Key DevOps Measures for Big Data environment:

  • Data acquisition performance (ingestion to creating a useful data set)
  • Model execution performance (Analytics creation)
  • Modeling platform / Tool performance
  • Software change impacts (upgrades and patches)
  • Development to Production –  Deployment Performance (Application changes)
  • Service SLA Performance (incidents, outages)
  • Security robustness / compliance

 

One of the top key issue is Big Data security. How secured is the data and who has the access and the oversight of the data? Putting together a governance framework to manage the data is vital for the overall health and compliance of the Big Data solutions. Big Data is just getting the traction and much of best practices for Big Data DevOps scenarios yet to mature.

Virtualization – THE WHY?

 

The speed in which we receive information from multiple devices and the ever-changing customer interactions providing new ways of customer experience, creates DATA! Any company that knows how to harness the data and produce actionable information is going to make a big difference to their bottom line. So Why Virtualization? The simple answer is Business Agility.

As we build the new information infrastructure and the tools for the modern Enterprise Information Management, one has to adapt and change. In the last 15 years, the Enterprise Data Warehouse has matured to a point with proper ETL framework and Dimension models.

With the new ‘Internet of Things’ (IoT) a lot more data is created and consumed from external sources. Cloud applications create data which may not be readily available for analysis. Not having the data for analysis will greatly change the critical insights outcome.

Major Benefits of Virtualization

 Virtualization_benefits

Additional considerations

  • Address performance impact of Virtualization on the underlying Application and the overall refresh delays appropriately
  • It is not a replacement for Data Integration (ETL) but it is a quicker way to get data access in a controlled way
  • May not include all the Business rules, which implies Data Quality issues, may still be an issue

In conclusion, having the Virtualization tool in the Enterprise Data Management portfolio of products will add more agility in Data Management. However, use Virtualization  appropriately to solve the right kind problem and not as a replacement to traditional ETL.

Cloud BI use cases

Cloud BI comes in different forms and shapes, ranging from just visualization to full-blown EDW combined with visualization and Predictive Analytics. The truth of the matter is every niche product vendor offers some unique feature which other product suite does not offer. In most case you almost always need more than one suite of BI to meet all the needs of the Enterprise.

De-centralization definitely helps the business in achieving agility and respond to the market challenges quickly. At the same token that is how companies may end up with silos of information across the enterprise.

Let us look at some scenarios where a cloud BI solution is very attractive to Departmental use.

time_2_mktTime to Market

Getting the business case built and approved for big CapEx projects is a time-consuming proposition. Wait times for HW/SW and IT involvement means lot longer delays in scheduling the project. Not to mention the push back to use the existing reports or wait for the next release which is allegedly around the corner forever.

 

deploymentDeployment Delays

Business users have immediate need for analysis and decision-making. Typical turnaround for IT to get new sources of data takes anywhere between 90 days to 180 days. This is absolutely the killer for the business which wants the data now for analysis. Spreadsheets are still the top BI tool just for this reason. With Cloud BI (not just the tool) Business users get not only  the visualization and other product features but also the data which is not otherwise available. Customer analytics with social media analysis are available as  a third-party BI solution. In the case of value-added analytics there is business reason to go for these solutions.

 

Tool CapabilitiesBI_cap

Power users need ways to slice and dice the data, need integration of other non traditional sources (Excel, departmental cloud applications) to produce a combined analysis. Many BI tools comes with light weight integration (mostly push integration) to make this a reality without too much of IT bottleneck.

So if we can add new capability, without much delay and within departmental budget where is the rub?

The issue is not looking at the Enterprise Information in a holistic way. Though speed is critical, it is equally important to engage Governance and IT to secure the information and share appropriately to integrate into the Enterprise Data Asset.

As we move into the future of Cloud based solutions, we will be able to solve many of the bottlenecks, but we will also have to deal with security, compliance and risk mitigation management of leaving the data in the cloud. Forging a strategy to meet various BI demands of the enterprise with proper Governance will yield the optimum use of resources and /solution mix.

KScope14 Session: Empower Mobile Restaurant Operations Analytics

Perficient is exhibiting and presenting this week at KScope14 in Seattle, WA. On Monday, June 23 I presented my retail-focused solution offering built upon the success of Perficient’s Retail Pathways, but using the Oracle suite of products. In order to focus the discussion to fit within a one hour window I chose restaurant operations to represent the solution.

Here is the abstract for my presentation.

Multi-unit, multi-concept restaurant companies face challenging reporting requirements. How should they compare promotion, holiday, and labor performance data across concepts? How should they maximize fraud detection capabilities? How should they arm restaurant operators with the data they need to react to changes affecting day-to-day operations as well as over-time goals? An industry-leading data model, integrated metadata, and prebuilt reports and dashboards deliver the answers to these questions and more. Deliver relevant, actionable mobile analytics for the restaurant industry with an integrated solution of Oracle Business Intelligence and Oracle Endeca Information Discovery.

We have tentatively chosen to brand this offering as Crave – Designed by Perficient. Powered by Oracle. This way we can differentiate this new Oracle-based offering from the current Retail Pathways offering.

Crave Logo

Read the rest of this post »

SAP HANA – A ‘Big Data’ Enabler

Some interesting facts and figures for your consideration:

  • 90% – of stored data in the world today was created in the past 2 years
  • 50% – annual data growth rate
  • 34,000 – tweets sent each minute
  • 9,000,000 – daily Amazon orders
  • 7,000,000,000 – daily Google Page Views
  • 2.5 Exabyte – amount of data created every day (an Exabyte is 1,000,000,000,000,000,000 B = 1000 petabytes = 1 million terabytes = 1 billion gigabytes)

Looking at these numbers it is easy to see why more and more technology vendors want to provide solutions to ‘Big Data’ problems.

In my previous blog, I mentioned how we’ll soon get to a place where it will be more expensive for a company not to store data than to store data – some pundits claim that we’ve already reached this pivotal point.

Either way, it would be greatly beneficial to come to terms with at least some of those technologies that have made a substantial investment in the Big Data space.

One such technology is SAP HANA – a Big Data enabler. I am sure that some of you have heard this name before… but what is SAP HANA exactly?

The acronym H.AN.A. in ‘SAP HANA’, stands for High-performance ANalytical Appliance. If I went beyond the name/acronym and described SAP HANA in one sentence, I would say that SAP HANA is a database on steroids, perfectly capable of handling Big Data in-memory, and one of the few in-memory computing technologies that can be used as an enabler of Big Data Solutions.

Dr. Berg and Ms. Silvia – both SAP HANA gurus – provide a comprehensive and accurate definition of SAP HANA:

“SAP HANA is a flexible, data-source-agnostic toolset (meaning it does not care where the data comes from) that allows you to hold and analyze massive volumes of data in real time, without the need to aggregate or create highly complex physical data models. The SAP HANA in-memory database solution is a combination of hardware and software that optimizes row-based, column-based, and object-based database technologies to exploit parallel processing capabilities. We want to say the key part again: SAP HANA is a database. The overall solution requires special hardware and includes software and applications – but at its heart, SAP HANA is a database”.

Or as I put it, SAP HANA is a database on steroids… but with no side-effects, of course. Most importantly though, SAP HANA is a ‘Big Data Enabler’, capable of:

  • Conducting Massive Parallel Processing (MPP), handling up to 100TB of data in-memory
  • Providing a 360 degree view of any organization
  • Safeguarding the integrity of the data by reducing, or eliminating data migrations, transformations, and extracts across a variety of environments
  • Ensuring overall governance of key system points, measures and metrics

All with very large amounts of data, in-memory and in real time… could this be a good fit for your company? Or, if you are already using SAP HANA, I’d love to hear from you and see how you have implemented this great technology and what benefits you’ve seen working with it.

My next blog post will focus on SAP HANA’s harmonious, or almost harmonious, co-existence with Hadoop…

QlikView… QlikTech… Qlik…

Several years ago, when I started using QlikView (QlikTech’s flagship product), I had a strong preference for more traditional BI tools and platforms, mostly because I thought that QlikView was just a visualization tool. But after some first-hand experience with the tool, any bias I had was quickly dissipated and I’ve been a QlikView fan and fulfilling the role of Senior QlikView Architect on full lifecycle projects for a while now.

QlikViewToday, Qlik Technologies (also known as QlikTech or simply Qlik) is the 3rd fastest growing tech company in the US (according to a Forbes article) but my personal journey with QlikView, and probably QlikTech journey as well, has not always been easy – a paradigm shift in the way we look at BI is required. Most importantly, I understood along with many others, that this isn’t a matter of QlikView or SAP BI, of QlikView Agile approach to BI or Traditional BI – it is NOT a matter of ORs, but rather a matter of ANDs.

It is a matter of striking the right balance with the right technology mix and do what is best for your organization, setting aside personal preferences. At times QlikView may be all that is needed. In other cases, the right technology mix is a must. At times ‘self-service’ and ‘agile’ BI is the answer…. and at times it isn’t. Ultimately, it all revolves around the real needs of your organization and creating the right partnerships.

So far, QlikTech has been able to create a pretty healthy ecosystem with many technology partners, from a wide variety of industries and with a global reach. QlikTech has been able to evolve over time and has continued to understand, act on and metabolize the needs of the market, along with the needs of end-users and IT – I wonder what’s next.

That’s one of the reasons why Qlik has been able to trail-blaze a new approach to BI; user-driven BI, i.e. Business Discovery. According to Gartner ‘Qlik’s QlikView product has become a market leader with its capabilities in data discovery, a segment of the BI platform market that it pioneered.’

Gartner defines QlikView as ‘a self-contained BI platform, based on an in-memory associative search engine and a growing set of information access and query connectors, with a set of tightly integrated BI capabilities’. This is a great definition that highlights a few key points of this tool.

In coming blogs, we’ll explore some additional traits of QlikTech and its flagship product QlikView, such as:

Ø  An ecosystem of partnerships – QlikTech has been able to create partnerships with several Technology Partners and set in place a worldwide community of devotees and gurus

Ø  Mobility – QlikView was recently named ‘Hot Vendor’ for mobile Business Intelligence and ranks highest in customer assurance (see WSJ article here) with one of the best TCO and ROI

Ø  Cloud – QlikView has been selected as a cloud-based solution by several companies and it has also created strong partnerships with leading technologies in Cloud Computing, such as Amazon EC2 and Microsoft Azure

Ø  Security – provided at the document, row and field levels, as well as at the system level utilizing industry standard technologies such as encryption, access control mechanisms, and authentication methods

Ø  Social Business Discovery – Co-create, co-author and share apps in real time, share analysis with bookmarks, discuss and record observations in context

Ø  Big Data – Qlik has established partnerships with Cloudera and Hortonworks. In addition, according to the Wall Street Journal, QlikView ranks number one in BI and Analytics offering in Healthcare (see WSJ article here), mostly in connection with healthcare providers seeking “alternatives to traditional software solutions that take too long to solve their Big Data problems”

 

In future posts, I am going to examine and dissect each of these traits and more! I am also going to make sure we have some reality checks set in place in order to draw the line between fact and fiction.

What other agile BI or visualization topics would you like to read about or what questions do you have? Please leave comments and we’ll get started.

Three Attributes of an Agile BI System

In an earlier blog post I wrote that Agile BI was much more than just applying agile SDLC processes to traditional BI systems.  That is, Agile BI systems need to support business agility.   To support business agility, BI systems should address three main attributes:

  1. Usable and Extensible –  In a recent TDWI webinar on business enablement, Claudia Imholf said “Nothing is more agile than a business user creating their own report.”   I could not agree more, with Ms. Imholf’s comments.   Actually, I would go farther.  Today’s BI tools allow users to create and publish all types of BI content like dashboards, and scorecards.  They allow power users to conduct analysis and then storyboard, annotate, and interpret the results.   Agile BI systems allow power users to publish content to portals, web-browsers, and mobile devices.  Finally, Agile BI systems do not confine users to data published in a data warehouse, but allow users to augment IT published data with “user” data contained in spreadsheets and text files.  Read the rest of this post »

Top 5 Best Practices for an Actionable BI Strategy

In an earlier blog post, I pointed out a number of companies complete a BI Strategy but only to shelve it shortly after its completion.   One main reason is that companies let their BI Strategy atrophy by not maintaining it; however, the other main cause of shelving a BI Strategy is that it was not actionable. That is, the BI Strategy did not result in a roadmap that would be funded and supported by the organization.  As you formulate your BI Strategy, there are 5 best practices that will help result in a BI Strategy that is actionable and supported by your business stakeholders.  These best practices are:BI Strategy

1.       Address the Elephants in the Room – Many times if management consultants are brought into help with a BI Strategy, their objectivity is needed to resolve one or more disagreements within an organization.   For example, this disagreement could be a DW platform selection, architectural approach for data integration, or the determination of business priorities for the BI program.   The BI Strategy needs to resolve these issues or the issues will continue to fester within the organization, eventually undermining the support for the BI Strategy.    Read the rest of this post »