Arunkalyan Nagarajan, Author at Perficient Blogs https://blogs.perficient.com/author/anagarajan/ Expert Digital Insights Mon, 14 May 2018 15:40:49 +0000 en-US hourly 1 https://blogs.perficient.com/files/favicon-194x194-1-150x150.png Arunkalyan Nagarajan, Author at Perficient Blogs https://blogs.perficient.com/author/anagarajan/ 32 32 30508587 Personnel Analytics for better “self-awareness” https://blogs.perficient.com/2013/03/24/personnel-analytics-for-better-self-awareness/ https://blogs.perficient.com/2013/03/24/personnel-analytics-for-better-self-awareness/#respond Sun, 24 Mar 2013 23:55:01 +0000 http://blogs.perficient.com/dataanalytics/?p=3346

BI technologies these days are not just limited to the Business world, it has seen widespread adoption in the consumer space as well. And its use is growing and people are finding the value in such analytic tools. I would like to discuss 2 example of personnel analytics that caught my eye in the recent past.

Example 1 : Personnel Analytics from Stephen Wolfram

I read an article a year ago by one of the great minds of our time (Stephen Wolfram) about his data collection habits and the use of analytics to gain insights into his personal life. You can read the entire blog here. He started collecting his daily activity data from 1990’s and has nearly 20 years of his life digitized. He has captured data from his emails, phone calls records, sleeping pattern and his walking steps. The reason for Stephen to do the personnel analytics might be his interest in self awareness, but we can use such analytics for better time management and to handle our daily routines more effectively.


 

Example 2 : Health Care Analytics

It is no surprise the next wave of gadgets are the wearable computers. Apple is rumored to be working on a watch, Google is working on the Google Glass but there are companies that are solely working on Healthcare gadgets as well. These are devices, wristbands that you wear all day long and it keep tracks of your daily life. From Calorie burnt, Steps walked, Sleeping patterns and Vital sign checks. Fitbit, Nike Fuelband and Jawbone Up are some of the consumer products that answer questions on your personal life. The next wave of these devices might check more health related stats which might help regularize eating habits and sleeping habits for a healthier lifestyle.



FITBIT: http://www.wired.com/reviews/2009/10/pr_fitbit/

FUELBAND: http://www.wired.com/playbook/2012/01/nike-fuelband/

JAWBONE UP: https://jawbone.com/up

I believe the need for personnel analytics would increase over time and we might see more of such BI technologies used in the consumer world. But I am skeptical whether the existing BI companies would even play a role in developing and providing the platform for such analysis.

]]>
https://blogs.perficient.com/2013/03/24/personnel-analytics-for-better-self-awareness/feed/ 0 199908
Change Data Capture: Capabilities, UseCases and Offerings: Part 2 https://blogs.perficient.com/2013/03/16/change-data-capture-capabilities-usecases-and-offerings-part-2/ https://blogs.perficient.com/2013/03/16/change-data-capture-capabilities-usecases-and-offerings-part-2/#respond Sat, 16 Mar 2013 17:00:59 +0000 http://blogs.perficient.com/dataanalytics/?p=3336

The post is the 2nd in the series which covers capabilities, use cases and product offerings for change data capture technology.

The critical capabilities which are expected in a CDC tool

  • Selective data replication and synchronization is a capability to synchronize data between multiple databases. It usually supports high volume and mission critical scenarios. An example of such use cases would be in creating redundant data for mission critical data and keeping data in all the operational systems in sync.
  • Volume data movement is an important capability which involves high volume data extraction and delivery. The capability is required for supporting Business Intelligence and Data warehouse, and data migration efforts of an organization.
  • Message Oriented Middleware is an infrastructure supporting sending and receiving messages between distributed systems. MOM allows application modules to be distributed over heterogeneous platforms and reduces the complexity of developing applications that span multiple operating systems and network protocols. The data is encapsulated into messages that different applications can exchange in real time.
  • Data federation is a technology that provides an organization with the ability to aggregate data from disparate sources in a virtual database so it can be used for business intelligence (BI) or other analysis. The virtual database created by data federation technology doesn’t contain the data. But, it contains information about the actual data and its location . The actual data is left untouched.

Use cases

There are 5 major use cases for the CDC technology and they are detailed below.

  • BI and Data Warehousing. The most useful use case of a CDC technology would be in the Data Warehouse and the BI world. The data is sourced from multiple sources and is extracted from multiple operational systems delivering an integrated data structure to provide analytics for the whole company.
  • MDM solutions is used to support Master Data Management by removing duplicates, standardizing data, incorporating rules to eliminate incorrect data from entering the system in order to create an authoritative source of master data. Master data are the products, accounts and parties for which the business transactions are completed.CDC helps in maintaining data consistency and help MDM tools provide a unified data structure for some of the corporate entities. Data replication and synchronization functions are increasingly required in supporting MDM.
  • Data Migration. Many organizations have multiple data migration projects and face large scale migration efforts at any given time. The usual practice of custom coding is dying away with more abstract capabilities these CDC tools provide. Many of the legacy application changes and consolidation efforts are being addressed by data migration as well.
  • Data Consistency across applications is used to maintain consistency and have data redundancy in mission critical application data. The CDC tools help in maintaining data consistency between the different application systems relying on different database solutions. For example a new purchase order details needs to reflected in the billing and inventory systems to avoid confusion in data and helps in having more reliable and consistent data across the databases and applications.
  • Data Sharing with Vendors, Partners and Customers. There are requirements and standards that companies need to maintain for running a business. Nowadays it is expected that all companies open up their data platform for its vendors and partners to give more insights and visibility towards the key operational data. Data integration tools might be helpful in these scenarios, which often consist of the same types of data access, transformation and movement components found in other common use cases.

 

Current Product Offerings

The Product Rating from Gartner features all prominent products and market leaders based on the score calculated by the features and capabilities these tools offer. The chart gives a quick glance of the same.



Source : Gartner

Product Offerings and Key Players

  • IBM – Information Server
  • iWay Software – DataMigrator , Data Hub , Service Manager
  • Informatica – Informatica Platform
  • Microsoft – SQL Server 2012 – Integration Services.
  • Oracle – GoldenGate and Oracle Data Integrator
  • SAP – Data Services
  • SAS – Dataflux’s Data Management Platform
  • Talend – Talend Integration Suite

Conclusion We see that CDC has widespread usecases and capabilities in an organisations IT efforts. IT departments would be wise in adopting CDC technologies and recognizing the capabilities it provides and in providing enterprise wide guidelines for its use.

]]>
https://blogs.perficient.com/2013/03/16/change-data-capture-capabilities-usecases-and-offerings-part-2/feed/ 0 199907
Changes to Information Management Reference Architecture – Part 1 https://blogs.perficient.com/2013/02/26/changes-to-oracle-information-management-reference-architecture-part-1/ https://blogs.perficient.com/2013/02/26/changes-to-oracle-information-management-reference-architecture-part-1/#respond Tue, 26 Feb 2013 22:45:01 +0000 http://blogs.perficient.com/dataanalytics/?p=3281

Oracle’s last Data Warehouse reference architecture was released in 2010 and since then the industry has seen a lot of changes in handling data. This is a 2 part series covering the changes to the data warehouse reference architecture incorporating Big Data needs.

As Oracle puts it

What has changed in the last few years is the emergence of “Big Data”, both as a means of managing the vast volumes of unstructured and semi-structured data stored but not exploited in many organizations, as well as the potential to tap into new sources of insight such as social-media web sites to gain a market edge.

The post covers the background on Information Management and look at the new demands on DW and BI solutions to exploit new information sources such as ( Social Media, Sensor, Logs, etc.. ) for better competitive advantage.

The earlier information management reference dealt with readily analysed ,easily analysed using standard BI tools. The earlier versions of the IM reference was defined based on the technical and commercial limitations in 2008-09. Now some/most of the limitations that existed earlier have disappeared with the advances in technologies such as Hadoop, NoSQL and Hardware improvement in Oracle Exadata. This gives rise to the flexibility in determining the IM solutions without any consequences on hardware capability or limitations.

Increase the scope of Information Management Reference Architecture

The reference architecture paper argues that many social media organizations are not focusing on accommodating the existing IM solution and that Big Data is not any different from other aspects of Information Management. There are common question are the same for Big Data as well.

How the new data or analysis scope can enhance your existing set of capabilities?

What additional opportunities for intervention or processes optimization does it present?

Information Management Reference Architecture

The difference between the old Reference and the new reference architecture is noticeable. The new version of the IM architecture includes Knowledge Discovery and improved analytics tools used by Data scientists.

Old Reference Architecture

New Reference Architecture

A brief information on the different layers in the architecture.

Extract from Oracle’s Information Management White Paper

Staging Data Layer. Abstracts the rate at which data is received onto the platform from the rate at which it is prepared and then made available to the general community. It facilitates a ‘right-time’ flow of information through the system.

Foundation Data Layer. Abstracts the atomic data from the business process. For relational technologies the data is represented in close to third normal form and in a business process neutral fashion to make it resilient to change over time. For non-relational data this layer contains the original pool of invariant data.

Access and Performance Layer. Facilitates access and navigation of the data, allowing for the current business view to be represented in the data. For relational technologies data may be logical or physically structured in simple relational, longitudinal, dimensional or OLAP forms. For nonrelational data this layer contains one or more pools of data, optimised for a specific analytical task or the output from an analytical process. e.g., In Hadoop it may contain the data resulting from a series of Map-Reduce jobs which will be consumed by a further analysis process.

Knowledge Discovery Layer. Facilitates the addition of new reporting areas through agile development approaches and data exploration (strongly and weakly typed data) through advanced analysis and Data Science tools (e.g. Data Mining).

BI Abstraction & Query Federation. Abstracts the logical business definition from the location of the data, presenting the logical view of the data to the consumers of BI. This abstraction facilitates Rapid Application Development (RAD), migration to the target architecture and the provision of a single reporting layer from multiple federated sources.

The next post in the series would discuss the implementation methodologies for Big Data in the context of Information management framework.

]]>
https://blogs.perficient.com/2013/02/26/changes-to-oracle-information-management-reference-architecture-part-1/feed/ 0 199900
Amazon Redshift: BI in the Cloud https://blogs.perficient.com/2013/02/15/amazon-redshift-bi-in-the-cloud/ https://blogs.perficient.com/2013/02/15/amazon-redshift-bi-in-the-cloud/#respond Fri, 15 Feb 2013 19:31:28 +0000 http://blogs.perficient.com/dataanalytics/?p=3238

BI and DW in the cloud has been avoided by companies for a while now. There are many reasons that are attributed for the slow start.

  1. Companies want to play the wait and watch approach when it comes to moving BI to the cloud

  2. The offered solutions were not straightforward and not much cost effective

  3. Employees needed to be retrained if a new analytics application is forced on them

But with Amazon’s new Redshift service, the DW in the cloud can be integrated with the existing Business Intelligence tools. This is a big step in the right direction. There are significant advantages to using Redshift vs traditional Data warehouse.

  1. Scaling the service is a breeze and you can expand at will ( Flexible )

  2. Predicted to be 1/10th the cost in maintaining the data warehouse.(Cost effective)

  3. Integrates with existing business intelligence tools. ( Ease of use )

  4. No maintenance and deployment issues.

From amazon’s blog

Amazon Redshift is a fast, fully managed, petabyte-scale data warehouse service that makes it simple and cost-effective to efficiently analyze all your data using your existing business intelligence tools. It is optimized for datasets ranging from a few hundred gigabytes to a petabyte or more and costs less than $1,000 per terabyte per year, a tenth the cost of most traditional data warehousing solutions.

Introductory video about Amazon Redshift

I don’t see a better narrative or a time to start moving to a cloud based BI solution than now. And Redshift is Amazon’s way of saying “Your Move Teradata”..

Useful Links

Redshift: http://aws.amazon.com/redshift/
Pricing : http://aws.typepad.com/aws/2013/02/amazon-redshift-now-broadly-available.html
Microstrategy Integration : https://aws.amazon.com/marketplace/pp/B00A41ASWK/ref=srh_res_product_title?ie=UTF8&sr=0-4&qid=1360808044401

]]>
https://blogs.perficient.com/2013/02/15/amazon-redshift-bi-in-the-cloud/feed/ 0 199897
BigML – Predictive Modelling made easy https://blogs.perficient.com/2013/02/11/bigml-predictive-modelling-made-easy/ https://blogs.perficient.com/2013/02/11/bigml-predictive-modelling-made-easy/#respond Mon, 11 Feb 2013 20:26:12 +0000 http://blogs.perficient.com/dataanalytics/?p=3152

I have been pleasantly surprised about the new startups which have emerged on predictive analytics  This year, i would be covering a wide range of predictive analytical tools and its usage techniques. We no longer need PHDs to understand statistics and to start using predictive models. We have to know how to use the tools and need not know anything about how the models work. As the world is moving towards more abstraction, it is only a matter of time before such powerful tools are used by business analysts..


One such tool i recently learnt about is BigML. I believe BigML have created the first step in making predictive analytical tool user friendly and to produce results in minutes.

The tool is currently limited to tree visualization and is based on how decision trees work to gain insights into predictive models. But it has huge potential in delivering fast and accessible models.

Its a 4 step process in getting to the insights

1. Setting up Data Source

2. Creating the data set

3. Create Models

4. Generate Predictions

You can also play around with the interactive models in the gallery section of the site.

]]>
https://blogs.perficient.com/2013/02/11/bigml-predictive-modelling-made-easy/feed/ 0 199895
Notes from the last Oracle OpenWorld 2012 https://blogs.perficient.com/2013/01/18/notes-from-the-last-oracle-openworld-2012/ https://blogs.perficient.com/2013/01/18/notes-from-the-last-oracle-openworld-2012/#respond Fri, 18 Jan 2013 17:27:54 +0000 https://blogs.perficient.com/oracle/?p=331

Link to my post on the trends in Oracle BI

https://blogs.perficient.com/businessintelligence/2013/01/17/excerpts-from-oracle-openworld-2012/

 

 

]]>
https://blogs.perficient.com/2013/01/18/notes-from-the-last-oracle-openworld-2012/feed/ 0 205488
Excerpts from Oracle OpenWorld 2012 https://blogs.perficient.com/2013/01/17/excerpts-from-oracle-openworld-2012/ https://blogs.perficient.com/2013/01/17/excerpts-from-oracle-openworld-2012/#respond Thu, 17 Jan 2013 17:53:14 +0000 http://blogs.perficient.com/dataanalytics/?p=2974

There has been a lot going in the BI world and Oracle had a bevy of announcements in the last Oracle Openworld 2012. Its been 3 months since the events, but the trends and announcements would have lasting impact on the BI world. The BI specific topics that were covered in the event were about Data Warehousing, OBIEE, Endeca , Essbase integration, EPM and Oracle Analytics. Below is a brief overview of the some of the trends from the event.

Stronger OBIEE 11g and Essbase integration

OBIEE has made steady progress in strengthening integration between Essbase and Oracle 11g.  The latest iteration of OBIEE improves the integration even further. Essbase as a data source did not play well with earlier releases and a lot of complex manual mapping was needed to get the functionality required. Essbase can be used a data source for OBIEE repository, and the Oracle Answers would be the front end for the reporting and dash boarding needs from HFM.

The new features allows bringing all the hierarchies (Parent-Child, Ragged and Skip level) from Essbase which was earlier not possible. The process of integration makes the multi dimensional view into a relational view and introduces new OLAP query capabilities.  Now all the financial forecasting performed at Essbase can be integrated with other non-multidimensional data sources with Oracle Answers providing the front end.

Growth of Endeca Information Discovery.

Endeca was a very good acquisition by Oracle as it strengthens its ERP and E-commerce offerings. Endeca was well known for its “Facedted Search” which is search across various attributes delivering a unique web experience in information discovery. Endeca Latitude is a BI platform which adopts the “Search First” philosophy at its core. The product now has been rebranded as Endeca Information Discovery.

The EID tool backend is supported by Oracle Endeca Server and its databases. The database is a unique technology focussed on delivering search and analtyics making the data mapping with attributes stored as key/value pairs. The technology stack is different from the traditional BI approach in delivering a unique data exploration methods which combines Search, Contextual navigation and Visual Analysis.

Oracle Exalytics,Exadata and In-Memory Analytics

These high capacity machines now run 10 times faster than the previous iterations. These are mainly used in companies which have real time processing needs.

Exalytics as the Exa-Machine for BI

– BI layer on steroids with multiple core and 1TB server

– In memory analytics used to accelerate the BI capabilities.

– It works on the 80:20 rule, Exadata addressing 80 percentage of the needs and the other 20 is served by the Exalytics.

– Faster response time and higher concurrency

Oracle states : “The Exadata Database Machine X3-8 is designed for database deployments that require very large amounts of data, delivering extreme performance and scalability for all applications including Online Transaction Processing (OLTP), Data Warehousing (DW) and consolidation of mixed workloads. It comes complete with two 8-socket database servers, 14 Oracle Exadata Storage Servers, InfiniBand switches and more than 22 terabytes of Exadata Smart Flash Cache to support extremely fast transaction response times and high throughput.

]]>
https://blogs.perficient.com/2013/01/17/excerpts-from-oracle-openworld-2012/feed/ 0 199878
(Visual) Data Discovery ( Beauty or a Beast ? ) https://blogs.perficient.com/2012/12/31/visual-data-discovery-beauty-or-a-beast/ https://blogs.perficient.com/2012/12/31/visual-data-discovery-beauty-or-a-beast/#respond Mon, 31 Dec 2012 19:39:13 +0000 http://blogs.perficient.com/dataanalytics/?p=2913

Search technology has been around for more than a decade now, but they have been used to index web files and to help users find what they are looking for. The search technology when used in the enterprise BI context brings two new capabilities to BI. First, search analyses data behind the scene that traditional BI cannot, including unstructured content like documents , social media updates, RSS/twitter feeds and other highly diverse data. These are usually information that are hard to get into the warehouse and takes huge efforts to structure the data. Data discovery can be thought as a searching engine combined with unstructured and structured content in the enterprise. Data Discovery tools speed the time to insight through the use of visualizations, best practices in visual perception, and easy exploration.

This is a 3 part series providing an overview of data discovery, current data discovery tools and how data discovery can complement BI.

Overview – The Need for data discovery

BI answers the “what” of the information and gives us insights into what happened, but does not answer “why” something happened like “why did the sales decrease” or “why did the insurance claims increase”. Data discovery is the technology that helps users the answer the question beginning with “why”. Discovery as you are aware is the action or process of discovering or being discovered. Data discovery is very true to the meaning where we perform unscripted exploration or a quest on the data to find the truth.

Data Discovery are tools and technologies that are not replacements for the traditional BI modules but help in filling gaps that traditional BI cannot address. We are always aiming to go to the top of the knowledge hierarchy by using newer technologies and paradigm in making sense of the data. Data discovery is another step in achieving the ultimate objective of gaining wisdom. It brings us a little closer to the goal.

As a BI analyst my first thought while understanding the need for such a tool was with pessimism.

Why do i need a new tool when the business users can use adhoc query/reporting tools to answer their questions ?

Adhoc query can answer only a part of the problem in performing exploration on data, again only if the data is structured and do not work on ever growing unstructured data where most business are not able to gain valuable insights.

To answer the “why”, we need to follow new trails, identify new patterns and trends, which are immediately not apparent with traditional BI. BI is highly focused and needs structured data to begin with and requires data warehouse experts to maintain and streamline the process. Since business models are changing constantly and cannot entirely rely on data warehouses to present them with every answer they need, businesses need exploratory tools and technologies which address their needs faster and in real time. Data discovery tools help organisations index and tap into the the vast information sources available in the Data Warehouse, CRM, ERP , 3rd application data, web data , document and excel files.

In my next post in the series, i would discuss about specific use cases where data discovery tools might be useful, current data discovery tools and future trends.

]]>
https://blogs.perficient.com/2012/12/31/visual-data-discovery-beauty-or-a-beast/feed/ 0 199870
My experience with an Enterprise BI strategy change https://blogs.perficient.com/2012/12/27/my-experience-with-an-enterprise-bi-strategy-change/ https://blogs.perficient.com/2012/12/27/my-experience-with-an-enterprise-bi-strategy-change/#respond Fri, 28 Dec 2012 05:13:10 +0000 http://blogs.perficient.com/dataanalytics/?p=2898

I recently had the opportunity to work with a client who were facing some trouble with their overall  BI vision, because the data warehouse solution was not mature in the organization and the company’s new 3rd party applications for Call Center and CRM software brings an integrated BI stack with them.

The Problem

Because of the introduction of the 3rd party applications (Call Center and CRM), the enterprise wide DW was not getting the traction it needed. And a lot of users preferred using the analytics that comes with these CRM tools. This meant the DW was not being used as the single source of truth for reporting and analysis.

The BI teams mission and vision was to drive all reporting needs from the DW. Unfortunately with the introduction of Sales force for CRM and some 3rd party applications which provided analytics capability of their own, the users of these system were comfortable using the out of the box solutions that these software offered.

It became increasingly difficult to drive operational reporting from DW. Thought the company adopted Agile BI methodologies for DW solution and was fairly quick to respond to changes to the DW, they were not able to cope up with the ever increasing demand for an integrated approach towards DW. Each department in the enterprise had their own BI technology stack and wanted faster delivery.

Speed vs Fragmentation

The major hurdle the company faced was that, each department had their own BI stack which created too many fragmented data sets. This inturn makes it very difficult for identifying the System of Records and keeping track of the changes in the data different systems while bringing them into the DW.

It turns out that speed of delivery is indirectly proposition to fragmentation of data sets. As speed of delivery decreases the data fragmentation in the organization increases.

Solution – Adapting the BI vision

They wanted to turn the DW into a consumption layer and increase usage through the Analytics tool. After all the analysis was completed, the enterprise BI team came to the realization that they might not be able to be a true consumption layer where all the analytics would be consumed from. They decided to let the CRM and 3rd party application handle all the operational reports and push all the historical and cross functional analytics to DW. So the Enterprise BI team focus was narrowed to working on only functional reporting capabilities and not deal with day to day operational reporting. After all the DW strength lies in the cross functional analytics and not consuming the day to day reporting from the DW.

Conclusion

I had the notion that goals are to be met at any cost. But when change is the only this that is constant, changing goals and visions are not a bad thing. The BI team adapted to the changing needs and the reality of the organization while also making sure they delivered on the promises made and provide cross functional data for broader enterprise wide insights, all at the cost of aligning to a more realistic BI strategy.

]]>
https://blogs.perficient.com/2012/12/27/my-experience-with-an-enterprise-bi-strategy-change/feed/ 0 199868
The Need for better use of Algorithms in BI https://blogs.perficient.com/2012/12/20/the-need-for-better-use-of-algorithms-in-bi/ https://blogs.perficient.com/2012/12/20/the-need-for-better-use-of-algorithms-in-bi/#respond Thu, 20 Dec 2012 17:28:52 +0000 http://blogs.perficient.com/dataanalytics/?p=2879

I recently came across couple of articles in AllthingsD where two C level executives have put forth their point of view on what is more critical to business (More data or better algorithms?). I believe since this is a BI blog, we might as a group be more inclined towards the more data argument than the better algorithm argument. I would like to put forth my views on why better algorithms would do a world of good to business.

Its easy to say, we need both, more data and better algorithms. But very few companies have the resources and the skill to manage and deliver on both counts.So businesses might have to choose between them.

I am parroting both sides of the arguments below and would finally make my case for better use of algorithms in BI.

Argument towards More data

http://allthingsd.com/20120907/more-data-beats-better-algorithms-or-does-it/?refcat=voices

The argument put forth by the BlueKai CEO Omar Tawakol is that disparate data points are not useful to any business, but when the disparate data points are connected and a context related to them, the data becomes more richer and meaningful than ever.

He argues that the degree of connectedness of the data which are laid beneath the analytical layer determines the relevancy and correlations that can be obtained from the data. Finally he points out that an isolated behavior in the data, when evaluated and connected, can produce unexpected value

Argument towards Better Algorithms

http://allthingsd.com/20121128/better-algorithms-beat-more-data-and-heres-why/

The argument that Mark Torrance is CTO of Rocket Fuel makes is that data might be rich and can be highly augmented but they mostly represent static information at any point of time and is less useful without good algorithms to sift through and find anomalies and business opportunities.

He believes that algorithms can extend the usefulness of the data assets and helps create significant and measurable improvements which cannot be obtained from more data. He does accept that more data can give better insights but only marginal gains compared to what better algorithms can.

The Case for better use of algorithms in BI

In BI we mostly structure the data in a manner useful for business to answer their questions.But we have probably ignored the use of better algorithms to help business gain useful insights into the business.

There are several advantages that algorithms can offer that more data simply cannot offer.

  • Algorithms can identify anomalies in the data faster and interrelate data to produce better results.

  • Unless the data is highly connected, it is near impossible to identify causation and correlation in the data. Algorithms can help business identify better connections between disparate data points even if the data is not rich enough.

  • Algorithms are highly useful in predictive behavior of the events and help take actions on the behavior.

  • Algorithms can react to changing business needs faster than data, and since data usually represent static points and or events in time, they fall short to address the changing nature of business models.

  • With the high growth and volume of unstructured data in the business and the consumer world, business would be better equipped if they focus on better algorithms than forcing structure to the data.

When will business invest time, energy and resources in using better Algorithms ?

I believe many businesses understand the need for better algorithms but don’t have the resources and the will to make use of the data. There would be point when they realize they have obtained all the insights there is through the structured data that they have assembled in their data warehouses. This might compel them to look at the connectedness of data and using algorithms to gain performance and identify new business opportunities.

]]>
https://blogs.perficient.com/2012/12/20/the-need-for-better-use-of-algorithms-in-bi/feed/ 0 199865
Mobile BI – Part 3 -Current Landscape & Key product offerings https://blogs.perficient.com/2012/12/13/mobile-bi-part-3-current-landscape-key-product-offerings/ https://blogs.perficient.com/2012/12/13/mobile-bi-part-3-current-landscape-key-product-offerings/#comments Thu, 13 Dec 2012 21:45:42 +0000 http://blogs.perficient.com/dataanalytics/?p=2870

The last in the series discusses about the landscape of the mobile BI and key product offerings from major players.

There are 2 types of mobile BI players

1. Major BI players

These are the big guns who are typically slow in bringing their solution to market. They are focused on tightly integrating their offering with their existing BI capabilities.

2. Mobile BI vendors ( Integrators )

These new players are bringing consumer focused products to market faster than established players. Mobile BI has given new startups to take a stab at the new opportunity it presents. The new players of course don’t provide a complete BI platform but a publishing platform which can integrate with the other major BI solutions. These startup are integrators and they are effective in providing a compelling reason to look at the new solution.

Lets look at what the big guys are offering now

Key Features

I have done my best in capturing the most accurate features in these different platforms.

SQL Server 2012 not included: As of now, microsoft is still playing catch up in terms of offering a true mobile BI solution. though you can access some of the reports and dashboard in iOS safari, they are not usually good at performance and speed. There is no native app for iOS or android for SQL Server 2012 Business Intelligence. It is rumored that Microsoft is building the mobile BI solution for the different platforms.

New Kid in the block

The opportunity was there for the small startups to become a true integrator for mobile BI solution and products like RoamBI and YellowFin have grabbed the opportunity with both their hands.

RoamBI

I recently had the opportunity to evaluate roamBI. In the process of evaluation, i integrated the OBIEE server with the roamBI. It involved setting up security for the BI server so the VPN access was successful. Though limited in its capabilities during my evaluation, the product has grown rapidly and has cut a cord with enterprises. The reason RoamBI is successful is because they can integrate with any of the top BI vendors and offers an interesting visualizations and implementation.

YellowFin

This is another solution that is providing promising mobile BI solution. The company provides a solid mobile BI solution and has been known for its visualization and collaborative BI efforts.

http://www.youtube.com/watch?v=Kv13VX6ABuc&list=UUY9QucpwrnTweq90PZ4jKog&index=2

Native Apps and Supported Devices

Companies which are deploying mobile BI solution need to be cognizant of the devices that a vendor supports. The list below gives a gist of the supported devices for the various platforms. Although all the devices can support web based solutions, Native applications have been recognized to be much faster and reliable.

Best Use Case for the different device types ( Screen Sizes )

Finally

The market is so dynamic at this time, there are a lot of changes to the product offerings and key differentiation. But most of the mobile BI product have almost the same vision for the future of mobile BI.

]]>
https://blogs.perficient.com/2012/12/13/mobile-bi-part-3-current-landscape-key-product-offerings/feed/ 3 199864
Mobile BI – ( Part 2 – Trends ) https://blogs.perficient.com/2012/12/05/mobile-bi-part-2-trends/ https://blogs.perficient.com/2012/12/05/mobile-bi-part-2-trends/#respond Wed, 05 Dec 2012 16:31:49 +0000 http://blogs.perficient.com/dataanalytics/?p=2852

This is part 2 of the 3 part series on Mobile BI trends and implementation strategy. Last post discussed the mobile implementation strategies for organizations. This one shifts focus to the current trends in mobile BI. The next post would focus on the current landscape and key product capabilities.

The below diagram describes the hype cycle in business intelligence ( 2011 ). As you can see Mobile BI was at its peak inflated expectation last year and is in the process of going through “Trough of Disillusionment”. Mobile BI also has 2-5 years for a mainstream adoption.

Trough of Disillusionment Definition from Gartner : Interest wanes as experiments and implementations fail to deliver. Producers of the technology shake out or fail. Investments continue only if the surviving providers improve their products to the satisfaction of early adopters.

Mobile BI investments are going strong and have had successful implementation in multiple organizations. It is expected that by 2014, 33% of business intelligence will be consumed through mobile devices. This adoption and utilization will not be limited to C-level executives.

There are particularly 4 trends i would like to discuss.

Insight to Action

As the initial period waned out , there is a clear shift from porting existing reports and dashboards to building actionable intelligence. Vendors have also started to develop specific domain related activities and tasks. The actionable analytics would be further propelled by further integrating existing CRM and SOA based applications into the mobile BI stack.

Social Collaboration

Social collaboration would facilitate information collaboration by enabling you to share insight, discuss your data and use your collective knowledge to make better decisions with ease. The mobile BI platform would increase sharing insights and analytics. Social collaboration might not be the answer for every problem but sure does help in collectively deciding the best course of action for specific problems.

Integrated solution

Microstrategy and other BI vendors have started to pack their mobile BI solution with their current BI offerings. We might see all the vendors adopting a similar approach i.e (Integrated BI offering with both mobile BI and traditional BI solution ). If organization have already invested in a particular BI vendor, it might be a good idea to look into the vendor’s mobile offerings which might greatly reduce the integration time and cost.

Increased Real Time data needs

More real time data needs for mobile BI are pushing the boundaries of data warehousing. As mobile BI usage increases , the need to implement more real time data need increases in organization. Mobile BI would ultimately drive the data warehousing experts to look into real time data needs, if their data warehouse does not currently perform real time data processing.

Conclusion : Mobile BI is a means to increase organization uptake of BI and to increase visibility and enhance productivity. These trends are evolutionary in nature and is a natural progression of where Business Intelligence needs to be. Mobile BI adoption done right would definitely lead to better ROI and being aware of the trends would give companies the competitive advantage.

]]>
https://blogs.perficient.com/2012/12/05/mobile-bi-part-2-trends/feed/ 0 199862