Perficient Enterprise Information Solutions Blog

Blog Categories

Subscribe via Email

Subscribe to RSS feed

Archives

Follow our Enterprise Information Technology board on Pinterest

Archive for June, 2012

Where is Oracle BI with Mobility?

It’s safe to concede that “appifications” in the mobile space (thanks to Apple’s world of Apps!) are driving trends and innovations in the BI marketplace.  Per Gartner (surveyed 1,364 organizations that use BI tools), here is a summary of the key Mobile BI statistics that have the potential of dramatically changing the experience of every existing BI consumer as well as diversifying existing business models to create organizations with a set of highly informed, interested and empowered BI users.

  1. 8% are actively using Mobile BI
  2. 13% are currently running Mobile BI pilots
  3. 33% have plans to deploy Mobile BI in 12 months

The trends that are increasingly determining where the Mobile BI market is heading to, are as below:

  1. Native BI Apps vs. Multi-purpose Apps
  2. Consumption  vs. Creation
  3. Mobile Touch UI vs. Desktop Point (Click UI)
  4. Outside Firewall vs. VPN Tunnel
  5. Free vs. Licensed

Available Solutions for Oracle BI:

  1. Device Native Browser (e.g., Safari for the iPad)
  2. Build Mobile Website for OBIEE
  3. Native App Oracle
  4. Native App-3rd Party (Roambi, SurfBI)
  5. Native App-Custom

With the release of OBIEE 11.1.1.6, options 1, 3 and 4 have emerged as the primary solutions for Mobile BI on the Oracle platform.

1)      Device Native Browser

What Works

i)        Table/Pivot Table View

ii)       Hierarchical Columns

iii)     Column/View Selectors

iv)     BIP Reports

v)      Map Views

vi)     Print PDF

vii)   Master Detail

viii)   Action Framework

ix)     Drop-down prompts

x)      Objects rendered by Flash

xi)     Scrolls

What DOES NOT Work

(1)    Objects that have a hover

(2)    Objects that require a right click

(3)    Exporting

(4)    Dashboarding pivot

With the latest release of Oracle BI Mobile App: 11.1.1.6.2 BP1, you can:

i)        Easily connect to OBIEE environment

ii)       View reports, dashboards, alerts and scorecards

iii)     No special configuration required for charting

iv)     hover over and do chart dill downs

v)      Leverage Touch UI from Dashboard prompts

vi)     Format it nicely to fit on the iPad (IOS support only –  iPad and iPhone only)

vii)   Apply additional customization to a dashboard from an iPad

Thanks to Roambi and SurfBI, there has been a ton of innovation around providing richer “views” on the iPad to enhance the overall Mobile BI experience for the users.  Oracle takes advantage of these products and allows for integration with these 3rd party products.

I will provide an overview of Roambi and SurfBI in my next blog.

Posted in News

More on SAP HANA vs. Oracle Exalytics

Clearly, there is a lot of hype surrounding SAP HANA and Oracle Exalytics.  They are both in-memory products, and although Exalytics is somewhat newer, HANA has only been generally available for about a year itself.  So, we don’t know what either might eventually become, or what newer releases might later bring, and yet there hasn’t been a long enough track record to take either company’s claims at face value.

Both products have shown the ability to speed up analytical queries.  But, for transactional applications, there are some important distinctions that would seem to give the edge to HANA.  In their recently released internal benchmarks, SAP has shown some impressive numbers (though unverified) with regard to HANA’s scalability. Per the details released InformationWeek, SAP claims that, “100 billion records of relational data, representing five years’ worth of SAP sales and distribution data, were loaded into HANA, which compressed the original 100 terabytes of disk data down to 3.8 terabytes of RAM. The Hana system was running on a commercially available IBM X5 16-node cluster, which is four times as large as the platform used for SAP’s previous HANA scalability test”.  For purposes of comparison, it would be very useful have similar benchmarks from Oracle.

SAP has said that HANA is designed to eventually run and do all processing with just one database (although, this has not been demonstrated so far).  But, Oracle’s solution clearly requires multiple databases. “Where SAP says a single HANA database will eventually run both BW and core transactional applications, Oracle Exalytics is an add-on product (for all but the smallest data marts with less than a terabyte of data). That means an SAP customer running Oracle will still need one database license for the transactional database, a separate license for the data warehouse database, and a third database license (either TimesTen, Essbase, or, in some cases, both) for Oracle Exalytics.”

The goal, Oracle says, is to get more out of existing databases, not to replace them.

Oracle does offer the benefit of being able to execute your existing applications at a higher level of performance, without any modifications.  Both companies have said that they will be creating new apps, although HANA is further along on this path. In the world of Exalytics, transactional data is copied from the application layer to the DW and yet again to Exalytics for executing queries. So that implies three layers of servers, storage, integration, and administration of all of the above.

“Not only do you end up with three copies of the data, you have to add the latency required to move data from one copy to the other,” says Gartner analyst Don Feinberg. “The real promise of HANA is that when the transaction is done, the data is instantly in the data warehouse and your analytics become real time.”

The contrast is that Oracle seems to move the data to the appropriate database type, and uses their expertise and proven technology for that particular type of processing.  Hana brings their processes to where the data already is, and performs whatever types of processing is needed, right there.  Both of these approaches will work; they are just done in different ways.  SAP believes that, with less movement of the data, it can perform the functions in a more instantaneous, real-time fashion.

I would also like to take the opportunity to clarify some of the misconceptions about HANA.  A key component is HANA’s ability to optimize reads or writes, depending on what is most critical for processing.  And, there are also business functions built into the database.

It supports unstructured data analysis, and it has both database columnar text processing, and text-analysis functionality.

It supports SQL and MDX, and can support parallel query execution, and in a high volume.

My previous blogs on this topic has led to some lively debate on the competitive differences.  David Hull, for example, has provided some commentary that he believes refute many of the claims that I cited in the previous blog on the Oracle point of view, and Puneet Suppal has added some perspective, as well.  It is not my intention to pick a winner in this, but to bring some of the latest information to light to provoke thought and discussion.  Like any technology, the best way to settle this is really to test the two products side by side, using your own benchmarks and criteria.

This will continue to be an ongoing discussion, and I look forward to more of your feedback.

Posted in News

“What is MDM?” – Video Overviews

Today we are going to look at a widely used acronym in business intelligence called MDM, or Master Data Management. Sure you have heard it before, but it is often tossed in with a series of other acronyms without much context. So what is MDM in a practical context?

Read the rest of this post »

Oracle’s Perspective on Exalytics vs. SAP Hana

There will naturally be judgments and comparisons made between Oracle’s Exalytics and SAP Hana. This blog is going to present Oracle’s claimed advantages of their Exalytics solution.  Keep in mind that the points put forth here are purely from Oracle’s perspective,   I will then be producing a follow-up blog that tries to take a more critical look at this comparison.  But, for now, let’s take a look at Oracle’s case:.

According to Oracle, TimesTen (their in-memory database) is significantly better than SAP HANA, based on their comaparative  strengths over SAP Hana in the following features:

In-Memory Data Caching                                                Times Ten &  SAP Hana

In-Memory Columnar Storage                                       Times Ten &  SAP Hana

In-Memory Row & Column Compression                   Only Times Ten fully  (SAP Hana has Column only)

In-Memory Indexes                                                          Only Times Ten

In-Memory Query Optimizer (Predictability)                 Only Times Ten

In-Memory NUMA Support (Scale Up)                         Only Times Ten

In-Memory Parallel Query (Scale Out)                         Only Times Ten

In-Memory Aggregates & Result Sets                         Only Times Ten

In-Memory Analytic Functions                                       Only Times Ten

In-Memory Unstructured Data                                      Only Times Ten

High Performance Writes/Updates                             Only Times Ten

Data Persistence on Disk                                             Times Ten &  SAP Hana

Transactional Integrity/Correctness                            Times Ten;  (SAP Hana unknown)

Multi-Version Concurrency                                            Times Ten;  (SAP Hana unknown)

 

Oracle also claims that, compared to Exalytics, SAP Hana has the following limitations:

Operational Reporting

———-Limited Data Sources with Sybase Replication Server

———-Limited support of 3rd normal form within Business Objects

Data Mart

———-No Parallel Query (Scale-Out) or NUMA (Scale-Up) Support;  Limited & Non-Standard SQL

Data Warehouse

———-Theoretically Possible, but Practically, far too expensive above 2-4 TB

Multi-dimensional OLAP

———-Limited Write Performance to update aggregates due to compressed, in- memory columnar storage

Planning & Budgeting

———-Layers of aggregates in SAP BW impact planning on BW on Hana;  Limited write performance w/ columnar storage

Unstructured Discovery

———-No unstructured data support in Hana;  No discovery capabilities across unstructured & structured

Packaged Apps & BI Tools

———-All Packaged Oracle Analytic Applications, Packaged EPM Applications, and any BI Tool works with Exalytics;   Hana only works with SAP Tools

 

In addition to the above, Oracle also says that Exalytics is less expensive than SAP HANA, as follows:

Their claim is that their In-Memory Data Mart, with 512GB compressed data and 1TB memory would cost $825,000, while the equivalent capabilities of an SAP HANA and IBM hardware solution would cost $4,100,000.

Oracle also says that, their package of In-Memory Analytics for Enterprise DW, with 20TB compressed data, and 40TB memory, including  1 Exalytics + 1 Exadata would have a total cost of $2, 500,000.

These same capabilities using SAP HANA and IBM hardware, meanwhile, would require 40 Size L servers to hold all data in-memory, resulting in a total cost of $126,500,000.

Oracle also states that their maintenance and upgrade fees for the above would be substantially lower.

These are the keys points of the Oracle case in this comparison.  My follow-up blog to this one will discuss how this viewpoint holds up.

 

 

 

Posted in News

And now, it’s “In-Memory” with Oracle Exalytics

The Information Management world is abuzz with talk of Oracle’s TimesTen-based “engineered system” – Exalytics! It is a high-performance, in-memory appliance that delivers fast, actionable interactive and insightful analytics by integrating optimized hardware and software components to deliver a complete analytical solution – combining modeling, planning and reporting all on a single box!

So, what does that mean?  Let’s get a little under the hood and check out Exalytics’ foundational building blocks.

The Exalytics Eco-system

  • Runs the BI layer on a multi-core 1 TB server
  • An in-memory cache (Oracle TimesTen 11.2.2.2) is used to accelerate the BI part of the stack

Key Hardware Components:

  • Sun Fire X4470 M2 Server
  • 1 TB RAM, 40 Cores (four Intel Xeon© E7-4800 series processors), 3.6 TB HDD

Key Networking Components:

  • InfiniBand: 2 quad-data rate (QDR) 40 GB/s InfiniBand ports are available with each machine expressly for Oracle Exadata connectivity. When connected to Exadata, Exalytics becomes an integral part of the Exadata private InfiniBand network and has high-speed, low latency access to the database servers. When multiple Exalytics machines are clustered together, the InfiniBand fabric also serves as the high-speed cluster interconnect.
  • 10 GB Ethernet: Two 10 GB/s Ethernet ports are available for connecting to enterprise data sources and for client access.
  • 1 GB Ethernet: Four 1 GB/s Ethernet ports are available for client access.
  • Dedicated Integrated Lights Out Management (ILOM): Ethernet port for remote management.

Key Software Elements:

  • Optimized OBIEE 11.1.1.6
  • Oracle Essbase 11.1.2 with enhancements
  • Oracle TimesTen for Exalytics (supports columnar compression)
  • Runs on 64-bit Oracle Linux
  • OBIEE and Essbase are licensed as Oracle BI Foundation and their Exalytics-specific functionality and features can only be used in conjunction with the Exalytics Hardware.

A typical Exalytics Prototype Environment

Exalytics Architecture: Developer tools are used for defining and maintaining aggregate definitions, client tools for OBIEE reporting, TimesTen and Oracle DB; the Exalytics layer contains OBIEE, TimesTen and Essbase and the DB tier contains FMW DB repository, usage tracking and summary stats.

In addition to an Exalytics machine and an external DB, the following components go on a Developer Workstation: Oracle BI Admin Tool, Summary Advisor Wizard, TimesTen Client, SQL Developer, Oracle DB 11g Client.

Exalytics includes two in-memory analytics engines that provide the analytics capability – Oracle TimesTen In-Memory Database for Exalytics and Oracle Essbase with in-memory optimizations for Exalytics. These two data management engines are leveraged in the following four techniques to provide high performance in-memory analytics for achieving the aforesaid query optimizations:

  • In-memory Data Replication
  • In-memory Adaptive Data Mart
  • In-memory Intelligent Result Cache
  • In-memory Cubes

Some of these are Oracle’s genuine “secret sauce” (new algorithms and functionality) and others are simply descriptions of standard existing BI/DW strategies extended to take advantage of the in-memory concepts.

My next blog will make comparisons between SAP HANA and Exalytics.

Posted in News

Financial Services Sees Big Value In Big Data: Top 10 Trends

SunGard has identified ten primary trends that have been shaping the financial services industry’s use of big data in 2012. These trends cover wide-ranging drivers such as predictive analytics, compliance, mobile and globalization. To accompany the list, Neil Palmer and Michael Versace (global research director at IDC Financial Insights) discuss these trends in more detail via webcast. Below is SunGard’s list of the key 2012 trends that are shaping big data:

Read the rest of this post »

Rational BI Implementation

As the number of organizations understanding the value of business intelligence grows, the need to adopt a BI strategy and a governance of BI tools and methods grows along with it.

In a recent blog, I referenced the tremendous impact of mobile devices, and the tablet in particular, on the BI space.  Having immediate access to dashboards, analytics, and other critical information can be a big advantage in dealing with problems before they can cause major damage.  However, this benefit does not override the need to still analyze the best platform for a given BI solution.

Similarly, I spoke of the benefits of using the tablet for social collaboration.  But, more important than whatever tool or technology is used, it is the promulgation of a collaborative culture within the organization that will best inculcate the attitudes of sharing and openness to discussion and debate that is needed for the success of cooperative efforts.

One factor that can undermine these efforts is the trend away from centralized IT generated solutions and control toward more decentralized and independent solutions arrived at within each end user department.  For a department or groups to have the autonomy to select their own tools and support to meet their unique needs can obviously have a lot of appeal.  But, this autonomous approach can suffer if there a lack of BI expertise and seasoned judgment within the group in choosing and implementing the best tools and solutions.  And, from an organizational standpoint, this can lead to a proliferation of redundant tools as each group goes its own way.  A continuing collaboration with IT and others can head off some of these issues.

The temptation and promise of predictive analytics can lure an organization to give short shrift to the foundational first step of fully understanding what has gone on in the past.  It is important to identify from real world historical data what conditions led to certain outcomes, whether good or bad.  And, of course, even when properly used, predictive analysis can only show a probable outcome, not a certain one.

Ideally, organizations will want to find ways to leverage their BI solutions, not just for the intelligence or even just for better decision making, but also to make those solutions part of their actual business execution.  Streams of real time data streams can be analyzed and compared to predetermined standards, based on historical data.  Any deviations or abnormalities can be instantly reported, and the appropriate actions taken.

To execute an effective BI strategy and implementation, it is necessary to create a foundation that can be extended to meet the growing business needs of the organization on a continuing basis.  It is in this way, rather than through an ad hoc or piecemeal approach, that the organizational BI challenges can be successfully met.  This requires a holistic end-to-end information management solution that will leverage leading functionality, processes, and best practices, using the best BI products currently available.

It is important to establish priorities in terms of reporting and dashboarding needs.  Then, having a reference architecture, data and organizational governance, BI programs and analytics, and rationalization requirements gathering that engages super-users on a regular basis will all serve to create the foundation for the continued building of newer BI applications that are based on true business needs.  The result should be an ongoing continuous loop between strategy, methodology and sustained success.

Posted in News

Perspectives on BI Mobility

iPads (and other tablets) have become ubiquitous in our world today.  Executives and other leaders can be seen carrying them to meetings, using them in their offices, displaying nifty data revelations to crtitical customers, or then, trying to keep tabs on key company performance metrics on a fancy vacation!  The iPad started out as a nice complement to the desktop or laptop, but for increasing numbers of users, it is becoming their primary, or even only device, even at work.  Tablets have come into common utilization as a tool for presentations and demonstrations, and have proven very effective for social collaborations with colleagues or customers.

The greater use of tablets has spurred further software development for the iPad, thereby increasing the usage and confidence in relying on the tablet alone.   For example, there are news and information services like Pulse and Flipboard, among others, that allow executives to stay current on industry and competitor activities and developments.  There are also apps that permit the user to set customized filters and perform targeted searches on the latest news in business, technology, industry trends, or whatever else they choose to focus on.

The smartphone, of course, will also continue to grow (especially globally) as the other non-traditional mobile device that is augmenting or supplanting the laptop in many quarters.  Its prevalence and advancing capabilities parallel the progression of the tablet, and near convergence of the two someday can’t be ruled out.  (Neither can a smaller, more mobile laptop/notebook, for that matter.)   With a bigger and better display for browsing, email, social collaboration, greater “cloud-enabled” memory, and a more extensive library of intuitive apps, the tablet is being increasingly favored by many executives, but the progress in smartphones is really lessening the capability differences between the two.  (Of course, the raw computing power of a laptop cannot be completely replaced, either.)

Security is always an issue, whichever device you’re talking about, and storing sensitive data on the hard drive of a desktop, laptop, iPad, or smartphone, or even a thumb drive that you carry around is always a risk.  That threat will have to be weighed against the inconvenience of retrieving information each time it is needed from a server or the cloud.

But, let’s take a look at tablets with regard to BI specifically.  BI visionaries and leaders are looking to their iPads for capabilities in monitoring their businesses, including getting all of the relevant metrics for analysis and decision making.  BI Vendors and consultants must address this reality and be prepared to respond to it (while accepting that some others will still prefer their data on a laptop, or even in hard copy).

One concern has been that specific apps are often needed for BI software to function on tablets and smartphones.  A major step toward a solution will be the greater use of HTML5.  This will allow BI web applications to run on mobile devices without the requirement of an installed app on the device.  For analytics and other types of BI analysis, this will be especially helpful on tablets and will accelerate their use.

But, even now, statistics show that BI has, in fact, been labeled a driving source for iPad utilization (not just the other way around).  The improved visual impact of standard reporting, easy access to snazzy data exploration scenarios and the ability to slice and dice critical data via deep-dive analytics on the fly – on an iPad, and greater ease-of-use and “fun” factors have made many leaders more open to recognizing the true value of mobile BI.  So, it is really a synergistic situation:  iPads drive more BI opportunities, and BI increases the attraction of the iPad!  Either way, it is a win-win for all, and will lead to increasing demand for ever greater acceptance and use of BI.

Posted in News

EzAPI for SSIS 2012

EzAPI is a set of tools for programmatically manipulating your SSIS packages.  This makes creating those dynamic, metadata driven packages a walk in the park. Or maybe the foothills.  Check it out on Matt Masson’s blog here.  Now for 2012!

Successful Strategic Planning for Big Data

Prior to Perficient, I had previously worked at HP/Knightsbridge Consulting and had worked on creating the Knightsbridge/HP BI/MDM methodology. One of my many colleagues on this really large endeavor then was Mike Mansur, who is now the Worldwide Competency Lead for HP’s Global Methods for EIS. Mike recently offered his views on the importance of a comprehensive BI strategy in an interview recorded in the TDWI.org website.

Companies today have often fallen short of the ROI that they expected from BI, due in large measure to their inability to put together an effective information strategy. This lack of a BI strategy has resulted in the siloing of department-level BI initiatives, instead of being driven as it should be, by effective business participation.

This failure is not all due to the recent developments. In fact, organizations have had challenges with information and BI for years, including incomplete data, master data inconsistency, poor data governance, etc. Organizations had traditionally simply chosen not to put in the requisite time, money and effort needed to address these problems.

Big data has brought these issues to the fore, where they can now no longer be ignored. The BI approaches of prior times have proven insufficient to provide “the speed and agility required to integrate the various data types we are dealing with today, analyze data in real time, and generate the intelligence required by today’s face-paced, rapidly changing business environment.”

There is level of complexity to the business and IT challenges that big data presents that requires mastery of multiple competencies and technologies. But, the first challenge is to define priorities, based on a sound BI strategy. The business value must be identified, along with an understanding of the cost of implementation, to build a successful BI roadmap.

All key stakeholders need to be engaged, and ownership needs to be shared by both business and IT. The big data social media component, for example, requires marketing’s input on how to leverage the information. Both short and long term value can only be delivered through a scalable and adaptable architecture.

This focus generates pressure on IT to keep up with the technology alternatives to meet the demands of increasing data volumes, variety, and complexity. The pressure on the business side is to define the value justification for the needed investments.

There is increased competitive pressure, as well, because companies that are not able to extract customer information and insights from the many social media sources available today will lose out to competitors who do.

The other challenges are integrating this data with traditional data types, providing context to the data, and feeding insight back into business processes, where it really starts to make a difference.

Mansur offers the following guidelines for organizations to achieve a comprehensive and business value-driven business intelligence program:

A) Connect and exploit previously untapped or inaccessible information.

B) Realize greater return on IT investments by realigning and leveraging siloed, uncoordinated BI activities.

C) Increase business agility with a cohesive, aligned BI strategy, to be better positioned to adapt to changing business needs, customer demands, and capitalize on emerging market opportunities.

D) Institute a business-driven vision for BI; incorporate a thorough business vision for BI rather than only IT’s perspective of what the business wants.

E) Ensure that the previous disconnects between the business and IT that hampered success are not repeated.

For more detailed information on all of the above, also visit the TDWI.org website.

Posted in News