J-P Contreras, Author at Perficient Blogs https://blogs.perficient.com/author/jpcontreras/ Expert Digital Insights Tue, 28 Sep 2021 19:18:24 +0000 en-US hourly 1 https://blogs.perficient.com/files/favicon-194x194-1-150x150.png J-P Contreras, Author at Perficient Blogs https://blogs.perficient.com/author/jpcontreras/ 32 32 30508587 3 Trends for Data & Analytics in 2016 https://blogs.perficient.com/2016/02/12/3-trends-for-data-analytics-in-2016/ https://blogs.perficient.com/2016/02/12/3-trends-for-data-analytics-in-2016/#respond Fri, 12 Feb 2016 20:22:43 +0000 https://blogs.perficient.com/ibm/?p=5744

AnalyticsTrends16Data, data, data!  IBM estimates that 90% of the data in the world today has been created in the last two years.  When we use our computers, tablets and smartphones, immense volumes of data are being generated.  Companies are tapping into this data as a potential source of competitive advantage.  After harnessing this data, predictive analytics provides a means to create new insights so that more informed judgements can be made by decision makers throughout the organization.  As data becomes more accessible, several trends will be transforming the way organizations are using data and analytics:

  1. New Generation of Analytics Tools

Traditionally one of the first questions asked when undertaking an analytics project has been the kind of data that was being analyzed:  structured or unstructured?  Conventional organizational data is of the structured variety and lives in databases supporting ERP, CRM and like systems.  However, with the rapid creation of data about 80% of the data is now unstructured.   Examples of unstructured data include posts on social media (tweets, blogs, posts, etc.), human created notes, images, and open-ended responses to survey questions.  Today’s trends indicate that increasingly structured and unstructured data are being presented together in the same analysis.  The dilemma is that legacy tools often can’t easily analyze unstructured and structured data together.  Thus there has been a greater demand for more powerful analytics tools.

  1. Time Required to Deliver Analytics Value will Decrease

The times of 60-90 days to show value are over.  Increasingly the timeline to show analytics value will compress.  We are talking about analytics projects taking 30-45 days more frequently, due to the many technical advances.  Increasingly in the future we will be talking in terms of days rather than weeks for analytics projects.

  1. Larger Computing Trends will Manifest in Analytics

Apple’s Siri is an example of an early stage Natural Language Processing application.  Analytics tools, such as IBM’s Watson technology platform, use natural language processing and machine learning to reveal insights from large amounts of unstructured data.  The ability for computers to run learning algorithms in near real time with the increase in computing power will present itself in more applications.  Eric Schmidt, co-founder of Google, has stated that algorithms have always been there, but it is the computing power that has caught up.  I couldn’t agree more.

How widely is predictive analytics being used?  A short answer is over all organizational sectors because of its potential for changing competitive dynamics.  Some common uses of predictive analytics are as follows:

  • Fraud detection and cybersecurity. Health insurers, insurance companies and credit card companies all use predictive analytics to quickly identify abnormalities.
  • Managing operations. Airlines determine how many tickets at particular price points to sell for specific flights. Hotels try to maximize daily occupancy rates by adjusting prices.  Credit scoring organizations try to assess the likelihood of default.
  • Marketing. Companies are using predictive analytics to attract and retain the most profitable customer segments.
  • Sports. Teams are using predictive analytics to maximize revenue, to scout players and even to make game time decisions.

These examples provide just a few ways in which predictive analytics are being used.  Manufacturing companies, retailers, media and entertainment companies, public sector organizations and utilities among others are all using predictive analytics to increase revenues and improve performance within their organizations.

————————–

If you’re in the Chicago area and interested in continuing this discussion, you can join J-P Contreras in DePaul University’s Business Analytics Certificate Program where he will introduce you to the field of Big Data and Analytics. You will gain insights into what is possible with analytics, how you can benefit and how you can take advantage of the data that is now available to you. The next program begins Friday, February 26.  For more information, go to http://cpe.depaul.edu/businessanalytics.

]]>
https://blogs.perficient.com/2016/02/12/3-trends-for-data-analytics-in-2016/feed/ 0 214310
Thanks Godfrey Sullivan, First Ballot Hall of Famer https://blogs.perficient.com/2015/11/20/thanks-godfrey-sullivan-first-ballot-hall-of-famer/ https://blogs.perficient.com/2015/11/20/thanks-godfrey-sullivan-first-ballot-hall-of-famer/#respond Fri, 20 Nov 2015 18:39:13 +0000 http://blogs.perficient.com/dataanalytics/?p=5939

SplunkGodfreySullivan“Godfrey, I think history is going to judge you as one of the truly iconic Silicon Valley CEOs.” –Greg McDowell, JMP Securities Analyst (11/19/2015)

With Splunk’s Q3 earnings release was the additional announcement that Godfrey Sullivan would be handing over the CEO reins to Doug Merritt.

I don’t know Silicon Valley history enough to confirm or deny the statement above, but if I could offer my own twist and re-write Mr. McDowell’s statement:

Godfrey, I think history is going to judge you as one of the truly iconic Analytics CEOs.

  • Godfrey built and Sold Hyperion Solutions to Oracle in 2007
  • He was on the board at Informatica for 5 years
  • He has been on the board of Citrix for 10 years
  • He joined Splunk in 2008, took the company public and grew it from a $40 Million revenue company to one with a $600 Million run rate and an $8 Billion market capitalization.

Godfrey has created value for shareholders, customers, employees and partners using a revolutionary way to get customers to use and value software from Splunk.

When people ask me why I am excited about Splunk, I mention the fundamentally different technology built on the schema-on-read paradigm, and I talk about the value customers can get.  I also talk about Godfrey.  Proven, Fun, Visionary… he is certainly a reason I have been so excited about Splunk, its culture and what it can be.

There are a variety of ways in history we have offered to recognize the contributions of people.  If Godfrey was a baseball player he would be a shoe-in for the Hall of Fame.  If there was a Mount Rushmore for Analytics, he would be on it.

The good news about this inevitable transition, as confirmed on the Q3 earnings call, is this was a calculated plan, essentially hand-picking his possible successor and training him on “the Godfrey way”.  So like the rest of his track record, Godfrey goes out the right way too.  The company couldn’t be better positioned for the future.  We look forward to the next phase of the journey.

]]>
https://blogs.perficient.com/2015/11/20/thanks-godfrey-sullivan-first-ballot-hall-of-famer/feed/ 0 200138
Cyber Security Awareness Month brings plenty of awareness https://blogs.perficient.com/2015/10/07/cyber-security-awareness-month-brings-plenty-of-awareness/ https://blogs.perficient.com/2015/10/07/cyber-security-awareness-month-brings-plenty-of-awareness/#respond Wed, 07 Oct 2015 18:34:46 +0000 http://blogs.perficient.com/dataanalytics/?p=5899

DataSecOctober 1st marked the beginning of the United States’ National Cyber Security Awareness month.  Three days into the month, awareness is exactly what we have.

Last week, Experian and T-Mobile announced a significant data breach.  What’s worse is that the hackers were able to maintain the breach for over two years.  T-Mobile CEO John Legere, never a person known to mince words, came out with a statement saying he is “incredibly angry” about the breach of 15 million customer records and will complete a thorough review of their relationship with Experian.

While it is difficult to learn that the breach occurred, what’s worse is it happened over two years.  What breaches are still on going in the world that we don’t know about?

The very next day, Scottrade disclosed they were the casualty of a data breach exposing customer data for over 4 million customers during late 2013 and early 2014.  Interestingly enough in this case, Scottrade didn’t even know the breach was occurring.  The FBI came in to tell them.

When hackers look to steal this data, they are not interested in the easy stuff.  They are going after encrypted social security numbers and other personal information, and based on the Experian report, they got what they were after.

Based on recent events, we can simply confirm that Information Systems are not prepared for what is happening in this day and age.  What is your company doing about security and threat detection?

Our certified team of Splunk experts are actively working with some of the world’s largest companies to implement Splunk’s Enterprise Security application.  This application coupled with the Splunk platform was recently named a leader in Gartner’s Magic Quadrant.  If you have not considered the Splunk application for Enterprise Security, but are ready to do so, please contact us.

]]>
https://blogs.perficient.com/2015/10/07/cyber-security-awareness-month-brings-plenty-of-awareness/feed/ 0 200133
IBM Watson Analytics: The Time is Now https://blogs.perficient.com/2015/07/20/ibm-watson-analytics-the-time-is-now/ https://blogs.perficient.com/2015/07/20/ibm-watson-analytics-the-time-is-now/#respond Mon, 20 Jul 2015 17:05:28 +0000 https://blogs.perficient.com/ibm/?p=4825

WAScreenshot1In mid-May, at the IBM Vision Conference, IBM announced a deal for a free year of licenses of IBM Watson Analytics for existing Cognos BI and TM1 customers. If you missed that offer, IBM has recently introduced another offer for free licenses:

  • Watson Analytics Professional Edition
  • Up to 60 users
  • 6 months of use
  • Offer available through 10/31/2015

Professional:  http://www.ibm.com/web/portal/analytics/analyticszone/wagoproforfree

If you are an existing Cognos customer, now is the time to start learning Watson Analytics.

Perficient has been working with Watson Analytics since it was first released in beta form near the end of 2014.  Since our team has worked with Cognos for 15 years, and more recently SPSS and Predictive Analytics, we feel that we’re well positioned to help customers capitalize on this new technology.

Throughout the year, we have expanded our use of the Watson Analytics solution and have recently started engaging with customers to explore Watson Analytics at their organizations.  Most recently we are partnering with a Healthcare provider to create Watson Analytics dashboards to analyze Medicare claim data and enhance patient readmission models previously built on SPSS.

WAScreenshot2

If you’re an existing Cognos BI or TM1 customer, the first question is: How do we engage? The answer – quickly.

It starts with a use case and data. Since Watson Analytics runs in the cloud, we can get started quickly with data.  Some questions to ask yourself:

  • What do you wish you could do with Cognos today and can’t?
  • How could you enrich existing dashboards and data sets with predictive KPIs?
  • Is there value in moving to a Cloud model for Analytics?

At this point, which is early for Watson Analytics, we are not advocating this as a wholesale replacement.  We are advocating an enhancement based on your current approach.  Getting data to Watson Analytics and exploring what it can do for you is time well spent.

As the Watson Analytics product matures over the next 9-12 months, Perficient will be on the forefront working with customers to absorb and extract value out of this technology.

If you’d like you get started and also learn what other companies are doing with the Watson brand of technologies, please reach out using the form below.

]]>
https://blogs.perficient.com/2015/07/20/ibm-watson-analytics-the-time-is-now/feed/ 0 214231
Insights From Reliable Asset World https://blogs.perficient.com/2015/06/05/perficient-at-reliable-asset-world-conference/ https://blogs.perficient.com/2015/06/05/perficient-at-reliable-asset-world-conference/#respond Fri, 05 Jun 2015 17:11:03 +0000 https://blogs.perficient.com/ibm/?p=4709

Perficient was a Gold Sponsor of this year’s conference, which brought together experts in asset maintenance related processes, training, tools and machinery.

The keynote speaker at the conference was Perficient’s David Reiber. As former owner of General Motors Global Maximo Deployment, David knows a thing or thousand about asset maintenance and reliability.

assetworld1 assetworld2

What I found in speaking with attendees, as well as knowing David, is there has not been investment in harnessing the machine data already being gathered to predict failure in a meaningful, cross-process way. Don’t get me wrong; it is being done by very smart people but done in narrow views of data…nothing that is cross-process step capturing the entire manufacturing process data.

As experts in helping companies harness data and unlock its true value, Perficient is working in a variety of ways to help companies harness machine and asset data.

As a leading IBM Maximo, IBM Integration and IBM Predictive Analytics partner, we’re excited to be a partner for the IBM Predictive Maintenance and Quality product. This product can change the game for organizations by addressing the asset reliability needs of organizations in meaningful ways.

Please contact us to learn how to get started with this cloud or on premise solution. The opportunities for customers are measurable, meaningful and achievable. We’d like to help.

 

]]>
https://blogs.perficient.com/2015/06/05/perficient-at-reliable-asset-world-conference/feed/ 0 214223
Big Data and the Skies https://blogs.perficient.com/2015/04/22/big-data-and-the-skies/ https://blogs.perficient.com/2015/04/22/big-data-and-the-skies/#respond Wed, 22 Apr 2015 18:28:37 +0000 https://blogs.perficient.com/ibm/?p=4349

Earlier this week, I read a news article about the use of Twitter, or more accurately, the use of data collected from Twitter to prohibit a passenger from boarding a United Airlines flight.

Strangely enough, the person banned from the flight was probably among the people who should know the most about cyber-security and perceived threats.  Chris Roberts is the owner of One World Labs, which is reported to analyze cyber-security risks.

twitterMr. Roberts posted messages on Twitter suggesting he could hack into an airplane’s on-board computer systems, and as a result, United perceived Mr. Roberts to be a threat.  Upon trying to board a flight over the weekend to, ironically enough, the RSA security conference in San Francisco, Mr. Roberts was denied access to the plane.

The purpose of this blog post, as in other blog posts, is not to judge.  Whether you agree or disagree with Mr. Roberts’ actions or the actions of United Airlines’, what you can agree with is that tweets are being tracked and if there appears to be a threat, airlines among other organizations will take action.

 In the last 12-18 months, I heard Meg Whitman of HP say that one of the impacts of technology is it will make people more authentic.  I couldn’t agree more.  No longer can someone anonymously write on a piece of paper, board or wall.  In this day and age, your words and actions are known.  Whether comments are made in jest or not, they can be found and used against you.

As far as Mr. Roberts goes, he was able to purchase a ticket from a different carrier and make it to San Francisco.  I wonder if that second airline knew who they were selling to and what had been said on Twitter?

As an individual, it’s important to understand the lasting and potentially serious impact that social media communications can have on our lives. As a business, it’s critical to monitor and analyze that social media data, along with other existing customer data, to create a complete picture of marketplace sentiment, revenue opportunities and perceived threats.

There are plenty of tools available to companies looking to extract some insights from social media platforms. For example, Perficient partners IBM and Splunk offer social media analysis platforms with the following capabilities:

  • Assess the current impact of your organization on social media channels
  • Extract meaning from vast amounts of social data and interactions through sentiment analysis and natural language processing
  • Segment offerings through increased access to customer data
  • Discover relationships within networks that are driving social and purchasing behavior
  • Adjust social strategies to take advantage of findings
]]>
https://blogs.perficient.com/2015/04/22/big-data-and-the-skies/feed/ 0 214187
The New National Emergency: Cyber Threats https://blogs.perficient.com/2015/04/07/the-new-national-emergency-cyber-threats/ https://blogs.perficient.com/2015/04/07/the-new-national-emergency-cyber-threats/#respond Tue, 07 Apr 2015 14:47:49 +0000 https://blogs.perficient.com/ibm/?p=4138

In a blog post earlier this year, I discussed the increasing role data security is playing in the world today.  As mentioned in the article, 2014 was dubbed the “Year of the Data Breach”.  Around the time of the post, Sony had just been hacked.  Subsequently we learned there was credible evidence the cyber-attack was launched from outside the United States.

securitySplunkThis past week, the United States took extraordinary action to address the issue, by declaring a national emergency.

Without too much fan-fare, on April 1, an Executive Order arrived on the Whitehouse.gov website, declaring malicious cyber-enabled activities, originating from outside the United States, to “constitute an unusual and extraordinary threat to the national security, foreign policy, and economy of the United States”.  If you live in the Unites States, don’t be alarmed as we’ve been at some state of national emergency for over 30 years.  A national emergency, sadly, is nothing new.

Whether you praise the action or not, what is undeniable is that cyber security and dealing with malicious IT threats is on everybody’s radar, including the POTUS.

Splunk is a technology we implement which deals with the issue at hand. Splunk App for Enterprise Security provides an analytics-based approach to enterprise data security and event management. Features include incident tracking dashboards and reports, remediation workflows, and security analytics.

I recommend you consider looking at these Splunk capabilities in Enterprise Security to improve your defenses and improve your tracking of breaches, both as a standalone data security solution and in conjunction with your IBM data environments and storage devices.  You may need proof of attack one day to support legal action.

]]>
https://blogs.perficient.com/2015/04/07/the-new-national-emergency-cyber-threats/feed/ 0 214178
How does Currency Volatility Affect Your Business? https://blogs.perficient.com/2015/03/10/how-does-currency-volatility-affect-your-business/ https://blogs.perficient.com/2015/03/10/how-does-currency-volatility-affect-your-business/#respond Tue, 10 Mar 2015 20:28:11 +0000 https://blogs.perficient.com/ibm/?p=4072

In a blog post earlier this year, I discussed the precipitous drop in energy prices.  From roughly mid 2014 through early 2015, the price of crude oil was cut in half, and the related dominoes started to fall.  Companies were affected directly or indirectly by the new price level of oil.  Public and private entities across the global had to adjust plans and forecasts quickly.

Like Crude Oil, other markets have seen a significant deal of change in the last 6-9 months.  Currency Conversion is one area in particular where we spent a lot of time working with our customers.

Euro-USD

The photo above is a 6 month chart of the Euro-US Dollar exchange rate.  6 months ago, 1 US Dollar was worth nearly 1.30 Euro.  6 months later, that same dollar was worth 1.08 Euro.  This represents a 15% strengthening in the US Dollar.  Many forecast the trend to continue due to Euro area quantitative easing.

A 15% move in the Euro-US Dollar exchange rate, in 6 months, is a meaningful change that must be accounted for, both in practice and in analytics.

Regardless of what strategies change as a result of foreign currency volatility, companies need to plan and perform analytics excluding foreign currency volatility.  If you reviewed IBM’s Q4 2014 earnings release, they described revenue, as they always do, in constant currency.  To witness other examples of the impact of exchange rate volatility, see large multi-national Q4 earnings reports from the global consumer product companies and learn how currency impacted them.

How will your business be affected if the Euro moves another 8% to parity with the US Dollar?

Perficient’s IBM Business Analytics practice has delivered budgeting and forecasting solutions to the world’s largest companies.  We have direct experience dealing with the topics covered in this post and we can bring these and other proven and repeatable best practices to new customers.  Please contact us to learn more.

]]>
https://blogs.perficient.com/2015/03/10/how-does-currency-volatility-affect-your-business/feed/ 0 214170
Is your Analytics Data Safer in the Cloud? https://blogs.perficient.com/2015/01/09/is-your-analytics-data-safer-in-the-cloud/ https://blogs.perficient.com/2015/01/09/is-your-analytics-data-safer-in-the-cloud/#respond Fri, 09 Jan 2015 16:25:05 +0000 https://blogs.perficient.com/ibm/?p=3326

Leaders in information technology have traditionally been concerned with their company’s sensitive financial data residing in the cloud.  The concern is that with sensitive data in the cloud, it is at risk of being accessed by the wrong people.

shutterstock_155282822After 2014, or as I have seen it referred, the year of the data breach, one has to ask the question – Is your analytical data really safer in your own data center than in the cloud?

In 2014, the alleged perpetrators in these hacking crimes have varied from a rogue developer to, as in one of the most recent cases, a sovereign government.  The reasons for these attacks also vary.  In some cases it has been 1 or 2 thieves with a financial agenda who use a technical “back door” to steal credit card numbers.  In other cases they are people who have a social agenda.  Most recently, the stakes have been upped with the finger pointed at another country.  What about your employees?  Can they cause a “breach”?

When data control and security is breached the costs are enormous, with an executive ouster not far behind.

To address Business Analytics specifically, executives I talk to have typically gotten squeamish when asked about the likelihood of their financial forecasts, financial performance and operational detail data residing in the cloud.

The most important way to protect your data is to have the right controls and processes in place.  Its location is of less concern in this day and age.  Arguably, a data center company who specializes in data centers is more capable of securing data than a widget manufacturer’s in-house IT staff.

Your data is safest where the process and control is appropriate.

IBM has spent millions and millions of dollars ensuring they have the right Cloud platform for you to move forward.  We have sold IBM Business Analytics in the cloud in 2014.  I expect 2014 to be the first year for selling IBM Business Analytics in the cloud and not the last.

The “analytics in the cloud” trend is strengthening – and the year of the data breach is one reason.  So to come back to the original question – Is your analytics data safer in the cloud?  Based on 2014, it just might be.

 

]]>
https://blogs.perficient.com/2015/01/09/is-your-analytics-data-safer-in-the-cloud/feed/ 0 214133
How do Lower Energy Prices Affect Your Business? https://blogs.perficient.com/2015/01/05/how-do-lower-energy-prices-affect-your-business/ https://blogs.perficient.com/2015/01/05/how-do-lower-energy-prices-affect-your-business/#respond Mon, 05 Jan 2015 16:09:20 +0000 https://blogs.perficient.com/ibm/?p=3290

Business leaders around the world are fully aware of the precipitous drop in energy prices. Oil prices are reported to be at 5 year lows, with the absolute price per barrel cut roughly in half in 2014. This weekend in Chicago, there was a specific gas station, to which I am fortunate to live near, selling unleaded gas at $1.99 per gallon. There were 10-20 cars waiting in 15 degree weather for gas. Last but certainly not least, liquefied natural gas is under 3 dollars per BTU, which will have significant impact in the U.S. north as the cost to heat homes and business will be less, among other things.

shutterstock_236262826These are by no means insignificant prices. Neither is the rate of change. Indeed, governments and their financial ministries around the world spent much of the end of 2014 reworking budgets as the prices of oil went from 100 to 90 to 80 to 70 to 60.

This led me to thinking about our customers.

  • How does the volatility in energy prices affect your business?
  • Are the impacts direct or indirect? Both?
  • How quickly can your budgets and forecasts adjust for this significant amount of volatility in such a short period of time?

In reality, the effect of lower energy prices will be felt in many places in a company’s P&L report.

  • Will your customers have more or less purchasing power as a result of this change in energy prices?
  • Will your distribution costs go down?
  • How does the effect of $80 per barrel oil compare to $60?
  • What are the ultimate effects of this on profitability?
  • Can you re-price your service or product based on lower energy prices? (for example, Airlines and other transportation companies)

If your business creates its budgets with spreadsheets and other manual processes, the creation and updating of these budgets is inordinately slow. If you are looking to implement solutions which make recalculating budgets and creating versions of budgets easier, please reach out to us.

 

]]>
https://blogs.perficient.com/2015/01/05/how-do-lower-energy-prices-affect-your-business/feed/ 0 214129
Cognos TM1 TurboIntegrator – Run-time or Read-time? https://blogs.perficient.com/2014/01/15/cognos-tm1-turbointegrator-run-time-or-read-time/ https://blogs.perficient.com/2014/01/15/cognos-tm1-turbointegrator-run-time-or-read-time/#respond Wed, 15 Jan 2014 17:51:52 +0000 http://blogs.perficient.com/dataanalytics/?p=4044

In this blog I wanted to take some time to describe certain “behaviors” of Cognos TM1 TurboIntegrator processes.

Access to Processes

As of version 10.2 (or as of this writing), TM1 Server lists all processes (in alphabetical order) under the consolidation “Processes”. The visibility of processes can be controlled by implementing TM1 security. TM1 groups can be set to have READ, WRITE or NONE/BLANK access to individual processes. Security must be set for each group and for each process individually. Process security (by group) is loaded and maintained in the TM1 control cube “}ProcessSecurity”.

Groups with NONE/BLANK access will not have visibility to the process.Cognos TM1 TurboIntegrator – Run-time or Read-time?

Groups with READ access will have visibility and have the ability to right-click on the process and select Run. This executes the process.

Groups with WRITE access will have visibility and have the ability to either:

  • Right-click on the process and select Run (to execute the process) or
  • Double-click on the process and edit/run the process. Double-clicking on a process enters the client into “read-time” (or edit mode) for that process.
  • All TM1 Administrators have (by default) WRITE access to all processes giving them full edit and execution ability for all processes.

Processes and Chore Execution

As a (best practice) rule, all processes that exist within a production environment should be executed as part of a TM1 Chore.

Access to Chores follows the same tactic as processes: Groups are assigned READ, WRITE or NONE/BLANK access. Chore access is loaded and maintained in the TM1 control cube “}ChoreSecurity”.

Groups with NONE/BLANK access will not have visibility to the chore.

Groups with READ access will have visibility and have the ability to right-click on the chore and select Run. This executes the chore.

Groups with WRITE access will have visibility and have the ability to either:

  • Right-click on the chore and select Run (to execute the chore) or
  • Double-click on the chore and edit/run the chore (using the Chore Setup Wizard).
  • All TM1 Administrators have (by default) WRITE access to all chores giving them full edit and execution ability for all chores.

Individual Processes

At times, it may be necessary to execute (run) individual processes without executing them “through” an associated TM1 chore. To do this it is recommended to use the “select and right-click” method (rather than double-clicking and editing to run the process). The following are the correct steps:

  1. From TM1 Server explorer, expand “Processes”.
  2. Locate the process to be executed and select/highlight it (by (single) clicking on it).
  3. Right-click on the process name and select Run (see below):

 

ti1

 

 

 

 

 

 

 

 

If the selected process includes run-time parameters, TM1 will prompt you for the required values by displaying the “Parameters dialog” (shown below). Multiple parameters will be listed sequentially/vertically on the same dialog.

ti2

 

 

 

 

 

 

 

 

 

Typically, the prompt will display a Prompt Question (text) describing each parameter and space for you to enter a value for that parameter (labeled Default Value). Certain processes may have included a default value for certain (or all) parameters and if so, those values will appear in the Default Value space. Default values may be changed if desired by typing over the value shown.

Processing and Data

TM1 TurboIntegrator processes include a “Data Source” tab where the Datasource Type option is set (ODBC, Text, ODBO, IBM Cognos TM1, SAP, IBM Cognos Package or None). When any option other than “None” is selected, the idea is that the TM1 process will attempt to “connect” to the specified datasource and then read that data. If the data source is inaccessible for any reason (does not exist or is restricted by security) a “compilation” error will occur.

Run-Time or Read-Time

When working with TurboIntegrator processes, you need to understand the concept of run-time vs. read-time.

Run-time is defined as the time when TM1 is running (executing) a process. During run time the client session (clients may in some environments be setup with the ability to have multiple sessions active) that initiated (the running of) the process is “locked” by TM1 server until that process completes. During run-time, processing or run-time errors may occur. Best practice recommends that scripting logic within the process be in place do prevent the occurrence of the most common run-time errors (and would perform the appropriate steps for remediating the errors) and that TM1 message logging be appropriately leveraged.

TM1 writes run-time errors to a log file. The number of minor errors that will be written to the TM1ProcessError.log file during process execution is by default set to 1000 (this value can be changed for an individual process).

When a process error log file is generated, TM1 assigns a unique name that lets you readily identify which TurboIntegrator process generated the error file and the time at which the file was created. File names are assigned using the convention TM1ProcessError_time stamp_process name.log. In this convention, time stamp is the time (expressed as yyyymmddhhmmss GMT) at which the file was generated and process name is the name of the TurboIntegrator process that caused the errors.

There may be multiple TM1ProcessError.log files associated with the server session that crashed. All TM1ProcessError.log files are stored in the server data directory.

Readtime is defined as the time when a TM1 client who has WRITE access to a process is “reading” the scripting within that process. In other words, a client has double-clicked on a selected process name and has the process “open” on the desktop. The intention of the client may be:

  • Perform a review of the scripting within the process
  • Modify the scripting or option settings within the process
  • Run the process (although best practice recommends that processes NOT be run/executed from within edit or “ready-time” mode)

If changes are made to a process during read-time, those changes are not automatically (or auto) saved. To save the changes the following must occur:

  • The client clicks the Save icon or clicks File, then Save or Save As on the TurboIntegrator process menu
  • The client closes the TurboIntegrator process. TM1 will prompt the client with the “Save Changes to Process Definition?” dialog. The clicks “Yes”.
  • The client clicks the Run icon or the File then Run menu. Before TM1 runs a process it saves the latest version of the process (which would include the changes). Note: When TM1 saves a process it performs a syntax check (or compilation) of the settings/options of the process as well as the scripting in the process. If there are errors, TM1 will prompt the client and not run the process.

Compilation Errors

During read-time, compilation errors may occur. Compilation errors are defined as “errors generated by TM1 based upon its perceived inability to execute a process to completion successful”. These errors are generated when TM1 compiles (checks for syntax errors in the process scripting or settings/options of the process) a process.

TM1 performs a compilation of the process when:

  • A client attempts to save the process
  • A client attempts to run a process
  • A client begins editing or enters read-time for a process (double-clicks on the process name and “opens” the process as described above).

The most common compilation error encountered during read-time of a TurboIntegrator process is the “cannot find” error. When the datasource type option (described above) is set to any option other than NONE, TM1 will attempt to verify the datasource named in the Datasource Name field on the Datasource tab of the process upon opening the process. These means that if you have set the value to an ODBC datasource TM1 will attempt to connect to that datasource named, or if the setting is Text, TM1 will validate the fully qualified filename, etc. if TM1 cannot connect to the source, an error will occur. This error can most always be considered a warning as best practice recommends that specific datasource values be set within the script and therefore validated during execution of the Prolog in the process. Scripting should include logic to verify the datasource and perform remediation actions if the datasource cannot be verified. Typically, “Cannot find” errors that occur upon opening a process are meaningless.

An Example

A large percentage of processes will work with data already in TM1. Therefore the Datasource Type would be set to IBM Cognos TM1 and the Data Source Name field will then show the explicit cube view or dimension subset to be read.

In this example, a process uses a specific view of the cube named Forecasting. The view is specific to the requirements of the process in that it filters the data in the cube view by certain parameters (for example by version, time or region). During development of the process, the developer started with a temporary view named: “Forecasting All” which included all data for the version named FORECAST in the current time period. At run-time the process filters the view by region (provided by a process parameter). Using this view the developer was able to preview a sample of the cube data and set all of the Variable Tab parameters easily. Once these values were set, script was created in the Prolog section of the process, based upon best practice recommendations, to verify and filter the view as required. This script in fact checks for the existence of a view and if it does not exist, creates it. If the view does exist, it filters the view (using subsets) to meet the runtime requirements of the process.

Note: after the first time the process is run, the required view will exist.

Once the process is created, compiled and tested, the developer would delete/remove the temporary view from the FORECAST cube (as that view is no longer needed). Since the process (via scripting) creates and/or modifies its own view. Once the temporary view is removed, each time the TI process is opened for edit, a “Cannot find” warning would occur (shown below).

ti3

 

 

 

 

 

To resolve this warning/error, you can simply click OK and proceed to either review and/or edit the process or execution/run it. You may be tempted to resolve this issue by changing the value in the Datasource Name field to a view name that will always exist within TM1; however views can be modified or removed in a variety of ways including other processes, manually by TM1 clients with appropriate access, or unanticipated events. Best practice recommends processes use scripting to set datasource options at runtime to ensure successful execution of the process.

Cheers!

 

]]>
https://blogs.perficient.com/2014/01/15/cognos-tm1-turbointegrator-run-time-or-read-time/feed/ 0 199980
Using Splunk with Cognos TM1 https://blogs.perficient.com/2014/01/14/using-splunk-with-cognos-tm1/ https://blogs.perficient.com/2014/01/14/using-splunk-with-cognos-tm1/#respond Tue, 14 Jan 2014 13:43:14 +0000 http://blogs.perficient.com/dataanalytics/?p=4032

IBM Cognos TM1 offers powerful enterprise planning, forecasting, and analysis capabilities. These benefits are best realized by extending the solution across departments for a consolidated, integrated planning process, leveraging business-specific financial models that mirror business strategy.

Our years of experience implementing TM1 have shown an increasing demand on support personnel to coordinate data transferring and loading activities without any errors. The ability to monitor and troubleshoot these errors using TM1 is cumbersome, with administrators relying on multiple production log files and error files referenced in the various logs. As a company’s TM1 server footprint grows, this adds to the complexity and decreases the support group’s effectiveness in helping end users during critical month-end data updates. The MTTR (mean time to resolve) and MTTI (mean time to identify) root causes for data issues and environment-related problems often takes hours.

Introducing Splunk Plus for TM1

To address these challenges, we offer Splunk Plus for TM1, a solution that captures disparate sets of IBM TM1 log files in real time and provides visibility into troublesome issues that cost TM1 administrators time and resources. By ingesting log file data and quickly extracting valuable information, Splunk Plus for TM1 provides a level of insight and visibility into the health of your TM1 environment that simply is not possible using tools provided by TM1 out-of-the-box.

With Splunk Plus for TM1, administrators gain insight into the following questions not easily answered natively by TM1:

  • What is causing poor health in the TM1 environment?
  • How many Turbo Integrator Processes have failed in the last 15 minutes with rejected records and who ran the processes?
  • What percentage of TM1 processes have completed successfully?
  • What TM1 processes are taking an abnormally long time to complete?

Solution Features

Splunk Plus for TM1 ingests and visualizes machine data for TM1 instances. The application is highly instrumented and allows administrators to interact with TM1 application and error logs. Users can pivot on time ranges, environment, TM1 applications/instances, TI (Turbo Integrator) type, and process results to tailor a view from TM1 metrics into reports, alerts, and dashboards that matter.

Dashboards provide both executives and operations stakeholders meaningful and actionable historical and real-time insights into the operational health of TM1. The application facilitates troubleshooting and faster root cause analysis for conditions that could lead to outages before they happen.

The following dashboards are included with Splunk Plus for TM1:

  • Executive Overview via Rolled up KPIs
  • TI Investigation
  • TI Status
  • TI Failures
  • TI Visualizer
  • TI Notables

SplunkPlusTM1_Screenshot

Solution Benefits

While the value of TM1 is undeniable, additional capabilities are required to optimize application support. By offering these capabilities, Splunk Plus for TM1 provides considerable benefits to TM1 administrators through unparalleled visibility into the TM1 environment. By improving TM1 administration, support teams can more effectively perform tasks and ultimately meet service level agreements.

This solution is available as a free app on Splunkbase: https://splunkbase.splunk.com/app/3139/

]]>
https://blogs.perficient.com/2014/01/14/using-splunk-with-cognos-tm1/feed/ 0 199979