Predictive Modeling Articles / Blogs / Perficient https://blogs.perficient.com/tag/predictive-modeling/ Expert Digital Insights Mon, 18 Nov 2024 19:18:56 +0000 en-US hourly 1 https://blogs.perficient.com/files/favicon-194x194-1-150x150.png Predictive Modeling Articles / Blogs / Perficient https://blogs.perficient.com/tag/predictive-modeling/ 32 32 30508587 Perficient Recognized in Forrester’s AI Services Landscape https://blogs.perficient.com/2024/01/04/perficient-recognized-in-forresters-ai-services-landscape/ https://blogs.perficient.com/2024/01/04/perficient-recognized-in-forresters-ai-services-landscape/#respond Thu, 04 Jan 2024 15:56:12 +0000 https://blogs.perficient.com/?p=352991

As we step into 2024, the transformative impact of Artificial Intelligence (AI) and generative AI on enterprise-level organizations has reshaped the business landscape in profound ways. The continual evolution of these technologies has empowered businesses to leverage advanced algorithms, predictive modeling, and generative capabilities, driving unprecedented innovation and efficiency. From enhancing decision-making processes to revolutionizing customer interactions, the influence of AI has positioned enterprises at the forefront of a dynamic digital era, where adaptability and integration of cutting-edge technologies are paramount for sustained success.

However, adopting AI at an enterprise level is not without its challenges, given the complexities of integration and optimization within existing frameworks. Navigating this landscape is made significantly more manageable with the assistance of a specialized consulting partner well-versed in AI. Such expertise ensures a strategic and tailored approach, enabling organizations to overcome obstacles, unlock AI’s full potential, and thrive in the dynamic business environment of 2024.

The AI Services Landscape, Q1 2024

The recently published AI Services Landscape, Q1 2024 report by Forrester states that “Generative AI (genAI) created an AI awakening among business stakeholders and workers.” The business value of implementing AI initiatives in partnership with AI service providers can be significant, and that’s why we’re proud to be recognized in the Forrester AI Services Landscape as a service provider with an industry focus in the sectors of Pharmaceutical and medical equipment, Transportation, and Financial services, and a geographic focus in North America, Asia Pacific, and Latin America.

Our AI Capabilities

Forrester carefully researched each service provider in the report through a set of comprehensive questions and identified three core business scenarios – customer experience, process automation, and edge intelligence. According to Forrester, “These are the business scenarios that buyers most frequently seek and expect AI services providers to address.” We believe as a leading digital consultancy that helps clients imagine, create, engineer, and run digital transformation and AI solutions, we in turn, help organizations drive growth, efficiency, and differentiation in these three areas.

One of the ways we help organizations optimize their customer journeys is through our CX AI Jumpstart, a 5 week approach to identifying and implementing AI opportunities. Built on our accelerated modeling process, CX AI focuses on developing an interactive model that demonstrates how your organization can leverage machine learning, natural language processing, and cognitive computing to jump start Al adoption.

In addition to the core business scenarios in this report, Forrester also references seven extended business scenarios that are important to certain buyers looking for artificial intelligence implementation services.

Forrester asked each vendor included in the Landscape to select the top business scenarios for which clients select them and from there determined which are the extended business scenarios that highlight differentiation among the vendors. We are shown in the report for having selected Product and service development, Knowledge management, and Content creation as some of the top reasons clients work with us out of the listed extended business scenarios.

Learn More About Perficient’s AI Services

Whether your business is just starting its AI journey or seeking to enhance its current efforts, partnering with the right service provider can make all the difference. With a team of more than 300 AI professionals, including data scientists, data engineers, AI architects, and AI developers, Perficient has extensive knowledge and skills in various AI domains. Contact us now to discover how our expertise can take your business to new heights.

Download the Forrester report, The AI Services Landscape, Q1 2024to learn more (available to Forrester subscribers and for purchase).

 

]]>
https://blogs.perficient.com/2024/01/04/perficient-recognized-in-forresters-ai-services-landscape/feed/ 0 352991
Perficient Wins IBM Award for Patient Readmissions Solution https://blogs.perficient.com/2017/02/17/perficient-wins-ibm-award-for-patient-readmissions-solution/ https://blogs.perficient.com/2017/02/17/perficient-wins-ibm-award-for-patient-readmissions-solution/#respond Fri, 17 Feb 2017 12:20:48 +0000 https://blogs.perficient.com/healthcare/?p=10603

Perficient was recently awarded a 2017 IBM Beacon Award for Outstanding Watson Solution. The award was presented this week during IBM’s PartnerWorld Leadership Conference.

The Beacon Awards recognize select business partners who deliver exceptional IBM-based solutions that drive business value and transform the way clients and industries do business in the cognitive era. This year’s awards honor achievement across 19 solution areas including analytics, collaboration, cloud, commerce, cognitive, the Internet of Things, security, and Watson. Winners were selected by a panel of IBM executives, industry analysts, and media members.

Perficient received the Beacon award in recognition for its Watson-based predictive modeling solution for patient readmissions. This solution is designed to better identify patients in need of interventions and ultimately reduce readmission rates. The solution extracts key information from previously untapped sources of unstructured data, such as doctors’ notes and psycho-social information and has helped our client, a large health system in Ohio, improve its readmission prediction accuracy from 46% to 93%.

“Many of our healthcare customers have been challenged to make demonstrable, evidence-based decisions to improve patient care and engagement while dealing with the growing amount of data they have available to them,” said Kevin Nunnally, Perficient’s National Partner Executive for IBM. “Perficient has followed the technology for several years and started out investing in our expertise and educating our clients on what Watson is and what it can do. This award is tremendous recognition for our success in helping our clients and our partners work with Watson and understand the scope of its capabilities.”

This is Perficient’s third consecutive Beacon Award and all of them have been in the healthcare industry. In 2015, we received the Outstanding Information Management Solution award for our Health Analytics Gateway, an information management framework that streamlines data processing for healthcare organizations. In 2016, we received the Outstanding Enterprise Cloud Solution award for our Immersion solution, a migration-as-a-service for the IBM Bluemix cloud platform that boosts in-cloud application development efficiency nearly 40 percent over standard migration practices.

“By delivering innovative solutions to drive business value, Beacon Award winners help transform the way their clients and industries do business,” said Marc Dupaquier, general manager, IBM Global Business Partners. “We’d like to congratulate Perficient on winning the Beacon Award for Outstanding Watson Solution and delivering exceptional client satisfaction and results.”

Learn more about how Perficient is leveraging IBM Watson in healthcare and life sciences.

]]>
https://blogs.perficient.com/2017/02/17/perficient-wins-ibm-award-for-patient-readmissions-solution/feed/ 0 181840
Perficient wins Cognitive Analytics Award for IBM Watson Solution https://blogs.perficient.com/2017/02/16/perficient-wins-cognitive-analytics-award-for-ibm-watson-solution/ https://blogs.perficient.com/2017/02/16/perficient-wins-cognitive-analytics-award-for-ibm-watson-solution/#respond Thu, 16 Feb 2017 17:04:24 +0000 http://blogs.perficient.com/dataanalytics/?p=7341

Earlier this week, Perficient won the 2017 IBM Beacon Award for an Outstanding Watson Solution. These awards are selected by a panel of IBM executives, industry analysts, and media members, to recognize partners that are delivering exceptional IBM-based solutions and transforming the way clients and industries do business in the cognitive era. This year’s award recognizes our Watson-based predictive modeling solution for patient readmissions, designed to better identify patients in need of interventions and ultimately reduce readmission rates.

Need for Predicting & Reducing Readmissions

Per the Center for Healthcare Quality & Payment Reform, “one of the best ways for communities to reduce healthcare costs quickly and improve patient care in the process is to implement initiatives to reduce hospital readmissions. Research studies and quality reporting initiatives around the country show that 15-25% of people who are discharged from the hospital will be readmitted to the hospital within 30 days or less, and that many of these readmissions are preventable.” They go on to predict that billions of dollars of savings would be achieved by reducing these unnecessary hospital readmissions. The U.S Department of Health & Human Services estimates that avoidable hospital readmissions make up more than $17 billion in Medicare expenses.

Enhanced Readmissions Modeling

Perficient’s Readmissions Modeling solution is unique in that rather than focusing on a single disease or condition, this solution predicts readmissions across all diseases and conditions.  Another unique aspect of this solution is that it focuses on the psycho-social needs on the patient and the family rather than a traditional clinical approach, which is the focus of most of the industry. By incorporating this data with a wealth of unstructured and structured healthcare data, healthcare providers can significantly increase the accuracy of their readmissions modeling and predictions. In fact, 80% of healthcare data is typically invisible to current systems because it’s unstructured, so the value of uncovering insights in that data is invaluable when predicting triggers for readmissions.

The ultimate goal of the solution is to reduce hospital readmissions across the entire patient population, by educating patients and their families to better manage their conditions while coordinating various services within the community. Hospitals and healthcare providers are looking to identify patients who are in the most need of interventions, identify which interventions are most appropriate and begin the intervention process as quickly as possible; whereas traditionally, interventions occurred at the end of a patient’s stay.  Early intervention allows for more time for clinicians to provide instruction, increase patient and family member understanding and allow time for community services to be arranged so they are ready when the patient is discharged.  This solution also helps to optimize hospital resources to provide interventions to those who stood to benefit the most.

The predictive modeling process derives a prediction of whether a person is at risk for being readmitted within 30 days for each day they are in the hospital.  These predictions are assigned a probability which is used to prioritize daily patient interventions.  Nursing assessment data is also segmented into different risk and intervention profiles to identify health and lifestyle behaviors that clinicians can use to develop appropriate interventions and plans of care.

Solution Components

The solution leverages IBM Watson to uncover new evidence in predicting readmission propensity, ushering in a new era of evidence-based analytics. Watson Explorer Advanced Edition and the healthcare annotators are used to analyze physician notes from the EMR to extract relevant, contextual data and transform that information into structured data points. The healthcare annotators are “reading” this unstructured information, tuned appropriately to best interpret and transform TriHealth’s data. The analysis result is exported as structured data into a data warehouse, which is then consumed by SPSS Modeler along with other existing structured data to develop the predictive readmissions model. The resulting readmission risk indicator is then incorporated into the EMR system, indicating a patient’s likelihood of readmission at the point of care.

 

]]>
https://blogs.perficient.com/2017/02/16/perficient-wins-cognitive-analytics-award-for-ibm-watson-solution/feed/ 0 200188
An Architectural Approach to Cognos TM1 Design https://blogs.perficient.com/2014/08/28/an-architectural-approach-to-cognos-tm1-design/ https://blogs.perficient.com/2014/08/28/an-architectural-approach-to-cognos-tm1-design/#respond Thu, 28 Aug 2014 20:22:52 +0000 http://blogs.perficient.com/dataanalytics/?p=4907

Overtime, I’ve written about keeping your TM1 model design “architecturally pure”. What this means is that you should strive to keep a models “areas of functionality” distinct within your design.

Common Components

I believe that all TM1 applications, for example, are made of only 4 distinct “areas of functionality”. They are absorption (of key information from external data sources), configuration (of assumptions about the absorbed data), calculation (where the specific “magic” happens; i.e. business logic is applied to the source data using the set assumptions) and consumption (of the information processed by the application and is ready to be reported on).

Some Advantages

Keeping functional areas distinct has many advantages:

  • Reduces complexity and increases sustainability within components
  • Reduces the possibility of one component negativity effecting another
  • Enables the probability of reuse of the particular (distinct) components
  • Promotes a technology independent design; meaning components can be built using the technology that best fits their particular objective
  • Allows components to be designed, developed and supported by independent groups
  • Diminishes duplication of code, logic, data, etc.
  • Etc.

Resist the Urge

There is always a tendency to “jump in” and “do it all” using a single tool or technology or, in the case of Cognos TM1, a few enormous cubes and today, with every release of software, there are new “package connectors” that allow you to directly connect (even external) system components. In addition, you may “understand the mechanics” of how a certain technology works which will allow you to “build” something, but without comprehensive knowledge of architectural concepts, you may end up with something that does not scale, has unacceptable performance or is costly to sustain.

Final Thoughts

Some final thoughts:

  • Try white boarding the functional areas before writing any code
  • Once you have your “like areas” defined, search for already existing components that may meet your requirements
  • If you do decide to “build new”, try to find other potential users for the new functionality. Could you partner and co-produce (and thus share the costs) a component that you both can use?
  • Before building a new component, “try out” different technologies. Which best serves the need of these components objectives? (A rule of thumb, if you can find more than 3 other technologies or tools that better fit your requirements than the technology you planned to use, you’re in trouble!).

And finally:

Always remember, just because you “can” doesn’t mean you “should”.

]]>
https://blogs.perficient.com/2014/08/28/an-architectural-approach-to-cognos-tm1-design/feed/ 0 200051
A Practice Vision https://blogs.perficient.com/2014/08/27/a-practice-vision/ https://blogs.perficient.com/2014/08/27/a-practice-vision/#respond Wed, 27 Aug 2014 23:11:53 +0000 http://blogs.perficient.com/dataanalytics/?p=4905

Vision

Most organizations today have had successes implementing technology and they are happy to tell you about it. From a tactical perspective, they understand how to install, configure and use whatever software you are interested in. They are “practitioners”. But, how may can bring a “strategic vision” to a project or to your organization in general?

An “enterprise” or “strategic” vision is based upon an “evolutionary roadmap” that starts with the initial “evaluation and implementation” (of a technology or tool), continues with “building and using” and finally (hopefully) to the organization, optimization and management of all of the earned knowledge (with the tool or technology). You should expect that whoever you partner with can explain what their practice vision or mythology is or, at least talk to the “phases” of the evolution process:

Evaluation and Implementation

The discovery and evaluation that takes place with any new tool or technology is the first phase of a practices evolution. A practice should be able to explain how testing is accomplished and what it covers How was it that they determined if the tool/technology to be used will meet or exceed your organization’s needs? Once a decision is made, are they practiced at the installation, configuration and everything that may be involved in deploying the new tool or technology for use?

Build, Use, Repeat

Once deployed, and “building and using” components with that tool or technology begin, the efficiency at which these components are developed as well as the level of quality of those developed components will depend upon the level of experience (with the technology) that a practice possess. Typically, “building and using” is repeated with each successful “build” so how many times has the practice successfully used this technology? By human nature, once a solution is “built” and seems correct and valuable, it will be saved and used again. Hopefully, this solution would have been shared as a “knowledge object” across the practice. Although most may actually reach this phase, it is not uncommon to find:

  • Objects with similar or duplicate functionality (they reinvented the wheel over and over).
  • Poor naming and filing of objects (no one but the creator knows it exists or perhaps what it does)
  • Objects not shared (objects visible only to specific groups or individuals, not the entire practice)
  • Objects that are obsolete or do not work properly or optimally are being used.
  • Etc.

Manage & Optimization

At some point, usually while (or after a certain number of) solutions have been developed, a practice will “mature its development or delivery process” to the point that it will begin investing time and perhaps dedicate resources to organize, manage and optimize its developed components (i.e. “organizational knowledge management”, sometimes known as IP or intellectual property).

You should expect a practice to have a recognized practice leader and a “governing committee” to help identify and manage knowledge developed by the practice and:

  • inventory and evaluate all known (and future) knowledge objects
  • establish appropriate naming standards and styles
  • establishing appropriate development and delivery standards
  • create, implement and enforce a formal testing strategy
  • continually develop “the vision” for the practice (and perhaps the industry)

 

More

As I’ve mentioned, a practice needs to take a strategic or enterprise approach to how it develops and delivers and to do this it must develop its “vision”. A vision will ensure that the practice is leveraging its resources (and methodologies) to achieve the highest rate of success today and over time. This is not simply “administrating the environment” or “managing the projects” but involves structured thought, best practices and continued commitment to evolved improvement. What is your vision?

]]>
https://blogs.perficient.com/2014/08/27/a-practice-vision/feed/ 0 200050
IBM OpenPages GRC Platform –modular methodology https://blogs.perficient.com/2014/08/14/ibm-openpages-grc-platform-modular-methodology/ https://blogs.perficient.com/2014/08/14/ibm-openpages-grc-platform-modular-methodology/#respond Thu, 14 Aug 2014 14:58:10 +0000 http://blogs.perficient.com/dataanalytics/?p=4849

The OpenPages GRC platform includes 5 main “operational modules”. These modules are each designed to address specific organizational needs around Governance, Risk, and Compliance.

Operational Risk Management module “ORM”

IBM OpenPages GRC Platform - modular methodologyThe Operational Risk Management module is a document and process management tool which includes a monitoring and decision support system enabling an organization to analyze, manage, and mitigate risk simply and efficiently. The module automates the process of identifying, measuring, and monitoring operational risk by combining all risk data (such as risk and control self-assessments, loss events, scenario analysis, external losses, and key risk indicators (KRI)), into a single place.

Financial Controls Management module “FCM”

The Financial Controls Management module reduces time and resource costs associated with compliance for financial reporting regulations. This module combines document and process management with awesome interactive reporting capabilities in a flexible, adaptable easy-to-use environment, enabling users to easily perform all the necessary activities for complying with financial reporting regulations.

Policy and Compliance Management module “PCM”

The Policy and Compliance Management module is an enterprise-level compliance management solution that reduces the cost and complexity of compliance with multiple regulatory mandates and corporate policies. This model enables companies to manage and monitor compliance activities through a full set of integrated functionality:

  • Regulatory Libraries & Change Management
  • Risk & Control Assessments
  • Policy Management, including Policy Creation, Review & Approval and Policy Awareness
  • Control Testing & Issue Remediation
  • Regulator Interaction Management
  • Incident Tracking
  • Key Performance Indicators
  • Reporting, monitoring, and analytics

IBM OpenPages IT Governance module “ITG”

This module aligns IT services, risks, and policies with corporate business initiatives, strategies, and operational standards. Allowing the management of internal IT control and risk according to the business processes they support. In addition, this module unites “silos” of IT risk and compliance delivering visibility, better decision support, and ultimately enhanced performance.

IBM OpenPages Internal Audit Management module “IAM”

This module provides internal auditors with a view into an organizations governance, risk, and compliance, affording the chance to supplement and coexist with broader risk and compliance management activities throughout the organization.

One Solution

The IBM OpenPages GRC Platform Modules Object Model (“ORM”, “FCM”, “PCM”, “ITG” an “IAM”) interactively deliver a superior solution for Governance, Risk, and Compliance. More to come!

]]>
https://blogs.perficient.com/2014/08/14/ibm-openpages-grc-platform-modular-methodology/feed/ 0 200044
The installation Process – IBM OpenPages GRC Platform https://blogs.perficient.com/2014/08/13/the-installation-process-ibm-openpages-grc-platform/ https://blogs.perficient.com/2014/08/13/the-installation-process-ibm-openpages-grc-platform/#respond Wed, 13 Aug 2014 18:13:27 +0000 http://blogs.perficient.com/dataanalytics/?p=4843

When preparing to deploy the OpenPages platform, you’ll need to follow these steps:

  1. Determine which server environment you will deploy to – Windows or AIX.
  2. Determine your topology – how many servers will you include as part of the environment? Multiple application servers? 1 or more reporting servers?
  3. Perform the installation of the OpenPages prerequisite software for the chosen environment -and for each server’s designed purpose (database, application or reporting).
  4. Perform the OpenPages installation, being conscious of the software that is installed as part of that process.

Topology

Depending upon your needs, you may find that you’ll want to use separate servers for your application, database and reporting servers. In addition, you may want to add additional application or reporting servers to your topology.

 

 

topo

 

 

 

 

 

 

 

 

 

 

 

 

After the topology is determined you can use the following information to prepare your environment. I recommend clean installs (meaning starting with fresh or new machines and VM’s are just fine (“The VMWare performance on a virtualized system is comparable to native hardware. You can use the OpenPages hardware requirements for sizing VM environments” – IBM).

(Note – this is if you’ve chosen to go Oracle rather than DB2):

MS Windows Severs

All servers that will be part of the OpenPages environment must have the following installed before proceeding:

  • Microsoft Windows Server 2008 R2 and later Service Packs (64-bit operating system)
  • Microsoft Internet Explorer 7.0 (or 8.0 in Compatibility View mode)
  • A file compression utility, such as WinZip
  • A PDF reader (such as Adobe Acrobat)

The Database Server

In addition to the above “all servers” software, your database server will require the following software:

  • Oracle 11gR2 (11.2.0.1) and any higher Patch Set – the minimum requirement is Oracle 11.2.0.1 October 2010 Critical Patch Update.

Application Server(s)

Again, in addition to the above “all servers” software, the server that hosts the OpenPages application modules should have the following software installed:

  • JDK 1.6 or greater, 64-bit Note: This is a prerequisite only if your OpenPages product does not include WebLogic Server.
  • Application Server Software (one of the following two options)

o   IBM Websphere Application Server ND 7.0.0.13 and any higher Fix Pack Note: Minimum requirement is Websphere 7.0.0.13.

o   Oracle WebLogic Server 10.3.2 and any higher Patch Set Note: Minimum requirement is Oracle WebLogic Server 10.3.2. This is a prerequisite only if your OpenPages product does not include Oracle WebLogic Server.

  • Oracle Database Client 11gR2 (11.2.0.1) and any higher Patch Set

Reporting Server(s)

The server that you intend to host the OpenPages CommandCenter must have the following software installed (in addition to the above “all servers” software):

  • Microsoft Internet Information Services (IIS) 7.0 or Apache HTTP Server 2.2.14 or greater
  • Oracle Database Client 11g R2 (11.2.0.1) and any higher Patch Set

During the OpenPages Installation Process

As part of the OpenPages installation, the following is installed automatically:

 

For Oracle WebLogic Server & IBM WebSphere Application Server environments:

  • The OpenPages application
  • Fujitsu Interstage Business Process Manager (BPM) 10.1
  • IBM Cognos 10.2
  • OpenPages CommandCenter
  • JRE 1.6 or greater

If your OpenPages product includes the Oracle WebLogic Server:

  • Oracle WebLogic Server 10.3.2

If your OpenPages product includes the Oracle Database:

  • Oracle Database Server Oracle 11G Release 2 (11.2.0.1) Standard Edition with October 2010 CPU Patch (on a database server system)
  • Oracle Database Client 11g Release 2 (11.2.0.1) with October 2010 CPU Patch applied 64-bit (on an application server system)
  • Oracle Database Client 11g Release 2 (11.2.0.1) with October 2010 CPU Patch applied 32-bit (on a reporting server system)

 Thanks!

]]>
https://blogs.perficient.com/2014/08/13/the-installation-process-ibm-openpages-grc-platform/feed/ 0 200043
IBM OpenPages Start-up https://blogs.perficient.com/2014/08/12/ibm-openpages-start-up/ https://blogs.perficient.com/2014/08/12/ibm-openpages-start-up/#respond Tue, 12 Aug 2014 17:47:20 +0000 http://blogs.perficient.com/dataanalytics/?p=4833

In the beginning…

OpenPages was a company “born” in Massachusetts, providing Governance, Risk, and Compliancesoftware and services to customers. Founded in 1996, OpenPages had more than 200 customers worldwide including Barclays, Duke Energy, and TIAA-CREF. On October 21, 2010, OpenPages was officially acquired by IBM:

http://www-03.ibm.com/press/us/en/pressrelease/32808.wss

IBM OpenPages Start-upWhat is it?

OpenPages provides a technology driven way of understanding the full scope of risk an organization faces. In most cases, there is extreme fragmentation of a company’s risk information – like data collected and maintained in numerous disparate spreadsheets – making aggregation of the risks faced by a company extremely difficult and unmanageable.

Key Features

IBM’s OpenPages GRC Platform can help by providing many capabilities to simplify and centralize compliance and risk management activities. The key features include:

  • Provides a shared content repository that can (logically) present the processes, risks and controls in many-to-many and shared relationships.
  • Supports the import of corporate data and maintains an audit trail ensuring consistent regulatory enforcement and monitoring across multiple regulations.
  • Supports dynamic decision making with its CommandCenter interface, which provides interactive, real-time executive dashboards and reports with drill-down.
  • Is simple to configure and localize with detailed user-specific tasks and actions accessible from a personal browser based home page.
  • Provides for Automation of Workflow for management assessment, process design reviews, control testing, issue remediation and sign-offs and certifications.
  • Utilizes Web Services for Integration. OpenPages utilizes OpenAccess API Interoperate with leading third-party applications to enhance policies and procedures with actual business data.

Understanding the Topology

The OpenPages GRC Platform consists of the following 3 components:

  • 1 database server
  • 1 or more application servers
  • 1 or more reporting servers

Database Server

The database is the centralized repository for metadata, (versions of) application data, and access control. OpenPages requires a set of database users and a tablespace (referred to as the “OpenPages database schema”). These database components install automatically during the OpenPages application installation, configuring all of the required elements. You can use either Oracle or DB2 for your OpenPages GRC Platform repository.

 Application Server(s)

The application server is required to host the OpenPages applications. The application server runs the application modules, and includes the definition and administration of business metadata, UI views, user profiles, and user authorization.

 Reporting Server

The OpenPages CommandCenter is installed on the same computer as IBM Cognos BI and acts as the reporting server.

Next Steps

An excellent next step would be to visit the ibm site and review the available slides and whitepapers. After that, keep tuned to this blog!

]]>
https://blogs.perficient.com/2014/08/12/ibm-openpages-start-up/feed/ 0 200042
Configuring Cognos TM1 Web with Cognos Security https://blogs.perficient.com/2014/08/07/configuring-cognos-tm1-web-with-cognos-security/ https://blogs.perficient.com/2014/08/07/configuring-cognos-tm1-web-with-cognos-security/#respond Thu, 07 Aug 2014 20:28:00 +0000 http://blogs.perficient.com/dataanalytics/?p=4821

Recently I completed upgrading a client’s IBM Cognos environment – both TM1 and BI. It was a “jump” from Cognos 8 to version 10.2, and TM1 9.5 to version 10.2.2. In this environment, we had multiple virtual servers (Cognos lives on one, TM1 on one and the third is the gateway/webserver).

Once the software was all installed and configured (using IBM Cognos Configuration and, yes, you still need to edit the TM1 configuration cfg file), we started the services and (it appeared) everything looked good. I spin through the desktop applications (Perspectives, Architect, etc.) and then go the Web browser, first to test TM1Web:

http:// stingryweb:9510/tm1web/

The familiar page loads:

01

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

But when I enter my credentials, I get the following:

 

02

 

 

Go to Goggle

Since an installation and configuration is not something you do every day, goggle reports that there are evidentially 2 files that the installation placed on the web server that belong on the Cognos BI server. These files need to be located, edited and then copied to the correct location for TM1Web to use IBM Cognos authentication security.

What files?

There are 2 files; an XML file (variables_TM1.xml.sample) and an HTML file (tm1web.html). These can be found on the server that you installed TM1Web – or can they? Turns out, they are not found individually but are included in zip files:

Tm1web_app.zip (that is where you’ll find the xml file) and tm1web_gateway.zip (and that is where you will find tm1web.html):

03

 

 

 

 

I found mine in:

Program Files\ibm\cognos\tm1_64\webapps\tm1web\bi_files

Make them your own

Once you unzip (the files) you need to rename the xml file (to drop the “.sample”) and place it onto the Cognos BI server in:

Program Files\ibm\cognos\c10_64\templates\ps\portal.

Next, edit the file (even though it’s an XML file, its small so you can use notepad). What you need to do is modify the URL’s (the “localhost” string should be replaced with the name of the server running TM1Web.) within the <urls> tags. You’ll find three (one for TM1WebLogin.aspx, one for TM1WebLoginHandler.aspx and one for TM1WebMain.aspx).

Now, copy your tm1web.html file to (on the Cognos BI server)

Program Files\ibm\cognos\c10_64\webcontent\tm1\web and edit it (again, you can use notepad). One more thing, the folder “tm1” may need to be manually created.

The html file update is straight forward (you need to point to where Cognos TM1 Web is running) and there is only a single line in the file. You change:

var tm1webServices = [“http://localhost:8080”];

To:

var tm1webServices = [“http:// stingryweb:9510”];

 

Now, after stopping and starting the servers web services:

 

04

 

 

 

 

The above steps are simple; you just need to be aware of these extra, very manual steps….

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

]]>
https://blogs.perficient.com/2014/08/07/configuring-cognos-tm1-web-with-cognos-security/feed/ 0 200041
Perficient takes Cognos TM1 to the Cloud https://blogs.perficient.com/2014/07/01/perficient-takes-cognos-tm1-to-the-cloud/ https://blogs.perficient.com/2014/07/01/perficient-takes-cognos-tm1-to-the-cloud/#respond Tue, 01 Jul 2014 16:51:03 +0000 http://blogs.perficient.com/dataanalytics/?p=4724

IBM Cognos TM1 is well-known as the planning, analysis, and forecasting software that delivers flexible solutions to address requirements across an enterprise, as well as provide real-time analytics, reporting, and what-if scenario modeling and Perficient is well-known for delivering expertly designed TM1 based solutions.

Analytic Projects

Perficient takes Cognos TM1 to the CloudPerhaps phase zero of a typical analytics project would involve our topology experts determining the exact server environment required to support the implementation of a number of TM1 servers (based upon not only industry proven practices, but our own breadth of practical “in the field” experiences). Next would be the procurement and configuration of said environment (and prerequisite software) and finally the installation of Cognos TM1.

It doesn’t stop there

As TM1 development begins, our engineers work closely with internal staff to outline processes for the (application and performance) testing and deployment (of developed TM1 models) but also to establish a maintainable support structure for after the “go live” date. “Support” includes not only the administration of the developed TM1 application but the “road map” to assign responsibilities such as:

  • Hardware monitoring and administration
  • Software upgrades
  • Expansion or reconfiguration based upon additional requirements (i.e. data or user base changes or additional functionality or enhancements to deployed models)
  • And so on…

Teaming Up

Earlier this year the Perficient analytics team teamed up with the IBM Cloud team to offer an interesting alternative to the “typical”: Cognos TM1 as a service in the cloud.

Using our internal TM1 models and colleagues literally all over the country, we evaluated and tested the viability of a fully cloud based TM1 solution.

What we found was, it works and works well, offering unique advantages to our customers:

  • Lowers the “cost of entry” (getting TM1 deployed)
  • Lowers the total cost of ownership (ongoing “care and feeding”)
  • Reduces the level of capital expenditures (doesn’t require the procurement of internal hardware)
  • Reduces IT involvement (and therefore expense)
  • Removes the need to plan for, manage and execute upgrades when newer releases are available (new features are available sooner)
  • (Licensed) users anywhere in world have access form day 1 (regardless of internal constraints)
  • Provides for the availability of auxiliary environments for development and testing (without additional procurement and support)

In the field

Once we were intimate with all of the “ins and outs” of TM1 10.2 on a cloud platform, we were able to to work directly with IBM to demonstrate how a cloud based solution would work to address the specific needs of one of our larger customers. After that, the Perficient team “on the ground” developed and deployed a “proof of concept” using real customer data, and partnered with the customer for the “hands on” evaluation and testing. Once the results were in, it was unanimous: “full speed ahead!””.

A Versatile platform

During the project life-cycle, the cloud environment was seamless; allowing Perficient developers to work (at the client site or remotely) and complete all necessary tasks without issue. The IBM cloud team was available (24/7) to analyze any perceived bottlenecks and, when required, to “tweak” things per the Perficient team’s suggestions, ensuring an accurately configured cloud and a successful, on-time solution delivery.

Bottom Line

Built upon our internal teams experience and IBM’s support, our delivered cloud based solution is robust and cutting edge and infinitely scalable.

Major takeaways

Even given everyone’s extremely high expectations, the project team was delighted and reported back the following major takeaways from the experience:

  • There is no “hardware administration” to worry about
  • No software installation headaches to hold things up!
  • The cloud provided an accurately configured VM -including dedicated RAM and CPU based exactly upon the needs of the solution.
  • The application was easily accessible, yet also very secure.
  • Everything was “powerfully fast” – did not experience any “WAN effects”.
  • 24/7 support provided by the IBM cloud team was “stellar”
  • The managed RAM and “no limits” CPU’s set things up to take full advantage of features like TM1’s MTQ.
  • The users could choose a complete web based experience or install CAFÉ on their machines.

In addition, IBM Concert (provided as part of the cloud experience) is a (quote) “wonderful tool for our user community to combine both TM1 & BI to create intuitive workflows and custom dashboards”.

More to Come

To be sure, you’ll be hearing much more about Concert & Cognos in the cloud and when you do, you can count on the Perficient team for expert delivery.

]]>
https://blogs.perficient.com/2014/07/01/perficient-takes-cognos-tm1-to-the-cloud/feed/ 0 200031
Is IT ready for Innovation in Information Management ? https://blogs.perficient.com/2014/06/20/4644/ https://blogs.perficient.com/2014/06/20/4644/#respond Fri, 20 Jun 2014 16:44:21 +0000 http://blogs.perficient.com/dataanalytics/?p=4644

Information Technology (IT) has come a long way from being a delivery organization to an organization part of business innovation strategy, though a lot has to change in the coming years. Depending on the industry and the company culture, IT organization will mostly fall in the operational spectrum and a lot of progressive ones are  gravitating towards innovation. Typically, IT maybe consulted on executing the strategic vision. It is not IT’s role to lead the business strategy but data and information is another story.  IT is uniquely positioned to innovation in Information Management because of their knowledge in data, if they don’t take up that challenge, business will look for outside innovation. Today’s market place offers tools and technologies to business users and they are bypassing IT organizations if they are not ready for the information challenge. A good example will be business users trying out third-party services (cloud), self-service BI tools for slicing and dicing data, cutting down the development cycle. The only way IT can play strategic game is to get into the game.

It is almost impossible for IT not to pay attention to data and just bury their heads in keeping the lights on projects. So I took a stab at the types of products and technologies which is maturing in the last 5 years in the Data Management space. By any means this is not the complete list but it captures the essence.DM_tools_x

Interesting phenomenon is many companies traditionally late to adopt data driven approach are using analytical tools as they become visually appealing and are at a price they can buy. Cloud adoption is another trend which is making the technology deployment and management without a huge IT bottleneck.

The question every IT organization, irrespective of company size, should ask is Are we ready to take on the strategic role in the enterprise? How well they can co-lead the business solution and not just implementing an application after the fact. Data Management is one area IT needs to take the lead in educating and leading innovation to solve business problems. Predictive analytics and Big Data is right on top with all the necessary supporting platforms including Data Quality, Master Data Management and Governance.

It will be interesting to know how many IT organizations leverage the Information Management opportunity.

DM_tools_list

 

]]>
https://blogs.perficient.com/2014/06/20/4644/feed/ 0 200024
Exercising IBM Cognos Framework Manager https://blogs.perficient.com/2014/06/16/exercising-ibm-cognos-framework-manager/ https://blogs.perficient.com/2014/06/16/exercising-ibm-cognos-framework-manager/#respond Mon, 16 Jun 2014 14:31:25 +0000 http://blogs.perficient.com/dataanalytics/?p=4619

In Framework Manager, an expression is any combination of operators, constants, functions, and other components that evaluates to a single value. You can build expressions to create calculation and filter definitions. A calculation is an expression that you use to create a new value from existing values contained within a data item. A filter is an expression that you use to retrieve a specific subset of records. Lets walk though a few simple examples:

Using a Session Parameter

I’ve talked before about session parameters in Framework manager (a session parameter is a variable that IBM Cognos Framework Manager associates with a session, for example user ID and preferred language and you also create your own) in a previous post.

It doesn’t matter if you use a default session parameter or one you’ve created, it’s easy to include a session parameter in your Framework Manager Meta Model.

Here is an example.

In a Query Subject (a query subject is a set of query items that have a relationship and are used to optimize the data being received for reporting); you can click on the Calculations tab and then click Add.

Framework Manager shows the Calculation Definition dialog where you can view and select from the Available Components to create a new Calculation. The Components are separated into 3 types – Model, Functions and Parameters.

I clicked on Parameters and then expanded Session Parameters. Here FM lists all of the default parameters and any I’ve created as well. I selected current_timestamp (to add as my Expression definition (note – FM wraps the expression with the # character to indicate that it’s a MACRO that will be resolved at runtime).

During some additional experimentation I found:

  • You can add a reasonable name for your calculation
  • You may have to (or want to) nest functions within the expression statement (i.e. I’ve added the function “sq” as an example. This function wraps the returned value in single quotes). Hint: the more functions you nest, the slower the performance, so think it thorough).
  • If you’ve got the expression correct (the syntax anyway), the blue Run arrow lights up and you can test the expression and view the results the lower right hand pane of the dialog. Tips will show you errors/Results will show the runtime result of your expression.
  • Finally, you can click OK to save your calculation expression with your Query Subject.

june1

 

 

 

 

 

 

 

 

 

 

 

 

Filtering

Filtering works the same way as calculations. In my example I’m dealing with parts and inventories. If I’d like to create a query subject that perhaps lists only part numbers with a current inventory count of 5 or less, I can set a filter by clicking on the Filter tab and then Add (just like we just did for the calculation).

This time I can select the column InventoryCount from the Model tab and add it as my Expression definition. From there I can grab the “less than or equal to” operator (you can type it directly or select it from the Function list).

june2

 

 

 

 

 

 

 

 

 

 

 

 

Filter works the same as Calculation as far as syntax and tips (but it does not give you a chance to preview your result or the effect of your filter).

Click OK to save your filter.

JOIN ME

Finally, my inventory report is based upon the SQL table named PartInventory which only provides a part number and an inventory count. I’d like to add part descriptions (which are in a table named simply “Part”) to my report so I click on the SQL tab and create a simple join query (joining the tables using PartNo):

june3

 

 

 

 

 

 

 

 

 

 

To make sure everything looks right, I can click on the tab named Test and then click Test Sample.

You can see that you have a part name for each part number, the session parameter Time Stamp is displayed for each record and only those parts in the database where the inventory count is 5 or less:

june4

 

 

 

 

 

 

 

 

 

 

 

By the way, back on the SQL tab, you can:

  • Clear everything (and start over)
  • Enter or Modify SQL directly (remember to click the Validate button to test your code)
  • Insert an additional data source into your Query subject to include data from another source, perhaps an entirely different SQL database.
  • Insert a Macro, For example, you can add inline macro functions to your SQL query.

Here is an example:

#$Corvette_Year_Grouping{$CarYear}#

Notice the # character to indicate the code within is a function to be resolved within the SQL query.

This code uses a parameter map (I’ve blogged about PM’s in the past) to convert a session parameter (set to a particular vehicle model year) to the name of a particular SQL table column (and include that column of information in my query subject result). So in other words, the database table column included in the query result will be decided at run time.

june5

 

 

 

 

 

 

 

 

 

 

 

And our result:

june6

 

 

 

 

 

 

You can see that these are simple but thought-provoking examples of the power of IBM Cognos Framework Manager.

Framework Manager is a metadata modeling tool that drives query generation for Cognos BI reporting. Every reporting project should begin with a solid meta model to ensure success. More to come…

]]>
https://blogs.perficient.com/2014/06/16/exercising-ibm-cognos-framework-manager/feed/ 0 200021