Skip to main content

Data & Intelligence

SPSS Collaboration and Deployment Services

Last time I mentioned IBM SPSS collaboration and deployment services and promised to talk more about it – so here we go:

Analytical Assets

Organizations positioning themselves to take full advantage of analytics will look to separate the effort of developing analytical assets and actually using them – between “creators” and “consumers”.  Generally speaking, an individual (or team) can focus on the modeling process (creating) while key decision makers can focus on making decisions based upon analytical results (consuming).

The collaboration and deployment services product provided by IBM does more than offer a “virtual bridge” between analytical creation and consumption, it provides a “robust repository” where all parts that make up an analytical asset can be accessed and consumed. Additionally, this package is a centralized and searchable “enterprise analytical warehouse” (EAW) -designed to enable secure sharing and re-using of all of an organizations analytical assets. This allows identified users within an organization to explore and consume the analytical information available – which in turn will increase the effectiveness of decision making at every level of the enterprise environment.

The Bridge

The IBM documentation states that within the Collaboration and Deployment Service, a user can:

  • Store analytical assets in a central, searchable repository, enabling the standardization and reuse of models to improve efficiency — and results
  • Develop custom interfaces that run analytical processes – giving analysts and others greater control over how they access and use analytics
  • Enable others to generate their own analytical output through a browser-based interface
  • Operationalize analytical processes by initiating specific jobs – such as scores or reports — on demand, or at a scheduled time, or when triggered by other events, and by orchestrating complex jobs across multiple systems and applications
  • Govern  the environment in which the analytical processes occur and increase confidence in results

In an earlier post, I broke down basic data analysis into 3 steps:

  1. Identification (and preparation) of data,
  2. Selection of an analysis and summation method and
  3. Presenting the results

To that point, an intermediate step might be added here – “deployment” -which would involve “saving” all the artifacts of the first 2 steps to the organizations “enterprise analytical warehouse” using the collaboration and deployment service.

System Architecture

Data Intelligence - The Future of Big Data
The Future of Big Data

With some guidance, you can craft a data platform that is right for your organization’s needs and gets the most return from your data capital.

Get the Guide

In general, the collaboration and deployment service consists of a single, centralized “analytical warehouse” (Collaboration and Deployment Services Repository) serving a variety of clients (using execution servers) to process and consume the available analytical assets.

An organizations functional analytical architecture would (should) include:

  • Thin client (portals) and thick clients for CDS management and designing reports
  • (Optionally) thick clients running “product collaboration” (allows direct access to the repository as well as file artifacts directly from their native product)
  • A server (or server farm) acting as the analytical warehouse
  • A database server and
  • Multiple execution servers

 

Configuring the Environment

A typical installation and deployment is too detailed to explore in a blog post of course, but the following are important notes:

  • It is advisable to utilize an experienced resource to ensure success with the installation and deployment
  • Specific minimal hardware and software requirements must be met or (recommended) exceeded
  • Prior to the install, verify that the necessary application server, database configuration, hardware, software, and permissions requirements have been met
  • The installing user must have the appropriate file system permissions
  • Before attempting installations, all required application servers and databases must be running and accessible
  • Virtualization can be utilized but adds additional complexity and requirements to the installation

 

Scaling to the Future

The future of the analytical architecture is fully supported by the use and optimzation of the following:

  • Migration tools,
  • Optional add-on components,
  • Clustering schemas,
  • Logging services, and an
  • Import tool…

Conclusion

The IBM SPSS family fully supports advanced analytics at an enterprise level with collaboration and deployment services. Other important members of this family include the Data Collector, the Modeler, the Decision Manager and (of course) Statistics.

One by one, I am going to expose them all!

______________________________________________________________

Thor: [walking into a pet shop] I need a horse!
Pet Store Clerk: We don’t have horses. Just dogs, cats, birds.
Thor: Then give me one of those large enough to ride.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Jim Miller

Mr. Miller is an IBM certified and accomplished Senior Project Leader and Application/System Architect-Developer with over 30 years of extensive applications and system design and development experience. His current role is National FPM Practice Leader. His experience includes BI, Web architecture & design, systems analysis, GUI design and testing, Database modeling and systems analysis, design, and development of Client/Server, Web and Mainframe applications and systems utilizing: Applix TM1 (including TM1 rules, TI, TM1Web and Planning Manager), dynaSight - ArcPlan, ASP, DHTML, XML, IIS, MS Visual Basic and VBA, Visual Studio, PERL, Websuite, MS SQL Server, ORACLE, SYBASE SQL Server, etc. His Responsibilities have included all aspects of Windows and SQL solution development and design including: analysis; GUI (and Web site) design; data modeling; table, screen/form and script development; SQL (and remote stored procedures and triggers) development and testing; test preparation and management and training of programming staff. Other experience includes development of ETL infrastructure such as data transfer automation between mainframe (DB2, Lawson, Great Plains, etc.) systems and client/server SQL server and Web based applications and integration of enterprise applications and data sources. In addition, Mr. Miller has acted as Internet Applications Development Manager responsible for the design, development, QA and delivery of multiple Web Sites including online trading applications, warehouse process control and scheduling systems and administrative and control applications. Mr. Miller also was responsible for the design, development and administration of a Web based financial reporting system for a 450 million dollar organization, reporting directly to the CFO and his executive team. Mr. Miller has also been responsible for managing and directing multiple resources in various management roles including project and team leader, lead developer and applications development director. Specialties Include: Cognos/TM1 Design and Development, Cognos Planning, IBM SPSS and Modeler, OLAP, Visual Basic, SQL Server, Forecasting and Planning; International Application Development, Business Intelligence, Project Development. IBM Certified Developer - Cognos TM1 (perfect score 100% on exam) IBM Certified Business Analyst - Cognos TM1

More from this Author

Follow Us