Perficient Business Intelligence Solutions Blog

Blog Categories

Subscribe via Email

Subscribe to RSS feed

Posts Tagged ‘bi’

Perficient takes Cognos TM1 to the Cloud

IBM Cognos TM1 is well-known as the planning, analysis, and forecasting software that delivers flexible solutions to address requirements across an enterprise, as well as provide real-time analytics, reporting, and what-if scenario modeling and Perficient is well-known for delivering expertly designed TM1 based solutions.

Analytic Projects

Perficient takes Cognos TM1 to the CloudPerhaps phase zero of a typical analytics project would involve our topology experts determining the exact server environment required to support the implementation of a number of TM1 servers (based upon not only industry proven practices, but our own breadth of practical “in the field” experiences). Next would be the procurement and configuration of said environment (and prerequisite software) and finally the installation of Cognos TM1.

It doesn’t stop there

As TM1 development begins, our engineers work closely with internal staff to outline processes for the (application and performance) testing and deployment (of developed TM1 models) but also to establish a maintainable support structure for after the “go live” date. “Support” includes not only the administration of the developed TM1 application but the “road map” to assign responsibilities such as:

  • Hardware monitoring and administration
  • Software upgrades
  • Expansion or reconfiguration based upon additional requirements (i.e. data or user base changes or additional functionality or enhancements to deployed models)
  • And so on…

Teaming Up

Earlier this year the Perficient analytics team teamed up with the IBM Cloud team to offer an interesting alternative to the “typical”: Cognos TM1 as a service in the cloud.

Using our internal TM1 models and colleagues literally all over the country, we evaluated and tested the viability of a fully cloud based TM1 solution.

What we found was, it works and works well, offering unique advantages to our customers:

  • Lowers the “cost of entry” (getting TM1 deployed)
  • Lowers the total cost of ownership (ongoing “care and feeding”)
  • Reduces the level of capital expenditures (doesn’t require the procurement of internal hardware)
  • Reduces IT involvement (and therefore expense)
  • Removes the need to plan for, manage and execute upgrades when newer releases are available (new features are available sooner)
  • (Licensed) users anywhere in world have access form day 1 (regardless of internal constraints)
  • Provides for the availability of auxiliary environments for development and testing (without additional procurement and support)

In the field

Once we were intimate with all of the “ins and outs” of TM1 10.2 on a cloud platform, we were able to to work directly with IBM to demonstrate how a cloud based solution would work to address the specific needs of one of our larger customers. After that, the Perficient team “on the ground” developed and deployed a “proof of concept” using real customer data, and partnered with the customer for the “hands on” evaluation and testing. Once the results were in, it was unanimous: “full speed ahead!””.

A Versatile platform

During the project life-cycle, the cloud environment was seamless; allowing Perficient developers to work (at the client site or remotely) and complete all necessary tasks without issue. The IBM cloud team was available (24/7) to analyze any perceived bottlenecks and, when required, to “tweak” things per the Perficient team’s suggestions, ensuring an accurately configured cloud and a successful, on-time solution delivery.

Bottom Line

Built upon our internal teams experience and IBM’s support, our delivered cloud based solution is robust and cutting edge and infinitely scalable.

Major takeaways

Even given everyone’s extremely high expectations, the project team was delighted and reported back the following major takeaways from the experience:

  • There is no “hardware administration” to worry about
  • No software installation headaches to hold things up!
  • The cloud provided an accurately configured VM -including dedicated RAM and CPU based exactly upon the needs of the solution.
  • The application was easily accessible, yet also very secure.
  • Everything was “powerfully fast” – did not experience any “WAN effects”.
  • 24/7 support provided by the IBM cloud team was “stellar”
  • The managed RAM and “no limits” CPU’s set things up to take full advantage of features like TM1’s MTQ.
  • The users could choose a complete web based experience or install CAFÉ on their machines.

In addition, IBM Concert (provided as part of the cloud experience) is a (quote) “wonderful tool for our user community to combine both TM1 & BI to create intuitive workflows and custom dashboards”.

More to Come

To be sure, you’ll be hearing much more about Concert & Cognos in the cloud and when you do, you can count on the Perficient team for expert delivery.

How to Report on Employee Utilization in OBIEE?

One of the common HR reporting needs is to determine the Utilization and Availability of employees. These metrics may also be studied at a higher level. For example, checking Workforce Utilization Percentages across a company’s different organizations provides insight into how overstaffed or understaffed each organization is. This blog describes an OBIEE design methodology to support such reporting requirements.

A quick functional overview of how Utilization is calculated

While Utilization % tells how much actual work an employee has completed compared to their overall capacity, Availability indicates the remainder of the time where an employee has been inactive or non-utilizable. For example if the Utilization of someone is 80%, their Availability is 20% (100 – 80).

Utilization is defined as the ratio of Hours Worked over Capacity. Hours Worked is a function of the actual hours entered on a timecard throughout an employee’s workweek. And there may be several variations of what defines Hours Worked depending on the organization’s specific definition of the type of timecard hours that are utilizable. For instance, a consulting firm may include billable hours to a client as utilizable, but not hours spent on non-billable categories such as bench time and vacations. Capacity is typically a standard number of hours an employee is expected to work irrespective of what gets entered on timesheets. For example, an employee who works 8 hour workdays has a capacity of 40 hours a week, whereas a part-time employee who works 3 days a week has a capacity of 24. Capacity usually excludes standard holiday hours as such hours are not expected to be utilizable in the first place.

Following is a summary of the key metrics:

Utilization % = 100 x Hours Worked / Capacity

Availability % = 100 – Utilization %

Hours Worked: Timecard Hours that are considered utilizable

Capacity: Standard Work Schedule Hours – Standard Holiday Hours

 

Data Model

No matter what transactional system your data is sourced from, Hours Worked and Capacity are most likely going to be stored in different tables in that system. For example, in Oracle E-Business Suite, Hours Worked are sourced from Oracle Time and Labor timecard tables. Whereas, Capacity is sourced from the HR assignment tables that associate employees to their corresponding work schedules and holiday calendars.

In my solution of a data warehouse model that supports Utilization calculations, I use 2 facts: Timecard Fact and Capacity Fact. Not all the dimensions in both star schemas are conforming. For example, the Timecard Fact has dimensions that describe the type of hours whether they are billable or not, vacation hours or project hours, work hours that were performed onsite or remote, etc… Such attributes of a timecard are not relevant when we talk about capacity facts. For this reason, if we were to store both metrics (Hours Worked and Capacity Hours) in the same fact table, we end up with an incorrect capacity as it doesn’t relate to all the timecard dimensions. Following is my schema for both stars where Project, Task and Time Entry Status are non-conforming dimensions:

Capture1

 

OBIEE Design

In the RPD business layer, I built 3 logical facts and the same facts are made available in the Presentation layer:

  1. Timecard Fact: Sourced from the timecard OLAP fact table
  2. Capacity Fact: Sourced from the capacity OLAP fact table
  3. Utilization Fact: This fact has no physical data sources as all the metrics are based on the other 2 logical facts.

Capture2

I am now able to build a simple trend report that shows utilization broken down by Organization. Such a report is straightforward to build since both the Time and Organization dimensions are conforming between both facts: Timecard and Capacity.

Capture3

A more advanced reporting requirement may ask for utilization to be dynamically re-calculated in the report based on additional prompts on dimensions like Time Entry Status, Project or Task. These dimensions are not conforming and therefore cannot be added as prompts in the typical way. If interested in adding dynamic prompting on timecard-specific dimensions, you can see an example of how that is possible by referring to my other blog: OBIEE Prompting on Non-Conforming Dimensions.

Exercising IBM Cognos Framework Manager

In Framework Manager, an expression is any combination of operators, constants, functions, and other components that evaluates to a single value. You can build expressions to create calculation and filter definitions. A calculation is an expression that you use to create a new value from existing values contained within a data item. A filter is an expression that you use to retrieve a specific subset of records. Lets walk though a few simple examples:

Using a Session Parameter

I’ve talked before about session parameters in Framework manager (a session parameter is a variable that IBM Cognos Framework Manager associates with a session, for example user ID and preferred language and you also create your own) in a previous post.

It doesn’t matter if you use a default session parameter or one you’ve created, it’s easy to include a session parameter in your Framework Manager Meta Model.

Here is an example.

In a Query Subject (a query subject is a set of query items that have a relationship and are used to optimize the data being received for reporting); you can click on the Calculations tab and then click Add.

Framework Manager shows the Calculation Definition dialog where you can view and select from the Available Components to create a new Calculation. The Components are separated into 3 types – Model, Functions and Parameters.

I clicked on Parameters and then expanded Session Parameters. Here FM lists all of the default parameters and any I’ve created as well. I selected current_timestamp (to add as my Expression definition (note – FM wraps the expression with the # character to indicate that it’s a MACRO that will be resolved at runtime).

During some additional experimentation I found:

  • You can add a reasonable name for your calculation
  • You may have to (or want to) nest functions within the expression statement (i.e. I’ve added the function “sq” as an example. This function wraps the returned value in single quotes). Hint: the more functions you nest, the slower the performance, so think it thorough).
  • If you’ve got the expression correct (the syntax anyway), the blue Run arrow lights up and you can test the expression and view the results the lower right hand pane of the dialog. Tips will show you errors/Results will show the runtime result of your expression.
  • Finally, you can click OK to save your calculation expression with your Query Subject.

june1

 

 

 

 

 

 

 

 

 

 

 

 

Filtering

Filtering works the same way as calculations. In my example I’m dealing with parts and inventories. If I’d like to create a query subject that perhaps lists only part numbers with a current inventory count of 5 or less, I can set a filter by clicking on the Filter tab and then Add (just like we just did for the calculation).

This time I can select the column InventoryCount from the Model tab and add it as my Expression definition. From there I can grab the “less than or equal to” operator (you can type it directly or select it from the Function list).

june2

 

 

 

 

 

 

 

 

 

 

 

 

Filter works the same as Calculation as far as syntax and tips (but it does not give you a chance to preview your result or the effect of your filter).

Click OK to save your filter.

JOIN ME

Finally, my inventory report is based upon the SQL table named PartInventory which only provides a part number and an inventory count. I’d like to add part descriptions (which are in a table named simply “Part”) to my report so I click on the SQL tab and create a simple join query (joining the tables using PartNo):

june3

 

 

 

 

 

 

 

 

 

 

To make sure everything looks right, I can click on the tab named Test and then click Test Sample.

You can see that you have a part name for each part number, the session parameter Time Stamp is displayed for each record and only those parts in the database where the inventory count is 5 or less:

june4

 

 

 

 

 

 

 

 

 

 

 

By the way, back on the SQL tab, you can:

  • Clear everything (and start over)
  • Enter or Modify SQL directly (remember to click the Validate button to test your code)
  • Insert an additional data source into your Query subject to include data from another source, perhaps an entirely different SQL database.
  • Insert a Macro, For example, you can add inline macro functions to your SQL query.

Here is an example:

#$Corvette_Year_Grouping{$CarYear}#

Notice the # character to indicate the code within is a function to be resolved within the SQL query.

This code uses a parameter map (I’ve blogged about PM’s in the past) to convert a session parameter (set to a particular vehicle model year) to the name of a particular SQL table column (and include that column of information in my query subject result). So in other words, the database table column included in the query result will be decided at run time.

june5

 

 

 

 

 

 

 

 

 

 

 

And our result:

june6

 

 

 

 

 

 

You can see that these are simple but thought-provoking examples of the power of IBM Cognos Framework Manager.

Framework Manager is a metadata modeling tool that drives query generation for Cognos BI reporting. Every reporting project should begin with a solid meta model to ensure success. More to come…

OBIEE Prompting on Non-Conformed Dimensions

A report that uses multiple facts may be prompted on dimensions that are not necessarily conforming to all the facts. At first one may think such a functionality is not valid. This posting demonstrates how such reporting requirements are common and are achievable in OBIEE though not in a very straightforward manner.

It is a basic OBIEE reporting concept that a report using metrics from more than one fact, requires that all the dimensional columns be conformed across the facts used in the report. In other words, it makes no sense to look at a side by side comparison of revenue and cost by product if the cost information is not available by product to start with. However, it is a valid question to ask how is revenue generated from certain products compared to the overall cost. Requirements like this usually have us facing the problem of developing a report that sources data from two facts: a revenue fact supporting a product dimension, and a cost fact that does not support the product dimension. At first one may be tempted to respond to the requester that a report like this is not possible since we are dealing with a “multiple facts and a non-conforming dimension” situation. But a closer look reveals that such requirements are completely valid from a functional perspective and therefore should be doable. The problem that remains though is that prompting a report on a non-conforming dimension will have OBIEE at a loss on how to aggregate a metric along a dimension it is not linked to.  Read the rest of this post »

Framework Manager – Creating a Parameter Map

A session parameter is a variable that IBM Cognos Framework Manager associates with a particular session. Examples include (current user name, current active language, current date and time, and others). Parameter maps are a method for substituting different values with different keys.

A parameter map can be thought of as simple data “look-up table”.

Each parameter map has two columns:

  • a key column and
  • a value column (holding the value that the key represents).

In Cognos TM1, Lookup (or mapping) cubes (and dimensions) are common (and I’ve blogged on them before).

So let’s create a simple Framework Manager Parameter Map:

Well, to construct your map, you can:

  • enter the keys and values (for your map) manually,
  • import them from an external file, or
  • base them on query items in your Meta model

– it all depends upon the size and/or complexity of the parameter map you need to build.

Some helpful hints:

  • All parameter map keys must be unique so that the Framework Manager can reliably obtain the correct value!
  • The value of one parameter can be the value of another parameter, so you must enclose the entire value in number signs (#).
  • There is a limit of five levels when nesting parameters in this way.

So let’s look at an example exercise. I chose to use the “source file” method to create my map.

In Framework Manager, right-click in the Parameter Maps icon, then select Create and Parameter Map:

TPM1

 

 

 

 

 

 

 

 

 

 

From there, you can enter a name for your parameter map.

Since I am converting (or mapping) (Corvette) part numbers into part descriptions, I’m naming my new parameter map:

“Keen Corvette Restoration Parts”,

and then selecting the option “Manually enter the parameter keys and/or import them from a file”:

tpm2

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

On the Create Parameter Map Wizard dialog, I entered a default value (a value to be used if a key doesn’t have a value in your map) and then clicked on Import File…

tpm3

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Navigated and selected mysource file (to use a .txt file for import, the values must be separated by tabs and the file must be saved as UTF8 or in the Unicode format. ANSI text files are not supported):

tpm4

 

 

 

 

 

 

 

 

 

 

 

 

 

Clicked OK and Framework Manager created my parameter map. It looks good, (it does!) so I clicked on Finish:

tpm5

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

And you can see my new map now existing in my project:

tpm6

 

 

 

 

 

 

 

 

 

 

Done!

If you double-click on your map, the Parameter Map dialog opens again where you can clear your map, import a new source file (to over lay or add to your map), add new specific keys, export your map or edit it directly.

Next time I will illustrate how to use the new parameter map!

Creating Business Logic – in Cognos Framework Manager

One of the goals of the Cognos Framework Manager modeler is to build a model that makes report authoring easier. To accomplish that, you “build in” business logic” (a loosely defined term here) into your meta model.

This business logic can be simple (like simply renaming or hiding a database table column) or quite complex (like modifying SQL commands to return only specific data from a datasource).

Some of the basics you can add to your model are:

  • Stipulating attributes
  • Renaming or hiding database columns
  • Adding prompts
  • Applying filters
  • Creating calculations
  • Adding formatting to data items
  • Using folders and namespaces for grouping information
  • Using shortcuts to include the same information in different places

Let’s explore a few of these.

Stipulating Attributes

A simple example is changing the attribute type of an imported database column. This is done for various reasons, but manly to dictate how Report Studio deals with the data in a report (for example, Report Studio may attempt to sum numeric fields which may, depending upon the database column, be a meaningless number).

An example (from my demo project) would be the ProductID field in the Products table (imported from SQL Server).It is imported as a FACT field (since it is defined as a numeric in the database table). To change this field’s attribute, I:

Select the database field (by clicking on it) and in the Framework Managers “Properties Pane” (hint, if you can’t find the properties pane, go to “view” on the FM menu and click Properties to make it visible), I locate the property “Usage”. There, I can change the attribute:

nan1

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Renaming or hiding database columns

This is even easier. If you’d like to rename a database column (to something more “user friendly”) or perhaps hide it (for example, we don’t want to display Salary information to the report authors), you can just:

  • right-click on the database column in the Framework Manager Explorer column and select (from the popup menu) “rename” or
  • back in the properties pane, change the “Is Hidden” property to TRUE (to hide the column)

Applying filters

Filters are neat. For example, my demo includes a database table named Employee. Right now, the meta model includes all records (all employees) in the table. That would include both active and inactive employees. I want to make sure that reports only use active employees, so I can alter the FM query (to apply a query filter) to accomplish this:

  1. Double-click on the table name Employee (in the FM explorer pane).
  2. FM shows me the “Query Subject Definition” dialog.
  3. Edit the SQL (hint: use the Validate button).
  4. Click OK.

nan2

 

 

 

 

 

 

 

 

 

 

And so on. You can see that Cognos Framework Manager makes it easy to build out your meta model with your business logic but to be sure, the real work here is to identify what logic to include and obtain consensus from your business stakeholders on the correctness of the logic.

Until next time…

Cognos Framework Manager, Transformer & Metric Designer

During a recent project of mine, I jumped “head first” into IBM Cognos Framework Manager.

Yes, it is an awesome tool, but where exactly does it “fit” into the Cognos environment? Are there similar or alternate tools that can be used?

I just had to have a quick look!

The Cognos Components

comp

 

 

 

 

 

 

Okay, so the components of the Cognos 10 BI environment include:

  • Business Insight & Insight Advanced
  • Cognos Connection
  • Cognos Viewer
  • Query Studio
  • Report Studio
  • Analysis Studio
  • Event Studio
  • Metric Studio
  • Admin
  • Framework Manager
  • Transformer
  • Metric Designer

Of all these components, our “modeling tool options” are Framework Manager, Transformer and Metric Designer. Here is some background on each:

Framework Manager

This is a tool for creating business related metadata for use in IBM Cognos BI analysis and reporting. This metadata is then published for use by reporting tools as “packages”, which provide an integrated view of your data sources.

Framework Manager can be used to:

  1. model for predictable results (star schema)
  2. model for OLAP-style queries (model dimensionally)
  3. create one or more business views
  4. add calculations
  5. create and apply filters
  6. add prompts
  7. set row level security

Transformer

This is a tool for modeling dimensional hierarchies and levels for PowerCubes. You can use Transformer to create a “business organization” of the information in your data sources.

You add dimensional metadata, specify measures, apply custom views, and then create PowerCubes (based on this model). You can than deploy these cubes to support OLAP reporting and analysis.

Metric Studio

This is a modelling tool used to create extracts for use in IBM Cognos scorecarding applications. You can use these extracts to map and transfer information from existing metadata sources (like Framework Manager!) Metric studio can be used to create scorecard content such as:

  1. Watch lists
  2. Scorecards
  3. Strategies
  4. Metric types
  5. Individual metrics

Conclusion

My conclusion is that it is imperative for every Cognos developer to understand what modeling tools are available for use and where they “fit” in the overall environment. Of course, this is a general “rule of thumb” that is true in any profession – “know your tools”.

 

IBM Cognos Framework Manager – Proven Practice

A proven practice for Framework Manager Meta Modeling is to divide your model into a series of layers, each layer having a specific purpose. (Originally IBM recommended the use of two layers (data and modeling), but later added a presentation layer, and (optionally) a separate dimensional layer).

The layers should be:

  • The top layer – or the “presentation layer”,
  • The middle layer – or the “logical layer” and
  • The lowest level – or the “data layer”.

(The dimensional layer can be used to replace the presentation layer, or can be used in addition to the presentation layer).

The Data layer

Contains the data source or “query subjects”.

The Logical layer

Where most of the modeling is done, providing business context to the data layer. For example, you can join fields from multiple tables, rename fields, assign aliases, organize by folders, etc.

The Presentation layer

This is what the report author sees when you publish a package for report creation.

The Dimensional layer

The dimensional layer is required only for models which include Dimensionally Modeled Relation data (DMR) rather than only the relational data. Specifically, this is for creating regular and measure dimensional query subjects.

By Default…

In my last post, I created a new FM project named “corvette” and added a simple relational datasource (a SQL database).

boy1

 

 

 

 

 

You can see that by default, FM created a “namespace” named “FelxibleModel”.

In order to adhere to our “layers” proven practice, we need to make some changes to our project (before we do any real modeling). To do that you can simply right-click on the namespace and select Create and then Namespace”:

boy2

 

 

 

 

 

 

 

 

 

 

 

From there, rename the namespace as “Data Layer” (right-click on the namespace and select rename). Next, I moved my 2 database tables (Product and Sales) under it (click, drag and drop):

boy3

 

 

 

 

 

 

 

Now I can create my 2 other namespaces (Logical and Presentation):

boy4

 

 

 

 

 

 

 

 

Remember to save the project. One note, the names for the “layers” can be anything that makes senses within your organization, but give some thought to a convention as object names must be unique identifiers.

Now we are ready to do some real “meta modeling”. See you next post!

Flexible Meta Models and IBM Cognos Framework Manager

A metadata model is “a gathering of Meta information that includes both physical information and business information for one or more datasources and is the foundation for both future modelling and report development within an organization”.

Meta-model flexibility can be defined as the ability of a (Meta) model to:

  • Easily expand and grow (to support other reporting needs) and
  • Be extremely easy to use (when generating query or report requests).

In a business reporting environment, IBM Cognos Framework Manager is “the” metadata modeling tool that solves for this. Framework manager enables performance management on normalized and de-normalized relational data sources and a variety of OLAP data sources.

With only a little foresight and planning, a “framework manager model developer” can deliver large gains by creating models that grow with users and are easy to use, support and maintain.

So let’s “kick the tires”.

In Framework Manager, you work in the context of a “project”, which holds all of the “objects” that you organize for your users to use. To create a project:

From the Welcome page, click Create a new project.

fm1

 

 

 

 

 

 

 

 

 

In the New Project page, specify a name and location for you project (mine is the “Stingray” project), and click OK.

 

fm2

 

 

 

 

 

 

 

Hint – if this is your first project with Framework Manager, clear the Use Dynamic Query Mode checkbox! – You’ll then be asked to select the design language for the project and click OK. The Metadata Wizard displays next where you can select an existing datasource, create a new one or skip this step entirely. A data source connection supplies the information that IBM Cognos BI needs to connect to a database. You can customize data source connections to meet the needs of users.

Let’s setup a new datasource, so, select “Data Sources” and click Next, then select New… You’ll see the “Welcome – New Data Source wizard” (again, click Next) and then specify a name and description for your datasource.

fm3

 

 

 

 

 

 

 

 

 

 

 

 

 

Click Next, then you specify your connection information (mine is a 2012 SQL database, so I selected “SQL 2012 Native Client”), then clicked Next:

fm4

 

 

 

 

 

 

 

 

 

 

You’ll then provide the server name, database name (mine is named “Surfer”) and a user name and password that will be used to access the database:

fm5

 

 

 

 

 

 

 

 

 

 

 

 

fm6

 

 

 

 

 

 

 

 

 

 

 

 

 

Before proceeding, always test your connection! After you’ve verified a “successful connection”, you can set the DB commands you want to allow and click Finish!

fm7

 

 

 

 

 

 

 

 

 

Success! We’ve got a datasource for our project:

 

fm8

 

 

 

 

 

Next, select the database objects you want to include in your project:

fm9

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Click Next. The Generate Relationships dialog will be presented, for my project; I left the default selections (you can check the product documentation for specifics on each setting). Click Import:

fm0

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Framework Manager will tell you when it completes the import processing and you can then click Finish.

We now have a new project (Stingray) which includes a relational datasource. All projects are displayed in the project page, where you design, package, and publish project metadata:

fm11

 

 

 

 

 

 

 

 

 

 

 

If you’d like, you can add additional datasources to yourproject or just begin your modeling. Before starting though, you should understand proven practices around naming and namespaces and be familiar with the new Framework Manager feature, object validation — but you’ll have to wait for the next post for that (I’ll walk through the basic framework operational features as well and tell you all about my latest project involving IBM Cognos Framework Manager!).

Thanks for reading.

Splunk, it’s a setup

In Splunk Web, if you click on Apps, you should see a link for Manage Apps. From there, Splunk web will display the Apps page where you can see all of the apps currently installed in your Splunk instance.

From this page (sometimes referred to as “the manager”), you can enable or disable an app, set permissions, edit or change properties, and view (the apps) properties.

One more thing you may notice is an “Action” named “Set up” (not all apps have this). If the app does offer this link, you can use it to reset the apps default properties (whatever properties are configurable for that app). So what that means is you don’t have to edit Splunk’s configuration files to change the apps properties (pretty helpful, I think).

So, as a Splunk app developer, it would be nice if apps you develop include a setup screen. So – how do you do that? Well…

It’s easy to create an app setup screen for your app:

  1. Create a setup.xml file and “drop” it into your app’s default directory:

$SPLUNK_HOME/etc/apps/<AppName>/default/setup.xml.

  1. Edit that (setup.xml) file to provide values for the fields in your apps configuration files (note the setup screen will use these values to populate the input fields in the setup screen).

So here is a simple illustration.

My custom Splunk app is set up to do some very “extreme” searching on TM1 logs generated by my organizations Cognos TM1 Server (I’ve named it “Extremely Searchable”).

To create s simple setup screen for this app I created the following setup.xml (using Windows notepad) and copied it into

$SPLUNK_HOME/etc/apps/extreme/setup.xml.

set1

 

 

 

 

 

 

 

A little explanation: I used 3 “blocks”. The first provided a title and description, the second provides an on/off checkbox to set the “check_for_updates” field for my app and the third gives two input text boxes to add a new “app user” username and password. (If you fill this screen out and click Save, Splunk will update the appropriate configurations for you).

No Splunk restart is required. So back into Splunk Web, from Apps, Manage Apps, I clicked on Set up and my setup screen is displayed:

set2

 

 

 

 

 

 

 

 

 

XML

If you are somewhat familiar with basic XML, then creating a Splunk app setup.xml will be easy. The general format observes XML conventions and the following are the XML tags you’ll use:

<setup> This is the file “base element” for your setup screen.

<block> The block element defines the UI for the app setup screen.

<text> This is an optional element that provides descriptive text for the app setup screen.

<input> The input element collects input from the user.

<label> The label is the description of the input field which is displayed on the setup screen.

<type> This specifies the UI control for capturing user input. Allowed values for the type element are bool, text, password and list).

 

Adding a setup to your apps makes them more suitable for sharing with the Splunk community and it doesn’t require hours of extra effort. do it!