Skip to main content

Oracle

How to Export and Purge ODI Log Files Using OdiPurgeLog Utility

We have to regularly purge the Oracle Data Integrator (ODI) log files to reduce the size of the ODI work repository and improve the performance of the ODI studio. Before purging we can export the ODI log files and archive them. If we ever want to analyze the log history, we can re-import the logs from archive.

Exporting Log Files manually:
 To export log files manually, log into ODI studio, Navigate to Operator tab, click on the ‘Connect Navigator’ icon highlighted below, Click ‘Export’
1

Select ‘Export the Log’ option and click ‘OK’
2

In ‘Export the Log’ screen mention the ‘Export to directory’ path, select the desired object type to export, and the export criteria, Click ‘OK’
3

Oracle - Guide to Oracle Cloud: 5 Steps to Ensure a Successful Move to the Cloud
Guide to Oracle Cloud: 5 Steps to Ensure a Successful Move to the Cloud

Explore key considerations, integrating the cloud with legacy applications and challenges of current cloud implementations.

Get the Guide

Purge Log Files manually:
To Purge log files manually, log into ODI studio, Navigate to Operator tab, click on the ‘Connect Navigator’ icon highlighted below, Click ‘Purge Log’
In ‘Purge Log’ screen select the desired object type to be purged, and the purge criteria, Click ‘OK’
4

Archive and Purge log using ODI utility:
We can archive and purge ODI log using the ‘ODIpurgelog’ utility
To purge all logs older than 10 days, create a variable ‘END_DATE_FOR_PURGE’ and refresh the value in this variable using below query

SELECT SYSTIMESTAMP-10 FROM DUAL
5

Create a new package, drag and drop ‘ODIpurgelog’ utility from ‘All’ or ‘Utilities’ folder under ‘Toolbox’
5A

Select the ‘ODIpurgelog’ utility in the Package and enter Parameters as below, Log files will be archived to the path mentioned in the ‘Target Directory’ parameter
7

Save package and generate scenario.

Schedule this scenario to run daily or weekly, or add this scenario as last step to the daily incremental load plan.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Ashwin Pittampally

10 years of experience doing analysis, design, development and implementation of Oracle Business Intelligence applications and ETL solutions using Informatica, DAC, ODI. Worked extensively on Oracle BIAPPS Implementations, SQL querying and Stored Procedures using PL/SQL. Worked on Functional Areas - HR, Finance, Service and Marketing Analytics, Procurement and Spend Analytics, Supply Chain and Order Management Analytics, Enterprise Healthcare Analytics (EHA) and OHADI (Oracle Healthcare Analytics Data Integration)

More from this Author

Follow Us
TwitterLinkedinFacebookYoutubeInstagram