Skip to main content

Development

Journey of CDC Configuration in Oracle Data Warehouse Admin Console

In most cases and requirements, the developers just need to make change in existing code repository of Informatica as well as DAC objects. However, when the requirement is brand new and there is no OOTB entity found, a new set of codes should be developed and DAC execution plan need to be setup.

CDC stands for Change Data Capture which is a solution to capture the change happened in transactional source system and get change data (inserted, modified, deleted) into data warehouse.  CDC can be implemented via various approach such as introducing Goldengate, custom script, DB trigger etc.

There is rare documentation which can guide the developers to accomplish a new configuration in DAC tool. I had a journey to configure DAC for Oracle data warehouse CDC strategy recently and gained learnt lesson and it may help for the newbie.

DAC CDC Behavior

  •  By default the CDC mechanism is provided in Oracle Business Intelligence Application (OBIA) and it’s implemented along with Informatica SDE & SIL mappings. General speaking, DAC take ‘build image’ approach in SDE extract processing while load with all records from staging tables in SIL process. This indicates most configuration work of CDC is on SDE part.

 

  • DAC will take latest version of Informatica code as long as all codes are checked in. In the case where you just modified the code but did not check in, DAC will not perform the latest logic accordingly.

 

  •  We know that there is a Unit Test button under execution plan -> Ordered Tasks, but with this button we can not perform complete CDC steps such as insert to Image table, build view automatically.

 

  •  Check system property “Drop and Create Change Capture Views Always” to make sure it is set to TRUE. Thus DAC will create view automatically. For example, view name for S_SRV_REQ is V_SRV_REQ. During task running (not unit test process), DAC will execute scripts of create this view. In the view, DAC will be joining to S_ETL_I_IMG_35 for incremental load but won’t for full load.

 

  • All image tables including S_ETL_I_IMG_*, S_ETL_R_IMG_*, S_ETL_D_IMG_* should be created manually. Get the unused number from Image table list and set for table name.

 

Steps to Configure DAC for New Entity

Scenario:  We received new requirement of Invoice Charge fact and going to create relevant codes/mappings/configuration to load rows into data warehouse.

1.  Design the mapping and create *_FS and *_F table in Oracle warehouse

2. Create SDE Mapping/Workflow to extract data from specified source container

3. Create SIL Mapping/Workflow to load staging data into Oracle data warehouse

4. Go into DAC and import table/column/index to synchronize the information in DAC with the physical one in DB

5. Create task in design tab, set property value in edit sub tab, and make sure that you have checked Build Image option for Extract task

6.  Add Source/Target table for each task, notice that the source type includes Primary, Auxiliary and lookup

7.  Create a new Subject Area for test purpose or find out existing Subject Area, to associate created tasks

8.  For a complete star schema there should be a fact table, add fact tables for associated or created Subject Area and click on Assemble button.

9. Create a new Execution Plan for testing purpose or find out existing execution plan and add entries

10. In execution plan – Parameter sub tab, click on Generate, it will create parameter atomically

11. Click on Build button to build out ordered tasks

12. If you run task first time, go to Setup -> Physical Data Sources, make sure Refresh Dates for source table is NULL. Otherwise it will perform an incremental load.

13. Go back to Execution Plan, click on Run Now. The entry will go into Current Run tab.

Check out the log of each task and find out what’s going in backend.

I want to mention in the last that checking DAC task detailed log is pretty good way to find out how DAC tool organize and process its ordered task. There has been much pre-built scripts and process internally. It’s helpful for your learning and development if you get familiar with those functions.

 

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Kent Jiang

Currently I was working in Perficient China GDC located in Hangzhou as a Lead Technical Consultant. I have been with 8 years experience in IT industry across Java, CRM and BI technologies. My interested tech area includes business analytic s, project planning, MDM, quality assurance etc

More from this Author

Follow Us