Vous êtes sur la page 1sur 8

Data Conversion Strategies In Oracle E-Business Suite Implementation Page 1 of 8

Data Conversion Strategies In Oracle E-Business Suite


Implementation
Sunday, 04 May 2008 15:28 Prasad Bhogle | Print |

Executive Summary

When one talks about Oracle E-Business suite implementation following questions about data arise:

How existing legacy system data will be available in new system?


Will there be a complete move from OLD system to NEW system or both systems will co-exists for
sometime before complete cut over?
Will there be any communication needed between OLD and NEW system?
Most important is HOW to achieve it?

While discussing about these questions few terms get used interchangeable viz. Data Migration, Data conversion, Interfaces
etc. All can be brought under one umbrella/category called Data Movement.

This white paper talks about various strategies/technologies which can be used for data communication between legacy system
and Oracle E-Business Suite. This gives the overview about various technical components involved in taking care of most
important word called "DATA".

This white paper tries to give direction to answer above questions. To make legacy data available to new system, one has to
perform data entry manually or use tools like data loader, win-runner which simulate keyboard/mouse action and make data
entry faster. If data is huge the programmer/developers help is needed to build the program to automate data loading. If two
systems are going to coexists for some time and communicate with each other then programs which run at regular intervals to
move the data are needed.

http://apps2fusion.com/at/50-pb/255-data-conversion-strategies-in-oracle-e-business-suite-... 8/18/2010
Data Conversion Strategies In Oracle E-Business Suite Implementation Page 2 of 8

Defining Various Terminologies

Data Migration is a process of moving required (and most often very large) volumes of data from our clients' existing systems to
new systems. Existing systems can be anything from custom-built IT infrastructures to spreadsheets and standalone databases.

Data conversion can be defined as a process of converting data from one structural form to another to suit the requirements of
the system to which it is migrated.

Interface is set of programs for connection between two systems in order to synchronize the Data. They can be Manual, Batch
or Real-Time. They are used repeatedly and should therefore be designed and constructed in the most efficient manner
possible.

Though Data Migration, conversion, interfaces are treated differently but technical folks should not differentiate between them.
The design of the all program should same like interfaces, because in most of the implementation data need to be moved
multiple times and for some time both legacy system and Oracle Apps 11i happened to be active. Bottom line if data needs to
be moved programmatically i.e. without using tools like data loader or manual data entry then it is better to design the program
which can be execute multiple time as a concurrent program.

Data Movement Entities

http://apps2fusion.com/at/50-pb/255-data-conversion-strategies-in-oracle-e-business-suite-... 8/18/2010
Data Conversion Strategies In Oracle E-Business Suite Implementation Page 3 of 8

Generally data conversion/interfaces include following entities

1. Legacy System (Oracle/Other database e.g. DB2/File System)


2. Oracle Apps custom staging tables
3. Shell scripts to read data files and load them in custom staging tables using SQL*Loader utility
4. Oracle Apps custom data validation programs
5. Oracle seeded open interface tables (e.g. MTL_TRANSACTIONS_INTERFACE)
6. Oracle seeded APIs (e.g. Process Order API)
7. Oracle Apps Seeded Import Program (e.g. Item Import, BOM Import)

Legacy System

There can be various types of legacy systems which can be based on databases like Oracle, DB2 etc or based on file systems.
Generally it is better to use files extracted from legacy system while development of data conversion program as data volume is
very high. DB Links can be used for connecting to other databases while developing interfaces.

1. ASCII files :- As a practice most of implementations have ASCII files as an input to oracle. These files can be
comma/pipe/tab delimited files with certain file format. If excel files are received them they need to be converted to CSV
file manually.
2. XML files :-Now days XML (Extended Markup Language) is becoming the standard world wide most of the system accept
and understand XML format. Source system can generate XML files based on format defined by DTD (Document Type
Definition) or XSD (Xml Schema Definition). DTD/XSD defines the structure/tags present in XML document.
3. Legacy Outbound staging tables :- Legacy system/home grown system can be based on databases like Oracle, DB2 etc.
Now days, there are drivers available in oracle to connect to other non-oracle databases using DB Links. If the data
volume is less then it is better to use DB Links that ASCII file so that we can get read of FTP/SFTP failure issues. Also it
is easier to update database tables directly to update the current status of record (Success/Error etc) instead of
exchanging files

Oracle Apps 11i E-Business Suite Components

Oracle Apps Custom staging tables :- As a thumb rule the structure of custom staging table should be same as the source
data file or source legacy database table. There should be presence of columns like Interface_id, Batch_ID, Process_Flag,
Error_Message, WHO columns and Extended WHO columns in the table structure.

Shell Script with SQL*Loader:- Shell script (host) type of concurrent program should be used to load data files in custom
staging tables instead of SQL*Loader executable type. Shell script internally uses SQL*Loader command line utility to load data
and archives the data files. With usage of shell script the data file path is not hard coded, archiving monitoring bad file is easily
achievable which is not the case with SQL*Loader executable type. As a rule there should not be hard coding in shell scripts
e.g. directory paths, username, password etc.Inbound data files can be placed in directories like $XXINV_TOP/datafiles/in.

Shell script should read these files and load in custom tables and archive data files in $XXINV_TOP/datafiles/in/archive with
date timestamp appended to file name. SQL*Loader log files and bad files can reside in $XXINV_TOP/datafiles/in/log and
$XXINV_TOP/datafiles/in/bad folders.

This shell script should also read the bad files generated by SQL*Loader and complete the concurrent program in
Success/Warning/Error Status. Sample shell script is given in annexure.
This shell script has following sections

Extract Apps username and password


Initialize file names (data, log, bad, archive files)
Check existence of file
Run SQL*Loader

http://apps2fusion.com/at/50-pb/255-data-conversion-strategies-in-oracle-e-business-suite-... 8/18/2010
Data Conversion Strategies In Oracle E-Business Suite Implementation Page 4 of 8

Archive data files


Check bad files

Shell Script with Java based XML Parser:- Shell script (host) type of concurrent program should be used to call java
based/PLSQL XML parser. The directory structure, archiving, logging related rules remain the same mentioned above. In this
case we have to develop a custom Java or PLSQL based XML parser which can parse the input xml files and load in custom
staging table. PLSQL based parsers user UTL_FILE package to perform file read operations. Building a java based XML parser
is a topic in itself, which can be considered as out of scope for this white paper.

PLSQL Package for reading Legacy outbound tables:- This can be a plsql package which can read legacy database tables
using database links and writes the inbound custom staging tables in Oracle Apps 11i instance.

Oracle Apps Custom Lookup Table:- This table should be used for storing the lookup values i.e. mapping legacy system
values with oracle values. This generally store information like Unit of Measure etc. which can be different on Oracle and
Legacy system. The table structure should have columns like Lookup_Type, Language, Legacy System, Legacy System Value,
Oracle_Value, effectivity_date, end_date and WHO columns.

PLSQL package for data validation and mapping:- This PLSQL package should perform following activities

Validate data in custom tables


Perform data mapping activities using lookup table
Update the custom staging table Process_Flag column with following values

1 - PENDING
3 - ERROR
7 - PROCESSED/SUCCESS
5 --VALIDATION SUCCESS
2 - PICKED FOR VALIDATION

PLSQL package to write data in Oracle Open Interface tables:- This PLSQL package should read all valid records
(Process_Flag=5) and insert data in Oracle Seeded interface table and mark Process_Flag=7

Oracle Open Interface tables:- These are the inbound interface tables provided by oracle for various base tables in oracle
applications.

Oracle Open Interface Concurrent Programs:- These are oracle supplied import programs which read oracle open interface
tables, validate the data and load oracle apps base tables.

Oracle Supplied Public APIs- Generally for online processing (when batch processing is not acceptable) public APIs are used
to load oracle base tables. In this form of interfaces, open interface tables are not used. Public PLSQL APIs accept plsql table
type of input parameters and APIs perform validation and updates of oracle base tables.

Other important components:-

Lookups and Profiles :- Custom Profile Options can be used for following purposes

Enable/Disable Debug Mode


Store Email addresses to be sent by Monitoring Routines
Store hard coded values needed by interfaces
Store file paths in case of outbound interfaces

FND_LOOKUPS (Common Lookups) can be used for following purposes

Storing mappings if custom lookup table is not utilized

http://apps2fusion.com/at/50-pb/255-data-conversion-strategies-in-oracle-e-business-suite-... 8/18/2010
Data Conversion Strategies In Oracle E-Business Suite Implementation Page 5 of 8

Store hard coded values

Custom Lookup Table -In case of some mapping between Oracle and Legacy systems a custom lookup table can be designed
along with a custom form which will help data entry for mappings

Custom Form to update custom staging tables :-This form is used for performing UPDATE/DELETE operations on custom
staging tables. This is used to correct any data errors in custom staging tables and re-submit data for processing. This helps
avoiding manual correction of data files. This form can be a simple data entry screen which selects custom table from list of
value and displays the data in tabular format on a stacked canvas and allows data updates.

Monitoring:- Monitoring/Error Reporting are an important aspect of data conversion or interface development. There should a
notification system to notify users about the issue or problem. Monitoring can be achieved in multiple ways like Alerts, Workflow,
and Error Reporting UNIX Emails etc.

Alert based monitoring:- This can be developed using a custom table with columns which are generally required for an email
e.g. SUBJECT, TO_EMAIL, EMAIL_MESSAGE, PROCESS_STATUS, INSTANCE and WHO Columns. This custom table can
have an event alert (INSERT) which read new records and send out an email and update PROCESS_STATUS column value
for current record to 'SENT'

Error Reporting using UNIX Emails :- To build this functionality additional efforts are needed to populate an error table giving
more details of the problem, build simple reports on it and submit these report concurrent programs at the end of data
conversion routine/interfaces and email the output file to respective users.

Summary

To summarize the data conversion and interface development strategies, one should always have about components like
custom staging tables, shell scripts, SQL*Loader control files, validation programs and monitoring in the design. There can be
short cuts in data conversion like instead of shell scripts using directly SQL*Loader executable types in concurrent programs,
which should definitely avoided because these short cuts involve lot of hard coding also maintenance and reusability of
components cannot be achieved. Monitoring and error report is generally ignored considering it as overhead but it helps in long
run for debugging and maintenance.

Monitoring and error report is generally ignored considering it as overhead but it helps in long run for debugging and
maintenance

Annexure I - Sample Shell Script

# XXINV2402_ITEM_OnHand
#author :
# Date Written. 09/20/2007
# Purpose. Loading data in Staging table for Master Item

#######################################################################
######## get user id #######
#######################################################################
for i in $*
do
case $i in FCP_LOGIN=*)
#echo $i
UID_PWD=`echo $i | sed 's/FCP_LOGIN=//g'`
UID_PWD=`echo $UID_PWD | sed 's/"//g'`;;
esac
done

##########################################################################
####### Initialize the variables #######

http://apps2fusion.com/at/50-pb/255-data-conversion-strategies-in-oracle-e-business-suite-... 8/18/2010
Data Conversion Strategies In Oracle E-Business Suite Implementation Page 6 of 8

##########################################################################

TODAY_DAT=`date +%Y%m%d%H%M`
CTL_FILE=$XXINV_TOP/bin/XXINV2402_01.ctl
DATA_FILE=$XXINV_TOP/datafiles/in/Item_OnHand.csv
BAK_FILE=$XXINV_TOP/datafiles/in/archive/Item_OnHand$TODAY_DAT.bak
LOG_FILE=$XXINV_TOP/log/Item_OnHand$TODAY_DAT.log
BAD_FILE=$XXINV_TOP/datafiles/out/archive/Item_OnHand$TODAY_DAT.bad

##########################################################################
####### Check File Existence #######
##########################################################################

fileChk()
{
# if file is not exists then exit the program

if [ -f $1 ]
then
echo $1' File Exists'
else
echo $1' File Not Exists'
exit 1
fi
}

# Check Data File is exists or not

fileChk $DATA_FILE

fileChk $CTL_FILE

tr -cd '[a-z],[A-Z],[0-9],\n,`,!,@,#,$,%,^,&,*,(,),_,-,+,=,[,{,],},|,\,/,?,<,>, ,:,;,",.'< $DATA_FILE > $DATA_FILE.01


mv $DATA_FILE.01 $DATA_FILE

sqlldr $UID_PWD control=$CTL_FILE data=$DATA_FILE log=$LOG_FILE bad=$BAD_FILE

retcode=`echo $?`

echo $retcode

if [ $retcode = 0 ]
then
# take backup of datafile to out folder if successful loaded
mv $DATA_FILE $BAK_FILE
fi

# Check Bad file exists and prompt it

if [ -f $BAD_FILE ]
then
echo 'Please Check Bad file '$BAD_FILE
fi

Annexure II - Monitoring Alert


Table Name: XXINV_2149_TRX_MONITOR
TRANSACTION_ID NUMBER,
INSTANCE VARCHAR2(100),
SUBJECT VARCHAR2(1000),

http://apps2fusion.com/at/50-pb/255-data-conversion-strategies-in-oracle-e-business-suite-... 8/18/2010
Data Conversion Strategies In Oracle E-Business Suite Implementation Page 7 of 8

ALERT_MESSAGE VARCHAR2(4000),
EMAIL_ADDRESSES VARCHAR2(4000),
EMAIL_STATUS VARCHAR2(30),
CREATION_DATE DATE DEFAULT SYSDATE,
CREATED_BY NUMBER,
LAST_UPDATED_BY NUMBER,
LAST_UPDATED_DATE DATE DEFAULT SYSDATE,
REQUEST_ID NUMBER,
PROGRAM_APPLICATION_ID NUMBER,
PROGRAM_ID NUMBER,
PROGRAM_UPDATE_DATE DATE DEFAULT SYSDATE,
ALERT_TYPE VARCHAR2(30),
PROGRAM_NAME VARCHAR2(1000)
Event Alert on this table to send out email and update EMAIL_STATUS from ‘PENDING’ to ‘SENT’

Hits: 9244 Email This Bookmark Set as favorite

Comments (3)
Subscribe to this comment's feed

Excellent Article
written by Dinesh Chauhan , May 20, 2008
Very good article indeed. While creating flat file or asking third party to provide one, what do you think is the best
approach,Comma/Pipe delimited file or without any delimiter ?

Votes: +0 vote up vote down report abuse

Excellent Article
written by R Mohanty , August 31, 2009
Excellent Article
Votes: +0 vote up vote down report abuse

Excellent article and Good Source of Data Migration


written by Boyapatisireesha , November 18, 2009
I am New to Shell script but the example of shell script in above article given me more confidence to learn scripting.

Could you please add commands used to execute the script in unix .

Thanks and regards,


Sireesha.

Votes: +0 vote up vote down report abuse

Write comment

http://apps2fusion.com/at/50-pb/255-data-conversion-strategies-in-oracle-e-business-suite-... 8/18/2010
Data Conversion Strategies In Oracle E-Business Suite Implementation Page 8 of 8

Name

Email

Title

Comment

smaller | bigger

Subscribe via email (Registered users only)

Write the displayed characters

Add Comment

http://apps2fusion.com/at/50-pb/255-data-conversion-strategies-in-oracle-e-business-suite-... 8/18/2010

Vous aimerez peut-être aussi