Académique Documents
Professionnel Documents
Culture Documents
1. Create a worksheet.
4 Informatica 7.x vs 8.x
An In Informatica 8.1 has an addition of transformations and supports different unstructured data.
s Introduced:
1. sql transformation
2. java transformation
3. support unstructured data like emails, word doc, and pdfs.
4. In custom transformation we can build the transformation using java or vc++
5.Concept of flat file updation is also introduced in 8.x
Object Permissions
Effective in version 8.1.1, you can assign object permissions to users when
you add a user account, set user permissions, or edit an object.
Effective in version 8.1, you configure the gateway node and location for
log event files on the Properties tab for the domain. Log events describe
operations and error messages for core and application services, workflows
and sessions.
Unicode compliance
You may notice an increase in memory and CPU resource usage on machines
running PowerCenter Services.
License Usage
High Availability
2.SOURCE QUALIFIER
The data from sourse is passed to Source Qualifier. Here source datatypes are converted into Informatice data types.It has got
outp[ut ports to pass data to the respective tranformation.Here , the values are passed into the lookup tranformation.
3.LOOKUP TRANFORMATION(LKP_Getdata)
Analyse the rows that dont update in any case(preferably the primary or composite keys) . These values are to be defined again
in new ports with new names. Here in the above example, empno,ename and hiredate are defined as IN_EMPNO,IN_ENAME and
IN_HIREDATE.Map the EMPNO,ENAME,HIREDATE from source qualifier to IN_EMPNO,IN_EMPNAME,IN_HIREDATE in
lookup(These the only input ports). A unique key is created in adition to the lookup values.Here PM_Primary key is created.
a)In the properties tab, define the condition . Here we need to define ,
EMPNO=IN_EMPNO, ENAME=IN_ENAME,HIREDATE=IN_HIREDATE
b) Check the other required properties and also check for the connectivity if its a relational data source.
a) Pass values(except the input values defined in lookup) from Source Qualifier into Expression
tranformation(exp_detectchanges). Create two output ports CHANGEDFLAG(integer) and NEWFLAG(integer).Pass the unique key
(PM_PrimaryKey) into the exp tranformation and map it . Create the comparison column for all the values
(comm,deptno,sal,mgr). Map the lookup values to these comparision keys.
c) Make sure only changed flag and new flag are the only output ports here .
d) logic for changed flag: If any row in changed or updated, the row that is updated is returned here in this port.
IIF (NOT ISNULL (PM_PRIMARYKEY)
AND
(DECODE (COMM,PM_PREV_COMM,1,0) = 0
OR
DECODE (DEPTNO,PM_PREV_DEPTNO,1,0) = 0
OR
DECODE (JOB,PM_PREV_JOB,1,0) = 0
OR
DECODE (MGR,PM_PREV_MGR,1,0) = 0
OR
DECODE (SAL,PM_PREV_SAL,1,0) = 0
),TRUE,FALSE)
Primarily, it checks if the primary key in not null(meaning, the row must be already existing one). Next it compares the existing
columns with the new columns. DECODE() is used to compare the columns here.
DECODE:
Searches a port for a value you specify. If the function finds the value, it returns a result value, which you define. You
can build an unlimited number of searches within a DECODE function.
If you use DECODE to search for a value in a string port, you can either trim trailing blanks with the RTRIM function or
include the blanks in your search string.
Syntax
DECODE( value, first_search, first_result [, second_search, second_result]...[,default] )
It compares 2 values in the decode function and returns '0' if both the values are different . So, the function return only '0'
values here. We get the updated rows in return in the changedflag port.
IIF ( ISNULL (PM_PRIMARYKEY),TRUE,FALSE)
If any new record is inserted, there is no PM_PRIMARY key created in that . It is a null value. So, write a logic , if
PM_PRIMARYKEY field is null, it return true. Meaning, a new record is created.
Pass values from the Source Qualifier.Edit a new value with name, 'NewFlag' (integer).Map the 'Newflag' in EXP_DetectChanges
to 'NewFlag' in FIL_NewRecord.Apply filter condition on 'NewFlag'. It filters the newly inserted records.
TYPE1 : KEEP MOST RECENT VALUES IN TARGET
1.SOURCE
SOURCE IS IMPORTED FROM THE DATABASE . HERE EMP TABLE(SAY) IS TAKEN AS A SOURCE.
2.SOURCE QUALIFIER
The data from sourse is passed to Source Qualifier. Here source datatypes are converted into Informatice data types.It has got
outp[ut ports to pass data to the respective tranformation.Here , the values are passed into the lookup tranformation.
3.LOOKUP TRANFORMATION(LKP_Getdata)
Analyse the rows that dont update in any case(preferably the primary or composite keys) . These values are to be defined again
in new ports with new names. Here in the above example, empno,ename and hiredate are defined as IN_EMPNO,IN_ENAME and
INHIREDATE.Map the EMPNO,ENAME,HIREDATE from source qualifier to IN_EMPNO,IN_EMPNAME,IN_HIREDATE in lookup(These
the only input ports). A unique key is created in adition to the lookup values.Here PM_Primary key is created.
a)In the properties tab, define the condition . Here we need to define ,
EMPNO=IN_EMPNO, ENAME=IN_ENAME,HIREDATE=IN_HIREDATE
b) Check the other required proterties and also check for the connectivity if its a relational data source.
a) Pass values(except the input values defined in lookup) from Source Qualifier into Expression
tranformation(exp_detectchanges). Create two output ports CHANGEDFLAG(integer) and NEWFLAG(integer).Pass the unique key
(PM_PrimaryKey) into the exp tranformation and map it . Create the comparison column for all the values
(comm,deptno,sal,mgr). Map the lookup values to these comparision keys.
c) Make sure only changed flag and new flag are the only output ports here .
d) logic for changed flag: If any row in changed or updated, the row that is updated is returned here in this port.
IIF (NOT ISNULL (PM_PRIMARYKEY)
AND(
DECODE (COMM,PM_PREV_COMM,1,0) = 0
OR
DECODE (DEPTNO,PM_PREV_DEPTNO,1,0) = 0
OR
DECODE (JOB,PM_PREV_JOB,1,0) = 0
OR
DECODE (MGR,PM_PREV_MGR,1,0) = 0
OR
DECODE (SAL,PM_PREV_SAL,1,0) = 0
),TRUE,FALSE)
Primarily, it checks if the primary key in not null(meaning, the row must be already existing one). Next it compares the existing
columns with the new columns. DECODE() is used to compare the columns here.
DECODE:
Searches a port for a value you specify. If the function finds the value, it returns a result value, which you define. You
can build an unlimited number of searches within a DECODE function.
If you use DECODE to search for a value in a string port, you can either trim trailing blanks with the RTRIM function or
include the blanks in your search string.
Syntax
DECODE( value, first_search, first_result [, second_search, second_result]...[,default] )
IIF ( ISNULL (PM_PRIMARYKEY),TRUE,FALSE)
If any new record is inserted, there is no PM_PRIMARY key created in that . It is a null value. So, write a logic , if
PM_PRIMARYKEY field is null, it return true. Meaning, a new record is created.
5.1.1 FILTER TRANFORMATION (FIL_NewRecord)
Pass values from the Source Qualifier.Edit a new value with name, 'NewFlag' (integer).Map the 'Newflag' in EXP_DetectChanges
to 'NewFlag' in FIL_NewRecord.Apply filter condition on 'NewFlag'. It filters the newly inserted records.
Pass the values from filter transformation (FIL_NewRecord) to Update Strategy (UPD_ForceInserts).
DD_Insert:
Flags records for insertion in an update strategy expression. DD_INSERT is equivalent to the integer literal 0.
Note: The DD_INSERT constant is designed for use in the Update Strategy transformation only. Informatica
recommends using DD_INSERT instead of the integer literal 0. It is easier to troubleshoot complex numeric expressions
if you use DD_INSERT.
When you run a workflow, select the data-driven update strategy to write records to a target based on this flag.
To create the unique key for the newly inserted records. The default values in the sequence generator are 'NEXTVAL' and
'CURRVAL'.
NEXTVAL:
Use the NEXTVAL port to generate a sequence of numbers by connecting it to a transformation or target. You connect
the NEXTVAL port to a downstream transformation to generate the sequence based on the Current Value and Increment
By properties.
FLAG
1.SOURCE
2.SOURCE QUALIFIER
The data from sourse is passed to Source Qualifier. Here source datatypes are converted into Informatice data types.It has got
outp[ut ports to pass data to the respective tranformation.Here , the values are passed into the lookup tranformation.
3.LOOKUP TRANFORMATION(LKP_Getdata)
Analyse the rows that dont update in any case(preferably the primary or composite keys) . These values are to be defined again
in new ports with new names. Here in the above example, empno,ename and hiredate are defined as IN_EMPNO,IN_ENAME and
INHIREDATE.Map the EMPNO,ENAME,HIREDATE from source qualifier to IN_EMPNO,IN_EMPNAME,IN_HIREDATE in lookup(These
the only input ports). A unique key is created in adition to the lookup values.Here PM_Primary key is created.
a)In the properties tab, define the condition . Here we need to define ,
EMPNO=IN_EMPNO, ENAME=IN_ENAME,HIREDATE=IN_HIREDATE
b) Check the other required proterties and also check for the connectivity if its a relational data source.
a) Pass values(except the input values defined in lookup) from Source Qualifier into Expression
tranformation(exp_detectchanges). Create two output ports CHANGEDFLAG(integer) and NEWFLAG(integer).Pass the unique
key (PM_PrimaryKey) into the exp tranformation and map it . Create the comparison column for all the values
(comm,deptno,sal,mgr). Map the lookup values to these comparision keys.
c) Make sure only changed flag and new flag are the only output ports here .
d) logic for changed flag: If any row in changed or updated, the row that is updated is returned here in this port.
IIF (NOT ISNULL (PM_PRIMARYKEY)
AND
(
DECODE (COMM,PM_PREV_COMM,1,0) = 0
OR
DECODE (DEPTNO,PM_PREV_DEPTNO,1,0) = 0
OR
DECODE (JOB,PM_PREV_JOB,1,0) = 0
OR
DECODE (MGR,PM_PREV_MGR,1,0) = 0
OR
DECODE (SAL,PM_PREV_SAL,1,0) = 0
),TRUE,FALSE)
Primarily, it checks if the primary key in not null(meaning, the row must be already existing one). Next it compares the existing
columns with the new columns. DECODE() is used to compare the columns here.
DECODE:
Searches a port for a value you specify. If the function finds the value, it returns a result value, which you define. You
can build an unlimited number of searches within a DECODE function.
If you use DECODE to search for a value in a string port, you can either trim trailing blanks with the RTRIM function or
include the blanks in your search string.
Syntax
DECODE( value, first_search, first_result [, second_search, second_result]...[,default] )
Pass the values ( other than primary keys, and values that can be changed) from the source qualifier to the filter. Also, pass
PM_Primary key from exp (EXP_DetectChanges)transformation. Create a new value 'ChangedFlag' and pass the 'changedflag'
value from EXP_DetectChanges to 'changedflag' in FIL_InsertChangedRecord.
Pass the values including 'PM_Primarykey' from filter transformation (FIL_NewRecord) to Update Strategy
(UPD_ChangedInserts).
Pass PM_Primarykey values from UPD_ChangedInserts to EXP_KeyProcessing_InsertChanged. Edit the transformation with new
values, NEW_PM_Primary key and PM_CURRENT_FLAG.
passvalues..
NEW_PM_PrimaryKey = PM_PRIMARYKEY + 1
PM_CURRENT_FLAG = 1
5.2.4 TARGET (NEWTARGET2)
Pass the values from the Update Strategy ( UPD_ChangedInserts) to the target .Also, pass the PM_Primarykey and
PM_CURRENT_FLAG values to the PM_Primarykey and PM_CURRENT_FLAG columns in the target.
Create a filter with PM_PrimaryKey and ChangedFlag ports and pass PM_PrimaryKey value from lookup and ChangedFlag value
from Expression( EXP_DetectChanges) .Filter on Changedflag.
5.3.4.Expression Transformation(EXP_KeyProcessing_UpdateChanged)
Pass the PM_PrimaryKey value in the transformation. In addition, create PM_CURRENT_FLAG and pass '0' to that .
5.3.4 TARGET
8. Go to WorkFlow Manager and create a session task and attach the respective mapping to the session. Edit the
mapping properties with the valid source and target definitions. Change "Target Load Type" as "normal" for the
target and uncheck the "truncate table option", for the history.
DATE
1.SOURCE
2.SOURCE QUALIFIER
The data from sourse is passed to Source Qualifier. Here source datatypes are converted into Informatice data types.It has got
outp[ut ports to pass data to the respective tranformation.Here , the values are passed into the lookup tranformation.
3.LOOKUP TRANFORMATION(LKP_Getdata)
Analyse the rows that dont update in any case(preferably the primary or composite keys) . These values are to be defined again
in new ports with new names. Here in the above example, empno,ename and hiredate are defined as IN_EMPNO,IN_ENAME and
INHIREDATE.Map the EMPNO,ENAME,HIREDATE from source qualifier to IN_EMPNO,IN_EMPNAME,IN_HIREDATE in lookup(These
the only input ports). A unique key is created in adition to the lookup values.Here PM_Primary key is created.
a)In the properties tab, define the condition . Here we need to define ,
EMPNO=IN_EMPNO, ENAME=IN_ENAME,HIREDATE=IN_HIREDATE
b) Check the other required properties and also check for the connectivity if its a relational data source.
a) Pass values(except the input values defined in lookup) from Source Qualifier into Expression
tranformation(exp_detectchanges). Create two output ports CHANGEDFLAG(integer) and NEWFLAG(integer).Pass the unique
key (PM_PrimaryKey) into the exp tranformation and map it . Create the comparison column for all the values
(comm,deptno,sal,mgr). Map the lookup values to these comparision keys.
c) Make sure only changed flag and new flag are the only output ports here .
d) logic for changed flag: If any row in changed or updated, the row that is updated is returned here in this port.
IIF (NOT ISNULL (PM_PRIMARYKEY)
AND
(DECODE (COMM,PM_PREV_COMM,1,0) = 0
OR
DECODE (DEPTNO,PM_PREV_DEPTNO,1,0) = 0
OR
DECODE (JOB,PM_PREV_JOB,1,0) = 0
OR
DECODE (MGR,PM_PREV_MGR,1,0) = 0
OR
DECODE (SAL,PM_PREV_SAL,1,0) = 0
),TRUE,FALSE)
Primarily, it checks if the primary key in not null(meaning, the row must be already existing one). Next it compares the existing
columns with the new columns. DECODE() is used to compare the columns here.
DECODE:
Searches a port for a value you specify. If the function finds the value, it returns a result value, which you define. You
can build an unlimited number of searches within a DECODE function.
If you use DECODE to search for a value in a string port, you can either trim trailing blanks with the RTRIM function or
include the blanks in your search string.
Syntax
DECODE( value, first_search, first_result [, second_search, second_result]...[,default] )
If any new record is inserted, there is no PM_PRIMARY key created in that . It is a null value. So, write a logic , if
PM_PRIMARYKEY field is null, it return true. Meaning, a new record is created.
5.1.1 FILTER TRANFORMATION (FIL_InsertNewRecord)
Pass values from the Source Qualifier.Edit a new value with name, 'NewFlag' (integer).Map the 'Newflag' in EXP_DetectChanges
to 'NewFlag' in FIL_NewRecord.Apply filter condition on 'NewFlag'. It filters the newly inserted records.
Pass the values from filter transformation (FIL_InsertNewRecord) to Update Strategy (UPD_ForceInserts).Change the "Update
Strategy Information" to "DD_INSERT"
DD_Insert:
Flags records for insertion in an update strategy expression. DD_INSERT is equivalent to the integer literal 0.
Note: The DD_INSERT constant is designed for use in the Update Strategy transformation only. Informatica
recommends using DD_INSERT instead of the integer literal 0. It is easier to troubleshoot complex numeric expressions
if you use DD_INSERT.
When you run a workflow, select the data-driven update strategy to write records to a target based on this flag.
To create the unique key for the newly inserted records. The default values in the sequence generator are 'NEXTVAL' and
'CURRVAL'.
NEXTVAL:
Use the NEXTVAL port to generate a sequence of numbers by connecting it to a transformation or target. You connect
the NEXTVAL port to a downstream transformation to generate the sequence based on the Current Value and Increment
By properties.
Create a nextvalue (input )column in expression . Pass the next value from Seq . Generator to the next value in the expression.
Create one output port PM_BEGIN_DATE and pass SYSDATE value to that .Also create PM_PrimaryKey as an output port and
pass NEXTVAL to this port.
Pass the values from the Update Strategy Transformation (UPD_ForceInserts) into the target. Pass NEXTVAL , PM_BEGIN_DATE
and PM_Primarykey values to the respective ports in the target.PM_BEGIN_DATE gives the system date when the new record is
inserted.
Create NEXTVAL,NEW_PM_Primarykey and PM_BEGIN_DATE here .Get the nextval from seq Gen and pass nextval into
NEW_PM_Primarykey. pass SYSDATE to PM_BEGINDATE
Pass the values from the source qualifier to the filter. Also, pass 'changed flag' from the EXP_DetectChanges to the filter.
Pass the values including 'PM_Primarykey' from filter transformation (FIL_UpdateChangedRecord) to Update Strategy
(UPD_ChangedUpdate).
Pass the values from the Update Strategy ( UPD_ChangedInserts) and EXP_Keyprocessing_InsertChanged to the target .
Get values PM_Primarykey from lookup and Changed Flag from expression and filter records on changed flag.
5.3.2.UPDATESTRATEGY (UPD_ChangedUpdate)
Pass PM_PrimaryKey into update strategy and Change the "Update Strategy Information" to "DD_Update".
5.3.3.TARGET
8. Go to WorkFlow Manager and create a session task and attach the respective mapping to the session. Edit the
mapping properties with the valid source and target definitions. Change "Target Load Type" as "normal" for the
target and uncheck the "truncate table option", for the history.
VERSION
1.SOURCE
SOURCE IS IMPORTED FROM THE DATABASE . HERE EMP TABLE(SAY) IS TAKEN AS A SOURCE.
2.SOURCE QUALIFIER
The data from sourse is passed to Source Qualifier. Here source datatypes are converted into Informatice data types.It has got
outp[ut ports to pass data to the respective tranformation.Here , the values are passed into the lookup tranformation.
3.LOOKUP TRANFORMATION(LKP_Getdata)
Analyse the rows that dont update in any case(preferably the primary or composite keys) . These values are to be defined again
in new ports with new names. Here in the above example, empno,ename and hiredate are defined as IN_EMPNO,IN_ENAME and
INHIREDATE.Map the EMPNO,ENAME,HIREDATE from source qualifier to IN_EMPNO,IN_EMPNAME,IN_HIREDATE in lookup(These
the only input ports). A unique key is created in adition to the lookup values.Here PM_Primary key is created.
a)In the properties tab, define the condition . Here we need to define ,
EMPNO=IN_EMPNO, ENAME=IN_ENAME,HIREDATE=IN_HIREDATE
b) Check the other required proterties and also check for the connectivity if its a relational data source.
a) Pass values(except the input values defined in lookup) from Source Qualifier into Expression
tranformation(exp_detectchanges). Create two output ports CHANGEDFLAG(integer) and NEWFLAG(integer).Pass the unique
key (PM_PrimaryKey) into the exp tranformation and map it . Create the comparison column for all the values
(comm,deptno,sal,mgr). Map the lookup values to these comparision keys.
c) Make sure only changed flag and new flag are the only output ports here .
d) logic for changed flag: If any row in changed or updated, the row that is updated is returned here in this port.
IIF (NOT ISNULL (PM_PRIMARYKEY)
AND
DECODE (COMM,PM_PREV_COMM,1,0) = 0
OR
DECODE (DEPTNO,PM_PREV_DEPTNO,1,0) = 0
OR
DECODE (JOB,PM_PREV_JOB,1,0) = 0
OR
DECODE (MGR,PM_PREV_MGR,1,0) = 0
OR
DECODE (SAL,PM_PREV_SAL,1,0) = 0
),TRUE,FALSE)
Primarily, it checks if the primary key in not null(meaning, the row must be already existing one). Next it compares the existing
columns with the new columns. DECODE() is used to compare the columns here.
DECODE:
Searches a port for a value you specify. If the function finds the value, it returns a result value, which you define. You
can build an unlimited number of searches within a DECODE function.
If you use DECODE to search for a value in a string port, you can either trim trailing blanks with the RTRIM function or
include the blanks in your search string.
Syntax
DECODE( value, first_search, first_result [, second_search, second_result]...[,default] )
IIF ( ISNULL (PM_PRIMARYKEY),TRUE,FALSE)
If any new record is inserted, there is no PM_PRIMARY key created in that . It is a null value. So, write a logic , if
PM_PRIMARYKEY field is null, it return true. Meaning, a new record is created.
Pass values from the Source Qualifier.Edit a new value with name, 'NewFlag' (integer).Map the 'Newflag' in EXP_DetectChanges
to 'NewFlag' in FIL_NewRecord.Apply filter condition on 'NewFlag'. It filters the newly inserted records.
Pass the values from filter transformation (FIL_NewRecord) to Update Strategy (UPD_ForceInserts).
Flags records for insertion in an update strategy expression. DD_INSERT is equivalent to the integer literal 0.
Note: The DD_INSERT constant is designed for use in the Update Strategy transformation only. Informatica
recommends using DD_INSERT instead of the integer literal 0. It is easier to troubleshoot complex numeric expressions
if you use DD_INSERT.
When you run a workflow, select the data-driven update strategy to write records to a target based on this flag.
To create the unique key for the newly inserted records. The default values in the sequence generator are 'NEXTVAL' and
'CURRVAL'.
NEXTVAL:
Use the NEXTVAL port to generate a sequence of numbers by connecting it to a transformation or target. You connect
the NEXTVAL port to a downstream transformation to generate the sequence based on the Current Value and Increment
By properties.
Create a nextvalue column in expression . Pass the next value from Seq . Generator to the next value in the expression. Also,
create a PM_VERSION_NUMBER and pass some value into it.
Pass the values from the Update Strategy Transformation (UPD_ForceInserts) into the target. Also pass the next value and
PM_VERSION_NUMBER from the exp transformation (EXP_KeyProcessing_InsertNew) into the pm_version_number and
pm_primarykey values defined in the target .
Pass the values ( other than primary keys, and values that can be changed) from the source qualifier to the filter. Also, pass
PM_Primary key from exp (EXP_DetectChanges)transformation. Create a new value 'ChangedFlag' and pass the 'changedflag'
value from EXP_DetectChanges to 'changedflag' in FIL_InsertChangedRecord.
Pass the values including 'PM_Primarykey' from filter transformation (FIL_NewRecord) to Update Strategy
(UPD_ChangedInserts).
Pass PM_Primarykey values from UPD_ChangedInserts to EXP_KeyProcessing_InsertChanged. Edit the transformation with new
values, NEW_PM_Primary key and NEW_PM_VERSION_NUMBER.
passvalues..
NEW_PM_PrimaryKey = PM_PRIMARYKEY + 1
NEW_PM_VERSION_NUMBER= (PM_PRIMARYKEY + 1)
This is to create the sequential keys for the updated records below the respective record .
Pass the values from the Update Strategy (UPD_ChangedInserts) to the target .Also, pass the PM_Primarykey and
PM_VERSION_NUMBER values to the PM_Primarykey and pm_version_num columns in the target.
8. Go to WorkFlow Manager and create a session task and attach the respective mapping to the session. Edit the mapping
properties with the valid source and target definitions. Change "Target Load Type" as "normal" for the target and uncheck the
"truncate table option", for the history.
Mapping : How to find the number of success , rejected and bad records in the same mapping.
Explanation : In this Mapping we will see how to find the number of success , rejected and bad records in one
mapping.
Source file is a flat file which is in .csv format . Click here to download the source file.The
table appears like as shown below..
EMPN HIREDAT SE
NAME
O E X
100 RAJ 21-APR M
101 JOHN 21-APR-08 M
102 MAON 08-APR M
103 22-APR-08 M
105 SANTA 22-APR-08 F
SMITH
104 22-APR-08 F
A
106 M
In the above table it shows few values are missing in the table .ANd also the date format
of few records are improper.This must be considered as invlaid records and should be
loaded into Bad_records table ( target table which is relational).
Other than 2 , 3 , 5, 6 records ,remaining all are invalid records because of NULL values
or improper DATE format or both .
INVALID & VALID RECORDS ::
First we seperate this data using Expression transformation.Which is used to flag the row
for 1 or 0 .The condition as follows ..
IIF(NOT IS_DATE(HIREDATE,'DD-MON-YY') OR ISNULL(EMPNO) OR ISNULL(NAME) OR
ISNULL(HIREDATE) OR ISNULL(SEX) ,1,0)
FLAG =1 is considered as invalid data and FLAG =0 is considered as valid data .This data
will be routed into next transformation using router transformation .Here we added two
user groups one as FLAG=1 for invalid data and the other as FLAG=0 for valid data.
FLAG=1 data is forwarded to the expression transformation .Here we take one variable
port and trwo ouput ports .One for increament purpose and the other for flag the row ...
INVALID RECORDS
INCREAMENT ::
EMPN SE COUN
NAME HIREDATE
O X T
INVALID
100 RAJ M 1
DATE
MAO INVALID
102 M 2
N DATE
103 NULL 22-APR-08 M 3
106 NULL NULL M 4
VALID RECORDS ::
In this we will have the valid records.But here we dont want the Employee ,who is 'F'
(Female).So our goal is to load MALE employee info., into the SUCCESS_RECORDS target
table.
For this we need to use a Router transformation and declare the user group as follows \
IIF( sex='M',TRUE,FALSE)
And the defined group will capture teh rejected records which are nothing but employee
who is FEMALE .
This data passed to the REUSABLE Expressiona transformation.Where the Increamental
logic is applied to get the count value for the the no., of success and rejected records
which are passing it.And loaded into the target table.
Look at the below tables :::
SUCCESS_RECORDS::
We will go for incremental loading to speed up the data loading and reduce data actually loaded.
One of the those methods by writing SQL Override in SQL using Mapping variable.
Explanation of the mapping: First we hav to create one mapping variable of type date.
In the expression assign sysdate to mapping variable to update the mapping variable because from
next load it will pick the records greater than today's date.It will accept only recent records
Output Port :: INCRE_LOAD
SETVARIABLE($$v_incre_load,sysdate)
NAME AGE
JOHN 23
SMITH
34
LUCKY 24
In the above table their are 4 records but in that two are valid and
other two are invalid records because in the 2 nd record AGE value is
NULL and in the 3rd record the NAME is a NULL value .So it is
essential to filter 2nd and 3rd before loading into the target
table.The belove table shows how a target table appears witha valid
data
NAME AGE
JOHN 23
LUCKY 24
The 2nd and 3rd records can be filtered by using IS_NULL ( )
functionThis is declaredFilter Transformation Condition.
The function appears like this IIF(NOT ISNULL(NAME) ,TRUE,FALSE)
FIlter transformation passes only TRUE records to the target and
drops the FALSE records which are nothing but NULL value records.
DATE
20-Apr
11-Mar-2008
12-Feb-2008
Feb-2008
If you look at 1 st and 4th record of above table .It clearly
shows that the DATE value format is not correct.In the 1st
record year is missing and in the 4th date is missing.In this
case it is necessary to see this invalid data to be removed
before it loaded into the target .Like as shown in below
target table:
DATE
11-Mar-2008
12-Feb-2008
The above can be acheived by using the IS_DATE( )
function .This is declared in Filter
Transformation condition.
The function appears like this IIF(IS_DATE(DATE,
'DD-MON-YY'),TRUE,FALSE)
When you take a source as Flatfile and it consists one of the feild as NAME .The following below table is
the source table data::
NAME
Ravi
Ramesh
Swathi
Jack
In the above table you can clearly see the Name values are not aligned properly and to align the data into
equally .We have to use TRIM concept .Now in our mapping we are loading above data into target table
.But in target table it appears like as shown below
NAME
Ravi
Ramesh
Swathi
Jack
The above can be acheived by using the LTRIM( ) function in expression editor.This is declared in
Expression transformation output port.
Mapping : How to convert flatfile Date datatype ( string) into Relational Date datatype (Date) ?
Explanation : In this mapping we will understand how to use the TO_DATE( ) function.It is mainly used to convert
String datatype to Date datatype .
When you take a source as Flatfile and it consists one of the feild as DATE and its datatype generally a
String .The following below table is the source table ::
DATE String
If you want to load the above source data into realtional target.Then it is manditory to match the
datatype.In relational generaly the date datatype is Date.The following below table is the target table ::
To match the datatype of ports , we use the TO_DATE ( ) function in expression transformation editor.The
function appears like this TO_DATE(DATE,'DD-MON-YY')
Syntax
Example
Syntax
Example
Mapping : Design a mapping to load the valid source records into warehouse.
Solution : Source : Flatfile
Target : Relational
Informatica version 7.1.1
Database : Oracle
Tip : Using mapping variable port implemented this. Same can be done
using dynamic lookup , Aggreator transformation .
Mapping : first half to one target and second half to other target.
Solution : Source : Flatfile
Target : Relational
Database : Oracle
Mapping : first half to one target and second half to other target.
Solution : Source : Flatfile
Target : Relational
Database : Oracle