Vous êtes sur la page 1sur 22

How to configure ODBC Connection for EXCEL

1. Create a worksheet.

1.1 Select the required rows to be read into PowerCenter.


1.2 Choose Insert | Name | Define and give the range a name then click OK.
1.3 Save the worksheet.

2. Create the ODBC connection.

2.1 System DSN | Microsoft Excel Driver (*.xls).


2.2 Configure and select workbook.

3. Import into Designer.

3.1 Sources | Import From Database.


3.2 ODBC data source must match (2.) above.
3.3 Leave username, password and ownername blank.
3.6 Click Connect
3.7 Expand the worksheet name and select the range created in (1.) above.

4. Create ODBC connection in Workflow Manager

4.1 In Workflow Manager go to Connections | Relational | New... | ODBC


4.2 Enter a name for the connection.
4.3 Username=pmnulluser
4.4 Password=pmnullpasswd
4.5 Connect string=
4.6 Use this connection in session mapping for source.
5. Please note that sheet names with spaces can be problematic.

 
4 Informatica 7.x vs 8.x
 
An In Informatica 8.1 has an addition of transformations and supports different unstructured data. 
s Introduced: 
1. sql transformation 
2. java transformation 
3. support unstructured data like emails, word doc, and pdfs. 
4. In custom transformation we can build the transformation using java or vc++ 
5.Concept of flat file updation is also introduced in 8.x 

Object Permissions

Effective in version 8.1.1, you can assign object permissions to users when 
you add a user account, set user permissions, or edit an object. 

Gateway and Logging Configuration

Effective in version 8.1, you configure the gateway node and location for 
log event files on the Properties tab for the domain. Log events describe 
operations and error messages for core and application services, workflows 
and sessions. 

Log Manager runs on the master gateway node in a domain. 


We can configure the maximum size of logs for automatic purge in megabytes. 
Powercenter 8.1 also provides enhancements to the Log Viewer and log event 
formatting. 

Unicode compliance

Effective in version 8.1, all fields in the Administration Console accept 


Unicode characters. One can choose UTF-8 character set as the repository 
code page to store multiple languages. 

Memory and CPU Resource Usage 

You may notice an increase in memory and CPU resource usage on machines 
running PowerCenter Services. 

Domain Configuration Database

PowerCenter stores the domain configuration in a database. 

License Usage

Effective in version 8.0, the Service Manager registers license information. 

High Availability

High availability is the PowerCenter option that eliminates a single point 


of failure in the PowerCenter environment and provides minimal service 
interruption in the event of failure. High availability provides the 
following functionality: 
Resilience: Resilience is the ability for services to tolerate transient 
failures, such as loss of connectivity to the database or network failure 

TYPE3 : KEEP CURRENT AND PREVIOUS VALUES IN TARGET


1.SOURCE

Source is imported from thedatabase.Here EMP table takesn as source.

2.SOURCE QUALIFIER

The data from sourse is passed to Source Qualifier. Here source datatypes are converted into Informatice data types.It has got
outp[ut ports to pass data to the respective tranformation.Here , the values are passed into the lookup tranformation.

3.LOOKUP TRANFORMATION(LKP_Getdata)

Analyse the rows that dont update in any case(preferably the primary or composite keys) . These values are to be defined again
in new ports with new names. Here in the above example, empno,ename and hiredate are defined as IN_EMPNO,IN_ENAME and
IN_HIREDATE.Map the EMPNO,ENAME,HIREDATE from source qualifier to IN_EMPNO,IN_EMPNAME,IN_HIREDATE in
lookup(These the only input ports). A unique key is created in adition to the lookup values.Here PM_Primary key is created.

a)In the properties tab, define the condition . Here we need to define ,
EMPNO=IN_EMPNO, ENAME=IN_ENAME,HIREDATE=IN_HIREDATE

b) Check the other required properties and also check for the connectivity if its a relational data source.

4.EXPRESSION TRANFORMATION (EXP_DetectChanges)

a) Pass values(except the input values defined in lookup) from Source Qualifier into Expression
tranformation(exp_detectchanges). Create two output ports CHANGEDFLAG(integer) and NEWFLAG(integer).Pass the unique key
(PM_PrimaryKey) into the exp tranformation and map it . Create the comparison column for all the values
(comm,deptno,sal,mgr). Map the lookup values to these comparision keys.

b) pass the lookup PM_primarykey to exp PM_primarykey.

c) Make sure only changed flag and new flag are the only output ports here .

d) logic for changed flag: If any row in changed or updated, the row that is updated is returned here in this port.

IIF (NOT ISNULL (PM_PRIMARYKEY)
AND
(DECODE (COMM,PM_PREV_COMM,1,0) = 0
OR
DECODE (DEPTNO,PM_PREV_DEPTNO,1,0) = 0
OR
DECODE (JOB,PM_PREV_JOB,1,0) = 0
OR
DECODE (MGR,PM_PREV_MGR,1,0) = 0
OR
DECODE (SAL,PM_PREV_SAL,1,0) = 0
),TRUE,FALSE)

Primarily, it checks if the primary key in not null(meaning, the row must be already existing one). Next it compares the existing
columns with the new columns. DECODE() is used to compare the columns here.

DECODE:

Searches a port for a value you specify. If the function finds the value, it returns a result value, which you define. You
can build an unlimited number of searches within a DECODE function.

If you use DECODE to search for a value in a string port, you can either trim trailing blanks with the RTRIM function or
include the blanks in your search string.

Syntax
DECODE( value, first_search, first_result [, second_search, second_result]...[,default] )

Source :: Informatica Help Guide

It compares 2 values in the decode function and returns '0' if both the values are different . So, the function return only '0'
values here. We get the updated rows in return in the changedflag port.

e)Logic for NewFlag: For new records

IIF ( ISNULL (PM_PRIMARYKEY),TRUE,FALSE)

If any new record is inserted, there is no PM_PRIMARY key created in that . It is a null value. So, write a logic , if
PM_PRIMARYKEY field is null, it return true. Meaning, a new record is created.

5.1.1 FILTER TRANFORMATION (FIL_InsertNewRecord)

Pass values from the Source Qualifier.Edit a new value with name, 'NewFlag' (integer).Map the 'Newflag' in EXP_DetectChanges
to 'NewFlag' in FIL_NewRecord.Apply filter condition on 'NewFlag'. It filters the newly inserted records.
TYPE1 : KEEP MOST RECENT VALUES IN TARGET

1.SOURCE

SOURCE IS IMPORTED FROM THE DATABASE . HERE EMP TABLE(SAY) IS TAKEN AS A SOURCE.

2.SOURCE QUALIFIER

The data from sourse is passed to Source Qualifier. Here source datatypes are converted into Informatice data types.It has got
outp[ut ports to pass data to the respective tranformation.Here , the values are passed into the lookup tranformation.

3.LOOKUP TRANFORMATION(LKP_Getdata)

Analyse the rows that dont update in any case(preferably the primary or composite keys) . These values are to be defined again
in new ports with new names. Here in the above example, empno,ename and hiredate are defined as IN_EMPNO,IN_ENAME and
INHIREDATE.Map the EMPNO,ENAME,HIREDATE from source qualifier to IN_EMPNO,IN_EMPNAME,IN_HIREDATE in lookup(These
the only input ports). A unique key is created in adition to the lookup values.Here PM_Primary key is created.

a)In the properties tab, define the condition . Here we need to define ,
EMPNO=IN_EMPNO, ENAME=IN_ENAME,HIREDATE=IN_HIREDATE

b) Check the other required proterties and also check for the connectivity if its a relational data source.

4.EXPRESSION TRANFORMATION (EXP_DetectChanges)

a) Pass values(except the input values defined in lookup) from Source Qualifier into Expression
tranformation(exp_detectchanges). Create two output ports CHANGEDFLAG(integer) and NEWFLAG(integer).Pass the unique key
(PM_PrimaryKey) into the exp tranformation and map it . Create the comparison column for all the values
(comm,deptno,sal,mgr). Map the lookup values to these comparision keys.

b) pass the lookup PM_primarykey to exp PM_primarykey.

c) Make sure only changed flag and new flag are the only output ports here .

d) logic for changed flag: If any row in changed or updated, the row that is updated is returned here in this port.

IIF (NOT ISNULL (PM_PRIMARYKEY)
AND(
DECODE (COMM,PM_PREV_COMM,1,0) = 0
OR
DECODE (DEPTNO,PM_PREV_DEPTNO,1,0) = 0
OR
DECODE (JOB,PM_PREV_JOB,1,0) = 0
OR
DECODE (MGR,PM_PREV_MGR,1,0) = 0
OR
DECODE (SAL,PM_PREV_SAL,1,0) = 0
),TRUE,FALSE)

Primarily, it checks if the primary key in not null(meaning, the row must be already existing one). Next it compares the existing
columns with the new columns. DECODE() is used to compare the columns here.
DECODE:
Searches a port for a value you specify. If the function finds the value, it returns a result value, which you define. You
can build an unlimited number of searches within a DECODE function.

If you use DECODE to search for a value in a string port, you can either trim trailing blanks with the RTRIM function or
include the blanks in your search string.

Syntax
DECODE( value, first_search, first_result [, second_search, second_result]...[,default] )

Source :: Informatica Help Guide


It compares 2 values in the decode function and returns '0' if both the values are different . So, the function return only '0'
values here.

We get the updated rows in return in the changedflag port.

e)Logic for NewFlag: For new records

IIF ( ISNULL (PM_PRIMARYKEY),TRUE,FALSE)

If any new record is inserted, there is no PM_PRIMARY key created in that . It is a null value. So, write a logic , if
PM_PRIMARYKEY field is null, it return true. Meaning, a new record is created.
5.1.1 FILTER TRANFORMATION (FIL_NewRecord)

Pass values from the Source Qualifier.Edit a new value with name, 'NewFlag' (integer).Map the 'Newflag' in EXP_DetectChanges
to 'NewFlag' in FIL_NewRecord.Apply filter condition on 'NewFlag'. It filters the newly inserted records.

5.1.2.UPDATE STRATEGY TRANSFORMATION(UPD_ForceInserts)

Pass the values from filter transformation (FIL_NewRecord) to Update Strategy (UPD_ForceInserts).

Change the "Update Strategy Information" to "DD_INSERT"

DD_Insert:

Flags records for insertion in an update strategy expression. DD_INSERT is equivalent to the integer literal 0.

Note: The DD_INSERT constant is designed for use in the Update Strategy transformation only. Informatica
recommends using DD_INSERT instead of the integer literal 0. It is easier to troubleshoot complex numeric expressions
if you use DD_INSERT.

When you run a workflow, select the data-driven update strategy to write records to a target based on this flag.

Source :: Informatica Help Guide

5.1.3. SEQUENCE GENERATOR TRANSFORMATION (SEQ_GenerateKeys)

To create the unique key for the newly inserted records. The default values in the sequence generator are 'NEXTVAL' and
'CURRVAL'.
NEXTVAL:
Use the NEXTVAL port to generate a sequence of numbers by connecting it to a transformation or target. You connect
the NEXTVAL port to a downstream transformation to generate the sequence based on the Current Value and Increment
By properties.

Source :: Informatica Help Guide

TYPE2 : KEEP FULL HISTORY OF CHANGES IN TARGET


FLAG | DATE | VERSION

FLAG

1.SOURCE

Source is imported from thedatabase.Here EMP table takesn as source. 

2.SOURCE QUALIFIER

The data from sourse is passed to Source Qualifier. Here source datatypes are converted into Informatice data types.It has got
outp[ut ports to pass data to the respective tranformation.Here , the values are passed into the lookup tranformation.

3.LOOKUP TRANFORMATION(LKP_Getdata)

Analyse the rows that dont update in any case(preferably the primary or composite keys) . These values are to be defined again
in new ports with new names. Here in the above example, empno,ename and hiredate are defined as IN_EMPNO,IN_ENAME and
INHIREDATE.Map the EMPNO,ENAME,HIREDATE from source qualifier to IN_EMPNO,IN_EMPNAME,IN_HIREDATE in lookup(These
the only input ports). A unique key is created in adition to the lookup values.Here PM_Primary key is created.

a)In the properties tab, define the condition . Here we need to define ,
EMPNO=IN_EMPNO, ENAME=IN_ENAME,HIREDATE=IN_HIREDATE

b) Check the other required proterties and also check for the connectivity if its a relational data source.

4.EXPRESSION TRANFORMATION (EXP_DetectChanges)

a) Pass values(except the input values defined in lookup) from Source Qualifier into Expression
tranformation(exp_detectchanges). Create two output ports CHANGEDFLAG(integer) and NEWFLAG(integer).Pass the unique
key (PM_PrimaryKey) into the exp tranformation and map it . Create the comparison column for all the values
(comm,deptno,sal,mgr). Map the lookup values to these comparision keys.

b) pass the lookup PM_primarykey to exp PM_primarykey.

c) Make sure only changed flag and new flag are the only output ports here .

d) logic for changed flag: If any row in changed or updated, the row that is updated is returned here in this port.

IIF (NOT ISNULL (PM_PRIMARYKEY)
AND
(
DECODE (COMM,PM_PREV_COMM,1,0) = 0
OR
DECODE (DEPTNO,PM_PREV_DEPTNO,1,0) = 0
OR
DECODE (JOB,PM_PREV_JOB,1,0) = 0
OR
DECODE (MGR,PM_PREV_MGR,1,0) = 0
OR
DECODE (SAL,PM_PREV_SAL,1,0) = 0
),TRUE,FALSE)

Primarily, it checks if the primary key in not null(meaning, the row must be already existing one). Next it compares the existing
columns with the new columns. DECODE() is used to compare the columns here.

DECODE:
Searches a port for a value you specify. If the function finds the value, it returns a result value, which you define. You
can build an unlimited number of searches within a DECODE function.

If you use DECODE to search for a value in a string port, you can either trim trailing blanks with the RTRIM function or
include the blanks in your search string.

Syntax
DECODE( value, first_search, first_result [, second_search, second_result]...[,default] )

Source :: Informatica Help Guide


It compares 2 values in the decode function and returns '0' if both the values are different . So, the function return only '0'
values here.

We get the updated rows in return in the changedflag port.

5.2.1. FILTER TRANSFORMATION(FIL_InsertChangedRecord)

Pass the values ( other than primary keys, and values that can be changed) from the source qualifier to the filter. Also, pass
PM_Primary key from exp (EXP_DetectChanges)transformation. Create a new value 'ChangedFlag' and pass the 'changedflag'
value from EXP_DetectChanges to 'changedflag' in FIL_InsertChangedRecord.

Filter records on 'changedflag'

5.2.2 UPDATE STRATEGY TRANSFORMATION(UPD_ChangedInserts)

Pass the values including 'PM_Primarykey' from filter transformation (FIL_NewRecord) to Update Strategy
(UPD_ChangedInserts).

Change the "Update Strategy Information" to "DD_Inserts".

5.2.3 EXPRESSION TRANSFORMATION (EXP_KeyProcessing_InsertChanged)

Pass PM_Primarykey values from UPD_ChangedInserts to EXP_KeyProcessing_InsertChanged. Edit the transformation with new
values, NEW_PM_Primary key and PM_CURRENT_FLAG.

passvalues..

NEW_PM_PrimaryKey = PM_PRIMARYKEY + 1

PM_CURRENT_FLAG = 1
5.2.4 TARGET (NEWTARGET2)

Pass the values from the Update Strategy ( UPD_ChangedInserts) to the target .Also, pass the PM_Primarykey and
PM_CURRENT_FLAG values to the PM_Primarykey and PM_CURRENT_FLAG columns in the target.

5.3.1 FILTER TRANSFORMATION (FIL_UpdateChangedRecord)

Create a filter with PM_PrimaryKey and ChangedFlag ports and pass PM_PrimaryKey value from lookup and ChangedFlag value
from Expression( EXP_DetectChanges) .Filter on Changedflag.

5.3.2 UPDATE STRATEGY(UPD_ChangedUpdate)

Pass the PM_PrimaryKey value into this transformation.

5.3.4.Expression Transformation(EXP_KeyProcessing_UpdateChanged)

Pass the PM_PrimaryKey value in the transformation. In addition, create PM_CURRENT_FLAG and pass '0' to that .

5.3.4 TARGET

Pass PM_PrimaryKey from UPD_ChangedUpdate and PM_CURRENT_FLAG from EXP_KeyProcessing_UpdateChanged.

6. Validate the mappings and check if the mappings are valid.

7. Save the mappings .

8. Go to WorkFlow Manager and create a session task and attach the respective mapping to the session. Edit the
mapping properties with the valid source and target definitions. Change "Target Load Type" as "normal" for the
target and uncheck the "truncate table option", for the history.

9. Save and start the task.

TYPE2 : KEEP FULL HISTORY OF CHANGES IN TARGET


FLAG | DATE | VERSION

DATE

1.SOURCE

Source is imported from thedatabase.Here EMP table takesn as source.

2.SOURCE QUALIFIER

The data from sourse is passed to Source Qualifier. Here source datatypes are converted into Informatice data types.It has got
outp[ut ports to pass data to the respective tranformation.Here , the values are passed into the lookup tranformation.

3.LOOKUP TRANFORMATION(LKP_Getdata)

Analyse the rows that dont update in any case(preferably the primary or composite keys) . These values are to be defined again
in new ports with new names. Here in the above example, empno,ename and hiredate are defined as IN_EMPNO,IN_ENAME and
INHIREDATE.Map the EMPNO,ENAME,HIREDATE from source qualifier to IN_EMPNO,IN_EMPNAME,IN_HIREDATE in lookup(These
the only input ports). A unique key is created in adition to the lookup values.Here PM_Primary key is created.

a)In the properties tab, define the condition . Here we need to define ,
EMPNO=IN_EMPNO, ENAME=IN_ENAME,HIREDATE=IN_HIREDATE

b) Check the other required properties and also check for the connectivity if its a relational data source.

4.EXPRESSION TRANFORMATION (EXP_DetectChanges)

a) Pass values(except the input values defined in lookup) from Source Qualifier into Expression
tranformation(exp_detectchanges). Create two output ports CHANGEDFLAG(integer) and NEWFLAG(integer).Pass the unique
key (PM_PrimaryKey) into the exp tranformation and map it . Create the comparison column for all the values
(comm,deptno,sal,mgr). Map the lookup values to these comparision keys.

b) pass the lookup PM_primarykey to exp PM_primarykey.

c) Make sure only changed flag and new flag are the only output ports here .

d) logic for changed flag: If any row in changed or updated, the row that is updated is returned here in this port.

IIF (NOT ISNULL (PM_PRIMARYKEY)
AND
(DECODE (COMM,PM_PREV_COMM,1,0) = 0
OR
DECODE (DEPTNO,PM_PREV_DEPTNO,1,0) = 0
OR
DECODE (JOB,PM_PREV_JOB,1,0) = 0
OR
DECODE (MGR,PM_PREV_MGR,1,0) = 0
OR
DECODE (SAL,PM_PREV_SAL,1,0) = 0
),TRUE,FALSE)

Primarily, it checks if the primary key in not null(meaning, the row must be already existing one). Next it compares the existing
columns with the new columns. DECODE() is used to compare the columns here.

DECODE:
Searches a port for a value you specify. If the function finds the value, it returns a result value, which you define. You
can build an unlimited number of searches within a DECODE function.

If you use DECODE to search for a value in a string port, you can either trim trailing blanks with the RTRIM function or
include the blanks in your search string.

Syntax
DECODE( value, first_search, first_result [, second_search, second_result]...[,default] )

Source :: Informatica Help Guide


It compares 2 values in the decode function and returns '0' if both the values are different . So, the function return only '0'
values here.

We get the updated rows in return in the changedflag port.

e)Logic for NewFlag: For new records


IIF ( ISNULL (PM_PRIMARYKEY),TRUE,FALSE)

If any new record is inserted, there is no PM_PRIMARY key created in that . It is a null value. So, write a logic , if
PM_PRIMARYKEY field is null, it return true. Meaning, a new record is created.
5.1.1 FILTER TRANFORMATION (FIL_InsertNewRecord)

Pass values from the Source Qualifier.Edit a new value with name, 'NewFlag' (integer).Map the 'Newflag' in EXP_DetectChanges
to 'NewFlag' in FIL_NewRecord.Apply filter condition on 'NewFlag'. It filters the newly inserted records.

5.1.2.UPDATE STRATEGY TRANSFORMATION(UPD_ForceInserts)

Pass the values from filter transformation (FIL_InsertNewRecord) to Update Strategy (UPD_ForceInserts).Change the "Update
Strategy Information" to "DD_INSERT"

DD_Insert:

Flags records for insertion in an update strategy expression. DD_INSERT is equivalent to the integer literal 0.

Note: The DD_INSERT constant is designed for use in the Update Strategy transformation only. Informatica
recommends using DD_INSERT instead of the integer literal 0. It is easier to troubleshoot complex numeric expressions
if you use DD_INSERT.

When you run a workflow, select the data-driven update strategy to write records to a target based on this flag.

Source :: Informatica Help Guide


5.1.3. SEQUENCE GENERATOR TRANSFORMATION (SEQ_GenerateKeys)

To create the unique key for the newly inserted records. The default values in the sequence generator are 'NEXTVAL' and
'CURRVAL'.

NEXTVAL:
Use the NEXTVAL port to generate a sequence of numbers by connecting it to a transformation or target. You connect
the NEXTVAL port to a downstream transformation to generate the sequence based on the Current Value and Increment
By properties.

Source :: Informatica Help Guide


5.1.4 EXPRESSION TRANSFORMATION (EXP_Keyprocessing_InsertNew)

Create a nextvalue (input )column in expression . Pass the next value from Seq . Generator to the next value in the expression.
Create one output port PM_BEGIN_DATE and pass SYSDATE value to that .Also create PM_PrimaryKey as an output port and
pass NEXTVAL to this port.

5.1.5 TARGET (EMP_TGT_SCD2)

Pass the values from the Update Strategy Transformation (UPD_ForceInserts) into the target. Pass NEXTVAL , PM_BEGIN_DATE
and PM_Primarykey values to the respective ports in the target.PM_BEGIN_DATE gives the system date when the new record is
inserted.

5.2.1 EXPRESSION TRANSFORMATION (EXP_Keyprocessing_InsertChanged)

Create NEXTVAL,NEW_PM_Primarykey and PM_BEGIN_DATE here .Get the nextval from seq Gen and pass nextval into
NEW_PM_Primarykey. pass SYSDATE to PM_BEGINDATE

5.2.2. FILTER TRANSFORMATION(FIL_InsertChangedRecord)

Pass the values from the source qualifier to the filter. Also, pass 'changed flag' from the EXP_DetectChanges to the filter.

Filter records on 'changedflag'

5.2.3 UPDATE STRATEGY TRANSFORMATION(UPD_ChangedInserts)

Pass the values including 'PM_Primarykey' from filter transformation (FIL_UpdateChangedRecord) to Update Strategy
(UPD_ChangedUpdate).

Change the "Update Strategy Information" to "DD_Insert".

5.2.4 TARGET (NEWTARGET2)

Pass the values from the Update Strategy ( UPD_ChangedInserts) and EXP_Keyprocessing_InsertChanged to the target .

5.3.1 FILTER TRANSFORMATION (FIL_UpdateChangedRecord)

Get values PM_Primarykey from lookup and Changed Flag from expression and filter records on changed flag.

5.3.2.UPDATESTRATEGY (UPD_ChangedUpdate)

Pass PM_PrimaryKey into update strategy and Change the "Update Strategy Information" to "DD_Update".

5.3.3.TARGET

Pass the PM_PrimaryKey from update strategy to target.

6. Validate the mappings and check if the mappings are valid.

7. Save the mappings .

8. Go to WorkFlow Manager and create a session task and attach the respective mapping to the session. Edit the
mapping properties with the valid source and target definitions. Change "Target Load Type" as "normal" for the
target and uncheck the "truncate table option", for the history.

9. Save and start the task.

TYPE2 : KEEP FULL HISTORY OF CHANGES IN TARGET


FLAG | DATE | VERSION

VERSION

1.SOURCE
SOURCE IS IMPORTED FROM THE DATABASE . HERE EMP TABLE(SAY) IS TAKEN AS A SOURCE.

2.SOURCE QUALIFIER

The data from sourse is passed to Source Qualifier. Here source datatypes are converted into Informatice data types.It has got
outp[ut ports to pass data to the respective tranformation.Here , the values are passed into the lookup tranformation.

3.LOOKUP TRANFORMATION(LKP_Getdata)

Analyse the rows that dont update in any case(preferably the primary or composite keys) . These values are to be defined again
in new ports with new names. Here in the above example, empno,ename and hiredate are defined as IN_EMPNO,IN_ENAME and
INHIREDATE.Map the EMPNO,ENAME,HIREDATE from source qualifier to IN_EMPNO,IN_EMPNAME,IN_HIREDATE in lookup(These
the only input ports). A unique key is created in adition to the lookup values.Here PM_Primary key is created.

a)In the properties tab, define the condition . Here we need to define ,

EMPNO=IN_EMPNO, ENAME=IN_ENAME,HIREDATE=IN_HIREDATE

b) Check the other required proterties and also check for the connectivity if its a relational data source.

4.EXPRESSION TRANFORMATION (EXP_DetectChanges)

a) Pass values(except the input values defined in lookup) from Source Qualifier into Expression
tranformation(exp_detectchanges). Create two output ports CHANGEDFLAG(integer) and NEWFLAG(integer).Pass the unique
key (PM_PrimaryKey) into the exp tranformation and map it . Create the comparison column for all the values
(comm,deptno,sal,mgr). Map the lookup values to these comparision keys.

b) pass the lookup PM_primarykey to exp PM_primarykey.

c) Make sure only changed flag and new flag are the only output ports here .

d) logic for changed flag: If any row in changed or updated, the row that is updated is returned here in this port.

IIF (NOT ISNULL (PM_PRIMARYKEY)

AND

DECODE (COMM,PM_PREV_COMM,1,0) = 0

OR

DECODE (DEPTNO,PM_PREV_DEPTNO,1,0) = 0

OR

DECODE (JOB,PM_PREV_JOB,1,0) = 0

OR

DECODE (MGR,PM_PREV_MGR,1,0) = 0
OR

DECODE (SAL,PM_PREV_SAL,1,0) = 0

),TRUE,FALSE)

Primarily, it checks if the primary key in not null(meaning, the row must be already existing one). Next it compares the existing
columns with the new columns. DECODE() is used to compare the columns here.

DECODE:
Searches a port for a value you specify. If the function finds the value, it returns a result value, which you define. You
can build an unlimited number of searches within a DECODE function.

If you use DECODE to search for a value in a string port, you can either trim trailing blanks with the RTRIM function or
include the blanks in your search string.

Syntax
DECODE( value, first_search, first_result [, second_search, second_result]...[,default] )

Source :: Informatica Help Guide


It compares 2 values in the decode function and returns '0' if both the values are different . So, the function return only '0'
values here.

We get the updated rows in return in the changedflag port.

e)Logic for NewFlag: For new records

IIF ( ISNULL (PM_PRIMARYKEY),TRUE,FALSE)

If any new record is inserted, there is no PM_PRIMARY key created in that . It is a null value. So, write a logic , if
PM_PRIMARYKEY field is null, it return true. Meaning, a new record is created.

5.1.1 FILTER TRANFORMATION (FIL_NewRecord)

Pass values from the Source Qualifier.Edit a new value with name, 'NewFlag' (integer).Map the 'Newflag' in EXP_DetectChanges
to 'NewFlag' in FIL_NewRecord.Apply filter condition on 'NewFlag'. It filters the newly inserted records.

5.1.2.UPDATE STRATEGY TRANSFORMATION(UPD_ForceInserts)

Pass the values from filter transformation (FIL_NewRecord) to Update Strategy (UPD_ForceInserts).

Change the "Update Strategy Information" to "DD_INSERT"


DD_Insert:

Flags records for insertion in an update strategy expression. DD_INSERT is equivalent to the integer literal 0.

Note: The DD_INSERT constant is designed for use in the Update Strategy transformation only. Informatica
recommends using DD_INSERT instead of the integer literal 0. It is easier to troubleshoot complex numeric expressions
if you use DD_INSERT.

When you run a workflow, select the data-driven update strategy to write records to a target based on this flag.

Source :: Informatica Help Guide


5.1.3. SEQUENCE GENERATOR TRANSFORMATION (SEQ_GenerateKeys)

To create the unique key for the newly inserted records. The default values in the sequence generator are 'NEXTVAL' and
'CURRVAL'.

NEXTVAL:
Use the NEXTVAL port to generate a sequence of numbers by connecting it to a transformation or target. You connect
the NEXTVAL port to a downstream transformation to generate the sequence based on the Current Value and Increment
By properties.

Source :: Informatica Help Guide


5.1.4 EXPRESSION TRANSFORMATION (EXP_KeyProcessing_InsertNew)

Create a nextvalue column in expression . Pass the next value from Seq . Generator to the next value in the expression. Also,
create a PM_VERSION_NUMBER and pass some value into it.

5.1.5 TARGET (EMP_TGT_SCD2)

Pass the values from the Update Strategy Transformation (UPD_ForceInserts) into the target. Also pass the next value and
PM_VERSION_NUMBER from the exp transformation (EXP_KeyProcessing_InsertNew) into the pm_version_number and
pm_primarykey values defined in the target .

5.1.2 FILTER TRANSFORMATION(FIL_InsertChangedRecord)

Pass the values ( other than primary keys, and values that can be changed) from the source qualifier to the filter. Also, pass
PM_Primary key from exp (EXP_DetectChanges)transformation. Create a new value 'ChangedFlag' and pass the 'changedflag'
value from EXP_DetectChanges to 'changedflag' in FIL_InsertChangedRecord.

Filter records on 'changedflag'

5.2.2 UPDATE STRATEGY TRANSFORMATION(UPD_ChangedInserts)

Pass the values including 'PM_Primarykey' from filter transformation (FIL_NewRecord) to Update Strategy
(UPD_ChangedInserts).

Change the "Update Strategy Information" to "DD_Inserts".

5.2.3 EXPRESSION TRANSFORMATION (EXP_KeyProcessing_InsertChanged)

Pass PM_Primarykey values from UPD_ChangedInserts to EXP_KeyProcessing_InsertChanged. Edit the transformation with new
values, NEW_PM_Primary key and NEW_PM_VERSION_NUMBER.
passvalues..

NEW_PM_PrimaryKey = PM_PRIMARYKEY + 1

NEW_PM_VERSION_NUMBER= (PM_PRIMARYKEY + 1)

This is to create the sequential keys for the updated records below the respective record .

5.2.4 TARGET (EMP_TGT_SCD21)

Pass the values from the Update Strategy (UPD_ChangedInserts) to the target .Also, pass the PM_Primarykey and
PM_VERSION_NUMBER values to the PM_Primarykey and pm_version_num columns in the target.

6. Validate the mappings and check if the mappings are valid.

7. Save the mappings .s

8. Go to WorkFlow Manager and create a session task and attach the respective mapping to the session. Edit the mapping
properties with the valid source and target definitions. Change "Target Load Type" as "normal" for the target and uncheck the
"truncate table option", for the history.

9. Save and start the task.

Mapping : How to find the number of success , rejected and bad records in the same mapping.
Explanation : In this Mapping we will see how to find the number of success , rejected and bad records in one
mapping.

 Source file is a flat file which is in .csv format . Click here to download the source file.The
table appears like as shown below..

EMPN HIREDAT SE
NAME
O E X
100 RAJ 21-APR M
101 JOHN 21-APR-08 M
102 MAON 08-APR M
103 22-APR-08 M
105 SANTA 22-APR-08 F
SMITH
104 22-APR-08 F
A
106 M
 In the above table it shows few values are missing in the table .ANd also the date format
of few records are improper.This must be considered as invlaid records and should be
loaded into Bad_records table ( target table which is relational).
 Other than 2 , 3 , 5, 6 records ,remaining all are invalid records because of NULL values
or improper DATE format or both .
 INVALID & VALID RECORDS ::
 First we seperate this data using Expression transformation.Which is used to flag the row
for 1 or 0 .The condition as follows ..
 IIF(NOT IS_DATE(HIREDATE,'DD-MON-YY') OR ISNULL(EMPNO) OR ISNULL(NAME) OR
ISNULL(HIREDATE) OR ISNULL(SEX) ,1,0)
 FLAG =1 is considered as invalid data and FLAG =0 is considered as valid data .This data
will be routed into next transformation using router transformation .Here we added two
user groups one as FLAG=1 for invalid data and the other as FLAG=0 for valid data.
 FLAG=1 data is forwarded to the expression transformation .Here we take one variable
port and trwo ouput ports .One for increament purpose and the other for flag the row ...
 INVALID RECORDS
 INCREAMENT ::

PORT EDIT EXPRESSION


COUNT_INVALID V_PORT ( output port )
V_PORT V_PORT+1 ( variable)
 INVALID DATE ::

PORT EDIT EXPRESSION


INVALID_DAT IIF( IS_DATE(O_HIREDATE,'DD-MON-
E YY'), O_HIREDATE, 'INVALID DATE')
 This data will be moved to the BAD_RECORDS table.Look at the below table::

EMPN SE COUN
NAME HIREDATE
O X T
INVALID
100 RAJ M 1
DATE
MAO INVALID
102 M 2
N DATE
103 NULL 22-APR-08 M 3
106 NULL NULL M 4
 VALID RECORDS ::
 In this we will have the valid records.But here we dont want the Employee ,who is 'F'
(Female).So our goal is to load MALE employee info., into the SUCCESS_RECORDS target
table.
 For this we need to use a Router transformation and declare the user group as follows \
 IIF( sex='M',TRUE,FALSE)
 And the defined group will capture teh rejected records which are nothing but employee
who is FEMALE .
 This data passed to the REUSABLE Expressiona transformation.Where the Increamental
logic is applied to get the count value for the the no., of success and rejected records
which are passing it.And loaded into the target table.
 Look at the below tables :::
 SUCCESS_RECORDS::

EMPN NAM SE COUN


HIREDATE
O E X T
101 JOHN 22-APR-08 M 1
 REJECTED_RECORDS::

EMPN HIREDAT SE COUN


NAME
O E X T
22-APR-
105 SANTA F 1
2008
SMITH 22-APR-
106 F 2
A 2008

Mapping : Increamental Data load code using SQL Override logic


In this Mapping we will see how to implement the incremental loading..

We will go for incremental loading to speed up the data loading and reduce data actually loaded.

There are different ways we can implement this incremantal loads.

One of the those methods by writing SQL Override in SQL using Mapping variable.

Explanation of the mapping: First we hav to create one mapping variable of type date.

In the source qualifier write the sql override as follows


SELECT SRC_INCRE_LOAD.EMPNO, SRC_INCRE_LOAD.ENAME, SRC_INCRE_LOAD.HIREDATE FROM
SRC_INCRE_LOAD WHERE SRC_INCRE_LOAD.HIREDATE>'$$V_LOAD_DATE'

In the expression assign sysdate to mapping variable to update the mapping variable because from
next load it will pick the records greater than today's date.It will accept only recent records
Output Port :: INCRE_LOAD
SETVARIABLE($$v_incre_load,sysdate)

Mapping : How to filter the Null Value data ?


Explanation : In this mapping we will understand how to filter the improper NULLvalue
data .

To detect the NULL values in a table , we use IS_NULL( ) function.

 In general , when you take source as flatfile or relational , we find


few feilds with NULL values .If you think that in your business
process this null data is a invalid data.Then its necessary to filter
invalid data.The below table shows ::

NAME AGE
JOHN 23
SMITH
34
LUCKY 24
 In the above table their are 4 records but in that two are valid and
other two are invalid records because in the 2 nd record AGE value is
NULL and in the 3rd record the NAME is a NULL value .So it is
essential to filter 2nd and 3rd before loading into the target
table.The belove table shows how a target table appears witha valid
data

NAME AGE
JOHN 23
LUCKY 24
 The 2nd and 3rd records can be filtered by using IS_NULL ( )
functionThis is declaredFilter Transformation Condition.
 The function appears like this IIF(NOT ISNULL(NAME) ,TRUE,FALSE)
 FIlter transformation passes only TRUE records to the target and
drops the FALSE records which are nothing but NULL value records.

Mapping : How to filter the improper Date format data?


Explanation : In this mapping we will understand how to filter the
improper Date format data .

To check the Date format , we use IS_DATE() function.

When you take a source as Flatfile , their are chances to see


the invalid DATE format .Like as shown in the below source
table.

DATE
20-Apr
11-Mar-2008
12-Feb-2008
Feb-2008
If you look at 1 st and 4th record of above table .It clearly
shows that the DATE value format is not correct.In the 1st
record year is missing and in the 4th date is missing.In this
case it is necessary to see this invalid data to be removed
before it loaded into the target .Like as shown in below
target table:

DATE
11-Mar-2008
12-Feb-2008
 The above can be acheived by using the IS_DATE( )
function .This is declared in Filter
Transformation condition.
 The function appears like this IIF(IS_DATE(DATE,
'DD-MON-YY'),TRUE,FALSE)

Download the mapping and execute it in your lab .


Mapping : How to correct the data values without spaces?
Explanation : In this mapping we will understand how to remove the spaces in the data .It is very important to
see the data values must be aligned equally their should not be any spaces .

To trim this spaces , we will use LTRIM() and RTRIM() functions.

When you take a source as Flatfile and it consists one of the feild as NAME .The following below table is
the source table data::

NAME

Ravi

Ramesh

Swathi
Jack
In the above table you can clearly see the Name values are not aligned properly and to align the data into
equally .We have to use TRIM concept .Now in our mapping we are loading above data into target table
.But in target table it appears like as shown below

NAME

Ravi

Ramesh

Swathi

Jack
The above can be acheived by using the LTRIM( ) function in expression editor.This is declared in
Expression transformation output port.

The function appears like this LTRIM(NAME)

Mapping : How to convert flatfile Date datatype ( string) into Relational Date datatype (Date) ?
Explanation : In this mapping we will understand how to use the TO_DATE( ) function.It is mainly used to convert
String datatype to Date datatype .

When you take a source as Flatfile and it consists one of the feild as DATE and its datatype generally a
String .The following below table is the source table ::

Feild Name Datatype

DATE String

If you want to load the above source data into realtional target.Then it is manditory to match the
datatype.In relational generaly the date datatype is Date.The following below table is the target table ::

Feild Name Datatype


DATE Date

To match the datatype of ports , we use the TO_DATE ( ) function in expression transformation editor.The
function appears like this TO_DATE(DATE,'DD-MON-YY')

How to use "REPLACECHR" functiion in mapping.


Mapping :
Explanation : Replaces characters in a string with a single character or no character. REPLACECHR searches the
input string for the characters you specify and replaces all occurrences of all characters with the
new character you specify.

Syntax

REPLACECHR( CaseFlag , InputString , OldCharSet , NewChar ) 

Example

REPLACECHR(0 , IN_PHONE , 'abcdefghijklmnopqrstuvwxyz!@#$%^&*(1234567890' , NULL)

0 - represents non case sensitive , 1 represents case sensitive.

Mapping : How to use "SUBSTR" functiion in mapping.


Explanation : Returns a portion of a string. SUBSTR counts all characters, including blanks, starting at the
beginning of the string.

Syntax

SUBSTR( string , start [, length ] )

Example

Substr (IN_PHONE, 1 ,3)

 
Mapping : Design a mapping to load the valid source records into warehouse.
Solution : Source : Flatfile
Target : Relational
Informatica version 7.1.1
Database : Oracle

Logic :: Use four conditions in Router transformation 

FIRST::NEXTVAL%2 = 0.5 OR NEXTVAL % 4 = 1


SECOND::NEXTVAL % 4 = 2
THIRD::NEXTVAL % 2 = 1.5 OR NEXTVAL%4 = 3
FOURTH::NEXTVAL %4 = 0

Mapping : Design a mapping generates sequence of numbers without using sequence


generator?
Solution : Source : Flatfile
Target : Relational
Database : Oracle

Note : usage of setmaxvariable() function and mapping variables !

Mapping : Design a mapping to seperate or remove duplicate records?


Solution : Source : Flatfile
Target : Relational
Database : Oracle

Tip : Using mapping variable port implemented this. Same can be done
using dynamic lookup , Aggreator transformation .

Mapping : first half to one target and second half to other target.
Solution : Source : Flatfile
Target : Relational
Database : Oracle

Tip : use stored procedure to count the records

Mapping : first half to one target and second half to other target.
Solution : Source : Flatfile
Target : Relational
Database : Oracle

Tip : use stored procedure to count the records

Vous aimerez peut-être aussi