Vous êtes sur la page 1sur 46

ODI Series Part 1

Now that HAL has really ended its shelf live what to people turn to, well I see a lot of people
assuming that EPMA must be the answer and end up wishing they have never taken that route. If
you load hierarchies from files or manage them straight from a repository then ODI (Oracle Data
Integrator) could be the way you want to go, it offers all this functionality and so much more.
ODI sits in Oracle fusion middleware suite of products and basically its an Extract Load and
Transform (E-LT) tool, it used to be owned by Sunopsis until it swallowed by the hungry jaws of
Oracle. It has built in connectivity to all the major databases and now has the added bonus of
Essbase and Planning adaptors which can be plugged straight into it.
ODI includes something known as Knowledge Modules that are components, which contain the
information for ODI to perform a specific set of tasks against a specific technology such as
Planning.
There are 6 different types of KMs and some have already been created for use with planning
and essbase.
The types of KMs you are likely to encounter are :
RKM (Reverse Knowledge Modules) are used to perform a customized reverse-engineering of
data models for a specific technology.
LKM (Loading Knowledge Modules) are used to extract data from the source database
tables and other systems (files, middleware, mainframe, etc.).
IKM (Integration Knowledge Modules) are used to integrate (load) data to the target tables.

Enough or the blurb I am going to try and cover off how you can use ODI with planning and
essbase though as it is quite an in-depth topic I am going to break into down into sections though
starting today with Installation and part of the configuration
First of all you will need to download the ODI installation package from edelivery, (Oracle Data
Integrator, Oracle Data Profiling, and Oracle Data Quality for Data Integrator 10g (10.1.3.4.0)
for Windows Platforms (CD) 450M)
The Planning and Essbase ODI adaptors from edelivery
Oracle Data Integrator Adapter for Hyperion Planning Release 9.3.1.1 Installation files
Oracle Data Integrator Knowledge Module for Essbase Release 9.3.1.1 Installation files
Updated 26/10/08 - I suggest installing ODI Version 10.1.3.4.5 or above, this will have all
the latest updated KMs and includes all neccessary drivers, if you install the patch you will
not need the Planning/Essbase ODI Installation files.
Download Version 10.1.3.5.0 for all the latest KMs

I have set up ODI against a 9.3.1 environment before but this time I am going to put it against a
11 environment, the setup shouldnt be any different though I am hoping adapter files will still
work as I have not seen any for 11 yet.
If you are using SQL server as I am for the database repository then you will also need JDBC
drivers, I am going to use the 2005 drivers, which can easily be downloaded from Microsoft.
You will need JRE 1.4+ (ODI has JRE 1.4.2 included) or JDK 1.5+ if you are going to use the
Web services of ODI. I intend to demonstrate the web service functionality to run ODI jobs so I
also download tomcat (version 5.5 or 6 is fine) for the web server and Axis2 as the web services
container. (It has to be Axis2 and version 1.2 and it is easiest to get the war version)
The installation of ODI is very simple just double click the setup.bat inside the \setup\Windows\
directory; you will be warned if your machine does not pass the pre-requisites. Once in the
product selection choose Oracle Data Integrator 10.1.3.4.0, choose complete which just means
the client and the server will be installed and select the ODI home and thats pretty much it.
Next step is to copy over the drivers into the odi file structure (\OraHome_1\oracledi\drivers)
If you are using SQL server copy over sqljdbc.jar
For essbase copy over all the jar files from the drivers directory of Oracle Data Integrator
Knowledge Module for Essbase Release 9.3.1.1 Installation files
Repeat for Planning, some of the files will already exist so just keep the latest ones.
So you should have something like this

To use ODI you will need to create two database repositories these are known as the Master
Repository and the Work Repository.
The Master holds the structure of the different types of technologies, security and version
management of projects and models
The Work holds all the information on data models, projects and how they are used.
If you are using Oracle grant schema user connect and resource, for SQL server I just made them
dbo.

Once the databases have been created you can create the ODI master repository.

Now because I am using sql server 2005 I had to manually enter driver and url string as ODI
only includes preset values for sql server 2000. If you are using Oracle you can easily select
Driver and just update the URL.

We now have our Master repository created so we can move on into the topology manager. (I
dont think I will ever get used to that name)
The topology manager lets you manage all your technologies such as essbase and planning, so
say for planning it is where you set up the connection details to your planning app (or in odi
speak Data Server)
When you start up the topology manager you have to connect to a Master Repository,
unfortunately this is more manual work entering all the JDBC details again.
Click the new button when the security repository connection window opens.
You need to give a login name, which can be anything and a username and password, the default
is SUPERVISOR/SUNOPSIS

OK we are in and the first thing we need to set up is the work repository and you guessed it yet
more entering of the connection details. There is a tab at the bottom of the topology manager
with AB on it that means repositories.

Insert a new work repository and enter all the details in the definition and JDBC tabbed window.

When you click test you will be asked to pick an agent, you can just choose Local (No Agent) for
now, details on agents shortly. You will then be asked for a name for the work repository and an
id, the id can be any numeric value, I have always left the Type as development the other option
is Execution.
In the topology manager the Hyperion Essbase technology has already been included but the
planning needs adding in. Go to the first tab and choose to import topology.

Locate the impexp folder of the planning installation files.

All being well Hyperion Planning should then appear in the technologies list.
Well that is it for this session as I said I want to try and break it up into lots of bite size chunks
instead of one endless blog.
Next session will be setting up of the agent and then configuring some of the technologies such
as planning.

ODI Series Part 2 - The Agent


Ok, so ODI is installed so we are now going to look at setting up an agent.
An agent is a java service that acts a listener on a designated tcp/ip port, the agent allows
execution of sessions such as model reverses, packages, scenarios and interfaces, you will learn
about more of these later.
You can set an agent up to run as a windows services or from a batch script, I am going to show
both methods.
First of all you need to locate and edit the file odiparams.bat that resides in the ODI bin directory.

The section you need to update is after


rem
rem Repository Connection Information
rem

Update the Driver and URL to match your ODI repository you should be used to doing this now.
The username and password are account details for the Master repository, you notice you have to
enter an encoded password, and there is a batch utility in the bin directory that will encode
passwords for you.

Update the ODI_SECU_WORK_REP to match the name of your work repository that you set up
in the topology manager.
You can leave the ODI_USER and pass alone if you have not changed the security for ODI.

To start the agent you will just need to create a simple batch script.

The port can be set to anything you like as long as you know it wont conflict with anything else.
Next add another script to stop the agent

Before you can start the agent a few more additional configurations are required in the topology
manager.
First you will need to insert an agent and give it a name, enter the host and the port you defined
in the batch script.

You cant actually test yet as the agent is not running.


Next I am going to create a context, a context is basically a way of grouping different
components together, so you can have say a development context which will bring together
certain technologies, databases, applications and agents.
There is already a global context available but I like to always set extra ones up for flexibility,
the context section can be accessed from the second tab

Finally a logical agent needs to be created which defines the link between the agent and the
context.

Once this is completed you should be able to start up the agent.

Just to be sure the agent is actually working you can go back and test it from the topology
manager.

Well thats all well and fine but some people like to have everything running as a windows
service, as long as you have updated the odiparams.bat then it is simple.

Just make sure you create the service with the same physical agent name and the same port as
you set up in the topology manager.
You can check the service is running by opening the agentservice.log in the bin directory.
One thing to watch out for is that if you ODI_JAVA_HOME is set to a directory with spaces, if
you are just using the default install then you will be fine otherwise you will need to add quotes
to the java config in the odiparams.bat file.
set ODI_JAVA_EXE=%ODI_JAVA_HOME%\bin\java.exe
set ODI_JAVAW_EXE=%ODI_JAVA_HOME%\bin\javaw.exe
set ODI_JAVAC_EXE=%ODI_JAVA_HOME%\bin\javac.exe
To
set ODI_JAVA_EXE="%ODI_JAVA_HOME%\bin\java.exe"
set ODI_JAVAW_EXE="%ODI_JAVA_HOME%\bin\javaw.exe"
set ODI_JAVAC_EXE="%ODI_JAVA_HOME%\bin\javac.exe"

Once the service has been installed remove the quotes


Well that completes another session, next time I will looking at configuring the topology
manager to point to a planning application, essbase database, file directory and sql database
tables before we move on to the designer.

ODI Series Part 3 - The configuration continues


Part 3 has arrived and I can go through the remaining configuration elements inside the topology
manager. The goal is to be able to load/extract hierarchy details from files and csv files and
database tables into Planning and essbase, to do this we need to set up connection details to
essbase , planning, sql database and a file store. It is going to be screen shot heaven today as
sometimes it explains much than words can.
Fire up the topology manager and straight into the Physical architecture tab. I will go through
each technology in alphabetical order so we start with the file set up.
If you going to write to or load from files in ODI you need a define a physical location, for this
exercise the files are going to be in E:\FileStore
If you expand File you will see that there is already Data server created called FILE_GENERIC,
I am going to ignore that and set up a new one.
Right click and Insert Data Server
(A physical component that stores and returns data is defined as a Data Server . A data server is
always linked to a single technology.)

Give the Data Server a name

Added 28/09/08. For the JDBC tab you need to have the following driver settings :-

Once you hit apply the physical schema window will appear, this is where you enter details for
the location of the files and set up the context details
An ODI Physical Schema corresponds to a pair of schemas:
A (Data) Schema, in which Oracle Data Integrator will look for the source and target data
structures for the interfaces.
A Work Schema in which Oracle Data Integrator can create and manipulate temporary work
data structures associated to the sources and targets contained in the Data Schema.
You need to manually enter the location into the Directory (Schema) and the Directory (Work
Schema), most screens in ODI are default screens so no matter what technology you are using
you will be presented with the same options, so dont worry if you keep asking the question
Why?
I have not found a requirement yet to use different values between the data and work schema.
There is no such functionality as browsing to the file location so make sure you enter the details
correctly.

Next you need to point it to a context, setting up a context was covered in the last part. Add a
new context, you will notice that the context that was set up earlier is populated as we made it
default, if you have a number of contexts you can manually select a different one. Finally we
need to provide a schema name where information will be stored in the repository.

Onto Essbase, I am going to set up a connection to Sample/Basic, it is pretty much the same as
before only when you insert the data server you will have to enter the essbase server connection
details.

One thing to be aware is not to hit the test button; this is only for use with JDBC connections so
will fail for products like essbase/planning.
Filling in the physical schema is all manual, so dont expect the drop downs to populate with

applications and databases. This is one of the problems with ODI you never really know if you
have got things right until later on because you cant test and there is no auto populating.

Setting up the planning details is pretty much similar but you must put the RMI port next to the
server details, the default port is 11333. I am setting up a connection to my sample application
(plansamp).

The next technology to set up is a connection to a database so I can access and write to tables, for
me it will be the Microsoft SQL Server technology but it is very similar for Oracle set up. I have
set up a database called HypData. Once again it is pretty much the same sort of setup but this
time you enter the JDBC connection details and yes you can even hit the test button.

As this time we are using a JDBC connection when you set up the physical schema you can dont
have to manually enter information the drop down boxes are auto populated, the only difference
is you have to select the database name and the owner.

Create a name for the logical schema

Ok, the final technology we have to make changes to is the Sunopsis Engine, this is the
technology that can be used when you are transforming data, so for example from a fields from a
flat file and mapping to a planning fields.
The technology is already set up and you just have to point it the context to it

Well the configuration is done for what we trying to achieve, depending on how you are going to
use ODI will no doubt increase the number of technologies you will use, it is a pretty powerful
tool and we are just touching on the surface that may suffice for most users.
That concludes todays session, next we will move on to the real centre of the product the
designer, where we create the interfaces to move the data to and from planning/essbase.

ODI Series 4 - Enter the Designer


Another blog and yes more ODI, today the designer comes in to play. The designer is where you
perform the main integration work e.g. loading hierarchies from flat files or database tables into
planning or transforming data before it is loaded into essbase. The designer stores all this
information in the work repository.
The main elements to the designer are models and projects; a model is a set of datastores
corresponding to data structures contained in a physical schema. In a model you will perform the
reverse engineering for instance reverse-engineering the dimensions and properties from a
planning application into a readable column format. The RKM (Reverse Knowledge Model) for
planning should have already been downloaded (see the first part of the ODI series to find out
what you need)
A project is a group of objects developed using ODI, the components to a project are:Packages:- A package is a sequence of steps organized into a flow diagram, so say you wanted to
wait for a file to arrive, load it into a planning dimension, push it to essbase and then send a
successful or failure email then the package is the place for this.
Interface:- An interface is really a set of rules that define the loading of a Datasource from one or
more source Datastores, an example of an interface is load members into planning dimension.
Procedure :- A procedure is a reusable component that groups operations that do not fit into the
interface category, for example sending a batch of files through FTP.
Variable :- This is a value which is stored in ODI, this value can be set or evaluated to create
conditions in packages, it is pretty much what you would expect a variable to be.
Sequence :- A sequence is a variable automatically incremented when used.
Knowledge Modules I described these in Part 1 of this series so if you need a refresher just
have a quick look back.
You will get a much clearer understanding of the different components as we start to use them.
The first time you start the designer you will have to set up the connection to the master
repository and work repository, you should be pretty used to this by now and I promise it is the
last time you will have to do it. It is the same as when first set up the Topology manager.

The first objective I have set myself to achieve in the designer is to load dimension information
from a flat file into the sample planning application, if you have been following then you will
have already set up connection information to planning and to the flat file directory in the
topology manager.
Before we start make sure you have extracted the KM files (Start KM_) from the impexp
directory in the downloaded installation files
Oracle Data Integrator Adapter for Hyperion Planning Release 9.3.1.1 Installation files
KM_IKM SQL to Hyperion Planning.xml
KM_RKM Hyperion Planning.xml
Oracle Data Integrator Knowledge Module for Essbase Release 9.3.1.1 Installation files
KM_IKM SQL to Hyperion Essbase (DATA).xml
KM_IKM SQL to Hyperion Essbase (METADATA).xml
KM_LKM Hyperion Essbase DATA to SQL.xml
KM_LKM Hyperion Essbase METADATA to SQL.xml
KM_RKM Hyperion Essbase.xml
Extract them to %ODI_HOME%\oracledi\impexp
This is just so all the knowledge modules are stored in one place and simpler for importing.
Once you are in the designer, the first thing to do is create a new project.

You will notice most of the components I described earlier have been created.
At the moment all we are going to do is import the required KMs, right click the project name
(ODI_DEMO_PROJECT) and select Import Knowledge Modules. Select the folder
%ODI_HOME%\oracledi\impexp and a full list of the KMs available should be displayed.
As the first objective is just to use a flat file and planning we will require the following KMs
RKM Hyperion Planning (reverse engineer the planning application)
LKM File to SQL (load data from a file to a temp area using SQL, the temp area is controlled by
the Sunopsis memory manager)
IKM SQL to Hyperion Planning (move the data from temp area and load it into the planning
repository using SQL)

You may have been wondering what the format of your flat file needs to be and this is where the
RKM for planning comes in, setting up the RKM is done in the models area.
First you need to create a models folder, it is nothing more than a folder and lets you maintain

your models. Click the Models tab then Insert Model folder (first button) and finally give it a
name.

Next step is to insert a model, the first model to set up is a data store to hold all the reverse
engineering of the planning app dimensions, when you insert the model you have to name it and
choose the technology, in this case it will be Hyperion Planning, on selection the logical schema
will be populated with the schema named you applied in the topology manager.

Now on to the reverse tab, this is where the clever KM will be used to reverse engineer the
planning dimensions.
Select Customized; choose the context and Logical agent (all of these have been set up from the
previous ODI blog)
Select RKM Hyperion Planning from the KM dropdown; it will have the project name appended
to it.

Apply the changes and click Reverse and let the KM does its magic.
How do you know it has completed? Well a couple of ways, just watch task manager and wait for
the CPU process to complete or officially look in the Operator, now you shout what the hell is
the Operator. The operator is basically a GUI that displays the status of executions from ODI and
are carried out through the use of the Agent.
I am not really going to go into any more detail about the Operator until later in the series when
most of the integrations are complete, but for now you can access it from the start menu or
directly from each ODI application, in the designer Windows > Open Module > Operator.

So it looks like our reverse has completed successfully.

If you expand the model you see each of the available dimensions have been reversed and placed
in a datastore with the same name as the dimension, there is also an extra store for the UDA.
At present you cannot reverse engineer or load to Year/Period/Version/Scenario that is just like
HAL.
Just to point out I have a couple of occasions where the reverse says it has completed but not all
the dimensions in the datastore have been populated, if this is the case just run the reverse again.
Expanding a datastore such as Account and then columns, the fields that can be populated will be
displayed.

This is where is should start to click in your mind what you are trying to achieve, from a flat file
you want to populate some of the above fields which will then be loaded into your planning app,
which fields you populate is up to you but obviously Account (member)/Parent are required if
you dont populate the other fields then defaults will be used.
Most of the fields are self-explanatory and relate straight to planning, if you have ever used HAL
they will be very familiar. If you have never used HAL then the one that stands out the most is
the Operation field.
Operation can have the following values.
Update This is the default and is used if not populated, it Add, updates, moves the member
being loaded.
Delete Level0 - Deletes the member being loaded if it has no children
Delete IdescendantsDeletes the member being loaded and all of its descendants.
Delete DescendantsDeletes the descendants of the member being loaded, but does not delete the
member itself.
Just a slight deviation but I am just going to demonstrate what happens when you have not set up
the connection to the planning app in the topology manager, I just renamed the planning app
from plansamp to plansamp1 in the topology manager and ran the reverse again.

I have found with ODI that sometimes the error messages are pretty obscure, this one is one of
the better ones The application plansamp1 is invalid gives you a rough idea, though this is the
first opportunity that you have had to check you set up the connection correctly.
So now we have reversed our dimensions a csv hierarchy load file can be created. If you like you
can just create a file with headings that relate to all the reversed fields.

For an example say I was going to add the following member as a child of account

The csv file would look like this :From this you should be able to build up a file to match your planning application pretty easily,
once you have completed the file make sure you save it in the file store directory you set up in
the topology manager, in my case E:\FileStore\AccLoad.csv
Now we have to create a model and datastore that will pick up the file.

Choose File as the technology and choose the logical schema you created originally in the
topology manager.

In the reverse tab select your context.

Insert a new DataStore and give it a name and if you click browse for file button it should open
up in the correct directory and you can select the csv file.

In the files tab as we are using a CSV file then choose Fie Format as delimited, the file has a
header row so it was set to 1, the field separator was set to other and as a comma (,)
If you have been following this guide then you may have to go back into the topology manager
as I forgot to set the JDBC options for the File set up, I have corrected the previous blog but this
is what you need to make sure you have set.

Back to the designer, click the Columns tab then the Reverse button and all being well the file
structure should be retrieved.

One thing to watch out for if you are using formulas, the physical and logical length is set to 50
and formulas can easy surpass that so you will need to increase the length and in the reversed
dimension column set up which is defaulted to 80.

Ok we have created the source and target elements so now we have to put them together to bring
them together and this is done by creating an interface, back to the projects tab and insert a new
interface

Give it a name and make sure the context is selected, next check Staging Area Different From
Target and choose SUNOPSIS_MEMORY_ENGINE.
This allows any transformations between the source and target to be handled by the memory
engine, an example being if you have a column header named TB in the source file and in the
planning target it goes to Time Balance then the memory engine will handle this mapping.

On to the Diagram tab, this is where you drag and drop you source and target datastores, its
certainly not the best graphically designed section and I remember thinking it was hideous the
first time I saw it.
First drag you Account Flat file datastore from the models window into the source window and
then drag the reversed account dimension datastore into the target window, select yes for the
automatic mapping. What this will do is automatically map the column headings of your file to
the target planning dimension, if you create the file with exactly the same headings as the
dimension properties then you will have less work to do.
Anyway you should end up with this lovely looking organised window.

Just stretch the windows to make it more appealing and understandable.


Now we have to map the remaining source and targets, the ones that have been mapped will
show as red in the source, btw the S in the source/target windows stands for String as all the
fields we are using are String based. In the target window the members that have been
automatically mapped will be displayed with an icon that stands for Source so basically meaning
the source is the same as the target.
To map, click on a name in the target window, this should display the mapping window below,
tick the Active Mapping box and in the implementation window enter the 3 digit name of the
source and the column name, so for the Target of Alias: Default the implementation window
will have ACC.Default_Alias, finally click the Staging Area button, as we need to use the
sunopsis memory engine to perform the mapping.

If you like you dont have to manually enter the name, you can click the Launch Expression
Editor and select from the window.

The expression editor has a whole raft of functions that can be used but you wont probably use
them for basic mappings.
The process needs to be repeated for the remaining source properties.

You can also force the value in the target mapping to be a fixed value, so say everything was
going to be put against a Source Plan Type of Consol you wouldnt need that in your source
file and you can just enter the value into the mapping like so :-

If you are going to do this make sure the Active Mapping box is ticked and execute on the
staging area is selected.
Once you have completed all the mappings click on the Flow tab and a diagram should be
displayed outlining the process, if you click on the Target there are a number of options available
like setting up error logging and the ability to refresh to essbase after the load has completed.

After applying changes you can execute the interface and the data in the file should be loaded
into planning, you can check in the Operator to see how the process went.

You should also check the log files that were generated, you may see this is the log
2008-09-28 13:15:04,109 DEBUG [DwgCmdExecutionThread]: Number of columns in the source result set does
not match the number of planning target columns.

I wouldnt worry about the message, as it is not really an error.

In planning your member(s) should have been loaded.

So there we go we have completed the objective of loading information from a flat file into a
planning dimension, from here you should be able to do the same for other dimensions.
I am going to wrap it up there for today, hopefully it all makes sense and you can go off and
create your own load interfaces.
Next time I will look into loading data from a SQL table to planning and look at what can be
achieved with ODI and essbase.

ODI Series Part 5 - SQL to Planning


The ODI series continues, we are already up to part 5 and still lots to cover. In the last part we
looked at loading hierarchies into planning from flat files but there are many occasions when the
metadata can be sourced from database warehouses. Today I am going to cover loading metadata
directly from db tables into planning.
In part 3 I set up a connection to SQL server and to a database named hypdata, if your warehouse
is Oracle then it is pretty much the same set up just using different JDBC drivers.
For this exercise I have set up two simple db tables one containing the parent/child entity
information and the other containing the member properties.

Now there are a few methods of getting them loaded into planning, the first and simplest is to
create a view in SQL server or Oracle to join the tables by the child/members fields.

I need to insert a new model in the ODI designer to point the SQL server technology.

You have the option of manually inserting the datastores to point to the db tables/views or you
can reverse everything from the Model Reverse tab.

For some reason it didnt reverse the View so I just checked View, unchecked Table and reversed
again, this time it reversed the view.

Ok nice and easy, we just need to create an interface to load the data from the view into the
planning entity dimension.
Just before we do this we will need an extra Knowledge Module, we will need a LKM (Loading
KM), we previously used LKM File to SQL but this time we are loading using SQL.
Import KM from the Project window in the Designer and choose LKM SQL to SQL

And it should appear in your LKM list in the designer.


Insert a new interface

Name it; set the context and the staging area, which once again we going to be using the
Sunopsis memory engine.

In the Diagram drag the View as the source and the planning entity dimension as the target, map
the fields and you should end up with something like the image above.
In the HIER_MEMBERS table and HIER_VIEW view there is a dimension field, currently there
is only Entity information in there but say we started adding extra dimension member

information then the above interface would try and load data that is not required for entities.
This can be easily resolved without having to go back and change the view and to be honest
changing the view wouldnt be a good idea because we would end up with a view for each
dimension.
This is where filters come into play, in the diagram source window drag the dimension field off
the window and this will create a filter, then you just need to set the filter to value in our case it
will be entities.

To check the filter is correct you can click the Check the expression in the DBMS button that
is on the right hand side of the implementation window. You can also create filters on the view in
the model tab.
Your flow diagram should look like this.

I manually added information for the log details in the Options window, this time I have also set
the REFRESH_DATABASE to Yes so after the metadata has been loaded into planning it will
push it down to essbase.

Executing the interface produced no errors in the log and opening the outline in EAS displayed
the new members so the refresh worked fine.
As I said early there were a few methods of loading from two tables one being with the use of a
view, now say you didnt have access to the SQL repository to create a view or for some reason it
wasnt deemed the right solution then you can achieve the same results by creating the join from
within ODI.
To achieve this create a new interface and fill in the definition tab in the same way as before, in
the diagram tab drag and drop both datastores HIER_MEMBERS & HIER_PROPERTIES into
the source window.
Now in the source window drag CHILD from HIER_MEMBER onto MEMBER in
HIER_PROPERTIES, this will create a join between the datastores. You can then just drag the
planning entity dimension into the target and map all the fields.

And the final flow diagram should look like

I know this is quite a simple example but hopefully it does start to show what can be done with
ODI and generate a few ideas how you can incorporate it into your own solutions.
I am going to leave it there for Planning and next time I will move on to using ODI with Essbase,
I will start to look at reverse engineering a database outline and output to different formats.

ODI Series Part 6 - Data load to planning


Just as I thought I would be leaving planning and ODI alone for a while a question has popped
up on the Oracle forums that is probably easier to answer through a blog.
Basically the question is how is it possible to load data into planning that has a member type of
Text. I did skip a blog about loading data into planning through ODI because I felt that it is
probably more appropriate to load data directly into essbase using the essbase KMs, no matter
which method you use it all ends up in essbase anyway.
Seeing as the question has been asked I will show how data can be loaded using the planning
KM and then move on to the Text problem.
As usual I am going to demonstrate on the planning sample application so it can be easily
replicated. When you load data into planning you first need to set what the data load dimension
will be plus you will need a driver dimension and its members you want to load data for.
I am going to set Account as the Data Load Dimension and use Segment as the Driver dimension
against member MP3
To do this choose Administration > Data Load Administration from planning web.

If you are interested in what happens when you hit save then basically it writes information by
object id into the planning app db table HSP_DRIVER_MEMBER

So with a quick bit of SQL

You can output the information you entered into planning web.

Anyway once you have set these details you can follow the reverse engineering process for a

planning application (if you need a refresher look HERE)


I created a new Model to generate the reverse for.

After running the Reverse you will have extra columns in the Account DataStore than if you
had not set up the data load settings in planning.

The extra columns are:- Data Load Cube Name which defines which essbase cube you will
load the data to, MP3 this is the Segment member which we set in driver dimension set up
and Point-of-view which is used to hold the remaining dimension combinations for the data
load.

I added three account members MusicSales1 3 all with the data type of Text. . (You dont have
to set the member as text this is just ready for the next blog on loading text data)

Also I made sure that Account was selected in Evaluation Order as otherwise you will not be
able to use the text functionality in web forms

I created a quick form to display data against the new members and for Segment MP3, now I can
produce the flat file.

As I picked the data load dimension as Account then the data needed to be by account, column B
is the Data Load Cube Name which in this case is the consol database, the POV was set to the
remaining dimension members each seperated with a comma :Currency :- Local
Entity :- NY
Period :- Jan
Scenario :- Actual
Version :- FY08
The final column contains the data and is against MP3 as we set that as the driver dimension
member.
Ok, back to the ODI developer and set up a new DataStore

I created this as a child of a flat file Model and pointed the Resource Name to the CSV file.

File Format of delimeted, the file has a row heading so that was set to 1, Field Seperator was set
as a comma (,) and the Text Delimeter as quotation mark () because the CSV file has commas in
it to separate the POV members and excel places quotation marks around the field.

The reverse generated the columns from the csv file.

A new interface was created, applied the context and set the staging area as the Sunopsis
Memory Engine

On the diagram tab, I dragged the DataStore for the csv data load file into the source window and
dragged the planning app DataStore for Account on the target.
Each column in the source was mapped to the target using the staging area for columns that have
a different name between the source and the target.

The flow diagram should resemble the above.

Once I had executed the interface and checked the output log for any errors I ran the form again
to check the data had been populated correctly.
So there we go a quick overview on how to load data into planning using ODI, I can now move
on tackle the problem of loading text data, stay tuned!