Vous êtes sur la page 1sur 33

Inventory Data Loading

This document explains step by step Inventory loading to 0IC_C03 Cube.


Inventory Scenario

Is it required to load data using 2LIS_03_BX data source to get initialize stock opening
balances...!!
2LIS_03_BX --> This structure is used to extract the stock data from MM Inventory
Management for initialization to a BW system.
2LIS_03_BF --> This structure is used to extract the material movement data from MM
Inventory Management (MM-IM) consistently to a BW system
2LIS_03_UM --> This structure is used to extract the revaluation data from MM Inventory
Management (MM-IM) consistently to a BW system. Contains only value changes, no quantity
changes.
Before you can extract data for revaluations to a BW system, you must ensure that the
transaction/event key is active.
For this, see the following SAP Notes:
353042: How To: Activate transaction key (PROCESSKEY)
315880: Missing event control after PI-A installation
Usage of BX datasource is depends on business requirement.
Requirement1: For example data getting pulled from R/3 system. It contains 10 years of
History Data.
Business requirement is want to see Inventory / Stocks data for last 2 years only.
In this case, to get opening balances can only get through BX datasource and
compress WITH Maker update. after that we can load only last 2 years of historic data
using BF & UM datasources and compress WITHOUT Marker update.
Requirement2: For example data getting pulled from R/3 system. It contains 2 years of
History Data only andentire data needed from reporting.
In this case, we can use BX to pull opening stock balances and compress WITH Marker
update and
Pull BF & UM for 2 years of historic data and compress WITHOUT Marker update. Same as
above.
OR
Just Pull entire historic data - stock movements (2 years) using BF and UM datasources and
compress WITH Marker update . It creates Opening Balances. In this case no need to Load
BX datasource.

How To Handle Inventory Management Scenarios in


BW:https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/f83be790-0201-0
1) First thing we need lock the all users
2) In Init load we should maintain given serialization First 2LIS_03_BX,2LIS_03_BF and then 2LIS_03_UM with
following given markerupdates.
3) In delta load we should load 2LIS_03_BF first then 2LIS_03_UM.
4) Every delta load should be compress with marker update.

Inventory data loading step by step procedure


1)2LIS_03_BX(Stock Initialization for Inventory Management
Deleting the setup tables using Tcode: LBWG Select the inventory controlling application component

Then Click on execute button

Setup Tables: Its place where BW Come and pick the Init Data load
To fill the setup tables of 2LIS_03_BX using Tocde: MCNB in transfer structur tap Select the 2LIS_03_BX
Change the Time of termination and click on Execute button

After filling the setup tables go to Tcode: RSA3 Enter the datasource name execute and check weather data got filled
in setup tables. RSA3 Showing data means your setup tables got filled.

Login into BI Side Select and expand the tree of 0IC_C03 (2LIS_03_BX) Trigger the Infopackage for the data load
upto PSA and then Trigger the DTP to load the from PSA to 0IC_C03.
Make sure DTP Should contain the following settings .

Check the data successfully got loaded into cube. Go to Manage screen Click on Collapse tab and make sure No
Marker Update should not be checked, enter the request id and click on Release. Now data got compress with
marker update

2LIS_03_BF: Goods Movements From Inventory Management


To fill the setup tables of 2LIS_03_BF using Tocde: OLI1BW Enter the Name of run Change the Termination time and
click on Execute button.

After filling the setup tables go to Tcode: RSA3 Enter the datasource name execute and check weather data got filled
in setup tables. RSA3 Showing data means your setup tables got filled.

Login into BI Side Select and expand the tree of 0IC_C03 Select the infopackage of (2LIS_03_BF) and Trigger the
Infopackage to load the data upto PSA and then Trigger the DTP to load the data from PSA to ZIC_C03.

Check the data successfully got loaded into cube. Go to Manage screen Click on Collapse tab and make sure No
Marker Update should be checked, enter the request id and click on Release button. Now data got compress without
marker update

Revaluations: 2LIS_03_UM
To fill the setup tables of 2LIS_03_UM Enter the Tocde: OLIZBW Give the Name of run, Company code Change the
Termination time and click on Execute button.

After filling the setup tables go to Tcode: RSA3 Enter the datasource name execute and check weather data got filled
in setup tables. In RSA3 Showing data means your setup tables got filled.

Login into BI Side Select and expand the tree of 0IC_C03 Select the Init infopackage of (2LIS_03_UM) and Trigger
the Infopackage to load the data upto PSA and then Trigger the DTP to load the data from PSA to ZIC_C03.

Check the data successfully got loaded into cube. Go to Manage screen Click on Collapse tab and make sure No
Marker Update should be checked, enter the request id and click on Release button. Now data got compress without
marker update.

Inventory Delta Load 2LIS_03_BF


Go to Tcode: LBWE then Click on Job Control Tab

1)Click on Start Date you will get following screen enter the date and time to trigger the delta load from lbwq to RSA7.

Then click on Print Parameters

After entering print parameters Click on schedule job.

Login into BI Side Select and expand the tree of 0IC_C03 Select the Delta infopackage of (2LIS_03_BF) and Trigger
the Infopackage to load the data upto PSA and then Trigger the DTP to load the data from PSA to ZIC_C03.

Check the data successfully got loaded into cube. Go to Manage screen Click on Collapse tab and make sure No
Marker Update should not be checked, enter the request id and click on Release button. Now data got compress with
marker update.

Inventory Delta Load: 2LIS_03_UM


Select and expand the tree of 0IC_C03 Select the Delta infopackage of (2LIS_03_UM) and Trigger the Infopackage
to load the data upto PSA and then trigger the DTP to load the data from PSA to ZIC_C03.

Check the data successfully got loaded into cube. Go to Manage screen Click on Collapse tab and make sure No
Marker Update should not be checked, enter the request id and click on Release button. Now data got compress with
marker update.

BI Data Loading Basics


Info package load
When the info package load is running check the following:
Use tcode: RSMON > click on Load Data > check the status.Use Refresh all to refresh the
data being loaded. Click on the data source and hit the manage PSA button and determine
the load status. Check the PSA table for the data source and determine the data load.
use tcode: SM50 to check the BGD processes that are running.
Use tcode: SM37 to determine whether the job is being scheduled and check the job details
and the job logs to determine the status.
DSO load
When DSO load is running check the following:
Use the monitor for the DTP and check the status for the data load. check the Error Stack
button and determine if there are any errors.Use tcode RSMON > click on monitors > Data
Store Objects > search the DSO.
Determine whether the 2 fields: last loading request ID and activated upto request iD. If
there are no numbers in the activated up to request ID column then hit the Activate button
at the rightWhen you activate the DSO i.e. the active tables you can check 3 places to
determine the status of the flow. First obtain the job that is displayed the moment you hit
the start button while activating the DSO and go to SM37 and check the job status.check
the status and keep refreshing it until it says FINISHED. The same time check SM50 and
look at the BGD processes that are running under your name and keep refreshing until the
BGD process stops running..then you know that the data has been completely loadedwhen
the DTP is used to load data to the DSO you can always check the DTP monitor and
determine the status of the load.

Data load in PSA check


Determine that Data is completely loaded in the Persistence Storage Area .
Before anything make sure you have no data hiding in the Initialization Options.follow
this one step below

If you find any data record as shown below before loading .select it and delete it !!!

Hit the monitor button at the top.

Now click on the step-by-step analysis button

Hit the analysis button at the bottom

Now hit the detail button.


Now you know that data has been loaded in your PSA !!!!

Find the PSA database table and data


Find your datasource

Click on Technical Attributes

go to Tcode: SE16 and enter the table name: PSA table: /BIC/B0001997000

there you gosimple way to find what data gets loaded in your Persistent Staging Area !

BI Metadata
Launching Web Application Designer the first time downloads all the metadata stored in BI
server

the metadata is basically pulled from the metadata repository as shown below:

Viewing reports on the web requires a repository manager in knowledge management


which can pull all the metadata to display the reports..

Error DTP handling


when a DTP is created the below screen is the standard options available:

1 Valid Records Update, No reporting (request Red)


Here the incorrect data will NOT be available in the target info provider and the error records will be
shown in red. However, the correct data will continue to load in the target info provider.
2 Valid Records Update, Reporting Possible (request Green)
Here the data will be available for reporting in the target info provider but the error records will not be
shown in red. You have to go to the error DTP and catch the incorrect data.
No update, No reporting
This is the standard option available for DTP.
Below is the screen shot showing the error data in the DTP run.

Go to the error DTP and click on the cushion button to get the below screen.

Notice that the req del date is missing

Correct the missing req del date.

Go back and run the ERROR DTP again and below is the result:

Now go back to the actual DTP and run it again and you might see this message:

If you choose Repair it will not delete the existing records and continue to load the non-error records to
be loaded in the target info provider.
If you choose Delete it will delete the existing records in the target info provider and start a fresh load.
If you dont want to take the above approach, go to the maintain link of the target info provider and click
on the monitor and change the red status to green and that should solve the problem as shown below.

If you see a Yellow it means that the data loading is going on since you have chosen
1 Valid Records Update, No reporting (request Red)

How to re-construct a Load From DSO

This guide is in reference to reconstruct a load in


Opportunities Header DSO but the process is similar to
any other object and can be a reference for any other
DSO we need to fix data without taking new data from
source system (CRM or R3).
1. Delete the existing load from the DSO.
2. Go to the Reconstruction tab and select the request we
want to reconstruct and click the Reconstruction/Insert
button.
3. Come back to the Request tab and monitor the
progress of the load clicking therefresh button, we need
continue monitoring until the request reach the green
status.
4. Despite of the load will complete, it wont be inactive
(there is no value in the ID of Request column).
5. In order to activate it, we need to choose the request
from the manage and click on the activate button.
6. A new window with the list of available requests in the
DSO will be shown, most of the times it will only exist one
in the first row of the list, lets select it and click on the start
button
7. A new window asking in which specific Server we want
to run the process appears, lets accept what is proposed
by default and click on green button to allow the process
to continue.
8. Once the activation is done, the request in the manage
will show with all the information populated and data will

be ready to be moved to the cubes or accessed by


routines

How to check the setting of flat file in BW


o check the settings of your flat file , what format of flat file we can upload in bw side,
we need to go for se11 , go to the RSADMINCV1,this a Maint. view with text BW: Settings for Flat Files..
Simply execute this , you will get the settings , follow the below screen shot

Delta Update Methods in Logistic


We have three delta update methods in Logistic.

Direct Delta: With this update mode, the extraction data is transferred with each document posting
directly into the BW delta queue. In doing so, each document posting with delta extraction is posted for
exactly one LUW in the respective BW delta queues.
Queued Delta: With this update mode, the extraction data is collected for the affected application instead
of being collected in an extraction queue, and can be transferred as usual with the V3 update by means of
an updating collective run into the BW delta queue. In doing so, up to 10000 delta extractions of
documents for an LUW are compressed for each DataSource into the BW delta queue, depending on the
application.
Un-serialized V3 Update: With this update mode, the extraction data for the application considered is
written as before into the update tables with the help of a V3 update module. They are kept there as long
as the data is selected through an updating collective run and are processed. However, in contrast to the
current default settings (serialized V3 update), the data in the updating collective run are thereby read
without regard to sequence from the update tables and are transferred to the BW delta queue.

Data Source - Real Time Scenarios


Q: I am having a Query, I have to copy the Data-Sources from One client to another Client.Can any one
let me know the Procedure.
Sol:
I don't think we can copy a Data source from one client to client ( Once you copy the Client you suppose
to get all the ds)
But if you still want to do then we have another option like transport from one client to Another client
trough RFC (regular transport model)
The best way is to do client copy which automatically will copy your data source to the target client. There
is no specific method to copy data sources from one client to another client.
Alternatively you can transport the data source to the target system.
Data source are moved in the following way in BI landscape,
Transport from R/3 Dev to R/3 Quality to R/3 Production and then they are replicated from R/3 Dev to BW
Dev and so on.
In this case if you want to move any particular DS in R/3 landscape then you will have to do it through
normal transport.
This can be done only by transporting the objects from source client to target client. For that make sure
the Transport path exists between Source to Target and then release the transport from the Source
system and then import the request in the target system.

If your are creating a new client from an existing client then basis can help you in creating a new and
copying all the objects from the old client to the new client

Data Source - Real Time Scenarios


Q: Generic datasource not appear in RSA7
We created a Generic datasource with delta however does not appear in
RSA7.
I need to do setting to the generic delta datasource can appear in RSA7 ?
Sol:
initialize the data source in BW side at infopackage?
if the infopackge is initialized then you will be normally able to see the entry
in RSA7.

How to check the setting of flat file in BW


o check the settings of your flat file , what format of flat file we can upload in bw side,
we need to go for se11 , go to the RSADMINCV1,this a Maint. view with text BW: Settings for Flat Files..
Simply execute this , you will get the settings , follow the below screen shot

Delta Update Methods in Logistic


We have three delta update methods in Logistic.

Direct Delta: With this update mode, the extraction data is transferred with each document posting
directly into the BW delta queue. In doing so, each document posting with delta extraction is posted for
exactly one LUW in the respective BW delta queues.

Queued Delta: With this update mode, the extraction data is collected for the affected application instead
of being collected in an extraction queue, and can be transferred as usual with the V3 update by means of
an updating collective run into the BW delta queue. In doing so, up to 10000 delta extractions of
documents for an LUW are compressed for each DataSource into the BW delta queue, depending on the
application.
Un-serialized V3 Update: With this update mode, the extraction data for the application considered is
written as before into the update tables with the help of a V3 update module. They are kept there as long
as the data is selected through an updating collective run and are processed. However, in contrast to the
current default settings (serialized V3 update), the data in the updating collective run are thereby read
without regard to sequence from the update tables and are transferred to the BW delta queue.

Vous aimerez peut-être aussi