Académique Documents
Professionnel Documents
Culture Documents
Contents
PDF for Printing This Document .................................................................................................................... 2 Purpose.......................................................................................................................................................... 2 Understanding Centralized vs. Distributed Installation ................................................................................ 3 Understanding the Data Collections Process ................................................................................................ 4 3 Key Stages to the Data Collections Process............................................................................................ 4 Refresh Snapshots Process .................................................................................................................... 4 Planning Data Pull ................................................................................................................................. 4 ODS Load ............................................................................................................................................... 5 Complete, Net Change and Targeted Refresh........................................................................................... 5 Setup Requests for Data Collections ......................................................................................................... 6 Purge Staging Tables - A Special Note ....................................................................................................... 6 ATP Data Collections vs. Standard Data Collections ................................................................................. 6 Fresh Installation or Upgrade - Basic Steps ................................................................................................... 7 For 12.1 ..................................................................................................................................................... 7 For 12.0.6 .................................................................................................................................................. 8 For 11.5.10.2 ............................................................................................................................................. 8 Upgrading EBS Applications - Special Considerations ................................................................................... 9 DBA Checklist................................................................................................................................................. 9 VCP Applications Patch Requirements ...................................................................................................... 9 RDBMS Patches and Setups for Known Issues .......................................................................................... 9 Other RDBMS Setups / Maintenance ......................................................................................................10 RAC Database - Special Requirements ....................................................................................................11 System Administrator Setups ..................................................................................................................11 Applications Super User Checklist ...............................................................................................................12 VCP Applications Patch Requirements ....................................................................................................12 Fresh Install/Upgrade Setup Steps ..........................................................................................................12 Running Data Collections for the First Time in a New Instance ..............................................................13 Parameters for First Run of Data Collections ......................................................................................13
Preventing Data Collections Performance Issues........................................................................................14 How Can I Check on Performance When Data Collections is Running?..................................................16 Cloning and Setup after Cloning for Data Collections .................................................................................16 I Have a Centralized Installation What should I do? ............................................................................16 Steps to Consider after Cloning when Using Distributed Installation .....................................................17 What Happens when I Setup a New Instance in the Instances Form to Connect an APS Destination to the EBS Source? .......................................................................................................................................17 Key Points ................................................................................................................................................17 I Am Going to Use ATP Data Collections for Order Management ATP, What Do I Need to Do? ................18 Upgrade and Patching .............................................................................................................................18 DBAs Information ...................................................................................................................................18 Applications Super Users Information ...................................................................................................18 Appendix A ..................................................................................................................................................18 List of Setup Requests .............................................................................................................................19 References ...................................................................................................................................................19 Revisions ......................................................................................................................................................21
Purpose
This document will offer recommendations for customers using Value Chain Planning - VCP (aka Advanced Planning and Scheduling - APS) who are performing: 1. Fresh Installation of EBS Applications 2. Upgrading EBS Applications OR 3. Cloning the EBS Source Instance and/or APS Destination instance. We will explain the difference between Centralized vs. Distributed and other Terminology used in this document We will help you insure the patching and setups are correct so you can launch and complete Data Collections successfully. We will address this functionality for 11.5.10 and above, with a concentration on R12.1.
Patching for a Centralized Installation - All patches are applied to the single
instance. You can ignore references in the patch readme for centralized and distributed installation and apply all patches to your single instance. Most patches will mention to just apply the patch to your instance when Centralized.
Distributed Installation - Two (or more) databases located on different machines. One is the
EBS Source Instance where all OLTP transactions and setups are performed. The APS Destination is where Data Collections process is launched and uses database links to communicate and move the data to the MSC tables in the APS Destination. Why do many customers use Distributed Installation? 1. The processing performed for Planning in ASCP, IO, DRP, etc. can take huge amounts of memory and disk IO. This can slow down the OLTP processing while the plan processes are running. It is not unusual for large plans to consume anywhere from 5-20+ GB of memory while running and move millions of rows into and out of tables to complete the planning process. 2. It is possible to have APS Destination on different release. APS Destination could be 12.1 with EBS Source running 11.5.10 or 12.0.6 a. Note that you may NOT have an APS Destination with lower version that EBS Source (example - APS Destination on 12.0 or 11.5 with EBS Source on 12.1) 3. It is possible to have more than one EBS Source connected to the APS Destination. This requires careful consideration as patching for all instances must be maintained. Example: 2 EBS Source instances connected to 1 APS destination instance and all are on the same EBS applications release. For VCP patches that effect Collections, all three instances must be patched at the same time in order to prevent breaking this integration. Or customer may have 2 EBS Source on different releases. APS destination on 12.1 with an EBS Source on 12.1 and another on 11.5.10.
Patching for a Distributed Installation - This requires careful attention to the patch
readme to make sure the patches are applied to the correct instance and that prerequisite and post-requisite patches are applied to the correct instance. Patch readme should be clear on where the patch is to be applied. Usually the patch will refer to the EBS (or ERP) Source (or 'source instance') and to the APS Destination (or 'destination instance') for where the patch is to be applied.
12.1.x For our 12.1 CU (or cumulative patches), we are only releasing a single
patch and the patch will be applied to both the EBS and APS instances. Demantra Integration patches will have a separate EBS Source Side patch requirement.
12.0.x and 11.5.10 For these releases we have separate patches for different
parts of the VCP applications. ASCP Engine/UI patches are applied only to the APS Destination. There is also a Source Side UI patch to be applied to the EBS Source only - see the readme. Data Collections patches are applied to BOTH the EBS and APS instances. The readme may list specific patches to be applied to the EBS Source for certain functionality. GOP (or ATP) and Collaborative Planning are applied to BOTH the EBS and APS instances. For Demantra, the main patch is applied to the APS Destination instance. And there is usually a Source side patch listed in the readme.
ODS Load
1. This will launch to perform Key Transformations of critical data into unique keys for the VCP applications. For instance, since we can collect from multiple instances, the INVENTORY_ITEM_ID in MSC_SYSTEM_ITEMS must be given a unique value. This happens for several entities. We store the Source instance key values in columns with SR_ like SR_INVENTORY_ITEM_ID in MSC_SYSTEM_ITEMS 2. Then ODS also launches workers to handle the many different load tasks that move data from the MSC_ST staging tables to the base MSC tables. 3. During complete refresh we create TEMP tables to copy data from MSC_ST staging tables into the TEMP tables, then use Exchange Partition technology to flip this temp table into the partitioned table. a. For instance, if your instance code is TST, you can see in the log file where we create temp table SYSTEM_ITEMS_TST, b. Then we move data into this table from MSC_ST_SYSTEM_ITEMS c. Then we exchange this table with the partition of MSC_SYSTEM_ITEMS used for collected data - example here shows table name if the instance_id is 2021 - partition name is SYSTEM_ITEMS__2021) 4. Lastly, the ODS Load will launch Planning Data Collections - Purge Staging Tables to remove data from the MSC_ST staging tables and if Collaborative planning is setup, then we launch the Collaboration ODS Load.
data, or quick collection for setting up a test case. Examples: a. Business Purpose - Customer runs an ASCP Plan intraday at 1 pm, after running Net Change collections, they run a Targeted Refresh of Customers, Suppliers, Orgs to get all new customers created that morning. b. We run Net Change Collections during the week, and Complete Collections on weekends, but during the week, we need to collect new Sourcing rules and Customers daily, so we run targeted refresh each night. c. I need to change Sourcing rules and setup a Supplier capacity calendar and run a testcase. You can collect Approved Supplier List, Calendars, and Sourcing Rules to get these changes quickly into the MSC tables for testing without running a complete refresh of all entities. 5. OPM customers cannot use Targeted Refresh. They must use Complete Refresh to collect OPM data.
a. Centralized Installation b. Only using ATP for Order Management and NOT using ANY other Value Chain Planning Applications c. If you run ATP Data Collections in Mode - Complete Refresh when you are using Value Chain Planning Applications, you will corrupt the data in the MSC tables. You will have to run Standard Data Collections with a Complete Refresh to recover from this mistake. 2. The ATP Data Collections section in the Note 179522.1 XLS attachment includes information on how to run Standard Data Collections to mimic ATP Data Collections. a. This can be useful if there is a problem with ATP Data Collections. You can try standard collections to see if this works to overcome the issue. b. Understanding how to run a Targeted Refresh using Standard Data Collections can be useful to quickly collect changes for only certain entities. c. Running a Targeted Refresh of just Sales Orders can be useful to synch changes if it appears that sales order changes are not being collected properly. d. A Targeted Refresh is like Complete, but only collects changes for any entity set to Yes. CAUTION: if you make a mistake and run Complete Refresh instead of Targeted Refresh and set wrong entities to No, then data will be lost and errors will occur. Then a Complete Refresh would be required to recover from this mistake. 3. Note 436771.1 is an ATP FAQ which will be help to any customer using GOP/ATP. a. For understanding when ATP will not be available during Data Collections, see Q: Why do we get the message 'Try ATP again later'?
For 12.1
Customers must install the latest release of EBS Applications and during the implementation cycle, if a new release of EBS becomes available, plan to install that release (example: Installed EBS 12.1.2, and three months later EBS 12.1.3 becomes available, then it is very important to plan a move to this latest release.) For the applications install, review Note 806593.1 Oracle E-Business Suite Release 12.1 - Info Center. For VCP Applications the following Critical Notes are required 1. Note 746824.1 12.1 - Latest Patches and Installation Requirements for Value Chain Planning. Use this note to reference:
2. 3. 4.
5.
a. The latest Installation Notes b. The Latest available VCP patch for your release must be applied after the installation. c. Understand the Patching requirements if you have a separate EBS Source instance. i. This includes R12.1 EBS Source ii. OR if you intend to run EBS Source Applications on R12.0.6 or 11.5.10.2 d. Links to other critical notes about 12.1 VCP Applications e. List of limitations on new functionality if using R12.0.6 or 11.5.10.2 for the EBS Source instance Note 552415.1 INSTALL ALERT - Setting Up The APS Partitions and Data Collections Instance in R12 Latest User Guides are in Note 118086.1 Note.763631.1 Getting Started With R12.1 Value Chain Planning (aka Advanced Planning and Scheduling APS) - Support Enhanced Release Content Document (RCD) a. If you also need to review new features for R12.0, then also see Note 414332.1 If you are installing to use Rapid Planning, then see Note 964316.1 - Oracle Rapid Planning Documentation Library
For 12.0.6
Customers should NOT be planning to use this release, but should move to 12.1. Note 401740.1 - R12 Info Center is available for customers who must perform this installation For VCP Applications, critical notes are: 1. Note 421097.1 R12 - Latest Patches and Critical Information for VCP - Value Chain Planning (aka APS - Advanced Planning and Scheduling) a. You must apply the latest CU Patches available after the installation and before releasing the instance to users. b. If you are using only ATP Data Collections, then apply the GOP Patch and Data Collections patches. 2. Note 412702.1 Getting Started with R12 - Advanced Planning and Scheduling Suite FAQ 3. Note 414332.1 Getting Started With R12 - Advanced Planning Support Enhanced RCD 4. Note 552415.1 INSTALL ALERT - Setting Up The APS Partitions and Data Collections Instance in R12
For 11.5.10.2
In the rare case that an upgrade to 11.5.10.2 is being considered, AND it is not an intermediate step on the way to R12.1, then it is very critical that the latest patches be applied. Review Note 223026.1 and apply the latest Rollup patches for at minimum: 1. Using ASCP - then apply ASCP Engine/UI and Data Collections rollup
2. Using any other applications, then also apply those patches. 3. Using ATP Data Collections only - then apply GOP Patch and Data Collections Patch Also review Note 883202.1 - Minimum Baseline Patch Requirements for Extended Support on Oracle E-Business Suite 11.5.10
DBA Checklist
VCP Applications Patch Requirements
1. Latest VCP Patches applied for your release as noted above in Fresh Installation or Upgrade - Basic Steps 2. If Demantra integration is required, then review Note 470574.1 List Of High Priority Patches For Oracle Demantra Including EBS Integration Note: After Patching, there may be invalids or missing objects used for Data Collections. The patch should set profile MSC: Source Setup Required = Yes (or Y in 11.5.10) Then with the first run of Data Collections, the Setup Requests will be launched and build all the required objects for Data Collections, so this will resolve any invalid or missing objects used for Data Collections. As noted above, you can find more information on the Data Collection objects and the Setup Requests, etc in Note 179522.1
a. Fixed in version - 10.2.0.5 and 11.1.0.7 and 11g R2 2. RDBMS bug 6044413 fix should be applied a. Fixed in version - 10.2.0.4 and 11g R1 b. Final fix for Table Prefetching causes intermittent Wrong Results in 9iR2,10gR1 and 10gR2 Note 406966.1 3. RDBMS bug 5097836 and 6617866 fixes should be applied a. Both relate to query run in parallel not fetch all rows. b. 5097836 Fixed in version - 10.2.0.4 and 11g R1 c. 6617866 Fixed in version - 10.2.0.5 and 11g R2 d. As a workaround, you can also set all snapshots for Data Collections to use NOPARALLEL to prevent missing data. e. Run this SQL select 'alter table '|| owner||'.'|| table_name|| ' NOPARALLEL;' from all_tables where table_name like '%_SN' and degree > 1; f. Then if any rows are returned, run the output and it will set DEGREE = 1 Example: alter table MRP.MRP_FORECAST_DATES_SN NOPARALLEL; 4. Set init.ora parameter _mv_refresh_use_stats = FALSE a. This is part of our performance tuning for MLOGs per Note 1063953.1
10
b. Review the note to understand and manage the MLOGs for Data Collections and related applications that share the same MLOGs we use for Data Collections c. When a lot of data changes happen to the base tables, then we get large MLOGs that must be managed AND/OR d. If the ENI and OZF snapshots mentioned in the note are on the system, then they must be manually refreshed to keep the MLOG table maintained and prevent performance issues. e. You may see initial run of Refresh Snapshots taking a long time. During the setup phases, this will usually settle down after the first 2-3 runs of Data Collections unless there is a lot of data processing happening between runs of data collections and MLOGs grow rapidly. i. Review the note and use the steps in section - I Want to Check My System and Prevent a Performance Problem (ANSWER #1) to prepare the system to avoid performance problems. ii. These steps are usually best accomplished after the first 2 -3 runs of Data Collections has been completed, but can be executed earlier if required. 5. PLEASE NOTE that we expect Data Collections to take significant time during the initial run of Data Collections. This is because we could be moving millions of rows of data from the OLTP table to empty MSC tables. a. Once Data Collections has completed for the first time and Gather Schema Statistics is run for MSC schema (and MSD schema if using Demantra or Oracle Demand Planning), then the process should settle down and have better timing b. It is not unusual to have to setup Timeout Parameter for Data Collections to 900 1600 for the first run to complete without a timeout error.
11
12
b. More complete information on the Instances form in Note 137293.1 - Section IX and download the Viewlet to review with the steps in the note. 11. Organizations setup in the instances form: a. You MUST include the Master Organization b. Only should collect from ALL Orgs for the first run of Data Collections, so do NOT enter any extra Orgs that you plan to collect later in the implementation. c. You can setup Collection Groups, just do NOT use a Collection Group for the first run of Data Collections.
13
5. If you have a lot of data to collect, then the Refresh Snapshots process could take several hours to run for this initial run. 6. Planning Data Pull Parameters - special considerations a. Set Timeout parameter very high - suggest 900 - 1600 for the first run b. Set Sales Order parameter explicitly to Yes if using Order Management application. 1. When running Complete Refresh - this parameter defaults to No and runs Net Change Collection of Sales Order for performance reasons. 2. This is normal and recommended during normal operations. c. All other parameters should be set to default for the first run of collections d. If there are other parameters that will need to be Yes for your implementation, then plan to set those Yes after you get the initial run of collections completed successfully. 7. ODS Load Parameters - special considerations a. Set Timeout parameter very high - suggest 900 - 1600 for the first run b. Set all other Parameters No for this first run. c. If you need to set these to Yes, then do this later after the first run has completed successfully and timing for Data Collections is stable. 8. When the first run is completed, then run Gather Schema Statistics for the MSC schema. 9. If a Setup Request(s) fail, then run Planning Data Collections - Purge Staging Tables with parameter Validation = No. Then launch again. We have seen many times where initial failures in the Setup Requests are resolved by running 2 or 3 times. 10. IF Planning Data Pull or ODS Load fails, check for a Timeout error or other specific error in the log file of the Main request and all the log files for the Workers also. You must run Planning Data Collections - Purge Staging Tables with parameter Validation = No, before you launch the Data Collections again. If you fail to run this request, then Data Collections will error with message text as see in Note 730037.1.
14
1. The Profile setup in Note 179522.1 should be setup correctly for your instance. Below we have some suggestions, BUT you must review all the profiles for proper settings. Notes on the Profile Setups: a. MSC: Collection Window for Trading Partner Changes (Days) Must be NULL or 0 (zero) for first run, but then should be setup for smaller number. Generally 7 days should be sufficient. As long as you run a Complete Refresh at least once per 7 days, then this will collect the changes. b. Make sure Staging Tables profile is set to Yes if you are collecting from only 1 EBS Source instance - MSC: Purge Staging and Entity Key Translation Tables / MSC: Purge Stg Tbl Cntrl i. If you are collecting from Multiple Instances, then you must set this profile - No ii. In this case, the MSC_ST staging tables may grow very large in dba_segments and high water mark will not be reset when we delete from these tables. iii. If you must set profile to No, then we suggest that the DBA periodically schedule a truncate of the MSC_ST% tables when Data Collections is NOT running to recover this space and help performance. c. MSC: Refresh Site to Region Mappings Event - this profile will be relevant when setting up Regions and Zones in Shipping Application. When new Regions or Zones are defined, then mapping that occurs during Refresh Collection Snapshots could take a long time. Otherwise, we just collect new supplier information, which is not performance intensive. d. Check to make sure Debug profiles are not setup except in case where they are required for investigation of an issue. e. IF using Projects and Tasks and have many records, then if profile MSC: Project Task Collection Window Days is available, you can be set to help performance. 2. MLOG Management is the key to performance of Refresh Collection Snapshots. Note 1063953.1 discuses the important steps to managing MLOGs to keeping the program tuned to perform as fast as possible. a. For customers actively using Data Collections, these performance problems occur for two primary reasons: i. MLOG tables are shared between VCP Applications and other applications - usually Daily Business intelligence applications or custom reporting ii. Large source data changes via Data Loads and/or large transactional data volumes. b. Once the initial collection has been performed and most data collected, then use the steps in section I Want to Check My System and Prevent a Performance Problem - and ANWER #1: Yes, we are using Data Collections now
15
c. When you execute these steps, you will be setting the MLOGs to provide best performance. d. You must also be aware if you have any MLOGs that are shared with other applications. In this case, the DBA will need to actively manage those other Snapshots, refreshing them manually - via cron job may be the best method. 3. If you have very large BOM tables and plan to use Collection Groups to collect information only for certain Orgs, then be aware that this can take very significant TEMP space during the ODS Load. We have seen reports where All Orgs takes only 6-12gb of TEMP space, but when using a Collection Group, then TEMP space requirements exceed 30gb.
16
a new instance. If you just change the instance code in the form, then this will cause data corruption.
What Happens when I Setup a New Instance in the Instances Form to Connect an APS Destination to the EBS Source?
When you create the line in the Instances form on the APS Destination (NAV: Advanced Planning Administrator / Admin / Instances) you are: 1. Using an instance_id from MSC_INST_PARTITIONS where FREE_FLAG = 1 to store information on these setups you define in the form. 2. Inserting a line into MSC_APPS_INSTANCES on the APS Destination instance with the Instance Code, Instance_id. 3. Inserting a line into the MRP_AP_APPS_INSTANCES_ALL table on the EBS source Instance 4. Then when creating Orgs in the Organizations region of the Instances form you are inserting rows in the APS Destination table MSC_APPS_INSTANCE_ORGS for that instance_id. 5. In MSC_INST_PARTITIONS, FREE_FLAG = 1 means that the instance_id is not yet used. Once the Instance_id is used the FREE_FLAG = 2. 6. You should NOT manually change the FREE_FLAG using SQL. You should create a new Instance partition using Create APS Partition request with parameters Instance partition = 1 and Plan partition = 0. We do not recommend creating extra Instance Partitions, since this creates many objects in the database that are not required. 7. If the previous collected data and plan data from the old instance is not required, then it should be removed from the system.
Key Points
1. You will have to MANUALLY manipulate the EBS Source instance table MRP_AP_APPS_INSTANCES_ALL. 2. You cannot have TWO instances which have the same INSTANCE_ID or INSTANCE_CODE in EBS Source table MRP_AP_APPS_INSTANCES_ALL 3. You cannot have TWO instances which have the same INSTANCE_ID or INSTANCE_CODE in APS Destination table MSC_APPS_INSTANCES
17
4. You can always use steps in Note 137293.1 - Section X to clean up the data in the APS Destination and reset the EBS Source and collect all data fresh and new into a new instance. 5. Once you have a VERY CLEAR and intimate understanding of Note 137293.1 and the ways to manipulate Instance Codes and Instance IDs, then you can use Section XV of Note 137293.1 and come up with a custom flow to manipulate and setup after cloning your instances. 6. AGAIN, if you have problems, you can always reset using Section X and collect all data fresh into a new instance.
I Am Going to Use ATP Data Collections for Order Management ATP, What Do I Need to Do?
For ATP Data Collections, the instance must be Centralized installation on a single database. You cannot run ATP Data Collections for a distributed installation. You cannot run ATP Data Collections if you are using any Value Chain Applications besides ATP for Order Management.
DBAs Information
All steps in DBA Checklist apply, EXCEPT that no database links are required and Note 137293.1 will not be so important unless you move to Distributed installation. Patching is simpler, since all patches are applied to the same instance.
Appendix A
Note 179522.1 should be the primary resource for information about the requests and other information on data collections objects, parameters, profiles, etc.
18
x x x x x x x x x x x x x x x x x
References
Note 179522.1 - Data Collection Use and Parameters Note 137293.1 - How to Manage Partitions in the MSC Schema
19
Note 552415.1 - INSTALL ALERT - Setting Up The APS Partitions and Data Collections Instance in R12 Note 163953.1 - Refresh Collection Snapshots Performance - Managing MLOG$ Tables and Snapshots for Data Collections Note 813231.1 - Understanding DB Links Setup for APS Applications - ASCP and ATP Functionality Note.790125.1 - When Does The Planning Manager Need To Be Running? Note 279156.1 - RAC Configuration Setup For Running MRP Planning, APS Planning, and Data Collection Processes Note 266125.1 - RAC for GOP - Setups for Global Order Promising (GOP) When Using a Real Application Clusters (RAC) Environment Note 396009.1 - Database Initialization Parameters for Oracle Applications Release 12 Note 216205.1 - Database Initialization Parameters for Oracle Applications Release 11i Note 470574.1 - List Of High Priority Patches For Oracle Demantra Including EBS Integration Note 806593.1 - Oracle E-Business Suite Release 12.1 - Info Center Note 401740.1 - R12 Info Center Note 746824.1 - 12.1 - Latest Patches and Installation Requirements for Value Chain Planning Note 118086.1 - Documentation for Value Chain Planning Note.763631.1 - Getting Started With R12.1 Value Chain Planning (aka Advanced Planning and Scheduling APS) - Support Enhanced Release Content Document (RCD) Note 964316.1 - Oracle Rapid Planning Documentation Library Note 421097.1 - R12 - Latest Patches and Critical Information for VCP - Value Chain Planning (aka APS - Advanced Planning and Scheduling) Note 412702.1 - Getting Started with R12 - Advanced Planning and Scheduling Suite FAQ Note 414332.1 - Getting Started With R12 - Advanced Planning Support Enhanced RCD Note 223026.1 - List of High Priority Patches for the Value Chain Planning (aka APS - Advanced Planning & Scheduling) Applications Note 883202.1 - Minimum Baseline Patch Requirements for Extended Support on Oracle EBusiness Suite 11.5.10 Note 436771.1 - FAQ - Understanding the ATP Results and the Availability Window of the Sales Order Form Note 186472.1 - Diagnostic Scripts: 11i - Hanging SQL - Find the Statement Causing Process to Hang Note 280295.1 - REQUESTS.sql Script for Parent/Child Request IDs and Trace File Ids Note 245974.1 - FAQ - How to Use Debug Tools and Scripts for the APS Suite Note 730037.1 - Planning Data Pull Fails With Message - Another Planning Data Pull Process Is Running
20
Revisions
Author: david.goddard@oracle.com Rev 2.0 April 2010 complete revision to replace note 145419.1
21