Académique Documents
Professionnel Documents
Culture Documents
D84830
Edition 1.0
D82531GC10
December 2013
Student Guide
Implementation Using ODI
Oracle BI Applications 11g:
Technical Contributors other intellectual property laws. You may copy and print this document solely for
your own use in an Oracle training course. The document may not be modified or
and Reviewers altered in any way. Except where your use constitutes "fair use" under copyright
Archana Singh law, you may not use, share, download, upload, copy, print, display, perform,
reproduce, publish, license, post, transmit, or distribute this document in whole or in
Scott Silbernick part without the express authorization of Oracle.
Patrick Block
The information contained in this document is subject to change without notice. If
Mizuru Asada you find any problems in the document, please report them in writing to: Oracle
Jayant Mahto University, 500 Oracle Parkway, Redwood Shores, California 94065 USA. This
document is not warranted to be error-free.
Edward James
Restricted Rights Notice
Sireesha Mandava
Deviprasad Kolli If this documentation is delivered to the United States Government or anyone using
Aju Kumar Oracle and Java are registered trademarks of Oracle and/or its affiliates. Other
names may be trademarks of their respective owners.
Anwesha Ray
Graphic Designer
Rajiv Chandrabhanu
Publishers
Nita Brozowski
Sujatha Nagendra
Contents
These eKit materials are to be used ONLY by you for the express purpose SELF STUDY. SHARING THE FILE IS STRICTLY PROHIBITED.
1 Course Introduction
Lesson Agenda 1-2
Instructor and Class Participants 1-3
Training Site Information 1-4
Course Audience 1-5
iii
Analytic Workflows 2-24
Speeds Time to Value and Lowers TCO 2-26
These eKit materials are to be used ONLY by you for the express purpose SELF STUDY. SHARING THE FILE IS STRICTLY PROHIBITED.
iv
Selected Star Schemas of the OBAW 4-7
OBAW Naming Conventions 4-8
These eKit materials are to be used ONLY by you for the express purpose SELF STUDY. SHARING THE FILE IS STRICTLY PROHIBITED.
v
12. Copy Source Files. 5-22
Post-Installation System Setup Tasks 5-23
These eKit materials are to be used ONLY by you for the express purpose SELF STUDY. SHARING THE FILE IS STRICTLY PROHIBITED.
vi
Load Plans and Scenarios 6-24
Full ETL and Incremental ETL 6-25
These eKit materials are to be used ONLY by you for the express purpose SELF STUDY. SHARING THE FILE IS STRICTLY PROHIBITED.
vii
Warehouse Domains 8-8
Warehouse Domains: Warehouse Members 8-9
These eKit materials are to be used ONLY by you for the express purpose SELF STUDY. SHARING THE FILE IS STRICTLY PROHIBITED.
viii
Quiz 9-25
Summary 9-30
These eKit materials are to be used ONLY by you for the express purpose SELF STUDY. SHARING THE FILE IS STRICTLY PROHIBITED.
Practices 9-31
ix
9. Execute the DDL Procedure. 11-15
10. Modify the Scenario Naming Convention. 11-16
These eKit materials are to be used ONLY by you for the express purpose SELF STUDY. SHARING THE FILE IS STRICTLY PROHIBITED.
13 Security
Objectives 13-2
Security Overview 13-3
Tools to Configure OBIA Security 13-4
Security Levels 13-5
About Authentication 13-6
About Authorization 13-7
x
About Application Roles 13-8
Advantage of Using Application Roles 13-9
These eKit materials are to be used ONLY by you for the express purpose SELF STUDY. SHARING THE FILE IS STRICTLY PROHIBITED.
14 Managing Performance
Objectives 14-2
Optimizing Performance 14-3
Common Performance Bottlenecks 14-4
Performance Tuning Recommendations 14-5
Tuning Underlying Systems 14-6
xi
Guidelines for Oracle Business Analytics Warehouse Databases 14-7
Using a Separate Database for the Oracle Business Analytics Warehouse 14-9
These eKit materials are to be used ONLY by you for the express purpose SELF STUDY. SHARING THE FILE IS STRICTLY PROHIBITED.
xii
These eKit materials are to be used ONLY by you for the express purpose SELF STUDY. SHARING THE FILE IS STRICTLY PROHIBITED.
Course Introduction
Recommended:
Experience in business intelligence, data warehouse
design, dimensional modeling, and database design
Basic knowledge of SQL
Student Guide
All slides presented during lecture
Student notes
Activity Guide
Hands-on practices and solutions
Day One:
Lesson 1: Course Introduction
Lesson 2: Oracle Business Intelligence Applications
Overview
Lesson 3: Oracle Business Intelligence Applications
Architecture
Day Two:
Lesson 6: Understanding the ETL Process
Lesson 7: Functional Configuration for Oracle Business
Intelligence Applications
Lesson 8: Administering and Maintaining Functional
Configuration Data
Day Three:
Lesson 11: Building a Category 1 Customization
Lesson 12: Building a Category 2 Customization
Lesson 13: Security
Lesson 14: Managing Performance
Applications Overview
Oracle BI Applications allows your organization to realize the value of a prebuilt, packaged BI
application such as rapid deployment, lower total cost ownership, and built-in best practices.
Oracle BI Applications offers:
A prebuilt data warehouse schema, Oracle Business Analytics Warehouse, with
associated extract, transform, and load (ETL) metadata and data-movement
infrastructure to support aggregation and transformation for the analysis of data from all
transactional sources
A prebuilt Oracle BI Applications metadata repository to support analysis of the data in
the Oracle Business Analytics Warehouse
A suite of metrics pertaining to how organizations measure performance that your
company can select from and apply to your particular line of business
Application and industry-specific, role-based analytics and dashboards that are
organized to maximize both industry and domain knowledge, as well as leveraging an
understanding of the Oracle Business Analytics Warehouse and its underlying source
transactional schemas for historical analysis
The Oracle BI Applications offerings are covered in more detail in the slides that follow.
This slides lists the applications offered in Enterprise Resource Planning Analytics. Each
application is covered in detail in the slides that follow.
Through complete end-to-end insight into the manufacturing operations and visibility across
the plants and business units, organizations can significantly reduce costs, enhance
profitability, increase customer satisfaction, and gain competitive advantage by identifying and
eliminating low value-added processes without compromising quality.
The solution is also suitably integrated with other applications in the Oracle Business
Intelligence Applications family to deliver supply chain information across the value chain. For
example, Oracle Supply Chain and Order Management Analytics enables better
understanding of problem areas in fulfilling certain products and helps identify unrealistic
levels of sales order fulfillment backlog. When coupled with Manufacturing Analytics,
organizations are able to analyze supply and demand in tandem to identify potential supply
shortages.
Oracle Marketing Analytics provides timely, fact-based insight into the marketing activities of
the entire organization. It provides new levels of information richness, usability, and reach to
marketing professionals throughout the enterprise. All users, from marketing executives to
marketing analysts, get complete and in-context marketing insightinsight that is
personalized, relevant, and actionable. The benefits are faster and more informed decisions
that help the marketing organization optimize resources, reduce costs, and improve the
effectiveness of marketing activities.
Oracle Marketing Analytics provides information on which products customers are likely to
buy and insight into which products may make effective bundles. The nuances of customer
behavior information can be gleaned from transaction history and correlated with customer
lifetime value, churn risk, and customer behavioral attributes to gain insight into customer
clusters and better inform treatment strategies. The ability to aggregate information from
various data sources also allows marketers to calculate, monitor, and build customer
investment strategies based on critical metrics such as customer profitability.
This layer provides a complete set of information delivery and access capabilities, which are
designed to address the needs of different types of users in an organization. These
capabilities include interactive dashboards for executives and managers, query and analysis
tools for power users, scorecard and Microsoft Office interfaces for finance users, and pixel-
perfect reports and mobile support for casual users. It also supports embedded analytics for
business process.
BI Foundation
The EPM and BI applications are integrated on a BI foundation that includes the following:
Common enterprise information model
Oracle BI Applications is built on the Oracle Business Intelligence Foundation Suite. At the
heart of the Oracle Business Intelligence Foundation Suite is a key technology differentiator
for Oraclethe Common Enterprise Information Model. This is a unified metadata model,
which is accessed by all the end-user tools, so every end user and every department across
the enterprise has the same consistent view of information, tailored to their role. As a result,
organizations no longer need to maintain multiple metadata environments for different types
of users. Oracle provides the ability to model once, deploy everywhere.
The metadata model consists of three tiers:
The physical layer enables you to import the table structures of your existing data
sources.
The semantic layer enables you to create a simplified representation of multiple data
sources, creating a logical model of your business in ways users think about it
dimensions, hierarchies, and metrics.
The presentation layer further simplifies this model making the data appear to end users
as a single data source with a single table structure of dimensions, measures, and
derived measures.
This common enterprise information model enables you to define key metrics and calculations
in one place, assuring that everyone has a consistent view of information (tailored to their
role) and assuring alignment across departments.
Business
Maximize Cash Flow
objectives/
issues
Is DSO on target?
Is DPO on target?
Analytic workflows are built around standard paths of discovery for business issues. In the
slide example, a Director of Credits and Collections in the Receivables function of Finance
and Accounting is monitoring the Maximize Cash Flow objective. This objective is composed
of several key questions and KPIs around Days Sales Outstanding (DSO), Days Payable
Outstanding (DPO), and others. Each one of these subsequently leads to more questions
about the core components of the KPIfor example, DSO being on target requires overdue
balances to be on target, customers to be paying in line with their terms, and so on. These
workflows are supported in Oracle BI Applications as standard exploration paths.
Business
objectives/ Maximize Cash Flow
issues
Is DSO on target?
Continuing with the example from the previous slide, following one branch of the Maximize
Cash Flow analytic workflow, each part of the flow is supported by prebuilt reports and
navigation that allow users to easily drill down to further levels of detail as required. Because
the application and the supporting data warehouse model and ETL are built to capture
information at the transaction-line level, users can easily drill down from the summary
information to the most atomic level of information. Ultimately this allows the user to not only
monitor progress on an objective, but also to easily navigate to the right information, so that,
in the end, any required corrective action can be proactively taken. Notice, for example, that in
the Take Action area of the workflow, the user is drilling down from the BI Applications
transactional-invoice level report back to the originating transactional application to take
action in the operational system.
Training/Rollout
Oracle BI Applications
solutions approach:
Define Metrics
and Dashboards Faster time to value
Lower TCO
Assured business value
Compared with a traditional business intelligence deployment, which entails using ETL and BI
platforms to build, load, and report on a custom data warehouse schema, Oracle BI
Applications can provide a significant benefit in the value and total cost of ownership (TCO).
The prebuilt nature of the applications, including the data warehouse data model and ETL, BI
metadata, and reports and dashboards, allows significant savings in deployment time,
creating business value in the significantly reduced time frames. Built-in best practices, KPIs,
metrics, and workflows reduce time, ensure successful business analysis, and reduce TCO.
From: To:
Oracle BI Applications
Proactive
Interactive Reporting & Ad hoc Detection Office
Scorecard Search Embedded Mobile
Dashboards Publishing Analysis and Alerts
PeopleSoft
Siebel IVR, ACD, CTI
DW Schema MS Excel
Other data sources
EBS Packaged Universal
ETL maps adaptors
Answer: c
Answer: c
Answer: c
Answer: a, b, c, d, e
Practice
Applications Architecture
Source systems
Oracle Business Intelligence Enterprise Edition
Oracle Business Analytics Warehouse
Oracle BI Applications components repository
Oracle BI Applications Oracle Data Integrator repository
The data extracted from source systems becomes the underlying data for your Oracle BI
analyses and dashboards, and includes information about customers, inventory, sales,
marketing, accounts, and other types of data that you collect about your business.
The Oracle Business Analytics Warehouse is a modular enterprise-wide data warehouse data
model with conformed dimensions for data integrated from multiple sources. Oracle Business
Analytics Warehouse provides code standardization, stores transaction data in the most
granular fashion, and tracks historical changes. It also supports multiple currencies and
languages.
The Oracle Business Analytics Warehouse is supported only on the Oracle database.
Oracle BI Applications Configuration Manager and Functional Setup Manager are described
in more detail in the next two slides.
Configuration Manager contains the setup objects for Oracle BI Applications. It provides
administrative graphical user interfaces for setup and configuration. It is the recommended
product for on-going administration and maintenance of functional setup within Oracle BI
Applications.
It also provides a quick review of Oracle BI Applications setup values, and is the
recommended tool for monitoring and troubleshooting load plan executions. Configuration
Manager works in conjunction with Functional Setup Manager to provide guided tasks to
configure Oracle BI Applications offerings and functional areas.
Oracle BI Applications can optionally leverage Oracle GoldenGate. Oracle GoldenGate is not
covered in detail in this course. For more information, refer to Administering Oracle
GoldenGate and Source Dependent Schemas in the Oracle Fusion Middleware
Administrators Guide for Oracle Business Intelligence Applications.
The graphic in the slide illustrates how all of the Oracle BI Applications architecture
components work together. Oracle BI Applications use all of the components depicted in the
graphic to deliver information specific to your business needs. Your data, from human
resources to sales totals, is extracted from your transactional application database, or source
system. This data is loaded into the Oracle Business Analytics Warehouse database and it is
transformed using the mappings that are stored in the Oracle BI Applications Repository
database. This database system contains the repository information for Oracle BI Applications
ODI repository and the Oracle BI Applications components repository.
Each layer is discussed in more detail in the slides that follow.
The transactional application database contains transactional application schemas and can
optionally be hooked up to the Oracle Business Analytics Warehouse with Oracle
GoldenGate. The Oracle Business Analytics Warehouse database contains the optional BI
Applications source-dependent data store (SDS) schema and the Oracle Business Analytics
Warehouse schema. The Oracle BI Applications Repository Database system includes the
Oracle BI Applications Oracle Data Integrator (ODI) repository and the Oracle BI Applications
components repository.
ETL components:
ODI Console and ODI Agent
Configuration Manager
Functional Setup Manager
Load plan generator
The Oracle Business Intelligence system logical architecture comprises a single integrated
set of manageable components called the Oracle BI domain. This layer includes the ODI
Console and ODI Agent, Oracle BI Applications Configuration Manager, Functional Setup
Manager, and the load plan generator, all of which are hosted on the Oracle WebLogic
Server.
ODI Console is a web application that enables you to view objects in the ODI Repository and
control and monitor ETL processes.
ODI Agent is a Java EE agent, which handles schedules and orchestrates ETL sessions.
Load plan generator (not shown here) is a set of jar files used to create load plans in the ODI
repository. A load plan is an executable object that comprises and organizes the child objects
(referred to as steps) that carry out the ETL process.
Oracle BI Presentation Services is part of the Oracle BI EE platform. It provides processing to
visualize information for end-user consumption, such as analyses and dashboards. It uses a
presentation catalog to store saved content. The BI Applications presentation catalog is
prebuilt with role-based interactive dashboards and analyses to support the deployed Oracle
BI Applications product offerings.
The Oracle BI Applications repository (.rpd) file is a prebuilt Oracle BI repository containing all
the metadata required for the deployed applications. It contains a logical business model
mapped to physical data sources, key performance indicators and metrics definitions, and a
presentation layer that exposes the business model to business users. Oracle BI Server
receives its processing information from this repository.
Oracle BI Applications users are assigned to application roles, which define a set of
permissions granted to a user or group. Application roles are discussed in more detail in
Lesson 13, Security.
Users assigned to business user application roles have access to web applications that allow
them to access the prebuilt Oracle BI Applications dashboards and analyses, and to build
new dashboards and analyses when the proper privileges are granted.
Users assigned to Oracle BI Applications ETL administration and functional setup application
roles have access to Configuration Manager and Functional Setup Manager, which are web
applications that allow administrative users to perform configuration tasks.
Users assigned to Oracle BI Applications development application roles have access to ODI
Console, which is a web application that provides the ability to view objects in the ODI
repository and to control and monitor ETL processes.
Users with development application roles can also access ODI Studio, which is used for
administering the ODI infrastructure, reverse engineering the metadata, developing projects,
scheduling, operating and monitoring executions, and customization.
Users with development application roles may also use the BI Administration Tool, which is a
Windows user interface for building, managing, and customizing the Oracle BI EE repository.
You learn more about the installation process in Lesson 5, Installing Oracle Business
Intelligence Applications.
The slide provides a partial list of configuration tasks that you can perform using the Oracle BI
Applications Configuration Manager and Functional Setup Manager web applications.
Functional configuration is covered in more detail in Lesson 7, Functional Configuration for
Oracle Business Intelligence Applications and Lesson 8, Administering and Maintaining
Functional Configuration Data.
After you have configured your load plan and your functional
configuration data, you perform a full load of your transactional
data into the Oracle Business Analytics Warehouse.
Creating, generating, executing, and monitoring load plans is covered in more detail in
Lesson 9, Managing Load Plans.
The ETL process is covered in more detail in Lesson 6, Understanding the ETL Process.
Each of these topics is covered in more detail in the remaining lessons for this course.
Answer: b
Answer: b
The Oracle BI Applications Oracle Data Integrator (ODI) Repository contains the BI
Applications-specific prebuilt ETL logic.
Which of the following are tasks that you perform using Oracle
BI Applications Configuration Manager?
a. Configure offerings, which are the products you have
purchased.
b. Configure functional areas, which are the component parts
of the offering.
Answer: a, b, d, f
Answer: c
Warehouse Content
Oracle Business Analytics
The Oracle Business Analytics Warehouse is a modular enterprise-wide data warehouse data
model with conformed dimensions for data integrated from multiple sources. Oracle Business
Analytics Warehouse provides code standardization, stores transaction data in the most
granular fashion, and tracks historical changes. It also supports multiple currencies and
languages.
The Oracle Business Analytics Warehouse is supported only on the Oracle database.
Because complex queries run slowly on transactional databases, the database requirements
for OBAW are different from those of transactional applications. In OBAW, you modify data
much less frequently than in transactional systems, but you need quick results when viewing
dashboards, analyses, and while drilling down to detailed charts and graphs. Star schemas
are optimized for these uses.
Product
The star schema is the simplest data warehouse schema. It is called a star schema because
the diagram of a star schema resembles a star, with points radiating from a center. The center
of the star consists of one or more fact tables and the points of the star are the dimension
tables. A star schema is characterized by one or more very large fact tables that contain the
primary information in the data warehouse and a number of much smaller dimension tables
(or lookup tables), each of which contains information about the entries for a particular
attribute in the fact table.
A typical fact table contains keys and measures. For example, a simple fact table might
contain the measure Sales, and keys Time, Product, and Market. In this case, there would be
corresponding dimension tables for Time, Product, and Market. The Product dimension table,
for example, would typically contain information about each product number that appears in
the fact table. A measure is typically a numeric or character column, and can be taken from
one column in one table or derived from two columns in one table or two columns in more
than one table.
A star join is a primary-key to foreign-key join of the dimension tables to a fact table. The main
advantages of star schemas are that they provide a direct and intuitive mapping between the
business entities being analyzed by end users and the schema design, and provide highly
optimized performance for typical data warehouse queries.
In this example, W_GL_REVN_F is the fact table in the data warehouse star schema.
Examples of dimension tables are W_PRODUCT_D, W_USER_D, and so on.
Oracle BI Applications offers a breadth of analysis, spanning sales, service, and marketing to
back-office functions, and includes several verticalized areas. To meet the analytical
requirements, the OBAW contains a number of horizontal as well as vertical star schemas.
For example, in the pharmaceutical industry, specific star schemas are included to store
information about prescriptions and industry-syndicated market data. The star schemas can
be deployed in the OBAW depending on the applications that are selected, and the use of
conforming dimensions allows consistent analysis across different subject areas, applications,
and areas of analysis. Conforming dimensions are discussed in detail later in the lesson.
The table in the slide shows naming convention examples for some of the table types in
OBAW. Not all table types are listed here. For more information, refer to Naming
Conventions for Oracle Business Analytics Warehouse Tables in the Oracle Fusion
Middleware Administrators Guide for Oracle Business Intelligence Applications.
Using W_GL_REVN_F as an example, W_ is the prefix indicating that it is a warehouse table,
GL_REVN is the table name, and _F is the suffix indicating that it is a fact table.
Fact
Dimension
Minidimension
Helper
Hierarchy
This slide lists a subset of the table types used in the OBAW. Each table type listed here is
discussed in detail in the slides that follow.
In a star schema, the central table is a fact table. Fact tables contain the metrics being
analyzed by dimensions. A fact table usually contains numeric measurements and has
multiple joins to dimension tables surrounding it. A fact table typically has two types of
columns: those that contain numeric facts (often called measures) and those that are foreign
keys to dimension tables. Dimension tables are related to the fact table by a single join. A fact
table contains either detail-level facts or facts that have been aggregated. Fact tables that
contain aggregated facts are often called summary tables. A fact table usually contains facts
with the same level of aggregation.
The dimension tables in a star schema store descriptions of the characteristics of a business.
A dimension is descriptive information that qualifies a fact. For example, each record in a
product dimension represents a specific product. Typically, dimension tables are wide and
short because they contain fewer records with many columns. The columns of a dimension
table are also called attributes of the dimension. Each dimension table in a star schema has a
single primary key joined to the fact table.
The unique numeric key (ROW_WID) for each dimension table is generated during the load
process. This key is used to join each dimension table with its corresponding fact table or
tables. It is also used to join the dimension with any associated hierarchy table or extension
table. The ROW_WID columns in the Oracle Business Analytics Warehouse tables are
numeric. In every dimension table, the ROW_WID value of zero is reserved for Unspecified. If
one or more dimensions for a given record in a fact table are unspecified, the corresponding
key fields in that record are set to zero.
Helper tables are used by the Oracle Business Analytics Warehouse to solve complex
problems that cannot be resolved by simple dimensional schemas. In a typical dimensional
schema, fact records join to dimension records with a many-to-one relationship. To support a
many-to-many relationship between fact and dimension records, a helper table is inserted
between the fact and dimension tables. The helper table can have multiple records for each
fact and dimension key combination. This allows queries to retrieve facts for any given
dimension value. It should be noted that any aggregation of fact records over a set of
dimension values might contain overlaps (due to a many-to-many relationship) and can result
in double counting.
Hierarchies stored in transactional systems are flattened in hierarchy tables in the data
warehouse. For example, W_ORG_DH stores the hierarchy relationships for the organization
dimension, W_PRODUCT_DH stores hierarchy relationships for the product dimension, and so
on. Hierarchy tables are rebuilt with each ETL run. Examples of hierarchy tables in the data
warehouse include:
Industry (W_INDUSTRY_DH)
Organization (W_ORG_DH)
Internal Organization (W_INT_ORG_DH)
Employee Positions (W_POSITION_DH)
Product (W_PRODUCT_DH)
The screenshot shows a partial view of W_INT_ORG_DH, which stores the flattened hierarchy
of internal organizations. When one organization rolls up into multiple hierarchies, the multiple
hierarchies are stored in this table. Each hierarchy is differentiated by a hierarchy number and
name.
Staging tables are used primarily to stage incremental data from the transactional database.
When the ETL process runs, staging tables are truncated before they are populated with
change capture data. During the initial full ETL load, these staging tables hold the entire
source data set for a defined period of history, but they hold only a much smaller volume
during subsequent refresh ETL runs. The staging table structure is independent of source
data structures and resembles the structure of data warehouse tables. This resemblance
allows staging tables to also be used as interface tables between the transactional database
sources and data warehouse target tables.
One of the main uses of a data warehouse is to sum fact data with respect to a given
dimension (for example, by date or by sales region). Performing this summation on demand is
resource-intensive and slows down response time. The Oracle Business Analytics
Warehouse precalculates some of these sums and stores the information in aggregate tables
to speed up response time. In the Oracle Business Analytics Warehouse, the aggregate
tables have been suffixed with _A.
Internal tables in the OBAW are those tables that cannot be classified as staging, fact,
dimension, hierarchy, extension, or dimensional map tables. These tables store important
information that is used during the ETL process, are rebuilt during each ETL process, and are
not used by end-user query tools. Internal tables include the following:
W_COSTLST_G: The cost list information used by the ETL process. It is mirrored in the
Siebel transactional database by S_ETL_COSTLST.
W_DUAL_G: The table used by the ETL process to generate calculated values. It is
similar to S_DUAL in the Siebel transactional database.
W_EXCH_RATE_G: The exchange rate information used by the ETL process. It is
mirrored in the transactional database by S_ETL_EXCH_RATE.
W_DIM_TABLES_G: A list of data warehouse tables that are map-enabled
W_LOV_EXCPT_G: An intermediate table for finding exceptions in a list of values
W_LST_OF_VAL_G: A list of values used in the ETL process
Conforming
Fact Dimension
Dimension
Conforming dimension tables are shared by multiple fact tables allowing consistent analysis
across multiple star schemas.
Date field
The table in the slide shows examples of column suffixes in OBAW. Not all column suffixes
are listed here. For more information, refer to Standard Column Suffixes in Oracle Business
Analytics Warehouse in the Oracle Fusion Middleware Administrators Guide for Oracle
Business Intelligence Applications.
The table in the slide shows examples of system columns in OBAW. Not all system columns
are listed here. For more information, refer to System Columns in Oracle Business Analytics
Warehouse Tables in the Oracle Fusion Middleware Administrators Guide for Oracle
Business Intelligence Applications.
Currency Description
Contract currency The currency used to define the contract amount. This
currency is used only in Project Analytics
CRM currency The CRM corporate currency as defined in the Fusion
CRM application. This currency is used only in CRM
Analytics applications.
Document currency The currency in which the transaction was done and the
Answer: b
Answer: b
Answer: b
Conforming dimension tables are shared by multiple fact tables, allowing consistent analysis
across multiple star schemas.
Answer: d
Answer: d
Answer: c, d
Intelligence Applications
Installing Oracle Business
This slide lists the preinstallation and deployment requirements for Oracle BI Applications.
Each requirement is discussed in detail in the slides that follow.
Although it is technically possible to put the OBAW in the same database as the transactional
database, it is not recommended for performance reasons. The transactional database is
structured as an online transaction processing (OLTP) database, whereas the Oracle
Business Analytics Warehouse is structured as an online analytical processing (OLAP)
database, each optimized for its own purpose. The slide lists the reasons for not combining
the two databases. More detail is provided here:
ETL is configured to maximize hardware resources; and, therefore, the warehouse
should not share any resources with any other projects.
The analytical queries interfere with normal use of the transactional database, which is
entering and managing individual transactions.
The data in a transactional database is normalized for update efficiency. Transactional
queries join several normalized tables and will be slow (as opposed to pre-joined, de-
normalized analytical tables).
Historical data cannot be purged from a transactional database, even if not required for
current transaction processing, because you need it for analysis. (By contrast, the
analytical database is the warehouse for historical as well as current data.) This causes
the transactional database to further slow down.
database requirements.
The parameter template file provides parameter guidelines based on the cost-based optimizer
for Oracle 11g R2. Use these guidelines as a starting point. You will need to make changes
based on your specific database sizes, data shape, server size (CPU and memory), and type
of storage.
Copy the appropriate template file into your <ORACLE_HOME>/dbs directory. Then review the
recommendations in the template file, and make the changes based on your specific
database configuration. The database administrator should make changes to the settings
based on performance monitoring and tuning considerations.
This slide and the next list the high-level installation task for Oracle Business Intelligence
Applications. Each task is discussed in detail in the slides that follow.
Before you install Oracle Business Intelligence Applications, you must install the Oracle
Business Intelligence Enterprise Edition infrastructure, and before you install the Oracle
Business Intelligence Enterprise Edition infrastructure, you must use the Fusion Middleware
Repository Creation Utility (RCU) to create the Metadata Services (MDS) and Business
Intelligence Platform (BIPLATFORM) schemas in your database.
The Repository Creation Utility (RCU) is a graphical tool for creating and managing Oracle
Fusion Middleware database schemas in your database.
The screenshot shows the Summary page when creating the MDS and BIPLATFORM
schemas using the Fusion Middleware RCU in a Windows environment.
When you run the Oracle Business Intelligence Enterprise Edition (Oracle BI EE) 11g
installer, you can select either the Software Only Install or Enterprise Install option.
If you select the Software Only Install option, then you must install WebLogic Server 10.3.6
before running the Oracle BI EE installer.
If you select the Enterprise Install option, WebLogic Server 10.3.5 is automatically installed.
You must then upgrade to WebLogic Server 10.3.6 after the Oracle BI Applications installation
is complete.
The screenshot shows the Complete page for an Oracle BI EE installation using the
Enterprise Install option in a Windows environment. The remaining slides in this lesson are
based on an Oracle BI EE installation using the Enterprise Install option in a Windows
environment.
Before performing installation steps, please refer to Notice for Install of Oracle BI
Applications OBIA 11.1.1.7.1 (Doc ID 1558419.1) on My Oracle Support for additional
information about installing Oracle Business Intelligence Applications version 11.1.1.7.1 with
WebLogic 10.3.6 and Oracle BI EE in Software Only Install mode.
The screenshot shows the Installation Completed page for Oracle Data Integrator (ODI)
installed in a Windows environment. The ODI installer must be run on the same machine
where the Oracle BI EE installer was run. You cannot use an ODI installation outside of BI
Domain for use with Oracle BI Applications.
When you install ODI, specify a directory inside the Oracle Middleware Home. You cannot
install in just any Oracle Middleware Home directory. You must install ODI to the same Oracle
Middleware Home directory where Oracle BI EE has been installed. In the screenshot this
directory is D:\Oracle\Middleware. Oracle Home Location is the root directory where the
ODI products are installed. In the screenshot this is
D:\Oracle\Middleware\Oracle_ODI1.
The WebLogic Server option (not shown here) is selected by default if the installer detects a
Middleware Home with Oracle WebLogic Server installed. This will always be the case with
the ODI install for Oracle BI Applications, because you must install ODI to the same Oracle
Middleware Home directory where Oracle BI EE has been installed. Oracle BI EE for Oracle
BI Applications is supported only on Oracle WebLogic Server.
The screenshot shows the Summary page for the Business Analytics Applications Suite RCU
installed in a Windows environment. You need to run the Business Analytics Applications
Suite RCU to create schemas for the following components:
Oracle Business Analytics Warehouse
Oracle Business Intelligence Applications Components Repository
ODI Master and Work Repository
Before you run the Business Analytics Applications Suite RCU in a Windows environment,
make sure that you unzip the downloaded RCU zip file into a directory that does not have
spaces in the directory path.
The RCU uses .dmp files to create the required schemas. You must copy the .dmp files for
each schema to a directory with global write access on the appropriate database server
machines. RCU writes log files to this directory.
The .dmp files are located in BIA_RCU_HOME/rcu/integration/biapps/schema.
The screenshot shows the Complete page for an Oracle BI Applications installation in a
Windows environment. All files are installed to disk in the Oracle Home for BI directory. During
installation, you must specify the path to the directory for an existing Oracle Middleware Home
where Oracle BI EE has been installed. The Oracle Home directory you specify must be the
Oracle Home for Oracle BI EE. You cannot choose another Oracle Home or create a new
one.
Note that you will perform post-installation steps to configure Oracle BI Applications in a later
procedure.
The screenshot shows a completed patching report run in a Windows environment. You can
download Oracle Fusion Middleware Platform Patches for Oracle Business Analytics
Applications Suite from the Oracle Business Intelligence Applications 11.1.1.7.1 media pack
on Oracle Software Delivery Cloud. Download the following three parts from the Oracle
Business Intelligence Applications 11.1.1.7.1 Media Pack:
Oracle Fusion Middleware Platform Patches for Oracle Business Intelligence
Applications (Part 1 of 2)
Oracle Fusion Middleware Platform Patches for Oracle Business Intelligence
Applications (Part 2 of 2)
Oracle Fusion Middleware Platform Patches for Oracle Business Intelligence
Applications for <OS>
Extract the contents of the three downloaded zip files containing the patches into the same
directory. The patches are contained in folders: biappsshiphome, odi, soa, weblogic, and
oracle_common. You do not have to unzip the individual patches within the folders.
You run a script to apply the patches. The script is a Perl script and is available in
<BI_Oracle_Home>/biapps/tools/bin/APPLY_PATCHES.pl.
The screenshot shows the Complete page for Oracle BI Applications configuration in a
Windows environment. During this phase, the following key configurations occur:
Oracle BI Applications Configuration Manager, Functional Setup Manager, ODI Java EE
Agent, ODI Console, and Load Plan Generator are deployed into WebLogic Server.
Component wiring is performed.
A BI Applications Administrator User (with full access to Configuration Manager and
access to ODI with the Supervisor role) is created in WebLogic Server embedded
LDAP.
The ODI repository for BI Applications is configured and set to use external
authentication (authentication against the WebLogic Server embedded LDAP).
You use the WebLogic Upgrade Installer to upgrade to WebLogic Server 10.3.6. You can
download WebLogic Upgrade Installer from My Oracle Support. The screenshot shows the
directory confirmation page to upgrade WebLogic Server to 10.3.6 in a Windows environment.
ODI Studio is a desktop client that enables you to design and manage the ODI repository.
ODI Studio is installed during the ODI installation if you select ODI Studio as an option during
installation. Typically, ODI Studio will not be installed to the BI Domain, but instead will be
installed on developer machines. The supported operating systems for ODI Studio are
Windows 32-bit and 64-bit and Linux 32-bit.
The ODI Repository is configured for external authentication against WebLogic Server's
embedded LDAP server. ODI Studio must be configured to use the appropriate security files
for authentication. You must perform the steps on the slide for all installations of ODI Studio.
Note: You must perform these steps even if ODI Studio has been installed on the machine
where Oracle Home for BI resides. If you do not successfully complete these steps, you will
receive the following error message when attempting to sign in to ODI Studio: ODI-10188:
Error while login from OPSS../jps-config.xml (No such file or directory).
The binaries for ATGLite Patch 16239380 are applied when the Fusion Middleware platform
patches are applied. However, the schema update and seed data updates are not applied and
must be performed by running the patch script.
This slide lists the post-installation system setup tasks. Each task is covered in detail in the
slides that follow.
Start ODI Studio and select Provide connection information BI Apps Project is
Connect to Repository. for the OBIA work repository. visible in ODI Studio.
When you start ODI Studio for the first time after installation, you are prompted to connect to
the OBIA ODI Work Repository. You can enter any login name, but the credentials must be for
the administrative user that you defined during installation. After you click OK to initialize the
ODI connection, the BI Apps Project mapping folders are visible in ODI Studio.
Topology tab
in ODI Studio
You use the Topology tab in ODI Studio to set the connection properties in the ODI repository
of the physical schema associated with the BIAPPS_DW_FILE physical server. In the
navigation pane, select the Topology tab.
In the left pane, expand Technologies > File > BIAPPS_DW_FILE and then double-click
BIAPPS_DW_FILE <path> to edit the BIAPPS_DW_FILE physical schema in the right pane.
BIAPPS_DW_FILE <path> points to the original location of the source files. Recall that during
the installation process these files are copied to a location that ODI can access, but that is
outside of the Oracle BI Home directory. This is done to prevent these files from being
overwritten when an Oracle BI Applications environment is upgraded or patched.
In the Definition pane, for the Directory (Schema) and Directory (Work Schema) properties,
specify the directory where you copied the source files and include the subfolders. In this
example, the location is D:\etl\data_files\src_files\BIA_11.
Use Oracle BI Applications Configuration Manager to register the source system. Launch
Oracle BI Applications Configuration Manager and sign in as the BI Applications
Administrator.
In the navigation pane, select System Setups > Define Business Intelligence Applications
Instance. The Source Systems tab is displayed (not shown here). Click the Add icon to
display the Register Source dialog box.
To register the source in Configuration Manager, use the Register Source in Configuration
Manager page to specify the following properties:
Product Line
Product Line Version
Source Instance Name
Description
Data Source Number
Click Next to open the Register Source in Oracle Data Integrator Topology page, which is
discussed in the next slide.
From the Tasks bar, under System Setups, click Manage Business Intelligence Applications
to display the Manage Business Intelligence Applications dialog box. Select the Business
Intelligence Application Offerings tab.
Select the Enabled check box next to the desired offering(s). In this example, Oracle Financial
Analytics and its associated functional areas are selected. Enabling an offering makes the
setup data associated with that offering available in Configuration Manager.
Oracle Business Intelligence is installed with a set of preferred currencies with preconfigured
preferred currency names and preferred currency codes. Preferred currency names are used
in Oracle Business Intelligence dashboards in the Currency drop-down in the My Account
dialog box and on the Preferences tab for a user logged into Oracle Business Intelligence.
You can use the Manage Preferred Currencies dialog box in Configuration Manager to edit
the default currency display names. You edit preferred currency name values to change the
currency labels that are displayed in all modules associated with BI dashboards. For example,
you might want to change the Local Currency label from Ledger Currency to Local
Currency.
The steps for running a domain-only load plan are covered in detail in Lesson 9, Managing
Load Plans.
Upon installation, the Oracle BI Applications system is configured to use WebLogic Server
embedded LDAP for authentication. Access to Configuration Manager and Functional Setup
Manager is controlled through the following application (duty) roles.
BI Applications Administrator Duty
BI Applications Functional Developer Duty
BI Applications Implementation Manager Duty
Load Plan Operator Duty
Load Plan Administrator Duty
During Oracle BI Applications installation, an Oracle BI Applications Administrator user is
created. This user has full access to Configuration Manager and ODI. This user can perform
all Oracle BI Applications functions, which includes performing system setups in Configuration
Manager, performing functional configurations in Functional Setup Manager, and running and
monitoring ETL. Additional users can be created and assigned appropriate roles based on the
tasks these users will perform.
Security implementation and administration is discussed in detail in Lesson 13, Security.
You can trim the OBIA repository (RPD) so that it includes only
the projects that are relevant to your deployment.
OBIA release 11.1.1.7.1 delivers a full RPD file, deployed
to the BI Server, with projects for all the OBIA modules.
Although optional, trimming the RPD makes the BI Server
startup process faster and also makes patching quicker.
The steps for trimming the RPD depend on the status of your deployment, as follows:
If the RPD has not been customized for your deployment: Extract the projects for the
products that your organization has purchased. You do not need to perform a merge.
If the RPD has been customized for your deployment: Extract the applicable projects
from the full (delivered) RPD for release 11.1.1.7.1, and, additionally, merge that RPD
with your customized release 11.1.1.7.1 RPD.
For more detailed information, refer to the Trimming the RPD section in the Oracle Fusion
Middleware Installation Guide for Oracle Business Intelligence Applications.
Which are valid reasons to not install the OBAW in the same
database as the transactional database:
a. ETL is configured to maximize hardware resources.
b. Analytical queries interfere with normal use of the
transactional database.
c. The data in a transactional database is normalized for
Answer: a, b, c, d, e
Answer: a
Answer: a, d
Answer: a, c, d
Answer: d
Configuration Manager is a web application that you use to manage and monitor load plans,
which execute ETL processes. It acts as a console for Oracle Data Integrator, which is the
embedded data integration platform and ETL processing engine. Configuration Manager
provides:
A user interface for managing load plans
Access to ODI Console, which is a web application that enables you to manage and
monitor ETL processes
Many instances of the ODI repository can coexist in the IT infrastructure (for example,
development, quality assurance, user acceptance, and production). The architecture of the
repository is designed to allow several separated environments that exchange metadata and
scenarios (for example, development, test, maintenance, and production environments).
There is usually only one ODI master repository that stores the
following information:
Security information including users, profiles, and rights for
the ODI platform
Topology information including technologies, server
definitions, schemas, contexts, languages, and so on
The work repository contains actual developed objects. Several work repositories may coexist
in the same ODI installation (for example, to have separate environments or to match a
particular versioning life cycle). When the work repository contains only the execution
information (typically for production purposes), it is then called an execution repository.
Oracle BI Applications is shipped with a prebuilt ODI work repository.
The objects referenced in this slide are covered in more detail later in this lesson.
Configuration Manager
Source ODI Console
Target
The graphic in the slide summarizes the interaction of the components presented in the
preceding slides.
Project
Adaptor
Interface
Package
Scenario
The slide lists the prebuilt ODI metadata objects that are used in ETL processing for Oracle BI
Applications. Each of these objects is discussed in detail in the slides that follow.
The objective here is to explore some of the prebuilt metadata in ODI Studio to gain a
high-level understanding of some of the key elements, components, and naming conventions
related to ETL processing.
At this point, do not concern yourself with specific details about the metadata, the ODI Studio
tools, or Configuration Manager, which are all covered in more detail in subsequent lessons.
Rather, focus on the metadata objects and their general use in the ETL process.
Development objects
Components involved in a project include components contained in the project and global
components referenced by the project. In addition, a project also uses the components
defined in the models and topology.
The slide shows the BI Apps Project, which is a prebuilt ODI project that ships with Oracle BI
Applications. The BI Apps Project folder contains various subfolders and objects used to
manage the ETL process for Oracle BI Applications.
Folders are components that help organize the work in a project. Subfolders can be inserted
into folders.
The example in the slide shows the project expanded to Components > DW > Oracle >
Generate DW DDL. This folder contains the procedures used to generate and optionally
execute DDL scripts to synchronize the Oracle Business Analytics Warehouse schema with
the Oracle BI Applications model in ODI. You learn how to generate DDL scripts in the lesson
titled Building a Category 1 Customization.
Task folder
Development objects
Expand BI Apps Project > Mappings to view the subfolders for the prebuilt adaptors for
various Oracle source systems from which data is extracted.
For example, SDE_ORA11510_Adaptor is a subfolder that contains the prebuilt SDE task
folders for the 11.5.10 version of Oracle E-Business Suite. SDE_PSFT_90_Adaptor contains
the prebuilt SDE task folders for the 9.0 version of PeopleSoft, and so on.
The screenshot shows SDE_ORA11510_Adaptor expanded to the
SDE_ORA_APAgingBucketsDimension task folder. Notice that the task folders comprise
packages, interfaces, and procedures.
In the example in the slide, notice there are two interfaces for the
SDE_ORA_GeographyDimension_HZLocations task folder.
The slide also shows the mapping diagram in the ODI Designer editor for the
W_GEO_DS_SQ_HZ_LOCATIONS interface. This is a temporary interface, which is used to
extract data for the staging table W_GEO_DS. In this interface, data is extracted from
columns in the HZ_LOCATIONS dimension datastore in the source system and mapped to
columns in a temporary target named SQ_HZ_LOCATIONS, where the data is transformed
for some mappings.
The second interface, W_GEO_DS, is the main interface. It loads the data into the
W_GEO_DS staging table in the Oracle Business Analytics Warehouse. This interface uses
the SQ_HZ_LOCATIONS temporary target in the W_GEO_DS_SQ_HZ_LOCATIONS
interface as its source, and the W_GEO_DS dimension staging table as its target.
In the example in the slide, notice there are three steps for the SDE_ORA_GLRevenueFact
package in the SDE_ORA_GLRevenueFact task folder.
Refresh IS_INCREMENTAL: This is a variable step, which declares, sets, refreshes, or
evaluates the value of a variable. In this case, this is a refresh variable step, which
refreshes the variable by running the query specified in the variable definition. More
specifically, this variable is used to determine whether this package should be run in full
or incremental mode.
Refresh LAST_EXTRACT_DATE: This is also a refresh variable step that determines
the date on which data was last extracted for this package. This variable is used to
determine if this package should execute a full or incremental load.
Run SDE_ORA_GLRevenueFact.W_GL_REVN_FS: This is a flow step, which
executes an interface. In this example, it executes the W_GL_REVN_FS interface,
which loads the W_GL_REVN_FS fact staging table in the data warehouse.
You can also view and administer package steps in an execution diagram in the ODI
Designer editor.
When a component is finished and tested, you can generate the scenario corresponding to its
actual state. Generating a scenario for an object compiles the code for this object for
deployment and execution in a production environment. This operation can be completed in
ODI Studio Designer. The scenario code (the language generated) is frozen, and all
subsequent modifications of the components, which contributed to creating it, will not change
it in any way.
It is possible to generate scenarios for packages, procedures, interfaces, or variables.
Scenarios generated for procedures, interfaces, or variables are single-step scenarios that
execute the procedure or interface, or refresh the variable. After the scenario is generated, it
is stored in the work repository.
Scenarios appear in a development environment under the source component in the Projects
tree of Designer Navigator, and appearfor development and production environmentsin
the Scenarios tree of Operator Navigator.
The screenshot in the slide shows a prebuilt scenario generated for the
SDE_ORA_GLRevenueFact package with the naming convention FOLDER NAME_SCENARIO
NAME. You learn more about generating and using scenarios in the lesson titled Building a
Category 1 Customization.
The BI Apps Projects > Variables folder contains the prebuilt variables for BI Apps Project. A
variable can be created as a global variable or in a project. This defines the variable scope.
Global variables can be used in all projects, whereas project variables can be used only
within the project in which they are defined.
A model:
Is the description of a set of datastores
Corresponds to a group of tabular data structures stored in
a data server
A model is based on a logical schema defined in the ODI topology. In a given context, this
logical schema is mapped to a physical schema. The data schema of this physical schema
contains the physical data structure: tables, files, JMS messages, and elements from an XML
file that are represented as datastores. Models, as well as all their components, are based on
the relational paradigm (table, columns, keys, and so on). The models in Data Integrator
contain only metadata, that is, the description of the data structures. They do not contain a
copy of the actual data.
The screenshot in the slide shows the prebuilt model for the Oracle BI Applications data
warehouse. Notice that the model is organized into submodels that represent the various
components of the data warehouse: aggregate, dimension, dimension staging, and so on.
The screenshot in the slide shows the prebuilt W_GEO_DS dimension staging datastore and its
associated columns. Recall that this is the target table of the interface presented earlier in this
lesson.
Because datastores are based on the relational paradigm, it is also possible to associate the
following elements to a datastore:
Keys: A key is a set of columns with a specific role in the relational paradigm. Primary
and alternate keys identify each record uniquely. Non-unique indexes enable optimized
record access.
References: A reference is a functional link between two datastores. It corresponds to a
foreign key in a relational model. For example, the INVOICE datastore references the
CUSTOMER datastore through the customer number.
Conditions and Filters: Conditions and filters are WHERE-type SQL expressions
attached to a datastore. They are used to validate or filter the data in the datastore.
Recall that a scenario is designed to put a source component (interface, package, procedure,
or variable) into production. A load plan is the largest executable object in Oracle Data
Integrator. It uses scenarios in its steps. When an executable object is used in a load plan, it
is automatically converted into a scenario. For example, a package is used in the form of a
scenario in load plans.
The screenshots in the slide show the prebuilt scenario for the
SDE_ORA_GLREVENUEFACT package that was discussed earlier in this lesson.
The Load Plans and Scenarios panel also displays any load plans that have been generated
in ODI Studio or Configuration Manager. After a load plan is generated, you can update the
scenarios in the load plan. You learn more about generating load plans in the lesson titled
Managing Load Plans and more about updating load plan scenarios in the lesson titled
Building a Category 1 Customization.
Full ETL
Initially, a full load is performed to extract all the required
data and load all the tables in the Oracle Business Analytics
Warehouse.
Incremental ETL
Subsequently, the data warehouse is updated incrementally.
Incremental ETL loads eliminate redundant loads of data. This is significant, because the load
process is time-consuming and resource-intensive.
The Initial Extract Date defines a cut-off so that not all records are loaded into the data
warehouse. You set the Initial Extract Date value for each data source in Oracle BI
Applications Configuration Manager.
An ETL process can extract a record from a single table or from multiple tables. When a
record is the result of joining multiple tables, one of these tables is identified as the base
table, which defines the granularity of the record. When extracting fact records, Oracle BI
Applications compares the Created Date of the base table only with the Initial Extract Date.
Last Extract Date is a value that is calculated based on the last time data was extracted from
that table less a Prune Days value. Records can be missed in an ETL process when a record
is being updated while the ETL process is running and was not committed until after the ETL
completed.
You set the Prune Days parameter value in Oracle BI Applications Configuration Manager.
Setting a small value means the ETL will extract fewer records, thus improving performance;
however, this increases the chances that records are not detected. Setting a large number is
useful if ETL runs are infrequent, but this increases the number of records that are extracted
and updated in the data warehouse. Therefore, you should not set the Prune Days value to a
very large number. A large Prune Days number can also be used to trigger re-extracting
records that were previously processed but have not changed. The value for Prune Days
should never be set to 0.
Traditional ETL tools operate by first extracting the data from various sources, transforming
the data in a proprietary, middle-tier ETL engine that is used as the staging area, and then
loading the transformed data into the target data warehouse or integration server. Therefore,
the term ETL represents both the names and the order of the operations performed. The data
transformation step of the ETL process is by far the most compute-intensive, and is performed
entirely by the proprietary ETL engine on a dedicated server. The ETL engine performs data
transformations (and sometimes data quality checks) on a row-by-row basis, and therefore,
can easily become the bottleneck in the overall process. In addition, the data must be moved
over the network twiceonce between the sources and the ETL server, and again between
the ETL server and the target data warehouse. Moreover, if you want to ensure referential
integrity by comparing data flow references against values from the target data warehouse,
the referenced data must be downloaded from the target to the engine, thus further increasing
network traffic and download time, and leading to additional performance issues.
In response to the issues raised by ETL architectures, a new architecture has emerged, which
in many ways, incorporates the best aspects of manual coding and automated
code-generation approaches. Known as ELT, this new approach changes where and how
data transformation takes place, and leverages existing developer skills, RDBMS engines,
and server hardware to the greatest extent possible.
in the slide.
Oracle Data Integrator supports both ETL and ELT data integration.
Note: For the sake of continuity, ETL is used throughout this course.
SDE routines extract data from sources and load data into
OBAW staging tables.
SIL routines transform data and load data into OBAW fact
and dimension tables.
Source Dependent
Source 1
Source Dependent Source
Full & OBAW
Extract Source 2
Source 2 refresh Independent OBAW
staging
extraction Loading
tables
Source Dependent
Source 3 Extract Source 3
Source Dependent
Extraction
This slide provides a summary of how the SDE and SIL workflows are used during the ETL
process.
The example in the slide illustrates the ETL architecture. Typically, the extract load transform
process has two main steps. The first step is the extract and stage load step, and the second
step is the load transform step. The extract and stage load step is generated from a
combination of the main interface and the nested temporary interface. The load transform step
is generated as a result of the integration knowledge module (IKM).
In this example, step 1 issues a SQL statement on the source that joins the GL_SET_OF_
BOOKS table with the HR_ORGANIZATION_INFORMATION table. The join is executed on
the source database, and the resulting data is staged. Then, a second join occurs at the load
transform stage between the W_DOMAIN_MAP_G table and the temporary stage table,
which results in the loading of the W_INV_ORG_ DS dimension staging table.
Oracle Data Integrator employs a powerful declarative design approach to data integration,
which separates the declarative rules from the implementation details. ODI is also based on a
unique ELT (Extract - Load Transform) architecture, which eliminates the need for a
stand-alone ETL server and proprietary engine, and instead leverages the inherent power of
your RDBMS engines. This combination provides the greatest productivity for both
development and maintenance, and the highest performance for the execution of data
transformation and validation processes.
Data Connectivity
ODI supports all RDBMSs including all leading data
warehousing platforms (such as Oracle, Exadata, Teradata,
IBM DB2, Netezza, Sybase IQ, and so on) and numerous
other technologies (such as flat files, ERPs, LDAP, XML, and
so on).
Answer: a, b, d
Answer: a, c, d
Answer: c
Answer: c, d
Intelligence Applications
Source instance
The transactional system that serves as the source of data
for the Oracle Business Analytics Warehouse (OBAW)
Offering
A BI Application product that you have purchased
For example, Oracle Financial Analytics, or Oracle Sales
Each of these tools is discussed in detail in the slides that follow. Please note that you cannot
perform functional configuration using either tool until you have run a domain-only load plan.
You learn more about domain-only load plans in Lesson 9, "Managing Load Plans."
Configuration Manager contains the setup objects for OBIA. It provides administrative
graphical user interfaces for setup and configuration. It is the recommended product for
ongoing administration and maintenance of functional setup within OIBA. It also provides a
quick review of OBIA setup values, and is the recommended tool for monitoring and
troubleshooting load plan executions. Configuration Manager works in conjunction with
Functional Setup Manager to provide guided tasks to configure OBIA offerings and functional
areas.
Work panel
Tasks bar
To open FSM
Resize
The screenshot shows the main Tasks bar for Configuration Manager and the functional
configuration work area for Manage Data Load Parameters.
The Configuration Manager work area includes:
Tasks bar, which provides links to Configuration Manager options
Work panel, which displays the currently selected option
Perform Functional Configurations option, for starting FSM
Collapse Tasks bar arrow. Use the Collapse Tasks bar arrow to hide the Tasks bar and
maximize the screen area for displaying the setup pages.
Resize bar for the Tasks bar
Expand/Collapse Contextual Pane arrow (for Data Load Parameters and Reporting
Parameters only). Please note: Some pages (for example, the Manage Data Load
Parameters page) have an additional Contextual pane on the right side that can be
expanded (and resized) or collapsed.
FSM is installed and deployed as part of OBIA. In FSM, you select the OBIA offering and
functional areas that you wish to deploy. FSM generates a list of configuration tasks specific
to the offering and functional areas that were selected. These tasks can be assigned to
different functional developers and the status of the OBIA implementation project can be
monitored in FSM. Setup user interfaces guide functional developers through the performance
of each task. The key point here is you should use the guidance presented in Functional
Setup Manager's list of tasks to perform your initial configurations.
Resize
This slide lists some of the important functional configuration tasks you perform using
Functional Setup Manager. Each task is discussed in detail in the slides that follow.
Select
Configure
Offerings. Enable offering.
In the Tasks bar in Oracle BI Applications Configuration Manager, select the Perform
Functional Configurations link to launch FSM. In FSM, in the Tasks bar, under
Implementations, select the Configure Offerings link to display the Configure Offerings page.
In the example in the slide, you select the Enable for Implementation check box next to
Oracle Financial Analytics. You then expand Oracle Financial Analytics, select the Enable for
Implementation check box next to Oracle E-Business Suite-Oracle Financial Analytics, and
then select the Enable for Implementation check boxes for all the desired functional areas
under Oracle E-Business Suite-Oracle Financial Analytics.
If you do not enable an offering for implementation, you will not be able to configure that
offering using FSM.
The steps to create an implementation project are not displayed in the slide. To create an
implementation project, in the Tasks bar, select Implementations > Manage Implementation
Projects to display the Manage Implementation Projects page. Then choose Actions > Create
to enter a name for the project and select the offering to implement. To make offerings easier
to manage, Oracle recommends that you deploy one offering per project. In other words, if
you are deploying three offerings, then create three implementation projects.
In this example, you have installed Oracle Financial Analytics and you create an
implementation project to configure the ETL for Oracle Financial Analytics. To configure ETL
for Oracle Financial Analytics, you must create at least one implementation project. When you
create an implementation project, you select the offering to deploy as part of that project.
Once you create an implementation project, FSM generates the tasks required to configure
the specified offering. By default, the tasks are assigned to the BI Administrator user. If
required, you can optionally assign tasks to functional developers, who will then perform the
tasks. Use the Go to Task column to complete functional configuration tasks.
The example in the slide shows an implementation project named IP_Financial_Analytics,
configured for the Oracle Financial Analytics offering, with its associated tasks.
Click Go to Task
In the example in the slide, select the Configure Initial Extract Date task, and then click the
Go to Task icon to display the Task: Configure Initial Extract Date configuration page, which
enables you to complete the task.
When you click Go to Task for an informational task, you display a list of steps that you must
perform externally to FSM. For example, you might need to use Oracle BI EE Administration
Tool to configure a value in the BI metadata repository.
When you have completed the steps listed in the informational task, you must manually set
the status of the task. You can edit the status by clicking the Status icon for the task, or
selecting the task and clicking the Edit Status button on the toolbar (not shown here).
By default, tasks are assigned to the BI Applications Administrator user (weblogic, in this
example). You can assign tasks to functional developers so that functional developers can
configure OBIA offerings. When functional developers log in and display the Assigned
Implementation Tasks tab, they would only see the tasks that have been assigned to them.
When BI Administrators log in and display the Implementation Projects tab, they see all tasks.
In a small deployment project, a single person with BI Applications Administrator privileges
might perform all of the setup and functional configuration tasks for Oracle BI Applications.
Notice that you could also add notes and assign a due date for a task.
In the Tasks bar select Implementation Objects > Manage Offerings and Functional Areas.
From here, you can manage the various offerings and their associated functional areas. For
example, you can drill on an offering to view detail information about the offering and the
associated functional areas. You can also search by name, offerings, functional area, or
product.
Answer: d
Answer: a, b, c, d
Answer: c
Answer: d
Configuration Data
Domain mappings
Domains are typically located in the source system. For example, in Financial Analytics,
domains store information about the General Ledger accounts. If domains are not available in
a source system, then they can be sourced from a flat file.
The screenshot shows the Domain Mappings tab. You access this tab by selecting Tasks >
Domains Administration > Manage Domain Mappings and Hierarchies in Configuration
Manager.
This tab shows how data fields in the source system map to data fields in the Oracle Business
Analytics Warehouse (OBAW). The domain mappings specify how data in a source system is
extracted and loaded into the OBAW.
For example, the data in the source domain Account Employee Size (ACCNT_EMP_SIZE)
extracts and loads into the target domain Customer Employee Size Category
(W_ACCNT_EMPLOYEE_SIZE).
Notice that you can search for domain mappings by source instance, offering, fact group,
dimension group, domain name, and so on.
Select domain
mapping
The screenshot shows the Manage Source Domains page. To navigate to this page, select
Tasks > Domains Administration > Manage Source Domains in Configuration Manager.
Source domains displayed on the Source Domains tab are read-only.
On the Source Domains page, you can select a source domain to view its members in the
lower pane. In the example in the slide, you select the Source AP Transaction Source domain
to view its domain members. Domain members are the permitted values for a source domain.
Notice that you do not have the ability to edit and add domain members in this pane. To
maintain data integrity in Oracle Business Intelligence Applications, some domains have been
designed as non-extensible, and are, therefore, read-only.
The screenshot shows the Manage Warehouse Domains page. To navigate to this page,
select Tasks > Domains Administration > Manage Warehouse Domains in Configuration
Manager.
In the example in the slide, you select the Country warehouse domain to view its warehouse
members. Warehouse members are the permitted values for a warehouse domain.
Notice that, unlike when working with source domain members, you have the ability to edit
and add warehouse members in this pane.
Domain hierarchies are displayed in inverted format. In the example in the slide,
W_COUNTRY is the parent of child W_REGION.
Select a domain mapping in the hierarchy to display domain member mappings in the lower
pane.
Use the field next to Source Domain Members to display mapped, unmapped, or all source
domain members.
Click Edit.
Oracle Business Intelligence Applications ships with default domain value mappings that map
the seeded BI Application domain values to the seeded configuration data. If you want to use
these default categories, you do not need to make any changes to these mappings before you
start your ETL processes. The example in the slide shows the member mappings for Account
Employee Size.
If you want to edit a domain member mapping, click the Edit icon to display the Edit Domain
Member Mappings dialog box. Use this dialog box if you want to make changes to default
domain-mapping values. For example, you could change the range values or click Add Range
Domain Member Mapping to create a new range.
You can set up a target domain by using the Batch Edit option
to update multiple target domain members with the same value.
Multi-select one or more rows in the table. Select a value from the Batch Edit drop-down list.
Using batch edit is useful for large domains with many member mappings that require the
same value. In the Tasks bar, click Manage Domains and Mappings, display the Domain
mappings tab, select the Domain that you want to edit, and then click the Edit Domain
Member Mappings icon in the Domain Member Mappings pane to display the Edit Domain
Member Mappings dialog box.
To use batch edit, select one or more rows in the table, select a value from the Batch Edit
drop-down list, and then click Change to apply the value selected in the Batch Edit drop-down
list to all specified members.
In some scenarios, you might not know what target domain member values should be when
you deploy Oracle BI Applications. For example, in Order Management or Supply Chain
Analytics, UOM (Unit of Measurement) is typically not known until deployment time.
You can set up a non-ranged target domain using the Sync To Source option to automatically
synchronize a target domain with values from the source domain. This process inserts new
target members from the source domain and automatically generates 1:1 mappings. This is
useful for large domains with many member mappings that might otherwise take a long time
to set up. Sync To Source is only available for extensible non-ranged domains.
To view and manage data load parameters, select Tasks > Data Load Parameters
Administration > Manage Data Load Parameters to open the Manage Data Load Parameters
page in Configuration Manager. Use the Search section to specify source instance, offering,
fact group, and so on. In the example in the slide, data load parameters are shown for the
Oracle Financial Analytics offering.
Data load parameters can be either global or application-specific.
Global parameters apply to all applications and are indicated by the (ab) and globe icon.
Global data load parameters can also be associated with specific fact groups or dimension
groups.
Application-specific parameters apply to specific applications and are indicated by the (ab)
icon. Application-specific data load parameters are always associated with one or more fact
groups or dimension groups.
Some parameters have a warning icon that indicates that this parameter value must be set
before running a full load.
Some parameters have a read-only icon that indicates that this parameter value cannot be
edited.
In the example in the slide, the Slowly Changing Dimension Flag parameter is selected. This
parameter indicates whether the slowly changing dimension type 2 flag is set for dimensions
or dimension groups.
Notice that this is a global parameter and the Global Parameter Value is set to No. If the
Global Parameter that you edit is associated with fact groups or dimension groups, then a
warning message is displayed to verify that you want to update the value for all associated
fact groups and dimension groups. If you click Yes at the warning message, then the values
of all occurrences of the parameter at the group level will be updated to the new value.
In the example in the slide, the Slowly Changing Dimension Flag data load parameter is
selected, and the Group Specific Parameter Values pane is visible. This pane shows the
value of the Slowly Changing Dimension Flag parameter for specific dimension groups within
the Oracle Financial Analytics offering.
In this example, the Business Location Dimension group-specific parameter value is selected.
To edit the group-specific parameter value, click the Edit icon on the toolbar to open the Edit
Dialog dialog box. Notice that you can change the parameter value by selecting from a list of
values: Yes or No in this example. The fields that are displayed in this dialog box are different,
depending on the type of parameter being edited. For example, the parameter data type might
be Boolean, date, multi-value select list of values, number, single-value select list of values,
string, and so on. It is also possible to edit more than one group-specific parameter value by
using the Edit All icon (two pencils).
Click to edit.
In the Tasks bar, select Manage Reporting Parameters to view or edit reporting parameters.
In the example in the slide, the Global tab is selected. Global parameters apply to all
applications. Application-specific parameters apply to specific applications. To edit a reporting
parameter, select the parameter in the parameter list, and then either click the Edit icon or
click the value in the Parameter Value column.
You can export and import setup data for Oracle BI Applications Configuration Manager to:
Make a backup of your configuration settings for security purposes. For example, you
might keep a record of the configuration changes that you have made.
Migrate the Setup Data for Oracle BI Applications Configuration Manager from one
environment to another environment. For example, you might move the configuration
changes that you have made from a Test environment to a Production environment.
In the Tasks bar, select Export Setup Data and then click the Export icon to display the Export
dialog box. Name the export file and use the Export dialog box to specify the setup objects
that you want to export. When you export setup data, you can export only the changes that
you have made to the values of the following objects: data load parameters, domains and
mappings, reporting parameters, and system setups. Unchanged configuration values are not
exported. For example, if you only change the value of DEFAULT_CURRENCY from USD to
Euro and then you export your data, then the export zip file that is produced will contain only
columns for DEFAULT_ CURRENCY=Euro. The Export Details pane (in the Export Setup
Data pane) displays the details of the selected export file.
To import setup data, select Tasks > Import Setup Data and follow a process similar to
importing an exported file. When you import setup data from a zip file, you import whatever
configuration changes were exported to that zip file.
You can use the Reports panes on the Overview page to monitor setup data. For example:
Use the System Setups list to monitor which Offerings have been enabled for
deployment.
Use the Parameters By Offerings report to monitor visually the number of parameters
that have been configured.
Use the Load Plan Executions report to monitor load plans.
Use the Domain Mappings by Offerings report to monitor domain mappings.
You can drill into each report for more detailed information. For example, drilling on the
Parameters bar graph in the Parameters By Offerings report will take you to a list of
parameters by offering. Drilling on the Parameters with no values bar in the graph will open
a page where you can view and edit parameters with unassigned values.
Answer: b
Answer: b, c, d
Answer: b
Batch edit is useful for large domains with many member mappings that require the same
value.
Answer: a, b, c, d
Answer: b, d
A load plan is an executable object that comprises and organizes the child objects (referred to
as steps) that carry out the ETL process. A load plan is made up of a sequence of several
types of steps. Each step can contain several child steps. Depending on the step type, the
steps can be executed conditionally, in parallel or sequentially.
You define a load plan in Oracle BI Applications Configuration Manager by selecting a data
source and one or more fact groups. This selection determines which steps need to be
performed during the ETL process. Each fact group belongs to a specific functional area or
areas that are associated with one or more offerings, which, in turn, are related to a data
server. A transactional data source is associated with one or more data servers.
After you define the load plan, you then generate it to build it in the ODI repository. You then
execute the load plan to perform the ETL process.
When you generate a load plan, the load plan is built in the ODI repository. A load plan must
be generated successfully before it can be executed. Note: Load plans must be generated
serially or the process will fail. Do not launch a second load plan generation if one is already
underway. You must wait until the first generation process completes before you launch the
next generation process.
To generate a load plan:
1. In the Load Plans master list, select the load plan that you want to generate.
2. In the Load Plans toolbar, click the Generate icon.
3. Use the Generation Status field to monitor progress. Click the Refresh icon to refresh
the display.
4. When the generation process completes, the Succeeded icon is displayed in the
Generation Status field.
You can execute a load plan or schedule it for execution after it has been successfully
generated.
Select the load plan that Click the Execute icon to display
you want to execute. the Execute Load Plan dialog box.
You can only execute a load plan if it was successfully generated. You can have separate
load plans for each source, but load plans should not run in parallel.
To execute a load plan:
1. In the Load Plans list, select the load plan that you want to execute.
2. On the Load Plans toolbar, click the Execute icon to display the load plan dialog box.
3. Specify the following information in the load plan dialog box:
- Context: The ODI context to be used when the load plan is run (Note that Global is
the only supported context.)
- Local Agent: The ODI local agent to be used when the load plan is run.
- ODI Work Repository: The name of the ODI Work Repository
4. Use the Execution Status field to monitor execution progress.
Select the load plan that Click the Show Execution Status
you want to monitor. Details icon to open ODI Console.
You can monitor a load plan run by viewing the execution status information on the Load Plan
Execution Details page of Oracle BI Applications Configuration Manager.
To view load plan execution details:
1. In the Load Plans master list, select the load plan whose run you want to view.
2. On the Load Plans toolbar, click the Show Execution Status Details icon. The Oracle
Data Integrator Console login screen is displayed (not shown in the slide).
3. Log in to Oracle Data Integrator Console by entering an appropriate user ID and
password.
4. ODI Console is displayed.
Within ODI Console, the navigation pane is displayed in the left pane and the Load Plan
Execution page for the selected load plan is displayed in the right pane. The Load Plan
Execution page displays the load plan execution name and load plan details. You can use the
Load Plan Execution page to view detailed information about the definition and execution
status of the load plan.
Copying a load plan enables you to define a new load plan with
the same fact groups as the selected load plan definition, but
with a different name and identifier.
1. In Configuration Manager, in the Load Plans list, select the load plan that you want to
copy.
2. On the Load Plans toolbar, click the Copy icon to display the Copy Load Plan page.
3. On the first page of the Copy Load Plan series, modify the load plan information.
4. On the second page of the Copy Load Plan series, verify that the same fact groups are
selected.
5. Save the copied load plan.
You can stop a load plan run in ODI Console or ODI Studio.
ODI Console ODI Studio
You can stop a load plan run from the Load Plan Execution page in Configuration Manager
(click Show Execution Status Details on the toolbar) or from ODI Studio. To stop a load plan
run from ODI Studio:
1. In Operator Navigator, select the running or waiting load plan run to stop from the Load
Plan Executions accordion.
2. Right-click the load plan and select Stop Normal or Stop Immediate.
- Stop Normal: In normal stop mode, the agent in charge of stopping the load plan
sends a Stop Normal signal to each agent running a session for this load plan.
Each agent will wait for the completion of the current task of the session and then
end the session in error. Exception steps will not be executed by the load plan
and, once all exceptions are finished, the load plan is moved to an error state.
- Stop Immediate: In immediate stop mode, the agent in charge of stopping the
load plan sends a Stop Immediate signal to each agent running a session for this
load plan. Each agent will immediately end the session in error and not wait for the
completion of the current task of the session. Exception steps will not be executed
by the load plan and, once all exceptions are finished, the load plan is moved to an
error state.
3. In the Stop Load Plan dialog box (not shown here), select an agent to stop the load plan
and click OK.
When you run ETL to load data from a source system into the Oracle Business Analytics
Warehouse (OBAW), you may need to restart the ETL load after a failure. Examples of
circumstances and reasons for load plan failure include:
Problem with access either to the source or target database due to network failure or
expired or otherwise incorrect usernames and passwords
Failure of the ODI agent
Problem with space or storage (For example, you are able to connect to the source or
target database, but the query fails to run due to lack of temp space, disk space, and so
on. For files, it could be due to inadequate space where the file needs to be placed.)
Problem with data (for example, incorrect data with lengths larger than the target column
can hold, or null values in Not Null columns)
After such a failure during ETL, to avoid restarting the entire load plan after a failure, which
would require inefficient re-runs of all ETL tasks, you must restart the load from the same
point in its execution once the cause of failure has been diagnosed and resolved.
When you restart a load plan after a failure, you may not
be able to restart from the exact point of failure.
To maintain data integrity in the case of restart, the grain
will vary depending on:
The location in the step hierarchy of the failed step
The Restart setting for the step
Parallel step
Serial step
Scenario step
When you restart a load plan after a failure, you may not be able to restart from the exact
point of failure, depending on where it occurred and on dependencies between load plan
steps. The goal of restartability is that the result of the load plan execution is the same
regardless of any load plan failure.
To maintain data integrity in the case of restart, the grain varies depending on the location in
the step hierarchy of the failed step and on the restart setting for the step.
Within the Steps Hierarchy, you can view the restart setting of a step in the Restart column.
The default settings for different steps in the hierarchy support data integrity in restarts:
Root steps are set to Restart from Failure if serial and Restart from failed Children if
parallel.
Substeps are set to Restart from Failure if serial and Restart from failed Children if
parallel.
Scenario steps are set to Restart from Failed Step.
running such a step to load a dimension group with multiple serial substeps loading individual
dimensions, the load plan, on restart, starts from the individual substep that failed. Any
successfully completed serial substeps are not run again.
Parallel Load Plan Step
Parallel steps are represented by a horizontal icon in the load plan steps hierarchy and, by
default, have a restart setting of Restart from Failed Children. In a typical run, a parallel step
with five parallel substeps under it has all five substeps executed in parallel, subject to free
sessions being available. If two of those five steps complete and then the load plan fails, all
the steps that did not complete or failed would be started again when the load plan is
You can use ODI Studio or ODI Console to restart a load plan.
The slide shows how to restart a load plan using ODI Console.
1. In ODI Console, navigate to Runtime > Sessions/Load Plan Executions and select the
load plan execution that has failed.
2. Click the Restart button. The Restart button is displayed only when the selected load
plan is the most recent run of the load plan. The restart option is enabled only on the last
run for a load plan. A load plan can be restarted any number of times and each time it
progresses from the last failure.
3. A new instance of the load plan is generated.
4. Monitor the load plan.
It is also possible to restart a session. To avoid restarting the entire load plan after a failure,
which would require inefficient re-runs of all ETL tasks, you can restart the load from the same
point in its execution once the cause of failure has been diagnosed and resolved.
You can use ODI Studio or ODI Console to restart a session. The slide shows how to restart a
session using ODI Console.
1. In ODI Console, navigate to Runtime > Sessions/Load Plan Executions and select the
session that has failed.
2. Click the Restart button. The Restart button is displayed only when the selected session
has failed.
A load plan must be restarted when it has stopped with an error. An alternate case where
restart may be required is when a load plan is not doing anything at all (for example, when a
load plan is executed and nothing has changed after 30 minutes). Use the following checklist
to assist in troubleshooting a nonresponsive load plan:
1. Check the maximum number of sessions set to run against the agent. In ODI Operator,
verify that the number of sessions running is equal to the maximum. If so, then the other
sessions are waiting for the running sessions to complete. Proceed to the next step.
2. Clean out stale sessions. Stale sessions are sessions that are incorrectly left in a
running state after an agent or repository crash. If an agent crashes or loses its
connection to the repository after it has started a session, it is not able to update the
status of the session in the repository, and such a session becomes stale. Until the stale
session is cleaned, it shows up as running in the repository but actually is not.
3. Check whether the agent is alive. To test the agent to see whether it is running and still
has a connection to the repository, open it in the Topology Navigator in ODI Studio and
select the Test tab. If the agent test fails, restart the agent after fixing the issue.
4. Verify that the ODI repository and the server hosting it are running and have not
experienced a failure.
5. If your load plan is in error and you have verified all of the above, restart the Load plan.
In most cases, the load plan restart method described earlier in this lesson is the
recommended approach. This approach ensures data integrity and leaves no scope for
manual error. However, at times you may want to run a load plan step manually. For example,
if a step is inserting duplicate records that are causing failure, rerunning the step would still
insert duplicates. In such a case, you may need to manually correct the data outside of the
load plan and then skip that step when you restart the load plan.
For this kind of situation, you can use the Mark as Complete option. When you mark a load
plan step as complete, it ensures that when the load plan is restarted, the marked step is not
executed. It is then the responsibility of the person making this setting to ensure that the load
for that step is carried out outside the load plan.
To mark a step as complete, right-click the step and select Mark as Complete. This can be
done at the scenario step or at any step higher than that. Marking a step complete at a higher
level in the step hierarchy means that none of the child steps under that parent step is
executed upon load plan restart, even if it is otherwise eligible. For this reason, marking a step
as complete should be treated as an advanced task and must be done only with a full
understanding of its impact. There is no single recommendation that pertains in all cases, so
the setting must be done carefully and only on a case-by-case basis.
You can fix a failed scenario step and run it individually outside
the load plan to complete the load.
When you are monitoring a load plan, you may not know how to completely fix a scenario-
step failure, but may wish to use the Mark as Complete option for the failed scenario step
instead of waiting for complete resolution. This prevents a step failure from precluding an
entire load plan from completing, while allowing you to inform the ETL team about the failed
scenario step so that they can work on a solution. The ETL team might then fix the scenario
and want to run it stand-alone outside the load plan to complete the load. You can use ODI
Studio or ODI Console to run a stand-alone scenario. The slide shows how to run a stand-
alone scenario by using ODI Console.
In ODI Console, navigate to Runtime > Scenarios/Load Plan Executions and select the
scenario.
Click the Execute button. You can also right-click the scenario and select Execute.
As in marking a step as complete, running a stand-alone scenario should be treated as an
advanced task and the person running the scenario must be aware of the following:
A scenario run outside of a load plan by itself invokes the Table Maintenance process.
This could, depending on the setting, truncate the table before the load.
A scenario step could have many variable values set, either dynamically in the case of a
refresh variable or explicitly by overriding its value at that scenario step in the load plan.
When running a scenario outside the load plan, all the scenario variables would have
only their default values. For this reason, care should be taken to set the variables
appropriately before calling a scenario from outside the load plan.
Oracle BI Applications 11g: Implementation Using ODI 9 - 19
Managing Load Plans: Fact Groups Tab
These eKit materials are to be used ONLY by you for the express purpose SELF STUDY. SHARING THE FILE IS STRICTLY PROHIBITED.
Use this tab to view the fact groups associated with a load plan
selected in the Load Plans list.
Use this tab to view the fact groups associated with a load plan selected in the Load Plans
list. The fact groups displayed may belong to a hierarchy of fact groups. You can expand the
fact group node to view the hierarchy. If a fact group is a child of another fact group in a
hierarchy, it appears twice in the tree table, because it is associated with both the functional
area and the parent fact group.
Use this tab to view and edit the data load parameters
associated with a load plan selected in the Load Plans list.
Use this tab to view and edit the data load parameters associated with a load plan selected in
the Load Plans list. The Data Load Parameters list includes both application-specific and
global parameters. Application-specific parameters are associated with one or more fact
groups included in the load plan definition. Global parameters apply to all applications and
can also be associated with specific fact groups. Key points to note about the Data Load
Parameters tab:
If a listed parameter requires a value but a value has not been assigned, the respective
row in the table is tagged with an error icon. Parameters that do not require a value
(value can be null) are not tagged even if no value has been assigned.
You can filter the list of parameters to display only the data load parameters that have
no value by using the Show drop-down list in the toolbar.
You can export and save content displayed in the table to a Microsoft Excel formatted
file by clicking the Export icon on the toolbar.
You can change a parameter value by selecting the parameter in the list, and then
clicking the Edit icon on the toolbar. The Edit Parameter Value dialog box is displayed.
To change a parameter value, the user must have been assigned a role that has the
appropriate privilege.
Use the Domains and Mappings tab to view and edit domains
and mappings related to a load plan selected in the Load Plan
list.
Use the Domains and Mappings tab to view and edit domains and mappings related to a load
plan selected in the Load Plan list. The domains and mappings are associated with the fact
group included in the load plan definition. Key points to note about the Domains and
Mappings tab:
If a source domain in the list contains members that have not been mapped to an
appropriate warehouse domain member, the row in the table is tagged with an error
icon. Some source domain members are not applicable, and, therefore, are not tagged
even if they are unmapped.
You can filter the list of mappings to display only the domains that have unmapped
source members using the Show drop-down list in the toolbar.
You can export and save content displayed in the table to a Microsoft Excel formatted
file by clicking the Export icon on the toolbar.
You can change a domain mapping by selecting the mapping in the list and then clicking
the Edit icon on the toolbar. The Edit Domain Member Mappings dialog box is displayed.
To change a domain member mapping, the user must have been assigned a role that
has the appropriate privilege.
Use the Schedules tab to view, create, edit, and delete schedules for the execution of a load
plan. A load plan schedule includes the following required properties:
Context: The ODI context to be used when the load plan is run (Note that Global is the
only supported context.)
Logical Agent: The ODI Agent to be used when the load plan is run
Recurrence: The frequency of occurrence
Status: The status of the schedule
Scheduled Time: The date and time the load plan is to be executed
Select Actions > Execute Reset Data Warehouse Scenario. This command resets the data
warehouse by truncating the W_ETL_LOAD_DATES table. This ensures that the subsequent
load will truncate all target tables and do a fresh full load.
In the Execute Reset Data Warehouse Scenario Dialog, set the Context, Agent, and ODI
Work Repository and click OK.
Answer: c
Answer: a
When you restart a load plan after a failure, you must restart
from the exact point of failure.
a. True
b. False
Answer: b
When you restart a load plan after a failure, you may not be able to restart from the exact
point of failure, depending on where it occurred and on dependencies between load plan
steps. The goal of restartability is that the result of the load plan execution is the same
regardless of any load plan failure.
Answer: a, b, d, e
Answer: c, d
Analytics Warehouse
Additional Additional
Additional rows
Packaged
Data sources
Nonpackaged
Category 1 Category 3 Category 2
data
The type of data source that you have determines the type of customization that you can do.
Data sources can be one of the following types:
Packaged applications (for example, Oracle E-Business Suite), which use prepackaged
adapters
Nonpackaged data sources, which use the Universal adapter
Customizations are grouped into the following categories:
Category 1: In a Category 1 customization, you add additional columns from source
systems that have prepackaged adapters and load the data into existing Oracle
Business Analytics Warehouse tables.
Category 2: In a Category 2 customization, you use prepackaged adapters to add new
fact or dimension tables to the Oracle Business Analytics Warehouse. Category 2
customizations normally require that you build new source-dependent extract (SDE) and
source-independent load (SIL) mappings.
Category 3: In a Category 3 customization, you use the Universal adapter to load data
from sources that do not have prepackaged adapters.
This lesson and the next two lessons focus on Category 1 and Category 2 customizations.
For more information about Category 3 customizations, refer to the Oracle Fusion Middleware
Administrator's Guide for Oracle Business Intelligence Applications.
Category 1 customizations involve extracting additional columns from source systems for
which prepackaged adapters are included (for example, Oracle E-Business Suite) and loading
the data into existing Oracle Business Analytics Warehouse tables.
For Category 1 customizations, data can also come from nonpackaged sources, but this
assumes that the sources have already been mapped with a Universal adapter and only need
to be extended to capture additional columns.
In the example in the slide, you would use the DESCRIPTION column in the HZ_LOCATIONS
table in the source system to capture data related to locations. You would then run a load plan
to load the data into a custom column, X_DESCRIPTION, in the W_GEO_DS dimensionstaging
table in the data warehouse, and ultimately into a custom column, X_DESCRIPTION, in the
W_GEO_D dimension table in the data warehouse.
In order to see additional columns in the Oracle Business Analytics Warehouse, the columns
must first be passed through the ETL process. The existing mappings and tables are
extensible. Oracle Business Intelligence Applications provides a methodology to extend
preconfigured mappings to include these additional columns and load the data into existing
tables.
Oracle Business Intelligence Applications recognizes two types of customization: extension
and modification. The supported extension logic allows you to add to existing objects. For
example, you can extract additional columns from a source, pass them through existing
mappings, and populate new columns added to an existing table.
Generally, Oracle Business Intelligence Applications does not allow you to modify existing
logic or columns. You should copy existing logic to custom folders, and then modify it. You
should not change existing calculations to use different columns, and you should not remap
existing columns to be loaded from different sources.
Customizations run
parallel to existing logic.
Most datastores have a single placeholder column named X_CUSTOM. Each ETL task has
mapping expressions to populate this column, which marks a safe path through the ETL task.
These serve as templates for customizing ODI datastores and interfaces. When creating new
custom columns, follow the naming convention of including the X_ prefix to help distinguish
custom columns.
In the figure in the slide, the preconfigured logic is shaded in gray. You should not modify
anything contained within these objects. You should add customizations to existing objects
rather than creating new packages and interfaces, which allows them to run parallel to the
existing logic.
The most common reason for extending the Oracle Business Analytics Warehouse is to
extract existing columns from a source system and map them to an existing Oracle Business
Analytics Warehouse table (either fact or dimension). This type of change typically requires
you to extend the interfaces within an SIL package. If the data is coming from a packaged
source, then you will also need to extend the interfaces within an appropriate SDE adapter
package. If the data is coming from a nonpackaged source, then you must use a Universal
adapter package. If an appropriate package does not already exist, you will need to create a
Universal adapter package with interfaces.
This slide and the next slide list the typical steps needed for a Category 1 customization. The
steps are provided here to give you a high-level overview of the Category 1 customization
process. Do not be concerned with the step details at this point. These steps are covered in
more detail in Lesson 11: Building a Category 1 Customization.
In a Category 2 customization, you use prepackaged adapters to add new fact or dimension
tables to the Oracle Business Analytics Warehouse. Category 2 customizations normally
require that you build new SDE and SIL mappings.
A typical Category 2 customization involves building entirely new tables that will be loaded
with data from a source table that is not already extracted from. For example, you might want
to create a new PARTNER dimension table. In this case, you create new dimension and
staging tables as well as new extract and load ETL mappings.
In the example in the slide, you would use the PARTNER table in the source system to capture
data related to partners. You would then load the data into a new dimension staging table and
subsequently a new dimension table in the data warehouse.
When you create a new dimension or fact table, use the required system columns that are
part of each of the Oracle Business Analytics Warehouse tables to maintain consistency and
enable you to reference existing table structures.
For staging tables, the following columns are required:
INTEGRATION_ID: Stores the primary key or the unique identifier of a record as in the
source table
DATASOURCE_NUM_ID: Stores the data source from which the data is extracted
For dimension and fact tables, the required columns are the INTEGRATION_ID and
DATASOURCE_NUM_ID columns as well as the following:
ROW_WID: A sequence number generated during the ETL process, which is used as a
unique identifier for the Oracle Business Analytics Warehouse
ETL_PROC_WID: Stores the ID of the ETL process information
This slide lists the typical steps needed for a Category 2 customization where a new
dimension table is added to the OBAW. The steps are provided here to give you a high-level
overview of the Category 2 customization process. Do not be concerned with the step details
at this point. These steps are covered in more detail in Lesson 12: Building a Category 2
Customization.
In some cases, you must reapply a customization to an object that has been patched. For
example, if you install an Oracle Business Intelligence Applications patch that modifies the
Supply Chain and Order Management application, you might need to manually reapply
customizations that you have made to the Supply Chain and Order Management application.
As part of customizing an ETL task (including interfaces and package under a specific task
folder), you copy the task folder to be customized, version the original, and version the copy.
Any patches are applied to the current version of the original task. Leverage ODIs version-
compare utility to identify the changes introduced by the patch.
The copy is also versioned so that any changes introduced can be isolated. After a patch,
compare any changes with those introduced by the patch and verify that there is no conflict,
and then manually apply the same changes introduced by the patch to the customized ETL
tasks.
A patch only installs changed repository objects, not the entire ODI Work Repository.
Therefore, you only need to reapply customizations to mappings that have been changed by
the patch.
For example, if a patch only modifies the Supply Chain and Order Management application,
you only need to manually reapply customizations that you have made to the Supply Chain
and Order Management application. Customizations in other applications would not be
affected by the patch.
Custom folder
for SIL objects
You use ODI Studio to customize ETL objects. If you want to make changes to preconfigured
ODI objects, create a custom folder and make the changes in it. Do not change objects in any
of the preconfigured folders unless explicitly directed by Oracle. This is because
preconfigured folders and the objects within them may be overwritten in future upgrades.
Using custom folders is not required, but it is strongly recommended and considered a best
practice to make identifying customized content easier.
The preconfigured ODI repository does not include any custom folders. You must create your
own. You should create a custom folder for each prepackaged SDE Adaptor folder that you
have deployed that will have customizations. In the example in the slide, a custom folder is
created for the SDE_ORA11510_Adaptor folder.
You should also create a separate custom folder for customizations that you want to make to
objects in the SILOS folder. Do not store customized SDE and SIL objects in the same folder.
In the example in the slide, a CUSTOM_SILOS folder is created to hold custom SIL objects.
For loading new fact and dimension tables, design a custom process on the source side to
detect the new and modified records. The SDE process should be designed to load only the
changed data (new and modified). If the data is loaded without the incremental process, the
data that was previously loaded will be erroneously updated again.
For example, the logic in the preconfigured SIL mappings looks up the destination tables
based on the INTEGRATION_ID and DATASOURCE_NUM_ID and returns the ROW_WID if
the combination exists, in which case it updates the record. If the lookup returns NULL, it
inserts the record instead. In some cases, last update date(s) stored in target tables are also
compared in addition to the columns specified above to determine insert or update. Look at
the similar mappings in the preconfigured folder for more details.
Name all the newly created tables with the prefix WC_.
This helps to visually isolate the new tables from the shipped
tables.
Keep good documentation of the customizations done.
This helps when upgrading your data warehouse.
Answer: c
Answer: c
Answer: c
Answer: a, c
Practice
A Category 1 customization is one of the three customization types that were initially
presented in the previous lesson titled Customizing the Oracle Business Analytics
Warehouse. Customizations are categorized based on the data source (packaged or
nonpackaged) and the desired Oracle Business Analytics Warehouse (OBAW) modification
(additional columns, tables, or rows). Category 1 customizations involve extracting additional
columns from source systems that are already mapped and loading the data into existing data
warehouse tables.
This slide presents the scenario for a Category 1 customization used in this lesson. Data is
extracted from a table in a source transactional database and loaded into a custom column in
a dimension table in the Oracle Business Analytics Warehouse. This scenario is used
throughout the lesson and associated practices.
This slide and the next list the steps to perform a Category 1 customization. Each step is
presented in detail in the slides that follow.
Custom folder
for SIL objects
If you want to make changes to preconfigured ODI objects, you must create a custom folder
and make the changes in it. Do not change objects in any of the preconfigured folders unless
explicitly directed by Oracle. This is because preconfigured folders and the objects within
them may be overwritten in future upgrades.
The preconfigured ODI repository does not include any custom folders. You must create your
own. You should create a custom folder for each prepackaged SDE Adaptor folder you have
deployed that will have customizations. In the example in the slide, a custom folder is created
for the SDE_ORA11510_Adaptor folder.
You should also create a separate custom folder for customizations that you want to make to
objects in the SILOS folder. Do not store customized SDE and SIL objects in the same folder.
The customization steps in the slides that follow use SDE objects as examples. The steps
apply to SIL objects as well.
Before you begin customization, enable versioning for the preconfigured task folder to be
customized. The version comment should indicate that this is the base (original) version of the
task. Subsequent patches applied to this task in the future would require increasing the
version in the comment so that it can be compared to the original task to identify any changes.
To create a version, right-click the task folder and select Version > Create Version to open the
Version dialog box. In the example in the slide, a version number and description have been
enabled for the SDE_ORA_GeographyDimension_HZLocations task folder.
Duplicate the task folder to be customized by copying it. Paste the copied task folder to the
custom folder, and rename it by removing the 'Copy of' prefix.
In this example in the slide, you copy the preconfigured task folder,
SDE_ORA_GeographyDimension_HZLocations, in the preconfigured adaptor folder,
SDE_ORA11510_Adaptor, and then paste the copied task folder to the
CUSTOM_SDE_ORA11510_Adaptor folder. This creates a new task folder named Copy of
SDE_ORA_GeographyDimension_HZLocations, which you rename to
SDE_ORA_GeographyDimension_HZLocations.
Before you begin customization, enable versioning of the copied task folder to be customized.
The version comment should indicate that this is the original version. This versioning enables
comparison of the customized task to a copy of the original version to determine all changes
that have been introduced.
Create another version of the copied task comment so that it can be compared to the original
task to identify any changes. The version comment should indicate that this is the customized
version. To enable versioning, right-click the task folder and select Version > Create Version
to open the Version dialog box.
In the example in the slide, a version number and description have been enabled for both the
original version and the customized version of the
SDE_ORA_GeographyDimension_HZLocations task folder in the
CUSTOM_SDE_ORA11519_Adaptor folder.
Version the model in which the datastores to be customized exist. Submodels and datastores
cannot be versioned. The version comment should indicate that this is the base or original
version.
Create another version of the model, with a version comment indicating that this is where
customizations are introduced. The models can now be compared to show differences. If the
model ever needs to be patched, the model should be versioned again so that the patched
version can be compared to the custom and original version.
The example in the slide shows navigation to ODI Designer > Models > Oracle BI
Applications (folder) > Oracle BI Applications (model). A version number and description have
been enabled for both the original version and the customized version of the Oracle BI
Applications model. This model has the datastores that you will customize in a later step in
this lesson.
Before you apply customizations to a task, edit the target datastores to include the required
column. In the example in the slide, you navigate to ODI Designer > Models > Oracle BI
Applications (folder) > Oracle BI Applications (model) > Dimension Stage and then edit the
W_GEO_DS dimension staging table to include the custom column, X_DESCRIPTION.
Column
Some task folders have multiple interfaces, each of which must be customized with new
mappings. The example in the slide shows how to map the custom column,
X_DESCRIPTION, in the W_GEO_DS interface for the
SDE_ORA_GeographyDimension_HZLocations task folder. This assumes that a similar
mapping has already been completed for the W_GEO_DS_SQ_HZ_LOCATIONS interface,
which contains the source, SQ_HZ_LOCATIONS, for the W_GEO_DS interface.
To map the custom column in the W_GEO_DS interface, perform the following steps:
1. Expand Projects > BI Apps Project > Mappings > CUSTOM_SDE_ORA11510_Adaptor
> SDE_ORA_GeographyDimension_HZLocations > Interfaces.
2. Double-click the W_GEO_DS interface to open it in the editor.
3. Click the Mapping tab at the bottom of the editor.
4. Drag the DESCRIPTION column from the HZ_LOCATIONS source datastore to the
custom column, X_DESCRIPTION, in the W_GEO_DS target datastore.
Notice that the mapping indicates both the table and the column from which it comes:
SQ_HZ_LOCATIONS.DESCRIPTION.
Procedure
Navigate to the location in Designer where the DDL procedure is stored and execute the
procedure. Use the Operator tab to monitor the procedure and verify that it completes
successfully. Use a SQL query tool to confirm that the physical tables are modified as
expected. In the example in the slide, the X_DESCRIPTION custom column has been added
to both the W_GEO_DS dimensions staging table and W_GEO_D dimension table in the data
warehouse.
Prior to generating scenarios, ensure that the Scenario Naming Convention user parameter
has a value of %FOLDER_NAME(2)%_%OBJECT_NAME%. This ensures that generated
scenarios are easily identified as custom scenarios because the custom folder is included in
the naming convention.
1. In ODI Studio, select ODI > User Parameters.
2. Scroll to locate the Scenario Naming Convention parameter.
3. Change the value to %FOLDER_NAME(2)%_%OBJECT_NAME% from the default value
%OBJECT_NAME%.
Newly generated scenarios now will be named according to the modified Scenario Naming
Convention user parameter.
When a component is finished and tested, you can generate the scenario corresponding to its
actual state. Generating a scenario for an object compiles the code for this object for
deployment and execution in a production environment. When a set of packages, interfaces,
procedures, and variables grouped under a project or folder is finished and tested, you can
generate a group of scenarios.
Generate scenarios for any new custom adaptors, using the option to generate the scenario
as if all underlying objects were materialized. The scenario will be generated reflecting the
custom adaptor name. In the future, if you make changes to any of the interfaces or the
package, you can either regenerate the existing scenario or generate a new scenario.
The example in the slide shows how to create a group of scenarios for the objects in the
SDE_ORA_GeographyDimension_HZLocations custom task folder.
In the Scenario Generation dialog box, select the Creation generation mode. This creates for
each object a new scenario with the same name as the last scenario version and with an
automatically incremented version number. If no scenario exists for an object, a scenario
named after the object with version number 001 is created. Select Generate scenario as if all
underlying objects are materialized. In the Objects to Generate section, select Packages,
Interfaces, and Procedures. Scenarios are generated for all underlying objects with the
naming convention set in User Parameters.
Use the techniques you learned in Lesson 9, Managing Load Plans, to create and generate
a load plan. In the example in the slide, the GL Revenue SDE Custom load plan has been
generated successfully.
Open the generated load plan in the Designer editor and use the search field to locate the
step that you want to update. In this example, navigate to Designer > Load Plans and
Scenarios > Generated Load Plans and open the GL Revenue SDE Custom load plan. Use
the Search field to locate the step
SDE_ORA11510_ADAPTOR_SDE_ORA_GEOGRAPHYDIMENSION_HZLOCATIONS.
Use the techniques you learned in Lesson 9, Managing Load Plans, to execute and monitor
the load plan. In the example in the slide, the GL Revenue SDE Custom load plan has been
executed successfully.
Answer: b
Answer: a, b, c
Answer: b
You should create a separate custom folder for customizations that you want to make to
objects in the SILOS folder. Do not store customized SDE and SIL objects in the same folder.
Answer: c
A Category 2 customization is one of the three customization types that were initially
presented in Lesson 10, Customizing the Oracle Business Analytics Warehouse.
Customizations are categorized based on the data source (packaged or nonpackaged) and
the desired Oracle Business Analytics Warehouse (OBAW) modification (additional columns,
tables, or rows). Category 2 customizations involve using prepackaged adapters to add new
fact or dimension tables to the Oracle Business Analytics Warehouse.
This slide presents a scenario and example for the Category 2 customization used in this
lesson. Data is extracted from a table in the source transactional database, loaded into a new
dimension staging table, and ultimately loaded into a new dimension table in the Oracle
Business Analytics Warehouse. This example is used throughout the lesson and associated
practices.
This slide lists the steps to perform a Category 2 customization. Each step is presented in
detail in the slides that follow.
In this example, you run a DDL script to manually create a new dimension table,
WC_PARTNER_D, and a new dimension staging table, WC_PARTNER_DS, in the data
warehouse based on the standard data warehouse structure with the required system
columns.
When creating a new custom table, use the prefix WC_ to help distinguish custom tables from
tables provided by Oracle as well as to avoid naming conflicts in case Oracle later releases a
table with a similar name.
Notice in the example in the slide that the dimension staging table contains the required
columns: DATASOURCE_NUM_ID and INTEGRATION_ID. The dimension table contains
these two required columns as well as the required columns ETL_PROC_ID and ROW_WID.
INTEGRATION_ID stores the primary key or the unique identifier of a record in the
source table.
DATASOURCE_NUM_ID stores the data source from which the data is extracted.
ETL_PROC_WID stores the ID of the ETL process information.
ROW_WID is a sequence number generated during the ETL process, which is used as a
unique identifier for the Oracle Business Analytics Warehouse.
5. Click the Reverse Engineer button to start a session that imports the tables into ODI.
In the set of practices for the previous lesson, Building a Category 1 Customization. you
manually created columns in ODI and then generated DDL scripts to define the columns in
the data warehouse. In this step, you use a different technique. You manually defined the
tables in the database in the previous step, and now you import the table definitions into ODI
using reverse engineering.
When you use the reverse engineering technique, the imported tables are automatically
placed in the Other submodel and must be moved into the appropriate submodels. In the
example in the slide, the WC_PARTNER_DS dimension staging table is moved from the Other
submodel to the Dimension Stage submodel in the Oracle BI Applications model.
Note: The specific submodel that a table belongs to drives the table maintenance behavior.
For example, tables in the Dimension Stage submodel will always be truncated at each ETL
run, while tables in the Dimension submodel are truncated only during a Full ETL run. Do not
create a Custom submodel to place your datastores, because table maintenance will not be
implemented properly for tables in such a submodel.
Open the new datastores in ODI Studio Designer and set the
OLAP type.
Open the new datastores in the ODI Studio Designer editor and set the OLAP type. In the
example in the slide, the OLAP type is set to Dimension for the WC_PARTNER_DS datastore.
Other OLAP types include Slowly Changing Dimension and Fact Table.
In this example, name the sequence name to WC_PARTNER_D_SEQ. Generally, the Native
sequence name should match the ODI name unless this causes the name length to exceed
30 characters, in which case you can shorten the name to meet this limit. This is the name of
the database trigger created to populate the ROW_WID column.
You also need to create the sequence in the data warehouse. You can do that manually by
running SQL to create the sequence in the database, or use Generate DDL in ODI to
synchronize the ODI model with the data warehouse.
Create custom SDE and SIL tasks in the custom SDE and SIL adaptor folders to populate the
new dimension staging table and dimension table in the data warehouse. Creating custom
SDE and SIL tasks includes creating new task folders, interfaces, mappings, and packages.
You can use the SDE_ <Product Line Code>_SampleDimension and SIL_SampleDimension
tasks as a template. These sample tasks include the logic required to populate the system
columns.
In the example in the slide, the SDE_ORA_PartnerDimension task folder includes an
SDE_ORA_PartnerDimension package and an
SDE_ORA_PartnerDimension.WC_PARTNER_DS interface. These objects are used to
extract data from the PARTNER source table and populate the WC_PARTNER_DS dimension
staging table in the data warehouse.
The SIL_PartnerDimension task folder includes an SIL_PartnerDimension package and an
SIL_PartnerDimension.WC_PARTNER_D interface. These objects are used to extract data
from the WC_PARTNER_DS dimension staging table in the data warehouse and load it into the
WC_PARTNER_D dimension table in the data warehouse.
The fact related datastores and tasks must be extended to reflect the new dimension. In the
example for this lesson, both the W_GL_REVN_FS fact staging datastore and the
W_GL_REVN_F fact datastore must be extended. The Oracle BI Applications Model should
already be versioned.
The example in the slide shows how to extend the W_GL_REVN_FS fact staging datastore by
adding an ID column that follows the naming convention X_<name>_ID with data type
VARCHAR2(80). In this example, the new column is X_PARTNER_ID. The next slide shows
how to extend the W_GL_REVN_F fact datastore.
Extend the GL Revenue (W_GL_REVN_F) fact datastore by adding a _WID column that
follows the naming convention X_<name>_WID with data type NUMBER(10). In this example,
the new column is X_PARTNER_WID.
Add a foreign key constraint to the fact table that refers to the
custom dimension table created previously.
Add a foreign key constraint to the W_GL_REVN_F fact table that refers to the WC_PARTNER_D
custom dimension table created previously. The naming convention is FK_<Fact
Table>_<Dimension Table>.
In this example, the new constraint is named FK_W_GL_REVN_F_WC_PARTNER_D. The
foreign key constraint ensures that the custom SIL task is included in the generated load plan.
The custom SDE task is included in the generated load plan because it populates the staging
table that is used as a source for the custom SIL task.
Add X_PARTNER_WID.
Add a non-unique bitmap index on the X_PARTNER_WID column. The naming convention is
<Fact Table>_F<n>. In this example, the index is named W_GL_REVN_F_F99. Use the
Description tab to enter the name.
On the Columns subtab, add the X_PARTNER_WID column by using the shuttle button.
On the Control subtab, check the Defined in the Database and Active check boxes.
On the Flexfields subtab, deselect Default for OBI Index Type, and change the value to
QUERY from ETL. Confirm that the OBI Bitmap Index value is set to Y.
In the example for this lesson, you modify a copy of the preconfigured
SDE_ORA_GLRevenueFact task folder to pass the ROW_WID value from the PARTNER
dimension source table to the custom X_PARTNER_ID column in the W_GL_REVN_FS fact
staging table. It is assumed that the preconfigured SDE_ORA_GLRevenueFact task folder
has been versioned and copied to the custom folder in ODI Designer.
The first step is to add a lookup in the W_GL_REVN_FS_SQ_GL_REVENUE_EXTRACT
interface to retrieve the ROW_ID from the PARTNER source dimension table.
The next step is to create the mapping for the X_PARTNER_ID column in the
W_GL_REVN_FS_SQ_GL_REVENUE_EXTRACT interface. The mapping is to
PARTNER.ROW_ID via the lookup. Modify the mapping to convert the data type to
VARCHAR2: TO_CHAR(LKP_PARTNER.ROW_ID).
The last step is to map the X_PARTNER_ID column from the SQ_GL_REVENUE_EXTRACT
source to X_PARTNER_ID in the W_GL_REVN_FS target datastore in the W_GL_REVN_FS
interface.
Modify the preconfigured SIL_GLRevenueFact fact SIL task by adding logic to retrieve the
ROW_WID value from the WC_PARTNER_D custom dimension.
The first step is to add a new column to the SQ_W_GL_REVN_FS datastore in the
W_GL_REVN_F_SQ_W_GL_REVN_FS interface. The step for adding the column is not
shown in the slide.
The next step is to add the WC_PARTNER_D dimension as a source to the interface and define
the mapping on the X_PARTNER_WID column: WC_PARTNER_D.ROW_WID.
Next, create a join on the fact tables ID column and the dimension table's INTEGRATION_ID
column and the fact and dimension DATASOURCE_NUM_ID columns.
Finally, create the mapping in the main interface. Modify the expression to include a function
that defaults NULL values to 0: COALESCE(SQ_W_GL_REVN_FS.X_PARTNER_WID,0).
Use known techniques presented in previous lessons in this course to complete the remaining
steps to load the data into the data warehouse.
Answer: c, d
Answer: a, b
Answer: b
Do not create a Custom submodel to place your datastores, because table maintenance will
not be implemented properly for tables in such a submodel.
Answer: c
Security
Oracle Business Intelligence Applications (OBIA) security is tightly integrated with the Oracle
Fusion Middleware security architecture and delegates core security functionality to
components of that architecture.
Specifically, any OBIA installation makes use of the following types of security providers:
An authentication provider that knows how to access information about the users and
groups accessible to OBIA and is responsible for authenticating users
A policy store provider that provides access to application roles and application policies,
which form a core part of the security policy and determines what users can and cannot
see and do in OBIA
A credential store provider that is responsible for storing and providing access to
credentials required by OBIA
To configure security in Oracle Business Intelligence, you use the following tools:
Oracle WebLogic Server Administration Console: Use the Oracle Administration
Console to manage users and groups in the embedded Oracle WebLogic Server LDAP.
You can also use the Administration Console to manage security realms and to
configure alternative authentication providers.
Oracle Enterprise Manager Fusion Middleware Control: Use Fusion Middleware
Control to manage the policy store, application roles, and permissions for determining
functional access. You can grant permissions to users, groups, and other application
roles.
Oracle BI Administration Tool: Use the Oracle BI Administration Tool to manage
security for Oracle BI repository objects. You can perform tasks such as setting
permissions for business models, tables, columns, and subject areas; specifying filters
to limit data accessibility; and setting authentication options.
Oracle BI Presentation Services Administration: Use Oracle BI Presentation
Services Administration to perform tasks such as setting permissions and managing
privileges for presentation catalog objects, including dashboards and dashboard pages.
Oracle BI Applications Functional Setup Manager: Use Oracle BI Applications
Functional Setup Manager informational tasks to set up security for OBIA offerings and
modules.
Security in OBIA can be classified broadly into the following three levels:
Object-level security: Object-level security controls the visibility to business logical
objects based on a user's role. You can set up object-level security for metadata
repository objects, such as business models and subject areas, and for web objects,
such as dashboards and dashboard pages, which are defined in the Presentation
Catalog.
Data-level security: Data-level security controls the visibility of data (content rendered
in subject areas, dashboards, Oracle BI Answers, and so on) based on the user's
association to data in the transactional system.
User-level security (authentication of users): User-level security refers to
authentication and confirmation of the identity of a user based on the credentials
provided.
An OBIA installation is configured with a default authentication provider that uses the
embedded Oracle WebLogic LDAP server for user and group information. The default OBIA
policy store provider and credential store provider store default credentials, application roles,
and application policies in files in the domain.
After installing OBIA, you can reconfigure the domain to use alternative security providers, if
desired. For example, you might want to reconfigure your installation to use an Oracle Internet
Directory, Oracle Virtual Directory, Microsoft Active Directory, or another LDAP server for
authentication.
Application roles define a set of permissions granted to a user or group. Object-level and
data-level security are implemented in Oracle BI Applications using application roles.
Application roles are also known as duty roles.
Application roles are mapped to directory server groups and users. For example, the
application roles BIAdministrator, BIConsumer, and BIAuthor are default application roles that
are defined as part of an OBIA installation. Application roles represent a functional role that a
user or group has, which gives that user or group the privileges required to perform that role.
For example, users or groups with the BIAdministrator application role have the administrative
permissions necessary to configure and manage the Oracle Business Intelligence installation
and create and edit content for others to consume. Any member of the BIAdministrators group
is explicitly granted this role and implicitly granted the BIAuthor and BIConsumer application
roles.
An application role is a logical role that can be used within the application to secure content in
a way that is independent of any particular authentication provider and the users and groups
within that provider. Security rules are built using application roles. If the underlying
authentication provider changes, the security rules persist. In a different authentication
provider, where group or usernames might be different, you could remap the application roles
to different groups or users, and the BI security structure (built with application roles) would
not be affected.
When operating in a development or test environment, you may find it convenient to use the
default security model because it comes preconfigured. You then add user definitions and
credentials that are specific to your business as well as customize the default application
roles and permission grants that your business security policies require.
After the identity, policy, and credential stores are fully configured and populated with data
that is specific to your business, they provide all user, policy, and credential information
needed by the OBIA components during authentication and authorization.
The slides that follow cover the key components of the default security model.
Please note that the default WebLogic authentication provider is not recommended for
production environments. Configuring an alternative authentication provider is beyond the
scope of this course. Please refer to the Configuring Oracle BI to use Oracle Internet
Directory section in the Oracle Fusion Middleware Security Guide for Oracle Business
Intelligence Enterprise Edition.
The default security realm is installed as part of the Oracle Business Intelligence (OBI)
Enterprise Edition installation. To view the default security realm, log in to the WebLogic
Server Administration Console. This console is used to manage users and groups for the
embedded LDAP server that serves as the prebuilt, default identity store. You also use the
WebLogic Server Administration Console to manage security realms, and to configure
alternative authentication providers.
On the left side of the console, under Domain Structure, notice that there is a single WebLogic
domain named bifoundation_domain into which all the OBI applications are deployed.
Notice that there is a single default security realm named myrealm. The OBI installer installs a
single domain with a single security realm in it. A security realm is a container for the
mechanisms that are used to protect WebLogic resources. This includes users, groups,
security roles, security policies, and security providers. Whereas multiple security realms can
be defined for the OBI domain, at any given time only one can be active, meaning designated
as the default realm.
Drill down on myrealm and use the Providers tab to view the
default settings for authentication providers.
Drill down on myrealm to view settings for the default security realm. Use the Providers tab to
access the default embedded WebLogic Authentication Provider. An authentication provider
establishes the identity of users and system processes, transmits identity information, and
serves as a repository for identity information from which components can retrieve it.
Oracle Business Intelligence is configured to use the directory server embedded in Oracle
WebLogic Server as the default security provider. Alternate security providers can be used if
desired and managed in the Oracle WebLogic Administration Console, but the WebLogic
authentication provider is used by default.
Notice that there is a default WebLogic Identity Assertion Provider. WebLogic Identity
Assertion Provider is used primarily for Single Sign-On.
To view users in the WebLogic Administration Console, click the Users and Groups tab and
then the Users subtab. This page displays information about each user that has been
preconfigured in this security realm. The default identity store is pre-seeded with usernames
specific to Oracle Business Intelligence. These default usernames are provided as a
convenience so that you can begin using the Oracle Business Intelligence software
immediately after installation, but you are not required to maintain the default names in your
deployment. The default users are Administrator, BIAppsSystemUser, BISystemUser,
OracleSystemUser, and SADMIN.
Oracle Business Intelligence system components establish a connection to each other as
BISystemUser instead of as the Administrator user, the latter being the practice in
earlier releases. Using a trusted system account, such as BISystemUser, to secure
communication between Oracle BI components enables you to change the password of your
deployments system administrator account without affecting communication between these
components. The name of this user is the default, and it can be changed or a different user
can be created for the purpose of inter-process communication. This is a highly privileged
user whose credentials should be protected from non-administrative users.
Notice the weblogic user. This is the administrative user created during the installation
process for this training environment. This user is discussed in more detail in the next slide.
A single administrative user is shared by Oracle Business Intelligence and Oracle WebLogic
Server.
This username is created during installation, can be any desired name, and, therefore, does
not have to be Administrator. The password is likewise provided during installation.
In the default security configuration, an administrative user is a member of the
BIAdministrators group and has all rights granted to the Oracle Business Intelligence
Administrator user in earlier releases, with the exception of impersonation. An administrative
user cannot impersonate other users.
An administrative user is also a member of the Oracle WebLogic Server default
Administrators group, which enables this user to perform all its administration tasks, including
the ability to manage Oracle WebLogic Servers embedded directory server and policy store.
Creating groups of users who have similar system resource access needs enables easier
security management. Managing a group is more efficient than managing a large number of
users individually. Oracle recommends that you organize your users into groups for easier
maintenance. Groups are then mapped to application roles in order to grant rights.
In the screenshot, the Groups table is filtered to display the three default groups specific to
Oracle BI: BIAdministrators, BIAuthors, and BIConsumers. These default groups are provided
as a convenience so you can begin using the Oracle Business Intelligence software
immediately after installation, but you are not required to maintain the default names in your
deployment.
Members of the BIAdministrators group have permissions equivalent to those of the
Administrator user of earlier releases. Members of the BIAuthors group have the
permissions necessary to create content for others to consume. Members of the
BIConsumers group have the permissions necessary to consume content created by others.
Groups are nested in a hierarchy. Members of the BIAdministrators group are, by default,
members of both other groups. Members of BIAuthors are members of BIConsumers.
Oracle BI Applications also provides a sample set of default groups (also called enterprise
roles).
In the screenshot, the Groups table is filtered to display the Fixed Asset Accounting Manager
E-Business Suite (EBS) group, which is member of the Fixed Asset Accounting Manager EBS
role. You can associate this group to your BI users. Any users made members of the Fixed
Asset Accounting Manager EBS group will automatically inherit any security rights associated
with the Fixed Asset Accounting Manager EBS role, and will have the correct security for
Fixed Assets Accounting reporting for EBS. Please note there are many other default OBIA
groups. The screenshot shows only one example.
You use Enterprise Manager Fusion Middleware Control to manage the policy store,
application roles, and permissions for determining functional access. Default BI application
roles include:
BIAdministrator: Grants administrative permissions necessary to configure and
manage the Oracle Business Intelligence installation. Any member of the
BIAdministrators group is explicitly granted this role and implicitly granted the BIAuthor
and BIConsumer roles.
BIAuthor: Grants permissions necessary to create and edit content for others to
consume. Any member of the BIAuthors group is explicitly granted this role and implicitly
granted the BIConsumer role.
BIConsumer: Grants permissions necessary to consume content created by others.
Any member of the BIAuthors group is explicitly granted this role.
BISystem: Grants the permissions necessary to impersonate other users. This role is
required by Oracle Business Intelligence system components for inter-component
communication.
The default application roles are mapped to default groups in the default WebLogic LDAP.
The groups are listed in the Members pane (not shown here). If you moved to a different
LDAP server, rather than the default WebLogic LDAP server, you could map these roles to
groups in the new LDAP server. Application roles are in the policy store, whereas groups are
in the identity store.
The example in the slide shows the Fixed Asset Accounting Manager EBS application role
and membership for the role. Notice that the Fixed Asset Accounting Manager EBS group is a
member of the Fixed Asset Accounting Manager EBS role. Any users associated with the
Fixed Asset Accounting Manager EBS group will have all the security privileges and
permissions assigned to the Fixed Asset Accounting Manager EBS role. Again, there are
many other default OBIA roles and groups. The screenshot shows only one example.
This example is provided to illustrate the relationships among users, groups, application roles,
and permissions. In Oracle BI, the members of a default application role include both groups
and other application roles. The result is a hierarchical role structure in which permissions can
be inherited in addition to being explicitly granted. A group that is a member of a role is
granted both the permissions of the role and the permissions for all roles descended from that
role. When you construct a role hierarchy, you should not introduce circular dependencies.
The graphic in the slide shows these relationships among the default application roles and the
ways in which permissions are granted to members. The result is that, because of the role
hierarchy, a user who is a member of a particular group is granted both explicit permissions
and any additional inherited permissions. Note that, by themselves, groups and group
hierarchies do not allow a privilege to perform an action in an application. Those privileges
are conveyed through the application roles and their corresponding permission grants.
The table shows the role and permissions granted to all group members (users). The default
BIAdministrator role is a member the BIAuthor role, and BIAuthor role is a member of the
BIConsumer role. The result is that members of the BIAdministrators group are granted all the
permissions of the BIAdministrator role, the BIAuthor role, and the BIConsumer role. Only one
of the permissions granted by each role is used in this example.
The screenshot in the slide displays the Oracle BIAdministrator application role and a partial
view of the default policies associated with the role. The default file-based policy store is pre-
seeded with the Oracle BIspecific permissions. All Oracle Business Intelligence permissions
are provided and you cannot create additional permissions. These permissions are granted
by the default application roles in the default security configuration. The default application
role hierarchy and permission grants can be changed as needed.
Please note that these permissions are not the same as those used to define access to BI
objects (metadata, dashboards, reports, and so on). Policy store permissions are only used to
define what BI functionality the assigned roles can access.
You use the Oracle BI Administration Tool to modify security settings for the Oracle BI
repository. You can perform tasks such as setting permissions for business models, tables,
columns, and subject areas; specifying filters to limit data accessibility; and setting
authentication options.
The screenshot shows the Identify Manager utility in the repository. On the Users tab, notice
that you can see the same set of users as those listed in the WebLogic Server Administration
Console. The key point is that users are no longer in the repository as in previous OBI product
releases. They are in the WebLogic LDAP or whatever identity store your system is
configured with. You must add a new user in the identity store, not in the repository.
On the Application Roles tab, notice that you can see the same set of application roles as
those listed in Fusion Middleware Control. You can now use the application roles to set
access control permissions for repository objects. The recommended practice is to use
application roles, not individual users, to set access control permissions.
Open the repository in online mode to see the default security settings. Repository security
should be managed in online mode.
The screenshot shows the Object Permissions tab for the Fixed Asset Accounting Manager
EBS application role, which displays the repository objects to which the Fixed Asset
Accounting Manager EBS application role has been granted access.
For example, Financials Asset Balance is a Presentation layer subject area. This allows
users assigned to the Fixed Asset Accounting Manager EBS role to access the objects in this
subject area for analyses and dashboards.
Notice that the Fixed Asset Accounting Manager EBS role has been granted Read access for
all of the objects in the list. However, an Administrator could check out the objects for
editing and grant Read/Write or No Access to the objects.
This is another method for assigning permissions in the repository. The example in the slide
shows the permissions for the Financials Asset Balance subject area. Notice that, as
expected, the Fixed Asset Accounting Manager EBS application role has Read access for this
object.
Users can have explicitly granted permissions. They can also have permissions granted
through membership in application roles, which in turn can have permissions granted through
membership in other application roles, and so on.
Permissions granted explicitly to a user take precedence over permissions granted through
application roles, and permissions granted explicitly to the application role take precedence
over any permissions granted through other application roles.
If there are multiple application roles acting on a user or application role at the same level with
conflicting security attributes, the user or application role is granted the least-restrictive
security attribute.
Any explicit permissions acting on a user take precedence over any permissions on the same
objects granted to that user through application roles.
Filter definitions, however, are always inherited. For example, if User1 is a member of Role1
and Role2, and Role1 includes a filter definition but Role2 does not, the user inherits the filter
definition defined in Role1.
User1
Member Role1
Member Role2
Role1 Role2
User1 is a direct member of Role1 and Role2, and is an indirect member of Role3,
Role4, and Role5.
Because Role5 is at a lower level of precedence than Role2, its denial of access to
TableA is overridden by the READ permission granted through Role2. The result is that
Role2 provides READ permission on TableA.
The resultant permissions from Role1 are NO ACCESS for TableA, READ for TableB,
and READ for TableC.
Because Role1 and Role2 have the same level of precedence and because the
permission in each cancel the other out (Role1 denies access to TableA, Role2 allows
access to TableA), the less-restrictive level is inherited by User1. That is, User1 has
READ access to TableA.
The total permissions granted to User1 are READ access for TableA, TableB, and
TableC.
Data filters provide a way to enforce row-level security rules in the repository. Data filters are
set up in the repository by using the Administration Tool and can be set for objects in both the
Business Model and Mapping layer and the Presentation layer. Applying a filter on a logical
object will affect all Presentation layer objects that use the object. If you set a filter on a
Presentation layer object, it is applied in addition to any filters that might be set on the
underlying logical objects. It is a best practice to set up data filters for particular application
roles rather than for individual users.
In the example in the slide, you set a filter on the Customer presentation table in the Supplier
Sales subject area for the SalesSupervisorsRole application role, so that customer data is
visible for only those records in which Jose Cruz or his direct reports are the sales
representatives.
You can manage the query environment by setting query limits (governors) in the repository
for users or application roles. It is a best practice to set query limits for particular application
roles rather than for individual users.
Notice that you manage the query environment by preventing queries from consuming too
many resources by limiting how long a query can run or how many rows a query can retrieve.
You also may want to regulate when individual users can query databases to prevent users
from querying when system resources are tied up with batch reporting, table updates, or other
production tasks.
You can use the Restrict field to restrict access to a database during a certain time frame. You
can regulate when users can query databases to prevent users from querying while system
resources are tied up with batch reporting, table updates, or other production tasks. To restrict
access to a database during particular time periods, click the ellipsis () button in the Restrict
column to open the Restrictions dialog box. Then select a time period and click the start time
and drag it to the end time. To explicitly allow access, click Allow. To explicitly disallow
access, click Disallow.
You can use the Catalog Manager to view or set presentation folder and presentation object
permissions for users and roles. It is advised that you use roles to manage permissions for
ease of management.
This page allows you to view and administer privileges associated with various components of
Oracle Business Intelligence. For example, the BI Consumer Role has the privilege to access
Oracle BI dashboards.
Creating a catalog group for the first time will automatically create a shared folder of the same
name for the group.
You can use My Account in BI Answers to view the application roles and catalog groups
assigned to a user. In this example, the weblogic user has logged in to Oracle BI and
selected My Account. All of the application roles to which weblogic is assigned are visible
on the Roles and Catalog Groups tab.
Object
Method
Profile
A method is an action that can be performed on an object. Each object has a predefined set
of methods.
An example of a method is Create. In the example in the slide, the Create method is
associated with the Load Plan object.
Profiles are associated with methods. For example, the BIA_ADMINISTRATOR profile is
associated with the Create method for the Load Plan object. A profile is a set of privileges.
Profiles are discussed in the next slide.
Profiles
Profile methods
Notice the three profiles that begin with BIAthese are prebuilt profiles specific to Oracle BI
Applications. The remaining profiles (CONNECT, DESIGNER, NG_DESIGNER, and so on)
are built-in ODI profiles that the security administrator can assign to the users.
Notice the objects associated with each profile. In the screenshot, the Action object is
expanded for the BIA_ADMINISTRATOR profile. Notice the methods assigned to each object
within a profilethese are referred to as profile methods. A profile method is an
authorization granted to a profile on a method of an object type. Each granted method allows
a user with this profile to perform an action (edit, delete, and so on) on an instance of an
object type (project, model, datastore, and so on). Methods granted to a profile appear under
this profile in the Profiles accordion of the Security Navigator. When a method does not
appear for a given profile, this profile does not have access to this method.
Notice that the BIA_ADMINISTRATOR profile is assigned to the weblogic user. This means
that the weblogic user inherits all the privileges granted to this profile. For example,
weblogic inherits the ability to create load plans.
Notice that it is also possible to grant privileges on specific objects for a user, although there
are no prebuilt examples of this for the weblogic user. A user method is a privilege granted
to a user on a method of an object type. Each granted method allows the user to perform an
action (edit, delete, and so on) on instances of an object type (project, model, datastore, and
so on). These methods are similar to the profile methods, but applied to users.
Notice that it is also possible to grant privileges on specific instances for a user, although
there are no prebuilt examples of this for the weblogic user. It is possible to grant users with
privileges on instances on specific work repositories where these instances exist. For
example, you may grant a developer user the edit privilege on the
LOAD_DATAWAREHOUSE scenario in a DEVELOPMENT repository but not on a
PRODUCTION repository.
Answer: b
Answer: a, c, e
Answer: a, b, c, d
Answer: a
Managing Performance
The graphic in the slide shows a performance triangle, which illustrates that performance
tuning is a broad topic that requires a balancing act involving hardware, the transactional
schema, and the OBAW schema. All three points of the performance triangle can be
bottlenecks, and performance enhancement in one area can have an impact on the
performance of another area. This lesson provides high-level recommendations for improving
performance related to ETL and queries in Oracle BI Applications. Real-world usage patterns
and ETL scheduling may demand further testing and tuning.
This is a high-level overview of the three areas where bottlenecks occur. Bottlenecks can
include transactional or OBAW schema issues (for example, index usage) or hardware issues
(for example, number of processors, degree of parallelism, or I/O subsystems).
Performance tuning recommendations for Oracle BI Applications, ETL, and the Oracle
Business Analytics Warehouse include the topics listed in this slide. Each topic is covered in
detail in the slides that follow.
Hardware and I/O subsystems can have a major effect on performance. The slide provides a
list of the system components that can be optimized before making major changes to the
OBAW or the transactional schema and should be considered while assessing ETL
performance.
For more information about block size and Oracle databases, see the Oracle Database 11g
Documentation Library on Oracle Technology Network.
Although it is technically possible to put the Oracle Business Analytics Warehouse in the
same database as the transactional database, it is not recommended for performance
reasons. The transactional database is structured as an online transaction processing (OLTP)
database, whereas the Oracle Business Analytic Warehouse is structured as an online
analytical processing (OLAP) database, each optimized for its own purpose.
myhost_orcl.world=
DESCRIPTION=(SDU=16384)(TDU=16384)
The parameter template file provides parameter guidelines based on the cost-based optimizer
for Oracle Database 11gR2. Use these guidelines as a starting point. You will need to make
changes based on your specific database sizes, data shape, server size (CPU and memory),
and type of storage.
Copy the appropriate template file into your <ORACLE_HOME>/dbs directory. Then, review
the recommendations in the template file, and make the changes based on your specific
database configuration. The database administrator should make changes to the settings
based on performance monitoring and tuning considerations.
The NLS_LENGTH_SEMANTICS parameter enables you to define byte- or character-length
semantics. Oracle BI Applications supports BYTE and CHAR values for this parameter. If you
are using MLS characters, you can add this parameter to the parameter template file
(init<DB version>.ora) for your database version.
You should install only the languages that you expect to use,
because each installed language can significantly increase the
number of records stored in the data warehouse and can affect
overall database performance.
After installing Oracle BI Applications, you use the Oracle BI Applications Configuration
Manager to configure which languages you want to support in the Oracle Business Analytics
Warehouse. You must configure one base" language, and you can also configure any
number of installed" languages. Typically, the base language specified for the data
warehouse should match the base language of the source system.
The installed languages that you specify for the data warehouse do not have to match the
languages that are installed in the source system. The data warehouse can have more, fewer,
or completely different installed languages compared to the source system.
Note that for languages that match between the transactional system and the data
warehouse, the corresponding record is extracted from the transactional system; languages
that do not match will have a pseudo-translated record generated.
W_RESPONSE_MD W_RESPONSE_D
W_ASSET_MD W_ASSET_D
W_OPTY_MD W_OPTY_D
W_ORDER_MD W_ORDER_D
W_QUOTE_MD W_QUOTE_D
W_SRVREQ_MD W_SRVREQ_D
One of the main uses of a data warehouse is to sum up fact data with respect to a given
dimension, for example, by date or by sales region. Performing this summation on demand is
resource-intensive, and slows down response time. The Oracle Business Analytics
Warehouse precalculates some of these sums and stores the information in aggregate tables
to speed up response time. In the Oracle Business Analytics Warehouse, the aggregate
tables have been suffixed with _A.
The Prune Days parameter is used to extend the window of the ETL extract beyond the last
time the ETL actually ran and ensure that the records that may have somehow been missed
in an earlier ETL process are picked up in the next ETL.
Last Extract Date is a value that is calculated based on the last time data was extracted from
that table less a Prune Days value. Records can be missed in an ETL process when a record
is being updated while the ETL process is running and was not committed until after the ETL
completed.
You set the Prune Days parameter value in Oracle BI Applications Configuration Manager.
Setting a small value means the ETL will extract fewer records, thus improving performance;
however, this increases the chances that records are not detected. Setting a large number is
useful if ETL runs are infrequent, but this increases the number of records that are extracted
and updated in the data warehouse. Therefore, you should not set the Prune Days value to a
very large number. A large Prune Days number can also be used to trigger re-extracting
records that were previously processed but have not changed. The value for Prune Days
should never be set to 0.
Answer: b
Which of the following are reasons for not putting the Oracle
Business Analytics Warehouse in the same database as the
transactional database?
a. The analytical queries interfere with normal use of the
transactional database.
b. The data in a transactional database is normalized for
Answer: a, b, d
Answer: a, b, d
Answer: d
Practice