Vous êtes sur la page 1sur 8

Performing Data Pump Exports and Imports

The Data Pump Export utility loads row data from database tables, as well as obj
ect metadata, into
dump file sets in a proprietary format that only the Data Pump Import utility ca
n read. The dump file
sets, which are operating system files, contain data, metadata, and control info
rmation. Dump file
sets usually refer to a single file, such as the default export dump file expdat
.dmp.
$ expdp system/manager DIRECTORY=dpump_dir1 DUMPFILE=expdat1.dmp
Using a Parameter File
Rather than specifying the export parameters on the command line, you can put th
em in a parameter
file. You then simply invoke the parameter file during the actual export. Here s a
n example of a
parameter file:
SCHEMAS=HR
DIRECTORY=dpump_dir1
DUMPFILE=system1.dmp
SCHEMAS=hr
Once you create the parameter file, all you need to do in order to export the HR
schema is invoke
expdp with just the PARFILE parameter:
$ expdp PARFILE=myfile.txt
using oem
step 1--> start database control
step 2--->select maintenance
step 3--> then choose utilities
you can see the various choices for exporting and importing
data
Schema mode is the default mode for Data Pump Export and Import jobs. If you log
in as follows,
for example, Data Pump will automatically perform a full export of all of SYSTEM s
objects:
$ expdp system/sammyy1
If you are the SYSTEM user, you can export another schema s objects by explicitly
using the
SCHEMAS parameter, as shown in Listing 14-3.
A Data Pump Export Using the Schema Mode
$ expdp system/sammyy1 DUMPFILE=scott.dmp SCHEMAS=SCOTT
File- and Directory-Related Parameters
You can specify several file- and directory-related parameters during a Data Pum
p Export job. These
include the DIRECTORY, DUMPFILE, FILESIZE, PARFILE, LOGFILE, NOLOGFILE, and COMP
RESSION
parameters.
LOGFILE and NOLOGFILE
You can use the LOGFILE parameter to specify a log file for your export jobs. He
re s what you need to
remember regarding this parameter:
If you just specify the LOGFILE parameter without the DIRECTORY parameter, Oracl

e automatically
creates the log file in the location you specified for the DIRECTORY parameter.
If you don t specify this parameter, Oracle creates a log file named export.log.
$ expdp hr DIRECTORY=dpump_dir1 DUMPFILE=hr.dmp
TABLES=employees REUSE_DUMPFILES=y
If you specify the parameter NOLOGFILE, Oracle does not create its log file (exp
ort.log). You still
see the progress of the export job on the screen, but Oracle suppresses the writ
ing of a separate log
file for the job.
REUSE_DUMPFILEs
$ expdp hr DIRECTORY=dpump_dir1 DUMPFILE=hr.dmp
TABLES=employees REUSE_DUMPFILES=y
Of course, you must make sure that you don t need the contents of the preexisting
dump file,
hr.dmp

COMPRESSION
The COMPRESSION parameter enables the user to specify which data to compress bef
ore writing the
export data to a dump file. By default, all metadata is compressed before it s wri
tten out to an export
dump file. You can disable compression by specifying a value of NONE for the COM
PRESSION parameter,
as shown here:
$ expdp hr/hr DIRECTORY=dpump_dir1 DUMPFILE=hr_comp.dmp COMPRESSION=NONE
EXCLUDE and INCLUDE
The EXCLUDE and INCLUDE parameters are two mutually exclusive parameters that yo
u can use to
perform what is known as metadata filtering. Metadata filtering enables you to s
electively leave out
or include certain types of objects during a Data Pump Export or Import job. In
the old export utility,
you used the CONSTRAINTS, GRANTS, and INDEXES parameters to specify whether you
wanted to export
those objects. Using the EXCLUDE and INCLUDE parameters, you now can include or
exclude many other
kinds of objects besides the four objects you could filter previously. For examp
le, if you don t wish to
export any packages during the export, you can specify this with the help of the
EXCLUDE parameter.
Note If you use the CONTENT=DATA_ONLY option (same as the old ROWS=Y parameter),
you aren t exporting
any objects just table row data. Naturally, in this case, you can t use either the E
XCLUDE or INCLUDE parameter
Simply put, the EXCLUDE parameter helps you omit specific database object types
from an export
or import operation. The INCLUDE parameter, on the other hand, enables you to in

clude only a specific


set of objects. Following is the format of the EXCLUDE and INCLUDE parameters:
EXCLUDE=object_type[:name_clause]
INCLUDE=object_type[:name_clause]
For both the EXCLUDE and INCLUDE parameters, the name clause is optional. As you
know, several
objects in a database such as tables, indexes, packages, and procedures have names.
Other objects,
such as grants, don t. The name clause in an EXCLUDE or an INCLUDE parameter lets
you apply a SQL
function to filter named objects.
Here s a simple example that excludes all tables that start with EMP:
EXCLUDE=TABLE:"LIKE 'EMP%'"
In this example, "LIKE 'EMP%'" is the name clause.
The following example shows how you must use slashes (\) to escape the double qu
otation
marks:
$ expdp scott/tiger DUMPFILE=dumphere:file%U.dmp
schemas=SCOTT EXCLUDE=TABLE:\"='EMP'\", EXCLUDE=FUNCTION:\"='MY_FUNCTION''\",
The following example shows how to apply remapping functions to two columns in a
table:
$ expdp hr DIRECTORY=dpump_dir1 DUMPFILE=remap1.dmp TABLES=employees
REMAP_DATA=hr.employees.employee_id:hr.remap.minus10
REMAP_DATA=hr.employees.first_name:hr.remap.plusx
DATA_OPTIONS
The DATA_OPTIONS parameter lets you specify options on handling specific types o
f data during an
export. You can only specify the value XML_CLOBS for this parameter (DATA_OPTION
S=XML_CLOBS).
QUERY
The QUERY parameter serves the same function as it does in the traditional expor
t utility: it lets you
selectively export table row data with the help of a SQL statement. The QUERY pa
rameter permits you
to qualify the SQL statement with a table name, so that it applies only to a par
ticular table. Here s an
example:
QUERY=OE.ORDERS: "WHERE order_id > 100000"
In this example, only those rows in the orders table (owned by user OE) where th
e order_id is
greater than 100,000 are exported.
SAMPLE
Using the SAMPLE parameter, which was brand new in Oracle Database 10g Release 2
, you have the
capability to export only a subset of data from a table. The SAMPLE parameter le
ts you specify a
percentage value ranging from .000001 to 100. This parameter has the following s
yntax:
SAMPLE=[[schema_name.]table_name:]sample_percent
Here s an example:
SAMPLE="HR"."EMPLOYEES":50

You specify the sample size by providing a value for the SAMPLE_PERCENT clause.
The schema
name and table name are optional. If you don t provide the schema name, the curren
t schema is
assumed. You must provide a table name if you do provide a schema name. Otherwis
e, the sample
percent value will be used for all the tables in the export job. In the followin
g example, the sample
size is 70 percent for all tables in the export job because it doesn t specify a t
able name:
$ expdp hr/hr DIRECTORY=dpump_dir1 DUMPFILE=sample.dmp SAMPLE=70
TRANSPORTABLE
The TRANSPORTABLE parameter enables you to specify whether you want the database
to export the
metadata for specific tables (and partitions and subpartitions) when doing a tab
le mode export. You
can specify either ALWAYS or NEVER as values for the TRANSPORTABLE parameter. He
re s an example:
$ expdp sh DIRECTORY=dpump_dir1 DUMPFILE=hr.dmp
TABLES=employees TRANSPORTABLE=always
There s no default value of this parameter
ENCRYPTION
Use the ENCRYPTION parameter to specify whether or not to encrypt data before it s
written to a dump
file. You can assign the following values to the ENCRYPTION parameter:
ALL: Encrypts all data and metadata
DATA_ONLY: Encrypts only data written to the dump file set
ENCRYPTED_COLUMNS_ONLY: Specifies encryption for only encrypted columns using th
e TDE feature
METADATA_ONLY: Specifies the encryption of just the metadata
NONE: Specifies that no data will be encrypted
If you don t specify the ENCRYPTION or the ENCRYPTION_PASSWORD parameter, the ENCR
YPTION
parameter defaults to NULL. If you omit the ENCRYPTION parameter but specify the
ENCRYPTION_PASSWORD
parameter, the ENCRYPTION parameter defaults to ALL.
The following example shows how to specify just the data and nothing else:
$ expdp hr DIRECTORY=dpump_dir1 DUMPFILE=hr.dmp JOB_NAME=test1
ENCRYPTION=data_only ENCRYPTION_PASSWORD=foobar
ENCRYPTION_ALGORITHM
The ENCRYPTION_ALGORITHM parameter specifies the encryption algorithm to use in
the encryption of
data. The default value is AES128, and you can also specify AE192 and AES256. Th
e following example
shows how to specify this parameter:
$ expdp hr DIRECTORY=dpump_dir1 DUMPFILE=hr.dmp
ENCRYPTION_PASSWORD=foobar ENCRYPTION_ALGORITHM=AES128
Here fs an example that shows how to specify the dual value for the ENCRYPTION_MOD
E parameter.
$ expdp hr DIRECTORY=dpump_dir1 DUMPFILE=hr.dmp
ENCRYPTION=all ENCRYPTION_PASSWORD=encrypt_pwd
ENCRYPTION_ALGORITHM=AES256 ENCRYPTION_MODE=dual
ENCRYPTION_PASSWORD

You can use the ENCRYPTION_PASSWORD parameter to encrypt table data or metadata
in the export
dump file to prevent unauthorized users from reading data from the dump set. Not
e that if you
specify ENCRYPTION_PASSWORD and omit the ENCRYPTION parameter, the database encr
ypts all data
written to the export dump set.
Note If you export a table with encrypted columns but don ft specify the ENCRYPTION_
PASSWORD parameter,
the database stores the encrypted table column or columns as clear text in the e
xport dump file when you do this.
The database issues a warning when this happens.The following example shows how
to pass a value of testpass for the ENCRYPTION_PASSWORD
parameter:
$ expdp hr TABLES=employee_s_encrypt DIRECTORY=dpump_dir
DUMPFILE=hr.dmp ENCRYPTION=ENCRYPTED_COLUMNS_ONLY
ENCRYPTION_PASSWORD=testpass
The dump file for the export will encrypt the encrypted columns in the employees
table
$ expdp system/sammyy1 estimate_only=y
$ expdp hr/hr DIRECTORY=dpump_dir1 NETWORK_LINK=finance
DUMPFILE=network_export.dmp LOGFILE=network_export.log
$ expdp system/manager STATUS=60 . . .
The STATUS parameter shows the overall percentage of the job that is completed,
the status of the
worker processes, and the status of the current data objects being processed. No
te that the Data
Pump log file will show the completion status of the job, whereas the STATUS par
ameter gives you the
status of an ongoing Data Pump job.
FLASHBACK_TIME
The FLASHBACK_TIME parameter is similar to the FLASHBACK_SCN parameter. The only
difference is that
here you use a time, instead of an SCN, to limit the export. Oracle finds the SC
N that most closely
matches the time you specify, and uses this SCN to enable the Flashback utility.
The Data Pump
Export operation will be consistent as of this SCN. Here fs an example:
$ expdp system/sammyy1 DIRECTORY=dpump_dir1 DUMPFILE=hr_time.dmp
FLASHBACK_TIME="TO_TIMESTAMP('25-05-2008 17:22:00', 'DD-MM-YYYY HH24:MI:SS')"
Note FLASHBACK_SCN and FLASHBACK_TIME are mutually exclusive
The following example shows how you can export the user HR s schema up to the SCN
150222:
$ expdp hr/hr DIRECTORY=dpump_dir1 DUMPFILE=hr_exp.dmp FLASHBACK_SCN=150222
$ expdp system/manager DIRECTORY=dpump_dir1 DUMPFILE=par_exp%U.dmp PARALLEL=3
Note that you don t need to use the %U substitution variable to generate multiple
dump files
when you choose a value greater than 1 for the PARALLEL parameter. You could sim
ply use a commaseparated

list of values, as follows:


$ expdp system/manager DIRECTORY=dpump_dir1
DUMPFILE=(par_exp01.dmp,par_exp02.dmp,par_exp03.dmp)
ATTACH
The ATTACH parameter attaches your Data Pump client session to a running job and
places you in
an interactive mode. You can use this parameter only in conjunction with the use
rname/password
combination; you can ft use any other export parameters along with it. Here fs an ex
ample:
$ expdp hr/hr ATTACH=hr.export_job
CHAPTER 14 USING DATA PUMP EXPORT AND IMPORT 701
The ATTACH parameter is very important, as it fs one of the two ways to open an in
teractive Data
Pump job, as explained in the following section.

The COMPRESSION parameter can take any of the following four values:
ALL: Enables compression for the entire operation.
DATA_ONLY: Specifies that all data should be written to the dump file in a compr
essed format.
METADATA_ONLY: Specifies all metadata be written to the dump file in a compresse
d format.
This is the default value.
NONE: Disables compression of all types.
CONTENT
By using the CONTENT parameter, you can filter what goes into the export dump fi
le. The CONTENT
parameter can take three values:
ALL exports both table data and table and other object definitions (metadata).
DATA_ONLY exports only table rows.
METADATA_ONLY exports only metadata.
Here s an example:
$ expdp system/manager DUMPFILE=expdat1.dmp CONTENT=DATA_ONLY

Refer 700-704 for attach jobs and inter active mode

By default, the Data Pump Export utility will run the export in schema mode
Here s an interesting Data Pump Export example, showing how to use the PARALLEL, F
ILESIZE,
and JOB_NAME parameters. It also illustrates the use of the DUMPFILE parameter w
hen there are
multiple dump files.
$ expdp hr/hr FULL=Y DUMPFILE=dpump_dir1:full1%U.dmp, dpump_dir2:full2%U.dmp

FILESIZE=2G PARALLEL=3 LOGFILE=dpump_dir1:expfull.log JOB_NAME=expfull


$ impdp hr/HR DIRECTORY=dpump_dir1 DUMPFILE=newdump.dmp
TABLES=hr.employees REMAP_TABLE=hr.employees:emp
Using the REMAP_SCHEMA parameter, you can move objects from one schema to anothe
r. You need to
specify this parameter in the following manner:
$ impdp system/manager DUMPFILE=newdump.dmp REMAP_SCHEMA=hr:oe
REMAP_DATAFILE
When you are moving databases between two different platforms, each with a separ
ate file-naming
convention, the REMAP_DATAFILE parameter comes in handy to change file system na
mes. The following
is an example that shows how you can change the file system from the old Windows
platform to the
new UNIX platform. Whenever there is any reference to the Windows file system in
the export dump
file, the Import utility will automatically remap the filename to the UNIX file
system.
$ impdp hr/hr FULL=Y DIRECTORY=dpump_dir1 DUMPFILE=db_full.dmp \
REMAP_DATAFILE='DB1$:[HRDATA.PAYROLL]tbs6.f':'/db1/hrdata/payroll/tbs6.f'

remap_tablespace
$ impdp hr/hr REMAP_TABLESPACE='example_tbs':'new_tbs' DIRECTORY=dpump_dir1 \
PARALLEL=2 JOB_NAME=TESTJOB_01 DUMPFILE=employees.dmp NOLOGFILE=Y
Here s an example that shows how to specify the REMAP_DATA parameter during import
:
$ impdp hr DIRECTORY=dpump_dir1 DUMPFILE=expschema.dmp
TABLES=hr.employees REMAP_DATA=hr.employees.first_name:hr.remap.plusx
The PLUSX function from the REMAP package remaps the FIRST_NAME column in this
DATA_OPTIONS
The DATA_OPTIONS parameter is the counterpart to the DATA_OPTIONS parameter duri
ng export operations.
You can specify only the SKIP_CONSTRAINT_ERRORS value for this parameter during
an import
(DATA_OPTIONS=SKIP_CONSTRAINT_ERRORS). The SKIP_CONSTRAINT_ERRORS option lets th
e import operation
continue even if the database encounters any nondeferred constraint violations
Perform the network import from the database named remote, using the following D
ata
Pump Import command:
[local] $ impdp system/sammyy1 SCHEMAS=scott NETWORK_LINK=remote