Vous êtes sur la page 1sur 2

1) Datapump operates on a group of files called dump file sets.

However, normal
export operates on a single file.
2) Datapump access files in the server (using ORACLE directories). Traditional e
xport can access files in client and server both (not using ORACLE directories).
3) Exports (exp/imp) represent database metadata information as DDLs in the dump
file, but in datapump, it represents in XML document format.
4) Datapump has parallel execution but in exp/imp single stream execution.
5) Datapump does not support sequential media like tapes, but traditional export
supports.
6) Data Pump Export and Import operations are processed in the database as a Dat
a Pump job,which is much more efficient that the client-side execution of origin
al Export and Import.
7) Data Pump operations can take advantage of the server s parallel processes to r
ead or write multiple data streams simultaneously.
8) Data Pump differs from original Export and Import in that all jobs run primar
ily on the server using server processes. These server processes access files fo
r the Data Pump jobs using directory objects that identify the location of the f
iles.
9) Datapump has a very powerful interactive command-line mode which allows the
user to monitor and control Data Pump Export and Import
COMPRESSION parameter in expdp
One of the big issues with Data Pump was that the dumpfile couldn't be compresse
d while getting created. In Oracle Database 11g, Data Pump can compress the dump
files while creating them by using parameter COMPRESSION in the expdp command li
ne. The parameter has three options:
METDATA_ONLY - only the metadata is compressed
DATA_ONLY - only the data is compressed; the metadata is left alone.
ALL - both the metadata and data are compressed.
NONE - this is the default; no compression is performed.
Encryption
The dumpfile can be encrypted while getting created. The encryption uses the sam
e technology as TDE (Transparent Data Encryption) and uses the wallet to store t
he master key. This encryption occurs on the entire dumpfile, not just on the en
crypted columns as it was in the case of Oracle Database 10g.
Data Masking
when you import data from production to QA, you may want to make sure sensitive
data are altered in such a way that they are not identifiable. Data Pump in Orac
le Database 11g enables you do that by creating a masking function and then usin
g that during import.
REMAP_TABLE
Allows you to rename tables during an import operation.
Example

The following is an example of using the REMAP_TABLE parameter to rename the emp
loyees table to a new name of emps:
impdp hr DIRECTORY=dpump_dir1 DUMPFILE=expschema.dmp TABLES=hr.employees REMAP_T
ABLE=hr.employees:emps operations.Datapump allows you to disconnect and reconnec
t to the session
10) Because Data Pump jobs run entirely on the server, you can start an
export or import job, detach from it, and later reconnect to the job to monitor
its progress.
11) Data Pump gives you the ability to pass data between two databases over a ne
twork (via a database link), without creating a dump file on disk.

Vous aimerez peut-être aussi