2.10g Data Pump
**********************************
■Data Pump Architecture:
There are a number of processes involved in a Data Pump job, two queues, a number of files, and one table.
・When a Data Pump job is launched, at least two processes are started: a Data Pump Master process (the DMnn) and one or more worker processes (namedDWnn).
・Two queues are created for each Data Pump job: a control queue and a status queue.
・The files generated by Data Pump come in three forms: SQL files, dump files,and log files.
・Finally, there is the control table. Created for you by the DMnn when you launch a job, it is used both to record the job’s progress and to describe it. It is included in the dump file as the final item of the job.
*********************
Data Pump has two methods for loading and unloading data: the direct path, and the external table path.
what determines whether Data Pump uses the direct path or the external table path? You as DBA have no control: Data Pump itself makes the decision according to the complexity of the objects
*********************
サンプル:
expdp -?
impdp -?
■expdp
create or replace directory GAU001_DIR as '/tmp';
expdp gau001/gaudi@gaudi100 TABLES=test DUMPFILE=table.dmpe directory =GAU001_DIR
eg:********************************
C:\Users\LuKegui>expdp gau001/gaudi@gaudi100 TABLES=test DUMPFILE=table.dmpe directory =GAU001_DIR
Export: Release 10.2.0.3.0 - Production on 星期五, 14 8月, 2009 11:04:26
Copyright (c) 2003, 2005, Oracle. All rights reserved.
连接到: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
With the Partitioning, OLAP and Data Mining options
Starting "GAU001"."SYS_EXPORT_TABLE_01": gau001/********@gaudi100 TABLES=test DUMPFILE=table.dmpe directory =GAU001_DIR
Estimate in progress using BLOCKS method...
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 13 MB
Processing object type TABLE_EXPORT/TABLE/TABLE
Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
. . exported "GAU001"."TEST" 8.568 MB 1000000 rows
Master table "GAU001"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded
******************************************************************************
Dump file set for GAU001.SYS_EXPORT_TABLE_01 is:
/tmp/table.dmpe
Job "GAU001"."SYS_EXPORT_TABLE_01" successfully completed at 02:16:13
■impdp
impdp gau001/gaudi@gaudi100 TABLES=test DUMPFILE=table.dmpe directory =GAU001_DIR
eg:********************************
C:\Users\LuKegui>impdp gau001/gaudi@gaudi100 TABLES=test DUMPFILE=table.dmpe directory =GAU001_DIR
Import: Release 10.2.0.3.0 - Production on 星期五, 14 8月, 2009 11:17:08
Copyright (c) 2003, 2005, Oracle. All rights reserved.
连接到: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
With the Partitioning, OLAP and Data Mining options
Master table "GAU001"."SYS_IMPORT_TABLE_01" successfully loaded/unloaded
Starting "GAU001"."SYS_IMPORT_TABLE_01": gau001/********@gaudi100 TABLES=test DUMPFILE=table.dmpe directory
Processing object type TABLE_EXPORT/TABLE/TABLE
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
. . imported "GAU001"."TEST" 8.568 MB 1000000 rows
Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
Job "GAU001"."SYS_IMPORT_TABLE_01" successfully completed at 02:28:06
SQL> show user
USER is "GAU001"
SQL> select count(*) from test;
COUNT(*)
----------
1000000
SQL>
*************************************
■Using Data Pump in Network Mode
It is possible to use Data Pump to transfer data from one database to another, without staging the data on disk at all. Consider this example of launching Data Pump from the command-line utility impdp:
impdp userid=scott/tiger tables=dept,emp network_link=L1 directory=dp_dir
This command will launch a Data Pump export against the local database, logging on as user SCOTT with password TIGER. The Data Pump job will then read two tables,DEPT and EMP, in SCOTT’s schema and transfer them across the network to the database identified by the database link L1, where a Data Pump import job will insert the data.The export and the import processes will run concurrently, the one feeding the other.
Note that even though the data is never written out as an operating system file, it is still necessary to specify a directory, in order that Data Pump can write out logging information.
***********************************
Transactions against the affected tables during a long-running Data Pump
export job will not be included in the export file: the job is protected from any changes.