Chinaunix首页 | 论坛 | 博客
  • 博客访问: 924739
  • 博文数量: 358
  • 博客积分: 8185
  • 博客等级: 中将
  • 技术积分: 3751
  • 用 户 组: 普通用户
  • 注册时间: 2008-10-15 16:27
个人简介

The views and opinions expressed all for my own,only for study and test, not reflect the views of Any Company and its affiliates.

文章分类

全部博文(358)

文章存档

2012年(8)

2011年(18)

2010年(50)

2009年(218)

2008年(64)

我的朋友

分类: Oracle

2009-08-13 16:38:05

1.9i Exp,Imp
・Table Mode:
exp hr/hr TABLES=employees,departments rows=y file=exp1.dmp
・User Mode:
exp system/manager OWNER=hr direct=y file=expdat.dmp
・Tablespace Mode:
exp \'username/password AS SYSDBA\' TRANSPORT_TABLESPACE=y TABLESPACES=ts_emp log=ts_emp.log

■■Globalization Support Considerations
$ export NLS_LANG=american_america.we8iso8859p15
on Unix, or
C:> set NLS_LANG=american_america.we8iso8859p15
on Windows.

■■Privilege 
You need to invoke Import as SYSDBA under the following conditions:
– At the request of Oracle technical support
– When importing a transportable tablespace set
To use Export, you must have the CREATE SESSION privilege on an Oracle database. To export tables owned by another user, you must have the EXP_FULL_DATABASE role enabled. This role is granted to all DBAs. If you do not have the system privileges contained in the EXP_FULL_DATABASE role, you cannot export objects contained in another user’s schema.
To use Import, you need the privilege CREATE SESSION to log on to the Oracle database server. This privilege belongs to the CONNECT role established during database creation.You can do an import even if you did not create the export file. However, if the export file was created by someone other than you, you can import that file only if you have the
IMP_FULL_DATABASE role.

■■Import Process Sequence
1. New tables are created
2. Data is imported
3. Indexes are built
4. Triggers are imported
5. Integrity constraints are enabled on the new tables
6. Any bitmap, functional, and/or domain indexes are built

Direct-Path Export
exp gau001/XXX@gau00t file=e:\test.dmp tables=(XXX) direct=y recordlength=64000

exp条件备份:
exp gau001/XXXX@gau00t file=e:\query.dmp  recordlength=64000 tables=(XXX) query=\"where rownum=1\"
注意:在带有query参数的exp语句中不能带有direct=y这个参数



###############################################################################
2.10g Data Pump

**********************************
■Data Pump Architecture:
There are a number of processes involved in a Data Pump job, two queues, a number of files, and one table.
・When a Data Pump job is launched, at least two processes are started: a Data Pump Master process (the DMnn) and one or more worker processes (namedDWnn).
Two queues are created for each Data Pump job: a control queue and a status queue.
・The files generated by Data Pump come in three forms: SQL files, dump files,and log files.
・Finally, there is the control table. Created for you by the DMnn when you launch a job, it is used both to record the job’s progress and to describe it. It is included in the dump file as the final item of the job.

*********************
Data Pump has two methods for loading and unloading data: the direct path, and the external table path.
what determines whether Data Pump uses the direct path or the external table path? You as DBA have no control: Data Pump itself makes the decision according to the complexity of the objects

*********************
サンプル:

expdp -?
impdp -?

■expdp
create or replace directory GAU001_DIR as '/tmp';
expdp gau001/gaudi@gaudi100 TABLES=test DUMPFILE=table.dmpe directory =GAU001_DIR

eg:********************************

C:\Users\LuKegui>expdp gau001/gaudi@gaudi100 TABLES=test DUMPFILE=table.dmpe directory =GAU001_DIR

Export: Release 10.2.0.3.0 - Production on 星期五, 14 8月, 2009 11:04:26

Copyright (c) 2003, 2005, Oracle.  All rights reserved.

连接到: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
With the Partitioning, OLAP and Data Mining options
Starting "GAU001"."SYS_EXPORT_TABLE_01":  gau001/********@gaudi100 TABLES=test DUMPFILE=table.dmpe directory =GAU001_DIR
Estimate in progress using BLOCKS method...
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 13 MB
Processing object type TABLE_EXPORT/TABLE/TABLE
Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
. . exported "GAU001"."TEST"                             8.568 MB 1000000 rows
Master table "GAU001"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded
******************************************************************************
Dump file set for GAU001.SYS_EXPORT_TABLE_01 is:
  /tmp/table.dmpe
Job "GAU001"."SYS_EXPORT_TABLE_01" successfully completed at 02:16:13

■impdp 
impdp gau001/gaudi@gaudi100 TABLES=test DUMPFILE=table.dmpe directory =GAU001_DIR

eg:********************************

C:\Users\LuKegui>impdp gau001/gaudi@gaudi100 TABLES=test DUMPFILE=table.dmpe directory =GAU001_DIR

Import: Release 10.2.0.3.0 - Production on 星期五, 14 8月, 2009 11:17:08

Copyright (c) 2003, 2005, Oracle.  All rights reserved.

连接到: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
With the Partitioning, OLAP and Data Mining options
Master table "GAU001"."SYS_IMPORT_TABLE_01" successfully loaded/unloaded
Starting "GAU001"."SYS_IMPORT_TABLE_01":  gau001/********@gaudi100 TABLES=test DUMPFILE=table.dmpe directory
Processing object type TABLE_EXPORT/TABLE/TABLE
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
. . imported "GAU001"."TEST"                             8.568 MB 1000000 rows
Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
Job "GAU001"."SYS_IMPORT_TABLE_01" successfully completed at 02:28:06

SQL> show user
USER is "GAU001"
SQL> select count(*) from test;

  COUNT(*)
----------
   1000000

SQL>

*************************************
■Using Data Pump in Network Mode

It is possible to use Data Pump to transfer data from one database to another, without staging the data on disk at all. Consider this example of launching Data Pump from the command-line utility impdp:
impdp userid=scott/tiger tables=dept,emp network_link=L1 directory=dp_dir
This command will launch a Data Pump export against the local database, logging on as user SCOTT with password TIGER. The Data Pump job will then read two tables,DEPT and EMP, in SCOTT’s schema and transfer them across the network to the database identified by the database link L1, where a Data Pump import job will insert the data.The export and the import processes will run concurrently, the one feeding the other.
Note that even though the data is never written out as an operating system file, it is still necessary to specify a directory, in order that Data Pump can write out logging information.
***********************************

Transactions against the affected tables during a long-running Data Pump
export job will not be included in the export file: the job is protected from any changes.
阅读(827) | 评论(0) | 转发(0) |
给主人留下些什么吧!~~