Oracle Data Pump

Oracle Data Pump

Since we are all familiar with Oracle’s original export (exp) utility, and in my opinion Data Pump will be replacing exp soon, I thought it would be good to start off getting familiar with this utility by some relatively simple Data Pump exports (expdp) that are similar to the way we have used exp in the past. In particular the FULL export.
As a word of caution, the Data Pump exports (expdp) are not compatible with exp. So as you go forth and play with this utility please at least name the exports something that will signify that the dump file was created by expdp so that you won’t get confused. Also since this utility is not backward compatible, if you have any databases prior to 10g and you are using exp, you may want to hold off on implementing the new expdp utility as you will not be able to import into any pre-10g databases.
Where are my dmp files
Different from the original export utility in Oracle, Data Pump runs only on the server side. You may initiate the export from a client but the job(s) themselves will run inside an Oracle server. There are no dump files (expdat.dmp) or log files that will be created on your local machine if you initiate a Data Pump Export (expdp) there. But, as we will see through a couple examples, if you have an Oracle server on your local machine you can get Oracle to produce dump files there.
Oracle creates dump and log files through DIRECTORY objects. So before you can use Data Pump you must create a DIRECTORY object. There are a few different “default” mechanisms for Oracle to determine an appropriate DIRECTORY to use. Mostly through environment variables and a default directory name that Oracle will look for. But as we all should know, we should not leave this to chance and instead explicitly create and use a directory object name of our choice. As soon as you create an object, a DIRECTORY here, that is a default you open yourself up to security breaches and thus this practice should be avoided. So for here I have logged into my S1 database and will create a DIRECTORY named datapump.
SQL-S1> CREATE DIRECTORY datapump AS 'C:\user\datafile\datapump';
Then, as you use Data Pump you can reference this DIRECTORY as a parameter for export where you would like the dump or log files to end up. It is good to note here that as dump and log files are created, log files that are written to will overwrite existing log files of the same name but dump files that have the same name will only create an error condition and error out the Data Pump job. This was not the case with Oracle’s original export utility (exp). Subsequent exports would overwrite all files. With Data Pump this is a nice safeguard but can also create problems for those of us who did nightly exports to the same location and file names. Now we have to think about cleanup routines. A small price to pay for additional security that could save your life one day when the scraping utility fails.
Just like the original exp utility Data Pump requires some authorization to allow users to export. Here I am granting EXP_FULL_DATABASE to a user JKOOP on database S1 that will allow the user to perform a full database export. If not given the JKOOP user could only export their own schema. Also I need to grant READ and WRITE privileges on the recently created DIRECTORY. Also on database S1.
SQL-S1 > GRANT EXP_FULL_DATABASE to jkoop;SQL-S1 > GRANT READ, WRITE ON DIRECTORY datapump to jkoop;
Now for the Examples
We are all familiar with the FULL database export. Data Pump easily performs this with the following command line. Notice there are just a few name changes and instead of specifying the directory path in the file locations the additional parameter for your DIRECTORY is supplied. This command line assumes you are on the database server and environment variables are properly set for a direct connection.
E:> expdp jkoop/pwd FULL=y DIRECTORY=datapump DUMPFILE=expdata.dmp LOGFILE=expdata.log
We have also used the exp utility to connect through a TNS entry to perform an export on a remote database. Data Pump can also easily do this by adding a connection identifier to the user/password parameter. The exact same way done in exp.
E:> expdp
jkoop/pwd@S1 FULL=y DIRECTORY=datapump DUMPFILE=byTNS.dmp LOGFILE=byTNS.log
Now for a few export trickeries. These next two examples assume an additional database named S2. They allow for a connection to the target database that we want to export through a database link. So the first thing to do is create a database link.
SQL-S2> CREATE DATABASE LINK S1 CONNECT TO JKOOP IDENTIFIED BY PWD USING 'S1';
The key item to remember with Data Pump and where files will end up is the fact that wherever you Data Pump runs it requires a DIRECTORY to place dump and log files in. So since we will be connecting to the S2 database there will be required a DIRECTORY for placing these files in. Here I create a new DIRECTORY named mydump on database S2.
SQL-S2> CREATE DIRECTORY mydump AS 'D:\mydump';
Now for the command line options. Here we are running on the server where database S2 resides and will be producing a full dump of database S1 through the NETWORK_LINK. But placing the dump and log files on the server where database S1 resides. This was great news for me as when I first read the documentation I thought all dumps would have to reside on the server the database resided on. Now I can almost produce an environment where a single database is a ‘gateway’ for my database exports if needed.
E:> expdp jkoop/pwd FULL=y DIRECTORY=mydump NETWORK_LINK=S1 DUMPFILE=byDB.dmp LOGFILE=byDB.log
Ok, suppose we do produce that gateway for exports. Do we need to execute all commands from that server? No! With Data Pump we need only connect to the S2 database through a TNS entry and then supply the appropriate NETWORK_LINK to the database we want to export.
E:> expdp
jkoop/pwd@S2 FULL=y DIRECTORY=mydump NETWORK_LINK=S1 DUMPFILE=TNSDB.dmp LOGFILE=TNSDB.log
Introduction to Monitoring Data Pump
A simple way to gain insight into the status of a Data Pump job is to look into a few views maintained within the Oracle instance the Data Pump job is running. These views are DBA_DATAPUMP_JOBS, DBA_DATAPUMP_SESSIONS, and V$SESSION_LOGOPS. These views are critical in the monitoring of your export jobs so, as we will see in a later article, you can attach to a Data Pump job and modify the execution of the that job.
DBA_DATAPUMP_JOBSThis view will show the active Data Pump jobs, their state, degree of parallelism, and the number of sessions attached.
SQL> select * from dba_datapump_jobs
OWNER_NAME JOB_NAME OPERATION JOB_MODE STATE DEGREE ATTACHED_SESSIONS---------- ---------------------- ---------- ---------- ------------- --------- -----------------JKOOP SYS_EXPORT_FULL_01 EXPORT FULL EXECUTING 1 1JKOOP SYS_EXPORT_SCHEMA_01 EXPORT SCHEMA EXECUTING 1 1DBA_DATAPUMP_SESSIONSThis view give gives the SADDR that assist in determining why a Data Pump session may be having problems. Join to the V$SESSION view for further information.
SQL> SELECT * FROM DBA_DATAPUMP_SESSIONS
OWNER_NAME JOB_NAME SADDR---------- ------------------------------ --------JKOOPMANN SYS_EXPORT_FULL_01 225BDEDCJKOOPMANN SYS_EXPORT_SCHEMA_01 225B2B7CV$SESSION_LONGOPSThis view helps determine how well a Data Pump export is doing. Basically gives you a progress indicator through the MESSAGE column.
SQL> select username,opname,target_desc,sofar,totalwork,message from V$SESSION_LONGOPS
USERNAME OPNAME TARGET_DES SOFAR TOTALWORK MESSAGE-------- -------------------- ---------- ----- ---------- ------------------------------------------------JKOOP SYS_EXPORT_FULL_01 EXPORT 132 132 SYS_EXPORT_FULL_01:EXPORT:132 out of 132 MB doneJKOOP SYS_EXPORT_FULL_01 EXPORT 90 132 SYS_EXPORT_FULL_01:EXPORT:90 out of 132 MB doneJKOOP SYS_EXPORT_SCHEMA_01 EXPORT 17 17 SYS_EXPORT_SCHEMA_01:EXPORT:17 out of 17 MB doneJKOOP SYS_EXPORT_SCHEMA_01 EXPORT 19 19 SYS_EXPORT_SCHEMA_01:EXPORT:19 out of 19 MB doneThe original export utility (exp) may or may not be going away soon. The documentation clearly states that Data Pump will handle data types that exp will not and we should begin our migration to this new utility. Except for those instances where you must export between 10g and pre-10g databases. This article stepped through the process of performing FULL exports as these are typical in Oracle environment. If you are doing schema or table exports the change is simple and we will visit those in subsequent parts to this series.