By William Sescu
Nowadays, since the cloud is becoming more and more important, the PL/SQL API’s become more and more important too. Fortunately, Oracle has quite a lot of them. E.g. How do you run a Data Pump export if you have no ssh connectivity to the server? You could use the old exp tool, which is still available even with Oracle 12.2, or you can use DBMS_DATAPUMP. The Data Pump API is quite good documented in the following books:
Database Utilities
https://docs.oracle.com/database/122/SUTIL/using-ORACLE_DATAPUMP-api.htm#SUTIL600
Database PL/SQL Packages and Types Reference
https://docs.oracle.com/database/122/ARPLS/DBMS_DATAPUMP.htm#ARPLS66050
But you might find some useful stuff in the $ORACLE_HOME/rdbms/admin/dbmsdp.sql as well.
In this little how to, I would like to show how to create a consistent full database export (parallel 8) with
the Data Pump API.
There are a only a few steps involved to get the job done.
1. Create a directory and grant the necessary privileges to user HR
2. Grant the DATAPUMP_EXP_FULL_DATABASE role to user HR
3. Execute the Data Pump job with DBMS_DATAPUMP
4. Monitor the Data Pump job
5. Optionally, do some cleanup
1.) Let’s start with the directory.
SQL> CREATE OR REPLACE DIRECTORY DATAPUMP_DIR AS '/u01/app/oracle/admin/DBIT122/dpdump'; Directory created. SQL> GRANT READ, WRITE ON DIRECTORY DATAPUMP_DIR TO HR; Grant succeeded.
2.) Now we grant the DATAPUMP_EXP_FULL_DATABASE role to the HR user
SQL> GRANT DATAPUMP_EXP_FULL_DATABASE TO HR; Grant succeeded.
Please be aware that the DATAPUMP_EXP_FULL_DATABASE role affects only export operations. It allows the user HR to run these operations:
- Perform the operation outside of the scope of their schema
- Monitor jobs that were initiated by another user
- Export objects (for example, TABLESPACE definitions) that unprivileged users cannot reference
Without this role, you might run into the following error when doing a full export:
ERROR at line 1: ORA-31631: privileges are required ORA-06512: at "SYS.DBMS_SYS_ERROR", line 79 ORA-06512: at "SYS.DBMS_DATAPUMP", line 4932 ORA-06512: at "SYS.DBMS_DATAPUMP", line 6844 ORA-06512: at line 6
3.) Now it’s time to run the Data Pump job. Be aware, that for a consistent export, you need to specify the FLASHBACK_TIME or FLASHBACK_SCN. In my case, I use the FLASHBACK_TIME and set it to the current SYSTIMESTAMP.
Ok. Let’s give it a try.
SQL> connect hr/hr Connected. SQL> @exp_datapump.sql SQL> declare 2 l_datapump_handle NUMBER; -- Data Pump job handle 3 l_datapump_dir VARCHAR2(20) := 'DATAPUMP_DIR'; -- Data Pump Directory 4 l_status varchar2(200); -- Data Pump Status 5 BEGIN 6 l_datapump_handle := dbms_datapump.open(operation => 'EXPORT', -- operation = EXPORT, IMPORT, SQL_FILE 7 job_mode =>'FULL', -- job_mode = FULL, SCHEMA, TABLE, TABLESPACE, TRANSPORTABLE 8 job_name => 'DBIT122 EXPORT JOB RUN 003', -- job_name = NULL (default) or: job name (max 30 chars) 9 version => '12'); -- version = COMPATIBLE (default), LATEST (dbversion), a value (11.0.0 or 12) 10 11 dbms_datapump.add_file(handle => l_datapump_handle, 12 filename => 'exp_DBIT122_%U.dmp', 13 directory => l_datapump_dir); 14 15 dbms_datapump.add_file(handle => l_datapump_handle, 16 filename => 'exp_DBIT122.log' , 17 directory => l_datapump_dir , 18 filetype => DBMS_DATAPUMP.ku$_file_type_log_file); 19 20 dbms_datapump.set_parameter(l_datapump_handle,'CLIENT_COMMAND','Full Consistent Data Pump Export of DBIT122 with PARALLEL 8'); 21 22 dbms_datapump.set_parameter(l_datapump_handle,'FLASHBACK_TIME','SYSTIMESTAMP'); 23 24 dbms_datapump.set_parallel(l_datapump_handle,8); 25 26 dbms_datapump.start_job(handle => l_datapump_handle); 27 28 dbms_datapump.wait_for_job(handle => l_datapump_handle, 29 job_state => l_status ); 30 31 dbms_output.put_line( l_status ); 32 33 end; 34 / PL/SQL procedure successfully completed. SQL>
4.) In another window, you might want to monitor the status of your export job.
SQL> r 1 select owner_name, job_name, rtrim(operation) "OPERATION", 2 rtrim(job_mode) "JOB_MODE", state, attached_sessions 3 from dba_datapump_jobs 4 where job_name not like 'BIN$%' 5* order by 1,2 OWNER_NAME JOB_NAME OPERATION JOB_MODE STATE ATTACHED_SESSIONS ---------- -------------------------------- ------------ ------------ ------------ ----------------- HR DBIT122 EXPORT JOB RUN 003 EXPORT FULL EXECUTING 1
Cool. If the job finished successfully, you will see 8 dump files, because we specified exp_DBIT122_%U.dmp as the file name, and one log file.
oracle@dbidg01:/u01/app/oracle/admin/DBIT122/dpdump/ [DBIT122] ls -l total 4752 -rw-r----- 1 oracle oinstall 630784 Jun 13 10:29 exp_DBIT122_01.dmp -rw-r----- 1 oracle oinstall 3321856 Jun 13 10:29 exp_DBIT122_02.dmp -rw-r----- 1 oracle oinstall 180224 Jun 13 10:29 exp_DBIT122_03.dmp -rw-r----- 1 oracle oinstall 57344 Jun 13 10:29 exp_DBIT122_04.dmp -rw-r----- 1 oracle oinstall 430080 Jun 13 10:28 exp_DBIT122_05.dmp -rw-r----- 1 oracle oinstall 20480 Jun 13 10:29 exp_DBIT122_06.dmp -rw-r----- 1 oracle oinstall 28672 Jun 13 10:29 exp_DBIT122_07.dmp -rw-r----- 1 oracle oinstall 176128 Jun 13 10:28 exp_DBIT122_08.dmp -rw-r--r-- 1 oracle oinstall 11966 Jun 13 10:29 exp_DBIT122.log
5.) Finally, you might want to do some cleanup, in case you don’t need the dump files and the log files anymore. Or you start your export job with the REUSE_DUMPFILES=YES option. This option overwrites the destination dump files if they exist. In case you want to do the cleanup manually, you can use the ULT_FILE package.
SQL> exec utl_file.fremove( 'DATAPUMP_DIR', 'exp_DBIT122.log' ); PL/SQL procedure successfully completed. SQL> exec utl_file.fremove( 'DATAPUMP_DIR', 'exp_DBIT122_01.dmp' ); PL/SQL procedure successfully completed. SQL> exec utl_file.fremove( 'DATAPUMP_DIR', 'exp_DBIT122_02.dmp' ); PL/SQL procedure successfully completed. SQL> exec utl_file.fremove( 'DATAPUMP_DIR', 'exp_DBIT122_03.dmp' ); PL/SQL procedure successfully completed. SQL> exec utl_file.fremove( 'DATAPUMP_DIR', 'exp_DBIT122_04.dmp' ); PL/SQL procedure successfully completed. SQL> exec utl_file.fremove( 'DATAPUMP_DIR', 'exp_DBIT122_05.dmp' ); PL/SQL procedure successfully completed. SQL> exec utl_file.fremove( 'DATAPUMP_DIR', 'exp_DBIT122_06.dmp' ); PL/SQL procedure successfully completed. SQL> exec utl_file.fremove( 'DATAPUMP_DIR', 'exp_DBIT122_07.dmp' ); PL/SQL procedure successfully completed. SQL> exec utl_file.fremove( 'DATAPUMP_DIR', 'exp_DBIT122_08.dmp' ); PL/SQL procedure successfully completed. SQL>
Conclusion
The PL/SQL API’s becomes more and more important, especially in cloud environments. It makes quite a lot of sense, from my point of view, to look closer into the one or the other. Especially the DBMS_DATAPUMP is an important one for moving data around.