Skip to Main Content
  • Questions
  • I have to import the specific tables using DBMS_DATAPUMP PACKAGE . How to exclude the unwanted tables ?

Breadcrumb

Question and Answer

Connor McDonald

Thanks for the question, Kumar.

Asked: March 02, 2017 - 7:47 am UTC

Last updated: March 22, 2017 - 6:41 am UTC

Version: 11.2.0.4

Viewed 10K+ times! This question is

You Asked

Hi Tom ,

I have used the below import par file to exclude the tables those names are with $

DIRECTORY=FEB09_E
DUMPFILE= MXPRD_01feb09_%U.dmp
LOGFILE=IMP_EDRPRE.log
REMAP_SCHEMA=MXPRD:MAXIMO
remap_tablespace=MXPRD_DATA:MAXIMO_DATA,MXPRD_INDEX:MAXIMO_INDEX
PARALLEL=8
exclude=TABLE:"LIKE '%$%'"

The Same I have tried with DBMS_DATAPUMP and it is excluding all the tables .

h :=DBMS_DATAPUMP.open('IMPORT','SCHEMA',null,'PRE_REFRESH');
DBMS_DATAPUMP.add_file(h,filename => 'exp_mxprd%u.dmp',directory => exp_dir);
DBMS_DATAPUMP.add_file(h,filename => 'imp_edr_maximo.log',directory=> exp_dir ,filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);
DBMS_DATAPUMP.metadata_remap (h,'REMAP_SCHEMA','MXPRD','MAXIMO');
DBMS_DATAPUMP.metadata_remap (h,'REMAP_TABLESPACE','MXPRD_DATA','MAXIMO_DATA');
DBMS_DATAPUMP.metadata_remap (h,'REMAP_TABLESPACE','MXPRD_INDEX','MAXIMO_INDEX');
DBMS_DATAPUMP.metadata_filter (h,'NAME_EXPR', value => 'NOT LIKE ''%$%''', object_type => 'TABLE');
DBMS_DATAPUMP.set_parallel(h,degree => 8);
DBMS_DATAPUMP.start_job(h);
DBMS_DATAPUMP.WAIT_FOR_JOB (h, status);


Please help me to find the equivalent of exclude=TABLE:"LIKE '%$%'" in DBMS_DATAPUMP

and Connor said...

Sorry, I can't replicate your problem

SQL> create table scott.emp$ as select * from scott.emp;

Table created.

SQL> select table_name from all_tables
  2  where owner = 'SCOTT'
  3  order by 1;

TABLE_NAME
------------------------------
BONUS
DEPT
EMP
EMP$
SALGRADE

SQL> set serveroutput on size 1000000
SQL> declare
  2    l_jobid       number;
  3  begin
  4    l_jobid := dbms_datapump.open(
  5      operation   => 'EXPORT',
  6      job_mode    => 'SCHEMA',
  7      job_name    => 'SCOTT_DP'
  8      );
  9
 10    dbms_datapump.add_file(
 11      handle    => l_jobid,
 12      filename  => 'scott.dmp',
 13      directory => 'MY_DIR');
 14
 15    dbms_datapump.add_file(
 16      handle    => l_jobid,
 17      filename  => 'scott.log',
 18      directory => 'MY_DIR',
 19      filetype  => dbms_datapump.ku$_file_type_log_file);
 20
 21    dbms_datapump.metadata_filter(
 22      handle => l_jobid,
 23      name   => 'SCHEMA_EXPR',
 24      value  => '= ''SCOTT''');
 25
 26    dbms_datapump.metadata_filter(
 27      handle      => l_jobid,
 28      name        => 'NAME_EXPR',
 29      value       => 'NOT LIKE ''%$%''',
 30      object_type => 'TABLE');
 31
 32    dbms_datapump.start_job(l_jobid);
 33
 34    dbms_datapump.detach(l_jobid);
 35  end;
 36  /

PL/SQL procedure successfully completed.

---
---  scott.log
---
FLASHBACK automatically enabled to preserve database integrity.
Starting "MCDONAC"."SCOTT_DP":  
Estimate in progress using BLOCKS method...
...
...
Processing object type SCHEMA_EXPORT/POST_SCHEMA/PROCACT_SCHEMA
. . exported "SCOTT"."DEPT"                              6.031 KB       4 rows
. . exported "SCOTT"."EMP"                               8.781 KB      14 rows
. . exported "SCOTT"."SALGRADE"                          5.960 KB       5 rows
. . exported "SCOTT"."BONUS"                                 0 KB       0 rows
Master table "MCDONAC"."SCOTT_DP" successfully loaded/unloaded
******************************************************************************
Dump file set for MCDONAC.SCOTT_DP is:
  C:\TEMP\SCOTT.DMP
Job "MCDONAC"."SCOTT_DP" successfully completed at Tue Mar 7 15:16:07 2017 elapsed 0 00:00:27



It might be version specific, so take an example like mine and build a test case and get in touch with Support.

Rating

  (7 ratings)

Is this answer out of date? If it is, please let us know via a Comment

Comments

Export is working and import is not working

Kumar, March 09, 2017 - 8:59 am UTC

Thanks Donald for your Answer . Export is working with the code but import is not working


declare
exp_dir varchar2(32) :='TESTMNP';
table_list varchar2(3000);
status varchar2(200);
h number;
begin
h :=DBMS_DATAPUMP.open('EXPORT','SCHEMA',null,'PRE_REFRESH');
DBMS_DATAPUMP.metadata_filter(h, name => 'SCHEMA_EXPR', value => 'IN(''KUMAR'')');
DBMS_DATAPUMP.add_file(h,filename => 'exp1_mxprd01.dmp',directory => exp_dir);
DBMS_DATAPUMP.add_file(h,filename => 'exp1_edr_maximo.log',directory=> exp_dir ,filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);
DBMS_DATAPUMP.metadata_filter (h,'NAME_EXPR', value => 'NOT LIKE ''%$%''', object_type => 'TABLE');
DBMS_DATAPUMP.set_parallel(h,degree => 8);
DBMS_DATAPUMP.start_job(h);
DBMS_DATAPUMP.WAIT_FOR_JOB (h, status);
end;
/

With this code I am able to export the expected tables .

export logfile is

Starting "SYS"."PRE_REFRESH":
Estimate in progress using BLOCKS method...
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 256 KB
Processing object type SCHEMA_EXPORT/USER
Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
Processing object type SCHEMA_EXPORT/ROLE_GRANT
Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
Processing object type SCHEMA_EXPORT/TABLE/TABLE
. . exported "KUMAR"."HR_TERMS_BK20170102" 91.15 KB 2518 rows
. . exported "KUMAR"."POWERFUL_USERS_SMP" 5.562 KB 40 rows
. . exported "KUMAR"."USERS_EFIMPRD" 16.23 KB 117 rows
Master table "SYS"."PRE_REFRESH" successfully loaded/unloaded
******************************************************************************
Dump file set for SYS.PRE_REFRESH is:
/dbashare/MAXIMO/refresh/TEST/exp1_mxprd01.dmp
Job "SYS"."PRE_REFRESH" successfully completed at Thu Mar 9 02:42:41 2017 elapsed 0 00:01:13



When I am trying to import, all the tables are getting excluded

declare
exp_dir varchar2(32) :='TESTMNP';
table_list varchar2(3000);
status varchar2(200);
h number;
begin
h :=DBMS_DATAPUMP.open('IMPORT','SCHEMA',null,'PRE_REFRESH11');
DBMS_DATAPUMP.add_file(h,filename => 'exp_mxprd01.dmp',directory => exp_dir);
DBMS_DATAPUMP.add_file(h,filename => 'imp_edr_maximo.log',directory=> exp_dir ,filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);
DBMS_DATAPUMP.metadata_remap (h,'REMAP_SCHEMA','TEST11','KUMAR');
DBMS_DATAPUMP.metadata_remap (h,'REMAP_TABLESPACE','TESTTBS','KUMARTBS');
DBMS_DATAPUMP.metadata_filter (h,'NAME_EXPR', value => 'NOT LIKE ''%$%''', object_type => 'TABLE');
DBMS_DATAPUMP.set_parallel(h,degree => 8);
DBMS_DATAPUMP.start_job(h);
DBMS_DATAPUMP.WAIT_FOR_JOB (h, status);
end;
/

import log file

Master table "SYS"."PRE_REFRESH11" successfully loaded/unloaded
Starting "SYS"."PRE_REFRESH11":
Processing object type SCHEMA_EXPORT/USER
Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
Processing object type SCHEMA_EXPORT/ROLE_GRANT
Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
Job "SYS"."PRE_REFRESH11" successfully completed at Thu Mar 9 02:47:00 2017 elapsed 0 00:00:02

Chris Saxon
March 09, 2017 - 2:07 pm UTC

If you excluded the $ tables in your export, why do you need to exclude them again in your import? Surely this parameter is unnecessary?

What an answer

A reader, March 09, 2017 - 5:50 pm UTC

The matter is not whether the export include or exclude. The main prob is missed.

Well I ll try to ask differently

Say I export all tables including emp$
And at there are lots of T$

Q: how exclude these from import.

Connor McDonald
March 13, 2017 - 3:12 am UTC

Works on import too as you'd expect, eg

C:\tmp>expdp directory=tmp dumpfile=scott.dmp schemas=scott

Export: Release 12.1.0.2.0 - Production on Mon Mar 13 11:06:07 2017

Copyright (c) 1982, 2016, Oracle and/or its affiliates.  All rights reserved.

Username: mcdonac/********

Connected to: Oracle Database 12c Enterprise Edition Release 12.1.0.2.0 - 64bit Production
With the Partitioning, OLAP, Advanced Analytics and Real Application Testing options
FLASHBACK automatically enabled to preserve database integrity.
Starting "MCDONAC"."SYS_EXPORT_SCHEMA_01":  mcdonac/******** directory=tmp dumpfile=scott.dmp schemas=scott
Estimate in progress using BLOCKS method...
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 256 KB
Processing object type SCHEMA_EXPORT/USER
Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
Processing object type SCHEMA_EXPORT/ROLE_GRANT
Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
Processing object type SCHEMA_EXPORT/TABLE/TABLE
Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT
Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
Processing object type SCHEMA_EXPORT/STATISTICS/MARKER
. . exported "SCOTT"."DEPT"                              6.023 KB       4 rows
. . exported "SCOTT"."EMP"                               8.773 KB      14 rows
. . exported "SCOTT"."EMP$"                              8.789 KB      14 rows
. . exported "SCOTT"."SALGRADE"                          5.953 KB       5 rows
. . exported "SCOTT"."BONUS"                                 0 KB       0 rows
Master table "MCDONAC"."SYS_EXPORT_SCHEMA_01" successfully loaded/unloaded
******************************************************************************
Dump file set for MCDONAC.SYS_EXPORT_SCHEMA_01 is:
  C:\TMP\SCOTT.DMP
Job "MCDONAC"."SYS_EXPORT_SCHEMA_01" successfully completed at Mon Mar 13 11:06:34 2017 elapsed 0 00:00:24


SQL> declare
  2    l_jobid       number;
  3  begin
  4    l_jobid := dbms_datapump.open(
  5      operation   => 'IMPORT',
  6      job_mode    => 'SCHEMA',
  7      job_name    => 'SCOTT_DP'
  8      );
  9
 10    dbms_datapump.add_file(
 11      handle    => l_jobid,
 12      filename  => 'scott.dmp',
 13      directory => 'TMP');
 14
 15    dbms_datapump.add_file(
 16      handle    => l_jobid,
 17      filename  => 'scott.log',
 18      directory => 'TMP',
 19      filetype  => dbms_datapump.ku$_file_type_log_file);
 20
 21    dbms_datapump.metadata_filter(
 22      handle => l_jobid,
 23      name   => 'SCHEMA_EXPR',
 24      value  => '= ''SCOTT''');
 25
 26    dbms_datapump.metadata_remap(l_jobid,
 27                                 'REMAP_SCHEMA',
 28                                 'SCOTT',
 29                                 'SCOTT2');
 30
 31    dbms_datapump.metadata_filter(
 32      handle      => l_jobid,
 33      name        => 'NAME_EXPR',
 34      value       => 'NOT LIKE ''%$%''',
 35      object_type => 'TABLE');
 36
 37    dbms_datapump.start_job(l_jobid);
 38
 39    dbms_datapump.detach(l_jobid);
 40  end;
 41  /

PL/SQL procedure successfully completed.

C:\tmp>cat scott.log
Master table "MCDONAC"."SCOTT_DP" successfully loaded/unloaded
Starting "MCDONAC"."SCOTT_DP":  
Processing object type SCHEMA_EXPORT/USER
Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
Processing object type SCHEMA_EXPORT/ROLE_GRANT
Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
Processing object type SCHEMA_EXPORT/TABLE/TABLE
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
. . imported "SCOTT2"."DEPT"                             6.023 KB       4 rows
. . imported "SCOTT2"."EMP"                              8.773 KB      14 rows
. . imported "SCOTT2"."SALGRADE"                         5.953 KB       5 rows
. . imported "SCOTT2"."BONUS"                                0 KB       0 rows
Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT
Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
Processing object type SCHEMA_EXPORT/STATISTICS/MARKER
Job "MCDONAC"."SCOTT_DP" successfully completed at Mon Mar 13 11:11:02 2017 elapsed 0 00:00:13





Kumar AV, March 13, 2017 - 3:47 am UTC

Hi ,

The issue is fixed by changing the object_path

DBMS_DATAPUMP.metadata_filter (h,'NAME_EXPR', value => 'NOT LIKE ''%$%''', object_path => 'SCHEMA_EXPORT\TABLE');

I am exporting data from one database and importing the data in 3 databases . Whereas I am excluding the tables only in one among the 3 databases .

any special privilege required

Rajeshwaran, Jeyabal, March 13, 2017 - 9:27 am UTC

Team - I was able to do the export successfully, but import got failed. ( I am on 11.2.0.4 on windows environment )

D:\app\Vnameit\product\11.2.0\dbhome_1\BIN>expdp scott/tiger@lt035221/ORA11g directory=TMP logfile=expdplog.txt schemas=SCOTT

Export: Release 11.2.0.4.0 - Production on Mon Mar 13 12:08:15 2017

Copyright (c) 1982, 2011, Oracle and/or its affiliates.  All rights reserved.

Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.4.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Starting "SCOTT"."SYS_EXPORT_SCHEMA_01":  scott/********@lt035221/ORA11g directory=TMP logfile=expdplog.txt schemas=SCOTT
Estimate in progress using BLOCKS method...
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 256 KB
Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
Processing object type SCHEMA_EXPORT/TABLE/TABLE
Processing object type SCHEMA_EXPORT/TABLE/GRANT/OWNER_GRANT/OBJECT_GRANT
Processing object type SCHEMA_EXPORT/TABLE/COMMENT
Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT
Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
. . exported "SCOTT"."DEPT"                              5.937 KB       4 rows
. . exported "SCOTT"."EMP"                               8.570 KB      14 rows
. . exported "SCOTT"."EMP$"                              8.570 KB      14 rows
. . exported "SCOTT"."SALGRADE"                          5.867 KB       5 rows
. . exported "SCOTT"."BONUS"                                 0 KB       0 rows
Master table "SCOTT"."SYS_EXPORT_SCHEMA_01" successfully loaded/unloaded
******************************************************************************
Dump file set for SCOTT.SYS_EXPORT_SCHEMA_01 is:
  D:\EXPDAT.DMP
Job "SCOTT"."SYS_EXPORT_SCHEMA_01" successfully completed at Mon Mar 13 12:08:49 2017 elapsed 0 00:00:29


D:\app\Vnameit\product\11.2.0\dbhome_1\BIN>

while import using DBMS_DATAPUMP got this error.
demo@ORA11G> declare
  2     l_job int;
  3  begin
  4     l_job := dbms_datapump.open(
  5             operation=>'IMPORT',
  6             job_mode=>'SCHEMA',
  7             job_name=>'DEMO_IMPORT');
  8
  9     dbms_datapump.add_file(
 10             handle=>l_job,
 11             filename=>'EXPDAT.DMP',
 12             directory=>'TMP');
 13
 14     dbms_datapump.add_file(
 15             handle=>l_job,
 16             filename=>'implog.txt',
 17             directory=>'TMP',
 18             filetype=>dbms_datapump.KU$_FILE_TYPE_LOG_FILE);
 19
 20     dbms_datapump.metadata_remap(
 21             handle=>l_job,
 22             name=>'REMAP_SCHEMA',
 23             old_value=>'SCOTT',
 24             value=>'DEMO');
 25
 26     dbms_datapump.metadata_filter(
 27             handle=>l_job,
 28             name=> 'NAME_EXPR',
 29             value=> q'| not like '%$%' |',
 30             object_type=>'TABLE');
 31
 32     dbms_datapump.start_job(l_job);
 33
 34     dbms_datapump.detach(l_job);
 35  end;
 36  /
declare
*
ERROR at line 1:
ORA-31626: job does not exist
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 79
ORA-06512: at "SYS.DBMS_DATAPUMP", line 1137
ORA-06512: at "SYS.DBMS_DATAPUMP", line 5285
ORA-06512: at line 4


demo@ORA11G>

Tried debugging from sql-developer and found that it was due to "open" call in dbms_datapump package.
demo@ORA11G> declare
  2     l_job int;
  3  begin
  4     l_job := dbms_datapump.open(
  5             operation=>'IMPORT',
  6             job_mode=>'SCHEMA',
  7             job_name=>'DEMO_IMPORT');
  8     dbms_output.put_line( 'l_job = '|| l_job );
  9  end;
 10  /
declare
*
ERROR at line 1:
ORA-31626: job does not exist
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 79
ORA-06512: at "SYS.DBMS_DATAPUMP", line 1137
ORA-06512: at "SYS.DBMS_DATAPUMP", line 5285
ORA-06512: at line 4


demo@ORA11G>

Do i need any special privilege to complete this import?
Connor McDonald
March 14, 2017 - 3:36 am UTC

Someone answered this for you :-)

Does the user have correct privileges

Stephen, March 13, 2017 - 10:44 pm UTC

@Jeyabal does the demo user have the create table privilege directly granted (DataPump API Fails With Error ORA-31626 Job Does Not Exist (Doc ID 315488.1).

grant create session, create table, create procedure, exp_full_database, imp_full_database to demo;
grant read, write on directory my_dump_dir to demo;


IHTH,
Stephen

any special privilege required

Rajeshwaran, March 14, 2017 - 2:45 am UTC

Steven - thanks for pointing that out.

the user DEMO has the default tablespace TS_DATA and the exported objects were from USERS tablespace, upon import the user DEMO tried to create the objects in USERS tablespace and hence it failed since the user DEMO don't have any quota assigned on the USERS tablespace.

after introducing "REMAP_TABLESPACE" parameter, invoking DBMS_DATAPUMP produces the output like this.

rajesh@ORA11G> conn demo/demo@ora11g
Connected.
demo@ORA11G> declare
  2     l_job int;
  3  begin
  4     l_job := dbms_datapump.open(
  5             operation=>'IMPORT',
  6             job_mode=>'SCHEMA',
  7             job_name=>'DEMO_IMPORT');
  8
  9     dbms_datapump.add_file(
 10             handle=>l_job,
 11             filename=>'EXPDAT.DMP',
 12             directory=>'TMP');
 13
 14     dbms_datapump.add_file(
 15             handle=>l_job,
 16             filename=>'implog2.txt',
 17             directory=>'TMP',
 18             filetype=>dbms_datapump.KU$_FILE_TYPE_LOG_FILE);
 19
 20     dbms_datapump.metadata_remap(
 21             handle=>l_job,
 22             name=>'REMAP_SCHEMA',
 23             old_value=>'SCOTT',
 24             value=>'DEMO');
 25
 26     dbms_datapump.metadata_remap(
 27             handle=>l_job,
 28             name=>'REMAP_TABLESPACE',
 29             old_value=>'USERS',
 30             value=>'TS_DATA');
 31
 32     dbms_datapump.metadata_filter(
 33             handle=>l_job,
 34             name=> 'NAME_EXPR',
 35             value=> 'NOT LIKE ''%$%''',
 36             object_type=>'TABLE');
 37
 38     dbms_datapump.start_job(l_job);
 39
 40     dbms_datapump.detach(l_job);
 41  end;
 42  /

PL/SQL procedure successfully completed.

demo@ORA11G> $type d:\implog2.txt
Master table "DEMO"."DEMO_IMPORT" successfully loaded/unloaded
Starting "DEMO"."DEMO_IMPORT":
Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
Job "DEMO"."DEMO_IMPORT" successfully completed at Tue Mar 14 08:03:58 2017 elapsed 0 00:00:02

demo@ORA11G> select table_name from user_tables;

TABLE_NAME
------------------------------
BIG_TABLE

demo@ORA11G>


Tried to invoke IMPDP from command line got this.

D:\app\Vnameit\product\11.2.0\dbhome_1\BIN>impdp demo/demo@LT035221/ORA11g directory=TMP dumpfile=EXPDAT.dmp remap_schema=scott:demo schemas=scott remap_tablespace=USERS:T
S_DATA logfile=implog.txt

Import: Release 11.2.0.4.0 - Production on Tue Mar 14 08:05:21 2017

Copyright (c) 1982, 2011, Oracle and/or its affiliates.  All rights reserved.

Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.4.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Master table "DEMO"."SYS_IMPORT_SCHEMA_01" successfully loaded/unloaded
Starting "DEMO"."SYS_IMPORT_SCHEMA_01":  demo/********@LT035221/ORA11g directory=TMP dumpfile=EXPDAT.dmp remap_schema=scott:demo schemas=scott remap_tablespace=USERS:TS_DA
TA logfile=implog.txt
Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
Processing object type SCHEMA_EXPORT/TABLE/TABLE
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
. . imported "DEMO"."DEPT"                               5.937 KB       4 rows
. . imported "DEMO"."EMP"                                8.570 KB      14 rows
. . imported "DEMO"."EMP$"                               8.570 KB      14 rows
. . imported "DEMO"."SALGRADE"                           5.867 KB       5 rows
. . imported "DEMO"."BONUS"                                  0 KB       0 rows
Processing object type SCHEMA_EXPORT/TABLE/GRANT/OWNER_GRANT/OBJECT_GRANT
Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT
Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
Job "DEMO"."SYS_IMPORT_SCHEMA_01" successfully completed at Tue Mar 14 08:05:26 2017 elapsed 0 00:00:04


when included EXCLUDE parameter it went like this.

D:\app\Vnameit\product\11.2.0\dbhome_1\BIN>impdp demo/demo@LT035221/ORA11g directory=TMP dumpfile=EXPDAT.dmp remap_schema=scott:demo schemas=scott remap_tablespace=USERS:T
S_DATA logfile=implog.txt EXCLUDE=TABLE:" like '%$%' "

Import: Release 11.2.0.4.0 - Production on Tue Mar 14 08:05:43 2017

Copyright (c) 1982, 2011, Oracle and/or its affiliates.  All rights reserved.

Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.4.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
UDI-00014: invalid value for parameter, 'exclude'


D:\app\Vnameit\product\11.2.0\dbhome_1\BIN>


Help me to understand these questions.

Questions:

1) Why the DBMS_DATAMPUMP doesn't import objects, while IMPDP does it? did i missing something with DBMS_DATAMPUMP here ?

2) why the EXCLUDE parameter for IMPDP resulted in errors?

Connor McDonald
March 22, 2017 - 6:41 am UTC

Add a SCHEMA_EXPR as well

SQL> create user demo identified by demo;

User created.

SQL> grant connect, resource to demo;

Grant succeeded.

SQL> alter user demo quota unlimited on demo;

User altered.

SQL> alter user demo quota unlimited on users;

User altered.

SQL>
SQL> declare
  2         l_job int;
  3      begin
  4         l_job := dbms_datapump.open(
  5                 operation=>'IMPORT',
  6                 job_mode=>'SCHEMA',
  7                 job_name=>'DEMO_IMPORT');
  8
  9         dbms_datapump.add_file(
 10                handle=>l_job,
 11                filename=>'scott.dmp',
 12                directory=>'TEMP');
 13
 14        dbms_datapump.add_file(
 15                handle=>l_job,
 16                filename=>'implog'||to_char(sysdate,'HH24MISS')||'.txt',
 17                directory=>'TEMP',
 18                filetype=>dbms_datapump.KU$_FILE_TYPE_LOG_FILE);
 19
 20        dbms_datapump.metadata_filter(
 21          handle => l_job,
 22          name   => 'SCHEMA_EXPR',
 23           value  => '= ''SCOTT''');
 24
 25        dbms_datapump.metadata_remap(
 26                handle=>l_job,
 27                name=>'REMAP_SCHEMA',
 28                old_value=>'SCOTT',
 29                value=>'DEMO');
 30
 31        dbms_datapump.metadata_remap(
 32                handle=>l_job,
 33                name=>'REMAP_TABLESPACE',
 34                old_value=>'USERS',
 35                value=>'DEMO');
 36
 37        dbms_datapump.metadata_filter(
 38          handle      => l_job,
 39          name        => 'NAME_EXPR',
 40          value       => 'NOT LIKE ''%$%''',
 41        object_type => 'TABLE');
 42
 43        dbms_datapump.start_job(l_job);
 44
 45        dbms_datapump.detach(l_job);
 46     end;
 47     /

PL/SQL procedure successfully completed.

SQL>
SQL> host cat c:\temp\imp143412.txt

Master table "MCDONAC"."DEMO_IMPORT" successfully loaded/unloaded
Starting "MCDONAC"."DEMO_IMPORT":  
Processing object type SCHEMA_EXPORT/USER
ORA-31684: Object type USER:"DEMO" already exists
Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
Processing object type SCHEMA_EXPORT/ROLE_GRANT
Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
Processing object type SCHEMA_EXPORT/TABLESPACE_QUOTA
Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
Processing object type SCHEMA_EXPORT/SYNONYM/SYNONYM
Processing object type SCHEMA_EXPORT/PROCEDURE/PROCEDURE
Processing object type SCHEMA_EXPORT/PROCEDURE/ALTER_PROCEDURE
Processing object type SCHEMA_EXPORT/VIEW/VIEW
Processing object type SCHEMA_EXPORT/STATISTICS/MARKER
Job "MCDONAC"."DEMO_IMPORT" completed with 1 error(s) at Wed Mar 22 14:34:12 2017 elapsed 0 00:00:03

SQL>
SQL>    select segment_name, tablespace_name
  2  from dba_segments
  3  where owner = 'DEMO';

SEGMENT_NAME                   TABLESPACE_NAME
------------------------------ ------------------------
EMP                            DEMO
DEPT                           DEMO
SALGRADE                       DEMO
EMP_PK                         DEMO
DEPT_PK                        DEMO


any special privilege required

Rajeshwaran, March 15, 2017 - 5:35 pm UTC

Tried with parameter files, went fine. not sure why this EXCLUDE parameter doesn't work with values provided from command line.

D:\app\Vnameit\product\11.2.0\dbhome_1\BIN>impdp demo/demo@LT035221/ORA11g parfile=d:\pat.txt

Import: Release 11.2.0.4.0 - Production on Wed Mar 15 22:59:54 2017

Copyright (c) 1982, 2011, Oracle and/or its affiliates.  All rights reserved.

Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.4.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Master table "DEMO"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
Starting "DEMO"."SYS_IMPORT_FULL_01":  demo/********@LT035221/ORA11g parfile=d:\pat.txt
Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
Processing object type SCHEMA_EXPORT/TABLE/TABLE
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
. . imported "DEMO"."DEPT"                               5.937 KB       4 rows
. . imported "DEMO"."EMP"                                8.570 KB      14 rows
. . imported "DEMO"."SALGRADE"                           5.867 KB       5 rows
. . imported "DEMO"."BONUS"                                  0 KB       0 rows
Processing object type SCHEMA_EXPORT/TABLE/GRANT/OWNER_GRANT/OBJECT_GRANT
Processing object type SCHEMA_EXPORT/TABLE/INDEX/INDEX
Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
Processing object type SCHEMA_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT
Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
Job "DEMO"."SYS_IMPORT_FULL_01" successfully completed at Wed Mar 15 23:00:26 2017 elapsed 0 00:00:19


D:\app\Vnameit\product\11.2.0\dbhome_1\BIN>type d:\pat.txt
directory=TMP dumpfile=EXPDAT.dmp logfile=implog.txt remap_schema=scott:demo remap_tablespace=USERS:TS_DATA EXCLUDE=TABLE:"LIKE '%$'"
D:\app\Vnameit\product\11.2.0\dbhome_1\BIN>

Connor McDonald
March 16, 2017 - 1:24 am UTC

Generally its when the OS or terminal interpreter "messes" with the quotes.

Often you end up having to type things like:

EXCLUDE=TABLE:\"LIKE \'%$\'\"

etc on Unix and similar escape sequences on Windows.

Which is why I'm a fan of parameter files.

I'm still looking at your "1) Why the DBMS_DATAMPUMP doesn't import objects, while IMPDP does it? did i missing something with DBMS_DATAMPUMP here ?" question - we havent forgotten it :-)

More to Explore

Data Pump

All of the database utilities including Data Pump are explained in the Utilities guide.