Skip to Main Content

Breadcrumb

Question and Answer

Connor McDonald

Thanks for the question, Oracle .

Asked: July 24, 2016 - 4:27 pm UTC

Last updated: August 11, 2016 - 6:56 am UTC

Version: 11.2.0.2

Viewed 10K+ times! This question is

You Asked

Hi tom,

I have a schema export with expdp with this -

expdp user/passwors@test directory=dump_dir dumpfile=user.dmp logfile=user.log schemas=user

User have 100 tables 50 procedures and triggers now I want to import this on dB how to import it if the user already there with tables and procedures So before to import we have to drop existing schemas then import or simply we have to import like

impdp user/passwors@test directory=dump_dir dumpfile=user.dmp logfile=user.log schemas=user

Pls suggest.

Thanks

and Connor said...

If you want to *replace* everything that is there with what is in the dumpfile, then yes, it would be safest to drop everything first. Otherwise if you have (say) object "A" already existing, and that object is *not* present in the dumpfile, then after importing, you would still have "A" present.

But also check out the TABLE_EXISTS_ACTION parameter for data pump import.

Rating

  (6 ratings)

Is this answer out of date? If it is, please let us know via a Comment

Comments

A reader, July 25, 2016 - 4:57 am UTC

Thanks for your quick reply.

I have submitted two three more question regarding the cluster and rman restoration but it is not showing into your site.

Thanks
Connor McDonald
July 25, 2016 - 6:58 pm UTC

Questions only appear once they are answered by us.

You should have received an email with the link to them.

A reader, August 06, 2016 - 5:04 pm UTC

Hi,

Thanks for reply one more doubt

I have a database I want to export user schema and import into another database for that we need to drop existing users and directly import or before import we need to create user and grant then import schema. Or drop user and import the exported schema will create user .

Thanks
Connor McDonald
August 08, 2016 - 7:12 am UTC

Datapump will create the user, eg

SQL> drop user demo cascade;

User dropped.

SQL> create user demo identified by demo;

User created.

SQL> alter user demo quota 100m on users;

User altered.

SQL> create table demo.t (x int);

Table created.



C:\Users\comcdona>expdp directory=temp dumpfile=user.dmp logfile=user.log schemas=demo

Export: Release 11.2.0.4.0 - Production on Mon Aug 8 15:10:59 2016

Copyright (c) 1982, 2011, Oracle and/or its affiliates.  All rights reserved.

Username: / as sysdba

Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.4.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Starting "SYS"."SYS_EXPORT_SCHEMA_01":  /******** AS SYSDBA directory=temp dumpfile=user.dmp logfile=user.log schemas=demo
Estimate in progress using BLOCKS method...
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 0 KB
Processing object type SCHEMA_EXPORT/USER
Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
Processing object type SCHEMA_EXPORT/TABLESPACE_QUOTA
Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
Processing object type SCHEMA_EXPORT/TABLE/TABLE
. . exported "DEMO"."T"                                      0 KB       0 rows
Master table "SYS"."SYS_EXPORT_SCHEMA_01" successfully loaded/unloaded
******************************************************************************
Dump file set for SYS.SYS_EXPORT_SCHEMA_01 is:
  C:\TEMP\USER.DMP
Job "SYS"."SYS_EXPORT_SCHEMA_01" successfully completed at Mon Aug 8 15:11:26 2016 elapsed 0 00:00:14


--
-- and now onto my other database (which has no DEMO user)
--

C:\Users\comcdona>impdp directory=temp dumpfile=user.dmp logfile=user.log schemas=demo

Import: Release 12.1.0.2.0 - Production on Mon Aug 8 15:12:00 2016

Copyright (c) 1982, 2016, Oracle and/or its affiliates.  All rights reserved.

Username: / as sysdba

Connected to: Oracle Database 12c Enterprise Edition Release 12.1.0.2.0 - 64bit Production
With the Partitioning, Oracle Label Security, OLAP, Advanced Analytics
and Real Application Testing options
Master table "SYS"."SYS_IMPORT_SCHEMA_01" successfully loaded/unloaded
import done in WE8MSWIN1252 character set and AL16UTF16 NCHAR character set
export done in AL32UTF8 character set and AL16UTF16 NCHAR character set
WARNING: possible data loss in character set conversions
Starting "SYS"."SYS_IMPORT_SCHEMA_01":  /******** AS SYSDBA directory=temp dumpfile=user.dmp logfile=user.log schemas=demo
Processing object type SCHEMA_EXPORT/USER
Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
Processing object type SCHEMA_EXPORT/TABLESPACE_QUOTA
Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
Processing object type SCHEMA_EXPORT/TABLE/TABLE
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
. . imported "DEMO"."T"                                      0 KB       0 rows
Job "SYS"."SYS_IMPORT_SCHEMA_01" successfully completed at Mon Aug 8 15:12:15 2016 elapsed 0 00:00:04






A reader, August 09, 2016 - 4:05 am UTC

Thanks to give suggestions with example.

So finally we can say that export schemas take a export of user creation also So no need to create user.


Thanks
Connor McDonald
August 10, 2016 - 1:25 am UTC

For datapump yes.

A reader, August 09, 2016 - 1:11 pm UTC

Hi team,

Thanks for reply.

Now i have an normal traditional exp for schemas and now i want to import table data only with existing table like

I have truncate the table before to import and the table records are 0

now i try to import with existing table from the exported schemas like

imp user@db file=/u01/app/oracle/dbbackup/...2015-09-05-21-00.dmp log=/u01/app/oracle/dbbackup/...2015-09-05-21-00.log tables=data_com ignore=y

imp user@db file=/u01/app/oracle/dbbackup/...2015-09-12-21-00.dmp log=/u01/app/oracle/dbbackup/...2015-09-12-21-00.log tables=data_com ignore=y

imp user@db file=/u01/app/oracle/dbbackup/...2015-09-19-21-00.dmp log=/u01/app/oracle/dbbackup/...2015-09-19-21-00.log tables=data_com ignore=y

imp user@db file=/u01/app/oracle/dbbackup/...2015-09-26-10-49.dmp log=/u01/app/oracle/dbbackup/...2015-09-26-10-49.log tables=data_com ignore=y

After to check imported it is showing duplicate data. So, pls suggest me how to append the data with imp traditional method.

Thanks
Chris Saxon
August 09, 2016 - 3:48 pm UTC

Any particular reason you want to use imp instead of impdp?

It doesn't have a parameter to truncate before loading. So you need to do this yourself.

Chris

oracle oracle, August 10, 2016 - 6:30 am UTC

Thanks for your reply.

Then how to append with imp command like impdp table_exists_action=append

It may happen with imp command if not then
any other way to append this or we can not append with imp command.

Thanks
Connor McDonald
August 10, 2016 - 9:56 am UTC

With imp ignore=y, Oracle will try to load the data if table already exists. So this is like impdp table_exists_action=append.

What precisely is the problem you're having? And why can't you use data pump?

Chris

A reader, August 11, 2016 - 4:01 am UTC

Thanks for your reply.

We are not using impdp due to the exported file in 2015 and at that time they are not using expdp .

The problem is we archiving the data weekly so I my table now is 2016 data and the user want 2015 one month data So I truncated the table and try to import with imp ignore=y and the exported dump file is weekly so I need to import 4 files for a months in a table that is the problem.

So ignore=y mentioned and import 4 dump file and check it is showing duplicate. Thats why I am asking why it has done duplicate and any other options to do this activity.

Thanks
Connor McDonald
August 11, 2016 - 6:56 am UTC

ignore=y will simply append to what is there, eg

SQL> create table t as select 1 x from dual;

Table created.


C:\Users\comcdona>exp userid=mcdonac tables=t

Export: Release 12.1.0.2.0 - Production on Thu Aug 11 14:53:48 2016

Copyright (c) 1982, 2016, Oracle and/or its affiliates.  All rights reserved.

Password:

Connected to: Oracle Database 12c Enterprise Edition Release 12.1.0.2.0 - 64bit Production
With the Partitioning, Oracle Label Security, OLAP, Advanced Analytics
and Real Application Testing options
Export done in WE8MSWIN1252 character set and AL16UTF16 NCHAR character set

About to export specified tables via Conventional Path ...
. . exporting table                              T          1 rows exported
Export terminated successfully without warnings.


C:\Users\comcdona>imp userid=mcdonac tables=t ignore=y

Import: Release 12.1.0.2.0 - Production on Thu Aug 11 14:54:28 2016

Copyright (c) 1982, 2016, Oracle and/or its affiliates.  All rights reserved.

Password:

Connected to: Oracle Database 12c Enterprise Edition Release 12.1.0.2.0 - 64bit Production
With the Partitioning, Oracle Label Security, OLAP, Advanced Analytics
and Real Application Testing options

Export file created by EXPORT:V12.01.00 via conventional path
import done in WE8MSWIN1252 character set and AL16UTF16 NCHAR character set
. importing MCDONAC's objects into MCDONAC
. importing MCDONAC's objects into MCDONAC
. . importing table                            "T"          1 rows imported
Import terminated successfully without warnings.

C:\Users\comcdona>imp userid=mcdonac tables=t ignore=y

Import: Release 12.1.0.2.0 - Production on Thu Aug 11 14:54:31 2016

Copyright (c) 1982, 2016, Oracle and/or its affiliates.  All rights reserved.

Password:

Connected to: Oracle Database 12c Enterprise Edition Release 12.1.0.2.0 - 64bit Production
With the Partitioning, Oracle Label Security, OLAP, Advanced Analytics
and Real Application Testing options

Export file created by EXPORT:V12.01.00 via conventional path
import done in WE8MSWIN1252 character set and AL16UTF16 NCHAR character set
. importing MCDONAC's objects into MCDONAC
. importing MCDONAC's objects into MCDONAC
. . importing table                            "T"          1 rows imported
Import terminated successfully without warnings.


SQL> select * from t;

         X
----------
         1
         1
         1





Is that what you are after ?

Connor

More to Explore

Data Pump

All of the database utilities including Data Pump are explained in the Utilities guide.