Skip to Main Content

Breadcrumb

Question and Answer

Tom Kyte

Thanks for the question.

Asked: February 02, 2009 - 10:56 am UTC

Answered by: Tom Kyte - Last updated: October 15, 2020 - 4:08 pm UTC

Category: Database - Version: 9208

Viewed 100K+ times! This question is

You Asked

Hello Tom,
How to do a Full DB exp and Import.

I do say
exp system/manager@xyz FULL=Y FILE=FULL.DMP

Then If I want to do a full Import to a new freshly created DB which only has the default schemas sys , system , etc.

1) Please correct me :-
I would need to pre-create all schemas /tablespaces for those schemas needed to import the user and then do a FULL import.

imp system/manager@abc FULL=Y FILE=FULL.DMP


2)Cant i just

imp system/manager@abc FULL=Y FILE=FULL.DMP and the users and tablespaces will be automatically created due to FULL option ?


3)Is the use of FULL exp/imp just to gather all schema files into one big file.


4) Suppose I need to exp import say only 10 schemas. What is better peroformance wise, do a fromuser touser comma seperated list of owners
or individually run 10 exp/imp commands. Are smaller individual files easier to handle/read by Oracle exp/imp utility?


Thanx

and we said...

1) you would not need to pre-create anything if you don't want to.

2) yes

3) it is, well, used to get a full export. It gets not only schema's but "public" things too (eg: it gets public synonyms - a schema or object level export would not)

4) each time you run IMP, it reads the dmp file from start to finish. Hence if you run it 10 times, you will read that dmp file 10 times.

Either run IMP once OR export the 10 schemas to 10 separate files, and imp the 10 separate files. You might do the latter in order to "parallel process" with exp/imp

and you rated our response

  (40 ratings)

Is this answer out of date? If it is, please let us know via a Review

Reviews

about point 4

February 02, 2009 - 9:26 pm UTC

Reviewer: A reader

Regarding
4) each time you run IMP, it reads the dmp file from start to finish. Hence if you run it 10 times, you will read that dmp file 10 times.

Either run IMP once OR export the 10 schemas to 10 separate files, and imp the 10 separate files. You might do the latter in order to "parallel process" with exp/imp


Question :-
Will the following be faster than the above point 4.
Do a single file export of 10 schemas so we get just one file .
But Run 10 imp commands with this big single file each time with just the required schema.So we parallelize it but with a BIG fat file consisting of 10 schemas of which only one schema is used in each imp command.

This way I do single export but 10 imports

Thanx

Tom Kyte

Followup  

February 03, 2009 - 10:08 am UTC

do the math. think about it. what do you think would be faster:

10 processes reading concurrently from start to finish a COMMON file that is 10 times larger than they need (so each does 10 times the IO it should)

versus

10 processes reading concurrently from start to finish a PRIVATE file that is perfectly sized - has not one extra byte of data in it?


I don't like import full !

February 03, 2009 - 5:23 am UTC

Reviewer: BLANCHARD from Bretagne, FRANCE

Hello,

I never do full import, I prefer import only users I need.

Is import full very safe ? On a fresh database (with sys, system, dbsnmp users, ...), I'm afraid that full import will add and modify important data.

What about import full into a 9.2.0.8 with export file done in 9.2.0.4 for example ?

FB
Tom Kyte

Followup  

February 03, 2009 - 10:21 am UTC

if you start with a 'fresh' database, a full import is 'safe' and has been done by many thousands of people - in particular when they move across different platforms with old releases (before cross platform transports)

full exp/imp between OS

April 07, 2010 - 8:49 am UTC

Reviewer: Nadee

Hi Tom,

you are the best and thanks a lot for sharing your knowledge.

I'm trying to import a full dump exported in the UNIX platform to a Windows platform.I have created all tablesapces needed. But when I run the imp command the following error comes up for all tablespaces,

IMP-00017: following statement failed with ORACLE error 12913:
"CREATE TABLESPACE "CONFIG_TABLES01" BLOCKSIZE 4096 DATAFILE '/export/home/"
"pa08/dbase/CONFIG_DATA/CONFIG_DATA01_pa08.dbf' SIZE 167874560 AUTOEXT"
"END ON NEXT 53248 MAXSIZE 4000M EXTENT MANAGEMENT DICTIONARY DEFAULT NOCO"
"MPRESS STORAGE(INITIAL 102400 NEXT 102400 MINEXTENTS 1 MAXEXTENTS 256 PCTI"
"NCREASE 5) ONLINE PERMANENT "
IMP-00003: ORACLE error 12913 encountered
ORA-12913: Cannot create dictionary managed tablespace

1. Since my system tablespace is a locally managed one i'm unable to create a dictionary manged tablespace.
2. It seems it is trying to creat tablespaces in the same location as the source database.
I can do a fromuser/touser imp and it works fine with my current tablespaces. But I need to import the entire database.

Can you please advice if there are any limitations when doing a imp full=y and elaborate on that?

Thanks in advance,
Nadee
Tom Kyte

Followup  

April 13, 2010 - 7:26 am UTC

1) correct
2) yes, that is what it is programmed to do (impdp - datapump - is more 'configurable' that way)

You will have to precreate your tablespaces where you want them, using the options you want to use.

Or, provide us a database instance that supports dictionary managed tablespaces and a file system that corresponds to the system you are coming from.

I'd go with the former option myself, use imp indexfile=foo.sql and you'll get a script with all of the creates - you can find the tablespaces and edit them as you see fit - run them and then import.

Could you please clarify imp/export across different OS?

May 12, 2010 - 12:48 pm UTC

Reviewer: John from Alexandria, VA USA

Please clarify. I hope to upgrade a 9i database from 32-bit Windows to 64 bit OS on faster new machine, target 10g database (that is all the app has been tested for)

I have known good full export
set oracle_sid=ORCL
C:\oracle\ora92\bin\exp SYSTEM/***** full=Y FILE=E:\EXPORTS\EXPORT.dmp log=e:\exports export.log CONSISTENT=Y

I really do not want to precreate all users with rights to each table if I don't have to.


I read I could pre-create the database on new system in 10g.

My question regards 10g import parameters and DESTROY?
If I do a import, what parameters are best, and should I use destroy if the 10g tablespaces have the same name as the 9i tablespaces?

I read from an Oracle powerpoint presentation that in 11g I should precreate and fully patch the database, do an import FULL=Y. Does that work in 10g? Advice much appreciated.

How do user expdp/impdp to copy 'full' database?

April 20, 2011 - 4:44 pm UTC

Reviewer: Robert from Memphis, USA

Hi Tom,

We need to use datapump to copy a full database from different platform (endian is different, tablespaces are in disorganized so we cannot use transportable tablespaces).

Would you please explain the best way to do expdp/impdp into a 'fresh' database?
Of course we only need the 'application' type schemas, but as you mentioned above, we need all the public 'SYS' type stuff (public synonyms, roles, etc.).

What is the best way to do this?
Especially without getting lots of errors saying object already exists (e.g. for SYS tables, etc.).

Thanks,

Robert.

character set..

May 07, 2012 - 5:13 am UTC

Reviewer: A Reader

Tom,
When we do full import to newly created DB.
whether the source ( from which dump was exported) characterset would be overwritten onto the target DB characterset?

regards

..contd .. character set..

May 07, 2012 - 5:33 am UTC

Reviewer: A Reader

Tom,
further to above post.
- Target database ( newly created) would already having the SYS and SYSTEM schema.
- when we do IMP full=y .. it would attempt to create those schemas/objects...
- at the same time .. we would like source databse dictionary get imported into target

how to deal in such situation..


regards



Tom Kyte

Followup  

May 07, 2012 - 5:50 am UTC

if the export is in character set X

and the database is built with character set Y

then upon import - X will be mapped into Y, converted in Y

that means that data will be altered, changed.

http://docs.oracle.com/cd/E11882_01/server.112/e10729/ch2charset.htm#i1006853

..contd .. character set..

May 08, 2012 - 1:41 am UTC

Reviewer: A Reader

Thanks Tom

Further, I have silly doubt regarding the FULL import.

I have the scenario :

source database : having schema A alongwith SYS, SYSTEM... and other default schema

Target database ( newly created) : has SYS, SYSTEM and other default schema.

Now
a) In target , I have pre created all the tablespaces as in source.
b) In target IMP full=Y...

In step (b) above what would happen to the Objects ( owned by SYS, SYSTEM ... ) already existing in target databases?
It might replaces them, skip them or append the underlying tables on SYS , SYSTEM schemas...?

we chosen FULL=y because we wanted to Import schema A from source to target alongwith roles, privilges... etc and we want to make sure we wont miss anything in target.

regards








Tom Kyte

Followup  

May 08, 2012 - 11:20 am UTC

we don't export those things in a full export.

..contd. charset

May 09, 2012 - 5:26 am UTC

Reviewer: A Reader

thanks Tom.

But when I tested the same..

[oracle@hostA ~]$ exp system/<pwd> full=Y file=full.dmp log=full.log


I can see ..

Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Export done in US7ASCII character set and AL16UTF16 NCHAR character set
server uses WE8ISO8859P1 character set (possible charset conversion)

About to export the entire database ...
. exporting tablespace definitions

...
..snip..
...
. about to export SYSTEM's tables via Conventional Path ...
. . exporting table DEF$_AQCALL 0 rows exported
. . exporting table DEF$_AQERROR 0 rows exported
. . exporting table DEF$_CALLDEST 0 rows exported
.....
...

SYSTEM's tables are exported.

regards

Tom Kyte

Followup  

May 10, 2012 - 3:14 am UTC

only the data that needs to be. SYS - not exported. things it needs to pick up (so you get jobs exported and the like) they are done 'correctly for you'

it does the right things.


watch that character set conversion - set your NLS_LANG before exporting - otherwise you've just lost all of your characters above ASCII 127

..contd. charset

May 11, 2012 - 12:28 am UTC

Reviewer: A reader

Many thanks Tom.

How to IMPORT table in SYS schema

June 15, 2012 - 4:59 am UTC

Reviewer: Khushal from New York

I exported SYS.ADU$ table using SYSTEM user. But when I tried to import it using SYSTEM user, I am getting error :
Insufficient Privileages.

Please help on this. I dont want to use SYS user while importing the tables into SYS schema.
Tom Kyte

Followup  

June 15, 2012 - 7:56 am UTC

you cannot do that. You cannot write to SYS owned tables. You cannot write into SYS.AUD$

my big problem

December 26, 2012 - 1:23 am UTC

Reviewer: farima from iran

i want to export full from my db but i dont need some of data of tables.what should i do?
Tom Kyte

Followup  

January 04, 2013 - 10:57 am UTC

then obviously - you do not want to export full from your database!


you don't give a version, in 10g and above you have the data pump and data pump can easily use exclude and include parameters to give filters for objects to skip or to include.


don't use exp/imp anymore, use data pump.

Export Import with schema change

September 19, 2013 - 12:02 pm UTC

Reviewer: Prashant from India

Hi Tom,

I need to export and import the database. In New database, I have just changed the block size and increased length of some columns from existing schema. The reason to change the block size to allow more size of index so that I will get minimum 8000 bytes size to the index/key.

I have to take backup of all indexes, schema , relationship .. Can you please provide the steps to me?

Export Import with schema change

September 19, 2013 - 12:14 pm UTC

Reviewer: Prashant from India

Hi Tom,

I need to export and import the database. In New database, I have just changed the block size and increased length of some columns from existing schema. The reason to change the block size to allow more size of index so that I will get minimum 8000 bytes size to the index/key.

I have to take backup of all indexes, schema , relationship .. Can you please provide the steps to me?

Can't import

February 20, 2014 - 10:15 am UTC

Reviewer: Krish from Chennai,Tamilnadu,India

C:\Documents and Settings\Administrator>imp PAY/PAYPH2@ARCH file='D:\Dump\Tables1.dmp' ignore=y fromuser=PAYM touser=PAY;

Import: Release 10.2.0.1.0 - Production on Thu Feb 20 12:17:04 2014
Copyright (c) 1982, 2005, Oracle. All rights reserved.
Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
With the Partitioning, OLAP and Data Mining options

Export file created by EXPORT:V10.02.01 via conventional path
import done in WE8MSWIN1252 character set and AL16UTF16 NCHAR character set
. importing PAYMGRPH2's objects into PAYMGRPH2;
"ALTER SESSION SET CURRENT_SCHEMA= "PAYMGRPH2;""
IMP-00003: ORACLE error 1435 encountered
ORA-01435: user does not exist
IMP-00000: Import terminated unsuccessfully

import full db

May 06, 2014 - 4:02 am UTC

Reviewer: tushar from india

Hi. I have created a user with sysdba prvis but wit this schema I cant imp full db. It throws lots error and also changes sys and system password atomatically. Could you pl help in this?

Cleanup a database for full import

November 26, 2014 - 10:44 am UTC

Reviewer: Ricardo PatrocĂ­nio from Portugal

Hi Tom,

I want to cleanup a database for a full import. What schemas should I leave in the database?

Thank you for your help!
Tom Kyte

Followup  

November 26, 2014 - 9:12 pm UTC

create new empty database with features you need. You don't want to "clean up" an existing database for something like this.

See
https://docs.oracle.com/cd/E11882_01/server.112/e23633/expimp.htm#UPGRD12564

to get started.

full database import from 10.2.0.1.0 to 12.0.1

December 22, 2014 - 10:50 pm UTC

Reviewer: A reader from 08831

We are in the process of upgrading 10g databases to 12c.
I took the dumpfile (full =y ) from 10g and did import as system to 12c.

I ran utltp to compile all the sys objects.

I currently have 440 objects in SYSMAN , OLAPSYS , PUBLIC.
Not sure where I went wrong ?

a) Should I do full import as sys
b) Is the full import from 10g to 12c is supported ?
c) Do we have to run any cleanyp script.

data pump

June 19, 2015 - 10:22 am UTC

Reviewer: sai krishna from india

why datapump is faster than traditional import and export?
i want indepth what is happening?

Oracle 11g SE exp utility

September 27, 2016 - 4:31 am UTC

Reviewer: A reader


I used exp command to export database and i have 100 tables like consider table name like table_one, table_two, ...... and table_hundred.

Case 1 :-
A table_one to table_fifty export using exp command and it done and now table table_fiftyOne is exporting, and that table contain 1000 rows and 500 rows also exported.
Now my question is suppose any user do changes on table_fiftyOne which rows exported so oracle also do changes those rows are exported or not?

Case 2:-
Suppose table_one to table_fifty exported.
In between i drop table table_ten so system also drop table which are exported?

If yes then how oracle database manages those operation in between perform during active session?

Connor McDonald

Followup  

September 28, 2016 - 7:22 am UTC

Case 1: By default yes, but check out the CONSISTENT keyword, to allow consistency across the tables.

Case 2: If table10 has been already completed by the exp command, then a drop command will not be picked up, that is, table10 will still be present in the exp file.

full import doesn't work

July 23, 2017 - 2:16 pm UTC

Reviewer: Sohail from Riyadh

I have my source oracle database 11g (rel 11.2.0.3.0) and the destination 11g (rel 11.2.0.1.0). I am using Oracle 12 client (rel 12.10.2.0) to export and import. Export was successful but import is failing always. The last few lines of the process are as follows:

IMP-00015: following statement failed because the object already exists:
"CREATE SEQUENCE "WWV_FLOW_SESSION_SEQ" MINVALUE 1 MAXVALUE 9999999999999999"
"999999999999 INCREMENT BY 1 START WITH 1 CACHE 20 NOORDER NOCYCLE"
. importing REPORT_QUEUE's objects into REPORT_QUEUE
"ALTER SESSION SET CURRENT_SCHEMA= "REPORT_QUEUE""
IMP-00003: ORACLE error 1435 encountered
ORA-01435: user does not exist
IMP-00000: Import terminated unsuccessfully

Connor McDonald

Followup  

July 24, 2017 - 1:21 am UTC

Why would

a) use a 12c client to do the export for an 11g database (if the target is not going to be 12)

b) import from 11.2.0.3 into 11.2.0.1

reason for using oracle 12 client

July 24, 2017 - 5:04 am UTC

Reviewer: Sohail from Riyadh

1. I would get the following error if I use 11g client to export, one internet post suggests to use 12 client and export was successful:

Export: Release 11.2.0.1.0 - Production on Mon Jul 24 07:56:34 2017

Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.


Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit
Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Enter array fetch buffer size: 4096 >

Export file: EXPDAT.DMP >

(1)E(ntire database), (2)U(sers), or (3)T(ables): (2)U > e

Export grants (yes/no): yes >

Export table data (yes/no): yes >

Compress extents (yes/no): yes >

Export done in WE8MSWIN1252 character set and AL16UTF16 NCHAR character set
server uses AR8ISO8859P6 character set (possible charset conversion)

About to export the entire database ...
. exporting tablespace definitions
. exporting profiles
. exporting user definitions
. exporting roles
. exporting resource costs
. exporting rollback segment definitions
. exporting database links
EXP-00106: Invalid Database Link Passwords
EXP-00000: Export terminated unsuccessfully

2. Both, 11.2.0.3 and 11.2.0.1 are production databases. I can't do anything to change them.
Connor McDonald

Followup  

July 25, 2017 - 7:17 am UTC

Use the 11.2.0.4 export.

But then again - really, you should be using DataPump, ie,

expdp
impdp

exp/imp are old tools

expdp and impdp is sever technology?

July 26, 2017 - 1:56 pm UTC

Reviewer: Sohail from Riyadh

The reason for not using expdp and impdp is that its a server technology. The dump file gets created in the server. I need to create the dump file in my local machine. As I mentioned before the source database is a production one and I don't have access to its directory structure.
Connor McDonald

Followup  

July 27, 2017 - 1:17 am UTC

So....instead you are pulling production data across a network and storing it on a local machine.

Yeah...that sounds secure :-)

Anyway, your options are:

a) use the right version for export (ie, that of the source database)
b) use datappump and transfer the file from the server to your target destination after the operation completes

How to import some schemas from a full db export?

September 06, 2017 - 4:18 pm UTC

Reviewer: curious_george from NYC

I was actually trying to ask a question. I apologize if I am doing something incorrect here.

Someone "handed me over" a full export done by another party. I don't even know the original tablespace names. Perhaps there is a way to get them similar to getting the DDL with the SQLFILE option? But my main question is:

How do I import ONLY 1 or 2 schemas from the "full export" they have give me? Is this a correct and safe command:

impdp dba1/dba1 SCHEMAS=TEST1,TEST2 DIRECTORY=dpump_dir1 LOGFILE=partial_import.log DUMPFILE=full_db_export.dmp

Naturally, their export dump file includes SYS, SYSMAN etc and I am only interested in SCHEMA1, SCHEMA2 etc. My intent is to import in an existing DB, as opposed to setting up a new DB instance just to accommodate the full export I have been given.

Since I am not "pre-creating" any tablespaces to match the original source, would this command still succeed? I often do export/import within my company, generally 1 schema at a time. I am not in a DBA role and I have never imported so far from a full Db export. That is why I am first asking here instead of taking the risk of directly firing the command.

Connor McDonald

Followup  

September 07, 2017 - 2:44 am UTC

Yes, you can use a full export to import just certain schemas, eg

C:\>expdp userid=system/***** directory=TEMP dumpfile=full.dmp full=y

Export: Release 11.2.0.4.0 - Production on Thu Sep 7 10:28:23 2017

Copyright (c) 1982, 2011, Oracle and/or its affiliates.  All rights reserved.

Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.4.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Starting "SYSTEM"."SYS_EXPORT_FULL_01":  userid=system/******** directory=TEMP dumpfile=full.dmp full=y
Estimate in progress using BLOCKS method...
Processing object type DATABASE_EXPORT/SCHEMA/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 694.1 MB
Processing object type DATABASE_EXPORT/TABLESPACE
Processing object type DATABASE_EXPORT/PROFILE
Processing object type DATABASE_EXPORT/SYS_USER/USER
Processing object type DATABASE_EXPORT/SCHEMA/USER
Processing object type DATABASE_EXPORT/ROLE
...



SQL> drop user scott cascade;

User dropped.


C:\>impdp userid=system/***** directory=TEMP dumpfile=full.dmp schemas=scott

Import: Release 11.2.0.4.0 - Production on Thu Sep 7 10:42:05 2017

Copyright (c) 1982, 2011, Oracle and/or its affiliates.  All rights reserved.

Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.4.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Master table "SYSTEM"."SYS_IMPORT_SCHEMA_01" successfully loaded/unloaded
Starting "SYSTEM"."SYS_IMPORT_SCHEMA_01":  userid=system/******** directory=TEMP dumpfile=full.dmp schemas=scott
Processing object type DATABASE_EXPORT/SCHEMA/USER
Processing object type DATABASE_EXPORT/SCHEMA/GRANT/SYSTEM_GRANT
Processing object type DATABASE_EXPORT/SCHEMA/ROLE_GRANT
Processing object type DATABASE_EXPORT/SCHEMA/DEFAULT_ROLE
Processing object type DATABASE_EXPORT/SCHEMA/PROCACT_SCHEMA
Processing object type DATABASE_EXPORT/SCHEMA/TABLE/TABLE
Processing object type DATABASE_EXPORT/SCHEMA/TABLE/TABLE_DATA
. . imported "SCOTT"."DEPT"                              5.929 KB       4 rows
. . imported "SCOTT"."EMP"                               8.562 KB      14 rows
. . imported "SCOTT"."SALGRADE"                          5.859 KB       5 rows
. . imported "SCOTT"."BONUS"                                 0 KB       0 rows
Processing object type DATABASE_EXPORT/SCHEMA/TABLE/INDEX/INDEX
Processing object type DATABASE_EXPORT/SCHEMA/TABLE/CONSTRAINT/CONSTRAINT
Processing object type DATABASE_EXPORT/SCHEMA/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
Processing object type DATABASE_EXPORT/SCHEMA/TABLE/CONSTRAINT/REF_CONSTRAINT
Processing object type DATABASE_EXPORT/SCHEMA/TABLE/STATISTICS/TABLE_STATISTICS
Job "SYSTEM"."SYS_IMPORT_SCHEMA_01" successfully completed at Thu Sep 7 10:42:09 2017 elapsed 0 00:00:03




Re: How to import some schemas from a full db export?

September 07, 2017 - 5:10 pm UTC

Reviewer: curious_george from New York

Thanks Connor!

I have been peeking at AskTom site for several years, but this was the 1st time I asked a question -
Ironically this after Dear Tom has already "retired" :-) I have been lucky enough to meet him at Open World in 2014 - I am listing further details of my solution here so that hopefully someone else can benefit in future.

Using the "full" DB dump given to me by 3rd party, I used the SCHEMAS=TEST1 option to load just the schema I wanted and it worked as desired. I did first run just with the SQLFILE option to try to see which schemas, tablespaces did the original source have. (My version is 11G).

Actual command used just to see the DDL was:
impdp dba1@my_db DIRECTORY=TEST1 DUMPFILE=third_party_db.dmp SCHEMAS=TARGET1 SQLFILE=TEST1:test_ddl1.sql

Good thing (or bad thing in the pure sense?) that all tables of the schema I was interested in, where under the USERS tablespace in the original source. So, I was able to use REMAP_TABLESPACE. Actual command used to load the table data, just for 1 schema from original dump file, was as below:

impdp dba1@my_db DIRECTORY=TEST1 DUMPFILE=other_party_db.dmp SCHEMAS=their_schema
remap_tablespace=USERS:SOME_EXISTING_TS

("DBA1" was my schema that had sufficient DBA privileges for the IMPDP operation. I was not authorised to use SYS or SYSTEM, so I used that one)
Connor McDonald

Followup  

September 09, 2017 - 12:58 am UTC

Glad it all worked out for you, and thanks for the additional info.

Impoer error

September 24, 2019 - 7:53 am UTC

Reviewer: Haroon from pak

Hi!
when i importing full db, geting this error "FRS_OPPS.......... _pk" , while newly db is installed. plz help
Chris Saxon

Followup  

September 24, 2019 - 1:10 pm UTC

What exactly is the full error?

Import error

September 25, 2019 - 4:43 pm UTC

Reviewer: Haroon from pak

i m getting error while importing .dmp file on newly database installed
"FRS_OPSS.CT_PK
FRS_OPSS.JPS_ATTRS_Pk"

i asked to my friend he said, delete repository and then import and do that twice. because some schemas are already in db.
but when i remove repository an error occured on FRS_OPSS( Oracle plateform security )
Plz help in this
Chris Saxon

Followup  

September 26, 2019 - 10:26 am UTC

Those are just object names.

We need to see the error message! Something starting ORA-xxxxx.

Paste the entire contents of the log.

Import error

September 25, 2019 - 4:57 pm UTC

Reviewer: Haroon from pak

and at the start of importing
"Export file created by export:V12.0.01.00 via conventional path
import done E8MSIN1252 charachter set and AL16UTF16 NCHAR characterset
import server uses AL32UTF8 charachterset.

is it ok?

Import error

September 26, 2019 - 3:16 pm UTC

Reviewer: Haroon from pak

getting error
IMP-00019 ros rejected due to ORACLE error 1
IMP-00003 oracle error 1 and
ORA-00001 unique constraint (FRS_OPSS........_PK)Violated

Chris Saxon

Followup  

September 26, 2019 - 3:41 pm UTC

Rows with the primary key value you're importing already exist in the table.

You can use the table_exists_action parameter to define what happens here:

* SKIP leaves the table as is and moves on to the next object. This option is not valid when the CONTENT parameter is set to DATA_ONLY.

* APPEND loads rows from the source and leaves existing rows unchanged.

* TRUNCATE deletes existing rows and then loads rows from the source.

* REPLACE drops the existing table, and then creates and loads it from the source. This option is not valid when the CONTENT parameter is set to DATA_ONLY.


https://docs.oracle.com/en/database/oracle/oracle-database/19/sutil/datapump-import-utility.html#GUID-C9664F8C-19C5-4177-AC20-5682AEABA07F

ORA-31685

December 12, 2019 - 4:43 pm UTC

Reviewer: SARVESH GUPTA from United States

Hi Tom ,

I am using FUll export of our on prem database and importing it to aws RDS on cloud with FULL=Y ,everything goes well but I see assigned roles are not being imported and not assigned to users.I was expected everything will be imported because it's a full import .I get below error.

ORA-31685: Object type ROLE_GRANT:"GUPTAS1" failed due to insufficient privileges. Failing sql is:
GRANT "VI_ROLE" TO "GUPTAS1"
Connor McDonald

Followup  

December 13, 2019 - 3:19 am UTC

You'd need to take that up with the people that look after RDS. They may have imposed restrictions and/or limitations there.

Could you clarify whether the system objects will be imported if full=y

February 10, 2020 - 12:27 pm UTC

Reviewer: Joe

Hi,

Earlier in the thread, it did mention the SYS objects will not be exported or imported. What about the system objects? I do expected the same but the import errors I am having suggests otherwise?

. . imported "SYSTEM"."FRM40__MNUGRP" 6.296 KB 9 rows
. . imported "SYSTEM"."FRM40__MNUUSER" 5.968 KB 55 rows
. . imported "SYSTEM"."FRM45__BUFFER" 6.492 KB 1 rows
. . imported "SYSTEM"."FRM45__OBJECT" 16.03 KB 22 rows
. . imported "SYSTEM"."PLAN_TABLE" 16.74 KB 59 rows
. . imported "SYSTEM"."REPCAT$_OBJECT_TYPES" 6.515 KB 28 rows
. . imported "SYSTEM"."REPCAT$_TEMPLATE_STATUS" 5.296 KB 3 rows
. . imported "SYSTEM"."REPCAT$_TEMPLATE_TYPES" 5.921 KB 2 rows
. . imported "SYSTEM"."ROSSEQUENCES" 7.765 KB 2 rows
. . imported "SYSTEM"."ROSSTRINGS" 7.460 KB 6 rows
. . imported "SYSTEM"."TOOL__ACCESS" 6.148 KB 9 rows
. . imported "SYSTEM"."TOOL__MODULE" 10.65 KB 1 rows
. . imported "SYSTEM"."TOOL__PLSQL" 7.414 KB 2 rows
. . imported "SYSTEM"."VG__COLOR" 18.71 KB 228 rows
. . imported "SYSTEM"."VG__DRAWING" 5.882 KB 1 rows
.......

ORA-39083: Object type INDEX_STATISTICS failed to create with error:
ORA-20000: INDEX "SYSTEM"."REPCAT$_FLAVORS_GNAME" does not exist or insufficient privileges

Failing sql is:
DECLARE I_N VARCHAR2(60); I_O VARCHAR2(60); NV VARCHAR2(1); c DBMS_METADATA.T_VAR_COLL; df varchar2(21) := 'YYYY-MM-DD:HH24:MI:SS'; stmt varchar2(300) := ' INSERT INTO "SYS"."IMPDP_STATS" (type,version,flags,c1,c2,c3,c5,n1,n2,n3,n4,n5,n6,n7,n8,n9,n10,n11,n12,d1,cl1) VALUES (''I'',6,:1,:2,:3,:4,:5,:6,:7,:8,:9,:10,:11,:12,:13,NULL,:14,:15,NULL,:16,:17)';BEGIN DELETE FROM "SYS"."IMPDP_STATS"; i_n := 'REPCAT$_FLAVORS_GNAME'; i_o := 'SYSTEM'; EXECUTE IMMEDIATE stmt USING 2,I_N,NV,NV,I_O,0,0,0,0,0,0,0,0,NV,NV,TO_DATE('2007-02-10 10:18:43',df),NV; DBMS_STATS.IMPORT_INDEX_STATS('"' || i_o || '"','"' || i_n || '"',NULL,'"IMPDP_STATS"',NULL,'"SYS"'); DELETE FROM "SYS"."IMPDP_STATS"; END;


ORA-39083: Object type INDEX_STATISTICS failed to create with error:
ORA-20000: INDEX "SYSTEM"."I_REPCAT$_SNAPGROUP1" does not exist or insufficient privileges




Connor McDonald

Followup  

February 11, 2020 - 3:52 am UTC

SYSTEM *will* be imported.


Should the system schema be excluded?

February 11, 2020 - 10:22 am UTC

Reviewer: Joe

Thanks for the reply.

In that case, should the system schema be excluded? It should not be any reason to import the system schema?


Connor McDonald

Followup  

February 17, 2020 - 1:35 am UTC

If you are doing a full import, then you actually need the SYSTEM schema to re-populated various metadata stuff.

(Similarly with the other internal schemas)

12c database export using 12c client

February 23, 2020 - 7:14 pm UTC

Reviewer: Sohail Siddiqui from Saudi Arabia

While exporting the 12c database, installed in linux environment, with 12c client 64 bit on windows 7 64 bit, getting the following error:

EXP-00008: ORACLE error 1406 encountered
ORA-01406: fetched column value was truncated
EXP-00000: Export terminated unsuccessfully

What could be the reason. I've tried with 12c client 32 bit on Windows 7 32 bit. Almost 75% database export was successful then got the above error.

Thanks,
Connor McDonald

Followup  

February 25, 2020 - 8:19 am UTC

Is this export or data pump?

If its export, I'd strongly suggest you use data pump

12c database export using 12c client

February 25, 2020 - 7:01 pm UTC

Reviewer: A reader

As you've mentioned earlier, data pump is a server side technology, I don't have that much authority to do anything one the server. I'll try this weekend again to take full database back with export. As database consist of only 4-5 users, I was taking export by listing those 4-5 users in the export wizard.
Thanks,

12c database export using 12c client

February 27, 2020 - 9:42 pm UTC

Reviewer: Sohail Siddiqui from Saudi Arabia

I tried to export today full database, 12c on linux server, using 12c 64 bits oracle client on windows 7 64. Got the same error - oracle 1406.
Chris Saxon

Followup  

February 28, 2020 - 10:25 am UTC

The old export was desupported in 11g. If you're encoutering errors in 12c you'll need to use a different method. Such as data pump export.

12c database export using 12c client

August 25, 2020 - 9:01 am UTC

Reviewer: A reader from Germany

Hi all,
last days I tried to do a DUMP of a DB (Oracle 12c), but noticed errors like:
EXP-00008: ORACLE-ERROR 1406 occurred.

No I found the reason:
Dump ends with error , if table_name is longer than 30 digits. Shortening table_name was the solution.

regards
Georg
Chris Saxon

Followup  

August 25, 2020 - 2:43 pm UTC

Using data pump seems like an easier solution to me!

September 01, 2020 - 2:12 pm UTC

Reviewer: A reader

Hi Chris,
certainly you are right, but according my knowledge you need special access rights using expdp? At least I got this information
Connor McDonald

Followup  

September 02, 2020 - 12:14 am UTC

No more rights that exp/imp

September 03, 2020 - 1:25 pm UTC

Reviewer: A reader

According description on
https://oracle-base.com/articles/10g/oracle-data-pump-10g
I need:
That database user performing the export will need DATAPUMP_EXP_FULL_DATABASE role, and the user performing the import will need the DATAPUMP_IMP_FULL_DATABASE role.
Connor McDonald

Followup  

September 04, 2020 - 12:53 am UTC

And exp/imp have similar requirements

EXP_FULL_DATABASE
IMP_FULL_DATABASE


ORA-31684 ORA-39151

October 14, 2020 - 11:51 am UTC

Reviewer: Franck from France

I'm trying out a restore of a database from a FULL export to a new server.

I recreated an empty database and ran catalog.sql/catproc.sql/catdbsyn.sql/pupbld.sql (following vendor scripts for installation).

When importing I get lots of errors on existing objects (e.g. USER:OUTLN, ROLE:AQ_USER_ROLE, SEQUENCE:MVIEW$_ADVSEQ_ID, SYNONYM:SYSTEM.TAB, etc.).
Which is normal I guess since I created the catalog.
I also get:
ORA-39151: Table "SYSTEM"."REPCAT$_SITES_NEW" exists. All dependent metadata and data will be skipped due to table_exists_action of skip
So I compared with the source database and only "SYSTEM"."DEF$_DESTINATION" contained that that would not be imported.

Did I do something wrong? Should table_exists_action be something different than "skip"?

There's also the error on Streams Capture pointing to note 1380295.1.

I would very much appreciate your help on this issue. The usual responses always say: create an empty database and do full import, but nobody explains "how" empty it should be.

Best regards,
Frank

Connor McDonald

Followup  

October 15, 2020 - 4:41 am UTC

You've done the right thing

Errors are expected because of the "chicken and egg" concept, ie, you need a database to run the import against, but you want to import a full database.

Due to my years of paranoia :-) I normally do something like

select owner, object_type, count(*)
from dba_objects
group by owner, object_type

on source and target as a verification step

In the end all is well

October 15, 2020 - 7:19 am UTC

Reviewer: Franck from France

Thank's a lot for your quick reply!

Even MOS notes tells you to count lines before and after.

BTW, on my first import I got lots of errors due to defaults on DATE columns of quite a few tables : had to redo the import with another NLS_DATE_FORMAT to be able to create these tables.

These things are what makes me dubious whenever I inherit instances where the only "backup" is a full import.

FYI, at the end of the import and in spite of the message stating to "skip table "SYSTEM"."DEF$_DESTINATION" " Oracle did put back the single entry from the source.
So under the hood there is lots of stuff going on (e.g. deactivation of replication-related constraints, etc.).

Thanks again!
Chris Saxon

Followup  

October 15, 2020 - 4:08 pm UTC

These things are what makes me dubious whenever I inherit instances where the only "backup" is a full import.

Yep, that's a bad sign!

Glad we could help.