Hmmm
Russell Smith, February 04, 2002 - 9:14 am UTC
Was ok, but i need to do the same but between 9i and 8i...is it possible?
February 04, 2002 - 10:22 am UTC
yes. try it.
$ exp userid=scott/tiger@ora9i owner=scott
Export: Release 8.1.7.2.0 - Production on Mon Feb 4 10:19:16 2002
(c) Copyright 2000 Oracle Corporation. All rights reserved.
Connected to: Oracle9i Enterprise Edition Release 9.0.1.0.0 - Production
With the Partitioning option
JServer Release 9.0.1.0.0 - Production
Export done in US7ASCII character set and UTF8 NCHAR character set
server uses WE8ISO8859P1 character set (possible charset conversion)
About to export specified users ...
. exporting pre-schema procedural objects and actions
. exporting foreign function library names for user SCOTT
. exporting object type definitions for user SCOTT
About to export SCOTT's objects ...
. exporting database links
. exporting sequence numbers
. exporting cluster definitions
. about to export SCOTT's tables via Conventional Path ...
. . exporting table BONUS 0 rows exported
. . exporting table C 1 rows exported
. . exporting table DEPT 4 rows exported
. . exporting table DUMMY 1 rows exported
. . exporting table EMP 14 rows exported
. . exporting table SALGRADE 5 rows exported
. . exporting table T 0 rows exported
. . exporting table TTT 14 rows exported
. exporting synonyms
. exporting views
. exporting stored procedures
. exporting operators
. exporting referential integrity constraints
. exporting triggers
. exporting indextypes
. exporting bitmap, functional and extensible indexes
. exporting posttables actions
. exporting snapshots
. exporting snapshot logs
. exporting job queues
. exporting refresh groups and children
. exporting dimensions
. exporting post-schema procedural objects and actions
. exporting statistics
Export terminated successfully without warnings.
$
Does platform diference cause a problem?
Will, February 05, 2002 - 11:00 am UTC
In my case I am going from 8.1.7.1.0 to 8.1.7.2.1 and get a cannot read header error (IMP-00010) message. The other factor I can identify is the source runs on Compaq Alpha 4100 and the targets tried are either Windows 2000 or Solaris 8 builds.
February 05, 2002 - 11:53 am UTC
you most likely forgot to set "binary" on in an ftp -- check the size of the DMP file and compare it to the size of the original DMP from the source system. If they are different -- the file was corrupted.
IMP ERROR
atul, October 17, 2003 - 12:22 am UTC
Hi TOM,
I am importing a file into 8.0.4 which i got from 8.0.5
I'm getting error:
Connected to: Oracle8 Enterprise Edition Release 8.0.4.3.1 - Production
With the Partitioning option
PL/SQL Release 8.0.4.3.1 - Production
IMP-00010: not a valid export file, header failed verification
IMP-00000: Import terminated unsuccessfully
=========================
I have correctly FTP'ed file in binary format,also i checked
size of the files they are same.
What should i do now?
Thanks.
atul
October 17, 2003 - 9:48 am UTC
read the original answer.
"To move data DOWN a version(s), you need to export using that lower versions EXP
tool and IMP using that lower versions tool."
8.0.4 < 8.0.5, hence 8.0.4 cannot read an 8.0.5 dmp file.
Thanks
atul, October 21, 2003 - 9:08 am UTC
Thanks Tom,
We are getting exp file through designer front-end (8.0.5),
We have 8.0.4 client,
So what could be the solution?
How could we get exp dump file of 8.0.4.
Thanks.
Atul
October 21, 2003 - 5:01 pm UTC
use the 804 exp -- not 805.
how about cross platform export?
Eugene, November 24, 2003 - 7:17 pm UTC
Hi Tom: I think your answer to the export issue is very helpful indeed. An issue that I encountered recently is doing export crossing platforms. Our database server is a Unix machine and I use a Oracle Client on Winodws OS to manage data. I have no problem export the database for a while and then this morning the export process hangs when it reaches "exporting cluster definitions..". Is there a partial corruption of the database or it is not good to do cross OS export.
Thank you for your answer.
Eugene
November 25, 2003 - 7:38 am UTC
export works excellent over cross platforms.
there is some other cause.
check out the v$ tables to see what the session was waiting on, what query it may have been running, why it appeared "stuck"
A reader, November 25, 2003 - 12:45 pm UTC
It is a good answer indeed. I am checking the session detail and it now shows the current state of EXE application and the statistic of each exporting object. I can use this to debug the issue.
Thanks,
Eugene
EXP/IMP Moving database between differ. OS
Abhi, April 22, 2004 - 12:38 pm UTC
hi,
o Is there any other option [ except exp/imp ]
to move a Big database to a differ OS.
I have a 80GB database on Win Box.
I want to move it on a Linux box.
Can/Should we use Exp-Imp Or is there any other
option. ?
o I have read the chapter on imp-exp in your book.
As you mentiond in chapter
[ with qualified name in Trigger Example ] some
problems with exp imp. Is there any solution of this
problem. I want to move this 80gb database under
another schema.
Thankx,
Abhi
April 23, 2004 - 8:32 am UTC
a) with 10g, yes, cross platform transportable tablespaces -- or the new datapump. so one option is to position this as an upgrade/migrate (upgrade the windows box, install 10g on linux and you can basically move the data files)
else you can
o use exp/imp (export in parallel, imp in parallel, 80 gig is pretty smallish, do it schema by schema in parallel)
o sqlplus copy command can work for you.
o dblinks even
b) if the schema names are the same, there would be no issues. it was with fromuser/touser you run into issues.
Edit Index File
Abhi, May 03, 2004 - 4:48 am UTC
hello Sir,
Thanks for your Answer.
I changed my index file for making all different
tablespaces to one tablespace and fired in database.
But when I tried to imp with ignore = y It gave me
ORA-02291: integrity constraint (TEST.FK) violated - parent key not found
Oracle had enabled all the foreign keys So, I could not
imp. the data.
Is there any solution ?
Thanks,
Abhi
May 03, 2004 - 7:30 am UTC
well, you must have done "data=n" and "constraints=y" then so the data was not imported, but constraints were turned on.
either do
indexes=n constraints=Y data=Y
or
indexes=N constraints=N data=N
and import the data before/same time as the constraints.
Thanks Sir .. :)
Abhi, May 05, 2004 - 6:25 am UTC
Transportable tablespace accros version
abhi, June 04, 2004 - 5:44 am UTC
hello sir .. thanks for your reply..
I have a db on 816 of 230 gb
I want to move it on 817 on same machine.
exp-imp will take more time ..
Can we use transportable tablespace from 816 and plug it
into 817. Is there any issue/problem ..
Thanks,
Abhi
June 04, 2004 - 8:25 am UTC
why not just upgrade it?
Re:
Abhi Malhotra, June 05, 2004 - 2:56 am UTC
We want to check my application dependencies.
So, First we want to maintain two version then we will
drop older database.
Is is possible to plug 816 tablespace in 817.
Thanks,
Abhi
June 05, 2004 - 8:52 am UTC
yes it is however, restore a backup of your 816 and upgrade it. it'll be faster and easier and you can still have your duplicates
export /import
sang2004, June 10, 2004 - 7:42 am UTC
this reviews were some but not very usefule to me
as i am a a developer and need to migrate from 8.0.5 to 8.1.7 server but i don't how to begin with frommy pc,whether I should install 8.1.7 on my machine ,8.1.7 is allready been installed on server machine which is at a different location.
June 10, 2004 - 9:35 am UTC
if you are the developer, why are you doing the migration - a DBA job typically. so, you must be a "DBA/Developer".
You could
a) export from 805 using the 805 export tool against the 805 database and then import into 817 using the 817 import tool after ftping the file (in binary mode -- windows will corrupt it by default)
b) upgrade the 805 database using the documented upgrade techniques (as found in the migration guide -- available on otn.oracle.com as part of the standard documentation set)
EXP/IMP on cross platform
atul, July 01, 2004 - 7:02 am UTC
Hi,
I need your help,
I have oracle 7.2 install on HP-UNIX(10.0.1B).
We need to migrate this oracle 7.2 On Linux or SCO.
Can i do that using following steps
1)Full export of Oracle
2)Install 7.2 on Sco/Linux
3)Full imp from exp file..
Will it do? I am not able to find any info on "metalink".
so plz help
Thank,
Atul
July 01, 2004 - 11:12 am UTC
7.2? not going to run on linux, might run on a really really old SCO release.
that is if you can even get the software anymore -- that release was desupported many many years ago.
Using Exp80 with 10g
Arvind, April 08, 2005 - 2:04 am UTC
Dear Tom,
Hello,
I am facing a problem. I am having multiple version oracle installed at my organisation. From Oracle 8.0 to Oracle 10g. Now I want to transfer data from Oracle 10g to Oracle 8.0. I have tried to export the 10g data using exp80. but it is giving erro exp-00003. Can you tell me the solution for the same.
Thanx
April 08, 2005 - 7:02 am UTC
not sure if there is one, only supported versions of the database would have been tested.
I'd probably recommend "unload" and "reload". You can try some of the tools:
</code>
http://asktom.oracle.com/~tkyte/flat/index.html <code>
to facilitate that with sqlldr.
Or, you can try the sqlplus copy command to copy data.
Or, you can try database links to copy data.
BLOB export problem from 10g using 9i export EXP-00003
Jelena, April 21, 2005 - 9:52 am UTC
Hi Tom,
we have the same issue that we want to export from 10g (10.1.0.3) and import data to 9i (9.2.0.3). Now, almost all tables got exported successfully except for one table.
Special about that one table is that it has 3 BLOB columns and they come as first 3 columns, but even after removing those columns export still doesn't work.
I tried creating table with as select with different column order, drop blob columns and it's the same...
This is the error :
Export: Release 9.2.0.3.0 - Production on Thu Apr 21 15:12:52 2005
Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
Connected to: Oracle Database 10g Release 10.1.0.3.0 - Production
With the Real Application Clusters option
Export done in WE8ISO8859P15 character set and AL16UTF16 NCHAR character set
server uses WE8ISO8859P1 character set (possible charset conversion)
About to export specified tables via Conventional Path ...
. . exporting table FOLDERX_NEW
EXP-00003: no storage definition found for segment(5, 44307)
Export terminated successfully with warnings.
April 21, 2005 - 12:01 pm UTC
what does the table look like
table desc...
Jelena, April 21, 2005 - 12:55 pm UTC
This is the table which can not export from 10g database with 9i exp utility:
DEFAULTCATEGORIESX BLOB
DEFAULTNOTIFICATIONSX BLOB
DEFAULTPERMISSIONSX BLOB
JDOIDX NOT NULL NUMBER
DEFAULTBUSINESSTYPEX NUMBER
DEFAULTCATEGORIESFLAGX NUMBER(1)
exp from 10g works fine on the same table.
When i drop those BLOB columns - exp from 9i also works fine.
I also tried reverse order or columns (NUMBER,NUMBER,NUMBER,BLOB,BLOB,BLOB) also doesn't work
Thanks for your help
April 22, 2005 - 9:14 am UTC
no, create table statement, the one you use
dbms_metadata.get_ddl can get it if you lost it.
Partial import
Thiru, April 26, 2005 - 3:17 pm UTC
Tom,
Is there a way to do a partial import of a particular table from a full table export. I know we can do a partial export using query parameter but if I have a full table dump, any workaround?
Thanks for the time.
April 26, 2005 - 3:39 pm UTC
not really, not practically.
I mean, you could set up a view that has the name of the table, import into the view, using an instead of trigger on the view -- only really insert the rows you wanted, ignoring the others.
it would be a "hack"
Export
Rajagopal R, May 10, 2005 - 2:14 am UTC
Dear Tom
If it is possible export for selected tables only ,pl advice me
Thanks and Regards
Rajagopal R
May 10, 2005 - 8:14 am UTC
$ exp help=y
displays all of exp's parameters.
the documentation tells you more.
exp tables=(t1,t2,t3,....)
Export
Rajagopal R, May 10, 2005 - 2:15 am UTC
Dear Tom
If it is possible export for selected tables only ,pl advice me
Thanks and Regards
Rajagopal R
move database,
sns, May 12, 2005 - 3:31 pm UTC
We are planning to retire certain SUN boxes and move its databases to a new LINUX box.
The database is around 800Gig that has numerous tables and indexes.
Right now our team is thinking of using SQL statements to move the data using DB links. I think it takes several hours to achieve this.
I suggested to use export/import. The system engineers are not providing disks to store the exported data (at most they can provide is 150Gig).
In this scenario, do you have better ideas of relocating the entire database to a new server? The version of the new database is 10g and the old one is 9.2.0.5.
thanks
May 12, 2005 - 3:48 pm UTC
it could take more than several hours.
can 10g be considered here at all?
follow up,
sns, May 12, 2005 - 3:54 pm UTC
Let me be clear in my question. I double checked with my manager.
The old database is 8.1.7 and the new database where we are moving into is 10.0.1.3.
We are moving from SUN box that has Oracle 8.1.7 to the LINUX box that has Oracle 10.0.1.3.
Thanks,
May 12, 2005 - 9:09 pm UTC
you could upgrade the 817 instance to 10g and transport the tablespaces to the new instance (no dump, no unload, no reload).
great,
sns, May 13, 2005 - 10:36 am UTC
that was in my thought too. Just want to ensure that TRANSPORT TABLESPACE can be used to move objects from one database to another database (across different OS platforms)
Approximately how long it would take to transport objects which is around 800Gig?
thanks,
May 13, 2005 - 10:50 am UTC
it would be a matter of how fast rman can read the source, filter it, and plop it down onto target (byte order needs be fixed).
and then how long it takes to import the metadata which is more a function of number of objects than size and you would need to benchmark this (since you will test this 100 times before actually cutting over, you'll have lots of opportunity to time it)
Getting Error EXP-00003: no storage definition found for segment(4, 2611)
Nags, May 20, 2005 - 10:54 am UTC
I am trying to export using 9.2 from 10.1 database so that I can import it back into 9.2 database. I am getting the error Error EXP-00003: no storage definition found for segment(4, 2611)
Export output :
Export: Release 9.2.0.1.0 - Production on Fri May 20 10:30:08 2005
Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
Connected to: Oracle Database 10g Enterprise Edition Release 10.1.0.2.0 - Production
With the Partitioning, OLAP and Data Mining options
Export done in AL32UTF8 character set and AL16UTF16 NCHAR character set
. . exporting table SCHEMA_HISTORY
EXP-00003: no storage definition found for segment(4, 4987)
The structure of the table is below
DBMS_METADATA.GET_DDL('TABLE','SCHEMA_HISTORY','N1PT')
--------------------------------------------------------------------------------
CREATE TABLE "N1PT"."SCHEMA_HISTORY"
( "BUILD_NUMBER" NUMBER(10,3) NOT NULL ENABLE,
"CHANGE_NUMBER" NUMBER NOT NULL ENABLE,
"SCRIPT" CLOB NOT NULL ENABLE,
"UPD_USER" NVARCHAR2(20),
"UPD_DATE" DATE,
"UPD_VERSION" NUMBER
) PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255 NOCOMPRESS LOGGING
STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645
PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT)
TABLESPACE "USERS"
LOB ("SCRIPT") STORE AS (
TABLESPACE "USERS" ENABLE STORAGE IN ROW CHUNK 8192 PCTVERSION 10
NOCACHE
STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645
PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT))
Where am I going wrong ?
May 20, 2005 - 6:27 pm UTC
don't see anything obvious, on the road, please utilize support for this one.
Getting Error EXP-00003: no storage definition found for segment(4, 2611)
Nags, May 20, 2005 - 10:57 am UTC
I am trying to export using 9.2 from 10.1 database so that I can import it back into 9.2 database. I am getting the error Error EXP-00003: no storage definition found for segment(4, 2611)
Export output :
Export: Release 9.2.0.1.0 - Production on Fri May 20 10:30:08 2005
Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
Connected to: Oracle Database 10g Enterprise Edition Release 10.1.0.2.0 - Production
With the Partitioning, OLAP and Data Mining options
Export done in AL32UTF8 character set and AL16UTF16 NCHAR character set
. . exporting table SCHEMA_HISTORY
EXP-00003: no storage definition found for segment(4, 4987)
The structure of the table is below
DBMS_METADATA.GET_DDL('TABLE','SCHEMA_HISTORY','N1PT')
--------------------------------------------------------------------------------
CREATE TABLE "N1PT"."SCHEMA_HISTORY"
( "BUILD_NUMBER" NUMBER(10,3) NOT NULL ENABLE,
"CHANGE_NUMBER" NUMBER NOT NULL ENABLE,
"SCRIPT" CLOB NOT NULL ENABLE,
"UPD_USER" NVARCHAR2(20),
"UPD_DATE" DATE,
"UPD_VERSION" NUMBER
) PCTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255 NOCOMPRESS LOGGING
STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645
PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT)
TABLESPACE "USERS"
LOB ("SCRIPT") STORE AS (
TABLESPACE "USERS" ENABLE STORAGE IN ROW CHUNK 8192 PCTVERSION 10
NOCACHE
STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645
PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT))
Where am I going wrong ?
Transport TBS (Version 9.2)
A reader, May 20, 2005 - 8:08 pm UTC
Hi Tom,
If block size in systems is different, what is your recommended approach for copying data from one database to another (may be use multiple block size support feature for target database)?
Regards
May 21, 2005 - 8:45 am UTC
you can use that to transport the tablespace, but would recommend getting it into the default blocksize so you can get back to a single buffer cache if this copy is "long term"
to Nags from Boston, USA
Alexey, June 29, 2005 - 9:50 am UTC
There is a workaround for this issue.
See Note:274076.1 "EXP-00003 When Exporting From 9.2.0.5.0 or any Higher Release with a Pre-9.2.0.5.0 Export Client".
This sent me in the correct direction
Jim Buskirk, February 27, 2006 - 2:50 pm UTC
import 10g dump file to 8i
Pratheep, March 28, 2007 - 3:34 am UTC
Hi,
I want import one 10g exported dump file to 8i version. I tried once, i got the error header file mismatch something. Can anyone explain me, is it possible and if possible pls provide me the steps to do.
n.pratheep@gmail.com
Thank you very much in advance..
March 28, 2007 - 11:39 am UTC
you need to use the 8i export tool against the 10g database in order to create a file that 8i import can read.
Imp Hangs at About to enable constraints
Santosh Dash, May 15, 2009 - 6:17 am UTC
Dear Sir,
We have two servers, one having RHEL4 and Oracle 10G R1 and another is RHEL5 Oracle 10G R2 and all other workstations with windows xp and Oracle 9i installed. I am trying to export data from RHEL4/Oracle 10G R1 server via windows XP/Oracle 9i and restore it to RHEL5/Oracle 10G R2 server. All table imported successfully but imp hangs saying About to enable constraints..... Please Help...............
May 15, 2009 - 1:46 pm UTC
have you verified it is stopped.
or are you just thinking constraints should be turned on infinitely fast - you do realize that there will be tons of scanning going on?
About to enable constraints......
Santosh Dash, May 16, 2009 - 3:23 am UTC
Dear Sir,
I Had tested it before. Once I leave the import for whole night it won't work.
May 23, 2009 - 10:49 am UTC
so, what if it took a night and 5 minutes to finish?
I'm right back where I started -
"have you verified it is stopped." no, you did not, at least you do not say otherwise
"or are you just thinking constraints should be turned on infinitely fast - you do realize that there will be tons of scanning going on?" this is probably what was happening
had you checked the v$ tables, you could have ascertained what if anything was happening.
Equivalent of impdp's remap_schema in imp
A reader, July 09, 2009 - 11:16 am UTC
I have a dmp file (from the legacy exp) from user A , user B.
I would import into a differenent datbase to user Y, user Z respectively .
What is the equivalent of remap_schema of IMPDP in IMP?
Thanks a lot
July 14, 2009 - 2:52 pm UTC
fromuser touser
A question
Kishore, September 16, 2009 - 5:30 pm UTC
Hi Tom,
I am sorry to ask you question here!
I need to import a schema of Oracle 7.3 which is hosted on OpenVMS box and export it to Oracle 8i AIX 4.3 box.
Is that possible with normal Exp/Imp commands or does it need any extra effort?
September 16, 2009 - 5:48 pm UTC
exp/imp are cross platform
you would exp using the 7.3 export tool
you would import using 8i import on the platform you are importing on
you would ensure you do a BINARY ftp of the filesize (make sure the file does not change size as you move it)
exp-imp parameters on 8i and 9i
Deepak Vij, November 04, 2009 - 1:53 am UTC
hi Tom,
I need ur help to know that what is the parameters of export import on oracle 8i,91 10gi and 11g
November 09, 2009 - 3:15 pm UTC
well, I usually consult the documentation for stuff like that.
how do you usually do this with software you purchase and use?
Data Migration between different platform
Ranajit, March 06, 2010 - 9:36 am UTC
Dear Tom,
I had a question but I got the answer from all your replies on the issue, just probably. My organisation want to audit 10g data base while we have only 9i. In some of your replis I found that data export/import from Upper to lower version is possible or at least Sql copy command could do it. But in one reply, I find that it is not posible.
In any case, thank you very much for providing such a wonderful platform. Our organisation(IAAD) is a customer of Oracle. If I am correct please respond. I had to tell my subordinates to act accordingly.The Auditee unit is saying that it would not be possibe to have a dump of their database in 10g in our 9i. I tend to differ.Thanks again.
March 09, 2010 - 10:45 am UTC
My organisation want to audit 10g data base while we have only 9i.
huh?
you obviously have 10g, if your organization has 10g - so do you.
Full Import different versions
Andre, December 14, 2010 - 5:53 am UTC
Hello guys,
Is it right to perform a full import when the source and target database are in different versions? Like, do a full export of oracle 10g r1 and a full import on oracle 10g r2. Will I have to execute upgrade operation on this target database after a full import?
Thanks,
Andre.
December 14, 2010 - 7:28 am UTC
the database is already upgraded - you do not get sys/system things during the export.
you can export from version X and import into version Y given that X <= Y, yes.
Andre, December 14, 2010 - 2:56 pm UTC
Thanks for the answer. I just would like to confirm if a full import would not prejudice the data dictionary or anything else.
December 15, 2010 - 12:33 am UTC
it certainly 'affects' the data dictionary - as it will create lots of objects. Not sure what you mean by "prejudice", it will not "corrupt or break" it.
Best Practices
Andre, December 15, 2010 - 6:04 am UTC
So, what would be considered, a Best Practice on how to migrate a database to a different server and different major oracle version, in a clean way? I don´t know how bad is to create those extra objects that woouldn´t be used by the database, for example.
Usually, I generate scripts to create tablespaces, users, system privs and then perform an import of just non-system owners. I was wondering If I could use Data Pump to improve this procedure. I do this a lot, and I´m always trying to find a better way of doing this kind of tasks.
Thanks.
December 15, 2010 - 11:24 am UTC
I would not call it a best practice at all. It would be a painfully slow way to accomplish that. I'd rather just either
o backup and restore if the server platforms are the same and then upgrade using the normal upgrade process
o full database transport (just like backing up and restoring) if the server platforms are different but the byte orders are different and then upgrade normally
o use expDP to get the code (plsql, views, synonyms, etc) over to a new database and then cross platform transport the data and then upgrade normally
before I'd consider doing a full database export to disk and then a full database import.