Skip to Main Content
  • Questions
  • export and import taking a lot of time

Breadcrumb

Question and Answer

Connor McDonald

Thanks for the question, Ruth.

Asked: February 26, 2019 - 9:27 am UTC

Last updated: March 01, 2019 - 1:09 am UTC

Version: 10g

Viewed 1000+ times

You Asked

Hi,

I need to export and import a table having 2 million data in it. using exp/imp command .
I tried using the below commands but it took a lot of time to export and then import the data. Please help.

Commands :exp test/test full=y file=REL.dmp

imp test/test file=REL.dmp full=y ignore=y indexes=n constraints=n
imp test/test file=REL.dmp full=y ignore=y rows=n


Thanks

and Connor said...

You probably should look at using DataPump instead of exp/imp

In the interim, some things you can do:

Do your exports in direct mode

exp ... direct=y recordlength=65535

The same recordlength parameter might help for your imports as well. Alternatively, you can use import to generate an indexfile script, and then build the indexes in nologging fashion (remembering to backup your database afterwards).

Finally, if you are copying an entire database, then check out the DUPLICATE command in RMAN. It copies the datafiles rather than unloading/reloading data and can be orders of magnitude faster.

Rating

  (2 ratings)

Is this answer out of date? If it is, please let us know via a Comment

Comments

Exp?imp commands

Ruth Kumar, February 28, 2019 - 6:05 am UTC

Thank you for the response.

I would like to add that while doing imp i used the below command to generate a script with the create indexes.

imp u/p file=expdat.dmp indexfile=REL.sql IGNORE=N ROWS=N COMMIT=N tables=REL

and then imported the data.

exp u/p full=y direct=y recordlength=65535 file=REL.dmp log=REL.log

and then ran the index script with no logging fashion :could you please explain how do no logging works and helps here.

Also is above correct ?

Thanks,
Ruth
Connor McDonald
March 01, 2019 - 1:09 am UTC

There is a redo cost to doing...well...anything, so you can get some benefits from using nologging when creating an index (at the cost of needing to backup the affected datafiles after the operation).

Similarly you could explore parallel options, so with your index file you can do something like:

sed 's/logging/nologging parallel/g'

or similar to convert your index file into one that is nologging and builds indexes by parallel. (Don't forget to reset them to parallel 1 after the operation)

Ruth Kumar, February 28, 2019 - 6:07 am UTC

Sorry importing the data as:
imp u/p file=REL.dmp INDEXES=N

More to Explore

Utilities

All of the database utilities are explained in the Utilities guide.