Skip to Main Content

Breadcrumb

Question and Answer

Tom Kyte

Thanks for the question, Beverley.

Asked: June 16, 2005 - 2:42 pm UTC

Last updated: June 16, 2005 - 3:48 pm UTC

Version: 8.1.7.2

Viewed 1000+ times

You Asked

I have read through your posts on export/import of large tables so thank you for all your info.
I have on 12 gig table with a long column that I am getting ready to export/import to reclaim space. Do you know of any way to calculate how long the import will take? I currently export the whole database ( about 80 gig ) through a compressed pipe and this takes one hour 14 minutes. The 12 gig table is the largest in this database. Thanks so much

and Tom said...

I would guess "1/4 of that" to export.

It'll be the import that kills you. You better be getting a ton of space back (i'm dubious, the space gained would purely be temporary as space is used and reused in the segment pretty efficiently).

The import will import a row at a time, it won't be "fast". If you are dead serious about doing this, you might consider "do it yourself parallelism", exp the table in many exp sessions using "query=" to get different slices of the table, so you can import them in parallel with commit=n to get it loaded more quickly.


Is this answer out of date? If it is, please let us know via a Comment