Skip to Main Content
  • Questions
  • Copy table data From One DB to Another DB

Breadcrumb

Question and Answer

Connor McDonald

Thanks for the question, Nikhil.

Asked: November 07, 2016 - 11:33 am UTC

Last updated: November 26, 2018 - 11:32 am UTC

Version: 11.2.0

Viewed 100K+ times! This question is

You Asked

Hi Team,

I need to Copy table data From One DB to Another DB.

One approach I can recollect from one of the conversation asked in asktom.oracle.com is that
create database link and simply execute -
insert into local_table select * from table@database_link;

Will this approach work efficiently in case we have requirement to copy table data having row count around 99,000,000,000 rows?

Or

Could you suggest any alternative?

Thanks!!

and Connor said...

For a one-off copy, you can do a direct mode insert:

insert /*+ APPEND */ into local_table select * from table@database_link;

with of course, no indexes, constraints etc applied until after the population.

Similarly, these operations can be parallelised, but typically the limitations here are more resource based (ie, network throughput, or IO/CPU consumption).

Rating

  (7 ratings)

Is this answer out of date? If it is, please let us know via a Comment

Comments

Excellent

Nikhil, November 07, 2016 - 12:07 pm UTC

Hi Connor,

Thanks a lot for your quick reply.
If Data volume is in billions, will there be any issue with HWM.

I read this issue in some articles that's why asking just for my knowledge.
Connor McDonald
November 09, 2016 - 1:47 am UTC

direct mode insert adds *after* the hwm, but if you're adding billions that is probably what you want

o1ga, November 07, 2016 - 12:46 pm UTC

And what about SQL*Plus COPY command?
Connor McDonald
November 09, 2016 - 1:47 am UTC

COPY is fine too, but is basically no different to fetch/insert

don't forget about sqlcl

A reader, November 07, 2016 - 3:07 pm UTC

sqlcl's copy command is ideal for things like this too. no need for a database link.

SQL> help copy
COPY
----

Non sqlplus additions are explicit BYTE options (default is CHAR
i.e. varchar2(1000 CHAR)) and LOB support.

Copies data from a query to a table in the same or another
database. COPY supports CHAR, DATE, LONG, NUMBER, LOB and VARCHAR2.

COPY {FROM database | TO database | FROM database TO database}
{APPEND|CREATE|INSERT|REPLACE|APPEND_BYTE|CREATE_BYTE|REPLACE_BYTE}
destination_table
[(column, column, column, ...)] USING query

where database has the following syntax:
username[/password]@connect_identifier

SQL>

A reader, November 09, 2016 - 2:43 am UTC


Need Clarification on some points

A reader, November 18, 2016 - 6:18 am UTC

Hi Connor,

Thanks for your reply.
But will there be any problem with High Water Mark if we use APPEND hint?

Is there any steps required if we go for INSERT with APPEND?
Connor McDonald
November 19, 2016 - 1:54 am UTC

Check for docs for restrictions regarding triggers/constraints etc.

But if you are doing large scale inserts (and you can have the table locked, ie, not accessible by others), then APPEND is the way to go

Need clarification on few points

Nikhil, November 18, 2016 - 6:23 am UTC

Hi Connor,
Thanks for your reply.

APPEND will insert records above High Water Mark, so will there be any issues after inserting huge amount of data using APPEND?

Are there any steps post INSERT with APPEND to avoid any issues?

Need to run it from local database

Salaam Yitbarek, November 23, 2018 - 4:59 pm UTC

insert /*+ APPEND */ into local_table select * from table@database_link;

is good, but I have a job that runs from the local database. It does:

1) insert /*+ APPEND */ into local_staging select * from local_source
2) partition exchange local_staging with local_destination
3) calls a procedure on remote database to:
3a) insert /*+ APPEND */ into remote_staging select * from local_source@local_link
3b) partition exchange remote_staging with remote_destination

Of course 3a does not do a direct path since it's distributed (initiated by a call over db link). How can I get around this?

source_table contains potentially one billion rows and my goal is speed.
Connor McDonald
November 26, 2018 - 11:32 am UTC

How about something along these lines:

3) calls procedure on remote database which submit job "X"

"X" does (3a) and (3b).

In that way, the process is launched locally and you will get direct load.

More to Explore

Hints

Need more information on hints? Check out the complete hint descriptions