Excellent
Nikhil, November 07, 2016 - 12:07 pm UTC
Hi Connor,
Thanks a lot for your quick reply.
If Data volume is in billions, will there be any issue with HWM.
I read this issue in some articles that's why asking just for my knowledge.
November 09, 2016 - 1:47 am UTC
direct mode insert adds *after* the hwm, but if you're adding billions that is probably what you want
o1ga, November 07, 2016 - 12:46 pm UTC
And what about SQL*Plus COPY command?
November 09, 2016 - 1:47 am UTC
COPY is fine too, but is basically no different to fetch/insert
don't forget about sqlcl
A reader, November 07, 2016 - 3:07 pm UTC
sqlcl's copy command is ideal for things like this too. no need for a database link.
SQL> help copy
COPY
----
Non sqlplus additions are explicit BYTE options (default is CHAR
i.e. varchar2(1000 CHAR)) and LOB support.
Copies data from a query to a table in the same or another
database. COPY supports CHAR, DATE, LONG, NUMBER, LOB and VARCHAR2.
COPY {FROM database | TO database | FROM database TO database}
{APPEND|CREATE|INSERT|REPLACE|APPEND_BYTE|CREATE_BYTE|REPLACE_BYTE}
destination_table
[(column, column, column, ...)] USING query
where database has the following syntax:
username[/password]@connect_identifier
SQL>
A reader, November 09, 2016 - 2:43 am UTC
Need Clarification on some points
A reader, November 18, 2016 - 6:18 am UTC
Hi Connor,
Thanks for your reply.
But will there be any problem with High Water Mark if we use APPEND hint?
Is there any steps required if we go for INSERT with APPEND?
November 19, 2016 - 1:54 am UTC
Check for docs for restrictions regarding triggers/constraints etc.
But if you are doing large scale inserts (and you can have the table locked, ie, not accessible by others), then APPEND is the way to go
Need clarification on few points
Nikhil, November 18, 2016 - 6:23 am UTC
Hi Connor,
Thanks for your reply.
APPEND will insert records above High Water Mark, so will there be any issues after inserting huge amount of data using APPEND?
Are there any steps post INSERT with APPEND to avoid any issues?
Need to run it from local database
Salaam Yitbarek, November 23, 2018 - 4:59 pm UTC
insert /*+ APPEND */ into local_table select * from table@database_link;
is good, but I have a job that runs from the local database. It does:
1) insert /*+ APPEND */ into local_staging select * from local_source
2) partition exchange local_staging with local_destination
3) calls a procedure on remote database to:
3a) insert /*+ APPEND */ into remote_staging select * from local_source@local_link
3b) partition exchange remote_staging with remote_destination
Of course 3a does not do a direct path since it's distributed (initiated by a call over db link). How can I get around this?
source_table contains potentially one billion rows and my goal is speed.
November 26, 2018 - 11:32 am UTC
How about something along these lines:
3) calls procedure on remote database which submit job "X"
"X" does (3a) and (3b).
In that way, the process is launched locally and you will get direct load.