Skip to Main Content
  • Questions
  • Inserting data into destination table from source if data is not present in destination table, taking more time due to large volume of data

Breadcrumb

May 4th

Question and Answer

Chris Saxon

Thanks for the question, Tarak.

Asked: July 10, 2020 - 5:33 am UTC

Last updated: July 10, 2020 - 2:41 pm UTC

Version: database version 11g release 2

Viewed 1000+ times

You Asked

sample example:
/* This procedure is taking approx 20-25 mins to complete because of selecting large volume of data,
Is there any way to reduce execution time ? */
    procedure sample ( a_in IN varchar2)
    IS
    v_row number;
    v1_row number;
    v2_row number;
    cursor c1 IS
    select a_value, b_value.., from source_table<where condition>;
    /* cursor c1 selecting 46 millions record, but inserted few records to the below two destinations tables based on conditions, 
    source_table is a force view*/ 
    Begin
    for i in c1
    loop
        v_row := 0;
    
        select count(1) into v_row from table_item where item = i.a_value||'_'||a_in;
        if v_row > 0 then
        
           select count(1) into v1_row from destination_table1
           where item1 = (i.b_value||'_'||a_in);
           if v1_row = 0 then
              insert into destination_table1
              (a_value, b_value)
              values(i.a_value, i.b_value);
              commit;
           end if;
           if i.b_value is not null then
              v2_row := 0;
              select count(1) into v2_row from destination_table2 where item2 = (i.a_value ||'_'||a_in) and 
              item3 = (i.b_value||'_'||a_in);
           
              if v2_row = 0 then
                 insert into destination_table2 (item2, item3) values (i.a_value ||'_'||a_in, 
                 i.b_value||'_'||a_in);
                 commit;
              end if;
           end if;
       end if;
     end loop;
    End sample;
    /* this procedure is taking approx. 20 mins to complete */

and Chris said...

cursor c1 selecting 46 millions record

So you're looping through 46 million rows?!

No wonder this is slow!

Putting SQL inside a loop => SLOOOOOOOW

If you want fast SQL, ideally you want one statement that changes all the rows.

Committing inside a loop is another cardinal sin. This slows the process down and destroys transactional integrity.

It may be possible to replace the loop with a single multitable insert:

insert all
  when ... then insert into t1
  when ... then insert into t2
  select of 46 million rows


If not, then replacing this with bulk processing should give you big performance gains

https://blogs.oracle.com/oraclemagazine/on-bulk-collect

Is this answer out of date? If it is, please let us know via a Comment

More to Explore

SQL

The Oracle documentation contains a complete SQL reference.