Skip to Main Content

Breadcrumb

Question and Answer

Tom Kyte

Thanks for the question, Dharmatha.

Asked: July 02, 2012 - 7:31 am UTC

Last updated: July 03, 2012 - 10:49 am UTC

Version: 11.2.0

Viewed 1000+ times

You Asked

Hi Tom,

My Question is about loading data into a table which has 150 columns. To this we have to add 145 more columns. Currently there is not data in this table.

The primary key for this table is the 'id' column. (The number of records ranges upto 100,000 to 150, 000 (It is not extremely huge).

The first set of 150 columns will be loaded with data from one single source table. The second set of 145 columns is to be loaded with data disintegrated from one huge CLOB column, the extraction of individual fields from this CLOB column is done using a function(Arguments table name, id, and field name) written already by someone else.

My question would be that "Which is the best way to load the second set of 145 columns into the table, using this function".

The complete load process will be automated at some point.

Thanks,
JD


and Tom said...

That is going to be slow.


If you have to call that function 145 times to get 145 columns - I presume they will be reading the clob 145 times and parsing it 145 times for each and every row.

I would suggest the function be rewritten to return all 145 attributes in an object type.


create type myType as object ( c1, c2, c3, .... );


Then you could write a pipelined plsql function that takes a cursor as input and use that to merge into the original table.

For example:

ops$tkyte%ORA11GR2> create table t
  2  ( id number primary key,
  3    c1  number,
  4    c2  number,
  5    c3  number,
  6    c4  number
  7  )
  8  /

Table created.

ops$tkyte%ORA11GR2> 
ops$tkyte%ORA11GR2> insert into t (id, c1, c2 )
  2  select rownum, rownum*10, rownum*100
  3    from all_users
  4   where rownum <= 5;

5 rows created.

ops$tkyte%ORA11GR2> commit;

Commit complete.

ops$tkyte%ORA11GR2> 
ops$tkyte%ORA11GR2> 
ops$tkyte%ORA11GR2> create or replace type myScalarType as object
  2  ( id number,
  3    c3 number,
  4    c4 number
  5  )
  6  /

Type created.

ops$tkyte%ORA11GR2> 
ops$tkyte%ORA11GR2> create or replace type myTableType as table of myScalarType
  2  /

Type created.

ops$tkyte%ORA11GR2> 
ops$tkyte%ORA11GR2> create or replace function your_parsing_function( p_id in number ) return myScalarType
  2  as
  3  begin
  4      return myScalarType( p_id, p_id*1000, p_id*10000 );
  5  end;
  6  /

Function created.

ops$tkyte%ORA11GR2> 
ops$tkyte%ORA11GR2> 
ops$tkyte%ORA11GR2> create or replace function foo return myTableType
  2  PIPELINED
  3  as
  4      pragma autonomous_transaction;
  5  begin
  6      for x in ( select id from t )
  7      loop
  8          pipe row( your_parsing_function( x.id ) );
  9      end loop;
 10      return;
 11  end;
 12  /

Function created.

ops$tkyte%ORA11GR2> 
ops$tkyte%ORA11GR2> merge into t
  2  using ( select * from TABLE( foo() ) )x
  3  on (t.id = x.id)
  4  when matched then update set t.c3 = x.c3, t.c4 = x.c4;

5 rows merged.


ops$tkyte%ORA11GR2> select * from t;

        ID         C1         C2         C3         C4
---------- ---------- ---------- ---------- ----------
         1         10        100       1000      10000
         2         20        200       2000      20000
         3         30        300       3000      30000
         4         40        400       4000      40000
         5         50        500       5000      50000



you only need the autonomous transaction if you are reading the table to get the list of ids to update.

Is this answer out of date? If it is, please let us know via a Comment

More to Explore

CLOB/BLOB/etc

Complete documentation on Securefiles and Large Objects here