Skip to Main Content

Breadcrumb

Question and Answer

Tom Kyte

Thanks for the question, Goran.

Asked: July 21, 2000 - 3:25 am UTC

Last updated: June 01, 2011 - 2:48 pm UTC

Version: 2000

Viewed 10K+ times! This question is

You Asked

How to with PL/SQL code import (load) data into an Oracle table from a delimited text file (from Win95 directory)?

and Tom said...

If the "Win95" directory is on the "Server" -- we can do this.

If the "Win95" directory is some arbitrary directory on some arbitrary client -- we can do this with PLSQL in Forms (not via a stored procedure but via a client side plsql routine) but not with a stored procedure.


PLSQL runs in either the server (this is normal) or in the client as client side PLSQL using a runtime engine such as forms or reports. when PLSQL is running in the server, it has access only to files on the server -- it can read and write only what is on the server and has no abilities to touch the client machine. When PLSQL is running on the client under forms -- the reverse it true -- that PLSQL can only read/write files on the client, not on the server.

Here is an example of SERVER side PLSQL (replace UTL_FILE with calls to TEXT_IO if you want client side PLSQL, you would probably want to use EXEC_SQL instead of DBMS_SQL as well). The following is a "mini sqlldr" in plsql. It loads delimited data into any table/view from a file.

ops$tkyte@8i> create table badlog( errm varchar2(4000), data varchar2(4000) );

Table created.

ops$tkyte@8i> column errm format a25
ops$tkyte@8i> column data format a25
ops$tkyte@8i> create or replace
2 function load_data( p_table in varchar2,
3 p_cnames in varchar2,
4 p_dir in varchar2,
5 p_filename in varchar2,
6 p_delimiter in varchar2 default '|' )
7 return number
8 is
9 l_input utl_file.file_type;
10 l_theCursor integer default dbms_sql.open_cursor;
11 l_buffer varchar2(4000);
12 l_lastLine varchar2(4000);
13 l_status integer;
14 l_colCnt number default 0;
15 l_cnt number default 0;
16 l_sep char(1) default NULL;
17 l_errmsg varchar2(4000);
18 begin
19 l_input := utl_file.fopen( p_dir, p_filename, 'r' );
20
21 l_buffer := 'insert into ' || p_table || ' values ( ';
22 l_colCnt := length(p_cnames)-
23 length(replace(p_cnames,',',''))+1;
24
25 for i in 1 .. l_colCnt
26 loop
27 l_buffer := l_buffer || l_sep || ':b'||i;
28 l_sep := ',';
29 end loop;
30 l_buffer := l_buffer || ')';
31
32 dbms_sql.parse(l_theCursor, l_buffer, dbms_sql.native);
33
34 loop
35 begin
36 utl_file.get_line( l_input, l_lastLine );
37 exception
38 when NO_DATA_FOUND then
39 exit;
40 end;
41 l_buffer := l_lastLine || p_delimiter;
42
43
44 for i in 1 .. l_colCnt
45 loop
46 dbms_sql.bind_variable
47 ( l_theCursor, ':b'||i,
48 substr( l_buffer, 1,
49 instr(l_buffer,p_delimiter)-1 ) ) ;
50 l_buffer := substr( l_buffer,
51 instr(l_buffer,p_delimiter)+1 );
52 end loop;
53
54 begin
55 l_status := dbms_sql.execute(l_theCursor);
56 l_cnt := l_cnt + 1;
57 exception
58 when others then
59 l_errmsg := sqlerrm;
60 insert into badlog ( errm, data )
61 values ( l_errmsg, l_lastLine );
62 end;
63 end loop;
64
65 dbms_sql.close_cursor(l_theCursor);
66 utl_file.fclose( l_input );
67 commit;
68
69 return l_cnt;
70 end load_data;
71 /

Function created.

ops$tkyte@8i>
ops$tkyte@8i> drop table t1;

Table dropped.

ops$tkyte@8i> create table t1 ( x int, y int, z int );

Table created.

ops$tkyte@8i>
ops$tkyte@8i> host echo 1,2,3 > /tmp/t1.dat

ops$tkyte@8i> host echo 4,5,6 >> /tmp/t1.dat

ops$tkyte@8i> host echo 7,8,9 >> /tmp/t1.dat

ops$tkyte@8i> host echo 7,NotANumber,9 >> /tmp/t1.dat

ops$tkyte@8i>
ops$tkyte@8i>
ops$tkyte@8i> begin
2 dbms_output.put_line(
3 load_data( 'T1',
4 'x,y,z',
5 '/tmp',
6 't1.dat',
7 ',' ) || ' rows loaded' );
8 end;
9 /
3 rows loaded

PL/SQL procedure successfully completed.

ops$tkyte@8i> select * from badlog;

ERRM DATA
------------------------- -------------------------
ORA-01722: invalid number 7,NotANumber,9

ops$tkyte@8i> select * from t1;

X Y Z
---------- ---------- ----------
1 2 3
4 5 6
7 8 9

ops$tkyte@8i>



Rating

  (69 ratings)

Is this answer out of date? If it is, please let us know via a Comment

Comments

Pauline, May 03, 2001 - 9:29 pm UTC


Very useful

Reader, September 27, 2001 - 10:08 am UTC

Hi,

What if the number of columns to be loaded is only part of the columns in the base table, and we also want to load some constant values into the table for every row, e.g. load SYSDATE, USER etc.?

Is this will affect the general feature of this most suitable procedure?

Thanks

Tom Kyte
September 27, 2001 - 10:24 am UTC

Well, if you wanted to load constant values -- I would use DEFAULTS on the columns.

the little, simple, trivial utility below does not accomidate for defaults and skipping columns.

In Oracle9i we have a feature called an EXTERNAL table which is sort of a create table with a sqlldr control file -- this does support all of that. Instead of running sqlldr, you'll "create table" and now you just use SQL to process the input file.

What if database size is in GBs

Tony, August 02, 2002 - 11:12 am UTC

Hi Tom,

I need to migrate around 4 GB data from ASCII file into Oracle database. I have two options :

1. Load data from file to oracle using Loader and then write scripts to migrate the data into live tables.
2. Write script using UTL_FILE and read every record, do some complex processing and load the record directly into table.

What will be the faster as I told you about the size of data need to loaded?

Regards,
Tony

Tom Kyte
August 05, 2002 - 11:20 am UTC

4 gig is tiny today.


1) would most likely be faster. Especially if you can use bulk operations on the data (mass inserts/updates). Use the direct path loader, and a full scan will be much faster then utl_file.

If you have 9i, just use an EXTERNAL table.



External Table

Vish, August 06, 2002 - 5:26 am UTC

I'm building a datawarehouse, I have to read the files in external table assign new keys and load the fact table. Will it not cause performance problems if I use external tables ( I can't create indexes on external tables right ? )


Vish

Tom Kyte
August 07, 2002 - 9:00 am UTC

To load -- no, no worries there. You'll be full scanning these tables anyway - you want to process every record right? No indexes would be used anyway.

error !!

a, August 09, 2002 - 4:44 am UTC

at this point, i got an error :

SQL> ed
Wrote file afiedt.buf

  1  begin
  2       dbms_output.put_line(
  3           load_data( 'T1',
  4                      'x,y,z',
  5                      '/tmp',
  6                      't1.dat',
  7                      ',' ) || ' rows loaded' );
  8*   end;
SQL> /
begin
*
ERROR at line 1:
ORA-06510: PL/SQL: unhandled user-defined exception
ORA-06512: at "SYS.UTL_FILE", line 98
ORA-06512: at "SYS.UTL_FILE", line 157
ORA-06512: at "QAAR.LOAD_DATA", line 18
ORA-06512: at line 2 

Tom Kyte
August 09, 2002 - 8:56 am UTC

uh-huh.

you didn't set the utl_file_dir init.ora parameter. or that file does not exist.

Mapping of external_id's to internal_id's

saj, January 27, 2003 - 11:25 am UTC

Thanks for all your examples and explanation's on various topics. I am a regular visitor to your site and enjoy reading the marvellous solutions you give!

I have one question and it is related to design and performance! We are on Oracle version 9.i R/2

We have data coming in from various sites and countries and they have their own ID's or key values and we keep our own ID's or primary key.

Say for example the external table/file comes in as
ext_id, type_name, language
1001 Traditional English

and I want to map it into our database table
as
internal_id, external_id, type_id, lang_id
9999 1001 1 2

where type, language etc are code/ref tables

What would be the best way to massage the data in external tables and map it to the real tables?

Looking forward to your relpy,

Tom Kyte
January 27, 2003 - 12:24 pm UTC

do you have a table of type_name to type_id somewhere that has all possible values? same with language to lang_id?

mapping of external id's to internal id's

saj, January 27, 2003 - 1:32 pm UTC

Thanks for your prompt response...

Yes I have two table's content_type, and language which have id's and name as the column. ( a typical reference table)

Tom Kyte
January 27, 2003 - 4:14 pm UTC

and how do you map their keys to your keys.

Mapping

saj, January 28, 2003 - 10:13 am UTC

Are you driving at something obvious?

My question is what is the best way to do this? A design issue?

--Insert into another staging table, resolve all id's and merge?
--cursor thru the xternal table and insert/update the real table?
-- or is there any other elegant solution?



Tom Kyte
January 28, 2003 - 1:24 pm UTC

No, i'm asking "do you have a table that maps all of their keys to yours"

if not, you will be writing procedural row by row code (unless you have 9i, in which case we may be able to merge)

if yes, we can do a simple insert with select's of select's to convert the strings to codes.

Excellent thank you

Patrick, January 28, 2003 - 1:40 pm UTC

Bing-O, great script

Mapping

saj, January 28, 2003 - 2:53 pm UTC

Thanks again for your prompt followup. I'm sorry! I was totally thrown off guard with your reply "and how do you map their keys to your keys."

I had mentioned in the previous queries that I am on 9iv2
and also I have tables to map the external data to our internal data.

We have a table:content_type with
type_id type_name
1 Tradtional
2 Contemporary

and table:language
lang_id language_name
2 English
3 Spanish

and this is what I want to design

example the external table/file comes in as

ext_id, type_name, language
1001 Traditional English

and I want to map it into our database table eg: tbl_content
as
internal_id, external_id, type_id, lang_id
9999 1001 1 2

to map external_id to internal_id, I use the sequence object if it does not exist in tbl_content, if it exists just update the row where external_id = 1001

I hope this is enough information..

looking forward to a very elegent solution....




Tom Kyte
January 28, 2003 - 3:19 pm UTC

Ok, 
ops$tkyte@ORA920> create or replace directory data_dir as '/tmp/'
  2  /

Directory created.

ops$tkyte@ORA920>
ops$tkyte@ORA920> create table external_table
  2  (ext_id number,
  3   type_name varchar2(20),
  4   language varchar2(30)
  5  )
  6  ORGANIZATION EXTERNAL
  7  ( type oracle_loader
  8    default directory data_dir
  9    access parameters
 10    ( fields terminated by ',' )
 11    location ('test.dat')
 12  )
 13  /

Table created.

ops$tkyte@ORA920> !echo 1001,traditional,english > /tmp/test.dat

ops$tkyte@ORA920>
ops$tkyte@ORA920> select * from external_table;

    EXT_ID TYPE_NAME            LANGUAGE
---------- -------------------- ------------------------------
      1001 traditional          english

ops$tkyte@ORA920> create sequence s;

Sequence created.

ops$tkyte@ORA920>
ops$tkyte@ORA920> create table content_type ( type_id number, type_name varchar2(20) );

Table created.

ops$tkyte@ORA920> insert into content_type values ( 1, 'traditional' );

1 row created.

ops$tkyte@ORA920> insert into content_type values ( 2, 'contemporary' );

1 row created.

ops$tkyte@ORA920>
ops$tkyte@ORA920> create table language( lang_id number, language varchar2(20) );

Table created.

ops$tkyte@ORA920> insert into language values ( 2, 'english' );

1 row created.

ops$tkyte@ORA920> insert into language values ( 3, 'spanish' );

1 row created.

ops$tkyte@ORA920>
ops$tkyte@ORA920>
ops$tkyte@ORA920> create table tbl_content
  2  ( internal_id number, external_id number, type_id number, lang_id number );

Table created.

ops$tkyte@ORA920>
ops$tkyte@ORA920> merge into tbl_content
  2  using EXTERNAL_TABLE
  3  on ( external_table.ext_id = tbl_content.external_id )
  4  when matched then
  5          update set type_id = (select type_id from content_type where type_name = external_table.type_name),
  6                     lang_id = (select lang_id from language where language = external_table.language)
  7  when not matched then
  8          insert ( internal_id, external_id, type_id, lang_id )
  9          values ( s.nextval, external_table.ext_id,
 10                   (select type_id from content_type where type_name = external_table.type_name),
 11                   (select lang_id from language where language = external_table.language))
 12  /

1 row merged.

ops$tkyte@ORA920> commit;

Commit complete.

ops$tkyte@ORA920>
ops$tkyte@ORA920> select * from tbl_content;

INTERNAL_ID EXTERNAL_ID    TYPE_ID    LANG_ID
----------- ----------- ---------- ----------
          1        1001          1          2

ops$tkyte@ORA920>
ops$tkyte@ORA920> !echo 1001,contemporary,english > /tmp/test.dat

ops$tkyte@ORA920> !echo 1002,traditional,spanish >> /tmp/test.dat

ops$tkyte@ORA920>
ops$tkyte@ORA920> merge into tbl_content
  2  using EXTERNAL_TABLE
  3  on ( external_table.ext_id = tbl_content.external_id )
  4  when matched then
  5          update set type_id = (select type_id from content_type where type_name = external_table.type_name),
  6                     lang_id = (select lang_id from language where language = external_table.language)
  7  when not matched then
  8          insert ( internal_id, external_id, type_id, lang_id )
  9          values ( s.nextval, external_table.ext_id,
 10                   (select type_id from content_type where type_name = external_table.type_name),
 11                   (select lang_id from language where language = external_table.language))
 12  /

2 rows merged.

ops$tkyte@ORA920> commit;

Commit complete.

ops$tkyte@ORA920>
ops$tkyte@ORA920> select * from tbl_content;

INTERNAL_ID EXTERNAL_ID    TYPE_ID    LANG_ID
----------- ----------- ---------- ----------
          1        1001          2          2
          3        1002          1          3

ops$tkyte@ORA920>

is one way.... 

A million Thanks!

saj, January 28, 2003 - 3:52 pm UTC

You are the very best!

column names

A reader, May 23, 2003 - 1:16 pm UTC

Tom

so if I have 100 column names in table T, then will I have to hard code all the 100 column names to the load_data routine?

Is there an easier way we can modify this routine, to read all the column names of a table.

Thx

Tom Kyte
May 24, 2003 - 9:40 am UTC

why hard code them? just loop over user_tab_columns and build it dynamically if you like.


for x in ( select column_name from user_tab_columns where table_name = 'x' )
loop
l_cnames := l_cnames || ',' || x.column_name;
end loop;
l_cnames := ltrim(l_cnames,',');




just got to your site for the fisrt time...!!!great!!!!!!!

EKENE KINGS, June 26, 2003 - 9:09 am UTC

Hi Tom,
I'm more than impressed with the questions raised on several topics and more imortantly your cmoprehensive and urgent response to them...
regards ekene...

I get an error when using the exception

Srinivas Pothuganti, November 12, 2003 - 12:23 pm UTC

I am trying to create a procedure to send an email, the message of which is to be read from an ascii file. I am using utl_file and utl_smtp packages for this. I am stuck at a point where I am trying to get all the lines from the ascii file. It gives me a ORA -1403(no data found) if I don't use an exception but when I use the exception the same way you used it in your procedure above, I get a different error. 
Here is my pl/sql and the errors.
SQL> Create or Replace Procedure Notify_utlfile(subject_line varchar2) As
  2    win_filetype utl_file.file_type;
  3    message varchar2(1000);
  4    con utl_smtp.connection;
  5    the_host varchar2(30);
  6    ip_addr varchar2(30);
  7  
  8   PROCEDURE send_header(name IN VARCHAR2, header IN VARCHAR2) AS
  9    BEGIN
 10      utl_smtp.write_data(con, name || ': ' || header || utl_tcp.CRLF);
 11    END;
 12   
 13  BEGIN
 14    ip_addr := utl_inaddr.get_host_address;
 15    the_host := utl_inaddr.get_host_name(ip_addr); 
 16    con := utl_smtp.open_connection('ghlngwy1');
 17    utl_smtp.helo(con, 'ghlngwy1');
 18    utl_smtp.mail(con, 'raos1@mail.modot.state.mo.us');
 19    utl_smtp.rcpt(con, 'raos1@mail.modot.state.mo.us');
 20    utl_smtp.open_data(con);
 21    send_header('From',    the_host|| '<raos1@mail.modot.state.mo.us>');
 22    send_header('To',      '"DBMS" <raos1@mail.modot.state.mo.us>');
 23    send_header('Subject', subject_line);
 24    win_filetype := utl_file.fopen('/tmp/orafiles','HQT2_raos1_exp.log','r');
 25  loop
 26    utl_file.get_line(win_filetype,message);
 27    EXCEPTION
 28              when no_data_found then 
 29              exit;
 30    utl_smtp.write_data(con, utl_tcp.CRLF || message);
 31  end loop;
 32    utl_file.fclose(win_filetype);
 33    utl_smtp.close_data(con);
 34    utl_smtp.quit(con);
 35  EXCEPTION
 36    WHEN utl_file.read_error THEN
 37     BEGIN
 38     utl_file.fclose(win_filetype);
 39     END;
 40    WHEN utl_smtp.transient_error OR utl_smtp.permanent_error THEN
 41      BEGIN
 42        utl_smtp.quit(con);
 43      EXCEPTION
 44        WHEN utl_smtp.transient_error OR utl_smtp.permanent_error THEN
 45          NULL; -- When the SMTP server is down or unavailable, we don't have
 46                -- a connection to the server. The quit call will raise an
 47                -- exception that we can ignore.
 48      END;
 49      raise_application_error(-20000,
 50        'Failed to send mail due to the following error: ' || sqlerrm);
 51  --  WHEN OTHERS THEN
 52  --   utl_file.fclose(win_filetype);
 53  END;
 54  /

Warning: Procedure created with compilation errors.

SQL> sho err
Errors for PROCEDURE NOTIFY_UTLFILE:

LINE/COL ERROR
-------- -----------------------------------------------------------------
27/3     PLS-00103: Encountered the symbol "EXCEPTION" when expecting one
         of the following:
         begin case declare end exit for goto if loop mod null pragma
         raise return select update while with <an identifier>
         <a double-quoted delimited-identifier> <a bind variable> <<
         close current delete fetch lock insert open rollback
         savepoint set sql execute commit forall merge
         <a single-quoted SQL string> pipe

32/3     PLS-00103: Encountered the symbol "UTL_FILE" when expecting one
         of the following:

LINE/COL ERROR
-------- -----------------------------------------------------------------
         end not pragma final instantiable order overriding static
         member constructor map

32/32    PLS-00103: Encountered the symbol ";" when expecting one of the
         following:
         . ( , * % & - + / at mod rem <an identifier>
         <a double-quoted delimited-identifier> <an exponent (**)> as
         from into || bulk
         The symbol ". was inserted before ";" to continue.

33/27    PLS-00103: Encountered the symbol ";" when expecting one of the

LINE/COL ERROR
-------- -----------------------------------------------------------------
         following:
         . ( , * % & - + / at mod rem <an identifier>
         <a double-quoted delimited-identifier> <an exponent (**)> as
         from into || bulk
         The symbol ". was inserted before ";" to continue.

34/21    PLS-00103: Encountered the symbol ";" when expecting one of the
         following:
         . ( , * % & - + / at mod rem <an identifier>
         <a double-quoted delimited-identifier> <an exponent (**)> as
         from into || bulk

LINE/COL ERROR
-------- -----------------------------------------------------------------

 

Tom Kyte
November 12, 2003 - 4:37 pm UTC

25 loop
BEGIN

26 utl_file.get_line(win_filetype,message);
27 EXCEPTION
28 when no_data_found then
29 exit;

end;


Got it right this time

Srinivas Pothuganti, November 12, 2003 - 4:16 pm UTC

Tom,
There were some syntactical errors in my procedure. I corrected them and the procedure works like a charm now.


Thanks,
Srinivas

another add up

Natasha, January 03, 2004 - 4:28 am UTC

Tom,

wishing u a happy new year....

I am having a serious problem and getting confused whether to use SQL loader or UTL_FILE utility.

Before putting my question...pls. clear me.
1) If I use UTL_FILE utility then UTL_FILE_DIR parameter has to be changed. Can it point a directory on Win-Nt or Win-XP instead of Unix OS.

2) Pls. do let us know how to call SQL-LOADER from JAVa sTORED pROCEDURE.(better if u give with an example).

My question:
------------
My CSV file looks like this...(assume alphabits enclosed are the data)

START
1
Table1,A,B,C,D,E,F,G,,,,,,,,,,H~
Table2,I,J,,K,L,M,,N,O,P~
R,M,T,M,D,M,F,M,H,M,J,,U,I,U~
K,Q,K,J,D,K,A,L,S,J,F,K,S,A,D,J,F~
Table3 W,Q,E,O,R,U,W,E,K,L,W,E,R,K,J,W,E,Q,R,I,O,W,E,Q,R~
L,E,3,4,3,4,3,4,P,L,3,4,K,0,X,C,K,L,K,3,0,9,4~

2
Table1,A,B,C,D,E,F,G,,,,,,,,,,H~
Table2,I,J,,K,L,M,,N,O,P~
R,M,T,M,D,M,F,M,H,M,J,,U,I,U~
K,Q,K,J,D,K,A,L,S,J,F,K,S,A,D,J,F~
Table3 W,Q,E,O,R,U,W,E,K,L,W,E,R,K,J,W,E,Q,R,I,O,W,E,Q,R~
L,E,3,4,3,4,3,4,P,L,3,4,K,0,X,C,K,L,K,3,0,9,4~

3
Table1,A,B,C,D,E,F,G,,,,,,,,,,H~
Table2,I,J,,K,L,M,,N,O,P~
R,M,T,M,D,M,F,M,H,M,J,,U,I,U~
K,Q,K,J,D,K,A,L,S,J,F,K,S,A,D,J,F~
Table3 W,Q,E,O,R,U,W,E,K,L,W,E,R,K,J,W,E,Q,R,I,O,W,E,Q,R~
L,E,3,4,3,4,3,4,P,L,3,4,K,0,X,C,K,L,K,3,0,9,4~
END

Now...How can I import the data to Oracle Database by using SQL-loader. Even how can I achieve this task by UTL_FILE utility.

Thxs in advance
Natasha


Tom Kyte
January 03, 2004 - 9:21 am UTC

1) utl_file_dir only needs be set in 9iR1 and before. in 9ir2 you should use create directory (no bounce needed) instead.

yes, the mapped drive can be from win-nt/xp AS LONG AS YOU CAN SEE THE MOUNTED DIRECTORY IN UNIX of course. So, if you samba mount or whatever the remote drive onto a unix mount point, and can "ls" it -- you can utl_file it.

2) that is a simple "host" command. you do not "call" sqlldr anymore then you would "call" uptime. it is a command.

In 9i, you would use external tables (search for that on this site, lots of examples). then you can load a file with "create table t as select * from flat_file" or "insert into t select * from flat_file"

In a java store procedure, you would just be using normal old java to run a host command. see
</code> http://asktom.oracle.com/pls/asktom/f?p=100:11:::::P11_QUESTION_ID:952229840241 <code>



which way is faster, Big cursor loop or join tables?

AH, January 06, 2004 - 8:46 pm UTC

Tom:

My application loads 8M records to a temp table (T), then open a cursor loop to compare the imcoming rows against the existing 200M rows partitioned table (P) to determine I/D/U or ignore the new data. I saw my database performs some phisical reads and tones of logical reads during this big batch job.

Is this cursor loop efficient? Read key values from input data, retrieves all rows from T (1 to thousands rows even using indexes), then compare values to I/D/U or discard.

My concern is the logical reads. Oracle will read the same blocks to retrieve row(s) for another key on the block. If I join T and P using three sql to perform I/D/U separately. will this approach be fater than the cursor loop?

Will MERGE be the fasest solution? What happen if I have a "big" where clause?

Thanks.

Tom Kyte
January 06, 2004 - 8:53 pm UTC

its all about merge -- if not merge, three single MASSIVE statements.

slow by slow processing (i keep doing that, meant row by row) is not a "good thing"

Try again

AH, January 08, 2004 - 7:48 pm UTC

Tom:

Assume MERGE cannot be used for a reason. Between cursor loops and join tables, which one provides better performance? Process 8M to 20M rows daily against 200M partitioned table.

create table t1 (col1, col2, col3, ..., col20);
create table t2 (f1, f2, f3, ... f15);
create index I1 on t1(col1, col2, col3);
create index I2 on t2(f1, f2, f3);
t2 changes everyday and the goal is to use information in t2 to I/D/U t1 daily.

Opt1: cursor loop
First cursor loop select all rows from t2;
Second cursor loops select all rows from t1 where col1 = f1 and col2 = f2 and col3 = f3.
If fields in t1 matches rule U, then update.
If fields in t1 matches rule I, then insert.
end of cursor 2; commit;
end of cursor 1; commit;

Opt2: join
update t1.col4, t1.col5, ...
where col1 = t2.f1 and col2 = t2.f2 and col3 = t2.f3 and col4 = 'X' and col5 != 'Y';
commit;
insert into t1 (...)
select t2.*
where col1 = t2.f1 and col2 = t2.f2 and col3 = t2.f3 and col4 = 'A' and col5 = 'B';
commit;

Opt1 will use index to fetch data from t1. Am I correct that Oracle will perform a loop in Opt 2 (Join) if the plain shown index is used? Should I use Oracle's loop or use my own loop? Is there any performance difference? Which one will be faster consider the number of rows to be processed?

Thanks



Tom Kyte
January 08, 2004 - 8:32 pm UTC

SET based processing is ALWAYS my first choice

slow by slow processing is as bad as it sounds (gotta stop doing that. meant "row by row")

A bulk version of the load_data function from start of thread

Matt, January 21, 2004 - 8:47 pm UTC

At the start of the thread you use a function to load CSV data from a flat file capturing all the errors raised along the way. This appears to be row by row. Can the rows be bulk loaded and still catch and log each of the errors?

I'm looking at loading 5 or so million CSV records and validating them. SQLLDR loads the data for me but does not allow me to easily classify the BAD records in terms of the errors raised.

I was thinking of using an external table and bulk collecting the data using your load_data function into a table with constraints/triggers and then logging any errors, the failed data record, and line number from the original CSV file. I could then have a further process to pull out the data that had errors and classify them by error into seperate files (for easy resolution).

One problem I have is when a record fails it will fail for one reason, when this error is fixed there may be additional errors hiding behind it. I want to catch ALL the error so that from my resulting error files output I am able to solve ALL data problems and obtain a clean next run.

I have looked at SQLLDR and it does 90% of what I need (and is efficient and tunable). I really need something that handles BAD records and their errors better though (for ease of resolution).

Can you briefly comment on the suggested approach? Do you expect it to be more efficient than say SQLLDR and a routine to parse the BAD and LOG files?

Best Regards.

Tom Kyte
January 22, 2004 - 6:31 am UTC

You can use plsql collections/table types and FORALL bulk processing

search for

"save exceptions"

on this site for examples.

DBMS_SQL & Bulk/Array & SAVE EXCEPTIONS

Erick, February 18, 2004 - 12:33 pm UTC

Thanks for your previous help concerning SDO.

A related question: I need to load high volumes of data 24/7 and allow user queries. The data is parsed and loaded from a high volume of flat files. The data is being inserted into a table which contains a rolling 72 hours of data and is range partitioned based on 4 hour intervals. As a new partition is created every 4 hours, the oldest partition is truncated.
I had wanted to use array processing using static SQL to bulk load the data for the best performance. Well, the truncating of the table partition, requires the recompilation of the load PL/SQL code since they are dependent on eachother. So I am left with stopping the loading of data, performing the truncate partition, resume data loading OR breaking the dependancy between the load PL/SQL code and the table being loaded. I then utilized 'EXECUTE IMMEDIATE', 'SAVE EXCEPTIONS' and your trick of using stored procedure global variables to bulk insert and collect any errors that occurred.
My concern is since the table to be loaded does not change, should I not be using DBMS_SQL to create the INSERT statement once (1 hard parse) and execute it a zillion times? If so, then how can I use 'SAVE EXCEPTIONS' with it (or something like it) so as to efficiently bulk load data and collect errors as they occur? I have been unable to locate any examples of using DBMS_SQL with array processing and SAVE EXCEPTIONS.

Thanks.

Tom Kyte
February 18, 2004 - 9:28 pm UTC

version?

flat file on network drive

Chuy, March 09, 2004 - 10:17 pm UTC

Tom

we have process where in a flat file is loaded at the end of every day into he 'Positions' table.

Currently, the file is loaded from the local host of one of the users.

Obviously, we have text_io doing the upload for us. Now with external tables coming into picture, we want to use external tables to accomplish this task.

Now the question is when we have create an external table and the flat file is on a network drive is it possible to access the file for the external table.

1. Can you give me an example of creating a directory referring one of the folders on your network drive. We are on Windows 2000.

2.While using external tables, can we load bad rows into an exceptions table ?

Thanks in Advance



Tom Kyte
March 09, 2004 - 10:50 pm UTC

1) </code> http://asktom.oracle.com/pls/asktom/f?p=100:11:::::P11_QUESTION_ID:241814624807 <code>

2) yes, you'll just map that bad file as an external table itself with one column (varchar2(4000) ). then you can insert into bad_table select * from bad_table_external;


A reader, March 10, 2004 - 9:32 pm UTC

Tom

Does your toutine take care of duplicate rows. That is when you loading your flat file rows into the table and you encounter duplicate rows( we can either update or we can discard them).. how can we tackle that issue..

Please explain..

Tom Kyte
March 11, 2004 - 8:13 am UTC

as long as you have constraints on these table identifying what duplicate means, they will be written into the "bad" table (the dups)

A reader, March 11, 2004 - 8:50 am UTC

Writing to the bad table is fine, but I want to update my table with the values, some of the columns might have to be updated for duplicate rows ?

Tom Kyte
March 11, 2004 - 1:26 pm UTC

You have the code, you can make it do whatever you want?

If you have 9i, it sounds like you want to use an external table and MERGE (search for

"external table" merge

for examples.

Bulk Load

Erick, April 12, 2004 - 12:39 pm UTC

Previously, I asked the question about bulk loading using DBMS_SQL and a 'SAVE EXCEPTIONS' like functionality. You requested the Oracle database version:

9.2.0.4

Tom Kyte
April 12, 2004 - 2:55 pm UTC

actually, now that I re-read this:

<quote>
My concern is since the table to be loaded does not change, should I not be
using DBMS_SQL to create the INSERT statement once (1 hard parse) and execute it
a zillion times?
</quote>

not a 100% valid concern, the truncate will invalidate the cursor, causing the hard parse anyway.

but to minimize the soft parsing overall -- you could dynamically parse a block of code that accesses arrays that are "global" (in a package spec) easily enough.



So, if you took your existing procedure, put it into a string as an anonymous block (making it access plsql tables that are found in a package spec), dbms_sql.parsed it -- you could just fill the arrays in the space, execute it, refill them, execute it, and so on...

is it possible

A reader, May 20, 2004 - 5:58 pm UTC

Tom

is it possible to read a .xls file and load it into an oracle table.



Tom Kyte
May 20, 2004 - 8:44 pm UTC

there are various windows tools that could do that, macros that run in xls files.

I'm sure someone will follow up with some, I don't use windows or xls files myself.

A reader, May 20, 2004 - 7:14 pm UTC

Further, is there a way we can convert .xls file into a tab file from the command line.

Tom Kyte
May 20, 2004 - 8:49 pm UTC

time for askbill.microsoft.com -- that would be their product?

A reader, May 20, 2004 - 9:24 pm UTC

Tom

1.Please give a direct answer..

Usual Oracle tools( that you know of- database, forms - utl_file, text_io etc) , is it possible to read a .xls file into a oracle table.

2.What are the windows tools you are talking about ..



Tom Kyte
May 21, 2004 - 9:37 am UTC

1) direct, hmm, let me think -- I guess if I knew of one I would have answered with it wouldn't I? I mean what else would I do?

XLS is a proprietary binary file format. utl_file, text_io, forms, et.al. are not going to read that proprietary format -- utl_file and text_io are both TEXT only (not binary). forms could load the xls into the database as a blob but I seriously doubt that is what you want. We could even index it and turn it into a html page using Oracle text from that blob, but again, it sounds like you want to save the cells into rows and columns

2) not being a windows user, not being a person who does spreadsheets or works with them -- I don't really have any. I've heard of things like "oraxcel" or something like that but as I said -- perhaps someone who does that would know and will followup.

otn.oracle.com <<<=== that is probably where you might want to cruise over to

Alternatively use Java

Paul, May 21, 2004 - 10:47 am UTC

Take a look at </code> http://jakarta.apache.org/poi/hssf/index.html <code>
It is a JAVA API for reading/writing XLS files.
Pretty neat, if you have simple SS you might be able to use this. I have not read files with it but we create them with it.



You could use ODBC . . .

Tak Tang, May 21, 2004 - 3:29 pm UTC

If the xls file is on the server, you can use Oracle Hetrogeneous Services, which is available in oracle 8.1.6 and above, possibly prior to 816, but definitely not in 7. Hetrogeneous Services allows you to set up non-oracle data sources, and access them from within oracle over a 'database link'. Use the generic flavour, which includes support for ODBC, and is license free (my favourite flavour of everything, thank you Larry).

Read more on Metalink

Hetrogeneous Services
</code> http://metalink.oracle.com/metalink/plsql/ml2_documents.showDocument?p_database_id=NOT&p_id=132692.1

How to setup generic connectivity (Heterogeneous Services) for Windows NT
http://metalink.oracle.com/metalink/plsql/ml2_documents.showDocument?p_database_id=NOT&p_id=109730.1

QUICK START GUIDE: WIN NT - Generic Connectivity using ODBC
http://metalink.oracle.com/metalink/plsql/showdoc?db=NOT&id=114820.1


Read more in the 'Heterogeneous Connectivity Administrator’s Guide' on OTN
[HTML] 
http://otn.oracle.com/pls/db92/db92.to_toc?pathname=server.920%2Fa96544%2Ftoc.htm&remark=docindex
[PDF]  
http://otn.oracle.com/pls/db92/db92.to_pdf?pathname=server.920%2Fa96544.pdf&remark=docindex


Note - all the docs say you do not need to specify an account/password when creating the database link.  I get a null password error when I try this, but if I specify a Windows 2000 account and password, it works.  Example, on my machine named "HOME", I have a user account "tangt", with the password "taktangspassword" (OK, perhaps I am fibbing about the password).

  create database link mydblink
  connect to "home\tangt" identified by taktangspassword
  using '<tnsservice>';

Perhaps I would not need to specify an account if I ran the listener service under a custom account instead of using LocalSystem . . . which we all do, of course, as a security measure to protect ourselves from listener attacks . . . and I do not acutally know how to do . . . because it is a problem for our DBAs ;-)

Note 2 - the PROGRAM in your listener.ora must be 'hsodbc', NOT 'hsagent'.


If you have an .xls file on the client, you could either transport it to the server, or if you happen to be using Forms 6i, you could use Oracle Open Client Adapter for ODBC.
You can find a working example of this called 'ODBC Data Importer' under 'Contents/Tools' on
http://www.geocities.com/oranails <code>

Beware, Forms 6i is old technology, possibly already desupported.

If I were using Forms 9i, I would take a look at Java and JDBC, but please do not ask me for an example, because I am not using Forms 9i, and I am a Java neophyte.

Takmeister


Tak Tang

A reader, May 24, 2004 - 8:51 pm UTC

Tak Tang , you give a whole lot of information, and in the end request us not to ask for any specific ifnormation as you did not use 9i forms. Funny.

The question was specifi to forms 9i.

XLS is a proprietary binary file format

A reader, May 24, 2004 - 9:05 pm UTC

Please explain what do you mean by 'XLS is a proprietary binary file format'.

Tom Kyte
May 24, 2004 - 11:58 pm UTC

hmmm, don't know how to say it differently?

xls is binary
xls is proprietary to MS
xls is a file format

they look like this:


ÐÏ^Qࡱ^Zá^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@>^@^C^@þÿ ^@^F^@^@^@^@^@^@^@^@^@^@^@^A^@^@^@1^@^@^@^@^@^@^@^@^P^@^@þÿÿÿ^@^@^@^@þÿÿÿ^@^@^@^@0^@^@^@ÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿ ^H^P^@^@^F^E^@¯^XÍ^GY^@^A^@^F^A^@^@á^@^B^@°^DÁ^@^B^@^@^@â^@^@^@\^@p^@^K^@^@

they are a binary file whose format is owned by MS

You could try . . .

Tak Tang, May 26, 2004 - 3:32 am UTC

Dear 'A Reader'

My previous post was neither addressed to anyone, nor was it a response to a specific question. If you re-read the thread carefully, you will also discover that although there are prior references to forms (and oracle SERVER 9i), I am in fact the first person to mention Forms 9i.

I raised the possibility of using the generic connectivity adaptor for Hetrogeneous services, and the Open Client Adaptor for Forms 6i, because neither had yet been mentioned on this, or a related thread 'Automate loading of data from Excel spreadsheets into database',
</code> http://asktom.oracle.com/pls/asktom/f?p=100:11:::::P11_QUESTION_ID:243814703172, <code>
and felt that someone responding to this thread's title 'from file into table' might find it solves their problem.

Furthermore, I am quite willing to share my experiences in both these technologies. My request not to ask for an example was limited to Forms 9i, for which I have no direct experience, but nevertheless offered my thoughts on how to tackle the problem.

As an oracle/unix/windows dba/developer, I do not see the problem as 'how do I load data into oracle', which is amply addressed here and elsewhere on this site, but rather 'how do I get data out of excel'.

Another anonymous reader hit the (metaphorical) nail on the head when they asked 'is there a way we can convert .xls file into a tab file from the command line?'. A quick search on google would have produced, oh, scores of tools that do just this, and not one of them mentions Oracle. Why? Because Microsoft Excel is not an Oracle product. Microsoft Excel is a Microsoft product.

You could try asking Microsoft how to convert excel to csv. And failing that, google.

Tak Tang


the right place

Paola, June 23, 2004 - 12:26 pm UTC

Tom is the man.. :) thanks a lot!°

Vamsi, November 10, 2004 - 12:19 am UTC

Excellent Site, has been most usful. the layout with the code is very good and helps understand better.

Tom keep up the great work!!!!!

data load from word file into database

Venkat, December 13, 2004 - 6:44 am UTC

Tom,

Our application picks up some huge word documents from the host system and puts the same in our system (unix server) at a particular location. We need to store this information in the oracle database and retrieve the same as and when needed. Kindly let us know the method of achieving this. (os. sun solaris 2.8 and oracle 9.2.0.4).

Regards,
Venkat

Tom Kyte
December 13, 2004 - 10:18 am UTC

search for

load blob

on this site, you have

a) dbms_lob.loadfromfile
b) sqlldr
c) custom program

to chose from at least.

Thank you very much Tom!

A reader, January 07, 2005 - 10:01 am UTC

I'm a regular visitor of your fantastic site and I'd like that you gave the right answers on lots of questions past years. All the best for 2005! Happy New Year and may the source be with you ;-)

Data Load Issues

Rashid Khan, January 10, 2005 - 3:27 pm UTC

Very Good site and extremly helpfull.
Thanks Tom for your dedications and focused support and patience.

Loading in a CR

Melissa, January 18, 2006 - 2:16 pm UTC

When i load in a row like :

W7771C|01-OCT-2004|2011|FTS|0|0|0|0|1|0|ASIP

using | as the deliminator
the last row inserts as 'ASIP
'
with a Carriage Return at the end. How do I get rid of this?

Tom Kyte
January 19, 2006 - 12:23 pm UTC

I'm assuming you are using sqlldr - how about sharing your ctl file with us.

The only site with the MOST useful information.

trevor borthwick, February 21, 2006 - 9:47 pm UTC

I am new to Oracle and have found solutions to all my questions already answered by you. I enjoy the code samples the most. Thanks a lot. Please keep this site alive forever!

Does SQL*Loader need to run on server-side ?

Apsolu, March 14, 2006 - 12:08 pm UTC

Hello Tom,

Your answers are great and I'd like to have a confirmation of something you said at the very beginning of this post : does SQLLDR need to run on the same machine as the one on which the DB instance is running (in other words : server-side) ? Or would it be possible that on a client machine, a SQLLDR connects to the database and loads data from the client host ???
If the second one is possible, does it suppose some special tricky parameters ?

Thank you very much in advance,
Apsolu

Tom Kyte
March 15, 2006 - 9:04 am UTC

sqlldr can work over the network, yes.

generating primary key using a field from text data

Imam Hossain, April 27, 2006 - 6:17 am UTC

Hi Tom !
I am Imam Hossain from Dhaka Stock Exchange Ltd.

We are using Oracle 10g and Linux cluster server.
we have to upload data from text file to oracle table.
presently we are using java aplication to upload data which is verymuch slow. That's why now we want to up load
data using SQL Loder.
Table Information:
CREATE TABLE TEMPTRADEDATA(
ObligationId NUMBER,
TransactionId NUMBER,
HowlaNo NUMBER,
InstrumentCode VARCHAR2(20),
MarketType CHAR(1),
TradeDate DATE,
TradeTime VARCHAR2(10),
TradePrice FLOAT,
TradeVolume NUMBER,
BuyerCode VARCHAR2(20),
SellerCode VARCHAR2(20),
TradeCrossing CHAR(1),
BuyerTransId NUMBER,
SellerTransId NUMBER,
HowlaType CHAR(1),
HowlaCharge FLOAT,
LagaCharge FLOAT,
IsException CHAR(1),
PayIn DATE,
PayOut DATE,
GroupCode CHAR(1),
Netting CHAR(1),
IsDeleted CHAR(1),
IsInvalid CHAR(1),
IsDematted CHAR(1),
Tax FLOAT,
constraint TEMPTRADEDATA_PK PRIMARY KEY(ObligationId),
foreign key (InstrumentCode) REFERENCES INSTRUMENT(Code),
foreign key (BuyerCode) REFERENCES MEMBER(Code),
foreign key (SellerCode) REFERENCES MEMBER(Code),
foreign key (MarketType) REFERENCES MarketType(Code),
foreign key (HowlaType) REFERENCES HOWLATYPE(Code)
);

and Data Information:
200611601, 200001,ABBANK ,P,2006-04-26,10:14:56, 282.000, 40,DSEJAM ,DSEDBL ,N,109306176 ,104260013 ,N, 3.000, 2.820
----- in data file, data starts from TransactionId that means 2nd field and ended to LagaCharge.
during insertion obligationId will be (12 digit)daily basis 260406000001, 260406000002 and so on. the first 6 digit of obligationId comes from tradedate field of data file and last 6 digit is a sequence number start from 1 daily basis. please give me a proper solution.

Tom Kyte
April 27, 2006 - 3:02 pm UTC

solution is "write a control file and run sqlldr" ???

Load data from client side file

Nikhilesh, May 22, 2006 - 12:31 am UTC

Dear Tom,
I have a csv on client side and want to load it in a table.
I'm not using forms. So I'm wroking on mod-plsql.

1)So is it ok to upload this file using mod-plsql in a table i.e. DOCUMENTS and download it on server and use it as a source for external table. Using this external table we can load data to required table

2) Or should I use FTP through pl-sql. Does ftp work in mod-plsql.

Please comment.........Thanks in advance.....

Tom Kyte
May 22, 2006 - 7:40 am UTC

1) if you have it (the csv) in a table already, why not just process it from there?

2) you can ftp in plsql, but I doubt you really want to, seems you already have the data there and the data cannot be "enormous", you had the patience to upload it via a webform.

reg utl_file

Balakrishna., December 29, 2006 - 4:40 am UTC

Dear Tom,

I have procedure i am reading the text from using utl_file then i am displaying in this procedure.

create or replace procedure test is
c1 utl_file.file_type;
read_string varchar2(200);
begin
c1:= utl_file.fopen('/csttstdb/oracle/CSTTST/920/utl/out','emp.txt','R');
BEGIN
loop
utl_file.get_line(c1,read_string);
dbms_output.put_line(read_string);
end loop;
EXCEPTION
WHEN NO_DATA_FOUND
THEN NULL;
END;
utl_file.fclose(c1);
end;

My requirement is something like this i want insert the data into the table after reading how this possilbe from the above procedure can you pls help me out in this.

Regards

Balakrishna.
Tom Kyte
December 29, 2006 - 9:48 am UTC

add an insert statement inside the loop???



reg utl_file

Balu, December 29, 2006 - 11:32 pm UTC

Dear Tom,

I am really happy for your prompt reply as you said include
insert stmt inside the loop yes i can use here what happens
here i am reading emp.txt file from os which consists of two columns empno&ename i am reading into one string i.e
utl_file.get_line(c1,read_string); how do i bifocate this string into two columns and include in the insert stmt can you pls help me out in this.

create or replace procedure test is
c1 utl_file.file_type;
read_string varchar2(200);
begin
c1:= utl_file.fopen('/csttstdb/oracle/CSTTST/920/utl/out','emp.txt','R');
BEGIN
loop
utl_file.get_line(c1,read_string);
dbms_output.put_line(read_string);
end loop;
EXCEPTION
WHEN NO_DATA_FOUND
THEN NULL;
END;
utl_file.fclose(c1);
end;

Tom Kyte
December 30, 2006 - 9:43 am UTC

parse it - that is, write code, manipulate the string, use substring, instr, maybe the regex functions - whatever

reg utl_file

Balu, January 01, 2007 - 5:23 am UTC

Many thanks.

Regards

Balu.

Thanks..

Anto Joy, May 25, 2007 - 11:29 am UTC

Now i have a question from this..
Am extracting from table to file.. I need the total no. of records as header in that file.. (without executing the query in beginning just for count & without any additional files also)..

After writing all the records using UTL_FILE.PUT_LINE , I just want to move the file pointer to beginning and write the count in the first line..

Please provide answer..
Tom Kyte
May 26, 2007 - 12:10 pm UTC

read the file upside down, pretend the bottom is the top - that way, you can print the number of records at the end of the file.

utl_file supports fseek, but only for reading. You'd want to use java or something more sophisticated in its file reading to do this

but really - you shouldn't need to - programs should just be able to read the output file to the end to find out how many lines there are.

A reader, May 28, 2007 - 8:41 am UTC

Thanks for your answer.. But in my requirement i need it in first line itself..
Tom Kyte
May 30, 2007 - 10:11 am UTC

2nd paragraph then.

csv

A reader, January 22, 2008 - 5:40 pm UTC

Tom:

Can you write the dump_CSV with dbms_output so we can run into client sql*plus and see if we can spool to a client file.

We could not get it to work.

thanks
Tom Kyte
January 22, 2008 - 6:59 pm UTC

you can certainly do that - why would you expect me to? it is a rather trivial, simple change.

but see
http://asktom.oracle.com/tkyte/flat/index.html

maybe all you want is a sqlplus script.

Reading external table from a client side directory

Line, February 06, 2008 - 3:24 pm UTC

Hi Tom!

To the first question, that was regarding how to load data from a flat file into Oracle, you mentionned it was possible to upload data from client side directory, with a client form...

I can't seem to find how to do it. Do we have to create a directory locally also (that would be dynamic of course).
Like that ?
CREATE OR REPLACE DIRECTORY LOAD_TEST AS '/dcxxdb13/applacxx/acxxappl/xx/11.5.0/admin';


Thanks !


Tom Kyte
February 07, 2008 - 7:31 am UTC

I said no such thing.

I said:

... If the "Win95" directory is some arbitrary directory on some arbitrary client
-- we can do this with PLSQL in Forms (not via a stored procedure but via a
client side plsql routine) but not with a stored procedure. ....


You can write a bit of code to open files, read records, insert them into the database using a client application.

but there is nothing the SERVER can do to touch that file (it would be a virus like thing if it could!)

Unless and until the file is available to the server machine via the servers file system - that file doesn't exist for the server.

Using forms, you would use something like the text_io package (part of FORMS) to open and read the file, processing each record you found.

from excel file into table

vinothraman, April 02, 2008 - 7:32 pm UTC

I want to write a stored procedure to write data from excel file into table.
Pls help me in this regard

Khaja Moin, April 20, 2008 - 3:07 pm UTC

Tom,

I have executed the function which is trying to load data from flat file with delimiters to oracle table. My problem is i have multiple data files with different names and when execute the function it is accepting only 1 file. I changed the code but it dint worked out can u review the below code and let me know:

create or replace
function load_data( p_table in varchar2,
p_cnames in varchar2,
p_dir in varchar2,
p_filename1 in varchar2,
p_filename2 in varchar2,
p_delimiter in varchar2 default '|' )
return number
is
l_input utl_file.file_type;
l_theCursor integer default dbms_sql.open_cursor;
l_buffer varchar2(4000);
l_lastLine varchar2(4000);
l_status integer;
l_colCnt number default 0;
l_cnt number default 0;
l_sep char(1) default NULL;
l_errmsg varchar2(4000);
begin
l_input := utl_file.fopen( 'DBDIR','ORA.txt','r' );
l_input := utl_file.fopen( 'DBDIR','QAD1.txt','r' )
l_buffer := 'insert into ' || p_table || ' values ( ';l_colCnt := length(p_cnames)-length(replace(p_cnames,',',''))+1;


for i in 1 .. l_colCnt
loop
l_buffer := l_buffer || l_sep || ':b'||i;
l_sep := ',';
end loop;
l_buffer := l_buffer || ')';

dbms_sql.parse(l_theCursor, l_buffer, dbms_sql.native);
loop
begin
utl_file.get_line( l_input, l_lastLine );
exception
when NO_DATA_FOUND then
exit;
end;
l_buffer := l_lastLine || p_delimiter;

for i in 1 .. l_colCnt loop
dbms_sql.bind_variable
( l_theCursor, ':b'||i,
substr( l_buffer, 1,
instr(l_buffer,p_delimiter)-1 ) ) ;
l_buffer := substr( l_buffer,
instr(l_buffer,p_delimiter)+1 );
end loop;
begin
l_status := dbms_sql.execute(l_theCursor);
l_cnt := l_cnt + 1;
exception
when others then
l_errmsg := sqlerrm;
insert into badlog ( errm, data )
values ( l_errmsg, l_lastLine );
end;
end loop;

dbms_sql.close_cursor(l_theCursor);
utl_file.fclose( l_input );
commit;

return l_cnt;
end load_data;
/
-------------------------------------------------------------
begin
dbms_output.put_line(
load_data('qad',
'source,gl_ref,amount',
'DBDIR',
'QAD1.txt',
'ORA.txt',
'|') || 'rows loaded');
end;
/


Tom Kyte
April 23, 2008 - 4:22 pm UTC

"u" isn't available. sorry, No idea when "u" will return - or actually appear for the first time. They must be really busy - with all of the requests for their time.

now, in looking at your code, I very quickly see the parameters p_filename1, p_filename2 are not used.

I see you open two files, putting the handles from each into the same variable. Hence the first file you open is never accessible to you.

This is some "pretty basic stuff", please review your code first - lots of obvious things in there.

Oracle Directory cant be used in design.

A reader, May 01, 2008 - 7:31 am UTC

Hi Tom,

We have implemented our design using external tables for reading data from flat files and it was working fine until some security issue turned up and I now need to revamp my design to use Java routines instead of External Table approach.

Security Issue :

We have 2 sites in LIVE i.e. LIVE-A and LIVE-B

LIVE-A LIVE-B

|
Java writing to NASA. | Java writing to NASB Acc
If NASA down then to NASB | If NASB down then to NASA
--------------------------|-----------------------
|
NASA | NASB App
|
-------------------------RAC--------------------- |
DB1 | DB2 DB
|
|

At any time,both Java routines on both sites would be writing records to NAS location and Database is supposed to pick up data from both locations.However,Only one of the database would be active at any instance and other would be in standby mode.The 2 database are then synchronized using data-guard.However DB2 cant access NASA and DB1 can access NASB because in External tables we specify the path and that means it directly acceses the Application Layer;due to this DB2 cant cross 2 firewalls and read files from NASA and DB1 cant cross 2 firewalls and read files from NASB.

Question1) Am I missing something out here?Can we access NAS(Network Attached storage) locations via RAC?

Planning to implement new design by writing a Java Process in Application layer which reads the records from file and inserts into the database.However we are expecting around .5 million records in the file. Planning to do a batch insert into database by Java process.Planning to do a commit only when all the records in the file gets inserted into the table so that we dont have duplicate records in event of server going down while Jave routing is inserting records into database.

Question2) Do you think this would cause a perofrmance problem?

Your views on above situation would highly be appreciated.

Regards
Tom Kyte
May 01, 2008 - 10:04 am UTC

... until some security issue ...

what 'security' issue please.


this sentence was not able to be parsed by my brain:

However DB2 cant access NASA and DB1 can access NASB because in
External tables we specify the path and that means it directly acceses the
Application Layer;due to this DB2 cant cross 2 firewalls and read files from
NASA and DB1 cant cross 2 firewalls and read files from NASB.



I don't see how external tables prevents a database from accessing network attached storage.


Oracle Directory cant be used in design

A reader, May 01, 2008 - 4:20 pm UTC

Thanks Tom for quick response.

Apologies if my language was too lengthy and confusing.

Securtiy Issue is as following:
Referring to the diagram, if DB2 has to access NASA it has to look diagonally i.e. meaning it needs to cross 2 firewalls which is a securtiy Issue.One Site can access another site by crossing only one firewall.
Now there are 2 ways to reach NASA from DB2 but both needs crossing 2 firewalls:
1) DB2 accesses NASB on App Layer and then access NASA on the same layer.
2) DB2 accesses DB1 on DB Layer and then acceses NASA on same site.

Our problem:

Since we are using NAS , we have mounted NAS path on DB Server. Oracle Directory is created on this mounted path. This oracle directory is then used for external table.Therefore Oracle Directory created on path for NASA will have to cross 2 sites when accessed from DB2 which is a security Issue.

Thanks in advance for looking into the Issue.

Regards
Tom Kyte
May 01, 2008 - 9:38 pm UTC

so, why don't you fix the 'security issue', get the files to a disk that both can access

rather than write scads of code that will never ever approach the performance you already have, with the small amount of code you already have.

I don't like to write code.

I like to find ways to not write code.

so, rather than say "we cannot", find a way to "do it" - find a way to get the files where they need to be.

Oracle Directories cant be used

A reader, May 02, 2008 - 6:12 am UTC

Thanks Tom for looking into the Issue.

I did escalted the concern regarding Performance but I think they are more interested in facing the problem rather than realizing it beforehand and resolving it.

About Separate location: They just have 2 sites and hence they can move all data either to site1 or site2.
There are also scenarios when NASB goes down and NASA comes in active mode.

Regards,
Tom Kyte
May 02, 2008 - 7:28 am UTC

well, I'd suggest not writing code, and getting the file system access setup as you need for your implementation - nothing much else I can say as far as "how to" here.

Oracle Directories cannot be used

A reader, May 02, 2008 - 11:28 pm UTC

Thanks Tom for responding to the Issue.

loading data into table from flat file

prabhakar, June 18, 2008 - 3:39 am UTC

We are using utl package to read the data from flat file after
reading the values it stores the values in the variables and loads
the data into the tables.
Problem :It is not loading the some of the records.
The procedure is not giving any error.
When I am putting only the missing records in the flat file and
running the function then it is getting loaded.
what could be the reason that it is missing some records.
what are the changes that I should make in the function to load all
the records.
can any one suggest the necessary changes to my function

Tom Kyte
June 18, 2008 - 1:09 pm UTC

...
Problem :It is not loading the some of the records.
The procedure is not giving any error.
....

answer: you have a when others error handler in there. remove it. You will start seeing why the procedure is failing, but right now failing silently - ignoring errors (eg: your developed code has a huge bug in it)





How much do people want to bet I am right.



short of that, you might, well, WANT TO ACTUALLY POST THE SMALLEST BIT OF CODE you can that demonstrates the issue.... so we can debug your code (you have an error in logic in there, we are not telepathic, we'd need to see your code)

Adding od headings in the flat file

jagdish, June 25, 2008 - 2:31 am UTC

Hi TOm,
I have used ur function to extract data to a flat file..but unable to add the query that u have given for xtracting the headings also..please write the full code for xtracting the data toa flat file with heading..
Tom Kyte
June 25, 2008 - 8:46 am UTC

if you are using a function written by someone named "UR" why are you here asking about it?

Especially if the modifications were made by someone named "U"?

and on a serious note.

If you, as a paid coder, cannot figure out how to get the column names to print out when you have 99% of the code already - then we have a problem.

http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:68212348056#4943605302984

that is a page that actually discusses UNLOADING data (whereas this page is about, well, loading data). That gets you 99.5% of the way there.

Using UTL File to merge multiple files

Nishant, June 27, 2008 - 7:33 am UTC

Hi tom,
I am having huge data around (12 GB) to be dumped everyday into the ascii file.At present we are using bulk operation to select the data and write it into file using utl_file.It takes lot of time.I am planning to create multiple files using multiple threads and then merge them into one.
Is it possible to merge multiple files (12 files) of size around 1 GB each into one file.I am not interested in line by line reading , instead reading a block of data and dump into the target file.
Tom Kyte
June 27, 2008 - 9:23 am UTC

you do not want to use plsql/utl_file to do this

(and how would writing lots of files - IO, and then reading them all again - IO, and rewriting a single large file from them all - IO be, well, better????)


why are you dumping 12gb of data into ascii files everyday, what is your ultimate goal with this data. When we know what you need to ACCOMPLISH (the end goal) we can suggest efficient ways to do it.

Don't tell us what you think you need to do to accomplish your goal (currently that is 'create 12gb flat file'), tell us what your ultimate goal is and we might well tell you a completely different way to achieve it.

external table

A reader, June 27, 2008 - 12:42 pm UTC

excellent

reading from Excelsheet to database(table)using Pl/SQl

Naga lakshmi, August 25, 2008 - 6:11 am UTC

Hi Tom,
1.How can I retrive data from Excelsheet to a database table using PL/SQL procedureby row wise and column wise.Can you please help me in this way.
2.can we use parmeters in array format in stored procedure.
3.can we miss middle values while inserting data into table if poosible how?
insert into table_name values('naga',,'hyd');--like this

Thanks & regards,
Nagalakshmi
Tom Kyte
August 26, 2008 - 8:31 pm UTC

1) http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:4406709207206#18830681837358

2) yes.

ops$tkyte%ORA10GR2> create or replace type mySQLtype as table of number
  2  /

Type created.

ops$tkyte%ORA10GR2>
ops$tkyte%ORA10GR2> create or replace package foo
  2  as
  3      type numArray is table of number;
  4
  5      procedure p( a1 in mySQLtype, a2 in numArray );
  6  end;
  7  /

Package created.

ops$tkyte%ORA10GR2>
ops$tkyte%ORA10GR2> create or replace package body foo
  2  as
  3      procedure p( a1 in mySQLtype, a2 in numArray )
  4      is
  5      begin
  6          for i in 1 .. a1.count loop dbms_output.put_line( 'a1('||i||') = ' ||a1(i) ); end loop;
  7          for i in 1 .. a2.count loop dbms_output.put_line( 'a2('||i||') = ' ||a2(i) ); end loop;
  8      end;
  9  end;
 10  /

Package body created.

ops$tkyte%ORA10GR2>
ops$tkyte%ORA10GR2> declare
  2      data1 mySQLtype := mySQLtype(1,2,3);
  3      data2 foo.numArray := foo.numArray(4,5);
  4  begin
  5      foo.p(data1,data2);
  6  end;
  7  /
a1(1) = 1
a1(2) = 2
a1(3) = 3
a2(1) = 4
a2(2) = 5

PL/SQL procedure successfully completed.




3) you NEVER will ever code an insert without a column list again

you will ALWAYS code:

insert into t (a,c) values ( 'value for a', 'value for c, skipping b' );


Deepthy, May 22, 2009 - 7:21 am UTC

TOM:
I got a text file like this
22-Mar-2009 07:39:46 > Data Migration Application Version 5.1.002
22-Mar-2009 07:39:46 > Processing file D:/RioDM/5NC/Import/01_5NC_AlternativeID.csv
22-Mar-2009 07:39:46 > Processing Alternative IDs from D:/RioDM/5NC/Import/01_5NC_AlternativeID.csv
22-Mar-2009 07:42:29 > Alternative ID Processing Completed Successfully.
All i wanted is to read the text file using UTL_FILE package in to 3 tables( Date, Time, Description.)
Each containing one column.
Help me working it.
Tom Kyte
May 23, 2009 - 10:31 pm UTC

it is called writing code.

the code you would write in plsql would be similar to the code you would write in "language X" (where X is any language you want)

eg:

open file
read a record
while more data
  parse record
  process record
  read a record 
end while
close file


That psuedo code is as old as programming. Hopefully you do not need help writing such code as that code is something all programmers should be able to craft in their sleep???



That said, I only see TWO fields in there

a) a date
b) a string


and I fail to see how or why that would be split into separate tables. If you are storing the dd-mon-yyyy data in one field and the hh24:mi:ss data in another - you are absolutely (no questions asked) doing it wrong.

If you are storing those two columns in two separate tables - you are doing it *wrong* as well - what use would it be in two different tables without any way to put it back together??? and if you did craft a way to put it back together - why would you split it up in the first place????????


sqlldr can load that data into a single table easily without code
external tables can do the same

do not write code, use the tools.

A reader, May 24, 2009 - 9:47 pm UTC


A reader, May 24, 2009 - 9:47 pm UTC


FILE NEEDED

NIRAJ, September 29, 2010 - 6:34 am UTC

CAN you please show me the file THAT YOU JUST READ USING UTL PACKAGE.???
JUST WANT TO SEE THE FORMAT AND WHAT DE-LIMITERS YOU ARE USING.

THANKS IN ADVANCE.

Tom Kyte
September 29, 2010 - 9:07 am UTC

THE FILE IS RIGHT THERE IN THE EXAMPLE (why are you shouting at me?)

ops$tkyte@8i> host echo 1,2,3 > /tmp/t1.dat

ops$tkyte@8i> host echo 4,5,6 >> /tmp/t1.dat

ops$tkyte@8i> host echo 7,8,9 >> /tmp/t1.dat

ops$tkyte@8i> host echo 7,NotANumber,9 >> /tmp/t1.dat




that and the delimiter is actually a formal parameter (named "p_delimiter" of all things) that defaults to '|' but may be overridden - as again shown in the example:

ops$tkyte@8i> begin
  2     dbms_output.put_line(
  3         load_data( 'T1',
  4                    'x,y,z',
  5                    '/tmp',
  6                    't1.dat',
  7                    ',' ) || ' rows loaded' );
  8  end;
  9  /
3 rows loaded



see line 7, I sent in a ',' instead.


did you read the example?

hello tom i m using the below data ....

niraj, September 29, 2010 - 6:48 am UTC

"abc"|"BP1000001"|"abc"|"abcd"|" INSTITUN"|"GBLCSTDN"|"abcd"|"abcd"|"abcd"|"ab"
"abc"|"BP1000002"|"abc"|"abcd"|" INSTITUN"|"GBLCSTDN"|"abcd"|"abcd"|"abcd"|"ab"
"abc"|"BP1000003"|"abc"|"abcd"|" INSTITUN"|"GBLCSTDN"|"abcd"|"abcd"|"abcd"|"ab"
"abc"|"BP1000004"|"abc"|"abcd"|" INSTITUN"|"GBLCSTDN"|"abcd"|"abcd"|"abcd"|"ab"
"abc"|"BP1000005"|"abc"|"abcd"|" INSTITUN"|"GBLCSTDN"|"abcd"|"abcd"|"abcd"|"ab"
"abc"|"BP1000006"|"abc"|"abcd"|" INSTITUN"|"GBLCSTDN"|"abcd"|"abcd"|"abcd"|"ab"
"abc"|"BP1000007"|"abc"|"abcd"|" INSTITUN"|"GBLCSTDN"|"abcd"|"abcd"|"abcd"|"ab"
"abc"|"BP1000008"|"abc"|"abcd"|" INSTITUN"|"GBLCSTDN"|"abcd"|"abcd"|"abcd"|"ab"
"abc"|"BP1000009"|"abc"|"abcd"|" INSTITUN"|"GBLCSTDN"|"abcd"|"abcd"|"abcd"|"ab"


whaat changes i need to make..



thanks in advance..
keep up the good work..
Tom Kyte
September 29, 2010 - 9:08 am UTC

see above...


as a developer, you should be able to read the code, understand the code, and know that the code basically would not need a change for you.

Easy way to create external tables

A reader, December 30, 2010 - 10:02 pm UTC

I have files that need to update rather merge into normal table.I would like to convert these files to external tables.

As number of columns are large,I thought if there is an easy way like below ?


1.CTAS statement- create table ext_table as select * from actual table wher 1=0 .
2.Step 1 creates normal table which i would simply alter using organization index external and give directory and filename details.





Tom Kyte
January 03, 2011 - 8:27 am UTC

You cannot update external tables. You can only

a) query from them
b) create them (with data)

You cannot use insert/update/delete against them.


ops$tkyte%ORA11GR2> create or replace directory tmp as '/tmp'
  2  /

Directory created.

ops$tkyte%ORA11GR2> exec utl_file.FREMOVE( 'TMP', 'test.dat' );

PL/SQL procedure successfully completed.

ops$tkyte%ORA11GR2> 
ops$tkyte%ORA11GR2> !ls -l /tmp/test.dat
ls: /tmp/test.dat: No such file or directory

ops$tkyte%ORA11GR2> create table test
  2  organization external
  3  ( type oracle_datapump
  4    default directory TMP
  5    location( 'test.dat' )
  6  )
  7  as
  8  select username, user_id, created from all_users
  9  /

Table created.

ops$tkyte%ORA11GR2> !ls -l /tmp/test.dat
-rw-rw---- 1 ora11gr2 ora11gr2 12288 Jan  4 00:07 /tmp/test.dat

ops$tkyte%ORA11GR2> 
ops$tkyte%ORA11GR2> insert into test (username, user_id, created) values ( 'x', 100, sysdate );
insert into test (username, user_id, created) values ( 'x', 100, sysdate )
            *
ERROR at line 1:
ORA-30657: operation not supported on external organized table


ops$tkyte%ORA11GR2> delete from test;
delete from test
            *
ERROR at line 1:
ORA-30657: operation not supported on external organized table


ops$tkyte%ORA11GR2> update test set username = lower(username);
update test set username = lower(username)
       *
ERROR at line 1:
ORA-30657: operation not supported on external organized table

what is this error?

John, January 03, 2011 - 3:00 am UTC

create or replace directory data_dir as 'C:\tmp';

create table external_table
(student_id varchar2(20),
enroll_status varchar2(20),
major varchar2(30)
)
ORGANIZATION EXTERNAL
( type oracle_loader
default directory data_dir
access parameters
( fields terminated by ',' )
location ('test.txt')
)
/

host echo 1001,traditional,english > /tmp/test.txt

select * from external_table;

ERROR at line 1:
ORA-29913: error in executing ODCIEXTTABLEOPEN callout
ORA-29400: data cartridge error
KUP-04063: unable to open log file EXTERNAL_TABLE_1640_3660.log
OS error The system cannot find the file specified.
ORA-06512: at "SYS.ORACLE_LOADER", line 19

what is this error Tom?
Tom Kyte
January 03, 2011 - 8:59 am UTC

We were until to open the file EXTERNAL_TABLE_1640_3660.log for writing on the DATABASE SERVER in the c:\tmp directory.


Make sure

a) you understand that all IO will (has to when you think about it) take place on the DATABASE SERVER - not your PC.

b) that the c:\tmp directory already exists on the DATABASE SERVER

c) that the account your database is running as has the ability to read and write that directory.

How to set init.ora

selva, May 27, 2011 - 8:53 am UTC

How to set the utl_file_dir init.ora parameter. can u plz help me in this
Tom Kyte
May 27, 2011 - 10:55 am UTC

do not use that - use directory objects...


create or replace directory my_dir as '/.....';

grant read/write on directory my_dir to whomever...



and in utl_file.fopen, reference 'MY_DIR' as the directory name.


utl_file_dir requires a data base restart and has no ability to control who can read/write it. DO NOT use it since 9i!

validation column in a oracle database

Perumal, June 01, 2011 - 11:33 am UTC

Hi Tom,
Your help required on this.

I have table named EMP. Now table contain two column.
'emp_name', 'emp_id' Both are Varchar.

Value are
'XXX','12'
'yyy','13'
'zzz','S12'

I need to store this datas.In another table name EMP_Valid table. It contain two columns empname varchar and empid integer.

I need to convert the empid varchar variable to integer. and check whether it is numeric. If Numeric means i need to store in emp_valid table columns or otherwise discard the value.


Do some help for me in this...


Tom Kyte
June 01, 2011 - 2:48 pm UTC

if a valid number to you is an integer with just 0-9 in it then:

replace( translate( emp_id, '0123456789', '0' ), '0', '' ) is null

will find the 'valid' ones.

You can also just use dml error logging - with that, you can just insert into the new table, select * from the old. any data that does not convert - will be logged into a third table for you.

http://asktom.oracle.com/Misc/how-cool-is-this.html

More to Explore

PL/SQL demos

Check out more PL/SQL tutorials on our LiveSQL tool.

PL/SQL docs

PL/SQL reference manual from the Oracle documentation library