Skip to Main Content


Question and Answer

Tom Kyte

Thanks for the question, James.

Asked: April 11, 2001 - 7:15 pm UTC

Last updated: October 20, 2004 - 4:34 pm UTC

Version: 816

Viewed 1000+ times

You Asked

I want to input data from a file using sql loader.I have read through the available samples on the site but my file seems quite a challenging one
The data is in this fomart order

A,HJ00034,Thu Dec 7 17:39:18 2000,,,,,,,,,
B,john,smith,,1 beacon House,Staltan Street,London,(null),~GB,SE5 0XS,,
X,message,Hello Hello, this is a message for you,mailchoice,Yes,,,,,,
S,Express delivery shipping,My other name,My second name,123 House Number,IN a City,(null),~DZ,SE4 9TY,,,

A,B,P,X,S,I,I,T are column identifiers.and are not required in the destined tables.
The A record contains the invoice number and the date of the order.
The B record contains customer name and bill to address information.
The P record contains payment information. There are actually two types of P records
depending on whether the customer used a check or credit card as a payment method
The X record contains any comments.
The S record contains customer name and ship to address information.
The I record contains information about the product(s) being ordered. There will be
as many I records as there are items ordered for each order.
The T record contains information about the order’s total monetary value.

The begining of every 'A' will indicate the begining of the transaction and likewise 'T' will indicate the end of this same unique transaction. I will need to
put the results in tables as follows
A - customers Table(HJ00034 -will act as primary key) and order Table (HJ00034 will be foreign key in table)
B - customer Table
P - Payments table
x - Payments table
S - Shipping Table
I - Invoice Table
T - Invoice Table
I have tried using positions but this wouldnt workable as I will need to skip and go to the next row. A Pointer in the right direction will be appreciated


and Tom said...

You won't be able to do that as a transaction in sqlldr. SQLLdr can load this file but the A...T records will NOT be considered "together". It might insert an A record, commit and then insert the B, P, fail on the X, succeed on the S, I, I and T records and then commit -- placing the X record into a BAD file.

If transactional integrity is meaningful to you across these records, you'll NOT be using SQLLDR.

Suggest you create a table like:

create table temp_load
( SEQ int primary key,
record_type char(1),
c1 varchar2(4000),
c2 varchar2(4000),
cN varchar2(4000)
organization index

and use a ctl file like:

into table temp_load
fields terminated by ","
trailing NULLCOLS
( seq RECNUM,
A,HJ00034,Thu Dec 7 17:39:18 2000,,,,,,,,,
B,john,smith,,1 beacon House,Staltan Street,London,

and then post process the loaded data using a plsql loop:

for x in ( select * from temp_load order by SEQ )
if ( x.record_type = 'A' ) then commit; end; -- new transaction;
if ( x.record_type = 'A' )
insert into table_a values ( x.c1, x.c2, .... x.c? );
elsif ( x.record_type = 'B' )
insert into table_b values ( x.c1, x.c2, ... x.c? );
end if;
end loop;
commit; -- the last transaction


  (2 ratings)

Is this answer out of date? If it is, please let us know via a Comment


Loading Row Based On ALready-Loaded Data

news0reader, October 18, 2004 - 5:50 pm UTC

Hi Tom, assume I have a data file such as the following, with newline-separated records and comma-separated cols; no header:


I want to load only the first-seen rows of each group of rows grouped by the value of the second column. Thus from the above I want to load only:


Is there a straightforward way to do this using only sql loader? My thoughts thus far have been:

1) preprocess the data file and remove "duplicate" rows before running SQL Loader.
2) somehow use CONTINUEIF to test the col value of the "next" row to build 1 long logical row...then load cols only from the first physical row that went into creating the logical row. Thing is, since these rows are delimited, I don't know how to refer to specific columns of "next" rows.

Please advise.


Tom Kyte
October 19, 2004 - 8:50 am UTC

not with sqlldr -- with an external table - it is trivial.

I would not use sqlldr for this, either use an external table or just load up the entire file into a scratch table and pretend it was an external table.

Is SQLLoader Good For Insert/Update/Delete?

news0reader, October 20, 2004 - 12:27 pm UTC

Hi Tom, thanx for the note above. Yes, external tables would make this trivial (DBAs don't like loading via external tables--don't ask why); scratch tables would also ease things (again DBAs veto; they don't want "scratch tables" in production databases).

Anyway, my follow-up question is this: I'm new to SQL Loader, but know it's good for rapidly loading data into tables. In general is it also a good choice for loading up tables which have inter-table referential integrity based on foreign keys? How about when a single data file has some rows to be inserted, others to be updated/deleted?

My thoughts, using SQL Loader, would be to first load master tables, then to load detail tables, with the detail's control file using an in-situ select statement to determine the foreign key to insert. For example:

create table _master(id number primary key, name varchar2(100))
create table _detail(id number primary key, mid number foreign key references _master(id), name varchar2(100))


This is inserted into _master as tuples ((1,apple), (2,battle), (3,cattle)). For whatever reason I choose not to use the natural key.


What would the field specification part of the control file for _detail.txt look like?

skip_mid filler
, name
id "detailsequence.nextval"
, mid "select id from _master where ..." -- ?

Failing this I suppose I could parse data to be loaded and construct insert statements (or stored procedure calls) for each row, and run it via SQL Plus. Is that inadvisable?

Please advise.



Tom Kyte
October 20, 2004 - 4:34 pm UTC

you don't have DBA's, you apparently have "roadblocks".

sqlldr does one thing -- insert. that is it.

I'd goto mgmt (honestly -- i would) and explain how the dba's are turning a 5 minute problem into $50k of code to develop, debug and maintain.

don't try to get too fancy with sqlldrl -- it gets ugly but fast.

(you can use a scalar subquery to load up "lookup" values -- but there are no updates, there are no deletes)