Skip to Main Content

Breadcrumb

Question and Answer

Tom Kyte

Thanks for the question, S..

Asked: March 06, 2002 - 10:54 pm UTC

Last updated: November 15, 2011 - 8:23 am UTC

Version: 8.1.7

Viewed 10K+ times! This question is

You Asked

Dear Tom,
Hope you are fine.
I am facing a problem using SQL Loader.
I have a text file and I have to take thses value into table.
I have also a table consists of 2 fields name and details.
Here I give some content of that text file:
LANGUILLI
Come on, girl-san. Tell me how you say that
one more time.

BIENSTOCK
Hey, why don't you ask her how to say
'jailbreak'...in Vietnamese?


and so many lines like these...
Here I want to insert these text into that table.
As example LANGUILLI will insert into 'name' field and
the remain text which is not fixed length will insert into 'details' field.

There is no terminated symbol here except 'Enter' and 'Double Enter'. How I will set terminated point in sql loader?
This text file is very large. I don't want to do manually.
I want to use sql loader.
What I can do?
Is there any solution?
Please give me solution.

with best regards,

-S. Rahman, Bangladesh.

and Tom said...

You will load into a "temp" staging table and use a procedure to reformat the data.

Suppose you have:

drop table t;
create table t ( name varchar2(30), text clob );

drop table temp;
create table temp ( seqno int primary key, text varchar2(4000) )
organization index overflow tablespace users;

T is where you ultimately want the data. You can write a simple procedure:

create or replace procedure reformat
as
l_name varchar2(80);
l_text varchar2(32000);
begin
for x in ( select * from temp order by seqno )
loop
if ( x.text = chr(10) )
then
null; -- skip it
elsif ( substr( x.text, 1, 1 ) <> ' ' )
then
if ( l_name is not null )
then
insert into t values ( l_name, l_text );
end if;
l_name := x.text;
l_text := null;
else
l_text := l_text || x.text || chr(10);
end if;
end loop;
insert into t values ( l_name, l_text );
end reformat;
/


And then given a input file like T.DAT (dashes are not part of the file):

--------------------------------------------------------NAME
text text text text text text text
text text text text text text text
text text text text text text text

NAME2
more text more text more text more text
more text more text more text more text
more text more text more text more text
more text more text more text more text
-----------------------------------------------------------

and a control file like:

LOAD DATA
INFILE t.dat
INTO TABLE temp
replace
( seqno recnum,
text position(1:1024)
)

you can load the file up, run reformat and then truncate temp.

Rating

  (101 ratings)

Is this answer out of date? If it is, please let us know via a Comment

Comments

SQL Loader

S. Rahman, March 08, 2002 - 11:05 pm UTC

Thank you.

SQL Loader

S. Rahman, March 09, 2002 - 3:47 am UTC

Thank you for your excellent solution.
But would you please expalin some points?
I can not understand why the following line is used in control file:
text position(1:1024).
Why do you use 'position' here?

I did as per your instructions.
But I faced some problems.
When I sql loader data is inserted from text file to temp table.
Suppose 40 rows were inserted into temp table.
But when I execute reformat procedure only one row was inserted into
T table from temp table.

I have understood the reason.
Actually the text file is exactly as follows:
=========================================
LANGUILLI
Come on, girl-san. Tell me how you say that
one more time.

BIENSTOCK
Hey, why don't you ask her how to say
'jailbreak'...in Vietnamese?

BELETKY
Look at him man, who does he think he's
kidding over there?
=============================================
Here I give the sample for 3 rows.

Would you please give me solution?

Tom Kyte
March 10, 2002 - 6:22 pm UTC

I used posititon cause I was not loading delimited data -- I wanted to load an entire line. I assume your lines of text are less then or equal to 1k in length.


Well, after looking at what gets loaded in TEMP, I simply rewrote reformat to accomidate what your input loads:

set define off

create or replace procedure reformat
as
l_name varchar2(80);
l_text varchar2(32000);
begin
for x in ( select * from temp order by seqno )
loop
if ( x.text is null )
then
if ( l_name is not null )
then
insert into t values ( l_name, l_text );
end if;
l_name := null;
l_text := null;
elsif ( l_name is null )
then
l_name := rtrim( ltrim( x.text ) );
else
l_text := l_text || x.text || chr(10);
end if;
end loop;
insert into t values ( l_name, l_text );
end reformat;
/
set define on


should do it (did for that file). Straightforward "programming", you can adjust as necessary.


But problem when Data loading

S. Rahman, March 09, 2002 - 9:59 pm UTC

Thank you for your good solution.
Could you please explain some points that you use in Procedure?
text position(1:1024) why do you use in LOAD statement in the control file?

I could load data into TEMP table but when I execute
REFORMAT procedure, onely one row was inserted into T table.
The text file is exactly like the following:
(= is not the content of text file)
===============================================
LANGUILLI
Come on, girl-san. Tell me how you say that one more time.
BIENSTOCK
Hey, why don't you ask her how to say 'jailbreak'...in Vietnamese?
BELETKY
Look at him man, who does he think he's kidding over there?
=====================================================
The contents for 2nd row starts after the 'carriage
return' of the 1st row and so on.

Please give me solution.

With best regards,

clob insertion using sqlloader from comma delimited file

A reader, April 02, 2003 - 12:23 pm UTC

i have a comma delimited data file like :
1,"abc def efg",3,4
2,"fff degf efg",7,2
3,"xcx def yu",3,5

The fields are comma seperated. The second field is to be inserted a sclob into the table. How do i do it using sqlloader.



Tom Kyte
April 02, 2003 - 12:57 pm UTC

ops$tkyte@ORA920> drop table t;

Table dropped.

ops$tkyte@ORA920> create table t ( a int, b clob, c int, d int );

Table created.

ops$tkyte@ORA920> !cat t.ctl
load data
INFILE *
INTO TABLE t
APPEND
FIELDS TERMINATED BY ','
(a,b,c,d)
begindata
1,"abc def efg",3,4
2,"fff degf efg",7,2
3,"xcx def yu",3,5

ops$tkyte@ORA920> !sqlldr / t

SQL*Loader: Release 9.2.0.3.0 - Production on Wed Apr 2 12:58:06 2003

Copyright (c) 1982, 2002, Oracle Corporation.  All rights reserved.

Commit point reached - logical record count 3

ops$tkyte@ORA920> select * from t;

         A B                                                                                         C          D
---------- -------------------------------------------------------------------------------- ---------- ----------
         1 "abc def efg"                                                                             3          4
         2 "fff degf efg"                                                                            7          2
         3 "xcx def yu"                                                                              3          5 

Assign row numbers in a detail record that will recycle for every master record?

Fred Shimaya, May 27, 2003 - 5:02 pm UTC

Tom,

Using SQL*Loader, is there a way to assign row numbers in a detail record that will recycle for every master record? For example:

Input file for detail table contains:

1,TEST1
1,TESTTEST
1,MYTEST
1,YOURTEST
1,EVERYONESTEST
2,HISTEST
2,HERTEST
3,HETEST
3,SHETEST

where the first column represents a unid and the second column represents a varchar value.

And the tables are:

create table MASTER_TABLE
(MASTER_UNID NUMBER NOT NULL,
ACCOUNT_NAME VARCHAR2(30))
/

create table DETAIL_TABLE
(MASTER_UNID NUMBER NOT NULL,
LINE_NUMBER NUMBER NOT NULL,
SOMETHING VARCHAR2(10))
/

So, using the input file for the detail table, I want SQL*Loader to load the data with the result looking like:

1,1,TEST1
1,2,TESTTEST
1,3,MYTEST
1,4,YOURTEST
1,5,EVERYONESTEST
2,1,HISTEST
2,2,HERTEST
3,1,HETEST
3,2,SHETEST

We are currently using 8i, so I hope there is a way to do this in 9i.

Thanks.

Fred

Tom Kyte
May 27, 2003 - 5:52 pm UTC

Nope and the concept is sort of "foreign" to me. Seems you just want to be able to "order them". the concept of 1, 2, 3, 4 -- just doesn't sit properly in a relational database.

(but in 9i, with an external table and using INSERT instead of the command line tool sqlldr, we would be able to do this with analytics and ROW_NUMBER(), but then again, I would just use ROW_NUMBER() upon output always -- and just use an increasing sequence)

Ok...

Fred Shimaya, May 27, 2003 - 6:46 pm UTC

Thanks. I thought about using EXTERNAL TABLES as an alternative. I just wanted to ask you that question hoping that there is a way in SQL*Loader to do that. The reason why it appears "foreign" is that we are migrating data from a network (CODASYL) database to Oracle 9i. To perform the unload, we are using DBO/UNLOAD, which does not offer much in creating the sequences we need to establish order. In RDBMS, the order in the input file is moot when loading data into tables. The problem is that in CODASYL, the records all tied internally via "sets". You can have no keys identified to establish the order. The order is internal. So when the records are loaded into RDBMS, my concern was how can we ensure that the order is maintained. The only way to do that, it seems, is to provide sequencing. That's why I asked.

Your suggestion of using EXTERNAL TABLE helped reaffirm my thoughts, though. Thanks!

Fred

Tom Kyte
May 27, 2003 - 7:59 pm UTC

you only need a sequence then - you need not have it "reset to one" for each detail group -- you just need it to increase with each insert.

A sequence is all you need.

row_number() -- that implies SORTING to assign the number and you probably don't have a sort key (else this would be moot)



True...

Fred Shimaya, May 27, 2003 - 10:48 pm UTC

I thought about that too. Using just sequence in the SQL*Loader. But we have millions of detail records. I am wondering if a sequence reaching over several million is a waste. What impact, if any, does that have on performance? Sorry to keep bugging you about this.

Fred

Tom Kyte
May 28, 2003 - 6:52 am UTC

sequences have 27 digits.

It would be millions of millions of years to exhaust one at thousands of retrievals per second.

they are the correct thing to use. when you create it -- do this


create sequence s cache 10000;


do your load and then set the cache back down to say 100 or so. It'll be the fastest way to generate a highly concurrent -- yet unique -- increasing number for you.

Thank You!

Fred Shimaya, May 28, 2003 - 3:31 pm UTC

Thanks. Your responses really help me out a lot.

concatenate and continueif

Reader, July 03, 2003 - 5:19 pm UTC

Tom, if time permits, could you please help me understanding the usage of CONCATENATE and CONTINUIF while using sqlloader. The doc discusses about 'logical record-forming' stratgies. I am hard time understanding how this can be done using CONCATENATE and CONTINUIF. I would appreciate a demo, if possible. Thanks. Happy July 4th.

Tom Kyte
July 03, 2003 - 8:15 pm UTC

concatenate is useful to take N records in a file and make them one:

suppose you have:

this is
all the
same record
this is
all the
same record

concatenate 3 would make us see

this is all the same record
this is all the same record

continueif is when you have a marked up file, like this:

this is
+one record that spans
+three lines
this is
+another record
+that spans
+four lines

here, continueif the first position = '+' would result in

this is one record that spans three lines
this is another record that spans four lines


Thanks

Reader, July 03, 2003 - 9:47 pm UTC


referring to a non-targetted input column in the control file

david, August 12, 2003 - 3:27 pm UTC

hi, Tom.

we need to take the first column of a comma-delimited input file and perform a calculation involving another column in the input file, but this second column is not actually being loaded into a target column. From everything I've read (and I've been reading all day), I haven't been able to find a way to refer to this input column in a delimited file unless it has a target column (ie. :<target column>).

other than asking the user to give me a fixed width input file, or worse yet, asking them to do the calculation themselves, or making two passes on the input, do you know of a way to do this?

as awlays, thanks.

Tom Kyte
August 13, 2003 - 9:50 am UTC

see </code> http://asktom.oracle.com/~tkyte/SkipCols/index.html <code>and the delimited package for an idea.

I think I get it

david, August 13, 2003 - 10:02 am UTC

Very, very nice.

had to modify it a little to do what i needed

david, August 13, 2003 - 11:12 am UTC

combined with your 'strip_bad' and changed word to accept 2 column arguments so that I could do the math.

Thanks so much!

type vcArray is table of varchar2(2000) index by binary_integer;
g_words vcArray;
g_empty vcArray;
g_last_string varchar2(4096);
g_bad_chars varchar2(256);
g_a_bad_char varchar2(256);

function de_quote( p_str in varchar2, p_enc_by in varchar2 ) return varchar2
is
begin
return replace( ltrim( rtrim( p_str, p_enc_by ), p_enc_by ),
p_enc_by||p_enc_by, p_enc_by );
end de_quote;

procedure parse( p_str in varchar2,
p_delim in varchar2,
p_sep in varchar2 )
is
l_n number default 1;
l_in_quote boolean default FALSE;
l_ch char(1);
l_len number default nvl(length( p_str ),0);
begin
if ( l_len = 0 ) then
return;
end if;
g_words := g_empty;
g_words(1) := NULL;
for i in 1 .. l_len loop
l_ch := substr( p_str, i, 1 );
if ( l_ch = p_delim ) then
l_in_quote := NOT l_in_quote;
end if;
if ( l_ch = p_sep AND NOT l_in_quote ) then
l_n := l_n + 1;
g_words(l_n) := NULL;
else
g_words(l_n) := g_words(l_n)||l_ch;
end if;
end loop;
for i in 1 .. l_n loop
g_words(i) := de_quote( g_words(i), p_delim );
end loop;
end parse;

function word( p_str in varchar2,
p_n1 in varchar2,
p_n2 in varchar2,
p_enclosed_by in varchar2 default '''',
p_separated_by in varchar2 default ',' ) return number
is
begin
g_last_string := p_str;
parse( p_str, p_enclosed_by, p_separated_by );
--begin changes **********
if length(g_words(p_n1)) = length(delimited.strip_bad(g_words(p_n1))) then
if length(g_words(p_n2)) = length(delimited.strip_bad(g_words(p_n2))) then
return (to_number(g_words(p_n1)) * 10000) + to_number(g_words(p_n2));
else
return 0;
end if;
else
return 0;
end if;
-- end changes ********
exception
when no_data_found then return NULL;
end;

function strip_bad( p_string in varchar2 ) return varchar2
is
begin
return replace(
translate( p_string,
g_bad_chars,
g_a_bad_char ),
substr( g_a_bad_char, 1, 1 ),
'' );
end strip_bad;

begin
for i in 0..255 loop
if ( i not between ascii('0') and ascii('9') )
then
g_bad_chars := g_bad_chars || chr(i);
end if;
end loop;
g_a_bad_char := rpad(
substr(g_bad_chars,1,1),
length(g_bad_chars),
substr(g_bad_chars,1,1));
end;

package vs. package body in 'delimited'

David, August 15, 2003 - 8:41 am UTC

How is it that the package body has functions that aren't defined in the package? I didn't realize that was valid, but what's the reason for doing it this way?

Thanks as always.

Tom Kyte
August 15, 2003 - 10:13 am UTC

just like everyone has deep dark secrets they keep bottled up, packages are designed to let you "expose" that which you want exposed, whilst keeping private that which is not designed to be "exposed"


(delimited btw is at </code> http://asktom.oracle.com/~tkyte/SkipCols/delimited.sql <code>)


Now in that spec, you see only:

a) a function "word"

that is the external interface, the only thing the rest of the world needs to know about.

In the body, you see lots of stuff:


create or replace package body delimited
as
type vcArray is table of varchar2(2000) index by binary_integer;

g_words vcArray;
g_empty vcArray;
g_last_string varchar2(4096);


there is a private type. I wanted an array. No else on the planet needs or wants to know about this internal datatype. Additionally -- i have a state (so we can parse a line of text onces and use it over and over and over). This state is private as well -- not exposed. No one but my package body can read/write this state...



function de_quote( p_str in varchar2, p_enc_by in varchar2 ) return varchar2
is
begin
return replace( ltrim( rtrim( p_str, p_enc_by ), p_enc_by ),
p_enc_by||p_enc_by, p_enc_by );
end de_quote;



there is a private function -- de_quote. I use it internally. I chose NOT to expose it. I don't have to document it, I don't have to support it for other usages. It does what I want it to do, period. Could it have been in a support "utility" package -- sure. But it is small and for self containment, I put it here.


Then there is procedure parse. That is really specific to this package. It is private code that no one else need know about or call. They could only get in trouble if they did. Hence it is hidden, visible ONLY to code in the body itself.


It is a method of encapsulation. I put lots of private functions/procedures in my body so my code remains modular (term dates me doesn't it...), easy to understand, bite sized. I don't clutter up the interface, the spec, with stuff that is specific to my package.

How To load following data

Vikas Sharma, September 15, 2003 - 3:02 pm UTC

Hi Tom,

I would like to load the following data from the txt file.

=========================

first rec >>HILLAMD 980321 8/2/2002 00:00:00 {Resign
due change in roster oc:Richard Thomas King

Roster was changed a fortnight ago, which resulted in his being unable to meet early turn start.}

next rec >> ALLEND 7755553 8/2/2002 00:00:00 {RAIL EUROPE - WEST MALLING!! (oc:Heather Stoate)
Heather was going to be
How good is that?!

cc: SCOTTK, ALLEND}
third_record starts here >>
.....
===============

The braces are just to show that it is one col and this column has carrage return also.

I can use the fix position for this but the problem is how can i tell loader the for the end of record.

Like the last column i have three more column in the data file.

There is any way to load the data stored in this format.

Many thanks in advance.

Regards,
Vikas Sharma


Tom Kyte
September 15, 2003 - 4:15 pm UTC

show me the "real" file -- i'm sure there isn't a "first rec >>" / "next rec >>"

if the braces are there -- that would be good.

tell me, how would YOU know the end of record was found?

RE- How to...

Vikas Sharma, September 15, 2003 - 7:18 pm UTC

Hi Tom,

You are right. There is no "first rec >>" i had put it to make it clear to you. And same there are no braces i had put that to make things clear. Sorry, for the confusion.

I dont know where is the end of the record as thre is no termination of the record ,can i take the maximun width of the one line, NOT record, which is 774 charecters.

I am putting the first record from the datafile which i have. it may be not clear but any way i am put there.But if you want i can email it to you (some part 10kb or so, as it is large) if you give me the email address.

Here is the first record and the second rec =============.

SMITHJ MMG 19/6/1998 00:00:00 14:15 1 0 membership clean up Someone emailed me at 1pm today re the above but although it appears in my callender I get an error when I try to read it so I don't know who it was from or what it said (obviously).

Do you know who you are ?

Oh and by the way to anyone that doesn't work on the third floor and hasnt herd this a thousand times already - I am on holiday for the next two weeks.

Bye Bye

cc: SMITHJ, +RECRUIT, ALLEND, ALLENJ, ANDERSON, ASHFORDD, CAMEROND, CANAVANP, CHALKLEY, CHAPMANP, CHESHERA, DAVIESN, DICKR, DOHERTY /K-A]$%%E9E= 2SMITHJ 19/6/1998 00:00:00 14:09SMITHJ 19/6/1998 00:00:00 14:1864VJ28Z(>0<+= 2

WARDF MMG 22/6/1998 00:00:00 12:39 1 0 STAFF REPRESENTATIVES MEETING Can you please advise when you are available on the following days so that I can arrange a meeting:-

week commencing 27 July - 31 July
3 August
4 August
5 August
10 August
11 August
13 August
14 August

Proposed agenda is as follows:-

1. IiP - progress report
1.1 Progress report (Management Side)
1.2 Personal development reviews (questions and comments) concerning the procedure from Staff Side
2. Business Plan (Management Side)
3. Media Monitoring (Management Side) (Nancy Platts to be /Z3H-S%D]MY-71ORMESJ 22/6/1998 00:00:00 12:05WARDF 22/6/1998 00:00:00 12:396532GYG&&MZX-71



==============
Every reocrd start with a name in uppercase as above first recrod start with SMITHJ next record with WARDF
Hope i am clear this time.

Thanks & Regards

Vikas Sharma

Tom Kyte
September 15, 2003 - 8:15 pm UTC

are the records separated by anything a COMPUTER can understand? like "two newlines" or something? anything -- anything at all?



RE - How To

Vikas Sharma, September 15, 2003 - 8:49 pm UTC

Hi Tom,

The record are not seprated by any thing. I think it may not be possible to make you clear here. May be possible if i can mail you some part of the datafile.

Sorry to trouble you again.

Thanks,

Vikas

Tom Kyte
September 16, 2003 - 7:45 am UTC

lets put it this way -- if you could not describe in text how to parse the file definitively, we cannot codify it.

it looks like free form text with no way to tell a computer "this is record 1", "this is record 2"

how to trim the spaces from flat file.

A reader, September 16, 2003 - 2:34 am UTC

Tom,

I wanted to know this from you,

I have a variable-length delimted file,delimiter(|).I am able to read all the fields except the last one, i mean its failing there at the last field in the flat file. because the last field has to be only 10 character(table has the field column with date(YYYY-MM-DD), but because of spaces, sqlloader reads it as 10+spaces as number of character and its failing with the following error message

Record 45: Rejected - Error on table TB_MWP020_CPMIDS_STAGE, column ORDER_CREATION_DT.
Field in data file exceeds maximum length

CTL file..

LOAD DATA
APPEND
INTO TABLE TB_MWP020_CPMIDS_STAGE
FIELDS TERMINATED BY "|"
trailing NULLCOLS
(
PROJECT_ID CHAR(30) "LTRIM(RTRIM(:PROJECT_ID))",
REQUISITION_NUM_DSC CHAR(8) "LTRIM(RTRIM(:PROPOSAL_NUM_DSC))",
.
.
.
.
.
ORDER_CREATION_DT DATE 'YYYY-MM-DD'"LTRIM(RTRIM(:ORDER_CREATION_DT))"
)

Please help me in resolving this ...

Thanks for you time and consideration




Tom Kyte
September 16, 2003 - 8:14 am UTC

you'll have to give me a "for example" set of data.

i cannot reproduce the issue as described (with 9iR2 anyway).

I can suggest you try

ORDER_CREATION_DT "to_date( LTRIM(RTRIM(:ORDER_CREATION_DT)), 'yyyy-mm-dd' )"

that is, don't have sqlldr do the date, have sql do the date.

Sample data and ctl file

A reader, September 16, 2003 - 8:40 am UTC

BIG HORN|TD100334|10601751000|3420701|GEORGIA POWER|REQUEST DATE TEST1|120.00|2002-07-25|N|34207TD|103013733|103013809|103013733||351|106|A|GEORGIA POWER|DISBURSEMENT ACCTG|PO BOX 3115|ATLANTA GA||ATLANTA|GA|30302|US|PP|FOB|GEORGIA POWER|DISBURSEMENT ACCTG|PO BOX 3115|ATLANTA GA||ATLANTA|GA|30302|US|09||17|2002-07-25
BIG HORN|TD100335|10601751000|3420701|GEORGIA POWER|REQUEST DATE TEST2|240.00|2002-07-25|N|34207TD|103013733|103013809|103013733||351|106|A|GEORGIA POWER|DISBURSEMENT ACCTG|PO BOX 3115|ATLANTA GA||ATLANTA|GA|30302|US|PP|FOB|PACIFICORP ELEC|213 EAST 3RD STREET||LOVELL WY||LOVELL|WY|82431|US||||2002-07-25


ctl file:

LOAD DATA
APPEND
INTO TABLE TB_MWP020_CPMIDS_STAGE
FIELDS TERMINATED BY "|"
trailing NULLCOLS
(
PROJECT_ID CHAR(30) "LTRIM(RTRIM(:PROJECT_ID))",
REQUISITION_NUM_DSC CHAR(8) "LTRIM(RTRIM(:REQUISITION_NUM_DSC))",
PROPOSAL_NUM_DSC CHAR(12) "LTRIM(RTRIM(:PROPOSAL_NUM_DSC))",
CUSTOMER_INDEX_NUM_DSC CHAR(9) "LTRIM(RTRIM(:CUSTOMER_INDEX_NUM_DSC))",
CUSTOMER_NAM CHAR(15) "LTRIM(RTRIM(:CUSTOMER_NAM))",
CUSTOMER_ORDER_ID_DSC CHAR(30) "LTRIM(RTRIM(:CUSTOMER_ORDER_ID_DSC))",
TOTAL_ORDER_VAL_AMT integer external,
CUST_ORDER_DT DATE 'YYYY-MM-DD'"LTRIM(RTRIM(:CUST_ORDER_DT))",
PROGRESS_PAY_IND_CD CHAR(1) "LTRIM(RTRIM(:PROGRESS_PAY_IND_CD))",
CHARGE_TO_NUM_DSC CHAR(9) "LTRIM(RTRIM(:CHARGE_TO_NUM_DSC))",
PROJECT_LEAD_NAM CHAR(9) "LTRIM(RTRIM(:PROJECT_LEAD_NAM))",
SALES_ENGG_DSC CHAR(9) "LTRIM(RTRIM(:SALES_ENGG_DSC))",
TD_SPECIALIST_DSC CHAR(9) "LTRIM(RTRIM(:TD_SPECIALIST_DSC))",
SE_USER_NUM_DSC CHAR(9) "LTRIM(RTRIM(:SE_USER_NUM_DSC))",
OFFICE_TAKING_CD CHAR(10) "LTRIM(RTRIM(:OFFICE_TAKING_CD))",
DISTRICT_CD CHAR(3) "LTRIM(RTRIM(:DISTRICT_CD))",
PAYMENT_TERMS_CD CHAR(2) "LTRIM(RTRIM(:PAYMENT_TERMS_CD))",
BILL_TO_CUST_NAM CHAR(32) "LTRIM(RTRIM(:BILL_TO_CUST_NAM))",
BILL_TO_ADDRESS_LN_1 CHAR(32) "LTRIM(RTRIM(:BILL_TO_ADDRESS_LN_1))",
BILL_TO_ADDRESS_LN_2 CHAR(32) "LTRIM(RTRIM(:BILL_TO_ADDRESS_LN_2))",
BILL_TO_ADDRESS_LN_3 CHAR(32) "LTRIM(RTRIM(:BILL_TO_ADDRESS_LN_3))",
BILL_TO_ADDRESS_LN_4 CHAR(32) "LTRIM(RTRIM(:BILL_TO_ADDRESS_LN_4))",
BILL_TO_CITY_NAM CHAR(28) "LTRIM(RTRIM(:BILL_TO_CITY_NAM))",
BILL_TO_STATE_NAM CHAR(2) "LTRIM(RTRIM(:BILL_TO_STATE_NAM))",
BILL_TO_ZIP_CD CHAR(9) "LTRIM(RTRIM(:BILL_TO_ZIP_CD))",
BILL_TO_COUNTRY_NAM CHAR(2) "LTRIM(RTRIM(:BILL_TO_COUNTRY_NAM))",
SHP_PMT_METH_CD CHAR(2) "LTRIM(RTRIM(:SHP_PMT_METH_CD))",
TRANS_TERM_CD CHAR(3) "LTRIM(RTRIM(:TRANS_TERM_CD))",
SHIP_TO_CUST_NAM CHAR(32) "LTRIM(RTRIM(:SHIP_TO_CUST_NAM))",
SHIP_TO_ADDRESS_LN_1 CHAR(32) "LTRIM(RTRIM(:SHIP_TO_ADDRESS_LN_1))",
SHIP_TO_ADDRESS_LN_2 CHAR(32) "LTRIM(RTRIM(:SHIP_TO_ADDRESS_LN_2))",
SHIP_TO_ADDRESS_LN_3 CHAR(32) "LTRIM(RTRIM(:SHIP_TO_ADDRESS_LN_3))",
SHIP_TO_ADDRESS_LN_4 CHAR(32) "LTRIM(RTRIM(:SHIP_TO_ADDRESS_LN_4))",
SHIP_TO_CITY_NAM CHAR(28) "LTRIM(RTRIM(:SHIP_TO_CITY_NAM))",
SHIP_TO_STATE_NAM CHAR(2) "LTRIM(RTRIM(:SHIP_TO_STATE_NAM))",
SHIP_TO_ZIP_CD CHAR(9) "LTRIM(RTRIM(:SHIP_TO_ZIP_CD))",
SHIP_TO_COUNTRY_NAM CHAR(2) "LTRIM(RTRIM(:SHIP_TO_COUNTRY_NAM))",
DESTINATION_CD CHAR(10) "LTRIM(RTRIM(:DESTINATION_CD))",
TAX_EXEMPT_FLAG_DSC CHAR(1) "LTRIM(RTRIM(:TAX_EXEMPT_FLAG_DSC))",
TAX_CD CHAR(3) "LTRIM(RTRIM(:TAX_CD))",
ORDER_CREATION_DT "to_date( LTRIM(RTRIM(:ORDER_CREATION_DT)), 'yyyy-mm-dd' )"

)


Please see, the version of oracle is 8.1.7.

The same problem i have even with other datatype, i mean not only date, if i have character or number also.


Thank you so much for your prompt reply

Tom Kyte
September 16, 2003 - 8:51 am UTC

ok, try


ORDER_CREATION_DT char(4000) "to_date( LTRIM(RTRIM(:ORDER_CREATION_DT)),
'yyyy-mm-dd' )"


looks like the whitespace could exceed 255 characters -- the default character buffer size in sqlldr. just make the default buffer "much larger"

A reader, September 16, 2003 - 9:03 am UTC

Tom,
thanks for your prompt replies..

Sorry, even this time it throws the error.

also how to set the buffer size for sqlldr?/

pls let me know
thanks

Tom Kyte
September 16, 2003 - 9:14 am UTC

when I did it without char(4000), it failed.

when I did it with char(4000), it succeeded.

using:

TAX_EXEMPT_FLAG_DSC CHAR(1) "LTRIM(RTRIM(:TAX_EXEMPT_FLAG_DSC))",
TAX_CD CHAR(3) "LTRIM(RTRIM(:TAX_CD))",
ORDER_CREATION_DT char(4000) "to_date( LTRIM(RTRIM(:ORDER_CREATION_DT)), 'yyyy-mm-dd' )"
)


corrects it. verify that you are doing that. (and that 4000 is in fact "long enough" given your data.

Great solution .. Thanks Thanks Thanks ....

A reader, September 16, 2003 - 10:32 am UTC

Tom,

Thank you so much. I have no words to express my satisfaction.

You are simply great.

Thanks

Import from Access

Vivek, September 19, 2003 - 7:24 am UTC

Tom,

I need to import a table data from access. I have converted the data into text file and written ctl file to load it. Problem i am facing is one of the field is of "free text" nature and has got carriage returns and line feed in it.
When i run loader for text file with CR and LF i got strange result
MY sample text file is
---------------
2,"Aberdaron","03405","7714","529316",,"Weather Station (10m lattice tower), Garreg Fawr, Aberdaron, Pwllehli, Gwynedd",0,1,1,0,0,1,0,0,0,0,0,0,0,0,0,"1993",,"Lattice ( unknown make)",10,"Unknown",95,"UK","SH","2","152","3","248",52,47,21,"N",4,44,31,"W","Gwilym Roberts","01758 760357",,"01758 760679",,,,"South

Ht 94mtr
3m SW of Aberdaron

(SAWS Line PW71408)

Distance from Aberporth 140m. 4.0 Hours

Directions:
From Pwllheli on A497, A499.
At Llanbedrog turn R onto B4413, on edge of Aberdaron take R turn by telephone box, towards Uwchmynydd.
Past crossroads then into valley and over bridge - uphill to staggered crossroads - straight on (not left towards Aberdaron)
After 300 mr turn L & after 1km take second of two close L turns
In 300 mr turn R to farmyard - L through gate up steep track to the site.

Site is immediately W of Garreg Fawr Farm about 4km W of Aberdaron.

Alternativly turn R after narrow bridge in Aberdaron, 1m over hill turn L at crossroads, after 300m turn L etc.

Anemometer is on a 10m climbable mast.
There are two signal junction boxes in the TB10 position (Pressure sensor box)

The BT wiring is taken from its in/out box via the second box to the SAWS.","14/04/106","ON100F",,1161,0

-------2ND RECORD----------------

38,"Barra","03035","0510","738179","EGPR","Barra Airport, Isle of Barra, Western Isles, HS9",0,1,0,1,1,0,0,0,0,0,0,0,0,0,0,"1987",,"WEC",10,"Client/Others",3,"UK","NF","0","694","8","060",57,1,59,"N",7,27,0,"W","Mrs Janet MacLean","01871-890-283",,,,,,"Scotland

NOTE: These notes refer to the old Barra Aux station before the SAMOS installation. It has been assumed that the SAMOS is in the same location and that the AUX station is now discontinued.

Contact (LoganAir) for free tickets
0141-889-1311 Xtn 237

Grid Ref NF 694 060 57 02'N 7 27'W Station Ht 3m/10ft
Mag Variation 9.5 W in 1993
Contact Capt Hauzzard 041-889-1311 x237 To Authorise Free Flight Travel for 6 Monthly Routines.
Depending on timetables it may not be possible to make a day visit. There are 2 hotels in Castlebay. A taxi fare if no lift available is approx £8 single. If the weather proves unsuitable on the day LoganAir are usually quite happy to rebook you for another day.

Wind Mast is a sectional guyed pole which requires the assistance of Aiport Fire Service to lower. Advance courtesy call of ETA helps to prepare them! Serco report this mast to be a 10m WEC tower in Jan 2003.

Loganair usually supply you with cups of tea but minor beverages and biscuits etc are available in the small departure hall in the same building.

Set of wind dials are on the office wall. The 50V XFMR and substitution box are in the porch nearest to the mast.
MAGNETIC VARIATION +9.5 DEGREE IN 1993 from aviation map.
Calculated at 9.0 in 1994 for compass use.

Barometer 3 mtr Runway 0 mtr!",,"ON100F",,82,0
--------------------------------------------------
and log for this is

Record 1: Rejected - Error on table SITE, column NOTES.
second enclosure string not present
Record 2: Discarded - all columns null.
Record 5: Discarded - all columns null.
Record 7: Discarded - all columns null.
Record 4: Rejected - Error on table SITE.
ORA-01401: inserted value too large for column

Record 6: Rejected - Error on table SITE.
ORA-01401: inserted value too large for column
...
Record 48: Rejected - Error on table SITE.
ORA-01401: inserted value too large for column


Table SITE:
2 Rows successfully loaded.
29 Rows not loaded due to data errors.
0 Rows not loaded because all WHEN clauses were failed.
17 Rows not loaded because all fields were null.
----------------------------------------------
But when i run without that field its fine
and log is like
----------
Table SITE:
957 Rows successfully loaded.
0 Rows not loaded due to data errors.
0 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.
-------------------

How can i handle this field data with in SQLLOADER.

Thanks

Vivek


Tom Kyte
September 20, 2003 - 5:24 pm UTC

well, first, you don't need to export it at all, you can just:

insert into oracle_table select * from access_table@access_database;


</code> http://asktom.oracle.com/pls/asktom/f?p=100:11:::::P11_QUESTION_ID:4406709207206 <code>



but unless you can have access put an END OF RECORD marker, there is no way to tell when the "end of record" is encountered. so the default end of record marker is a newline -- hence it won't work.

if you could make each line end with some special character, you can use the STR directive

search for


str sqlldr

on this site for details.

just to know how a column can be auto increase using sequence number

vipin, December 19, 2003 - 5:29 am UTC

I am looking for information how we can use sequence nect val for a column in the control file itself.....

any help.........

Tom Kyte
December 19, 2003 - 7:23 am UTC

ops$tkyte@ORA9IR2> create table t ( id int, a int, b varchar2(15), c int, d int );
 
Table created.
 
ops$tkyte@ORA9IR2> create sequence S;
 
Sequence created.
 


ops$tkyte@ORA9IR2> !cat t.ctl
load data
INFILE *
INTO TABLE t
APPEND
FIELDS TERMINATED BY ',' trailing nullcols
(a,b,c,d, <b>ID "s.nextval"</b>)
begindata
1,abc def efg,3,4
2,fff degf efg,7,2
 
ops$tkyte@ORA9IR2>
ops$tkyte@ORA9IR2> !sqlldr / t
 
SQL*Loader: Release 9.2.0.4.0 - Production on Fri Dec 19 07:24:18 2003
 
Copyright (c) 1982, 2002, Oracle Corporation.  All rights reserved.
 
Commit point reached - logical record count 2
 
ops$tkyte@ORA9IR2> select * from t;
 
        ID          A B                        C          D
---------- ---------- --------------- ---------- ----------
         1          1 abc def efg              3          4
         2          2 fff degf efg             7          2
 

just to know how a column can be auto increase using sequence number

vipin, February 10, 2004 - 1:34 am UTC

Really a great solution

error

Ak, March 30, 2004 - 1:57 am UTC

Hi,
I have couple of problem while using sql*loader for converting data from access DB to oracle db.This is what i did..
1)Taken export of access table in .txt format. which has one field of memo data type.
2)In oracle used long datatype(and tried clob as well) for equivalent column. This two datatype doesn't seems to be working ...here is the error..

Column Name Position Len Term Encl Datatype
------------------------------ ---------- ----- ---- ---- ---------------------
OVERFLOWID FIRST * , O(") CHARACTER
OVERFLOWTEXT NEXT * , O(") CHARACTER

Record 1: Rejected - Error on table RQOVERFLOWTEXT, column OVERFLOWTEXT.
Field in data file exceeds maximum length
Record 2: Rejected - Error on table RQOVERFLOWTEXT, column OVERFLOWTEXT.
Field in data file exceeds maximum length

3)Another problem for other table I have is ..when I export access db in txt format ...for one table(second last field) it goes into the next line..because of which sqlloader fails.for an example my txt file will look like
1,2,3,"this is
testing",4

Is there any way in oracle with which i can tell sql loader to read next line.I cann't change it manually becuase this file is very big.

Thanks

Cheers
Ak




Tom Kyte
March 30, 2004 - 7:57 am UTC

in the control file, specify the maximum length -- else it is char(255)


c1,
c2,
c3 char(100000),
c4,
c5,
.....


in order to deal with that field that goes over lines, you would have to use a record terminator. If you can get the file to be generated like this:

1,2,3,"this is
testing",4^


(eg: end of line is ^\r\n -- not just \r\n) we can work with it.


If not, you might consider just setting up the hetergenous gateway so you can load from access using SQL:


insert into oracle_table select * from access_table;

</code> http://docs.oracle.com/docs/cd/A87860_01/doc/server.817/a76960/hs_genco.htm#173 <code>




Delimited word package problems in 9i

Mark Kiel, June 15, 2004 - 12:26 pm UTC

Tom,

Thanks in advance for your help!

We are currently migrating from 8.1.7 to 9i and have had problems with sql*loader in the new environment. Specifically, when loading data using SQL*Loader and the delimited word package, which worked great in 8.1.7, we are seeing numbers transposed across lines.

The following load file:

9703,13
9704,1
3709,37
6571,10
6574,5
3810,1

Loads correctly in 8.1.7 as

col1 col2
19703 13
19704 1
13709 37
16571 10
16574 5
13810 1

but incorrectly in 9i as

col1 col2
19703 1
19704 13
13709 3
16571 1
16574 50
13810 16

The following control file is used during the load:
options (skip = 1)
Load Data
append into table pre_statload
( DS constant "BKCD REF",
YEAR_S constant 2004,
ACCUM_TYPE constant D,
CONSOLIDAT constant 100,
FINANC_ID constant 10000,
ORG_ID position(1:4096) "10000 + to_number(delimited.word(:ORG_ID,1,chr(34),chr(44)))",
GL_ACCT_ID constant 900316,
CMN_COA_ID constant 0,
ONE_MONTH_AMT position(1:1) "to_number(delimited.word(:ORG_ID,2,chr(34),chr(44)))",
TPCOA_ID constant 0,
TMCOA_ID constant 0,
PRCOA_ID constant 0,
ORG_UNIT_T position(1:1) "10000 + to_number(delimited.word(:ORG_ID,1,chr(34),chr(44)))",
BPCOA_ID constant 0)


For the following line:
ORG_ID position(1:4096) "10000 + to_number(delimited.word(:ORG_ID,1,chr(34),chr(44)))",

if we change position(1:4096) to position(1:1000) , the load works! Any idea why changing this number makes a difference??


Tom Kyte
June 15, 2004 - 5:29 pm UTC

can you help me reproduce?


ops$tkyte@ORA9IR2> !cat t.ctl
options (skip = 1)
Load Data
infile *
replace into table t
( x position(1:4000) "to_number(delimited.word(:x,1,chr(34),chr(44)))",
  y position(1:4000) "to_number(delimited.word(:y,2,chr(34),chr(44)))"
)
begindata
9703,13
9704,1
3709,37
6571,10
6574,5
3810,1
 
ops$tkyte@ORA9IR2> !sqlldr / t
 
SQL*Loader: Release 9.2.0.4.0 - Production on Tue Jun 15 17:11:08 2004
 
Copyright (c) 1982, 2002, Oracle Corporation.  All rights reserved.
 
Commit point reached - logical record count 5
 
ops$tkyte@ORA9IR2> select * from t;
 
         X          Y
---------- ----------
      9704          1
      3709         37
      6571         10
      6574          5
      3810          1
 

Skk

sundeep, August 04, 2004 - 3:53 pm UTC

HI TOM,
I really like all the soln from you...Need a small help on the reformat procedure.
I have to insert the enitre file content in the clob column,i can't do it using dbms_lob and bfile due to some architecture issue of the database and middle tier.
So i am planning to implement
1) Load thru SQL Loader
Can u please let me know how it can be done we are on 8.0...
2)using the above procedure reformat...
but i have issue in reformat is since i have data that exceeds 32000 char the procedure errors so can u prodvide me with a soln how we can handle ...as the temp table has 7000 rows ..
TIA
Regards,
S

Tom Kyte
August 05, 2004 - 8:41 am UTC

your sample data looks structured. no idea why you want to load as a clob and then run a procedure against it to "reformat". load it structured.

SKK

sundeep, August 04, 2004 - 3:56 pm UTC

The sample data to load is
203590 AVL 3485751 05-0695-03 OP PAS,MUX/DEMUX,BP,C/L/OSC,25dB/10dB ISOL MUX/DEMUX,SC/E2000/NO CONN,141x100x16mm 5731 JDS UNIPHASE, INC. BWDMY3TYCIP03-005 25-Jun-2003 I
282745 AVL 3341285 05-0886-01 OP PAS,WDM,1560.61-1558.17nm,100GHz,3.6dB IL,MPX,172x72x22mm 10967 AVANEX CORPORATION PEF3004XCCS09PX US 11-Jun-2003 A
282747 AVL 3341286 05-0887-01 OP PAS,WDM,1550.12-1552.52nm,100GHz,3.6dB IL,MPX,172x72x22mm 10967 AVANEX CORPORATION PEF3004XCCS11PX US 11-Jun-2003 A
282749 AVL 3341287 05-0888-01 OP PAS,WDM,1546.12-1548.51nm,100GHz,3.6dB IL,MPX,172x72x22mm 10967 AVANEX CORPORATION PEF3004XCCS12PX US 11-Jun-2003 A
282751 AVL 3341288 05-0889-01 OP PAS,WDM,1542.14-1544.53nm,100GHz,3.6dB IL,MPX,172x72x22mm 10967 AVANEX CORPORATION PEF3004XCCS13PX US 11-Jun-2003 A
282753 AVL 3341289 05-0890-01 OP PAS,WDM,1538.19-1540.56nm,100GHz,3.6dB IL,MPX,172x72x22mm 10967 AVANEX CORPORATION PEF3004XCCS14PX US 11-Jun-2003 A
282755 AVL 3341264 05-0891-01 OP PAS,WDM,1534.25-1536.61nm,100GHz,3.6dB IL,MPX,172x72x22mm 10967 AVANEX CORPORATION PEF3004XCCS15PX US 11-Jun-2003 A
282757 AVL 3341265 05-0892-01 OP PAS,WDM,1530.33-1532.68nm,100GHz,3.6dB IL,MPX,172x72x22mm 10967 AVANEX CORPORATION PEF3004XCCS16PX US 11-Jun-2003 A
282759 AVL 3341266 05-0893-01 OP PAS,WDM,1546.12-1562.23nm,16+1Chs,100GHz,6.2dB IL,MPX/MU 10967 AVANEX CORPORATION PMC3017XCCS01PX US 11-Jun-2003 A
282760 AVL 3341267 05-0894-01 OP PAS,WDM,1530.33-1544.53nm,16Chs,100GHz,4.0dB IL,MPX/MU 10967 AVANEX CORPORATION PFC4016XCCS01PX US 11-Jun-2003 A
282761 AVL 3345858 05-0895-01 OP PAS,WDM,1556.55-1554.13nm,100GHz,3.6dB IL,MPX,172x72x22mm 10967 AVANEX CORPORATION PEF3004XCCS10PX US 11-Jun-2003 A
282546 AVL 3828536 05-1009-01 OP PAS,INTRLVR,1529.16-1561.42nm,50GHz,100GHz,3.0dB IL,E2000,121x91x15mm 10967 AVANEX CORPORATION PMP2100DCCS01PX US 11-Jun-2003 A
282547 AVL 3828541 05-1010-01 OP PAS,INTRLVR,1577.86-1591.68nm,50GHz,100GHz,3.0dB IL,E2000,121x91x15mm 10967 AVANEX CORPORATION PMP2100DLCS01PX US 11-Jun-2003 A
282568 AVL 3994160 05-1022-01 OP PAS,SPLTR,1x2,50/50,1550,3.7dB IL,SM,NO CONN,3.6x70mm 10947 TYCO ELECTRONICS POWER SYSTEMS 0-107774-1 IN 16-Jun-2003 A
282484 AVL 2128997 07-1016-01 1X603, CPLR, 2.5GH, SM 6208 ANAREN MICROWAVE 1X603 US 10-Jun-2003 A
282500 AVL 1301743 08-0004-04 IC,GARRAY,D-PATH,RENO4,CQFP240 322 LSI LOGIC L2A1231 MY 30-Jun-2003 A
282501 AVL 2050153 08-0004-05 IC,GARRAY,D-PATH,RENO5,CQFP240 322 LSI LOGIC L2A1611 MY 30-Jun-2003 A
282503 AVL 10691 08-0006-01 IC,HDWR,TRIPOVT2 ,PQFP160 638 XILINX XC3390-PQ160C5086 PH,KR,JP,TW 11-Jul-2003 A
282504 AVL 10692 08-0007-01 IC,HDWR,C7000,DBUSCT,PQFP100 638 XILINX XC3342-PQ100C5087 PH,KR,JP,TW 11-Jul-2003 A
282512 AVL 10697 08-0012-01 IC,GARRY,DRAM CTRL LOGP,QFP240 322 LSI LOGIC L2A2005 KR 30-Jun-2003 A
282513 AVL 10698 08-0013-01 IC,GARRY,DATA PATH UNIT,QFP240 322 LSI LOGIC L2A1999 KR 30-Jun-2003 A
282516 AVL 10699 08-0014-01 IC,GARRY,QA,RSP/4,QFP208 322 LSI LOGIC L2A2000 KR 30-Jun-2003 A
282416 AVL 10703 08-0018-01 IC,GARRY,BUS INTF,REV 4 322 LSI LOGIC L1A9374 MY 30-Jun-2003 A
282436 AVL 62081 08-0030-02 IC,GARRAY,MD,RSP/4,TBGA342 322 LSI LOGIC L1A9733 CA 30-Jun-2003 A
282437 AVL 228290 08-0030-03 IC,GARRAY,MD,RSP/4,TBGA342 322 LSI LOGIC L2A0154 CA 30-Jun-2003 A
282438 AVL 10716 08-0031-01 IC,GARRAY,MC,RSP/2,TBGA342 322 LSI LOGIC L1A9614 CA 30-Jun-2003 A
282440 AVL 327665 08-0031-03 IC,GARRAY,MC,RSP/4,TBGA432 322 LSI LOGIC L2A0432 CA 30-Jun-2003 A
282448 AVL 71732 08-0037-01 IC,HDWR,MBRI MBA,PFQP160 638 XILINX XC3490APQ160C5180 PH,KR,JP,TW 11-Jul-2003 A
282449 AVL 71733 08-0038-01 IC,HDWR,MBRI MCN,PQFP160 638 XILINX XC3490APQ160C5179 PH,KR,JP,TW 11-Jul-2003 A
282450 AVL 30683 08-0040-01 IC,GARRAY,SAGE3,PQFP240 322 LSI LOGIC L1A9672 KR 30-Jun-2003 A
282367 AVL 123680 08-0073-01 IC,HDWR,PCI,DATAPTH,GNS,QFP240 638 XILINX XC4413-PQ240C6075 KR,JP,TW 11-Jul-2003 A
282368 AVL 260532 08-0073-02 IC,HDWR,PCI,DATAPTH,GNS,QFP240 638 XILINX XC4413-PQ240C6105 KR,JP,TW 11-Jul-2003 A
282369 AVL 123687 08-0074-01 IC,HDWR,PCI,CTLBUS,GNS,QFP160 638 XILINX XC4305-PQ160C PH,KR,JP,TW 11-Jul-2003 A
282375 AVL 606099 08-0078-02 IC,GARRAY,TIGER,BGA503 322 LSI LOGIC L2A0767 KR 30-Jun-2003 A
282377 AVL 1448822 08-0078-04 IC,GARRAY,TIGER,BGA503 322 LSI LOGIC L2A1367 KR,PH 30-Jun-2003 A
282378 AVL 1725709 08-0078-05 IC,GARRAY,TIGER,BGA503 322 LSI LOGIC L2A1495 KR,PH 30-Jun-2003 A
282268 AVL 246336 08-0102-01 IC,GARRAY,BODEGA STAT,PQFP160 322 LSI LOGIC L2A0322 MY 30-Jun-2003 A
282269 AVL 246348 08-0103-01 IC,GARRAY,QUADMAC,PQFP208 322 LSI LOGIC L2A0320 KR,MY,PH 30-Jun-2003 A
282270 AVL 246349 08-0104-01 IC,GARRAY,BODEGA,PBGA352 322 LSI LOGIC L2A0317 KR,MY 30-Jun-2003 A
282274 AVL 505452 08-0107-02 IC,STDCEL,SLI,BFR,E-PBGA313 322 LSI LOGIC L2B0395 KR,MY 30-Jun-2003 A
282276 AVL 1921545 08-0107-03 IC,STDCEL,SLI,BFR,E-PBGA313 322 LSI LOGIC L2A1515 MY 30-Jun-2003 A
282277 AVL 276239 08-0108-01 IC,ASIC,XBAR,BFR,E-PBGA503 322 LSI LOGIC L2A0410 MY,PH 30-Jun-2003

Wizard

A reader, November 10, 2004 - 11:16 am UTC

Does Oracle have a GUI tool to graphically build the SQLLDR controlfile? Sort of like the wizards MS Access/Excel walk you thru the steps of loading a file? Define columns, drag and drop, etc?

Thanks

Tom Kyte
November 10, 2004 - 12:23 pm UTC

not sure, not the biggest "gui" guy personally. i know Htmldb definitely does -- might be something in EM (probably is).

my_seq.nextval versus sequence(max,1)

John Gilmore, November 17, 2004 - 4:33 am UTC

Hi Tom,

With respect to running sqlloader concurrently, will two concurrent invocations of a script which uses "sequence(max,1)" to populate a column ever try to insert the same value?

It's not clear to me whether it simply looks at the current (uncommitted) maximum value of the column or whether it is actually implemented using an Oracle sequence.


Tom Kyte
November 17, 2004 - 10:24 am UTC

it will just be doing:


SELECT /*+NESTED_TABLE_GET_REFS+*/ NVL(MAX(DEPTNO),0)
FROM
DEPT


at the beginning, so there would be "conflicts" in a concurrent situation. you want to use a sequence.nextval, the sequence(max,1) is very much "pre-sequence" stuff.

(sql_trace=true tells all -- the above was a cut and paste from a traced sqlldr session)

Loading data with Sql Loader

Ashok k Rathi, October 12, 2005 - 9:27 am UTC

Tom ,
Is there any way I can load data using the sql loader
if the record length is 200 else discard it to discard file .
I tried filler but it does not allow sql function like "length"

Thanks
Ashok k rathi

Tom Kyte
October 12, 2005 - 1:53 pm UTC

give me an example layout for the data - perhaps by leaving off trailing nullcols - making those records fail, we could.

Loading filename

raja, October 13, 2005 - 7:52 am UTC

Hi Tom
I need to load the name of the data file in the target table . for eg : ifthe data file name is xx123.dat then it should be loaded as a part of data .
I have thought of options like like to develop the control file at "fly" and use it in sqlloader but there are issues of concurrency .
Please suggest

use sqlldr or trigger to insert the data

karthikeyan, January 18, 2006 - 7:19 am UTC

my data file has 7 fileds only but my table has 8 fields is it possible to load the data..or should i write a trigger to insert into the 8th column,In such insert using trigger using temporary table..im not able to delete the data in the temp one.

CREATE OR REPLACE TRIGGER SYS_REC_UPD_TRG
AFTER INSERT ON TR_TEMP_INS 
FOR EACH ROW
DECLARE
PRAGMA AUTONOMOUS_TRANSACTION;
        cr_mtimespace varchar2(45);
BEGIN
cr_mtimespace:=ROUND(TO_NUMBER(TO_DATE(:LOAD_DATE||:LOAD_TIME,'DDMMYYHH24MISS')-to_date('010170053000','DDMMRRHH24MISS'))*86400);
INSERT INTO MTNL_03 (timestampa,callingnoa,callednoa,durationa,dup_marka,load_date,load_time) VALUES (cr_mtimespace,:new.callingnoa,:new.callednoa,:new.durationa,:new.dup_marka,:new.load_date,:new.load_time);
delete from TR_TEMP_INS ;
commit;
END;

see a sample below
LOAD DATA
INFILE *
REPLACE INTO TABLE MTNL_03 FIELDS TERMINATED BY ' ' TRAILING NULLCOLS
(CALLEDNOA,CALLINGNOA,DURATIONA,LOAD_DATE,LOAD_TIME,DUP_MARKA,TIMESTAMPA "ROUND(TO_NUMBER(TO_DATE(:LOAD_DATE||:LOAD_TIME,'DDMMYYHH24MISS')-to_date('010170053000','DDMMRRHH24MISS'))*86400)")
BEGINDATA
1129633139 1129633139 035 181005 162859 N
1130338499 1129633139 035 261005 202459 N
1128225779 1129633139 035 021005 093259 N
1128225839 1129633139 035 021005 093359 N
1128393719 1129633139 035 041005 081159 N

SQL> desc mtnl_03
 Name                    Null?    Type
 ----------------------- -------- ---------------
 CALLEDNOA                        VARCHAR2(30)
 CALLINGNOA                       VARCHAR2(30)
 DURATIONA                        NUMBER
 DUP_MARKA                        VARCHAR2(1)
 LOAD_DATE                        VARCHAR2(20)
 LOAD_TIME                        VARCHAR2(20)
 TIMESTAMPA                       VARCHAR2(40)
note the data exceeds 10 lacs rows in actual, which should i use trigger or sqlldr is enough 

Tom Kyte
January 18, 2006 - 8:22 am UTC

it would only seem to make sense to do it in sqlldr since

a) you can
b) the trigger is in general "not something you want to have"




Re

Karthikeyan, January 18, 2006 - 9:31 am UTC

yes we can do it well in sqlldr but ,

the last column here is null..notpresent in such case its going to be null or blanks in the table..

In the previous example..i have 6 columns seperated by whitespaces..and the final 7th column is not in data file but to be inserted into the table.

when i insert its null over that column alone in table.

any suggestions to trap the right termination field in controlfile...

the same field works when i place the timestamp any where in the query wheres the last column is omitted...

any suggestions..

thx...:-)

Tom Kyte
January 19, 2006 - 7:44 am UTC

it won't be null if you apply a function to it???

not sure what you mean.

Trigger

Kamesh, January 18, 2006 - 10:00 pm UTC

First of all a trigger can not have a commit statement. DDL and DCL statements are not allowed in a trigger. According to me the trigger above should give compilation errors. Kindly, correct me if I am wrong.

Tom Kyte
January 19, 2006 - 12:32 pm UTC

triggers can commit. (autonomous transactions)....

ddl and dcl (we really just call them ddl) can be (but SHOULD NEVER EVER BE) done in triggers using autonomous transactions.



re kamesh:

karthikeyan, January 18, 2006 - 11:22 pm UTC

no i had put the question what i had done succesfully..
this trigger works out..

a commit alone might put errors but i have include the pragma autonoumous claues.. so the trigger is working fine.. but it requires a temp table..

i tried to use a global temporary table but sqlloader cannot load datas into the temporary table. It says teporary table doesnt support bitmap indexes..but i cant understand it.

here i dont need to create a temp table..as i had to delete after creation.and want to make it simple...to the most.

the sqlldr works fine but how to skip a column or omit a column in sqlldr.

particularly to make sqlldr recognize that last column is blank so consider this value thats enough for me.

Tom Kyte
January 19, 2006 - 12:33 pm UTC

Skipping columns in SQL*loader

Kamesh, January 19, 2006 - 10:22 am UTC

oops I seem to have overlooked the autonomous transaction pragma. Regarding skipping a column, yes it is possible. From Oracle 8i one can specify FILLER columns. FILLER columns are used to skip columns/fields in the load file, ignoring fields that one does not want. Look at this example:

LOAD DATA
TRUNCATE INTO TABLE T1
FIELDS TERMINATED BY ','
( field1,
field2 FILLER,
field3
)

thank you

karthikeyan, January 20, 2006 - 1:02 am UTC

LOAD DATA
INFILE *
INSERT INTO TABLE MTNL_03 FIELDS TERMINATED BY ' ' TRAILING NULLCOLS
(
(1)CALLEDNOA,
(2)CALLINGNOA,
(3)DURATIONA,
(4)LOAD_DATE,
(5)LOAD_TIME,
(6)DUP_MARKA,
(7)TIMESTAMPA ROUND(TO_NUMBER(TO_DATE(:LOAD_DATE||:LOAD_TIME,'DDMMYYHH24MISS')-
to_date('010170053000','DDMMRRHH24MISS'))*86400)"
)
BEGINDATA
(1) (2) (3) (4) (5) (6) (7-nill)
1129633139 1129633139 035 181005 162859 N
1130338499 1129633139 035 261005 202459 N
1128225779 1129633139 035 021005 093259 N
1128225839 1129633139 035 021005 093359 N
1128393719 1129633139 035 041005 081159 N

i am really sorry to disturb you tom...

in the above condition is it possible to insert the 7th column into table..that is not in the data file.

SAMPLE OUTPUT:

CALLEDNOA CALLINGNOA DURATIONA LOAD_DATE LOAD_TIME DUPMARKA TIMESTAMPA
------------ ----------- ---------- --------- --------- --------- --------
1129633139 1129633139 35 181005 162859 N

thanks a lot..




Tom Kyte
January 20, 2006 - 10:17 am UTC

ops$tkyte@ORA10GR2> !cat t.ctl
LOAD DATA
INFILE *
INSERT INTO TABLE MTNL_03 FIELDS TERMINATED BY ' ' TRAILING NULLCOLS
(
CALLEDNOA,
CALLINGNOA,
DURATIONA,
LOAD_DATE,
LOAD_TIME,
DUP_MARKA,
TIMESTAMPA "ROUND(TO_NUMBER(TO_DATE(:LOAD_DATE||:LOAD_TIME,'DDMMYYHH24MISS')-to_date('010170053000','DDMMRRHH24MISS'))*86400)"
)
BEGINDATA
1129633139 1129633139 035 181005 162859 N
1130338499 1129633139 035 261005 202459 N
1128225779 1129633139 035 021005 093259 N
1128225839 1129633139 035 021005 093359 N
1128393719 1129633139 035 041005 081159 N
 
ops$tkyte@ORA10GR2> !sqlldr / t
 
SQL*Loader: Release 10.2.0.1.0 - Production on Fri Jan 20 15:09:00 2006
 
Copyright (c) 1982, 2005, Oracle.  All rights reserved.
 
Commit point reached - logical record count 5
 
ops$tkyte@ORA10GR2> select timestampa from MTNL_03;
 
TIMESTAMPA
--------------------
1129633139
1130338499
1128225779
1128225839
1128393719
 

HI-TOM

KARTHIKEYAN, January 20, 2006 - 2:17 am UTC

LOAD DATA
INFILE *
REPLACE INTO TABLE MTNL_03 FIELDS TERMINATED BY ' ' TRAILING NULLCOLS
(CALLEDNOA,
CALLINGNOA,
DURATIONA,
LOAD_DATE,
LOAD_TIME,
DUP_MARKA,
TIMESTAMPA "ROUND(TO_NUMBER(TO_DATE(:LOAD_DATE||:LOAD_TIME,'DDMMYYHH24MISS')-to_date('010170053000','DDMMRRHH24MISS'))*86400)"
)
BEGINDATA
1129633139 1129633139 035 181005 162859 N
1130338499 1129633139 035 261005 202459 N
1128225779 1129633139 035 021005 093259 N
1128225839 1129633139 035 021005 093359 N
1128393719 1129633139 035 041005 081159 N


THIS THING WORKS OUT

BUT IT WORKS IN CONVENTIONAL PATH LOAD ONLY
NOT FOR DIRECT PATH LOAD..

IS THERE ANY OPTION TO INCREASE THE SPEED OF LOAD IN CONVENTIONAL PATH LOAD..

THANKS ALOT


Tom Kyte
January 20, 2006 - 10:18 am UTC

In software written this century, it can use direct path:

ops$tkyte@ORA10GR2> !sqlldr / t direct=y
 
SQL*Loader: Release 10.2.0.1.0 - Production on Fri Jan 20 15:10:25 2006
 
Copyright (c) 1982, 2005, Oracle.  All rights reserved.
 
 
Load completed - logical record count 5.
 
ops$tkyte@ORA10GR2> !cat t.log | grep -i direct
Path used:      Direct
Bind array size not used in direct path.



therefore, I must guess you are using really old stuff? 

re old stuff....

Karthikeyan, January 24, 2006 - 12:10 am UTC

this is the output i get for the direct option=true..it works out but the values are not inserted for timestamp the blank column.

With direct=true i get this output:

C:\Documents and Settings\skarthikeyan\Desktop\etcertra\sqlloadr>sqlldr userid=recon/recon@testdb control=ts.ctl direct=true

SQL*Loader: Release 9.2.0.1.0 - Production on Tue Jan 24 10:28:08 2006

Copyright (c) 1982, 2002, Oracle Corporation.  All rights reserved.

Load completed - logical record count 5.

SQL> select * from mtnl_03;

CALLEDNOA   CALLINGNOA   DURATIONA D LOAD_DATE  LOAD_TIME  TIMESTAMPA
----------- ----------- ---------- - ---------- ---------- -----------
1129633139  1129633139          35 N 181005     162859
1130338499  1129633139          35 N 261005     202459
1128225779  1129633139          35 N 021005     093259
1128225839  1129633139          35 N 021005     093359
1128393719  1129633139          35 N 041005     081159



When i dont specify the direct i get this output:

C:\Documents and Settings\skarthikeyan\Desktop\etcertra\sqlloadr>sqlldr userid=recon/recon@testdb control=ts.ctl

SQL*Loader: Release 9.2.0.1.0 - Production on Tue Jan 24 10:33:31 2006

Copyright (c) 1982, 2002, Oracle Corporation.  All rights reserved.

Commit point reached - logical record count 5

SQL> truncate table mtnl_03;

Table truncated.

SQL> select * from mtnl_03;

CALLEDNOA    CALLINGNOA   DURATIONA D LOAD_DATE  LOAD_TIME  TIMESTAMPA
------------ ----------- ---------- - ---------- ---------- -----------
1129633139   1129633139          35 N 181005     162859     1129633139
1130338499   1129633139          35 N 261005     202459     1130338499
1128225779   1129633139          35 N 021005     093259     1128225779
1128225839   1129633139          35 N 021005     093359     1128225839
1128393719   1129633139          35 N 041005     081159     1128393719

My ctl file:
LOAD DATA
INFILE *
INSERT INTO TABLE MTNL_03 FIELDS TERMINATED BY ' ' TRAILING NULLCOLS
(
CALLEDNOA,
CALLINGNOA,
DURATIONA,
LOAD_DATE,
LOAD_TIME,
DUP_MARKA,
TIMESTAMPA "ROUND(TO_NUMBER(TO_DATE(:LOAD_DATE||:LOAD_TIME,'DDMMYYHH24MISS')-to_date('010170053000','DDMMRRHH24MISS'))*86400)"
)
BEGINDATA
1129633139 1129633139 035 181005 162859 N
1130338499 1129633139 035 261005 202459 N
1128225779 1129633139 035 021005 093259 N
1128225839 1129633139 035 021005 093359 N
1128393719 1129633139 035 041005 081159 N

here what is the mistake i make..
 

Tom Kyte
January 24, 2006 - 8:05 am UTC

that would be a "product issue" if that is happening, I notice you are using very much "not patched" software - any change of getting up to 9.2.0.7?

RE PATCHED VERSION

Karthikeyan, January 27, 2006 - 2:38 am UTC

sorry tom...
I had to do this in 8.1.7 version
no problem in testing with 9.2.0.7

thx....for ur suggestions tom.

Tom Kyte
January 27, 2006 - 8:37 am UTC

8i did not have the ability to do sql in direct path - 9i did.

Load all contents of file using sqlldr

David Pujol, February 09, 2006 - 8:02 am UTC

Hi Tom, I need load one xml file. Is a requeriment use sql loader. I'd like to know how load all contents of file in one record, for example, I have a column of clob type.

My table is:

SQL> desc yqta_xmls
 Name                                      Null?    Type
 ----------------------------------------- -------- ----------------------------
 FICHERO                                            VARCHAR2(500)
 CONTENIDO                                          CLOB

I need that 1 load --> 1 record (name of file and xml content). 

How can I build control file?

Thanks in advance
David Pujol

 

Tom Kyte
February 10, 2006 - 11:07 am UTC

extract from Expert Oracle: Database Architecture on data loading/unloading:

<quote>
Loading LOB Data That Is Out of Line

A common scenario is to have a data file that contains the names of files to load into the LOBs, instead of having the LOB data mixed in with the structured data. This offers a greater degree of flexibility, as the data file given to SQLLDR does not have to use one of the four methods to get around having embedded newlines in the input data, as would frequently happen with large amounts of text or binary data. SQLLDR calls this type of additional data file a LOBFILE.

SQLLDR can also support the loading of a structured data file that points to another, single data file. We can tell SQLLDR how to parse LOB data from this other file, so that each row in the structured data gets loaded with a piece of it. I find this mode to be of limited use (I've never found a use for it myself to date), and I will not discuss it here. SQLLDR refers to these externally referenced files as complex secondary data files.

LOBFILES are relatively simple data files aimed at facilitating LOB loading. The attribute that distinguishes LOBFILEs from the main data files is that in LOBFILEs, there is no concept of a record, hence newlines never get in the way. In LOBFILEs, the data is in any of the following formats:

    *    Fixed-length fields (e.g., load bytes 100 through 1000 from the LOBFILE)
    *    Delimited fields (terminated by something or enclosed by something)
    *    Length/value pairs, a variable-length field

The most common of these types is the delimited fields-ones that are terminated by an end-of-file (EOF), in fact. Typically, you have a directory full of files you would like to load into LOB columns, and each file in its entirety will go into a BLOB. The LOBFILE statement with TERMINATED BY EOF is what you will use.
So, let's say we have a directory full of files we would like to load into the database. We would like to load the OWNER of the file, the TIME_STAMP of the file, the NAME of the file, and the file itself. Our table we would load into would be as follows:

ops$tkyte@ORA10G> create table lob_demo
  2  ( owner      varchar2(255),
  3    time_stamp date,
  4    filename   varchar2(255),
  5    data       blob
  6  )
  7  /
Table created.

Using a simple ls -l on UNIX, and dir /q /n on Windows, and capturing that output, we can generate our input file and load it using a control file such as this on UNIX:

LOAD DATA
INFILE *
REPLACE
INTO TABLE LOB_DEMO
( owner       position(17:25),
  time_stamp  position(44:55) date "Mon DD HH24:MI",
  filename    position(57:100),
  data        LOBFILE(filename) TERMINATED BY EOF
)
BEGINDATA
-rw-r--r--    1 tkyte    tkyte     1220342 Jun 17 15:26 classes12.zip
-rw-rw-r--    1 tkyte    tkyte          10 Jul 16 16:38 foo.sql
-rw-rw-r--    1 tkyte    tkyte         751 Jul 16 16:36 t.ctl
-rw-rw-r--    1 tkyte    tkyte         491 Jul 16 16:38 testa.sql
-rw-rw-r--    1 tkyte    tkyte         283 Jul 16 16:38 testb.sql
-rw-rw-r--    1 tkyte    tkyte         231 Jul 16 16:38 test.sh
-rw-rw-r--    1 tkyte    tkyte         235 Apr 28 18:03 test.sql
-rw-rw-r--    1 tkyte    tkyte        1649 Jul 16 16:36 t.log
-rw-rw-r--    1 tkyte    tkyte        1292 Jul 16 16:38 uselast.sql
-rw-rw-r--    1 tkyte    tkyte         909 Jul 16 16:38 userbs.sql

Now, if we inspect the contents of the LOB_DEMO table after running SQLLDR, we will discover the following:

ops$tkyte@ORA10G> select owner, time_stamp, filename, dbms_lob.getlength(data)
  2  from lob_demo
  3  /
 
OWNER    TIME_STAM FILENAME       DBMS_LOB.GETLENGTH(DATA)
-------- --------- -------------- ------------------------
tkyte    17-JUN-05 classes12.zip                   1220342
tkyte    16-JUL-05 foo.sql                              10
tkyte    16-JUL-05 t.ctl                               875
tkyte    16-JUL-05 testa.sql                           491
tkyte    16-JUL-05 testb.sql                           283
tkyte    16-JUL-05 test.sh                             231
tkyte    28-APR-05 test.sql                            235
tkyte    16-JUL-05 t.log                                 0
tkyte    16-JUL-05 uselast.sql                        1292
tkyte    16-JUL-05 userbs.sql                          909
 
10 rows selected.

This works with CLOBs as well as BLOBs. Loading a directory of text files using SQLLDR in this fashion is easy.

</quote> 

sqlloader in clob

David Pujol, February 13, 2006 - 4:13 pm UTC

A lot of thanks !!

CLOB And SQL Loader

Mital, March 13, 2006 - 1:04 pm UTC

Hi Tom,
I have similar question regarding CLOB datatype.
my table is as :-

CREATE TABLE TBL_GSMAC (
COUNTER NUMBER (10) NOT NULL,
SOURCE VARCHAR2 (1) NOT NULL,
REQNO CHAR (5),
RECONDATE DATE,
IDDATE DATE,
DATELASTINPROOF DATE,
DATELASTINPROOFALT DATE,
FUND CHAR (4),
FUND_EXT CHAR (4),
REGION CHAR (4),
ASSETID CHAR (9) DEFAULT '?????????',
ASSETIDALT CHAR (9) DEFAULT '?????????',
RATE FLOAT,
MATURITYDATE DATE,
LOC CHAR (3),
SECACCT CHAR (4),
SECTYPECODE CHAR (4),
INVTYPE CHAR (2),
INVDESCRIPTION CHAR (20),
TRADECURRENCY CHAR (3),
TRADECOUNTRY CHAR (2),
SECTYPECODEALT CHAR (4),
SETTLELOCATIONNAME VARCHAR2 (40),
SETTLECOUNTRY VARCHAR2 (30),
SUBCUSTACCTNO VARCHAR2 (34),
ASSETNUMSYSTEM CHAR (4),
ISIN CHAR (35) DEFAULT '???????????????????????????????????',
SUBSECDESC1 VARCHAR2 (50),
SUBSECDESC2 VARCHAR2 (50),
SUBSECDESC3 VARCHAR2 (35),
SUBSECDESC4 VARCHAR2 (35),
SPECIAL_INSTRUCTIONS VARCHAR2 (255),
SSBSECDESC1 VARCHAR2 (50),
SSBSECDESC2 VARCHAR2 (50),
SSBSECDESC3 VARCHAR2 (30),
SSBSECDESC4 VARCHAR2 (30),
SSBSECURITIESACCOUNT CHAR (4),
REGISTRACKED CHAR (1),
SHORTTERMSW CHAR (1),
RERUNSW CHAR (1),
SUB FLOAT NOT NULL,
GSMAC FLOAT NOT NULL,
VARIANCE_ FLOAT NOT NULL,
VARABS FLOAT NOT NULL,
EXCEPTIONCODE CHAR (2),
REGIST_FIELD1_DESC CHAR (30),
REGIST_FIELD1_GSMAC_POS FLOAT NOT NULL,
REGIST_FIELD1_SUB_POS FLOAT NOT NULL,
REGIST_FIELD2_DESC CHAR (30),
REGIST_FIELD2_GSMAC_POS FLOAT NOT NULL,
REGIST_FIELD2_SUB_POS FLOAT NOT NULL,
REGIST_FIELD3_DESC CHAR (30),
REGIST_FIELD3_GSMAC_POS FLOAT NOT NULL,
REGIST_FIELD3_SUB_POS FLOAT NOT NULL,
REGIST_FIELD4_DESC CHAR (30),
REGIST_FIELD4_GSMAC_POS FLOAT NOT NULL,
REGIST_FIELD4_SUB_POS FLOAT NOT NULL,
GSMACBLOCKED FLOAT NOT NULL,
GSMACSEGREGATED FLOAT NOT NULL,
SUBBLOCKED FLOAT NOT NULL,
SUBSEGREGATED FLOAT NOT NULL,
STATUS VARCHAR2 (4),
USERID VARCHAR2 (50),
CLEARRESP VARCHAR2 (12),
ISOCOMMENTS VARCHAR2 (255),
EXCEPTIONTYPE VARCHAR2 (4),
RESOLUTIONDATE DATE,
ITEMCOUNT FLOAT,
SAFEKEEPINGEXCEPTION VARCHAR2 (2),
AUDITDATE DATE,
AUDITTYPE VARCHAR2 (4),
DATEPRINTED DATE,
SWIFTMSGID VARCHAR2 (50),
LASTRECONDATE DATE,
SAFEKEEPING NUMBER (1) NOT NULL,
REGISTRATION NUMBER (1),
SEGREGATED NUMBER (1),
BLOCKED NUMBER (1),
MANSTATUS CHAR (1),
REGCMMNT VARCHAR2 (255),
REGCMMNTBY VARCHAR2 (50),
REGCMMNTDTE DATE,
SAFCMMNT VARCHAR2 (255),
SAFCMMNTBY VARCHAR2 (50),
SAFCMMNTDTE DATE,
SEGCMMNT VARCHAR2 (255),
SEGCMMNTBY VARCHAR2 (50),
SEGCMMNTDTE DATE,
BLCCMMNT VARCHAR2 (255),
BLCCMMNTBY VARCHAR2 (50),
BLCCMMNTDTE DATE,
CLOSED NUMBER (10),
CONSTRAINT TBL_GSMAC_PK
PRIMARY KEY ( COUNTER ) ) ;

The data file looks like:--

3608804,"A","00002",1998-12-07 00:00:00,1998-12-08 10:32:15.743000000,1998-12-04 00:00:00,1992-01-01 00:00:00,"OM01"," ","NADC","6839949N9"," ",15.38,2007-05-01 00:00:00,"CAS","SOPS"," ","30","INV TYPE NOT FOUND ","CAD","CA"," ","CANADA-CANADA SELFCUSTODY-SSS","CANADA","T770"," ","************ "," "," "," "," "," ","ONTARIO DEBENTURE"," "," "," ","OM01","N","N","N",0,128000000,128000000,-128000000,"10"," ",0,0," ",0,0," ",0,0," ",0,0,0,0,0,0,"CLSD","EXCT dan smith 11-07-2005","CONV UNIT","","DRIP",,,"10",1998-12-08 10:22:46.210000000,"RVCL",2002-06-07 00:00:00,"98Q%LE24337",2003-10-31 00:00:00,True,False,False,False,"","","",,"Clear Resp changed to: CONV UNIT from: Canada, Exc_type changed to: DRIP from: BVLT dan smith at 11/7/2005 12:01:12 PM
Refers to physical holdings in NYC location, previously held in Boston *Gerardo Teodoro 3/4/2003 2:49:39 PM
Exception originally coded Canada-Bvlt was erroneouly changed to GCAD-Bvlt *peter dela pena 6/14/02 1:07:03 PM
Exception type changed from ECRO to BVLT per Peter Pena ref tkt 3517","Gerardo Teodoro",2003-03-04 00:00:00,"","",,"","",,


This is one row of data.

I am getting error message in log file as :--

Rejected - Error on table TBL_GSMAC, column COUNTER.
Field in data file exceeds maximum length

Rejected - Error on table TBL_GSMAC, column SAFCMMNT.
second enclosure string not present.

Can you please help me in solving this?

Thanks!!

Tom Kyte
March 13, 2006 - 9:31 pm UTC

hmm, everthing but... the control file.

CLOB and SQL Loader!!

Mital, March 14, 2006 - 10:18 am UTC

Tom,
sorry for the blunder. Here is the control file!!


load data
infile 'c:\scripts\datadump\tbl_gsmac.txt'
into table tbl_gsmac
fields terminated by "," optionally enclosed by '"'
trailing nullcols
(COUNTER CHAR(20),
SOURCE,
REQNO,
RECONDATE "to_date(:recondate,'YYYY-MM-DD HH24:MI:SS')",
IDDATE "to_timestamp(:iddate,'yyyy-mm-dd hh24:mi:ssxff')",
DATELASTINPROOF "to_date(:datelastinproof,'YYYY-MM-DD HH24:MI:SS')",
DATELASTINPROOFALT "to_date(:datelastinproofalt,'YYYY-MM-DD HH24:MI:SS')",
FUND,
FUND_EXT,
REGION,
ASSETID,
ASSETIDALT,
RATE,
MATURITYDATE "to_date(:maturitydate,'YYYY-MM-DD HH24:MI:SS')",
LOC,
SECACCT,
SECTYPECODE,
INVTYPE,
INVDESCRIPTION,
TRADECURRENCY,
TRADECOUNTRY,
SECTYPECODEALT,
SETTLELOCATIONNAME,
SETTLECOUNTRY,
SUBCUSTACCTNO,
ASSETNUMSYSTEM,
ISIN,
SUBSECDESC1,
SUBSECDESC2,
SUBSECDESC3,
SUBSECDESC4,
SPECIAL_INSTRUCTIONS CHAR(10000) ,
SSBSECDESC1,
SSBSECDESC2,
SSBSECDESC3,
SSBSECDESC4,
SSBSECURITIESACCOUNT,
REGISTRACKED,
SHORTTERMSW,
RERUNSW,
SUB,
GSMAC,
VARIANCE_,
VARABS,
EXCEPTIONCODE,
REGIST_FIELD1_DESC,
REGIST_FIELD1_GSMAC_POS,
REGIST_FIELD1_SUB_POS,
REGIST_FIELD2_DESC,
REGIST_FIELD2_GSMAC_POS,
REGIST_FIELD2_SUB_POS,
REGIST_FIELD3_DESC,
REGIST_FIELD3_GSMAC_POS,
REGIST_FIELD3_SUB_POS,
REGIST_FIELD4_DESC,
REGIST_FIELD4_GSMAC_POS,
REGIST_FIELD4_SUB_POS,
GSMACBLOCKED,
GSMACSEGREGATED,
SUBBLOCKED,
SUBSEGREGATED,
STATUS,
USERID,
CLEARRESP,
ISOCOMMENTS,
EXCEPTIONTYPE,
RESOLUTIONDATE "to_date(:resolutiondate,'YYYY-MM-DD HH24:MI:SS')",
ITEMCOUNT,
SAFEKEEPINGEXCEPTION,
AUDITDATE "to_timestamp(:auditdate,'yyyy-mm-dd hh24:mi:ssxff')",
AUDITTYPE,
DATEPRINTED "to_date(:dateprinted,'YYYY-MM-DD HH24:MI:SS')",
SWIFTMSGID,
LASTRECONDATE "to_date(:lastrecondate,'YYYY-MM-DD HH24:MI:SS')",
SAFEKEEPING,
REGISTRATION,
SEGREGATED,
BLOCKED,
MANSTATUS,
REGCMMNT CHAR(10000),
REGCMMNTBY,
REGCMMNTDTE "to_date(:regcmmntdte,'YYYY-MM-DD HH24:MI:SS')",
SAFCMMNT CHAR(10000),
SAFCMMNTBY,
SAFCMMNTDTE "to_date(:safcmmntdte,'YYYY-MM-DD HH24:MI:SS')",
SEGCMMNT CHAR(10000),
SEGCMMNTBY,
SEGCMMNTDTE "to_date(:segcmmntdte,'YYYY-MM-DD HH24:MI:SS')",
BLCCMMNT CHAR(10000),
BLCCMMNTBY,
BLCCMMNTDTE "to_date(:blccmmntdte,'YYYY-MM-DD HH24:MI:SS')",
CLOSED)


The columns SAFCMMNT,SEGCMMNT,BLCCMMNT,REGCMMNT and SPECIAL_INSTRUCTIONS are LONG columns in SQL Server(thats where the original data resides)!! Another team has converted these SQL Server data into .txt files and now its my job to load the same into Oracle 9i.

These LONG columns are VARCHAR2(255) in Oracle Database. The problem with these columns are that when they span to next line (new line), the SQL*Loader is looking for second termination string(") and displays the error message that second enclosure string not found!!

How to eliminate the new line character?

Your kind help is needed on the same.

Thanks for the quickest reply!!




Tom Kyte
March 15, 2006 - 8:56 am UTC

well, I ran your create, I used your record.  

It did not fail on the columns you stated, however, the strings true, false, false, false did not like being loaded into numbers, so I changed them into varchars.  It got mad when you tried to load 406 characters into safcmmnt which you defined as 255.

Basically, you have bad data - suggestion:

recreate the table with all varchar2(4000) columns.
load it.
then using sql, inspect what you actually have - you do not have the inputs you described in the control file at all.


ops$tkyte@ORA9IR2> select safekeeping, registration, segregated, blocked, length(safcmmnt) from TBL_GSMAC;

SAFEKEEPIN REGISTRATI SEGREGATED BLOCKED    LENGTH(SAFCMMNT)
---------- ---------- ---------- ---------- ----------------
True       False      False      False                   406

 

CLOB and SQL Loader

Mital, March 17, 2006 - 9:41 am UTC

Tom thanks for the insight. I apologize for replying late.
I have made the changes proposed by you and still, I am getting error message as:-

SQL*Loader-510: Physical record in data file (c:\scripts\datadump\tbl_gsmac_test.txt) is longer than the maximum(1048576)
SQL*Loader-2026: the load was aborted because SQL Loader cannot continue.

I changed the datatype of varchar2(255) to varchar2(4000). But looking at this error, I think that even varchar2(4000) is not enough for the particular column SAFCMMNT.

will CLOB work in place of varchar2(4000)? If yes, can you please please provide me with the declaration syntax as I am not familiar with the same?

Also, I was thinking of using external tables instead of SQL*Loader. Is external table a good decision for this scenario?

Your input is highly appreciated.

Thanking in advance,
Mital


Tom Kyte
March 17, 2006 - 5:46 pm UTC

check out that physical record - what do you SEE.

I mean, this is your data, do you know what the specifications for this data is? what should be the maximum length of this field - I refuse to believe you are supposed to "guess" this stuff.


You would just make it a clob type in the table, and use char(N) in the control file (where N is at least big enough for the biggest one) but way way way before you do that, you need to find out what this should really be.

TRUNCATE

A reader, March 19, 2006 - 10:49 pm UTC

Tom,
In your book you mentioned that "TRUNCATE" option in the control file would silently use "REUSE STORAGE" in sqlldr chapter.

In our system, we do full refresh of many tables every day. We truncate the tables and load the tables using "INSERT" Option in the control file using sqlldr.

After reading this chapter, i am thinking of taking advantage of "REUSE STORAGE" similar to what SQLLDR does as the number of records we load everyday would be generally greater than or equal to the number of records we load on the previous day.

With the REUSE STORAGE option, i expect performance of the loading jobs to be better. Is there anything else i have to consider before suggesting to change the loading jobs?
Thanks Tom.

Tom Kyte
March 20, 2006 - 7:09 am UTC

I would not expect this to make a huge difference - small to none at best (unless you have 10's or 100's of thousands of extents - in which case we need to fix that first - we'd want to adjust your uniform size so we didn't have to allocate so many).



Question

Harison, April 01, 2006 - 11:05 pm UTC

I have a question about sqlldr.
I am directly coming to my need and expecting your help as early as possible.
e.g: I have n number of file in abc.csv formate, these file are stored in a directory(folder). By using sqlldr and a single ctl file i can load these .csv file into single table. For the next week i am going to download new .csv file and stored it in same folder with deferent name like xyz.csv and i will delete the previous .csv files from the same folder. This time i want load the .csv file with out modifying the sqlldr command.
I am entering my sqlldr command that i using now.

sqlldr control=ctlINX_Search_Conversions_Raw.ctl, data=dataSearch , log=loglog_INX_Search_Conversions_Raw_Mar1925.log, bad=logad_INX_Search_Conversions_Raw_Mar1925.log, discard=logdiscard_INX_Search_Conversions_Raw_Mar1925.log, userid=INX/INX

next time i will store the .csv as INX_Search_Conversions_Raw_Mar2631.csv in folder but i am not interested to change the sqlldr command for every week.

So i need a loop function that whatever .csv files i stored in that particular folder with different name it will be loaded into a table with losing any of data that previously loaded in the table with respect to date appending.
I am expecting the answer as early as posible that should help me to getting a good name in this company "connexiona".
as i need now.
Thanks
Harry



Tom Kyte
April 02, 2006 - 11:12 am UTC

well, this is really an OS scripting question.

so, hence, the answer would depend on the OS.

there are "for" commands that can iterate over files in a directory available from the command line in various OS's. That is what you will be using.

You'll be scripting this - if you cannot figure out how to script it using the scripting language of your preferred operating system - let us know what OS that is and maybe we can point out how it would be done.

Can we forget about sqlloader?

Sreenivasa Rao, April 03, 2006 - 1:59 am UTC

External table concept looks good in 9i onwards.can we totally forget about the sqlloader or does sqlloader has it own advantages still.
Pls clarify.
Regards,
Sreenivas.

Tom Kyte
April 03, 2006 - 8:07 am UTC

<quote src=Expert Oracle database architecture>

What many DBAs don’t realize is that their knowledge of SQLLDR control files is very readily transferable to the use of external tables. You’ll discover, as we work through the examples in this part of the chapter, that external tables incorporate much of the SQLLDR syntax and many of the techniques.

SQLLDR should be chosen over external tables in the following three situations:

* You have to load data over a network—in other words, when the input file is not on the database server itself. One of the restrictions of external tables is that the input file must be accessible on the database server.
* Multiple users must concurrently work with the same external table processing different input files.
* You have to work with LOB types. External tables do not support LOBs.

With those three exceptions in mind, in general I strongly recommend using external tables for their extended capabilities. SQLLDR is a fairly simple tool that generates an INSERT statement and loads data. Its ability to use SQL is limited to calling SQL functions on a row-by-row basis. External tables open up the entire SQL set of functionality to data loading. Some of the key functionality features that external tables have over SQLLDR in my experience are as follows:
</quote>

Embedded Record Terminator w/SQL Loader

Dan, April 24, 2006 - 7:46 pm UTC

Hi Tom,
Follow-on to one of your earlier comments. If I have CSV data that I'm loading in stream mode using

FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'

it seems that I can't have embedded record terminators (newlines) even in quoted fields. You suggested using a different record terminator like "^\n" My question is, what if I can't guarantee my data not to contain any particular character combination like that? Is there any other way to allow embedded newlines?

Thanks,

Tom Kyte
April 25, 2006 - 12:29 am UTC

If you have access to expert one on one Oracle - I cover all of the possible ways - which are basically

o using a string to terminate the record instead of just a newline, as discussed.
o using a "var" type record where the leading bytes represent the record length.

sqlloader header

jhon edison b.c, May 05, 2006 - 4:07 pm UTC

Hola since as,like all: The truth almost not groin but I expect they understand me a bit. I have the following problem, desire To load a base of datas information using sqlloader but not since, like before the load to add her(it) Head-board of the record, in this case the date: the file is of the following form:


clamasiva8.log; C:\clasmas; 30; Clasificacion ; 172976; 182510; 2; 8490073; ; 1; 0; 24; AAA;
clamasiva8.log; C:\clasmas; 30; Clasificacion ; 68901; 73461; 2; 8490075; ; 1; 0; 24; AAA;
30-01-2006;
clamasiva3.log; C:\clasmas; 28; Clasificacion ; 2523; 2616; 2; 8849387; ; 1; 0; 1; AAA;
clamasiva3.log; C:\clasmas; 28; Clasificacion ; 9923; 10510; 2; 8861563; A; 1; 0; 4; AAA;
clamasiva3.log; C:\clasmas; 28; Clasificacion ; 36988; 39030; 2; 8845339; ; 1; 0; 1; AAA;
26-01-2006;

It must be this way:

clamasiva8.log; C:\clasmas; 30-01-2006; 30; Clasificacion ; 172976; 182510; 2; 8490073; ; 1; 0; 24; AAA;
clamasiva8.log; C:\clasmas; 30-01-2006; 30; Clasificacion ; 68901; 73461; 2; 8490075; ; 1; 0; 24; AAA;
clamasiva3.log; C:\clasmas; 26-01-2006; 28; Clasificacion ; 2523; 2616; 2; 8849387; ; 1; 0; 1; AAA;
clamasiva3.log; C:\clasmas; 26-01-2006; 28; Clasificacion ; 9923; 10510; 2; 8861563; A; 1; 0; 4; AAA;
clamasiva3.log; C:\clasmas; 26-01-2006; 28; Clasificacion ; 36988; 39030; 2; 8845339; ; 1; 0; 1; AAA;

Thank you:



Tom Kyte
May 05, 2006 - 5:01 pm UTC

look in the other place you put this very ambigous question.

sql loader date header

jhon edison b.c, May 08, 2006 - 4:48 pm UTC

hi tom: thank's you attention:
It is necessary to create a file ctl in order that the date appears in the position of the file of exit. Using sql loader.


file input

clamasiva8.log; C:\clasmas; 30;
clamasiva8.log; C:\clasmas; 30;
30-01-2006;

clamasiva3.log; C:\clasmas; 28;
clamasiva3.log; C:\clasmas; 28;
clamasiva3.log; C:\clasmas; 28;
26-01-2006;

clamasiva7.log; C:\clasmas; 28;
clamasiva7.log; C:\clasmas; 28;
27-03-2006;




File of exit: this way it has to can only in the table:


clamasiva8.log; C:\clasmas; 30-01-2006; 30;
clamasiva8.log; C:\clasmas; 30-01-2006; 30;

clamasiva3.log; C:\clasmas; 26-01-2006; 28;
clamasiva3.log; C:\clasmas; 26-01-2006; 28;
clamasiva3.log; C:\clasmas; 26-01-2006; 28;

clamasiva7.log; C:\clasmas; 27-03-2006; 28;
clamasiva7.log; C:\clasmas; 27-03-2006; 28;


create table file(id_file integer not null,
name_file varchar2(40) not null,
path_file varchar2(35),
fecha date, num_file number(3)
primary key (id_file));


create secuence tk; //(tk.nextval, 1)


Tom Kyte
May 08, 2006 - 8:19 pm UTC

I don't understand. "file of input"? "file of exit"?

using sql loader

jhon edison b.c, May 09, 2006 - 8:57 am UTC

looks at the file of entry is like this nowadays, and the file of exit is like a debit to be to be able to be loaded in the table using sqlloader. thank you.

Tom Kyte
May 09, 2006 - 9:52 am UTC

ok, so load it? I'm not sure what you are asking.


bearing in mind that sqlldr is going to read the file sequentially from top to bottom and it is very likly you will have to perform some pre-processing of your files in order to get that last line "first" and have it available if each record needs to be timestamped with it

using sql loader

jhon edison b.c, May 09, 2006 - 10:09 am UTC

look tom, the procesing file this way:
But here it appears the hour, ¿I believe that it does not important?

30-01-2006 08:01:23;
clamasiva8.log; C:\clasmas; 30;
clamasiva8.log; C:\clasmas; 30;
26-01-2006 09:37:25;
clamasiva3.log; C:\clasmas; 28;
clamasiva3.log; C:\clasmas; 28;
clamasiva3.log; C:\clasmas; 28;
27-03-2006 10:12:01;
clamasiva7.log; C:\clasmas; 28;
clamasiva7.log; C:\clasmas; 28;

thank you.

Tom Kyte
May 09, 2006 - 11:03 am UTC

sorry, but I don't know what you mean by "appears the hour, I believe it is not important"

A reader, May 09, 2006 - 12:45 pm UTC

look tom, the procesing file this way:

30-01-2006;
clamasiva8.log; C:\clasmas; 30;
clamasiva8.log; C:\clasmas; 30;
26-01-2006;
clamasiva3.log; C:\clasmas; 28;
clamasiva3.log; C:\clasmas; 28;
clamasiva3.log; C:\clasmas; 28;
27-03-2006;
clamasiva7.log; C:\clasmas; 28;
clamasiva7.log; C:\clasmas; 28;

thank you. Using sql loader way:


clamasiva8.log; C:\clasmas; 30-01-2006; 30;
clamasiva8.log; C:\clasmas; 30-01-2006; 30;
clamasiva3.log; C:\clasmas; 26-01-2006; 28;
clamasiva3.log; C:\clasmas; 26-01-2006; 28;
clamasiva3.log; C:\clasmas; 26-01-2006; 28;
clamasiva7.log; C:\clasmas; 27-03-2006; 28;
clamasiva7.log; C:\clasmas; 27-03-2006; 28;

thank you for help.

Tom Kyte
May 09, 2006 - 1:23 pm UTC

look - I don't "get it".

You can thank me all you want, but until you actually say what you "need", I'll keep saying "don't get it, doesn't make sense"


You can keep typing the same thing over and over - but that won't change the fundemental fact that you haven't said what you need.

using sql loader

jhon edison b.c, May 09, 2006 - 2:59 pm UTC

What I need is that using sqlloader, it is possible to to load the file that has as header the date, which delimits the content of the file in a table with the characteristics that already him writing. He looks your already at done in a past article using an alone head, but you do not change head and in this case the head is a date. It looks at the following code that you did:

"The difference is that appears the head dates often"


Hi, I have a question regarding SQL*Loader. I am parsing a datafile that has a
header record that contains information that I need to store on each record
being inserted into the database. My data file looks something like:

BOF - 09/01/2003
ABCD 132.43 DJI
ABCD 2344.54 SJY
ABCD 50.51 IRJ


I would like to store the data in the table like:

---------- ---- ------- ---
09/01/2003 ABCD 132.43 DJI
09/01/2003 ABCD 2344.54 SJY
09/01/2003 ABCD 50.51 IRJ

So basically, I want to read the date from the first line and include it in each
of the following records that get loaded into the DB.

I've looked at using 'FILLER' datatypes a bit, but I'm not sure that this is the
correct solution.

Any help would be greatly appreciated.

Thanks, Bob.

***************************

It would be okay if I had to run SQL loader twice to get this to work.

Tom Kyte
May 09, 2006 - 4:48 pm UTC

you could but it requires fantastic tricks using triggers and I'm not going to demonstrate that at all (unfortunately, the documentation does - on loading a report, but it is all horribly fragile).

I'd suggest

a) loading the first record into a table (sqlldr ... load=1)
b) using a scalar subquery in the 2nd control file to retrieve the max date from this table you just loaded into and using skip=1 to ignore the header


I'd probably really suggest "maybe not sqlldr in this case" - but it could be done.

using sql loader

jhon edison b.c, May 10, 2006 - 8:56 am UTC

hi tom

to help myself saying to me like.
With an example. thank you

Max number of columns

Smita, May 17, 2006 - 1:48 am UTC

Hi Tom,

I want sqlloader to throw an error if the data/text file has more columns than a specified number. Any keyword in the control file which can do this ???

Tom Kyte
May 17, 2006 - 6:49 am UTC

no, not really.

load xml file using sql*loader

Ray, July 08, 2006 - 3:03 pm UTC

Hi Tom,

Sorry for bothering by asking for a simple example:

I know that it is possible to load xml files using sql*loader. Could you please give a small example as to how to load the following xml into a table

<?xml version="1.0"?>
<SERIES-CONTAINER>
<SERIES>
<CONTRACT_NO>40449</CONTRACT_NO>
<SYNOPSIS>Series 7 of Big Brother</SYNOPSIS>
</SERIES>
<SERIES>
<CONTRACT_NO>40449</CONTRACT_NO>
<SYNOPSIS>Series 7 of Big Brother</SYNOPSIS>
</SERIES>
<SERIES>
<CONTRACT_NO>40449</CONTRACT_NO>
<SYNOPSIS>Series 7 of Big Brother</SYNOPSIS>
</SERIES>
<SERIES>
<CONTRACT_NO>40449</CONTRACT_NO>
<SYNOPSIS>Series 7 of Big Brother</SYNOPSIS>
</SERIES>
</SERIES-CONTAINER>

"SYNOPSIS" would be a clob in a table (often this is large for varchar2).

If it is possible through sql*loader we may use an external table as well. But the question is external tables do not support clobs. Any workarounds using external tables as well?

Thanks & regards
Ray

Tom Kyte
July 08, 2006 - 9:24 pm UTC

well, what you really need is a "filename", if you have that then dbms_lob.loadfromfile is what you need from "SQL" - not an external table (the external table might be something that contains a list of filenames though...)

Here is a quote from Expert Oracle Database Architecture on this subject:

<quote>

How do I load LOBs?

We will now consider some methods for loading into LOBs. This is not a LONG or LONG RAW field, but rather the preferred datatypes of BLOB and CLOB. These datatypes were introduced in Oracle 8.0 and later, and they support a much richer interface/set of functionality than the legacy LONG and LONG RAW types, as discussed in Chapter 12.

We will investigate two methods for loading these fields: SQLLDR and PL/SQL. Others exist, such as Java streams, Pro*C, and OCI. We will begin working with the PL/SQL method of loading LOBs, and then we’ll look at using SQLLDR to load them as well.

Loading a LOB via PL/SQL

The DBMS_LOB package has entry points called LoadFromFile, LoadBLOBFromFile, and LoadCLOBFromFile. These procedures allow us to use a BFILE (which can be used to read operating system files) to populate a BLOB or CLOB in the database. There is not a significant difference between the LoadFromFile and LoadBLOBFromFile routines, other than the latter returns OUT parameters that indicate how far into the BLOB column we have loaded data. The LoadCLOBFromFile routine, however, provides a significant feature: character set conversion. If you recall, in Chapter 12 we discussed some of the National Language Support (NLS) features of the Oracle database and the importance of character sets. LoadCLOBFromFile allows us to tell the database that the file it is about to load is in a character set different from the one the database is using, and that it should perform the required character set conversion. For example, you may have a UTF8-compatible database, but the files received to be loaded are encoded in the WE8ISO8859P1 character set, or vice versa. This function allows you to successfully load these files.

Note    For complete details on the procedures available in the DBMS_LOB package and their full set of inputs and outputs, please refer to the Oracle9i Oracle Supplied Packages Guide and the Oracle 10g Oracle PL/SQL Packages and Types Reference.

To use these procedures, we will need to create a DIRECTORY object in the database. This object will allow us to create BFILES (and open them) that point to a file existing on the file system that the database server has access to. This last phrase, “that the database server has access to,” is a key point when using PL/SQL to load LOBs. The DBMS_LOB package executes entirely in the server. It can see only the file systems the server can see. It cannot, in particular, see your local file system if you are accessing Oracle over the network. 
So we need to begin by creating a DIRECTORY object in the database. This is a straightforward process. We will create two directories for this example (note that these examples are executed in a UNIX environment; you will use the syntax for referring to directories that is appropriate for your operating system):

ops$tkyte@ORA10G> create or replace directory dir1   as '/tmp/';
Directory created.
 
ops$tkyte@ORA10G> create or replace directory "dir2" as '/tmp/';
Directory created.

Note    Oracle DIRECTORY objects are logical directories, meaning they are pointers to existing, physical directories in your operating system. The CREATE DIRECTORY command does not actually create a directory in the file system—you must perform that operation separately.

The user who performs this operation needs to have the CREATE ANY DIRECTORY privilege. The reason we create two directories is to demonstrate a common case-related (“case” as in uppercase versus lowercase characters) issue with regard to DIRECTORY objects. When Oracle created the first directory DIR1, it stored the object name in uppercase as it is the default. In the second example with dir2, it will have created the DIRECTORY object preserving the case we used in the name. The importance of this will be demonstrated shortly when we use the BFILE object.

Now, we want to load some data into either a BLOB or a CLOB. The method for doing so is rather easy, for example:

ops$tkyte@ORA10G> create table demo
  2  ( id        int primary key,
  3    theClob   clob
  4  )
  5  /
Table created.
 
ops$tkyte@ORA10G> host echo 'Hello World!' > /tmp/test.txt
 
ops$tkyte@ORA10G> declare
  2      l_clob    clob;
  3      l_bfile   bfile;
  4  begin
  5      insert into demo values ( 1, empty_clob() )
  6       returning theclob into l_clob;
  7
  8      l_bfile := bfilename( 'DIR1', 'test.txt' );
  9      dbms_lob.fileopen( l_bfile );
 10
 11      dbms_lob.loadfromfile( l_clob, l_bfile,
 12                             dbms_lob.getlength( l_bfile ) );
 13
 14      dbms_lob.fileclose( l_bfile );
 15  end;
 16   /
PL/SQL procedure successfully completed.
 
ops$tkyte@ORA10G> select dbms_lob.getlength(theClob), theClob from demo
  2  /
DBMS_LOB.GETLENGTH(THECLOB) THECLOB
--------------------------- ---------------
                         13 Hello World!

Walking through the preceding code we see
    *    On lines 5 and 6, we create a row in our table, set the CLOB to an EMPTY_CLOB(), and retrieve its value in one call. With the exception of temporary LOBs, LOBs “live” in the database—we cannot write to a LOB variable without having a pointer to either a temporary LOB or a LOB that is already in the database. An EMPTY_CLOB() is not a NULL CLOB; it is a valid non-NULL pointer to an empty structure. The other thing this did for us was to get a LOB locator, which points to data in a row that is locked. If we were to have selected this value out without locking the underlying row, our attempts to write to it would fail because LOBs must be locked prior to writing (unlike other structured data). By inserting the row, we have, of course, locked the row. If we were modifying an existing row instead of inserting,  we would have used SELECT FOR UPDATE to retrieve and lock row.
    *    On line 8, we create a BFILE object. Note how we use DIR1 in uppercase—this is key, as we will see in a moment. This is because we are passing to BFILENAME() the name of an object, not the object itself. Therefore, we must ensure the name matches the case Oracle has stored for this object.
    *    On line 9, we open the LOB. This will allow us to read it.
    *    On lines 11 and 12, we load the entire contents of the operating system file /tmp/test.txt into the LOB locator we just inserted. We use DBMS_LOB.GETLENGTH() to tell the LOADFROMFILE() routine how many bytes of the BFILE to load (all of them).
    *    Lastly, on line 14, we close the BFILE we opened, and the CLOB is loaded.

If we had attempted to use dir1 instead of DIR1 in the preceding example, we would have encountered the following error:

ops$tkyte@ORA10G> declare
…
  6       returning theclob into l_clob;
  7
  8      l_bfile := bfilename( 'dir1', 'test.txt' );
  9      dbms_lob.fileopen( l_bfile );
…
 15  end;
 16  /
declare
*
ERROR at line 1:
ORA-22285: non-existent directory or file for FILEOPEN operation
ORA-06512: at "SYS.DBMS_LOB", line 523
ORA-06512: at line 9
This is because the directory dir1 does not exist—DIR1 does. If you prefer to use directory names in mixed case, you should use quoted identifiers when creating them as we did for dir2. This will allow you to write code as shown here:
ops$tkyte@ORA10G> declare
  2      l_clob    clob;
  3      l_bfile   bfile;
  4  begin
  5      insert into demo values ( 1, empty_clob() )
  6       returning theclob into l_clob;
  7
  8      l_bfile := bfilename( 'dir2', 'test.txt' );
  9      dbms_lob.fileopen( l_bfile );
 10
 11      dbms_lob.loadfromfile( l_clob, l_bfile,
 12                             dbms_lob.getlength( l_bfile ) );
 13
 14      dbms_lob.fileclose( l_bfile );
 15  end;
 16  /
PL/SQL procedure successfully completed.
There are methods other than the load from file routines by which you can populate a LOB using PL/SQL. Using DBMS_LOB and its supplied routines is by far the easiest if you are going to load the entire file. If you need to process the contents of the file while loading it, you may also use DBMS_LOB.READ on the BFILE to read the data. The use of UTL_RAW.CAST_TO_VARCHAR2 is handy here if the data you are reading is in fact text, not RAW. You may then use DBMS_LOB.WRITE or WRITEAPPEND to place the data into a CLOB or BLOB.
Loading LOB Data via SQLLDR
We will now investigate how to load data into a LOB via SQLLDR. There is more than one method for doing this, but we will investigate the two most common:
    *    When the data is “inline” with the rest of the data.
    *    When the data is stored out of line, and the input data contains a file name to be loaded with the row. These are also known as secondary data files (SDFs) in SQLLDR terminology.
We will start with data that is inline.
Loading LOB Data That Is Inline
These LOBs will typically have newlines and other special characters embedded in them. Therefore, you will almost always use one of the four methods detailed in the “How Do I Load Data with Embedded Newlines?” section to load this data. Let’s begin by modifying the DEPT table to have a CLOB instead of a big VARCHAR2 field for the COMMENTS column:
ops$tkyte@ORA10G> truncate table dept;
Table truncated.
 
ops$tkyte@ORA10G> alter table dept drop column comments;
Table altered.
 
ops$tkyte@ORA10G> alter table dept add comments clob;
Table altered.
For example, say we have a data file (demo.dat) that has the following contents:
10, Sales,Virginia,This is the Sales
Office in Virginia|
20,Accounting,Virginia,This is the Accounting
Office in Virginia|
30,Consulting,Virginia,This is the Consulting
Office in Virginia|
40,Finance,Virginia,"This is the Finance
Office in Virginia, it has embedded commas and is
much longer than the other comments field. If you
feel the need to add double quoted text in here like
this: ""You will need to double up those quotes!"" to
preserve them in the string. This field keeps going for up to
1000000 bytes (because of the control file definition I used)
or until we hit the magic end of record marker,
the | followed by an end of line - it is right here ->"|
Each record ends with a pipe symbol (|), followed by the end-of-line marker. The text for department 40 is much longer than the rest, with many newlines, embedded quotes, and commas. Given this data file, we can create a control file such as this:
LOAD DATA
INFILE demo.dat "str X'7C0A'"
INTO TABLE DEPT
REPLACE
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
TRAILING NULLCOLS
(DEPTNO,
  DNAME        "upper(:dname)",
  LOC          "upper(:loc)",
  COMMENTS     char(1000000)
)
Note    This example is from UNIX, where the end-of-line marker is 1 byte, hence the STR setting in the preceding control file. On Windows, it would have to be '7C0D0A'.
To load the data file, we specify CHAR(1000000) on column COMMENTS since SQLLDR defaults to CHAR(255) for any input field. The CHAR(1000000) will allow SQLLDR to handle up to 1,000,000 bytes of input text. You must set this to a value that is larger than any expected chunk of text in the input file. Reviewing the loaded data, we see the following:
ops$tkyte@ORA10G> select comments from dept;
 
COMMENTS
-------------------------------------------------------------------------------
This is the Consulting
Office in Virginia
 
This is the Finance
Office in Virginia, it has embedded commas and is
much longer than the other comments field. If you
feel the need to add double quoted text in here like
this: "You will need to double up those quotes!" to
preserve them in the string. This field keeps going for up to
1000000 bytes or until we hit the magic end of record marker,
the | followed by an end of line - it is right here ->
 
This is the Sales
Office in Virginia
 
This is the Accounting
Office in Virginia
The one thing to observe here is that the doubled-up quotes are no longer doubled up. SQLLDR removed the extra quotes we placed there.
Loading LOB Data That Is Out of Line
A common scenario is to have a data file that contains the names of files to load into the LOBs, instead of having the LOB data mixed in with the structured data. This offers a greater degree of flexibility, as the data file given to SQLLDR does not have to use one of the four methods to get around having embedded newlines in the input data, as would frequently happen with large amounts of text or binary data. SQLLDR calls this type of additional data file a LOBFILE.
SQLLDR can also support the loading of a structured data file that points to another, single data file. We can tell SQLLDR how to parse LOB data from this other file, so that each row in the structured data gets loaded with a piece of it. I find this mode to be of limited use (I’ve never found a use for it myself to date), and I will not discuss it here. SQLLDR refers to these externally referenced files as complex secondary data files.
LOBFILES are relatively simple data files aimed at facilitating LOB loading. The attribute that distinguishes LOBFILEs from the main data files is that in LOBFILEs, there is no concept of a record, hence newlines never get in the way. In LOBFILEs, the data is in any of the following formats:
    *    Fixed-length fields (e.g., load bytes 100 through 1000 from the LOBFILE)
    *    Delimited fields (terminated by something or enclosed by something)
    *    Length/value pairs, a variable-length field
The most common of these types is the delimited fields—ones that are terminated by an end-of-file (EOF), in fact. Typically, you have a directory full of files you would like to load into LOB columns, and each file in its entirety will go into a BLOB. The LOBFILE statement with TERMINATED BY EOF is what you will use.
So, let’s say we have a directory full of files we would like to load into the database. We would like to load the OWNER of the file, the TIME_STAMP of the file, the NAME of the file, and the file itself. Our table we would load into would be as follows:
ops$tkyte@ORA10G> create table lob_demo
  2  ( owner      varchar2(255),
  3    time_stamp date,
  4    filename   varchar2(255),
  5    data       blob
  6  )
  7  /
Table created.
Using a simple ls –l on UNIX, and dir /q /n on Windows, and capturing that output, we can generate our input file and load it using a control file such as this on UNIX:
LOAD DATA
INFILE *
REPLACE
INTO TABLE LOB_DEMO
( owner       position(17:25),
  time_stamp  position(44:55) date "Mon DD HH24:MI",
  filename    position(57:100),
  data        LOBFILE(filename) TERMINATED BY EOF
)
BEGINDATA
-rw-r--r--    1 tkyte    tkyte     1220342 Jun 17 15:26 classes12.zip
-rw-rw-r--    1 tkyte    tkyte          10 Jul 16 16:38 foo.sql
-rw-rw-r--    1 tkyte    tkyte         751 Jul 16 16:36 t.ctl
-rw-rw-r--    1 tkyte    tkyte         491 Jul 16 16:38 testa.sql
-rw-rw-r--    1 tkyte    tkyte         283 Jul 16 16:38 testb.sql
-rw-rw-r--    1 tkyte    tkyte         231 Jul 16 16:38 test.sh
-rw-rw-r--    1 tkyte    tkyte         235 Apr 28 18:03 test.sql
-rw-rw-r--    1 tkyte    tkyte        1649 Jul 16 16:36 t.log
-rw-rw-r--    1 tkyte    tkyte        1292 Jul 16 16:38 uselast.sql
-rw-rw-r--    1 tkyte    tkyte         909 Jul 16 16:38 userbs.sql
Now, if we inspect the contents of the LOB_DEMO table after running SQLLDR, we will discover the following:
ops$tkyte@ORA10G> select owner, time_stamp, filename, dbms_lob.getlength(data)
  2  from lob_demo
  3  /
 
OWNER    TIME_STAM FILENAME       DBMS_LOB.GETLENGTH(DATA)
-------- --------- -------------- ------------------------
tkyte    17-JUN-05 classes12.zip                   1220342
tkyte    16-JUL-05 foo.sql                              10
tkyte    16-JUL-05 t.ctl                               875
tkyte    16-JUL-05 testa.sql                           491
tkyte    16-JUL-05 testb.sql                           283
tkyte    16-JUL-05 test.sh                             231
tkyte    28-APR-05 test.sql                            235
tkyte    16-JUL-05 t.log                                 0
tkyte    16-JUL-05 uselast.sql                        1292
tkyte    16-JUL-05 userbs.sql                          909
 
10 rows selected.
This works with CLOBs as well as BLOBs. Loading a directory of text files using SQLLDR in this fashion is easy.
</quote> 

Continuing on my previous post

Ray, July 08, 2006 - 3:18 pm UTC

Hi Tom , sorry to mention the structure of my target table.

Create table test (contract_no number(10)
,synopsis clob);

Sorry for the inconvinience.

Regards
Ray


problem in sql loader

Chandana, July 18, 2006 - 6:54 am UTC

hi tom,

i want to insert NULL into the DATE column. If in the data file,it contains any date then it has to be populated otherwise just NULL into the column.

the control file is
LOAD DATA
REPLACE
INTO TABLE TEST_LDR
(COL1 POSITION(1:8) DATE 'RRRRMMDD')

the data file is
20060706
20060708
20060709

20060711
20060712

the table test_ldr contains only one column with DATE datatype.

whe i'm running this the log file contains the following info:

Record 4: Discarded - all columns null.
Table TEST_LDR:
5 Rows successfully loaded.
0 Rows not loaded due to data errors.
0 Rows not loaded because all WHEN clauses were failed.
1 Row not loaded because all fields were null.

Total logical records skipped: 0
Total logical records read: 6
Total logical records rejected: 0
Total logical records discarded: 1

Hope you help me soon.

Thanks,
Chandana

Tom Kyte
July 18, 2006 - 8:40 am UTC

not sure that sqlldr will be the tool for you then. Not sure what use a record with all nulls would be (where is the primary key for this table at least??)

Check Date values in SQL Loader

Madhava Reddy, July 18, 2006 - 4:29 pm UTC

Hi Tom,
We have a job that loads data from a flat file using SQL Loader.The date values are being loaded into varchar2 columns and then the date columns are being updated after validating the data in the corresponding varchar2 columns. We are calling a function to do this (the function just has to_date(p_date,'RRMMDD' and returns NULL if the value is not a valid date). I am trying to migrate this conversion to the control file.But the issue that I have is that, if the value for the date column is not a valid date, then the whole record is getting rejected.Instead, is there a way to load null value into the date column in such situations?

Tom Kyte
July 19, 2006 - 8:56 am UTC

....
that_column "your_function(:that_column,'rrmmdd')",
....


just have sqlldr's control file reference your function to convert the date. Your function doesn't fail so the record won't fail.

loader

Ranjeet kumar bhatia, July 19, 2006 - 1:44 am UTC

thanx

How to stop Developers from using Sql Loader from their PC

A reader, July 19, 2006 - 3:57 pm UTC

We have some developers that are using Sql Loader from their PCs which management doesn't want. Is there any way I can stop it (by using Database level trigger etc).

Thanks in advance

Tom Kyte
July 22, 2006 - 4:19 pm UTC

not really - laughing out loud. why not? All sqlldr does is "insert".

I guess really the answer would be "revoke insert from those that should not have insert"

sqlldr isn't "magic", you could do in excel literally what they are using sqlldr for, or APEX, or sqlplus, or toad - or - since they are developers - they could write a program.




sql loader and clob, sort of

A reader, July 20, 2006 - 6:31 pm UTC

Hi

I have a table with clobs

The data file comes as follows

select 1 from dual;12312.33;32.33,22

the first field is the clob, the first field can have from 20 characters to 25000

my control file is

LOAD DATA
INFILE tkprof.dat
INTO TABLE TKPROF_TABLE
replace
FIELDS TERMINATED BY ";"
(
SQL_STATEMENT CHAR,
PARSE_CNT DECIMAL EXTERNAL,
PARSE_CPU DECIMAL EXTERNAL
)

I am getting errors for those record with huge field 1 ( the query)

I tried to change

SQL_STATEMENT CHAR,
to
SQL_STATEMENT CLOB,

No luck does not work, I wonder how can I load this? Is it a must use LOBFILE in my situation?


Tom Kyte
July 22, 2006 - 5:31 pm UTC

sql_statment char(25000),
...

is what you need to do. else the char defaulted to 255

Using Sequence and direct path load

Jamu, July 20, 2006 - 10:32 pm UTC

a. Can we use sequence and also do a DIRECT load through Sql*loader? If no, is there a good alternative to get the sequence in the table other than conventional path that is taking quite some time (3 minutes for a million records vs 35 secs through direct method).


Though not related to this thread, can you still reply to this second question ?

I have a procedure:

begin
execute immediate 'alter session enable parallel dml';

insert /* + append */ into t ...select .. from t2

Will the append hint benefit from the parallel dml? Also is it required to commit after insert to do a select from table t?

Thanks so much.

Tom Kyte
July 22, 2006 - 5:41 pm UTC

a) yes you can, insert /*+ APPEND */ from an external table (which is a direct path load) can access all things sql easily.


append does not "need" parallel - it'll direct path load if it can.

? on SQL*LOADER, sequence and multi-table INTO

Duke Ganote, August 10, 2006 - 3:02 pm UTC

I'm following up on this
http://asktom.oracle.com/pls/asktom/f?p=100:11:::::P11_QUESTION_ID:2976181076629#14453815375721but
I don't understand why the sequence CURRVAL isn't functioning consistently.  Am I missing something obvious?

<b>DATABASE OBJECTS:</b>
drop table t1;
drop table t2;
drop sequence seq#_generator;
create table t1 ( seq# number, payload varchar2(20) );
create table t2 ( seq# number, seq_dt date );
create sequence seq#_generator;

<b>CONTROL FILE:</b>

LOAD DATA
INFILE *
APPEND
INTO TABLE T2
( seq#     EXPRESSION "seq#_generator.NEXTVAL"
, seq_dt   SYSDATE 
)
INTO TABLE T1
( PAYLOAD  CHAR(10)
, seq#     EXPRESSION "seq#_generator.CURRVAL"
)
--3456789 123456789 
--       1         2
BEGINDATA
0101 31601
0101I24200
0101000000
0101000000

<b>RUN OF SQL*LOADER:</b>

C:\>sqlldr userID/password@database control_file.ctl

SQL*Loader: Release 10.2.0.1.0 - Production on Thu Aug 10 14:43:43 2006

Copyright (c) 1982, 2005, Oracle.  All rights reserved.

Commit point reached - logical record count 3
Commit point reached - logical record count 4

<b>RESULTS:</b>

SQL> select * from t2;

      SEQ# SEQ_DT
---------- ---------
         1 10-AUG-06
         2 10-AUG-06
         3 10-AUG-06
         4 10-AUG-06

<b>looks good, but I don't understand this...</b>

SQL> select * from t1;

      SEQ# PAYLOAD
---------- --------------------
         3 0101 31601
         3 0101I24200
         3 0101000000
         4 0101000000
 

Never mind... known bug: CURRVAL, NEXTVAL NOT WORKING IN SQL*LOADER

Duke Ganote, August 10, 2006 - 5:16 pm UTC

Once I looked at the data, I realized it was CURRVAL that wasn't working, and that let me do a more effective search in metalink. NOTE 1031704.6

CURRVAL Workaround

Duke Ganote, August 11, 2006 - 9:04 am UTC

Metalink's proposed workaround doesn't seem very scalable, but fortunately I'm only inducting a batch of 15,000 records daily.

"Setting the row=1 or bindsize=recordlength forces Sql*Loader to process 1 row at a time, and hence keeps the nextval/currval sequence correct for each record."

sequence in sql*loader

A reader, September 30, 2006 - 3:13 pm UTC

Hi

I have this contro file

LOAD DATA
INFILE SQL_PIECE.dat
INTO TABLE SQL_PIECE
APPEND
FIELDS TERMINATED BY '@'
(
ID_SQL SQL_PIECE_SEQ1.NEXTVAL,
PROG_NAME CHAR,
PROG_ID_SQL DECIMAL EXTERNAL,
CLOB_FILE FILLER CHAR(80),
SQL_TEXT LOBFILE (CLOB_FILE) terminated by eof
)

it keeps on saying incrrect syntax, I know it is the sequence part which is giving error however I cant find anywhere in the documentation how to load sequence using my controlfile format?

May you throw some lights?

Cheers

Tom Kyte
October 01, 2006 - 12:46 am UTC

"sql_piece_seq1.nextval"

double quote sql to be invoked.

Partitioned column value in sql loader

raajesh, October 03, 2006 - 2:15 pm UTC

Hi Tom,

I have a table like this

DeptID PrimaryKey
DeptName
PersonName
Counter

Now, in my CSV file, I have the values for DeptID, DeptName, PersonName.

I need to set the value for counter, that is partitioned by Dept Name. For example

Dept ID DeptName PersonName Counter
1 A A1 1
2 A A2 2
3 B B1 1
4 B B2 2

How to get this done in the control file itself?

Can Analytic functions be used for the same? I tried using ROW_NUMBER but got an exception that

"window functions cannot be used here".

Please clarify.

Thanks.
Raajesh

Tom Kyte
October 03, 2006 - 4:53 pm UTC

what is counter? what is the LOGIC behind counter. Your example is rather simple and begs more questions than it provides answers.

When records in a line change

Rory Concepcion, October 03, 2006 - 9:45 pm UTC

Hi Tom,

I have a question.
What if my flat file looked like this.

200609191041,PDT,myhphost,ORCL2,D,CO,20060919104000,S,tci_myhphost_ng,TCI_DEV_UNIX_ORACLE,dbdaily,NBUBP,2,100,10000,102,ORCL2_myhphost_20060919104000
200609191200,PDT,myhphost,ORCL2,D,CO,20060919104000,E,tci_myhphost_ng,TCI_DEV_UNIX_ORACLE,dbdaily,NBUBP,2,100,10000,102,ORCL2_myhphost_20060919104000
200609191200,PDT,myhphost,ORCL2,D,20060919104000,F,tci_myhphost_ng,NETBACKUP,Netbackup exit code:23

I would be inserting records of a line in table t1 if 8th record is "S" or "E" and will be inserting them to table t2 if it is "F".
My problem is, the structure of the lines is only consistent when 8th record is "S" or "E". But when it is "F" (where it is now in 7th place), the whole line changes. My 6th record changes causing the move of the 8th record.
Can I do this is sqlloader?
Can I also do this column changes in external tables?
Please show example based from data above.
Thanks.

Tom Kyte
October 04, 2006 - 7:01 am UTC

do you mean "8th and 7th column/attribute????"

I would use an external table and a simple when clause to insert into one or the other table

when (col8 in ( 'S', 'E' ) )
when (col7 = 'F' )


Like this:

insert
when (deptno=10) then
into emp_10 ( empno, ename, job, mgr, hiredate, sal, comm, deptno )
values ( empno, ename, job, mgr, hiredate, sal, comm, deptno )
when (deptno=20) then
into emp_20 ( empno, ename, job, mgr, hiredate, sal, comm, deptno )
values ( empno, ename, job, mgr, hiredate, sal, comm, deptno )
when (deptno=30) then
into emp_30 ( empno, ename, job, mgr, hiredate, sal, comm, deptno )
values ( empno, ename, job, mgr, hiredate, sal, comm, deptno )
else
into leftover ( empno, ename, job, mgr, hiredate, sal, comm, deptno )
values ( empno, ename, job, mgr, hiredate, sal, comm, deptno )
select * from external_table e
/


Explanations

Raajesh, October 04, 2006 - 3:21 am UTC

Hi Tom,

Sorry if my example was poor.

The counter is a field, that is partitioned by DeptName. In normal SQL it is like this

ROW_NUMBER () over (Partition by DeptName)

In one of your follow-up you have mentioned that it is not possible. That was way back in 2003. I would like to know if it is possible now? With the newer versions.

Regards
Raajesh

Tom Kyte
October 04, 2006 - 7:08 am UTC

use an external table:

SQL> create table emp as select * from scott.emp;

Table created.

SQL> create table e1 as select * from emp where 1=0;

Table created.

SQL> create table e2 as select * from emp where 1=0;

Table created.

SQL>
SQL> insert
  2    when (rn=1) then
  3      into e1 ( empno, ename, job, mgr, hiredate, sal, comm, deptno )
  4      values ( empno, ename, job, mgr, hiredate, sal, comm, deptno )
  5    else
  6      into e2 ( empno, ename, job, mgr, hiredate, sal, comm, deptno )
  7           values ( empno, ename, job, mgr, hiredate, sal, comm, deptno )
  8  select e.*, row_number() over (partition by deptno order by ename)rn from emp e
  9  /

14 rows created.

SQL> select deptno, ename from e1 order by deptno;

DEPTNO ENAME
------ ----------
    10 CLARK
    20 ADAMS
    30 ALLEN

SQL> select deptno, ename from e2 order by deptno;

DEPTNO ENAME
------ ----------
    10 KING
    10 MILLER
    20 SCOTT
    20 JONES
    20 FORD
    20 SMITH
    30 BLAKE
    30 JAMES
    30 MARTIN
    30 TURNER
    30 WARD

11 rows selected.


In this example "emp" was pretending to be an external table. 

Thank you

Raajesh, October 04, 2006 - 1:09 pm UTC

Thanks Tom. The example was indeed a lot helpful. I was able to acheive this using external tables.

Regards
Raajesh

When records in a line change

Rory Concepcion, October 04, 2006 - 6:54 pm UTC

Thanks a lot Tom.
c",)

Discard and use of when clause in Sql Loader

A reader, October 23, 2006 - 11:57 am UTC

Tom,

recently I used a call to sqlloader into a ksh script. The sql loader control file contains a when clause which exclude few records from the input file.

In that case the return code from the ksh script is not 0 but its value is 2. And as far as I was testing that return code should be <> '0' then my ksh went in error.

Do you mention this in your new book? or is it a bug for oracle 8.1.7.4 only

Thanks for your helpful web site

Tom Kyte
October 23, 2006 - 12:47 pm UTC

</code> http://docs.oracle.com/docs/cd/A87860_01/doc/server.817/a76955/ch06.htm#1956 <code>

it is documented to exit with 2 (warning) when records are discarded.

you care if it exits with 1 or 3

How to load filename into table from sqlldr

A reader, March 13, 2007 - 6:39 am UTC

Hi Tom,
My requirement is I have to load all the files in a directory(all the files have only one column). I have to load the contents along with the file name. My table structure is temp(file_name varchar2(100),text varchar2(1000)). Here I need to load the file name when I load the contents of the file. Is it possible to load the file name, pls help me.
Tom Kyte
March 13, 2007 - 11:30 am UTC

write a script

script will do "ls" or "dir" or whatever magic command list files

script will format this data into a data file to be loaded by sqlldr (as sqlldr needs to be told the name of the files to be loaded)

so, sure, once you have a file full of filenames - you can use sqlldr to load

a) the file itself
b) the name of the file

Date Formate Of SQL loader to import data

Nazmul Hoque, August 01, 2007 - 6:45 am UTC

Hi Tom,

I have problem insert date in the table from text data by sql loder, Pls advise, dlts as under,i am useing 8.1.7 :

my text data sample :
01D000:8>>0023236490----0707286:084604459I*G

ctl file is :
---------
LOAD data
infile 'F:\softwon\hrm-SDL\DATA.txt'
into table TEMPINDATA
(empid position(17:21),
ATTEND_DATE position(27:32) date "yymmdd",
IN_HOUR position(34:35),
IN_MIN position(36:37),
cardno position(17:21))
----------

and getting eror for the date as below :


SQL*Loader: Release 8.1.7.0.0 - Production on Wed Aug 1 16:24:45 2007

(c) Copyright 2000 Oracle Corporation. All rights reserved.

Control File: pdata.ctl
Data File: F:\softwon\hrm-SDL\DATA.txt
Bad File: DATA.bad
Discard File: none specified

(Allow all discards)

Number to load: ALL
Number to skip: 0
Errors allowed: 50
Bind array: 64 rows, maximum of 65536 bytes
Continuation: none specified
Path used: Conventional

Table TEMPINDATA, loaded from every logical record.
Insert option in effect for this table: INSERT

Column Name Position Len Term Encl Datatype
------------------------------ ---------- ----- ---- ---- ---------------------
EMPID 17:21 5 CHARACTER
ATTEND_DATE 27:32 6 DATE yymmdd
IN_HOUR 34:35 2 CHARACTER
IN_MIN 36:37 2 CHARACTER
CARDNO 17:21 5 CHARACTER

Record 1: Rejected - Error on table TEMPINDATA, column ATTEND_DATE.
ORA-01843: not a valid month

Record 2: Rejected - Error on table TEMPINDATA, column ATTEND_DATE.
ORA-01843: not a valid month

Record 3: Rejected - Error on table TEMPINDATA, column ATTEND_DATE.
ORA-01843: not a valid month

Record 4: Rejected - Error on table TEMPINDATA, column ATTEND_DATE.
ORA-01843: not a valid month

Record 5: Rejected - Error on table TEMPINDATA, column ATTEND_DATE.
ORA-01843: not a valid month

Record 6: Rejected - Error on table TEMPINDATA, column ATTEND_DATE.
ORA-01843: not a valid month

Record 7: Rejected - Error on table TEMPINDATA, column ATTEND_DATE.
ORA-01843: not a valid month

Record 8: Rejected - Error on table TEMPINDATA, column ATTEND_DATE.
ORA-01843: not a valid month

Record 9: Rejected - Error on table TEMPINDATA, column ATTEND_DATE.
ORA-01843: not a valid month

Record 10: Rejected - Error on table TEMPINDATA, column ATTEND_DATE.
ORA-01843: not a valid month

Record 11: Rejected - Error on table TEMPINDATA, column ATTEND_DATE.
ORA-01843: not a valid month

Record 12: Rejected - Error on table TEMPINDATA, column ATTEND_DATE.
ORA-01843: not a valid month

Record 13: Rejected - Error on table TEMPINDATA, column ATTEND_DATE.
ORA-01843: not a valid month

Record 14: Rejected - Error on table TEMPINDATA, column ATTEND_DATE.
ORA-01843: not a valid month

Record 15: Rejected - Error on table TEMPINDATA, column ATTEND_DATE.
ORA-01843: not a valid month

Record 16: Rejected - Error on table TEMPINDATA, column ATTEND_DATE.
ORA-01843: not a valid month

Record 17: Rejected - Error on table TEMPINDATA, column ATTEND_DATE.
ORA-01843: not a valid month

Record 18: Rejected - Error on table TEMPINDATA, column ATTEND_DATE.
ORA-01843: not a valid month

Record 19: Rejected - Error on table TEMPINDATA, column ATTEND_DATE.
ORA-01843: not a valid month

Record 20: Rejected - Error on table TEMPINDATA, column ATTEND_DATE.
ORA-01843: not a valid month

Record 21: Rejected - Error on table TEMPINDATA, column ATTEND_DATE.
ORA-01843: not a valid month

Record 22: Rejected - Error on table TEMPINDATA, column ATTEND_DATE.
ORA-01843: not a valid month

Record 23: Rejected - Error on table TEMPINDATA, column ATTEND_DATE.
ORA-01843: not a valid month

Record 24: Rejected - Error on table TEMPINDATA, column ATTEND_DATE.
ORA-01843: not a valid month

Record 25: Rejected - Error on table TEMPINDATA, column ATTEND_DATE.
ORA-01843: not a valid month

Record 26: Rejected - Error on table TEMPINDATA, column ATTEND_DATE.
ORA-01843: not a valid month

Record 27: Rejected - Error on table TEMPINDATA, column ATTEND_DATE.
ORA-01843: not a valid month

Record 28: Rejected - Error on table TEMPINDATA, column ATTEND_DATE.
ORA-01843: not a valid month

Record 29: Rejected - Error on table TEMPINDATA, column ATTEND_DATE.
ORA-01843: not a valid month

Record 30: Rejected - Error on table TEMPINDATA, column ATTEND_DATE.
ORA-01843: not a valid month

Record 31: Rejected - Error on table TEMPINDATA, column ATTEND_DATE.
ORA-01843: not a valid month

Record 32: Rejected - Error on table TEMPINDATA, column ATTEND_DATE.
ORA-01843: not a valid month

Record 33: Rejected - Error on table TEMPINDATA, column ATTEND_DATE.
ORA-01843: not a valid month

Record 34: Rejected - Error on table TEMPINDATA, column ATTEND_DATE.
ORA-01843: not a valid month

Record 35: Rejected - Error on table TEMPINDATA, column ATTEND_DATE.
ORA-01843: not a valid month

Record 36: Rejected - Error on table TEMPINDATA, column ATTEND_DATE.
ORA-01843: not a valid month

Record 37: Rejected - Error on table TEMPINDATA, column ATTEND_DATE.
ORA-01843: not a valid month

Record 38: Rejected - Error on table TEMPINDATA, column ATTEND_DATE.
ORA-01843: not a valid month

Record 39: Rejected - Error on table TEMPINDATA, column ATTEND_DATE.
ORA-01843: not a valid month

Record 40: Rejected - Error on table TEMPINDATA, column ATTEND_DATE.
ORA-01843: not a valid month

Record 41: Rejected - Error on table TEMPINDATA, column ATTEND_DATE.
ORA-01843: not a valid month

Record 42: Rejected - Error on table TEMPINDATA, column ATTEND_DATE.
ORA-01843: not a valid month

Record 43: Rejected - Error on table TEMPINDATA, column ATTEND_DATE.
ORA-01843: not a valid month

Record 44: Rejected - Error on table TEMPINDATA, column ATTEND_DATE.
ORA-01843: not a valid month

Record 45: Rejected - Error on table TEMPINDATA, column ATTEND_DATE.
ORA-01843: not a valid month

Record 46: Rejected - Error on table TEMPINDATA, column ATTEND_DATE.
ORA-01843: not a valid month

Record 47: Rejected - Error on table TEMPINDATA, column ATTEND_DATE.
ORA-01843: not a valid month

Record 48: Rejected - Error on table TEMPINDATA, column ATTEND_DATE.
ORA-01843: not a valid month

Record 49: Rejected - Error on table TEMPINDATA, column ATTEND_DATE.
ORA-01843: not a valid month

Record 50: Rejected - Error on table TEMPINDATA, column ATTEND_DATE.
ORA-01843: not a valid month


MAXIMUM ERROR COUNT EXCEEDED - Above statistics reflect partial run.

Table TEMPINDATA:
0 Rows successfully loaded.
51 Rows not loaded due to data errors.
0 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.


Space allocated for bind array: 2048 bytes(64 rows)
Space allocated for memory besides bind array: 0 bytes

Total logical records skipped: 0
Total logical records read: 64
Total logical records rejected: 51
Total logical records discarded: 0

Run began on Wed Aug 01 16:24:45 2007
Run ended on Wed Aug 01 16:24:46 2007

Elapsed time was: 00:00:00.27
CPU time was: 00:00:00.03

Tom Kyte
August 05, 2007 - 9:58 am UTC

sorry, your binary data you pasted is unusable by me (and there is no create table)

suggestion:

create a table
use varchar2(4000) for all types
load into that table

see what is getting loading (eg: use normal debugging/diagnosing techniques to investigate what is happening)

: ORA-01401: inserted value too large for column

chitta, October 03, 2007 - 10:09 am UTC

Hi Tom,
I am using sqlloader to load data to our Oracle 9i database.
In my table a column having datatype varchar2(17), but the datafile having the value for that column is 20 characters, so i got the error : ORA-01401: inserted value too large for column. How can i over come the error without altering my column size.
Is it possible to restrict the value of the column to 17 characters in the loading script while loading data?



please advice.

A reader, October 07, 2007 - 12:47 am UTC

Please try using substr function to load only first 17 characters while loading data...

Reader, February 16, 2010 - 11:50 am UTC

Tom,

I need to load the following using SQL loader to a varchar2 field -
Börse

Can you please let me know what I am supposed to do?

--table
create table tst
(id number,
description varchar2(400));


--tst.dat
1]öabc]

--ctl file; tst.ctl

load data
INFILE 'c:\tst.dat'
BADFILE 'c:\test.bad'
DISCARDFILE 'c:\test.dis'
APPEND
INTO TABLE tst
fields terminated by "]"
optionally enclosed by '"'
TRAILING NULLCOLS
( ID
, DESCRIPTION char(256)
)




Tom Kyte
February 17, 2010 - 8:10 am UTC

what problem are you having exactly?

Problem in loading sequence

Sattam, February 22, 2010 - 2:55 pm UTC

Hi Tom,

I am using SQL*Loader: Release 10.2.0.1.0
I was not being able to load a sequence from the sql*loader using exactly how it is shown here. Then I replicated your example and the loaded the table t. Please see the result

ID A B C D
1 abc def efg 3 4
2 fff degf efg 7 2
Thanks in advance,
Sattam


Tom Kyte
March 01, 2010 - 7:08 am UTC

big page here. Not sure what you used "as shown here" to test with.

How about the full test case and details on what you are trying to accomplish?



The only example I see that looks relevant was:

ops$tkyte%ORA10GR2> create table t ( id int, a int, b varchar2(15), c int, d int );

Table created.

ops$tkyte%ORA10GR2> create sequence S;

Sequence created.

ops$tkyte%ORA10GR2> !cat t.ctl
load data
INFILE *
INTO TABLE t
APPEND
FIELDS TERMINATED BY ',' trailing nullcols
(a,b,c,d, ID "s.nextval")
begindata
1,abc def efg,3,4
2,fff degf efg,7,2

ops$tkyte%ORA10GR2> !sqlldr / t

SQL*Loader: Release 10.2.0.4.0 - Production on Mon Mar 1 08:06:58 2010

Copyright (c) 1982, 2007, Oracle.  All rights reserved.

Commit point reached - logical record count 2

ops$tkyte%ORA10GR2>
ops$tkyte%ORA10GR2> select * from t;

        ID          A B                        C          D
---------- ---------- --------------- ---------- ----------
         1          1 abc def efg              3          4
         2          2 fff degf efg             7          2



and that looks pretty darn "the same as"

ops$tkyte@ORA9IR2> !sqlldr / t
 
SQL*Loader: Release 9.2.0.4.0 - Production on Fri Dec 19 07:24:18 2003
 
Copyright (c) 1982, 2002, Oracle Corporation.  All rights reserved.
 
Commit point reached - logical record count 2
 
ops$tkyte@ORA9IR2> select * from t;
 
        ID          A B                        C          D
---------- ---------- --------------- ---------- ----------
         1          1 abc def efg              3          4
         2          2 fff degf efg             7          2
 


which both look like yours????? Well, you seem to be missing a column - but that is likely your mistake - you don't show your work.

Loading .txt file data in non sequence column order using sqlloader

Raghavendra, April 30, 2010 - 9:00 am UTC

Hi Tom,
I saw your responses in this site. So nice.
Currently I need your help on loading the .txt file into oracle table using sqlloader.
1>> I have input file as .txt like below
columns---- a|b|c|d|
Nov 20 2009 23:09:34:234PM|0.0| |343.927834
Nov 20 2009 23:09:34:234PM|23.45| |344.232334
2>> Loading the data in oracle like below table
Table structure:
create table RTS_TRD(id number(20),column_3 varchar2(200),a date,coulmn_1 varchar2(200),c float,d varchar2(200),b float,column_4 varchar2(200));
Issue 1: Before loading I need to convert the date as "MM/DD/YYYY HH24:MI:SS" for the column "a". succesfully i converted required date format.
Issue 2: I need only four numbers after decimal points. so tell me how can i truncate the float value for columns "b,d"? Ex: 343.927834,
after truncate: 343.9278
Issue 3: How to load the data in table "RTS_TRD" in exact positions of columns "a,b,c,d"?

Tom Kyte
April 30, 2010 - 9:17 am UTC

ops$tkyte%ORA11GR2> !cat t.ctl
LOAD DATA
INFILE *
INTO TABLE rts_trd
REPLACE
FIELDS TERMINATED BY '|'
trailing nullcols
(
a "to_date( substr( :a, 1, 20 ), 'Mon DD YYYY HH24:MI:SS' )",
b "trunc(to_number(trim(:b)),4)",
c "trim(:c)",
d "trunc(to_number(trim(:d)),4)"
)
BEGINDATA
Nov 20 2009 23:09:34:234PM|0.0| |343.927834
Nov 20 2009 23:09:34:234PM|23.45| |344.232334

ops$tkyte%ORA11GR2> desc rts_trd
 Name                                                  Null?    Type
 ----------------------------------------------------- -------- ------------------------------------
 ID                                                             NUMBER(20)
 COLUMN_3                                                       VARCHAR2(200)
 A                                                              DATE
 COULMN_1                                                       VARCHAR2(200)
 C                                                              NUMBER
 D                                                              VARCHAR2(200)
 B                                                              FLOAT(126)
 COLUMN_4                                                       VARCHAR2(200)

ops$tkyte%ORA11GR2> !sqlldr / t

SQL*Loader: Release 11.2.0.1.0 - Production on Fri Apr 30 10:17:35 2010

Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.

Commit point reached - logical record count 2

ops$tkyte%ORA11GR2> select * from rts_trd
  2  /

        ID C A                    C          C D                   B C
---------- - -------------------- - ---------- ---------- ---------- -
             20-nov-2009 23:09:34              343.9278            0
             20-nov-2009 23:09:34              344.2323        23.45


constant data and date loading using sqlloader

Raghavendra, May 03, 2010 - 6:22 am UTC

Thank you very much for your reply.
Can you please suggest on my request.
Target table like below :
CREATE TABLE MARKS_TRD_TRADE_DATA_TMP
(
MARKS_TRD_TRADE_DATA_ID NUMBER NOT NULL,
PORTFOLIO_CODE VARCHAR2(256 BYTE) NOT NULL,
HEDGE_ID VARCHAR2(200 BYTE),
CONTRACT_ALLOC VARCHAR2(200 BYTE),
BROKER_COMM FLOAT(126,
ALLOC_PERCENTAGE FLOAT(126,
ALLOC_TIME DATE),
EXEC_BROKER_COMM FLOAT(126,
CLEARING_BROKER_COMM FLOAT(126,
APPROVAL_ID VARCHAR2(200 BYTE),
APPROVAL_DATE DATE),
TRADE_STATUS VARCHAR2(200 BYTE),
TRADE_STAGE VARCHAR2(200 BYTE),
EXTERNAL_DESTN VARCHAR2(200 BYTE),
SETTLE_DATE DATE),
TRANS_REF VARCHAR2(200 BYTE),
BUY_SELL_IND VARCHAR2(200 BYTE),
TRADE_EXEC_DATE DATE NOT NULL,
TRADE_EXEC_TIME DATE),
MARKET_EXCHANGE_CODE VARCHAR2(200 BYTE),
CONTRACT_ID VARCHAR2(200 BYTE),
COUNTRY_CODE VARCHAR2(200 BYTE),
CONTRACT_NAME VARCHAR2(200 BYTE),
INSTRUMENT_TYPE VARCHAR2(200 BYTE),
PUT_CALL_IND VARCHAR2(200 BYTE),
QUOTED_CCY VARCHAR2(200 BYTE),
CONTRACT_SIZE VARCHAR2(200 BYTE),
EXEC_BROKER_CODE VARCHAR2(200 BYTE),
EXEC_BROKER_NAME VARCHAR2(200 BYTE),
CLEARING_BROKER_CODE VARCHAR2(200 BYTE),
CLEARING_BROKER_NAME VARCHAR2(200 BYTE),
CONTRACT_PRICE FLOAT(126,
CASH_FLOW_AMOUNT FLOAT(126,
SETTLE_AMOUNT FLOAT(126,
CANCEL_IND VARCHAR2(200 BYTE),
CANCEL_DATE DATE),
TRADER_ID VARCHAR2(200 BYTE),
POOL_NUMBER VARCHAR2(200 BYTE),
TRANSMISSION_DATE DATE),
BROKER_CODE VARCHAR2(200 BYTE),
ENTITY_ID VARCHAR2(200 BYTE),
STATUS_TIME DATE),
CUSTODIAN_STATUS VARCHAR2(200 BYTE),
CUSTODIAN_CODE VARCHAR2(200 BYTE),
BACKOFFICE_STATUS VARCHAR2(200 BYTE),
ENTRY_DATE DATE),
EFFECT_DATE DATE),
WITHDRAWAL_DATE DATE),
SEQUENCE VARCHAR2(200 BYTE),
NARRATIVE VARCHAR2(200 BYTE),
NARRATIVE_TYPE VARCHAR2(200 BYTE),
SHARES VARCHAR2(200 BYTE),
PRICE FLOAT(126,
EXPIRATION_DATE DATE),
SERIAL_ID VARCHAR2(200 BYTE),
BASKET_DETAIL_WRK_ID VARCHAR2(200 BYTE),
RULE_ID VARCHAR2(200 BYTE),
RULE_NAME VARCHAR2(200 BYTE),
SETTLE_DEPOSITORY VARCHAR2(200 BYTE),
SECURITY_COUNTRY_OF_ISSUE VARCHAR2(200 BYTE),
BOOK_VALUE_CCY VARCHAR2(200 BYTE),
BOOK_VALUE FLOAT(126,
ACCRUED_INT_CCY VARCHAR2(200 BYTE),
CLOSE_INT_CCY VARCHAR2(200 BYTE),
CLOSE_INT_AMOUNT FLOAT(126,
CLOSE_SETTLE_CCY VARCHAR2(200 BYTE),
CLOSE_SETTLE_AMOUNT FLOAT(126,
COLLATERAL_IND VARCHAR2(200 BYTE),
FUTURES_COLLATERAL VARCHAR2(200 BYTE),
CONFIRMATION_IND VARCHAR2(200 BYTE),
FEE_CCY VARCHAR2(200 BYTE),
TOTAL_FEES FLOAT(126,
INCOME_FREQ VARCHAR2(200 BYTE),
INT_MTHD VARCHAR2(200 BYTE),
NEXT_COUPON_DATE DATE),
REG_ACCOUNT_NAME VARCHAR2(200 BYTE),
REG_ACCOUNT_NUMBER VARCHAR2(200 BYTE),
TAX_AMOUNT FLOAT(126,
UNDERLYING_PAR_VALUE FLOAT(126,
EXEC_FX_FLAG VARCHAR2(200 BYTE),
SOFT_BROKER VARCHAR2(200 BYTE),
FIN_TRADE_ID VARCHAR2(200 BYTE),
HEDGE_DESC VARCHAR2(200 BYTE),
SHARES_QTY FLOAT(126,
MISC_FEES FLOAT(126,
SETTLE_CCY VARCHAR2(200 BYTE),
SWIFT VARCHAR2(200 BYTE),
MESSAGE_ID VARCHAR2(200 BYTE),
DESCRIPTION VARCHAR2(200 BYTE),
GROSS_AMOUNT FLOAT(126,
NET_AMOUNT FLOAT(126,
SOFT_COMMISSION VARCHAR2(200 BYTE),
UNIT_COST FLOAT(126,
VALUATION_CCY_CODE VARCHAR2(200 BYTE),
RELATED_REF VARCHAR2(200 BYTE),
EXCHANGE_RATE FLOAT(126,
LOT_POSITION_ID VARCHAR2(200 BYTE),
LOT_POSITION_QTY VARCHAR2(200 BYTE),
POSITION_ID VARCHAR2(200 BYTE),
LOT_NUMBER VARCHAR2(200 BYTE),
LOT_ACQ_DATE DATE),
LOT_COST VARCHAR2(200 BYTE),
LOT_UNIT_PRICE FLOAT(126,
BASE_LOT_UNIT_PRICE FLOAT(126,
POSITION_QTY VARCHAR2(200 BYTE),
LOT_DATE DATE),
VALUATION_AMOUNT FLOAT(126,
BROKER_COMM_CCY VARCHAR2(200 BYTE),
TRANS_FEES_PAID_TO_BROKER FLOAT(126,
TOTAL_COST FLOAT(126,
MARKET_PRICE FLOAT(126,
MARKET_VALUE FLOAT(126,
GMAN_LOT_POS_WRK_IDENT VARCHAR2(200 BYTE),
GMAN_BROKER_CODE VARCHAR2(200 BYTE),
GFS_BROKER_CODE VARCHAR2(200 BYTE),
GFS_BROKER_NAME VARCHAR2(200 BYTE),
CLEARING_BIC VARCHAR2(200 BYTE),
CLEARING_BROKER_CODE_TYPE VARCHAR2(200 BYTE),
FX_CONTRACT_TYPE VARCHAR2(200 BYTE),
OPEN_CLOSE_IND VARCHAR2(200 BYTE),
FX_CONTRACT_CLOSE_TYPE VARCHAR2(200 BYTE),
FX_SETTLE_TYPE VARCHAR2(200 BYTE),
CUSTODIAN_ACCT_NAME VARCHAR2(200 BYTE),
CUSTODIAN_ACCT_NUM VARCHAR2(200 BYTE),
REG_ABA_NAME VARCHAR2(200 BYTE),
REG_ABA_NUMBER VARCHAR2(200 BYTE),
TEXT_NUMBER VARCHAR2(200 BYTE),
REF_TEXT_NUMBER VARCHAR2(200 BYTE),
INITIAL_CUSIP VARCHAR2(200 BYTE),
INITIAL_ISSUER VARCHAR2(200 BYTE),
TARGET_CUSIP VARCHAR2(200 BYTE),
TARGET_ISSUER VARCHAR2(200 BYTE),
TRADE_TYPE VARCHAR2(200 BYTE),
FX_PURCHASE_CCY_CODE VARCHAR2(200 BYTE),
FX_PURCHASE_AMOUNT FLOAT(126,
FX_DELIVERY_AGENT_CODE VARCHAR2(200 BYTE),
FX_DELIVERY_AGENT_NAME VARCHAR2(200 BYTE),
FX_SALES_CCY_CODE VARCHAR2(200 BYTE),
FX_SALES_AMOUNT FLOAT(126,
FX_INT_PURCHASE_AGENT_CODE VARCHAR2(200 BYTE),
FX_INT_PURCHASE_AGENT_NAME VARCHAR2(200 BYTE),
FX_RECEV_AGENT_CODE VARCHAR2(200 BYTE),
FX_RECEV_AGENT_NAME VARCHAR2(200 BYTE),
FX_INT_SALE_AGENT_CODE VARCHAR2(200 BYTE),
FX_INT_SALE_AGENT_NAME VARCHAR2(200 BYTE),
BENEF_ACCOUNT_NUMBER VARCHAR2(200 BYTE),
BENEF_ACCOUNT_NAME VARCHAR2(200 BYTE),
SENDER_TO_RECEV_INFO VARCHAR2(200 BYTE),
FX_GAIN_LOSS_IND VARCHAR2(200 BYTE),
BUSINESS_UNIT_FILE VARCHAR2(200 BYTE),
FILE_NAME VARCHAR2(200 BYTE),
CREATION_DATE DATE),
EXCEPTION_FLAG VARCHAR2(256 BYTE),
SOURCE_DB_NAME VARCHAR2(200 BYTE),
SYSTEM_ID VARCHAR2(200 BYTE),
SEC_ID VARCHAR2(200 BYTE),
DELIVERY_ID VARCHAR2(200 BYTE),
ACCOUNT_NUMBER VARCHAR2(200 BYTE),
ACCOUNT_NAME VARCHAR2(200 BYTE),
SETTLEMENT_INSTR_1 VARCHAR2(200 BYTE),
SETTLEMENT_INSTR_2 VARCHAR2(200 BYTE),
SETTLEMENT_INSTR_3 VARCHAR2(200 BYTE),
SETTLEMENT_INSTR_4 VARCHAR2(200 BYTE),
EXECUTING_BIC VARCHAR2(200 BYTE),
ACCOUNT_ID VARCHAR2(200 BYTE),
TRADE_ORDER_ID VARCHAR2(200 BYTE),
TRADE_ID NUMBER,
ACCRUED_INTEREST_AMOUNT VARCHAR2(200 BYTE),
EXERCISE_DATE DATE),
COUPON_RATE FLOAT(126,
STRIKE_PRICE FLOAT(126,
REALIZED_GAIN_LOSS FLOAT(126,
FORWARD_POINTS FLOAT(126,
MATURITY_DATE DATE),
DATED_DATE DATE),
TRADE_DATE DATE),
FACE_VALUE FLOAT(126,
PRINCIPAL_VALUE FLOAT(126,
SETTLE_CASH FLOAT(126,
CUSIP VARCHAR2(256 BYTE),
TRADER_COMMENTS VARCHAR2(256 BYTE),
CREATE_STAMP DATE),
MODIFY_STAMP DATE),
CREATE_USER VARCHAR2(256 BYTE),
MODIFY_USER VARCHAR2(256 BYTE),
PAYMENT_DATE DATE),
PAYMENT_FREQ VARCHAR2(256 BYTE),
ACCRUAL_METHOD VARCHAR2(256 BYTE),
TRADER_CODE VARCHAR2(256 BYTE),
ROLLING_MATURITY NUMBER,
ACCRUAL_FROM DATE),
ACCRUAL_TO DATE),
TRADE_NUMBER NUMBER,
ISSUER_CODE VARCHAR2(256 BYTE),
YIELD FLOAT(126)
)
Source flat file:
Entry_date|trade_number|Trade_stage|Trade_status|Exception_flag|BUY_SELL_IND|INSTRUMENT_TYPE|BROKER_CODE|ISSUER_CODE|COUPON_RATE|PRICE|YIELD|maturity_date|Dated_Date|trade_date|settle_date|face_value|principal_value|accrued_interest_amount|settle_cash|collateral_date|CUSIP|Trader_Comments|system_stamp|create_stamp|modify_stamp|create_user|modify_user|forward|backward|override|reason|payment_date|payment_freq|accrual_method|factor|trader_code|lock_status|lock_admin|pairoff_ptr|net_cash|trade_time|collateral_type|pre_trade_id|rolling_maturity|rate_reset|accrual_from|accrual_to|vpdate_method|s_calcode
Nov 8 2002 12:00:00:000AM|211040061|CON|OCAN| |B |DN |CRS |FHL |0.0||1.75|Dec 5 2003 12:00:00:000AM||Nov 4 2002 12:00:00:000AM|Nov 12 2002 12:00:00:000AM|125000000.00|122642361.11|0.00|122642361.11|| ||0001000034a231da|Nov 4 2002 11:32:32:066AM|Nov 4 2002 12:46:43:830PM|guerral|guerral|||0|Overlapping Trades caused a Violation: Because of Trade Entry Timing/Delay the Limit was exceeded. Reserve Analyst preapproved trade and must receive a copy of the ticket.||N|NONE||LIG| ||||Nov 4 2002 11:32:00:000AM|NULL|0|0| ||||
Nov 8 2002 12:00:00:000AM|211040099|CON|OK| |B |IBN |CTW |FHL |2.3999999999999999|100.0||Nov 26 2004 12:00:00:000AM|Nov 26 2002 12:00:00:000AM|Nov 4 2002 12:00:00:000AM|Nov 26 2002 12:00:00:000AM|12000000.00|12000000.00|0.00|12000000.00||3133MTMS4||0001000034a231e0|Nov 4 2002 3:20:54:106PM|Nov 4 2002 4:51:27:546PM|haddadc|haddadc|||0|Preapproved by Credit Analyst: Temporary override of the Limits which was preapproved by a Reserve Analyst.|Nov 26 2002 12:00:00:000AM|S|30/360||CS| ||||Nov 4 2002 3:20:00:000PM|NULL|0|0| ||||


Here my issue is : i need to load the data for "entry_date" column in target "entry_date" column "datatype as "MM/DD/YYYY HH24:MI:SS" format and same as other columns "coupon rate" trunc the value having 4 digits after decimal points . But some of the columns i need to skip columns " (collateral_date, lock_statu,lock_admin,system_stamp,forward,backward,override,reason,factor,pairoff_ptr,trade_time,collateral_type,pre_trade_id,rate_reset,vpdate_method,s_calcode,net_cash)

Tom Kyte
May 06, 2010 - 11:41 am UTC

this is a massive create table

this is a massive string of data


do you think you could rephrase the question with perhaps a four or five column table and some readable data?

Make it small
Make it concise
Make it complete
but most of all, make it small.

Loading the data

Raghavendra, May 03, 2010 - 6:34 am UTC

Its very urgent.
please provide a solution on the above issue.
Tom Kyte
May 06, 2010 - 11:42 am UTC

are you "serious"???????

the sense of entitlement astounds sometimes.

Excellent replies of different loader queries...!!

Parimal, May 08, 2010 - 10:11 am UTC

I got one .txt file. I m not able to insert values including TEXT colunm values in table as each TEXT column value has varying description.

TEXT column is third column from last in the following given sample rows.

Every column is saparated by comma but here the TEXT column description itself contain many comma & new line characters & thats where Loader is giving error & not able to figure out the new line even after including OPTIONALLY ENCLOSED BY '"' in ctl-file.

It will be really great if you can guide me through it.

Following are some 3-example rows: -
--------
1-row data
--------

3794574,audible,03/01/2010 12:00 PM,03/01/2010 12:00 PM,1,"cu called in asking about canceling the account since he doesn't want to pay the monthly membership/..
offered the light annual plan for 9.95 he said yes...
changed the plan for him and changed the credit card number too...

AudibleListener LightEverest Annual Membership- Upgrade (MB_LIST_000000EAa)

Signup Date: 03/01/10
First Billed On: 03/01/10
Last Billed On: 03/01/10
Next Renewal:

March 1, 2011


Purchase Price: $9.95
Contract Price: $9.95
Cancel Status: Not committed",1,Mariel Herrera


-------
2-nd row data
--------

3794580,audible,03/01/2010 12:01 PM,03/01/2010 12:01 PM,1,Gave current pricing plans. Wanted to know of any promos for old members. Told him he would receive those in his email. Changed email address and updated opt-in for email only.,1,Brian James

-------
3rd row data
--------

3794586,audible,03/01/2010 12:03 PM,03/01/2010 05:42 PM,2,"Dear Anona Ore,

Thank you for contacting Audible, this is in regard's to the title ""Erak's Ransom"" by John Flanagan.

I need to escalate this issue to the Content department to find out when we will get it. Normally, I could pull up the release date at the websites of Amazon and ""Barnes & Noble"", by looking when the audio version will be available. However, they have no entry for the CD/cassette version. I went the publisher's website, Recorded Books, and it looks like a CD and cassette version is already available. However, a digital audio version is not due out until May 1. That may be the release date on our site.

As soon as I get the response from Content, I will let you know. Let me know if you have any questions. Thank you and have a great day!

For additional questions, how to contact us and hours of operation, please visit us at: www.audible.com/contactus

Sincerely,
Victor K.
Audible Customer Support",1,Victor Kelmeris

--------------------
--------------------

Tom Kyte
May 10, 2010 - 6:17 pm UTC

That data is not going to be loadable with sqlldr.


see http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:30092999095211#1827234800346433394 for formats that would work.

Loading the data

Raghavendra, May 12, 2010 - 7:15 am UTC

Hi All,

Last week I had posted my issue with subject name as "Loading the data" on May 3, 2010 - 6am Central time zone along with given big datamodel.Now Myself I have resolved this issue at my end. Thank you very much for your support.
If any body want resolution steps, please let me know. I will forward it to all of them.

SQL Loader LOB File - Ref Env Variables

olerag, June 08, 2010 - 2:35 pm UTC

Tom,

In our LOB File load, the control file has the following syntax:

LOAD DATA
INFILE *
replace
INTO table my_xml
fields terminated by ","
(
rec_id,
lob_file FILLER char,
XML_CLOB LOBFILE(lob_file) TERMINATED BY EOF
)
begindata
1,one.xml
2,two.xml
3,three.xml
4,/home/tom/myloadpath/four.xml
5,$TOMS_PATH/five.xml

Rows 1-3 load great. Also, if I put the complete path in, no problems either so record #4 also loads. However, referencing environment variables do not seem to work so "five" doesn't load.

Env variable referencing in SQL Loader works in all other facets including load file statements and when indicating bad/log/discard files in the control file (although for the later the hyphen character must be used to start/end the string.

Any ideas on if env variable can be used in the "beginData" portion of the control script when loading lob files?
Tom Kyte
June 10, 2010 - 11:13 am UTC

no, data is data - there are no environment variables to be used there.

how to load the data as (Date format in sql

Raghava, July 10, 2010 - 2:35 am UTC

Currently i am using oracle date format as MM/DD/YYYY H24:MI:SS PM.I need to load the data in "abc" table by using below 1.txt file.
1.txt
----------
b,d,e,a,c
23,1/23/2007 6:00:05 AM,234.4567834,6/2/2007 6:00:05 PM,11/3/2010
----------------
'abc" table syntax:
abc { a date, b number, c date, d date, e float) }

and After loaded the above data using .ctl file, i wil expect the data appeared in table "abc" ouput as like below
6/2/2007 6:00:05 PM , 23,11/3/2010,1/23/2007 6:00:05 AM,234.4567
floating value as after decimal points as 4 numbers only.
Please suggest me.
Tom Kyte
July 19, 2010 - 8:35 am UTC

ops$tkyte%ORA10GR2> desc abc
 Name                                     Null?    Type
 ---------------------------------------- -------- ----------------------------
 A                                                 DATE
 B                                                 NUMBER
 C                                                 DATE
 D                                                 DATE
 E                                                 FLOAT(126)

ops$tkyte%ORA10GR2> select * from abc;

no rows selected

ops$tkyte%ORA10GR2> !cat abc.ctl
LOAD DATA
INFILE *
INTO TABLE abc
REPLACE
FIELDS TERMINATED BY ','
(
 b
,d date "mm/dd/yyyy hh:mi:ss am"
,e
,a date "mm/dd/yyyy hh:mi:ss am"
,c date "mm/dd/yyyy"
)
BEGINDATA
23,1/23/2007 6:00:05 AM,234.4567834,6/2/2007 6:00:05 PM,11/3/2010

ops$tkyte%ORA10GR2> !sqlldr / abc

SQL*Loader: Release 10.2.0.4.0 - Production on Mon Jul 19 09:32:56 2010

Copyright (c) 1982, 2007, Oracle.  All rights reserved.

Commit point reached - logical record count 1

ops$tkyte%ORA10GR2> select * from abc;

A                             B C                    D
-------------------- ---------- -------------------- --------------------
         E
----------
02-jun-2007 18:00:05         23 03-nov-2010 00:00:00 23-jan-2007 06:00:05
234.456783


A reader, July 13, 2010 - 6:53 am UTC

can you please reply asap on my above issue.

Function Call in SQL Loader

Abhisek, July 19, 2010 - 9:15 am UTC

Hi Tom,

I think it is possible to call any built-in or user-defined function during sql load process. I have made a function which takes the first character of each row in data file and passes it as parameter. It works fine for conventional load.

Logic of program is : Check the first character of row.. if I means Insert, U means update, D means Delete..

Operation will be carried on million records.

Can you please suggest me if the same can be implied for Direct load SQL Loader process. In case yes, a small example will be appreciable.

Thanks again.
Tom Kyte
July 19, 2010 - 2:10 pm UTC

I'd need to know what you are trying to do a bit better.

So, you have a function
It takes a single character as input.

You have said IUD = insert/update/delete - so what, so are you returning the word "Insert" or doing an insert (and if so, of what exactly, to where)

full example necessary. Not sure what you are trying to do.

Trigger or Package

Abhisek, July 22, 2010 - 6:45 am UTC

Hi Tom,

I have the requirement to load the data that will be incremental in nature. There is a flag at the beginning that signifies whether the row will be inserted, deleted or updated based on the defined keys. Cant argue for this logic as this is the requirement.

Table looks like:

CREATE TABLE DEMO
(
     COL1 VARCHAR2(10)
    ,COL2 NUMBER
    ,COL3 NUMBER
    ,COL4 DATE
    ,COL5 VARCHAR2(10)
)    


The file may look like this:

COL1 COL2 COL3 COL4 COL5
I 11 10 01.01.2010 DEMO
I 12 11 01.01.2011 ABC
U 11 12 01.01.2012 DEMO
I 13 10 01.01.2009 XYZ
U 12 10 01.01.2011 PQR
D 13 10 01.01.2009 XYZ

In case I found, 'I' in col1, that row will be inserted. In case I found 'U' in col1, that will be updated and if I found 'D' in col1, that is deleted.

What is the best possible way to handle it:
1. triggers
2. Packages
3. Functions inside SQL Loader

Please suggest as I am not very sure about it. But I woul definitely not like to go for triggers unless the WORLD is going to blast.


Tom Kyte
July 23, 2010 - 9:17 am UTC

external table and a merge statement. No code, no sqlldr.


I'd give you an example, but you don't define any keys or give creates or inserts to work with :(

Follwo up: Data looks like this

Abhisek, July 22, 2010 - 6:47 am UTC

"I"|"11"|"10"|"01.01.2010"|"DEMO"
"I"|"12"|"11"|"01.01.2011"|"ABC"
"U"|"11"|"12"|"01.01.2012"|"DEMO"
"I"|"13"|"10"|"01.01.2009"|"XYZ"
"U"|"12"|"10"|"01.01.2011"|"PQR"
"D"|"13"|"10"|"01.01.2009"|"XYZ"


Followup

Abhisek, July 23, 2010 - 5:45 pm UTC

Hi Tom,

Am sorry for that.. but using SQL Loader is a requirement as it has been part of system and I cannot avoid that.

Thats why I gave you the data sample and create table script. I can provide you the ctl file:

LOAD DATA
INTO TABLE DEMO
FIELDS TERMINATED BY "|" OPTIONALLY ENCLOSED BY '"'
TRAILING NULLCOLS
( col1,
col2,
col3,
col4,
col5
)
So please suggest me considering that I am using SQL loader. Sorry for the stubborn request..


Tom Kyte
July 26, 2010 - 8:06 am UTC

... but using SQL Loader is a requirement as it has been part
of system and I cannot avoid that.
...

Look - we all have to realize that this word 'requirement' is utter nonsense.

YOU ARE MAKING A MAJOR CHANGE TO SOMETHING HERE. Otherwise you would not be asking the question would you? If everything was staying the same - we wouldn't be here. You are doing something new, all rules are lifted. The way you used to do it doesn't count - I don't care about it, it is not relevant. You ask how best to do something - and you receive a response.


I don't care about sqlldr, it won't do what you need. It can LOAD (insert) records - period. You wrote:

... There is a flag at the beginning that signifies whether the row will be inserted, deleted or updated based on the defined keys....

that sort of precludes sqlldr from being relevant to the discussion.


So, please suggest to the powers that be that "it ain't the right way, I've been told what is and it is called MERGE"


(if you persist, if you really really really want to blindly do it "the way its been done forever, so it must be right", then go ahead, use sqlldr to load this data into any table - and then MERGE this newly loaded table with the real stuff - you can be as inefficient as you like)

External table and SQL Loader

Charlie 木匠, July 27, 2010 - 1:02 pm UTC

As I recently know, may Oracle developers even don't know the External table exists. sigh! :(

Exteral Table

A reader, September 09, 2010 - 4:11 pm UTC

So, I get the chance to use the external table...

Thanks Tom for the suggestions you had, to let me force this change..

But I see a problem here, which could not be for you :)

Requirements:
My staging table used to contain three indicators: I=INSERT, U=UPDATE, D=DELETE

Now with the new suggested way, I can:
1. Load the external table with data.
2. Use a merge statement between staging and final table to update or insert rows.

Problem with me:

1. How do we handle the DELETE indicator which used to delete from the final table?

2. My data file name changes everytime.. at one point of time I may have multiple files with the names I dont know.. I only know they may start with a name and then timestamp..
So I know DEMO in case the file was DEMO_01_JAN_2010...How can we handle it through external table..

SQL Loader was fine as I used to call it that many times..
Tom Kyte
September 09, 2010 - 9:05 pm UTC

1) merge can delete as well, read about it, sql language reference manual.


2) alter table can deal with that, you can change the file associated with the external table

jelena, September 14, 2010 - 3:52 am UTC

Hi,

is there anyway to read two records from txt file and load into one record in table with some sqlldr operations? for example..i have txt like this:
H0123456789...
M542678552..
M789456123..
T0123456789...
I want to  make somehow that first and last record be read as one record, without changing the file.

Tnx, jelena

Tom Kyte
September 14, 2010 - 7:15 am UTC

not really - not with sqlldr, but with an external table INSTEAD of sqlldr (much better, lots of flexibility) you could easily.


select max(decode( tag, 'HO', data ))||max(decode( tag, 'TO', data )) from et;


would do it (assuming you mapped the first two characters to a field called "TAG" and the rest of the data to a field called data...)

http://asktom.oracle.com/pls/asktom/asktom.search?p_string=external+table

Using Sqlloader

ashok chejerla, September 21, 2011 - 1:38 pm UTC

Can u pls let me know if i can use a sql query instead of text file as an input to Sqlloader.
Tom Kyte
September 21, 2011 - 10:41 pm UTC

U says:

if U would like to load the text of a query - sure, go ahead.

but since sqlldr loads DATA - it would load the text of the query itself. if you want to load the OUTPUT of some query - you would not use sqlldr of course, you would just use:

insert into your_table <YOUR_QUERY>;

right in sql itself.



meaning, if you are copying data withing a database - it would be BEYOND silly to use sqlldr, you would just insert into your_table <YOUR_QUERY>;

SQL Loader Control file Data File

Rafee, September 30, 2011 - 4:26 am UTC

The source data file contains both a control record and detail records.
The control records indicate total entered credits and debits as well as total number of records.
This control record is validated against the sum of the detail record.
this is my requirement can i have one example(Sample Data file/Control file) to satisfy this requirement.
Here i need to valiated no of records in data file and no of records loaded into table through control script.
Tom Kyte
September 30, 2011 - 6:32 pm UTC

can i have one example(Sample Data file/Control file

simply put, no - not right now.

but that is only because we don't have a schema, we don't have sample data.

I cannot believe you expect me to provide the sample data???? You are the only one here that knows the true format of that data file.

And I'd want to use external tables - NOT sqlldr for this.

We'll load the data and then any validation will take place in SQL. If we use sqlldr - we'll have to load and then validate. If we use external tables - we may well be able to load and validate at the same time. It depends. Have to see an example though.

A reader, November 14, 2011 - 12:19 am UTC

hi Tom,
can we use a computed column in another column.

Say we have table
create table tbldemo(col1 varchar2(100),col2 varchar2(100),col3 varchar2(100),col4 varchar2(100))

sql loadrr logic:
load into tbldemo fields terminated by ,
col1 "ucase(:col1)",
col2,
col3 ":col2" ll ":col1",
col4 "new value" ll ":col3"

problem is i am unable to use value of col3 in col4. Please help with a solution if possible.
Tom Kyte
November 15, 2011 - 8:23 am UTC

the INPUT column values are 'constant' for a given execution of the insert.

You'll have to use the function in the function


col4 "'new value ' || :col2 || :col1"


SQL Loader

Pavan, March 13, 2014 - 2:41 pm UTC

Hi Tom,

Good day!!

I am little confused on how to achive the below using SQLLDR.

We get an inbound file like the below:
AAA|john|123|MN|
BBB|john|10000|
BBB|john|50000|
AAA|rohan|456|CA|
BBB|john|10000|
BBB|john|50000|
AAA|john|123|MN|
BBB|john|90000|
BBB|john|80000|

We are staging all AAA segments in to table_aaa
with the structure like

SEGID - NAME - ID - STATE - RECORD_ID
-------------------------------------
AAA john 123 MN 1
AAA rohan 456 CA 4
AAA john 123 MN 7

similarly BBB segments into table_bbb like the below

SEGID - NAME - DISEASE_CD - RECORD_ID - AAA_RECORD_ID
---------------------------------------
BBB john 10000 2 1
BBB john 50000 3 1
BBB rohan 10000 5 2
BBB rohan 50000 6 2
BBB john 90000 7 3
BBB john 80000 8 3

I would like to get the AAA_RECORD_ID generated as shown above for tagging the BBB records with corresponding AAA records. Could you please help with achieving this with SQLLDR.

Thank you!

More to Explore

CLOB/BLOB/etc

Complete documentation on Securefiles and Large Objects here