Skip to Main Content

Breadcrumb

Question and Answer

Connor McDonald

Thanks for the question.

Asked: August 10, 2000 - 9:12 am UTC

Last updated: February 23, 2017 - 2:53 am UTC

Version: 8.0.5

Viewed 10K+ times! This question is

You Asked

Hi Tom,

I need to load the data from text file into multiple tables.
In each row, the first, second, fourth fields should go into
table"A", third, sixth fields should go into table"B" and
fifth field should go into table "C".
The fields are terminated by ";". Can we load this into multiple tables?

Thanks in advance,

Jagan


and Tom said...

in 8.0, we can do this with some PLSQL code. See
</code> http://asktom.oracle.com/~tkyte/SkipCols/index.html <code>
for the delimited package I use below.


Basically I started with:

create table t1 (field1 int, field2 int, field4 int );
create table t2 (field3 int, field6 int);
create table t3 (field5 int );

And a control file that was:


LOAD DATA
INFILE *
REPLACE
INTO TABLE T1
(
FIELD1 position(1:1024) "delimited.word(:field1,1,NULL,';')",
FIELD2 position(1:1) "delimited.word(:field1,2,NULL,';')",
FIELD4 position(1:1) "delimited.word(:field1,4,NULL,';')"
)
INTO TABLE T2
(
FIELD3 position(1:1024) "delimited.word(:field3,3,NULL,';')",
FIELD6 position(1:1) "delimited.word(:field3,6,NULL,';')"
)
INTO TABLE T3
(
FIELD5 position(1:1024) "delimited.word(:field5,5,NULL,';')"
)
BEGINDATA
1;2;3;4;5;6;7;8;9;10;11;12;13;14;15;16;17;18;19;20;21;22


After running it -- I saw:


ops$tkyte@8i> select * from t1;

FIELD1 FIELD2 FIELD4
---------- ---------- ----------
1 2 4

ops$tkyte@8i> c/1/2/
1* select * from t2
ops$tkyte@8i> /

FIELD3 FIELD6
---------- ----------
3 6

ops$tkyte@8i> c/2/3
1* select * from t3
ops$tkyte@8i> /

FIELD5
----------
5

ops$tkyte@8i>


Rating

  (102 ratings)

Is this answer out of date? If it is, please let us know via a Comment

Comments

How to do this

pawan, December 11, 2002 - 9:27 am UTC

I am new to Oracle and have learnt a lot from this site. I have a very simple question.
Name Null? Type
----------------------------------------- -------- ----------------------------
BUSINESS_UNIT_ID_CREDIT NOT NULL VARCHAR2(5)
CREDIT_MGR_ID VARCHAR2(5)
CREDIT_MGR_NAME VARCHAR2(50)
FI_NUMBER VARCHAR2(18)
FI_NAME VARCHAR2(50)
FI_CITY VARCHAR2(50)
FI_STATE VARCHAR2(3)
PROMOTION_CD NOT NULL VARCHAR2(18)
PROMOTION_DESC NOT NULL VARCHAR2(20)
CHECK_NO VARCHAR2(14)
CHECK_DT DATE
LOC_ID VARCHAR2(5)
PROMOTION_DT NOT NULL DATE
PROMOTION_CATEGORY NOT NULL VARCHAR2(20)
PROMOTION_RSN_CD NOT NULL VARCHAR2(3)
PROMOTION_RSN_DESC NOT NULL VARCHAR2(25)
PROMOTION_ID NOT NULL VARCHAR2(9)
REFERENCE_ID VARCHAR2(14)
DYS_PAST_DUE NOT NULL NUMBER(6)
PROMOTION_BALANCE NOT NULL NUMBER(15,2)
AGED_AMT_1_5 NUMBER(15,2)
AGED_AMT_6_30 NUMBER(15,2)
AGED_AMT_31_60 NUMBER(15,2)
AGED_AMT_61_90 NUMBER(15,2)
AGED_AMT_91_180 NUMBER(15,2)
AGED_AMT_181_365 NUMBER(15,2)
AGED_AMT_OVER_365 NUMBER(15,2)
PROMOTION_STATUS NOT NULL VARCHAR2(2)
PROMOTION_LST_UPD_DT NOT NULL DATE


I have table called PROMOTIONS which has the structure as shown above. I need to load data into the table which is available in a flat (data.csv) file. The data is pipe delimited. I need to skip the first row which has the date and then for each record in the flat file I need to skip the first field (which is always - ORACLE GROUP) and then start loading from the second field onwards. The last two columns in the table viz. PROMOTION_STATUS and PROMOTION_LST_UPD_DT are to be populated with ‘O’ and PROMOTION_LST_UPD_DT is to be sysdate. Sample data file (say data.csv) is as under:
12/02/02| | | | | | | | | | | | | | | | | | | | | | | | | | | | |
ORACLE GROUP|01100| | |UNIDENTIFIED | | | |601 |DIVISION PROMOTION | | | |05/31/02|PROMOTIONAL RELATED |034|PROMOTIONAL CREDIT |P80165280|102501 | 185| -49.00| | | | | | -49.00| |
ORACLE GROUP|01100| | |UNIDENTIFIED | | | |651 |DIVISION CREDIT MEMO | | | |12/28/01|RETURNS |202|UNKNOWN - NF |96204325 | | 340| -1070.19| | | | | | -1070.19| |
ORACLE GROUP|01100| | |UNIDENTIFIED | | | |651 |DIVISION CREDIT MEMO | | | |01/09/02|RETURNS |202|UNKNOWN - NF |96213909 | | 328| -551.29| | | | | | -551.29| |
ORACLE GROUP|01100| | |UNIDENTIFIED | | | |651 |DIVISION CREDIT MEMO | | | |02/05/02|RETURNS |202|UNKNOWN - NF |96239262 | | 301| -146.48| | | | | | -146.48| |
ORACLE GROUP|01100| | |UNIDENTIFIED | | | |651 |DIVISION CREDIT MEMO | | | |02/13/02|RETURNS |202|UNKNOWN - NF |96247863 | | 293| -15.90| | | | | | -15.90| |
ORACLE GROUP|01100| | |UNIDENTIFIED | | | |651 |DIVISION CREDIT MEMO | | | |02/20/02|RETURNS |202|UNKNOWN - NF |96251086 | | 286| -1261.51| | | | | | -1261.51| |
ORACLE GROUP|01100| | |UNIDENTIFIED | | | |651 |DIVISION CREDIT MEMO | | | |02/20/02|RETURNS |202|UNKNOWN - NF |96251087 | | 286| -2923.89| | | | | | -2923.89| |
ORACLE GROUP|01100| | |UNIDENTIFIED | | | |651 |DIVISION CREDIT MEMO | | | |02/21/02|RETURNS |202|UNKNOWN - NF |96253112 | | 285| -22.91| | | | | | -22.91| |
ORACLE GROUP|01100| | |UNIDENTIFIED | | | |651 |DIVISION CREDIT MEMO | | | |03/19/02|COLLECTION RELATED |241|UNKNOWN - NF |96280401 |N/A/AB068 | 259| -76.32| | | | | | -76.32| |


I am new to Oracle and will be obliged if you can show me a SQL*Loader script to do it. I don’t have manuals handy and someone guided me top this site.
Thanks


Tom Kyte
December 11, 2002 - 9:32 am UTC

Hey, that "i don't have the manuals handy" doesn't cut it in 2002!!!

If you can get here, you have the manuals totally handy.

</code> http://otn.oracle.com/
 -> DOCUMENTATION!!!  It is all there.


Anyway, to answer your question -- to skip the first record, simply use skip=


sqlldr userid=x/y control=yourfile.ctl skip=1


In order to skip that first field, use FILLER, see:
http://asktom.oracle.com/~tkyte/SkipCols/index.html <code>



Here is

Pawan, December 11, 2002 - 10:21 am UTC

Thanks Tom. Here is what I have coded. The BUSINESS_NAME field is not there in table and I have just used as filler to skip the first field in the flat file. The final table is not yet created and so I cannot test it.
LOAD DATA
INFILE /prod/data/input/di.csv
TRUNCATE INTO TABLE PRD.PROMOTIONS
FIELDS TERMINATED BY '|'
(
BUSINESS_NAME FILLER,
BUSINESS_UNIT_ID_CREDIT,
CREDIT_MGR_ID,
CREDIT_MGR_NAME,
FI_NUMBER,
FI_NAME,
FI_CITY,
FI_STATE,
PROMOTION_CD,
PROMOTION_DESC,
CHECK_NO,
CHECK_DT,
LOC_ID,
PROMOTION_DT,
PROMOTION_CATEGORY,
PROMOTION_RSN_CD,
PROMOTION_RSN_DESC,
PROMOTION_ID,
REFERENCE_ID,
DYS_PAST_DUE,
PROMOTION_BALANCE,
AGED_AMT_1_5,
AGED_AMT_6_30,
AGED_AMT_31_60,
AGED_AMT_61_90,
AGED_AMT_91_180,
AGED_AMT_181_365,
AGED_AMT_OVER_365,
PROMOTION_STATUS,
PROMOTION_LST_UPD_DT
)

I will use the skip=1 in the command line while invoking SQLLOADER.

I can't load

Pawan, December 11, 2002 - 5:31 pm UTC

Sorry to be a pain Tom. I wrote the contol file and tried running the sqlloader from the command prompt but it fails.
Here is my control file
----------------------------------------------------
LOAD DATA
INFILE "/home/ /di.csv"
INTO TABLE SPOT.PROMOTIONS
FIELDS TERMINATED BY '|'
TRAILING NULLCOLS
(
BUSINESS_NAME FILLER,
BUSINESS_UNIT_ID_CREDIT,
CREDIT_MGR_ID,
CREDIT_MGR_NAME,
FI_NUMBER,
FI_NAME,
FI_CITY,
FI_STATE,
PROMOTION_CD,
PROMOTION_DESC,
CHECK_NO,
CHECK_DT DATE NULLIF CHECK_DT=BLANKS,
LOC_ID,
PROMOTION_DT "MM/DD/YYYY",
PROMOTION_CATEGORY,
PROMOTION_RSN_CD,
PROMOTION_RSN_DESC,
PROMOTION_ID,
REFERENCE_ID,
DYS_PAST_DUE,
PROMOTION_BALANCE,
AGED_AMT_1_5,
AGED_AMT_6_30,
AGED_AMT_31_60,
AGED_AMT_61_90,
AGED_AMT_91_180,
AGED_AMT_181_365,
AGED_AMT_OVER_365,
PROMOTION_STATUS, ----„³ I need to enter a Default Value ¡¥O¡¦
PROMOTION_LST_UPD_DT sysdate
)
---------------------------------------------------
The last two fields in the table are not in the flat file. I need to have default values for those two - one will have SYSDATE and the other PROMOTION_STATUS will have 'O'.

When I run using
sqlldr userid=u/p control=promo.ctl skip=1
Nothing gets loaded and I get the following in the log file
Record 1: Rejected - Error on table PROMOTIONS, column PROMOTION_DT.
ORA-00984: column not allowed here

Please help me

Tom Kyte
December 11, 2002 - 10:49 pm UTC

just to prove the manuals are handy:

</code> http://docs.oracle.com/docs/cd/A87860_01/doc/server.817/a76955/ch05.htm#4959 <code>

read about "generating data" with sqlldr - but, that isn't your error.

PROMOTION_DT "MM/DD/YYYY",

is clearly the error (not the last two columns).


PROMOTION_DT date "MM/DD/YYYY",

Thanks

pawan, December 12, 2002 - 9:46 am UTC

Thanks Tom. I poured over the manual online yesterday and could solve the problem.

Another SQLLoader Error

Steve, February 19, 2003 - 11:30 am UTC

Hi Tom,

We are having an SQL Loader error. I'm not sure if I
should add this to this thread or not (what do you prefer?)

We occasionally get the following error while SQL-Loading
into a partitioned table:

Record 712: Rejected - Error on table XXXX.
ORA-06540: PL/SQL: compilation error
ORA-06553: PLS-907: cannot load library unit DWADM.PP_CALLS (referenced by DWADM.LONG_DISTANCE_IND)
ORA-06512: at "DDSADM.MO_AFTER_INS_ROW_TR", line 11
ORA-04088: error during execution of trigger 'DDSADM.MO_AFTER_INS_ROW_TR'


Table XXXX:
1859 Rows successfully loaded.
1 Row not loaded due to data errors.
0 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.


Space allocated for bind array: 64428 bytes(42 rows)
Space allocated for memory besides bind array: 0 bytes

Total logical records skipped: 0
Total logical records read: 1860
Total logical records rejected: 1
Total logical records discarded: 0


I've checked metalink and your site but struck out. Here's
some additional info/notes.

1. Solaris 8, Oracle 8.1.7, 64 bit.
2. The target table is large, partitioned, and has a trigger.
3. There are not database links involved (single instance)
4. there are 2 schemas: ddsadm and dwadm
5. we are loading into ddsadm.

One other thing (not sure if this is related, relavent, or helpful). We often see views (not sure about functions) become invalid and we don't know why. Could views become invalid due to "rolling" partition deletion/creation?


Tom Kyte
February 19, 2003 - 2:01 pm UTC

views can become invalid due to index maintenance which occurs with rolling windows alot. The "updatable" or not setting must be re-evalauated for a view.

Does the affected procedure make use of these views? could it be that a combination of

o loading
o triggering
o rolling
o index invalidation/rebuilding

is happening at the same time?

Bingo!

Steve, February 19, 2003 - 2:23 pm UTC

You got it. All (without exception) are happening right around 8 PM (and almost exactly at 20:00). This is when our cronjob starts. This cron job:

o exports the oldest partition
o truncates the oldest partition
o moves the oldest partition (now truncated) to another tablespace
o rebuilds the indexes on that truncated partition.

As far as I can tell, this is what is happening:

SQL*Loader -> into Table T
-> Trigger on T uses fn1, fn2, and fn3.

the trigger only uses functions but does not access the views. Does the same apply to functions? i.e. Can functions become invalid due to the rolling window activity (which would explain the SQL*Loader error we are getting)




Tom Kyte
February 19, 2003 - 3:41 pm UTC

are the functions in a package?
are their other functions in the package dependant on the views?
do the views actually go invalid?
do the procedures access the partitioned table itself? (if so -- that is definitely the cause, they'll go invalid at that point since Oracle remembers a procedure is dependent on a TABLE, not a partition of a table -- so partition operations invalidate dependent procedures -- they do not know if coded:


select ... from partitioned_table

or
select .... from partitioned_table partition(p1)


so they invalidate to check it out.

Is this it?...

Steve, February 20, 2003 - 6:29 am UTC

Note: Functions are not in a package.

Is this the reason...

1. The table with the rolling window is: PP_CALLS.
2. One of the functions in question is: LONG_DISTANCE_IND

3. The error contains:
ORA-06553: PLS-907: cannot load library unit DWADM.PP_CALLS (referenced by
DWADM.LONG_DISTANCE_IND)

4. The trigger on the SQL*Loader table calls the function: LONG_DISTANCE_IND

5. The function contains the following lines:

FUNCTION LONG_DISTANCE_IND
(col1 PP_CALLS.CT_CT_ID%TYPE
col2 PP_CALLS.SERVED_MSISDN%TYPE...
)
RETURN PP_CALLS.LONG_DISTANCE_ID%TYPE
IS
-- PL/SQL Specification
LONG_DISTANCE_ID PP_CALLS.LONG_DISTANCE_ID%TYPE;
...


i.e. a bunch of references to types in the rolling window table.

Do you think that's the cause?

Thanks a lot Tom.

Steve

Tom Kyte
February 20, 2003 - 7:52 am UTC

yes, consider:

ops$tkyte@ORA817DEV> CREATE TABLE t
  2  (
  3    year_month  varchar(4)
  4  )
  5  PARTITION BY RANGE (year_month) (
  6    PARTITION part_jan_2000 VALUES LESS THAN ('0002') ,
  7    PARTITION part_feb_2000 VALUES LESS THAN ('0003') ,
  8    PARTITION junk VALUES LESS THAN (MAXVALUE)
  9  )
 10  ;

Table created.

ops$tkyte@ORA817DEV>
ops$tkyte@ORA817DEV> create or replace function f return number
  2  as
  3          data t.year_month%type;
  4  begin
  5          null;
  6  end;
  7  /

Function created.

ops$tkyte@ORA817DEV>
ops$tkyte@ORA817DEV> select status from user_objects where object_name = 'F';

STATUS
-------
VALID

ops$tkyte@ORA817DEV> alter table t drop partition part_jan_2000;

Table altered.

ops$tkyte@ORA817DEV> select status from user_objects where object_name = 'F';

STATUS
-------
INVALID


the partition operation invalidates the function -- if you do this at the wrong time and try to revalidate the function -- you could get "wedged" as you are here.  The function needs to DDL lock the table -- the table cannot be because you are rolling the windows -- function cannot recompile itself -- bamm. 

Thanks!

steve, February 20, 2003 - 8:17 am UTC

As always, a big big thanks Tom!


sqloader cobol file on unix

lura, September 11, 2003 - 2:05 am UTC

Hi Tom,
questions about sqlloader (To load a .dat file which was generated by a Cobol extract program into a
table(oracle 9i, platform is unix).

1,I have a record in the file like abc123.. , I can not load it correctly(the field located is not right), but I create a new file using vi and copy the
record to the new one, it will be successful. What happend here ?

2.For cobole type PIC 9(n) comp, x(n) etc, How can use the
position to locate them ?

Thank you very much!








Tom Kyte
September 11, 2003 - 8:39 am UTC

1) no clue what you are saying here.

i've no idea why you "can not load it correctly"
no idea how vi fixed it.

2) you know the start field position and the length right? that is what you'll need in order to use position which just needs the start and length or start and stop column numbers.

SQL Loader

Zilani, February 22, 2004 - 1:39 am UTC

Hi Tom 
I have a problem of using SQL Loader. I have a table as follows-
SQL> desc Mtblabc;
 Name                 Null?    Type ---------------------------
 abc_ID       NOT NULL VARCHAR2(16)
 
 abc_ADDRESS           ADDRESS
 
 abc_TYPE     NOT NULL CHAR(1)
 
 

there is a column of object type(ADDRESS).

SQL> desc address;
 Name        Null?    Type
 ----------------------------------------------------- 
 HOUSE_NO_ROAD    VARCHAR2(256)
 VILLAGE_CODE   VARCHAR2(2)
 UPAZILLA_CODE   VARCHAR2(2)
 UNION_CODE      VARCHAR2(2)
 DISTRICT_CODE   VARCHAR2(3)

How do I create the control file for this object type column. I did it in the following way-

-- File Name : mtblabc.ctl 
-- File Creation Date : 19-FEB-04 

LOAD DATA 
 INFILE  'mtblabc.txt'
 BADFILE  'mtblabc.bad'
APPEND 
 INTO TABLE
mtblabc  
 TRAILING NULLCOLS 
 (
abc_id               CHAR TERMINATED BY ",",
abc_address        VARRAY COUNT (ADDRESS)
(
ADDRESS     COLUMN OBJECT
(
 house_no_road     TERMINATED BY ",",
 village_code     TERMINATED BY ",",
 upzilla_code     TERMINATED BY ",",
 union_code     TERMINATED BY ",",
 district_code     TERMINATED BY ","
)
)

abc_type        CHAR TERMINTATED BY ","

) 

Is it the right way . Please help me.
When i wants to load from this ctl file it gives error.
give me some suggestions or any related documents.

Thanks in advance.
 

Tom Kyte
February 22, 2004 - 9:34 am UTC

well, what error and what does your input file look like and how about giving me "create" statements so I can try your example?

SQL Loader

Zilani, March 03, 2004 - 1:58 am UTC

Thanks Tom for your patience. I have already solved my problem. Any way I am also giving you the ?create? command as following and the control file. My table is a large table so I cut it and gave you a small portion and used ?abc? for understanding. You can try it .Better suggestions will be appreciated. In .txt file in control file I have to give the full path. Without the path the sqlldr could not read the .txt file. Is there any way to avoid the full path.
Zilani

CREATE OR REPLACE TYPE ADDRESS AS OBJECT
( house_no_road VARCHAR2(256),
village_code VARCHAR2(2),
union_code VARCHAR2(2),
upazilla_code VARCHAR2(2),
district_code VARCHAR2(3)
)

CREATE TABLE Mtblabe
( abc_ID NOT NULL VARCHAR2(16),
abc_ADDRESS ADDRESS,
abc_TYPE NOT NULL CHAR(1)
);


**********Control File mtblabc.ctl**********
LOAD DATA
INFILE 'F:\CreditBureauDatabase\TechnicalDevelopment\Loader\mtblabc.txt'
BADFILE 'F:\CreditBureauDatabase\TechnicalDevelopment\Loader\mtblabc.bad'
APPEND
INTO TABLE
mtblabc
TRAILING NULLCOLS
(
abc_id CHAR TERMINATED BY ",",
abc_address COLUMN OBJECT (
house_no_road CHAR TERMINATED BY ",",
village_code CHAR TERMINATED BY ",",
union_code CHAR TERMINATED BY ",",
upazilla_code CHAR TERMINATED BY ",",
district_code CHAR TERMINATED BY ","),
abc_type CHAR TERMINATED BY ","
)


*************Data File mtblabc.txt**********
1,1,2,3,4,5,1


Tom Kyte
March 03, 2004 - 9:48 am UTC

run sqlldr from the directory where the file to be loaded resides. it uses the current working directory.

SQL Loader

Zilani, March 07, 2004 - 2:02 am UTC

Thanks Tom
A great job.


need information

Shivcharan Lavande, May 25, 2004 - 4:02 am UTC

I need to know what would be the ideal way for the following task to
be done

client has given me an excel file with huge data. now client wants me to
upload this data into table t1 using sql loader.

But at the same time the excel file does not contain all the fields,based on the
given excel data i need to derive some more fields and then insert into the table t1.

consider the excel contains 4 columns, now using an sql script i derive additional data
of 3 more columns.so consolidated it should be huge data with 7 columns.

should i create a temp table





Tom Kyte
May 25, 2004 - 7:28 am UTC

given excel has tiny limits on sizes -- this is really pretty "small data". I would probably save it as a csv file (comma/tab delimited file), use sqlldr or external table (9i) to load it into a scratch table, process it futher into the real table.



SQL Loader

deepak samant, June 03, 2004 - 9:24 am UTC

Tom,
Your solutions has helped me in solving many of the problems.Could you please provide me the solution for the following problem.

SQL*Loader-704: Internal error: ulnai1: bad row out of bounds [13]

Regards,
Deepak


Tom Kyte
June 03, 2004 - 1:40 pm UTC

you'll need to contact support for "internal" errors -- unless you give me a reproducible test case...

Loading Variable length records in multiple tables

Yogesh B, July 21, 2004 - 12:38 pm UTC

I want to load multiple length records in multiple tables, based on the value in the first column ... how should I write the control file ?

columns are delimited by '|', first column may contain values from 1-100 ..




Tom Kyte
July 21, 2004 - 2:06 pm UTC

can we use an external table? sqlldr isn't going to be "easier" with that.

Oracle 8.0

Yogesh, July 22, 2004 - 4:52 am UTC

No can't use external tables .... as 8.0 doesn't support ..

Tom Kyte
July 22, 2004 - 7:23 am UTC

</code> http://asktom.oracle.com/~tkyte/SkipCols/index.html <code>

you have to use position, parse the lines. You can use the package referenced by that link as a starting place.

search for that package name on this site to see various examples.

Sql*Loader

ARC, July 22, 2004 - 9:37 am UTC

Hi Tom,
I have a table with a column long data type.
Using Sql*Loader how can load data?
Please help me.
Thanks in advance.
ARC

Tom Kyte
July 23, 2004 - 7:35 am UTC

it loads like any other datatype? you might need to specify char(1000000) or something (datatype defaults to char(255))

if you have my book "Expert one on one Oracle" -- in the chapter on sqlldr -- i cover the ins and outs of using sqlldr and spend some time talking about longs.

Loader Problem

Yogesh, July 22, 2004 - 11:48 am UTC

I have a test data file which have follwing data

1|abcd
2|pqr
3|ooo

My controlfile is

load data
append
into table loader1
when (1)='1'
Fields terminated by '|'
(
no1 integer EXTERNAL,
name CHAR
)
into table loader1
when (1)='2'
Fields terminated by '|'
(
no1 integer EXTERNAL,
name CHAR
)
into table loader2
when (1)='3'
Fields terminated by '|'
(
no1 integer EXTERNAL,
name CHAR
)

Table structure of loader1 and loader2 is

Name Null? Type
------------------------------- -------- ----
NO1 NUMBER
NAME VARCHAR2(10)

I'm executing sql loader as
sqlldr / control=loaddata.ctl log=loaddata.log bad =loaddata.bad data=loaddata.dat

Only loader1 table is getting loaded with : 1,abcd

Following is the log

Table LOADER1, loaded when 1:1 = 0X31(character '1')
Insert option in effect for this table: APPEND
TRAILING NULLCOLS option in effect

Column Name Position Len Term Encl Datatype
NO1 FIRST * | CHARACTER
NAME NEXT * | CHARACTER

Table LOADER1, loaded when 1:1 = 0X32(character '2')
Insert option in effect for this table: APPEND
TRAILING NULLCOLS option in effect

Column Name Position Len Term Encl Datatype
------------------------------ ---------- ----- ---- ---- ---------------------
NO1 NEXT * | CHARACTER
NAME NEXT * | CHARACTER

Table LOADER2, loaded when 1:1 = 0X33(character '3')
Insert option in effect for this table: APPEND
TRAILING NULLCOLS option in effect

Column Name Position Len Term Encl Datatype
------------------------------ ---------- ----- ---- ---- ---------------------
NO1 NEXT * | CHARACTER
NAME NEXT * | CHARACTER

Record 2: Discarded - all columns null.
Record 3: Discarded - all columns null.

Table LOADER1:
1 Row successfully loaded.
0 Rows not loaded due to data errors.
2 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.


Table LOADER1:
0 Rows successfully loaded.
0 Rows not loaded due to data errors.

2 Rows not loaded because all WHEN clauses were failed.
1 Row not loaded because all fields were null.


Table LOADER2:
0 Rows successfully loaded.
0 Rows not loaded due to data errors.
2 Rows not loaded because all WHEN clauses were failed.
1 Row not loaded because all fields were null.

where I'm going wrong ?



Tom Kyte
July 23, 2004 - 7:57 am UTC

you cannot use the delimited trick here since the first into clause parses the entire line.

see my answer two above where I said "you have to use position" - -as you need to read and re-read the same line over and over -- using "delimited" won't work here at all.

Any better solution ?

Yogesh, July 23, 2004 - 11:18 am UTC

I used following control file and it worked. Thanks a lot .. but I'm sure if it is best way of writing control file... specially "when (1)='1'" part ... Can we write this in some better way?

load data
append
into table loader1
when (1)='1'
(
no1 position (1:1024) "delimited.word(:no1,1,NULL,'|')",
name position (1:1) "delimited.word(:no1,2,NULL,'|')"
)
into table loader1
when (1)='2'
(
no1 position (1:1024) "delimited.word(:no1,1,NULL,'|')",
name position (1:1) "delimited.word(:no1,2,NULL,'|')"
)
into table loader2
when (1)='3'
and (2)='0'
(
no1 position (1:1024) "delimited.word(:no1,1,NULL,'|')",
name position (1:1) "delimited.word(:no1,2,NULL,'|')"
)


Tom Kyte
July 23, 2004 - 3:59 pm UTC

you need to use when (1:2) = '1|'

else 1, 10, 100 all look the same.

Sql*Loader

ARC, July 23, 2004 - 2:04 pm UTC

Hi Tom,
Thank you very much. Really it worked out.
I have another strange issue. I my data file for one column data is coming in multiple lines. How can I load that?

Thanks In Advance
ARC

Tom Kyte
July 23, 2004 - 4:54 pm UTC

do you have my book "Expert one on one Oracle" -- i cover your options in there, in the chapter on sqlldr

see also:
</code> http://asktom.oracle.com/pls/asktom/f?p=100:11:::::P11_QUESTION_ID:1286201753718 <code>

reader

A reader, November 25, 2004 - 2:35 pm UTC

Is it possible to construct the control file
so that the rows from the input file is validated based
on value for a column in a OR predicate

value1 OR vlaue2

and inserted into the table, rows with the column value
other than value1 OR value2 are not inserted into the
table

Basically, can a OR predicate be used in a control file.
If not is there a work around ( other than a trigger
on the input table)


Tom Kyte
November 25, 2004 - 3:41 pm UTC

external tables -- then you can filter based on anything you want at all -- you have the full power of SQL.

the control file filters are very primitive in comparision (and or is one of the things it is not so good at)

In this case you could probably use two INTO's with a "when x=y" filter on each (two inserts basically -- single pass on input file)

reader

A reader, March 08, 2005 - 12:31 pm UTC

conventional SQL*Loader discards duplicate rows
from the input file while loading into tables

Can the direct load SQL*Loader be configured to
identify duplicate rows and put it in a bad file
instaed of loading duplicate rows into tables
which causes unique constraint issues subsequent
to the load, when enabling unique index constraints

Tom Kyte
March 08, 2005 - 3:12 pm UTC

no, the goal is "make load fast" here - data is blown in -- to check for duplicates would mandate the index be in place

RE : How to load this data

A reader, April 07, 2005 - 12:03 pm UTC

Hi Tom,

I have a pipe ("|") delimited text data file with 4 fields. I want to load the data into the table with the following structure :

TABLE TEST

Fld1 VARCHAR2(20)
Fld2 VARCHAR2(20)
Fld3 NUMBER(28,12)
Fld4 VARCHAR2(20) <== field that determines the load
Fld5 varchar2(20)

My test data is as follows :

1|ABC|90050|XYZ|annanana
230303|JAJA|00000|ABC|hxhxhxhxhx
23|XYZB|92857|DEF|xxjxjxjxj
23033|JZJZJ|23549|XYZ|sjasjasja
2333888888|JZJZJZ|00006|DEF|uauauauau

I want to load this data whenever field4 is either ABC or DEF. After reading through this discussion I understand that a delimited file will be parsed once and so only the first WHEN condition will be evaluated and so per your example, I created the delimited package. I am not sure what condition should I give to extract only ABC or DEF from field4. I tried several options and finally gave it up. Is there a way to load this data. Below is the sample control file to load this data.

LOAD DATA
INFILE '/tmp/test.dat'
TRUNCATE
INTO TABLE test
(
fld1 POSITION(1:108) "delimited.word(:fld1,1,'''','|')"
, fld2 POSITION(1:1) "delimited.word(:fld1,2,'''','|')"
, fld3 POSITION(1:1) "delimited.word(:fld1,3,'''','|')"
, fld4 POSITION(1:1) "delimited.word(:fld1,4,'''','|')"
, fid5 POSITION(1:1) "delimited.word(:fld1,5,'''','|')"
)

Question

1. Is it doable with SQL*Loader. The DBAs in my group are opposed to using the external tables and they do not want to productionize my code if I use external tables.
2. How should I frame the WHEN condition to extract the correct field position.
I tried using WHEN fld4 = 'DEF' and did not work.
How should I use the POSITION keyword in the WHEN clause to identify the correct records to load?
The problem is that we are not getting fixed-width files as these files are generated from external systems.

Any help in this regard will be highly appreciated.

Thanks

Tom Kyte
April 07, 2005 - 12:20 pm UTC

1) ugh -- i hate when that happens. probably gave you a really good technical reason like "they didn't have it in version 6"

rather than call plsql for each line, I'd probably rather just make two passes on the input, one with a control file "when field=abc" and the other with "when field=xyz"



WHEN Clause

Greg W, April 07, 2005 - 3:10 pm UTC

It seems that the "WHEN" clause works for more than one instance if the field that you're checking is the first one in each record. (Unless I didn't understand the line: "that a delimited file will be parsed once and so only the first WHEN condition will be evaluated"
So... if possible can you modify the input file to get the column of interest to the front? Something like:

awk -F\| '{ print $4"|"$0}' < input > mod.dat
So your input file of:
1|ABC|90050|XYZ|annanana
230303|JAJA|00000|ABC|hxhxhxhxhx
23|XYZB|92857|DEF|xxjxjxjxj
23033|JZJZJ|23549|XYZ|sjasjasja
2333888888|JZJZJZ|00006|DEF|uauauauau

looks like:
XYZ|1|ABC|90050|XYZ|annanana
ABC|230303|JAJA|00000|ABC|hxhxhxhxhx
DEF|23|XYZB|92857|DEF|xxjxjxjxj
XYZ|23033|JZJZJ|23549|XYZ|sjasjasja
DEF|2333888888|JZJZJZ|00006|DEF|uauauauau



Hey Tom - here's the table create:
create table t1 (
Fld1 VARCHAR2(20),
Fld2 VARCHAR2(20),
Fld3 NUMBER(28,12),
Fld4 VARCHAR2(20),
Fld5 varchar2(20));


Then use a control file like:
LOAD DATA
INFILE 'mod.dat'
BADFILE 'mod.bad'
DISCARDFILE 'mod.discard'
into table t1 APPEND
when fil1 = 'ABC'
fields terminated by '|' TRAILING NULLCOLS
( fil1 FILLER position (1),
FLD1,
FLD2,
FLD3,
FLD4,
FLD5)

into table t1 APPEND
when fil1 = 'DEF'
fields terminated by '|' TRAILING NULLCOLS
( fil1 FILLER position (1),
FLD1,
FLD2,
FLD3,
FLD4,
FLD5)

The log file is:
=============================================
SQL*Loader: Release 10.1.0.3.0 - Production on Thu Apr 7 18:54:54 2005

Copyright (c) 1982, 2004, Oracle. All rights reserved.

Control File: mod.ctl
Data File: mod.dat
Bad File: mod.bad
Discard File: mod.discard
(Allow all discards)

Number to load: ALL
Number to skip: 0
Errors allowed: 100000
Bind array: 64 rows, maximum of 256000 bytes
Continuation: none specified
Path used: Conventional

Table T1, loaded when FIL1 = 0X414243(character 'ABC')
Insert option in effect for this table: APPEND
TRAILING NULLCOLS option in effect

Column Name Position Len Term Encl Datatype
------------------------------ ---------- ----- ---- ---- ---------------------
FIL1 1 * | CHARACTER
(FILLER FIELD)
FLD1 NEXT * | CHARACTER
FLD2 NEXT * | CHARACTER
FLD3 NEXT * | CHARACTER
FLD4 NEXT * | CHARACTER
FLD5 NEXT * | CHARACTER

Table T1, loaded when FIL1 = 0X444546(character 'DEF')
Insert option in effect for this table: APPEND
TRAILING NULLCOLS option in effect

Column Name Position Len Term Encl Datatype
------------------------------ ---------- ----- ---- ---- ---------------------
FIL1 1 * | CHARACTER
(FILLER FIELD)
FLD1 NEXT * | CHARACTER
FLD2 NEXT * | CHARACTER
FLD3 NEXT * | CHARACTER
FLD4 NEXT * | CHARACTER
FLD5 NEXT * | CHARACTER

Record 1: Discarded - failed all WHEN clauses.
Record 4: Discarded - failed all WHEN clauses.
Record 6: Discarded - failed all WHEN clauses.

Table T1:
1 Row successfully loaded.
0 Rows not loaded due to data errors.
5 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.


Table T1:
2 Rows successfully loaded.
0 Rows not loaded due to data errors.
4 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.


Space allocated for bind array: 198144 bytes(64 rows)
Read buffer bytes: 1048576

Total logical records skipped: 0
Total logical records read: 6
Total logical records rejected: 0
Total logical records discarded: 3

Run began on Thu Apr 07 18:54:54 2005
Run ended on Thu Apr 07 18:54:55 2005

Elapsed time was: 00:00:00.18
CPU time was: 00:00:00.06
================================================

Did I miss the mark completely? Or is this helpful?

Greg




Tom Kyte
April 07, 2005 - 4:34 pm UTC

as long as the when clause isn't satisfied -- yes. once you fall into a "true" condition -- the referenced fields in that part of the control file advance the line pointer along.

if I were to preprocess the file, I would just filter out the records I didn't want in this case.

Why this control file giving a problem in Oracle 9i not in Oracle 8i ?

Parag J Patankar, April 15, 2005 - 2:12 pm UTC

Hi Tom,

I am having following sqlloader ctl file in Oracle 8.1.7 which is working very fine 

LOAD DATA INFILE'$A2_INFOCENTRE/TQT3S.load' INTO TABLE TQT3S TRUNCATE FIELDS TERMINATED BY '|' TRAILING N
ULLCOLS
(G4150 ,
G4000 ,
G4300 ,
G4740 )

and my table structure is 
00:04:17 SQL> desc tqt3s
 Name                                      Null?    Type
 ----------------------------------------- -------- ----------------------------
 G4150                                              DATE
 G4000                                     NOT NULL VARCHAR2(3)
 G4300                                     NOT NULL VARCHAR2(5)
 G4740                                     NOT NULL VARCHAR2(1500)

But when I am running this sqlloader with same ctl file in Oracle 9i R2 which was giving a problem. ( max record length exceeded ) But when I run sqlloader with following ctl file 

LOAD DATA INFILE'$A2_INFOCENTRE/TQT3S.load' INTO TABLE TQT3S TRUNCATE FIELDS TERMINATED BY '|' TRAILING N
ULLCOLS
(G4150 ,
G4000 ,
G4300 ,
G4740 CHAR(1500)) <== put format char(1500)

Can you tell me what is the reason ?

regards & thanks
pjp 

Tom Kyte
April 15, 2005 - 2:20 pm UTC

same input file exactly?

Input File

Parag J Patankar, April 15, 2005 - 2:26 pm UTC

Hi Tom,

Can you give me your mail ID I will mail it. As this table is very large I can not cut and paste records. I have just copied single record for your reference

20050402|LC1|00426|P1012












@

regards & thanks
pjp

Tom Kyte
April 15, 2005 - 2:38 pm UTC

no, all you need to do is say "the input files were identical" in answer to my question.

the 9i ctl file would work if the last field was 255 or less. neither should have worked without the char(1500) if the field was >255 in the input file since 255 is the default char size in sqlldr and has been for a long time.

sql functions in WHEN CLAUSES

Rahul, April 27, 2005 - 8:49 pm UTC

Tom,

Posting this after trying to get an answer everywhere else. But my question is relevant to this thread, so, here it goes.

I want to access a sql function like 'length' in my WHEN clause in sql loader.

For example:

WHEN addr<>'Error' AND addr<>''

works fine, but when I do this:

WHEN address<>'Error' AND address<>'' AND length(state)=2

then I get an error
"Expecting positive integer or column name, found keyword length."

also tried this:
WHEN addr<>'Error' AND addr<>'' AND "length(state)"=2

then I got the error
"Expecting quoted string or hex identifier, found "2""

In my own defense, I have searched the web and your website but couldn't find anything. If there is a documentation for this sort of thing, please point me to it.

Thank you,
Rahul



Tom Kyte
April 27, 2005 - 9:51 pm UTC

not going to happen, sqlldr is very "primitive"

however, are you using 9i or above? external tables will make this *easy*

sql functions in WHEN CLAUSES followup

Rahul, April 28, 2005 - 10:30 am UTC

Your answer:

"not going to happen, sqlldr is very "primitive"

however, are you using 9i or above? external tables will make this *easy* "

Tom,

Thank you for a quick reply. I am using 10g. I was afraid of that and wanted to dig more into external tables. Well, this is a good place to start. Can you point me to some documentation regarding this part of the external tables?

Love 10g by the way, used the cardinality function for the collections(I don't think that was in 9i).

Thank you

Tom Kyte
April 28, 2005 - 10:34 am UTC

you have the full power of the WHERE clause!

select * from flat_file where <anything you want>;

cardinality is 9i and above.

Sanjiv Sarkar, May 03, 2005 - 1:48 am UTC

Hi Tom,
I have a problem. I have the following ctl file
Load Data
APPEND
into table batch_conv.temp_super_consumers
TRAILING NULLCOLS
(SEQ_NUM POSITION(1:3) INTEGER EXTERNAL, OLD_SPONSOR_ID POSITION(4:13) INTEGER EXTERNAL, SPONS_CONS_ID POSITION(14:33) INTEGER EXTERNAL,
OLD_CONSUMER_ID POSITION(77:86) CHAR, SPLIT_FLAG POSITION(87:87) CHAR TERMINATED BY WHITESPACE)

I want to pass a constant value (eg. 6) to the column SEQ_NUM. How should i do it? Please help.

Tom Kyte
May 03, 2005 - 8:49 am UTC

seq_num constant 6

see the sqlldr documentation in the supplied packages guide for more details.



Loading data into multiple tables from one control file

Laxman Kondal, May 22, 2005 - 8:56 pm UTC

Hi Tom

I am trying to load data from one control file into multiple tables and in position case it works and in comma delimited case it inserts into only the first table. Here is the control file:

This works fine

load data
infile *
INTO TABLE dept1
WHEN recid = '1'
(recid FILLER POSITION(1:1) INTEGER EXTERNAL,
deptno POSITION(3:4) INTEGER EXTERNAL,
dname POSITION(8:21) CHAR,
loc POSITION(23:35) )

INTO TABLE emp1
WHEN recid = '2'
(recid FILLER POSITION(1:1) INTEGER EXTERNAL,
empno POSITION(3:6) INTEGER EXTERNAL,
ename POSITION(8:17) CHAR,
deptno POSITION(19:20) INTEGER EXTERNAL)

INTO TABLE dept2
WHEN recid = '3'
(recid FILLER POSITION(1:1) INTEGER EXTERNAL,
deptno POSITION(3:4) INTEGER EXTERNAL,
dname POSITION(8:21) CHAR,
loc POSITION(23:35) )

INTO TABLE emp2
WHEN recid = '4'
(recid FILLER POSITION(1:1) INTEGER EXTERNAL,
empno POSITION(3:6) INTEGER EXTERNAL,
ename POSITION(8:17) CHAR,
deptno POSITION(19:20) INTEGER EXTERNAL)

BEGINDATA
1 50 Manufacturing Springfield
2 1119 Smith 50
2 1120 Snyder 50
1 60 Shipping Woodbridge
2 1121 Stevens 60
3 50 Manufacturing Springfield
4 1119 Smith 50
4 1120 Snyder 50
3 60 Shipping Woodbridge
4 1121 Stevens 60

and when I change this position to commma delimited then it works for first INTO TABLE dept1 only.

lload data
infile *
INTO TABLE dept1
WHEN recid = '1'
FIELDS TERMINATED BY '|'
(recid FILLER INTEGER EXTERNAL,
deptno INTEGER EXTERNAL,
dname CHAR,
loc CHAR)

INTO TABLE emp1
WHEN recid = '2'
FIELDS TERMINATED BY '|'
(recid FILLER INTEGER EXTERNAL,
empno INTEGER EXTERNAL,
ename CHAR,
deptno INTEGER EXTERNAL)

INTO TABLE dept2
WHEN recid = '3'
FIELDS TERMINATED BY '|'
(recid FILLER INTEGER EXTERNAL,
deptno INTEGER EXTERNAL,
dname CHAR,
loc CHAR)

INTO TABLE emp2
WHEN recid = '4'
FIELDS TERMINATED BY '|'
(recid FILLER INTEGER EXTERNAL,
empno INTEGER EXTERNAL,
ename CHAR,
deptno INTEGER EXTERNAL)

BEGINDATA
1|50|Manufacturing|Springfield
2|1119|Smith|50
2|1120|Snyder|50
1|60|Shipping|Woodbridge
2|1121|Stevens|60
3|50|Manufacturing|Springfield
4|1119|Smith|50
4|1120|Snyder|50
3|60|Shipping|Woodbridge
4|1121|Stevens|60


And BAD FILE reads:

Record 2: Discarded - failed all WHEN clauses.
Record 3: Discarded - failed all WHEN clauses.
Record 5: Discarded - failed all WHEN clauses.
Record 6: Discarded - failed all WHEN clauses.
Record 7: Discarded - failed all WHEN clauses.
Record 8: Discarded - failed all WHEN clauses.
Record 9: Discarded - failed all WHEN clauses.
Record 10: Discarded - failed all WHEN clauses.

In document it says:
=========================================================
Relative Positioning Based on Delimiters
The same record could be loaded with a different specification. The following control file uses relative positioning instead of fixed positioning. It specifies that each field is delimited by a single blank (" ") or with an undetermined number of blanks and tabs (WHITESPACE):

INTO TABLE emp
(empno INTEGER EXTERNAL TERMINATED BY " ",
ename CHAR TERMINATED BY WHITESPACE)
INTO TABLE emp
(empno INTEGER EXTERNAL TERMINATED BY " ",
ename CHAR) TERMINATED BY WHITESPACE)


The important point in this example is that the second empno field is found immediately after the first ename, although it is in a separate INTO TABLE clause. Field scanning does not start over from the beginning of the record for a new INTO TABLE clause. Instead, scanning continues where it left off.
============================================================

I tried using last field 'TERMINATED BY WHITESPACE' still not loading remaing three tables

Where I am going wrong in this case? I need to load data from '|' delimited file which has 29 tables and the first field is the table name its suppose to be inserted.

Is there any way multiple logical record can be loaded into multiple tables in Oracle9iR2 without using external tables?

Thanks and regards.




Tom Kyte
May 23, 2005 - 8:18 am UTC

the problem lies in the way sqlldr parses strings -- it is a single pass, it is not reparsed for each into, so the first into clause parsed and left the "pointer" at the end, the second and remaining intos "didn't have anything left in the string"

you can use something like this:

</code> http://asktom.oracle.com/~tkyte/SkipCols/index.html <code>

the delimited package parses the line and you can reference the i'th column.

You can achieve the same with substr in the ctl file itself if you like.



Getting SQL*Loader to begin at the beginning.

A reader, May 23, 2005 - 9:58 am UTC

True, when you have multiple tables being loaded from delimited records, SQL*Loader wants to start the fields for the next table where it left off from the prior one. So you tell it to start over at position 1.

load data
infile *
INTO TABLE dept1
WHEN recid = '1'
FIELDS TERMINATED BY '|'
(recid FILLER POSITION (1) INTEGER EXTERNAL,
deptno INTEGER EXTERNAL,
dname CHAR,
loc CHAR)

INTO TABLE emp1
WHEN recid = '2'
FIELDS TERMINATED BY '|'
(recid FILLER POSITION (1) INTEGER EXTERNAL,
empno INTEGER EXTERNAL,
ename CHAR,
deptno INTEGER EXTERNAL)

You CAN use both POSITION and TERMINATED BY.

Tom Kyte
May 23, 2005 - 2:31 pm UTC

thanks, forest for the trees day.

appreciate the input very much.

Loading data into multiple tables from one control file

Laxman Kondal, May 23, 2005 - 2:17 pm UTC

Hi Tom

I changed it to position (1) as suggested by 'A reader' THANKS, and it works

load data
infile *
INTO TABLE dept1
WHEN recid = '1'
FIELDS TERMINATED BY '|'
(recid FILLER INTEGER EXTERNAL,
deptno INTEGER EXTERNAL,
dname CHAR,
loc CHAR TERMINATED BY WHITESPACE)

INTO TABLE emp1
WHEN recid = '2'
FIELDS TERMINATED BY '|'
(recid FILLER POSITION (1) INTEGER EXTERNAL,
empno INTEGER EXTERNAL,
ename CHAR,
deptno INTEGER EXTERNAL TERMINATED BY WHITESPACE)

INTO TABLE dept2
WHEN recid = '3'
FIELDS TERMINATED BY '|'
(recid FILLER POSITION (1) INTEGER EXTERNAL,
deptno INTEGER EXTERNAL,
dname CHAR,
loc CHAR TERMINATED BY WHITESPACE)

INTO TABLE emp2
WHEN recid = '4'
FIELDS TERMINATED BY '|'
(recid FILLER POSITION (1) INTEGER EXTERNAL,
empno INTEGER EXTERNAL,
ename CHAR,
deptno INTEGER EXTERNAL)

BEGINDATA
1|50|Manufacturing|Springfield
2|1119|Smith|50
2|1120|Snyder|50
1|60|Shipping|Woodbridge
2|1121|Stevens|60
3|50|Manufacturing|Springfield
4|1119|Smith|50
4|1120|Snyder|50
3|60|Shipping|Woodbridge
4|1121|Stevens|60

So both can be used?


html file

Parag Jayant Patankar, May 24, 2005 - 5:13 am UTC

Hi Tom,

I have generated simple html output file of emp table by 

SQL> set markup html on spool on
SQL> spool c:\temp\t.html
SQL> select * from emp;
SQL> spool off
SQL> set markup html off

in Oracle 9.2. My questions regarding this  are 

1. Can we upload this file ( html output ) into emp table again using sqlloader ? if yes pl show it to me by example.

2. Can I use this file in external table ? if yes pl show it to me by example.

regards & thanks
pjp
 

Tom Kyte
May 24, 2005 - 8:04 am UTC

1) not easily, it is HTML, HTML is not a suitable data interchange format.
2) see #1, parsing HTML is a pain in the butt, I would not consider doing it myself.


if you want to unload and reload data, see:
</code> http://asktom.oracle.com/~tkyte/flat/index.html <code>


Escaped Delimiter in the file

Rahul, June 14, 2005 - 3:25 pm UTC

Hi Tom,

Appreciate all the help you are giving to the oracle community.

I am using Oracle 10g and the sqlldr version is 10.1.0.3.0.

I am trying to load a file with an escaped delimiter. My delimiter is a
pipe(|) and the escape character is the backslash character(\). That means the
user wants the pipe character in their text field if it is escaped.

So, my data looks something like:
123456|ABCD\||RAHUL|04/17/2002|

So, my date is the fourth column,but it is trying to load 'RAHUL' into the
date column. The other data is normal. This is the only row that isn't loading. It should load "ABCD|" into the second column.

Thank you,
Rahul.

Tom Kyte
June 14, 2005 - 4:25 pm UTC

look at the OTHER place you asked this same thing.

Sorry about that.

Rahul, June 14, 2005 - 6:04 pm UTC

Hi Tom,

Sorry about that duplicate post. Somehow the post before that was last answered in 2003, so, I thought you weren't monitoring that one. I should have known that it is you maintaining the database.

Rahul

Skipping only the last record in a flat file?

A reader, June 14, 2005 - 10:27 pm UTC

How do I skip a trailer record in a flat file?

For example if I have

field1,field2,field3

Total no of records in extract: 272

Is there a way to instruct SQL*Loader to skip the 'Total no of records...' line without flagging it as a bad record?

Tom Kyte
June 15, 2005 - 3:21 am UTC

use the WHEN clause to say "WHEN the first 7 characters are not 'TOTAL '" in the control file. the Server Utilities guide documents this clause.

SQLLOADER with partition tables

Hannu, June 20, 2005 - 11:41 pm UTC

Lets say I have two partioned tables and would like to load the data in a partiular partitions
create table t_part1
(id number(10),
name varchar2(20),
account number(10)
)
partition by range(id)
(
partition P1 values less than (200),
partition P2 values less than (400)
)

create table t_part2
(id number(10),
name varchar2(20),
account number(10)
)
partition by range(id)
(
partition P1 values less than (200),
partition P2 values less than (400)
)

test.txt :
6,P1,111
350,P2,999
2,P1A,8888

CTL File :
LOAD DATA
INFILE 'C:\test.txt'
BADFILE 'c:\test.bad'

TRUNCATE

INTO TABLE T_PART1
PARTITION(P1)
WHEN (NAME='P1')

FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
TRAILING NULLCOLS
(ID INTEGER EXTERNAL,
NAME CHAR(10),
ACCOUNT INTEGER EXTERNAL
)


INTO TABLE T_PART2
PARTITION(P1)
WHEN (NAME='P1A')
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
TRAILING NULLCOLS
(ID INTEGER EXTERNAL,
NAME CHAR(10),
ACCOUNT INTEGER EXTERNAL
)

when the sqlloader is run only the data is loaded in the first table and the in the second table the partition P1 is truncated without loading the data.
I really appreciate your resonse


Tom Kyte
June 21, 2005 - 8:17 am UTC

you have to "reset" the line pointer in sqlldr.  else the first into clause has already fully parsed the line.

but... in your case that'll be problematic because if we 'reset' the line pointer using position(1:1), it'll skip the first character in the second into clause, eg:

ops$tkyte@ORA9IR2> !sqlldr / t.ctl
 
SQL*Loader: Release 9.2.0.5.0 - Production on Tue Jun 21 08:16:47 2005
 
Copyright (c) 1982, 2002, Oracle Corporation.  All rights reserved.
 
Commit point reached - logical record count 2
 
ops$tkyte@ORA9IR2> !cat t.ctl
LOAD DATA
INFILE *
INTO TABLE DEPT1
truncate
when (dname='ACCOUNTING')
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
(DEPTNO, DNAME, LOC)
 
INTO TABLE DEPT2
truncate
when (dname='ART')
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
(dummy filler position(1:1),
 deptno, DNAME, LOC)
BEGINDATA
10,"ACCOUNTING",CLEVELAND
20,"ART",SALEM
 
ops$tkyte@ORA9IR2> select * from dept1;
 
    DEPTNO DNAME          LOC
---------- -------------- -------------
        10 ACCOUNTING     CLEVELAND
 
ops$tkyte@ORA9IR2> select * from dept2;
 
    DEPTNO DNAME          LOC
---------- -------------- -------------
         0 ART            SALEM
 


see how deptno is missing the first character..  You might have to make two passes on the input file. 

sqlloader

hannu, June 21, 2005 - 12:13 pm UTC

Can you please give an example of making two passes for the input file

Appreciate your help

Tom Kyte
June 21, 2005 - 5:16 pm UTC

sqlldr u/p control_file1.ctl input.data
sqlldr u/p control_file2.ctl input.data

run sqlldr twice. with different ctl files.

When are blanks not BLANKS?

Richard, June 30, 2005 - 6:53 pm UTC

Hi Tom,

Thanks for all the great information here.

I have a control file which discards records and I can't work out why:
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
OPTIONS (ERRORS=0)
LOAD DATA
INFILE *
DISCARDMAX 0
APPEND
INTO TABLE XXXX.DMPK
WHEN (S_PKTYP=BLANKS)
FIELDS TERMINATED BY '~#~' OPTIONALLY ENCLOSED BY '"'
(
BATCH_ID CONSTANT 1111,
BDT_SRC_ID CONSTANT 22222,
STUDYID CONSTANT '123456/789',
PID POSITION(1) "RTRIM(LTRIM(:PID))",
SUBJID "RTRIM(LTRIM(:SUBJID))",
VOBS "RTRIM(LTRIM(:VOBS))",
CENTREID "RTRIM(LTRIM(:CENTREID))",
VISIT "RTRIM(LTRIM(:VISIT))",
VDAT DATE "YYYYMMDDHH24MISS" NULLIF VDAT=BLANKS,
CRFPAGNO "RTRIM(LTRIM(:CRFPAGNO))",
S_TOBS "RTRIM(LTRIM(:S_TOBS))",
S_PKTYP "RTRIM(LTRIM(:S_PKTYP))",
S_PKTYP_DECODE "RTRIM(LTRIM(:S_PKTYP_DECODE))",
S_RLABNO "RTRIM(LTRIM(:S_RLABNO))",
S_ACTNO "RTRIM(LTRIM(:S_ACTNO))",
S_REQNO "RTRIM(LTRIM(:S_REQNO))",
D_SMPNO NULLIF D_SMPNO=BLANKS,
PKQUES1 "RTRIM(LTRIM(:PKQUES1))",
PKQUES1_DECODE "RTRIM(LTRIM(:PKQUES1_DECODE))",
PKQUES2 DATE "YYYYMMDDHH24MISS" NULLIF PKQUES2=BLANKS,
PKQUES3 "RTRIM(LTRIM(:PKQUES3))",
PKQUES4 DATE "YYYYMMDDHH24MISS" NULLIF PKQUES4=BLANKS,
PKQUES5 "RTRIM(LTRIM(:PKQUES5))",
PKQUES9 "RTRIM(LTRIM(:PKQUES9))",
PKQUES9_DECODE "RTRIM(LTRIM(:PKQUES9_DECODE))"
) BEGINDATA
"AAA.BBB.CCCCC"~#~"CCCCC"~#~"A201080.0"~#~"BBB"~#~"DAY5"~#~"20021002000000"~#~"31"~#~"200.PRE"~#~""~#~""~#~"XXXXXXXXYYYYYY"~#~"XXXXXXXX"~#~"YYYYYY"~#~"1"~#~"Y"~#~"Yes"~#~"20021002000000"~#~"08:10"~#~""~#~""~#~""~#~""
"AAA.DDD.EEEEE"~#~"EEEEE"~#~"A201090.0"~#~"DDD"~#~"DAY7"~#~"20021130000000"~#~"33"~#~"200.POST"~#~""~#~""~#~"WWWWWWWWZZZZZZ"~#~"WWWWWWWW"~#~"ZZZZZZ"~#~"1"~#~"Y"~#~"Yes"~#~"20021130000000"~#~"10:30"~#~""~#~""~#~""~#~""
etc...

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

The (hopefully) relevant bits of the log look like this:

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
SQL*Loader: Release 8.1.7.0.0 - Production on Thu Jun 30 23:33:54 2005
...
Table XXXX.DMPK, loaded when S_PKTYP = BLANKS
Insert option in effect for this table: APPEND

Column Name Position Len Term Encl Datatype
------------------------------ ---------- ----- ---- ---- ---------------------
...
S_PKTYP NEXT * O(") CHARACTER
Terminator string : '~#~'
SQL string for column : "RTRIM(LTRIM(:S_PKTYP))"
...
Record 1: Discarded - failed all WHEN clauses.
Discard limit reached - processing terminated on data file CtlDataFile_568535.ctl.
...
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

and the table looks like this:

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
NAME Null? Type
------------------------------- --------- -----
BATCH_ID NUMBER(9,0)
BDT_SRC_ID NUMBER(9,0)
STUDYID VARCHAR2(10)
PID VARCHAR2(13)
SUBJID VARCHAR2(22)
VOBS VARCHAR2(9)
CENTREID VARCHAR2(3)
VISIT VARCHAR2(10)
VDAT DATE
CRFPAGNO VARCHAR2(7)
S_TOBS VARCHAR2(14)
S_PKTYP VARCHAR2(2)
S_PKTYP_DECODE VARCHAR2(80)
S_RLABNO VARCHAR2(14)
S_ACTNO VARCHAR2(8)
S_REQNO VARCHAR2(6)
D_SMPNO NUMBER(5,0)
PKQUES1 VARCHAR2(20)
PKQUES1_DECODE VARCHAR2(80)
PKQUES2 DATE
PKQUES3 VARCHAR2(5)
PKQUES4 DATE
PKQUES5 VARCHAR2(5)
PKQUES9 VARCHAR2(20)
PKQUES9_DECODE VARCHAR2(80)

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

How do I get the records without an S_PKTYP value to load?

Thanks,

Richard

Tom Kyte
June 30, 2005 - 8:59 pm UTC

problem is -- it isn't "blank", it isn't there at all.

if there were a blank there it would load. any chance of getting a blank there?

Richard, July 01, 2005 - 5:27 am UTC

I take it there's no equivalent of "WHEN S_PKTYP IS NULL"? I have already tried all the combinations of that that I can think of without success. WHEN S_PKTYP="" appears to be automatically converted to WHEN S_PKTYP=BLANKS.

The control file is generated as part of an ETL procedure, so theoretically it would be possible to get a space in there. Ideally, I'd only want to modify the piece that generates the control file, rather than having to shepherd a space through other parts of the ETL procedure.

Do you know of any tricks for this. For example, would something like this work?

WHEN (S_PKTYP = ' ')
...
S_PKTYP NULLIF S_PKTYP=' ' "NVL(LTRIM(RTRIM(:S_PKTYP)),' ')"

or would this result in a space being loaded into the (VARCHAR2) field?

Tom Kyte
July 01, 2005 - 10:10 am UTC

when (s_pktyp=' ')
....

s_pktyp "LTRIM(RTRIM(:S_PKTYP))"



the when clause is applied way before the SQL (which is only applied to the data in an insert, not by sqlldr itself) function is applied. the ltrim(rtrim( would get rid of the blanks.



Richard, July 01, 2005 - 12:13 pm UTC

Thanks - it helps to know the execution order of the WHEN clause vs the SQL, but this still requires a space in the data, doesn't it?

Does this mean that there is no way to modify the control file (WHEN clause / col defs only, not the data in this case) and get data to load when the key column (S_PKTYP) is empty? If so, I guess I'll just have to bite the bullet and code in a default value - either a space or something more visible like this:

WHEN S_PKTYP='!NULL!'
...
S_PKTYP NULLIF S_PKTYP='!NULL!' "LTRIM(RTRIM(:S_PKTYP))"

Right?

Final question: is NULLIF evaluated before or after the SQL? So, in the example above, would ' !NULL! ' be loaded as NULL or '!NULL!' (ignoring the fact that it wouldn't be loaded at all given that the WHEN clause would kick in before the trimming)?

Tom Kyte
July 01, 2005 - 1:51 pm UTC

yes, there is no way around that that I'm aware of, if you want to use BLANKS, there must be a space.

nullif is a sqlldr thing, controls what sqlldr will bind into the generated insert statement.

Sqlldr

Laxman Kondal, July 06, 2005 - 3:56 pm UTC

Hi Tom

I get this at the end of sqllsr log

---------------
Bind array size not used in direct path.
Column array rows : 5000
Stream buffer bytes: 256000
Read buffer bytes: 1048576

Total logical records skipped: 0
Total logical records read: 108533
Total logical records rejected: 0
Total logical records discarded: 1
Total stream buffers loaded by SQL*Loader main thread: 50
Total stream buffers loaded by SQL*Loader load thread: 6

Run began on Wed Jul 06 14:58:46 2005
Run ended on Wed Jul 06 14:58:57 2005

Elapsed time was: 00:00:11.65
CPU time was: 00:00:04.81
------------

and when I open discard file there is nothing - what does this
'Total logical records discarded: 1'
means.

And secondly I am running this from pl-sql using your run_cmd function
---------
v_str := p_userlogin||'/'||p_pwd||'@'||v_glb_name
--sqlldr
v_imp_cmd := v_bin||'bin/sqlldr '
||v_str||' '
||v_dir||'load_hi_from_txt.ctl '
||'LOG='||v_dir||'load_hi_txt_log.log '
||'BAD='||v_dir||'load_hi_txt_bad.bad '
||'DISCARD='||v_dir||'load_hi_txt_discard.dis '
||'DIRECT=true ';
--
v_rn_cmd_ret := Run_Cmd(v_imp_cmd);
---------

When there is an error then v_rn_cmd_ret = 1
and other occasions v_rn_cmd_ret = 0 or 2. In the case of above log I got v_rn_cmd_ret = 2.

What's the difference between 0 and 2?, and if I need the pl-sql code to continue if it loaded all records then which v_rn_cmd_ret should I use. In this case v_rn_cmd_ret = 2 and I dont find any record in discard file and BAD file is not created at all.

Thanks and regards.

Tom Kyte
July 07, 2005 - 8:27 am UTC

are you sure you are opening the right "discard" file and that no one else has overwritten it.


</code> http://docs.oracle.com/docs/cd/B10501_01/server.920/a96652/ch04.htm#1005021 <code>

for return codes.

SQLLDR

Laxman Kondal, July 07, 2005 - 10:55 am UTC

Hi Tom

Thanks for reply. Discard file do not show any record when opened after running sqlldr.

Tail part of log file shows:
--
Total logical records read: 994
Total logical records rejected: 0
Total logical records discarded: 1
--
and there are only 993 record in the tables. Does this 994 includes one logical record discared and how can I eliminate this discarded record showing up in log file and get exit code 0 and not 2.

--------------------
Date cache:
Max Size: 1000
Entries : 1
Hits : 22
Misses : 0

Bind array size not used in direct path.
Column array rows : 5000
Stream buffer bytes: 256000
Read buffer bytes: 1048576

Total logical records skipped: 0
Total logical records read: 994
Total logical records rejected: 0
Total logical records discarded: 1
Total stream buffers loaded by SQL*Loader main thread: 14
Total stream buffers loaded by SQL*Loader load thread: 0

Run began on Thu Jul 07 10:27:12 2005
Run ended on Thu Jul 07 10:27:15 2005

Elapsed time was: 00:00:02.81
CPU time was: 00:00:00.25
----------------

Thanks and regards

Tom Kyte
July 07, 2005 - 1:30 pm UTC

perhaps the discarded record is "blank"? and that is why you cannot "see it"

SQLLDR

Laxman Kondal, July 07, 2005 - 11:17 am UTC

Hi Tom

Never mind, I did some search on my code and found that there are 26 tables which are loaded and not all may have records to load but all are loaded/appended in same txt file. So the first thing is create a file with no record
------
l_rows := dump_csv( 'select null FROM DUAL',
'|',
'EXP_DIR',
'exp_txt_file.txt',
'w');
------
which creats a new file and inserts NULL, which is discarded and discard file is created.

Reuirement is to creat a file every time and and then append.

Is there any thing else other then NULL I can use just to create new file rather recreate without any logical record and then append to it?

Thanks and regards

Tom Kyte
July 07, 2005 - 1:40 pm UTC

utl_file can create a file.

SQLLDR

Laxman Kondal, July 07, 2005 - 11:52 am UTC

Hi Tom

Finally I got it done by adding if clause in dump_csv if its append then just close the file else do write in the file and that returns exit code=0

Thanks and regards.

bad file

Geetha, July 12, 2005 - 1:22 am UTC

Hi Tom,
Iam new to the subject and i known psql&sql.But iam having little knowledge in sql * loader .i want to known how we get external data into our db(by using sql*loadre) iwant to known how to create bad ,ctrl file?
If anything is wrong please excuse me
Thanks in advance,
Geetha

Tom Kyte
July 12, 2005 - 4:40 pm UTC

Server Utilities guide:

</code> http://docs.oracle.com/docs/cd/B10501_01/server.920/a96652/toc.htm <code>

documents sqlldr, examples are in $ORACLE_HOME/rdbms/demo

and if you have access to my book Expert One on One Oracle -- I think I have a pretty decent sqlldr chapter in there.

Rejected data's..

Sabir, August 26, 2005 - 5:08 am UTC

Tom,

while loading thru sql loader I received the follwing error with 51 record rejected..

Total of skipped logical record: 0
Total of read logical record: 8883
Total of refused logical record: 51
Total of abandoned logical record: 0

The error i found in log file as follows:

Record 8883:It was refused. - The error occurred by table MST_KYOTAKU_200508 and row COLUMN25.
The field of the data file exceeds the maximum length.

It shows that the datafile exceeds maximum length,but exactly the length of all fields not more than 320 and my
data type is varchar2(2000).

what's the mistake i did,since the data's in Japanese!
Kindly let me the know the go of diagnosing,i.e., to load all data's..


Tom Kyte
August 26, 2005 - 8:25 am UTC

varcahr2(2000) is 2000 bytes, could a 320 CHARACTER field be more than 2000 bytes in your character set?

Rejected data's..

sabir, August 26, 2005 - 11:08 pm UTC

Tom,

The problems had been solved by modifying the varchar2(2000) into char(1000)..

Thanks for your patience.

But you don't need the filler to reset the position...

Vladimir Andreev, September 26, 2005 - 9:26 am UTC

<quote src=followup from June 20, 2005>
but... in your case that'll be problematic because if we 'reset' the line 
pointer using position(1:1), it'll skip the first character in the second into 
clause, eg:
</quote>

We can 'reset' the line using position(1) and avoid the first character being 'eaten' by a filler field:

SQL> ! cat t.ctl
LOAD DATA
INFILE *
INTO TABLE DEPT1
truncate
when (dname='ACCOUNTING')
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
(DEPTNO, DNAME, LOC)
 
INTO TABLE DEPT2
truncate
when (dname='ART')
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
(deptno<b> position(1)</b>, DNAME, LOC)
BEGINDATA
10,"ACCOUNTING",CLEVELAND
20,"ART",SALEM

SQL> !sqlldr flado t.ctl
Password:*

SQL*Loader: Release 9.2.0.4.0 - Production on Mon Sep 26 14:10:17 2005

Copyright (c) 1982, 2002, Oracle Corporation.  All rights reserved.

Commit point reached - logical record count 2

SQL> select * from dept1;

    DEPTNO DNAME           LOC
---------- --------------- ---------------
        10 ACCOUNTING      CLEVELAND

SQL> select * from dept2;

    DEPTNO DNAME           LOC
---------- --------------- ---------------
        20 ART             SALEM


see how deptno is <b>not</b> missing the first character..  You <b>don't</b> have to make two passes on the input file.

;-)

Cheers,
Flado 

Tom Kyte
September 26, 2005 - 9:41 am UTC

very nice, thanks for the followup

SQLLDR using parameters

V, December 29, 2005 - 12:22 pm UTC

Is there anyway to pass in the file to load from using sqlldr?
i.e. Control file would look like:

LOAD DATA
infile $1
...
...

Tom Kyte
December 29, 2005 - 1:17 pm UTC

$ sqlldr .... data=filename .....







SQLLDR want to load data where all columns are null

parag jayant patankar, March 28, 2006 - 6:12 am UTC

Hi, 

I am having following table 


09:48:42 SQL> desc t 
 Name                                 Null?    Type 
 --------------------------------- -------- 
---------------------------- 
 A                                          CHAR(1) 
 B                                          NUMBER 


I am having following contorlfile t.ctl to upload records 


LOAD DATA 
INFILE * 
INTO TABLE T 
REPLACE 
FIELDS TERMINATED BY ',' 
TRAILING NULLCOLS 
( 
a  nullif a="(null)" 
,b nullif b="(null)" 
) 
BEGINDATA 
A,2 
(null),(null) 
(null),(null) 
(null),9 
B,(null) 
(null),10 
(null),(null) 
C,1 
(null),(null) 
(null),(null) 
(null),(null) 


When I try to do upload using sqlldr  sqlldr xxxx/yyyy  control=t.ctl 
log=t.log I am getting following error 


Table T, loaded from every logical record. 
Insert option in effect for this table: REPLACE 
TRAILING NULLCOLS option in effect 


   Column Name                  Position   Len  Term Encl Datatype 
------------------------------ ---------- ----- ---- ---- 
--------------------- 
A                                   FIRST     *   ,       CHARACTER 
    NULL if A = 0X286e756c6c29(character '(null)') 
B                                    NEXT     *   ,       CHARACTER 
    NULL if B = 0X286e756c6c29(character '(null)') 


Record 2: Discarded - all columns null. 
Record 3: Discarded - all columns null. 
Record 7: Discarded - all columns null. 
Record 9: Discarded - all columns null. 
Record 10: Discarded - all columns null. 
Record 11: Discarded - all columns null. 


Table T: 
  5 Rows successfully loaded. 
  0 Rows not loaded due to data errors. 
  0 Rows not loaded because all WHEN clauses were failed. 


I want to also insert all records where all fields are null. How can I do this in SQLLDR ? 


thanks & regards 
pjp 


 

Tom Kyte
March 28, 2006 - 8:02 am UTC

first thing that popped into my head:

change

(
a nullif a="(null)"
,b nullif b="(null)"
)

to

(
a "decode(:a,'(null)',NULL,:a)",
b "decode(:b,'(null)',NULL,:b)"
)

Multi table inserts

Jamu, July 21, 2006 - 9:31 am UTC

Is it not possible to do multi table inserts based on the value of a column using comma delimiters?

LOAD DATA
INFILE 'd:\aa.TXT'
INTO TABLE p
FIELDS TERMINATED BY '^' OPTIONALLY ENCLOSED BY '" ' TRAILING NULLCOLS
when a = '1'
(...
)
into table t
when a= '2
(
)

Tom Kyte
July 23, 2006 - 7:32 am UTC

it is not possible to do a multi-table insert with sqlldr at all.

with SQL and external tables - it is.

sqlldr can have multiple into statements (not a multi-table insert, that is sql) but the when clause is very primitive (supports AND only - no OR)

Using ROWS,BINDSIZE,READSIZE

Jamu, August 17, 2006 - 10:03 am UTC

Thanks Tom for the reply. Saw it just today being out of country.

Need your expert advise on the following:

My sqlldr syntax is:

SQLLDR userid/pwd@connstr control=d:\qlloader\test_loader.txt bindsize=20971520 readsize=20971520 rows=5000

The max row length for a record is 463 bytes derived from sum(data_length) from user_tab_columns. And the actual record in the datafile is also the same.

I expected the load to go at 5000 rows at a time (500 * 463)=2315000 which is much less than the bindsize/readsize given. But the load goes through at ROWS 2087 (this is the max I can achieve) as given in the log file extract below:

value used for ROWS parameter changed from 5000 to 2087
Space allocated for bind array: 20999394 bytes(2087 rows)

I had changed the READSIZE value in my above sqlldr statement after hitting the following:
"SQL*Loader-569: READSIZE parameter exceeds maximum value 20971520 for platform"

Even the space allocated for bind array 20999394 should have taken care of 5000 rows given the size of each row. Is there anyway to get the sqlldr to use my ROWS value. Any other confirguration/parameter that I am missing?

Thanks

Tom Kyte
August 17, 2006 - 10:21 am UTC

the bind size is setup based on the MAXIMUM POSSIBLE record, not the "actual observed max record" - for the simple reason that we don't know what the actual observed max record width will be until we hit it!  You have to size for worst case.

Look at your ctl file and use that (with worst case sizes - defaults to char(255) for each column).

consider:


ops$tkyte%ORA10GR2> !head t.ctl
LOAD DATA
INFILE *
INTO TABLE t
REPLACE
FIELDS TERMINATED BY '|'
(
x
)
BEGINDATA
         1

<b>here it'll assume char(255)</b>

ops$tkyte%ORA10GR2> !sqlldr / t bindsize=1000000 readsize=1000000 rows=5000

SQL*Loader: Release 10.2.0.2.0 - Production on Thu Aug 17 10:16:36 2006

Copyright (c) 1982, 2005, Oracle.  All rights reserved.

Commit point reached - logical record count 3875
Commit point reached - logical record count 5950
Commit point reached - logical record count 9825
Commit point reached - logical record count 11907
Commit point reached - logical record count 15782
Commit point reached - logical record count 17864
Commit point reached - logical record count 21739
Commit point reached - logical record count 23821
Commit point reached - logical record count 27696
Commit point reached - logical record count 29778
Commit point reached - logical record count 33653
Commit point reached - logical record count 35735
Commit point reached - logical record count 39610
Commit point reached - logical record count 41692
Commit point reached - logical record count 45567
Commit point reached - logical record count 47649
Commit point reached - logical record count 50033

ops$tkyte%ORA10GR2> edit t.ctl

ops$tkyte%ORA10GR2> !head t.ctl
LOAD DATA
INFILE *
INTO TABLE t
REPLACE
FIELDS TERMINATED BY '|'
(
x char(4000)
)
BEGINDATA
         1

<b>now it assumes 4000 characters instead, same parameters to sqlldr, but very different bind sizes</b>

ops$tkyte%ORA10GR2> !sqlldr / t bindsize=1000000 readsize=1000000 rows=5000

SQL*Loader: Release 10.2.0.2.0 - Production on Thu Aug 17 10:16:51 2006

Copyright (c) 1982, 2005, Oracle.  All rights reserved.

Commit point reached - logical record count 249
Commit point reached - logical record count 498
Commit point reached - logical record count 747
Commit point reached - logical record count 996
Commit point reached - logical record count 1245
Commit point reached - logical record count 1494
Commit point reached - logical record count 1743
Commit point reached - logical record count 1992
Commit point reached - logical record count 2241
Commit point reached - logical record count 2490
Commit point reached - logical record count 2739
Commit point reached - logical record count 2988
Commit point reached - logical record count 3237
Commit point reached - logical record count 3486
Commit point reached - logical record count 3735
 

Performance in load

Jamu, August 17, 2006 - 1:13 pm UTC

While I was more or less able to achieve the rows value specified, the time taken to load did not make any appreciable difference. The load with rows=2000 took 2 minutes for 1.2 million and with rows=40000 took 1:30 min. I have read somewhere in your site where you mentioned about a load of a million records in a few seconds. One more load into a wider table with 115 columns was taking 15 minutes for 5 mill records with rows=2000 and took 13 min with rows=14000 (this was the max I could get after modifying the control file with all the data lengths). There was no other activity on the box at the time of load. Is there any thing you could throw light on?

Tom Kyte
August 17, 2006 - 1:28 pm UTC



it is a function of rows*width.
it is a function of your IO capabilities.

I would not expect huge differences with such large arraysizes (and you'd like find they are the "same" if you ran enough observations - two runs would not be definitive - too many outside variables affect you in a conventional path load)


what does "direct=y" do for you.
does parallel direct path load apply in your case (can you use that).

just a tiny example, direct pathed:

Table BIG_TABLE:
1000000 Rows successfully loaded.
0 Rows not loaded due to data errors.
0 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.

Date cache:
Max Size: 1000
Entries : 13
Hits : 1999987
Misses : 0

Bind array size not used in direct path.
Column array rows : 5000
Stream buffer bytes: 256000
Read buffer bytes: 1048576

Total logical records skipped: 0
Total logical records read: 1000000
Total logical records rejected: 0
Total logical records discarded: 0
Total stream buffers loaded by SQL*Loader main thread: 1616
Total stream buffers loaded by SQL*Loader load thread: 0

Run began on Thu Aug 17 13:21:38 2006
Run ended on Thu Aug 17 13:21:55 2006

Elapsed time was: 00:00:16.74
CPU time was: 00:00:03.86


conventional pathed:


Table BIG_TABLE:
1000000 Rows successfully loaded.
0 Rows not loaded due to data errors.
0 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.


Space allocated for bind array: 231168 bytes(64 rows)
Read buffer bytes: 1048576

Total logical records skipped: 0
Total logical records read: 1000000
Total logical records rejected: 0
Total logical records discarded: 0

Run began on Thu Aug 17 13:23:51 2006
Run ended on Thu Aug 17 13:24:42 2006

Elapsed time was: 00:00:51.33
CPU time was: 00:00:18.75



Using Direct Path

Jamu, August 17, 2006 - 3:36 pm UTC

Thanks for such a quick response.

I am not able to use direct path load as there is a sequence used for one of the columns. And also the sequence is required to be used sequentially for the records in the file.

for eg in data file:
aaa,111,someremarks
aaa,222,somemoreremarks
bbb,333,remarks
bbb,444,remarks
...
and the table has a row column that gets the sequence during the load. The sequence has to be used in the order of records in the file for our logic. Direct path is not guaranteeing this. Hope I am right here!
Any more hints on where to look at?

Tom Kyte
August 17, 2006 - 3:48 pm UTC

use insert /*+ APPEND */ and an external table.

or a direct path load, sequences can be used (up your sequence cache!!)

Table BIG_TABLE, loaded from every logical record.
Insert option in effect for this table: REPLACE

Column Name Position Len Term Encl Datatype
------------------------------ ---------- ----- ---- ---- ---------------------
ID FIRST * | CHARACTER
SQL string for column : "s.nextval"

OWNER NEXT * | CHARACTER
..
SECONDARY NEXT * | CHARACTER


Table BIG_TABLE:
1000000 Rows successfully loaded.
0 Rows not loaded due to data errors.
0 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.
...
Elapsed time was: 00:00:11.51
CPU time was: 00:00:06.13


Sequence not used in Direct Path

Jamu, August 17, 2006 - 4:27 pm UTC

--Direct Path is not using sequence..

LOAD DATA
INFILE 'D:\dataload.TXT'
REPLACE INTO TABLE TBL
FIELDS TERMINATED BY '^' OPTIONALLY ENCLOSED BY '" ' TRAILING NULLCOLS
(

C1 CHAR(1),
C2 CHAR(1),
C3 CHAR(10) "NVL(:C3,'AAA')",
ROW_KEY CHAR(20) "row_key_SEQ.NEXTVAL"
)
CREATE TABLE TBL(
C1 CHAR(1),
C2 CHAR(1),
C3 CHAR(10) ,
ROW_KEY CHAR(20)
)

drop sequence row_key_seq
create sequence row_key_SEQ

Datafile:

A^B^CCCC
B^C^DDDD
C^D^EEEE
D^E^FFFF


sqlldr uid/pwd@cstr control=loadtest.txt direct=y

The row_key column is null after load.

Tom Kyte
August 17, 2006 - 5:24 pm UTC

how about the external table - use external_table=generate_only on the sqlldr command line.

External table clause

Jamu, August 17, 2006 - 9:28 pm UTC

The log file generated with external_table=generate_only is:


SQL*Loader: Release 10.2.0.1.0 - Production on Thu Aug 17 21:15:56 2006

Copyright (c) 1982, 2005, Oracle. All rights reserved.

Control File: testload.txt
Data File: c:\dataload.TXT
Bad File: dataload.bad
Discard File: none specified

(Allow all discards)

Number to load: ALL
Number to skip: 0
Errors allowed: 50
Continuation: none specified
Path used: External Table

Table TBL, loaded from every logical record.
Insert option in effect for this table: REPLACE
TRAILING NULLCOLS option in effect

Column Name Position Len Term Encl Datatype
------------------------------ ---------- ----- ---- ---- ---------------------
C1 FIRST 1 ^ CHARACTER
Enclosure string : '" '
C2 NEXT 1 ^ CHARACTER
Enclosure string : '" '
C3 NEXT 10 ^ CHARACTER
Enclosure string : '" '
SQL string for column : "NVL(:C3,'AAA')"
ROW_KEY NEXT 20 ^ CHARACTER
Enclosure string : '" '
SQL string for column : "row_key_SEQ.NEXTVAL"



CREATE DIRECTORY statements needed for files
------------------------------------------------------------------------
CREATE DIRECTORY SYS_SQLLDR_XT_TMPDIR_00000 AS 'C:\'
CREATE DIRECTORY SYS_SQLLDR_XT_TMPDIR_00000 AS 'c:\'


CREATE TABLE statement for external table:
------------------------------------------------------------------------
CREATE TABLE "SYS_SQLLDR_X_EXT_TBL"
(
"C1" CHAR(1),
"C2" CHAR(1),
"C3" VARCHAR(255),
"ROW_KEY" VARCHAR(255)
)
ORGANIZATION external
(
TYPE oracle_loader
DEFAULT DIRECTORY SYS_SQLLDR_XT_TMPDIR_00000
ACCESS PARAMETERS
(
RECORDS DELIMITED BY NEWLINE CHARACTERSET WE8MSWIN1252
BADFILE 'SYS_SQLLDR_XT_TMPDIR_00000':'dataload.bad'
LOGFILE 'testload.log_xt'
READSIZE 1048576
FIELDS TERMINATED BY "^" OPTIONALLY ENCLOSED BY '" ' LDRTRIM
MISSING FIELD VALUES ARE NULL
REJECT ROWS WITH ALL NULL FIELDS
(
"C1" CHAR(1)
TERMINATED BY "^" OPTIONALLY ENCLOSED BY '" ',
"C2" CHAR(1)
TERMINATED BY "^" OPTIONALLY ENCLOSED BY '" ',
"C3" CHAR(10)
TERMINATED BY "^" OPTIONALLY ENCLOSED BY '" ',
"ROW_KEY" CHAR(20)
TERMINATED BY "^" OPTIONALLY ENCLOSED BY '" '
)
)
location
(
'dataload.TXT'
)
)REJECT LIMIT UNLIMITED


INSERT statements used to load internal tables:
------------------------------------------------------------------------
INSERT /*+ append */ INTO TBL
(
C1,
C2,
C3,
ROW_KEY
)
SELECT
"C1",
"C2",
NVL("C3",'AAA'),
row_key_SEQ.NEXTVAL
FROM "SYS_SQLLDR_X_EXT_TBL"


statements to cleanup objects created by previous statements:
------------------------------------------------------------------------
DROP TABLE "SYS_SQLLDR_X_EXT_TBL"
DROP DIRECTORY SYS_SQLLDR_XT_TMPDIR_00000
DROP DIRECTORY SYS_SQLLDR_XT_TMPDIR_00000



Run began on Thu Aug 17 21:15:56 2006
Run ended on Thu Aug 17 21:15:58 2006

Elapsed time was: 00:00:02.69
CPU time was: 00:00:00.10
--------------------------------------------------

Are you suggesting using this script to do the load through external table instead of using direct=y from sqlldr?

Your example above I thought did use the sequence. Was the example you gave was through convential path?

Tom Kyte
August 18, 2006 - 7:56 am UTC

yes, I'm suggesting (always will suggest) using an external table - they rock.

I found a bug filed against the sequence with direct path load. It is filed as a documentation bug however, I'm not happy with the results there. It silently failed, there is no indication it doesn't work, I don't think that is right and will be chasing that down.

Thanks so much

Jamu, August 18, 2006 - 9:21 am UTC

Thanks for the bug information. And sure will use the external table.

By the way, when we use the append hint for the insert, new blocks are grabbed without looking at the existing free space in the used blocks, right?

And when we truncate the table, does the entire space used gets released back to the system?

One more thing, is the external table script that gets generated need to be tuned for heavy loads (50 mill ++) or is the script generated after taking into account the number of records in the file?

Thanks again for the time and such quick responses.

Tom Kyte
August 18, 2006 - 10:30 am UTC

append is a direct path load, it writes above the high water mark, correct.

when you truncate, you lower the high water mark to the lowest point.

You can "parallel" the external table and insert - enabling a parallel direct path load if you like, much much easier than scripting sqlldr to do the same.

skip and FILLER keyword

Alex, July 25, 2007 - 12:17 pm UTC

Tom

Regarding

http://asktom.oracle.com/~tkyte/SkipCols/index.html

Q1 What is the max length of the column that one can skip ?

I have a situation like this :

(
field1 char(14),
field2 char(80),
field3 filler,
field4 char(20),
.
.
.
)

originally field3 was defined as CHAR(8000), (in the table it was defined as LONG datatype).
but now, I want to skip this field altogether (the LONG column was also removed from table, as I am going Partition table) so I used FILLER, but then I got the error

"Field in data file exceeds maximum length".


Q2 how would you overcome this problem ?

Thank you for your help.
Tom Kyte
July 26, 2007 - 5:39 pm UTC

... IELDS TERMINATED BY ','
(a,b filler char(8000),c)

....


defaults to char(255)

SQLLDR

Robert, June 20, 2008 - 6:06 am UTC

Hello Tom

We're using SQLLDR to distibute data from one flatfile to several tables too.

The controlfile looks like this:
=============================================================
OPTIONS(rows=1000,bindsize=5000000,readsize=5000000)

LOAD DATA
APPEND

INTO TABLE TB_CART_ST_REVERSAL
WHEN (TABLENAME = 'TB_CART_ST_REVERSAL|') -- Filter
FIELDS TERMINATED BY '|'
TRAILING NULLCOLS
(
TABLENAME position(1:20) NULLIF TABLENAME = BLANKS "rtrim(:TABLENAME,'|')",
ID NULLIF ID = BLANKS "TRIM(:ID)",
SECURITY NULLIF SECURITY = BLANKS "TRIM(:SECURITY)",
CUSTOMER NULLIF CUSTOMER = BLANKS "TRIM(:CUSTOMER)",
CCY NULLIF CCY = BLANKS "TRIM(:CCY)",
<...an so on...>
)


INTO TABLE TB_CART_MM_REVERSAL
WHEN (TABLENAME = 'TB_CART_MM_REVERSAL|') -- Filter
FIELDS TERMINATED BY '|'
TRAILING NULLCOLS
(
TABLENAME position(1:20) NULLIF TABLENAME = BLANKS "rtrim(:TABLENAME,'|')",
ID NULLIF ID = BLANKS "TRIM(:ID)",
CLIENT NULLIF CLIENT = BLANKS "TRIM(:CLIENT)",
NAME NULLIF NAME = BLANKS "TRIM(:NAME)",
CCY NULLIF CCY = BLANKS "TRIM(:CCY)",
<...and so on...>
)
=============================================================

Our administrators tried to setup an automization by using a program called UC4
to control the flow of copying, loading, archiving the provided flatfiles.

This works properly up to the point, when a flatfile contains rows,
that fail all WHEN clauses.

Record 1: Discarded - failed all WHEN clauses.
Record 2: Discarded - failed all WHEN clauses.

In this case SQLLDR returns an errorcode 2.

If there's a real error like a false date format
and the row is stored in a BAD-file,
SQLLDR returns an errorcode 2 as well !

And even if there are blank rows in a file,
we get the errorcode 2.

We only get the code 0, if the records are all matched
from a WHEN clause.

We want to distinguish between BAD-lines and DISCARDED-lines.
The flow should only rise an alarm, if there are BAD-lines.
Blanks an comments in the flatfile should be ignored.
The returncodes of SQLLDR cannot be used for this.

SKIP is a solution, if the comments have a fix length and are
positioned at the beginning. But that's not sure.

I would need something like a WHEN discared THEN skip.

Question:
Is there anything, that could solve my problem ?

Greetings and thanks in advance
Robert

Tom Kyte
June 20, 2008 - 10:26 am UTC

I'd prefer to use an external table - then you would be using PLSQL to access this data and you have the ability to PROGRAM things.

http://docs.oracle.com/docs/cd/B19306_01/server.102/b14215/ldr_params.htm#sthref646
those are the sqlldr exit codes, three conditions cause "ex_warn" (#2)

All or some rows rejected EX_WARN
All or some rows discarded EX_WARN
Discontinued load EX_WARN


short of grepping the log files after the fact and looking for what you want to see - not really.

Reader, July 29, 2008 - 12:44 pm UTC

Tom,
Can you please give some suggestion in regards to automating the creation of SQL loader control file?

This is just a sample table. I have to create control files for 10 tables which has more than 100 columns. Can you please let me know if I can automate this?

create table tst
(
dt_key number(8)
,Record_Type varchar2(2)
,ID varchar2(20)
,Description varchar2(36)
,C_ID varchar2(9)
,S_ID varchar2(7)
,Common_Code number(9)
)

LOAD DATA
INFILE infile.dat
BADFILE infile.bad
DISCARDFILE infile.dsc
INSERT
INTO TABLE tst
(Record_Type position(1:2) char
,ID position(3:22) char
,Description position(23:58) char
,C_ID position(59:67) char
,S_ID position(68:74) char
,Common_Code POSITION(75:83) INTEGER EXTERNAL
,dt_key POSITION(84:91) INTEGER EXTERNAL
)
Tom Kyte
August 01, 2008 - 9:40 am UTC

not unless you can tell us the fixed logic here.

for example, record_type - is that special - is it always first
dt_key (which seems to be a date, probably yyyymmdd - you used the wrong datatype in your create table) - is that special - always last

all other columns - are they always in the middle?


if so, pretty easy, just query the dictionary, use a plsql block to write (using dbms_output) your control file, the dictionary has the names, the positions, the lengths of all columns

Reader, July 29, 2008 - 5:10 pm UTC

Can you please answer the above question?
Tom Kyte
August 01, 2008 - 10:38 am UTC

wow, guess I'll have to quit my day job.

you couldn't even wait 5 hours, 5 short hours...

wow......


How to get the name of data file loaded

A reader, August 12, 2008 - 11:39 am UTC

hi,

In SQLLDR, is there any way by which I can get the name of the data file loaded into a table(along with other data in the data file) . SQLLDR will be called by VBA (we are not working in Unix)

thanks
Anto
Tom Kyte
August 13, 2008 - 8:19 am UTC

you would use VBA to create the control file, the control file would have a constant field, that constant field would be the name of the file you are going to load - it would then be loaded as a constant.

I would suggest that you actually just do your own load routine - calling sqlldr will involve you executing an external program and then reading the log file to figure out if it worked or not - you'll have much better flow control if you do the insertion yourself, you'll at least know what happened.

Robert, August 29, 2008 - 1:26 pm UTC

Hello Tom
Some months ago I asked for a solution to distinguish discarded and rejected rows.
You told me to work with external tables. Our admins didn't want that and they used SKIP to avoid discarded rows.
Now we have the problem that the header of files is not fix. So SKIP cannot be used anymore.
So I have to find another way to avoid discards.

One idea is to store comments and blank-rows in a temp-table, which will be truncated on each load.

With the following part in the control-file I will store all rows, that don't look like data in the table TB_CART_NO_DATA_LINE_CONTAINER

INTO TABLE TB_CART_NO_DATA_LINE_CONTAINER
TRUNCATE
WHEN (LINE_PREFIX != 'TB_CART') -- Filter
TRAILING NULLCOLS
(
LINE_PREFIX                 position(1:7) NULLIF LINE_PREFIX  = BLANKS   ":LINE_PREFIX"
)

This works fine, if the row is not empty.
The I don't get a discard and the returncode of the sqlldr is zero, even if there are comments in the file.

But it does not work for lines that only consist of a return.

How can I catch these rows in a WHEN-CLAUSE ?
I tried everything, but there's no way to make a comparison like LINE_PREFIX IS NULL.

I know, this all is quite a stupid solution.
It would be better if SQLLDR would provide different returnvalues for each case.

Thanks in advance
   Robert



Tom Kyte
August 30, 2008 - 9:54 am UTC

can you bring your admins to this site and have them say why external tables are so scary to them?


here is the butt easy way around your problem

a) create a stage table, seq number, data varchar2(4000)
b) sqlldr into that
c) run a process against that table to do whatever you need.

sqlldr is not incredibally programmable...

load 2 files into 1 table based on a Key

Nicole, September 25, 2008 - 2:18 pm UTC

I'm not sure if this is possible and nothing that I've been reading is pointing me in the right direction. I really appreciate any help you could give me.
I have 1 table with the following fields:

COLUMNA
COLUMNB
COLUMNC
COLUMND

The data for COLUMNA, COLUMNB and COLUMNC are in 1 .csv file and the data for COLUMNA and COLUMND are in another .csv file. I have no problem loading the data from the first file into my table, however, I can't work out how to load the data from the second file into the same table based on the key (COLUMNA).
Is this possible with sql loader?
Tom Kyte
September 25, 2008 - 4:43 pm UTC

use an external table - you need SQL, not sqlldr

sqlldr can only "load" a record - insert it.

You want to load a JOIN of these two files.

create two external tables and then you can load:


insert into mytable
select a.a, a.b, a.c, b.d
from a, b
where a.a = b.a;

(or an outer join or full outer join if there could be data in A not in B and vice versa)

Table creation corresponding to .bad file

Dheeraj, September 30, 2008 - 4:54 am UTC

Hi Tom,

In continuation to the above, I have a requirement where-in when I execute SQLLDR comand, I wish to insert all the rejected records 
into a table along with its Oracle error.
(Basically, I wish to use bad file for rejected rows and log file for Oracle error)

Example:

SQLLDR table structure:

t_sqlldr_data

empid: NUMBER
deptid: NUMBER
deptname: VARCHAR2(10)

Data:

empid  deptid  deptname
1      2       def

Bad table structure:

t_sqlldr_bad

empid: VARCHAR2(100)
deptid: VARCHAR2(100)
deptname: VARCHAR2(100)
ora_error: VARCHAR2(100)

Bad table data:

empid  deptid  deptname  ora_error
1      abc     def       Record 111: Rejected - Error on table t_sqlldr_data, column deptid. ORA-01722 invalid number

OR
bad_col: VARCHAR2(100)
ora_error: VARCHAR2(100)

bad_col   ora_error
abc       Record 111: Rejected - Error on table t_sqlldr_data, column deptid. ORA-01722 invalid number

I want to kick-off above process as soon as SQLLDR is run successfully.

Any pointers shall be highly appreciated.

Thanks,

Dheeraj

Tom Kyte
September 30, 2008 - 8:34 am UTC

your "requirement" cannot be satisfied - unless you do it.

If you want to "kick off some process", you would do that in your script. You must be doing "something" to run sqlldr - after you do that something, do something else - that something else loads your bad file into a table - after merging up the errors (that is, you write a program)


or, simply use external tables plus dml error logging:

ops$tkyte%ORA10GR2> create or replace directory my_dir as '/home/tkyte'
  2  /

Directory created.

ops$tkyte%ORA10GR2>
ops$tkyte%ORA10GR2> CREATE TABLE ET
  2  (
  3    "USERNAME" VARCHAR2(4000),
  4    "USER_ID" varchar2(4000),
  5    "CREATED" varchar2(4000)
  6  )
  7  ORGANIZATION external
  8  (
  9    TYPE oracle_loader
 10    DEFAULT DIRECTORY MY_DIR
 11    ACCESS PARAMETERS
 12    (
 13      RECORDS DELIMITED BY NEWLINE CHARACTERSET US7ASCII
 14      BADFILE 't.bad'
 15      LOGFILE 't.log_xt'
 16      FIELDS TERMINATED BY "|" LDRTRIM
 17      REJECT ROWS WITH ALL NULL FIELDS
 18      (
 19        "USERNAME" CHAR(4000) TERMINATED BY "|",
 20        "USER_ID" CHAR(4000) TERMINATED BY "|",
 21        "CREATED" CHAR(4000) TERMINATED BY "|"
 22      )
 23    )
 24    location
 25    (
 26      't.dat'
 27    )
 28  )
 29  /

Table created.

ops$tkyte%ORA10GR2>
ops$tkyte%ORA10GR2> create table t as select * from all_users where 1=0;

Table created.

ops$tkyte%ORA10GR2> exec dbms_errlog.create_error_log('T');

PL/SQL procedure successfully completed.

ops$tkyte%ORA10GR2>
ops$tkyte%ORA10GR2> !cat t.dat
BIG_TABLExxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx|58|14-DEC-05
DIP|19a|30-JUN-05
TSMSYS|21|32-JUN-05
OPS$ORA10GR2|56|14-DEC-05
FOO$TKYTE|60|19-DEC-05
A|388|24-JUN-08
MAP|263|07-JAN-08

ops$tkyte%ORA10GR2> insert into t select * from et log errors reject limit unlimited;

4 rows created.

ops$tkyte%ORA10GR2> select * from t;

USERNAME                          USER_ID CREATED
------------------------------ ---------- ---------
OPS$ORA10GR2                           56 14-DEC-05
FOO$TKYTE                              60 19-DEC-05
A                                     388 24-JUN-08
MAP                                   263 07-JAN-08

ops$tkyte%ORA10GR2> select ora_err_number$, ora_err_mesg$, substr(username,1,20), substr(user_id,1,10), substr(created,1,10) from err$_t;

ORA_ERR_NUMBER$ ORA_ERR_MESG$                            SUBSTR(USERNAME,1,20 SUBSTR(USE SUBSTR(CRE
--------------- ---------------------------------------- -------------------- ---------- ----------
          12899 ORA-12899: value too large for column "O BIG_TABLExxxxxxxxxxx 58         14-DEC-05
                PS$TKYTE"."T"."USERNAME" (actual: 48, ma
                ximum: 30)

           1722 ORA-01722: invalid number                DIP                  19a        30-JUN-05
           1847 ORA-01847: day of month must be between  TSMSYS               21         32-JUN-05
                1 and last day of month




Table creation corresponding to .bad file

Dheeraj, October 17, 2008 - 1:49 am UTC

Makes good sense, Tom.

However, is it possible to tweak above output such that we need not display all the columns of a rejected row. In the 2nd row of the example shown above, we don't really know which column has invalid number unless we see rejected row data as well as each column's data type
(Imagine 1000 rows rejected for a table having 100+ columns)

Instead, we would want following output:

Value too large for column - user_name 'BIG-xxxxxxxxxxx'
Invalid number, user_id 19a

An error message that we could pull from .bad file itself.

Cheers,

Dheeraj
Tom Kyte
October 17, 2008 - 9:23 pm UTC

or, you could just take the row from the bad table, try to insert it and get the error message specific to that very row.....


eg: you don't really need to write any code to get the exact specific error message for that row, just do an insert into T select from error-log-table where...; for that failed row.

How to load mutle rows into one table column

David, November 13, 2008 - 2:06 pm UTC

Tom,

I have loading data like this:

"ID","Field1","Field2","Field3","Field4","Field5",
1,103712,2004,6,22,36820
2,103712,2004,6,22,36821
3,103712,2004,6,22,36822

I need to load this data into a table like this:
SQL> desc ahstdn
 Name                          Null?    Type
 --------------------------- -------- ---------------
 DNHSTD                                VARCHAR2(1)
 DNPAT                                 NUMBER(9)
 DNTDT                       NOT NULL  DATE
 DNSEQ                                 NUMBER(5)

So I need to load the data "Field2","Field3","Field4" into the table column "DNTDT". Do you think I can do this with SQLLOADER query?

Many thankd for your great help!


Problem faced with TAB Delimited data's.

Rajeshwaran Jeyabal, January 24, 2009 - 5:33 am UTC

Tom,

     I have a control file like this.

LOAD DATA
INFILE *
TRUNCATE
INTO TABLE EMP1
FIELDS TERMINATED BY WHITESPACE OPTIONALLY ENCLOSED BY " " TRAILING NULLCOLS
(
 EMPNO  ,
 ENAME ,
 JOB   ,
 MGR   ,
 HIREDATE ,
 SAL      ,
 COMM     ,
 DEPTNO   
)

BEGINDATA
      7369 SMITH      CLERK           7902 17-DEC-80        800                    20
      7499 ALLEN      SALESMAN        7698 20-FEB-81       1600        300         30
      7521 WARD       SALESMAN        7698 22-FEB-81       1250        500         30
      7566 JONES      MANAGER         7839 02-APR-81       2975                    20
      7654 MARTIN     SALESMAN        7698 28-SEP-81       1250       1400         30
      7698 BLAKE      MANAGER         7839 01-MAY-81       2850                    30
      7782 CLARK      MANAGER         7839 09-JUN-81       2450                    10
      7788 SCOTT      ANALYST         7566 19-APR-87       3000                    20
      7839 KING       PRESIDENT            17-NOV-81       5000                    10
      7844 TURNER     SALESMAN        7698 08-SEP-81       1500          0         30
      7876 ADAMS      CLERK           7788 23-MAY-87       1100                    20
      7900 JAMES      CLERK           7698 03-DEC-81        950                    30
      7902 FORD       ANALYST         7566 03-DEC-81       3000                    20
      7934 MILLER     CLERK           7782 23-JAN-82       1300                    10

Once done here is the data from the EMP1 table. one record is rejected the contents in 

BAD.TXT is
      7839 KING       PRESIDENT            17-NOV-81       5000                    10


scott@ORCL> select * from emp1;

     EMPNO ENAME      JOB              MGR HIREDATE         SAL       COMM     DEPTNO
---------- ---------- --------- ---------- --------- ---------- ---------- ----------
      7369 SMITH      CLERK           7902 17-DEC-80        800         20
      7499 ALLEN      SALESMAN        7698 20-FEB-81       1600        300         30
      7521 WARD       SALESMAN        7698 22-FEB-81       1250        500         30
      7566 JONES      MANAGER         7839 02-APR-81       2975         20
      7654 MARTIN     SALESMAN        7698 28-SEP-81       1250       1400         30
      7698 BLAKE      MANAGER         7839 01-MAY-81       2850         30
      7782 CLARK      MANAGER         7839 09-JUN-81       2450         10
      7788 SCOTT      ANALYST         7566 19-APR-87       3000         20
      7844 TURNER     SALESMAN        7698 08-SEP-81       1500          0         30
      7876 ADAMS      CLERK           7788 23-MAY-87       1100         20
      7900 JAMES      CLERK           7698 03-DEC-81        950         30
      7902 FORD       ANALYST         7566 03-DEC-81       3000         20
      7934 MILLER     CLERK           7782 23-JAN-82       1300         10

13 rows selected.

1) If you look into the first record in EMP1 Table then column COMM has the value 20 buy it should go to DEPTNO column.
   the same thing happened for the record 4,6,7,9,10,11,12 with the abouve TAB delimited DATA. 
   Can you please show me how to achieve this? i also tried with the option  " FIELDS TERMINATED BY x'09' ". But seems to be useless.



Tom Kyte
January 24, 2009 - 1:27 pm UTC

ops$tkyte%ORA10GR2> !cat t.ctl
LOAD DATA
INFILE *
TRUNCATE
INTO TABLE EMP
FIELDS TERMINATED BY x'09'
( EMPNO  , ENAME , JOB   , MGR   , HIREDATE , SAL      , COMM     , DEPTNO   )
BEGINDATA
7369    SMITH   CLERK   7902    17-DEC-80       800             20

ops$tkyte%ORA10GR2> !od -c t.ctl
0000000   L   O   A   D       D   A   T   A  \n   I   N   F   I   L   E
0000020       *  \n   T   R   U   N   C   A   T   E  \n   I   N   T   O
0000040       T   A   B   L   E       E   M   P  \n   F   I   E   L   D
0000060   S       T   E   R   M   I   N   A   T   E   D       B   Y
0000100   x   '   0   9   '      \n   (       E   M   P   N   O
0000120   ,       E   N   A   M   E       ,       J   O   B
0000140   ,       M   G   R               ,       H   I   R   E   D   A
0000160   T   E       ,       S   A   L                           ,
0000200   C   O   M   M                       ,       D   E   P   T   N
0000220   O               )  \n   B   E   G   I   N   D   A   T   A  \n
0000240   7   3   6   9  \t   S   M   I   T   H  \t   C   L   E   R   K
0000260  \t   7   9   0   2  \t   1   7   -   D   E   C   -   8   0  \t
0000300   8   0   0  \t  \t   2   0  \n
0000310

ops$tkyte%ORA10GR2> !sqlldr / t

SQL*Loader: Release 10.2.0.1.0 - Production on Sat Jan 24 13:24:40 2009

Copyright (c) 1982, 2005, Oracle.  All rights reserved.

Commit point reached - logical record count 1

ops$tkyte%ORA10GR2> exec print_table( 'select * from emp' );
.EMPNO                        : 7369
.ENAME                        : SMITH
.JOB                          : CLERK
.MGR                          : 7902
.HIREDATE                     : 17-dec-1980 00:00:00
.SAL                          : 800
.COMM                         :
.DEPTNO                       : 20
-----------------

PL/SQL procedure successfully completed.

Tab limited data having NULL Values.

Rajeshwaran, Jeyabal, January 26, 2009 - 11:31 pm UTC


Tom,
Here is my control file & log file. I using SQL Loader in Oracle10gR1

CONTROL FILE
============

LOAD DATA
INFILE *
TRUNCATE
INTO TABLE EMP1
FIELDS TERMINATED BY x'09' TRAILING NULLCOLS
(
EMPNO,
ENAME,
JOB,
MGR,
HIREDATE,
SAL,
COMM,
DEPTNO
)

BEGINDATA
7369 SMITH CLERK 7902 17-DEC-80 800 20
7499 ALLEN SALESMAN 7698 20-FEB-81 1600 300 30
7521 WARD SALESMAN 7698 22-FEB-81 1250 500 30
7566 JONES MANAGER 7839 02-APR-81 2975 20
7654 MARTIN SALESMAN 7698 28-SEP-81 1250 1400 30
7698 BLAKE MANAGER 7839 01-MAY-81 2850 30
7782 CLARK MANAGER 7839 09-JUN-81 2450 10
7788 SCOTT ANALYST 7566 19-APR-87 3000 20
7839 KING PRESIDENT 17-NOV-81 5000 10
7844 TURNER SALESMAN 7698 08-SEP-81 1500 0 30
7876 ADAMS CLERK 7788 23-MAY-87 1100 20
7900 JAMES CLERK 7698 03-DEC-81 950 30
7902 FORD ANALYST 7566 03-DEC-81 3000 20
7934 MILLER CLERK 7782 23-JAN-82 1300 10

LOG FILE
=========

SQL*Loader: Release 10.1.0.2.0 - Production on Tue Jan 27 09:46:03 2009

Copyright (c) 1982, 2004, Oracle. All rights reserved.

Control File: C:\TEMP\CTL.TXT
Data File: C:\TEMP\CTL.TXT
Bad File: C:\TEMP\BAD.TXT
Discard File: none specified

(Allow all discards)

Number to load: ALL
Number to skip: 0
Errors allowed: 50
Bind array: 64 rows, maximum of 256000 bytes
Continuation: none specified
Path used: Conventional

Table EMP1, loaded from every logical record.
Insert option in effect for this table: TRUNCATE
TRAILING NULLCOLS option in effect

Column Name Position Len Term Encl Datatype
------------------------------ ---------- ----- ---- ---- ---------------------
EMPNO FIRST * WHT CHARACTER
ENAME NEXT * WHT CHARACTER
JOB NEXT * WHT CHARACTER
MGR NEXT * WHT CHARACTER
HIREDATE NEXT * WHT CHARACTER
SAL NEXT * WHT CHARACTER
COMM NEXT * WHT CHARACTER
DEPTNO NEXT * WHT CHARACTER

Record 1: Rejected - Error on table EMP1, column COMM.
ORA-01722: invalid number

Record 4: Rejected - Error on table EMP1, column COMM.
ORA-01722: invalid number

Record 6: Rejected - Error on table EMP1, column COMM.
ORA-01722: invalid number

Record 7: Rejected - Error on table EMP1, column COMM.
ORA-01722: invalid number

Record 8: Rejected - Error on table EMP1, column COMM.
ORA-01722: invalid number

Record 9: Rejected - Error on table EMP1, column MGR.
ORA-01722: invalid number

Record 11: Rejected - Error on table EMP1, column COMM.
ORA-01722: invalid number

Record 12: Rejected - Error on table EMP1, column COMM.
ORA-01722: invalid number

Record 13: Rejected - Error on table EMP1, column COMM.
ORA-01722: invalid number

Record 14: Rejected - Error on table EMP1, column COMM.
ORA-01722: invalid number


Table EMP1:
4 Rows successfully loaded.
10 Rows not loaded due to data errors.
0 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.


Space allocated for bind array: 132096 bytes(64 rows)
Read buffer bytes: 1048576

Total logical records skipped: 0
Total logical records read: 14
Total logical records rejected: 10
Total logical records discarded: 0

Run began on Tue Jan 27 09:46:03 2009
Run ended on Tue Jan 27 09:46:03 2009

Elapsed time was: 00:00:00.28
CPU time was: 00:00:00.03

scott@ORCL> DESC EMP1;
Name Null? Type
----------------------------------------------------------------------- -------- ------------------
EMPNO NUMBER(4)
ENAME VARCHAR2(10)
JOB VARCHAR2(9)
MGR NUMBER(4)
HIREDATE DATE
SAL NUMBER(7,2)
COMM NUMBER(7,2)
DEPTNO NUMBER(2)

scott@ORCL> SELECT * FROM EMP1;

EMPNO ENAME JOB MGR HIREDATE SAL COMM DEPTNO
---------- ---------- --------- ---------- --------- ---------- ---------- ----------
7499 ALLEN SALESMAN 7698 20-FEB-81 1600 300 30
7521 WARD SALESMAN 7698 22-FEB-81 1250 500 30
7654 MARTIN SALESMAN 7698 28-SEP-81 1250 1400 30
7844 TURNER SALESMAN 7698 08-SEP-81 1500 0 30

Here is my bad file details.
===========================
7369 SMITH CLERK 7902 17-DEC-80 800 20
7566 JONES MANAGER 7839 02-APR-81 2975 20
7698 BLAKE MANAGER 7839 01-MAY-81 2850 30
7782 CLARK MANAGER 7839 09-JUN-81 2450 10
7788 SCOTT ANALYST 7566 19-APR-87 3000 20
7839 KING PRESIDENT 17-NOV-81 5000 10
7876 ADAMS CLERK 7788 23-MAY-87 1100 20
7900 JAMES CLERK 7698 03-DEC-81 950 30
7902 FORD ANALYST 7566 03-DEC-81 3000 20
7934 MILLER CLERK 7782 23-JAN-82 1300 10

1) Only four records got inserted Tom, rest of them are treated as Bad records. why its happening so?
I want to replace NULL for " COMM " column if no value exists. you can see the rejected records are having " COMM " column as NULL Values.
Also the " COMM " column is NULLABLE from the table definition. I am working in Oracle10g on windows Environment.


Tom Kyte
January 28, 2009 - 8:20 am UTC

you do know that with html, I cannot actually "see" your tabs.

I showed you mine:

ops$tkyte%ORA10GR2> !od -c t.ctl


now, you figure out a way to show me your tabs..... I cannot tell what your inputs actually *are*

Tab limited data having NULL Values

Rajeshwaran, Jeyabal, January 28, 2009 - 9:36 am UTC

Tom,

 Really sorry to Trouble you again.

 Here is how is generate the tab limited data from SQL * Plus.
scott@ORCL> col TAB# new_value TAB NOPRINT
scott@ORCL> select chr(9) TAB# from dual;




scott@ORCL> set colsep "&TAB"
scott@ORCL> spool c:\temp\emp_text.txt
scott@ORCL> select * from emp;

     EMPNO      ENAME           JOB                    MGR      HIREDATE               SAL            COMM          DEPTNO
----------      ----------      ---------       ----------      ---------       ----------      ----------      ----------
      7369      SMITH           CLERK                 7902      17-DEC-80              800                              20
      7499      ALLEN           SALESMAN              7698      20-FEB-81             1600             300              30
      7521      WARD            SALESMAN              7698      22-FEB-81             1250             500              30
      7566      JONES           MANAGER               7839      02-APR-81             2975                              20
      7654      MARTIN          SALESMAN              7698      28-SEP-81             1250            1400              30
      7698      BLAKE           MANAGER               7839      01-MAY-81             2850                              30
      7782      CLARK           MANAGER               7839      09-JUN-81             2450                              10
      7788      SCOTT           ANALYST               7566      19-APR-87             3000                              20
      7839      KING            PRESIDENT                       17-NOV-81             5000                              10
      7844      TURNER          SALESMAN              7698      08-SEP-81             1500               0              30
      7876      ADAMS           CLERK                 7788      23-MAY-87             1100                              20
      7900      JAMES           CLERK                 7698      03-DEC-81              950                              30
      7902      FORD            ANALYST               7566      03-DEC-81             3000                              20
      7934      MILLER          CLERK                 7782      23-JAN-82             1300                              10

14 rows selected.

scott@ORCL> spool off;

 After that i manually removed the Query (select * from emp) and the ouput column names (empno,ename,job,mgr,hiredate,sal,comm,deptno) and in the
 end of the file I removed " 14 rows selected." and finally saved the file. so now the contents will be like this (like the one below).

      7369 SMITH      CLERK           7902 17-DEC-80        800                    20
      7499 ALLEN      SALESMAN        7698 20-FEB-81       1600        300         30
      7521 WARD       SALESMAN        7698 22-FEB-81       1250        500         30
      7566 JONES      MANAGER         7839 02-APR-81       2975                    20
      7654 MARTIN     SALESMAN        7698 28-SEP-81       1250       1400         30
      7698 BLAKE      MANAGER         7839 01-MAY-81       2850                    30
      7782 CLARK      MANAGER         7839 09-JUN-81       2450                    10
      7788 SCOTT      ANALYST         7566 19-APR-87       3000                    20
      7839 KING       PRESIDENT            17-NOV-81       5000                    10
      7844 TURNER     SALESMAN        7698 08-SEP-81       1500          0         30
      7876 ADAMS      CLERK           7788 23-MAY-87       1100                    20
      7900 JAMES      CLERK           7698 03-DEC-81        950                    30
      7902 FORD       ANALYST         7566 03-DEC-81       3000                    20
      7934 MILLER     CLERK           7782 23-JAN-82       1300                    10

Here is my control file.
========================

LOAD DATA
INFILE *
TRUNCATE
INTO TABLE EMP1
FIELDS TERMINATED BY x'09' TRAILING NULLCOLS
(
 EMPNO,
 ENAME,
 JOB,
 MGR,
 HIREDATE,
 SAL,
 COMM,
 DEPTNO 
)

BEGINDATA
      7369 SMITH      CLERK           7902 17-DEC-80        800                    20
      7499 ALLEN      SALESMAN        7698 20-FEB-81       1600        300         30
      7521 WARD       SALESMAN        7698 22-FEB-81       1250        500         30
      7566 JONES      MANAGER         7839 02-APR-81       2975                    20
      7654 MARTIN     SALESMAN        7698 28-SEP-81       1250       1400         30
      7698 BLAKE      MANAGER         7839 01-MAY-81       2850                    30
      7782 CLARK      MANAGER         7839 09-JUN-81       2450                    10
      7788 SCOTT      ANALYST         7566 19-APR-87       3000                    20
      7839 KING       PRESIDENT            17-NOV-81       5000                    10
      7844 TURNER     SALESMAN        7698 08-SEP-81       1500          0         30
      7876 ADAMS      CLERK           7788 23-MAY-87       1100                    20
      7900 JAMES      CLERK           7698 03-DEC-81        950                    30
      7902 FORD       ANALYST         7566 03-DEC-81       3000                    20
      7934 MILLER     CLERK           7782 23-JAN-82       1300                    10

Here is my Bad file contents
============================

      7369 SMITH      CLERK           7902 17-DEC-80        800                    20
      7566 JONES      MANAGER         7839 02-APR-81       2975                    20
      7698 BLAKE      MANAGER         7839 01-MAY-81       2850                    30
      7782 CLARK      MANAGER         7839 09-JUN-81       2450                    10
      7788 SCOTT      ANALYST         7566 19-APR-87       3000                    20
      7839 KING       PRESIDENT            17-NOV-81       5000                    10
      7876 ADAMS      CLERK           7788 23-MAY-87       1100                    20
      7900 JAMES      CLERK           7698 03-DEC-81        950                    30
      7902 FORD       ANALYST         7566 03-DEC-81       3000                    20
      7934 MILLER     CLERK           7782 23-JAN-82       1300                    10

Here is my log file contents
=============================

SQL*Loader: Release 10.1.0.2.0 - Production on Wed Jan 28 19:52:42 2009

Copyright (c) 1982, 2004, Oracle.  All rights reserved.

Control File:   c:\temp\ctl.txt
Data File:      c:\temp\ctl.txt
  Bad File:     c:\temp\bad.txt
  Discard File: c:\temp\disc.txt 
 (Allow all discards)

Number to load: ALL
Number to skip: 0
Errors allowed: 50
Bind array:     64 rows, maximum of 256000 bytes
Continuation:    none specified
Path used:      Conventional

Table EMP1, loaded from every logical record.
Insert option in effect for this table: TRUNCATE
TRAILING NULLCOLS option in effect

   Column Name                  Position   Len  Term Encl Datatype
------------------------------ ---------- ----- ---- ---- ---------------------
EMPNO                               FIRST     *  WHT      CHARACTER            
ENAME                                NEXT     *  WHT      CHARACTER            
JOB                                  NEXT     *  WHT      CHARACTER            
MGR                                  NEXT     *  WHT      CHARACTER            
HIREDATE                             NEXT     *  WHT      CHARACTER            
SAL                                  NEXT     *  WHT      CHARACTER            
COMM                                 NEXT     *  WHT      CHARACTER            
DEPTNO                               NEXT     *  WHT      CHARACTER            

Record 1: Rejected - Error on table EMP1, column COMM.
ORA-01722: invalid number

Record 4: Rejected - Error on table EMP1, column COMM.
ORA-01722: invalid number

Record 6: Rejected - Error on table EMP1, column COMM.
ORA-01722: invalid number

Record 7: Rejected - Error on table EMP1, column COMM.
ORA-01722: invalid number

Record 8: Rejected - Error on table EMP1, column COMM.
ORA-01722: invalid number

Record 9: Rejected - Error on table EMP1, column MGR.
ORA-01722: invalid number

Record 11: Rejected - Error on table EMP1, column COMM.
ORA-01722: invalid number

Record 12: Rejected - Error on table EMP1, column COMM.
ORA-01722: invalid number

Record 13: Rejected - Error on table EMP1, column COMM.
ORA-01722: invalid number

Record 14: Rejected - Error on table EMP1, column COMM.
ORA-01722: invalid number


Table EMP1:
  4 Rows successfully loaded.
  10 Rows not loaded due to data errors.
  0 Rows not loaded because all WHEN clauses were failed.
  0 Rows not loaded because all fields were null.


Space allocated for bind array:                 132096 bytes(64 rows)
Read   buffer bytes: 1048576

Total logical records skipped:          0
Total logical records read:            14
Total logical records rejected:        10
Total logical records discarded:        0

Run began on Wed Jan 28 19:52:42 2009
Run ended on Wed Jan 28 19:52:42 2009

Elapsed time was:     00:00:00.24
CPU time was:         00:00:00.02

scott@ORCL> select * from emp1;

     EMPNO ENAME      JOB              MGR HIREDATE         SAL       COMM     DEPTNO
---------- ---------- --------- ---------- --------- ---------- ---------- ----------
      7499 ALLEN      SALESMAN        7698 20-FEB-81       1600        300         30
      7521 WARD       SALESMAN        7698 22-FEB-81       1250        500         30
      7654 MARTIN     SALESMAN        7698 28-SEP-81       1250       1400         30
      7844 TURNER     SALESMAN        7698 08-SEP-81       1500          0         30

scott@ORCL> desc emp1;
 Name                                                                    Null?    Type
 ----------------------------------------------------------------------- -------- ------------------
 EMPNO                                                                            NUMBER(4)
 ENAME                                                                            VARCHAR2(10)
 JOB                                                                              VARCHAR2(9)
 MGR                                                                              NUMBER(4)
 HIREDATE                                                                         DATE
 SAL                                                                              NUMBER(7,2)
 COMM                                                                             NUMBER(7,2)
 DEPTNO                                                                           NUMBER(2)

Questions
=========
1) Why the Records are rejected eventhough the column COMM can accept NULL values?
2) If you look at the Record no 9 ( Produced in the BAD File) it is rejected. But the Column MGR can accept NULL values?

Thanks in Advance,
Rajesh.

Tom Kyte
January 28, 2009 - 4:00 pm UTC

nope, I don't have windows and oracle. I only have linux and oracle. So, you as a developer, please - show me what is really truly in YOUR files (you do know exactly, precisely what is in your files right???)

or at the very very very very VERY VERY least consider this

use a character different from tab, you know, so you and I can "talk". Why not substitute # for TAB?? Simple "debugging 101" type of stuff, something simple we can both "see" on the screen. When you make it visible - it *might* become obvious.

Tab limited data having NULL Values

Rajeshwaran, Jeyabal, January 31, 2009 - 3:33 pm UTC

Tom,

Here's my deta that needs to be loaded. for visiblity purpose i have added " \T " to denote a tab delimitation,
Can you show me how the control file needs to be modified appropriately to load all the data's without any bad records.

7839\T KING \T PRESIDENT\T \T 17-NOV-81\T 5000\T \T 10
7698\T BLAKE \T MANAGER \T 7839\T 01-MAY-81\T 2850\T \T 30
7782\T CLARK \T MANAGER \T 7839\T 09-JUN-81\T 2450\T \T 10
7566\T JONES \T MANAGER \T 7839\T 02-APR-81\T 2975\T \T 20
7654\T MARTIN \T SALESMAN \T 7698\T 28-SEP-81\T 1250\T 1400\T 30
7499\T ALLEN \T SALESMAN \T 7698\T 20-FEB-81\T 1600\T 300\T 30
7844\T TURNER \T SALESMAN \T 7698\T 08-SEP-81\T 1500\T 0\T 30
7900\T JAMES \T CLERK \T 7698\T 03-DEC-81\T 950\T \T 30
7521\T WARD \T SALESMAN \T 7698\T 22-FEB-81\T 1250\T 500\T 30
7902\T FORD \T ANALYST \T 7566\T 03-DEC-81\T 3000\T \T 20
7369\T SMITH \T CLERK \T 7902\T 17-DEC-80\T 800\T \T 20
7788\T SCOTT \T ANALYST \T 7566\T 09-DEC-82\T 3000\T \T 20
7876\T ADAMS \T CLERK \T 7788\T 12-JAN-83\T 1100\T \T 20
7934\T MILLER \T CLERK \T 7782\T 23-JAN-82\T 1300\T \T 10
Tom Kyte
February 02, 2009 - 10:08 am UTC

Use the one I gave you already above... I did, it worked *perfectly*

did you even try it????

ops$tkyte%ORA10GR2> !cat test.ctl | sed 's/     /\\t/g'
LOAD DATA
INFILE *
TRUNCATE
INTO TABLE EMP
FIELDS TERMINATED BY x'09'
( EMPNO  , ENAME , JOB   , MGR   , HIREDATE , SAL      , COMM     , DEPTNO   )
BEGINDATA
7839\tKING\tPRESIDENT\t\t17-NOV-81\t5000\t\t10
7698\tBLAKE\tMANAGER\t7839\t01-MAY-81\t2850\t\t30
7782\tCLARK\tMANAGER\t7839\t09-JUN-81\t2450\t\t10
7566\tJONES\tMANAGER\t7839\t02-APR-81\t2975\t\t20
7654\tMARTIN\tSALESMAN\t7698\t28-SEP-81\t1250\t1400\t30
7499\tALLEN\tSALESMAN\t7698\t20-FEB-81\t1600\t300\t30
7844\tTURNER\tSALESMAN\t7698\t08-SEP-81\t1500\t0\t30
7900\tJAMES\tCLERK\t7698\t03-DEC-81\t950\t\t30
7521\tWARD\tSALESMAN\t7698\t22-FEB-81\t1250\t500\t30
7902\tFORD\tANALYST\t7566\t03-DEC-81\t3000\t\t20
7369\tSMITH\tCLERK\t7902\t17-DEC-80\t800\t\t20
7788\tSCOTT\tANALYST\t7566\t09-DEC-82\t3000\t\t20
7876\tADAMS\tCLERK\t7788\t12-JAN-83\t1100\t\t20
7934\tMILLER\tCLERK\t7782\t23-JAN-82\t1300\t\t10

ops$tkyte%ORA10GR2> truncate table emp;

Table truncated.

ops$tkyte%ORA10GR2> select * from emp;

no rows selected

ops$tkyte%ORA10GR2> !sqlldr / test

SQL*Loader: Release 10.2.0.4.0 - Production on Mon Feb 2 10:03:12 2009

Copyright (c) 1982, 2007, Oracle.  All rights reserved.

Commit point reached - logical record count 14

ops$tkyte%ORA10GR2> select * from emp;

     EMPNO ENAME      JOB              MGR HIREDATE         SAL       COMM
---------- ---------- --------- ---------- --------- ---------- ----------
    DEPTNO
----------
      7839 KING       PRESIDENT            17-NOV-81       5000
        10

      7698 BLAKE      MANAGER         7839 01-MAY-81       2850
        30

      7782 CLARK      MANAGER         7839 09-JUN-81       2450
        10

      7566 JONES      MANAGER         7839 02-APR-81       2975
        20

      7654 MARTIN     SALESMAN        7698 28-SEP-81       1250       1400
        30

      7499 ALLEN      SALESMAN        7698 20-FEB-81       1600        300
        30

      7844 TURNER     SALESMAN        7698 08-SEP-81       1500          0
        30

      7900 JAMES      CLERK           7698 03-DEC-81        950
        30

      7521 WARD       SALESMAN        7698 22-FEB-81       1250        500
        30

      7902 FORD       ANALYST         7566 03-DEC-81       3000
        20

      7369 SMITH      CLERK           7902 17-DEC-80        800
        20

      7788 SCOTT      ANALYST         7566 09-DEC-82       3000
        20

      7876 ADAMS      CLERK           7788 12-JAN-83       1100
        20

      7934 MILLER     CLERK           7782 23-JAN-82       1300
        10


14 rows selected.

Load values which is not in the .dat file

hanar, February 03, 2009 - 6:29 am UTC

Hello,
I used sqlldr to load a .Dat file into a table,and it was successful,but in the table there is an extra column which is not available in the .dat file,I want to add values to that column as well( one value for all records)
How can I do that in my ctl file....

Appreciate your support....

Tom Kyte
February 03, 2009 - 10:24 am UTC

In reading the documentation.... (always illuminating - before I knew anything about Oracle - I read some of it and learned a bit...)

http://docs.oracle.com/docs/cd/B19306_01/server.102/b14215/ldr_field_list.htm#sthref1248

hello

hanar, February 08, 2009 - 8:38 am UTC

Sorry it doesn't help,
Tom Kyte
February 09, 2009 - 6:40 pm UTC

I am sorry too -

you asked "how to load a constant"

and someone posts

"here is a link to how to load a constant"


and it follows with simple text:


CONSTANT Parameter

To set a column to a constant value, use CONSTANT followed by a value:

CONSTANT value


CONSTANT data is interpreted by SQL*Loader as character input. It is converted, as necessary, to the database column type.

You may enclose the value within quotation marks, and you must do so if it contains whitespace or reserved words. Be sure to specify a legal value for the target column. If the value is bad, every record is rejected.

Numeric values larger than 2^32 - 1 (4,294,967,295) must be enclosed in quotation marks.


so, it sounds like

column_name CONSTANT 42

might work to load the constant number 42 into the column column_name - doesn't it?

open mouth, insert spoon:

ops$tkyte%ORA10GR2> desc emp
 Name                                     Null?    Type
 ---------------------------------------- -------- ----------------------------
 EMPNO                                             NUMBER(4)
 ENAME                                             VARCHAR2(10)
 JOB                                               VARCHAR2(9)
 MGR                                               NUMBER(4)
 HIREDATE                                          DATE
 SAL                                               NUMBER(7,2)
 COMM                                              NUMBER(7,2)
 DEPTNO                                            NUMBER(2)
 EXTRA_FIELD                                       NUMBER

ops$tkyte%ORA10GR2> !cat t.ctl
LOAD DATA
INFILE *
TRUNCATE
INTO TABLE EMP
FIELDS TERMINATED BY x'09'
( EMPNO  , ENAME , JOB   , MGR   , HIREDATE , SAL      , COMM     , DEPTNO  , <b>extra_field CONSTANT 42</b> )
BEGINDATA
7369    SMITH   CLERK   7902    17-DEC-80       800             20

ops$tkyte%ORA10GR2> !sqlldr / t

SQL*Loader: Release 10.2.0.1.0 - Production on Mon Feb 9 18:41:34 2009

Copyright (c) 1982, 2005, Oracle.  All rights reserved.

Commit point reached - logical record count 1

ops$tkyte%ORA10GR2> select * from emp;

     EMPNO ENAME      JOB              MGR HIREDATE         SAL       COMM
---------- ---------- --------- ---------- --------- ---------- ----------
    DEPTNO EXTRA_FIELD
---------- -----------
      7369 SMITH      CLERK           7902 17-DEC-80        800
        20         <b> 42
</b>

ops$tkyte%ORA10GR2>


and all I can say is...
wow

Grumpy & Old

Satan, February 10, 2009 - 2:19 am UTC

Tom, you are being grumpy and sarcastic.Oh Boy, you are definetly getting OLD :).
Tom Kyte
February 10, 2009 - 7:06 am UTC

well, do you disagree?

I mean, what is going on here - a pointer to the documentation, about using a constant, and we can no longer take the initiative to figure it out? I mean, it was *right there*.


LOAD REPORT : STYLE INPUT DATA

Rajeshwaran, Jeyabal, April 28, 2009 - 11:15 am UTC

Hi Tom,

I need your Suggestion in this scenario.
I have a Data file like the below 
===================================================================================================
REPORT: FERAS-RESP      TEST DATA SYSTEM                          
RUN DATE: 20081219           TEST DATA REPORT                              
                                                                                
SUBMITTER ID: SH9533                                                            
FILE ID: 0000002329       REJECTED  TEST                                        
                                                                                
RECORD  SEQ     ERROR                                                           
 TYPE   NO      CODE    ERROR DESCRIPTION                                       
  CCC  0000017  302 TEST DATA DISCRITPTIONS                        
 END OF REPORT                                                                  
                                                                                
REPORT: REPORT-RESP      TEST DATA SYSTEM                        
RUN DATE: 20081219           TEST DATA DISCRITPTIONS                             
                                                                                
SUBMITTER ID: SH9533                                                            
FILE ID: 0000002328       REJECTED  TEST                                        
                                                                                
RECORD  SEQ     ERROR                                                           
 TYPE   NO      CODE    ERROR DESCRIPTION                                       
  BBB  0020176  213 TEST DATA DISCRITPTIONS                
  CCC  0020177  302 TEST DATA DISCRITPTIONS                       
 END OF REPORT                                                                  
                                                                                
REPORT: REPORT-RESP      TEST DATA SYSTEM                        
RUN DATE: 20081219          TEST DATA DISCRITPTIONS                             
                                                                                
SUBMITTER ID: SH9533                                                            
FILE ID: 0000002323       REJECTED  TEST                                        
                                                                                
RECORD  SEQ     ERROR                                                           
 TYPE   NO      CODE    ERROR DESCRIPTION                                       
  CCC  0000096  302 TEST DATA DISCRITPTIONS                       
 END OF REPORT                                                                  
                                                                                
REPORT: REPORT-RESP      TEST DATA SYSTEM                        
RUN DATE: 20081219           REPORT RESPONSE REPORT                              
                                                                                
SUBMITTER ID: SH9533                                                            
FILE ID: 0000002320       REJECTED  TEST                                        
                                                                                
RECORD  SEQ     ERROR                                                           
 TYPE   NO      CODE    ERROR DESCRIPTION                                       
  CCC  0040557  302 TEST DATA DISCRITPTIONS                      
 END OF REPORT                                                                  
                                                                                
REPORT: REPORT-RESP      TEST DATA SYSTEM                        
RUN DATE: 20081219           REPORT RESPONSE REPORT                              
                                                                                
SUBMITTER ID: SH9533                                                            
FILE ID: 0000002317       REJECTED  TEST                                        
                                                                                
RECORD  SEQ     ERROR                                                           
 TYPE   NO      CODE    ERROR DESCRIPTION                                       
  CCC  0000279  302 TEST DATA DISCRITPTIONS                        
 END OF REPORT                                                                  
                                                                                
REPORT: REPORT-RESP      TEST DATA SYSTEM                        
RUN DATE: 20081219           REPORT RESPONSE REPORT                              
                                                                                
SUBMITTER ID: SH9533                                                            
FILE ID: 0000002331       REJECTED  TEST                                        
                                                                                
RECORD  SEQ     ERROR                                                           
 TYPE   NO      CODE    ERROR DESCRIPTION                                       
  BBB  0013235  213 TEST DATA DISCRITPTIONS                 
  CCC  0013236  302 TEST DATA DISCRITPTIONS                       
 END OF REPORT                                                                  
***** E N D    O F    DATA IN THE FILE *****                             
===================================================================================================
 And My control file is 

 LOAD DATA
INFILE *
TRUNCATE
INTO TABLE T 
WHEN Y <> '' TRAILING NULLCOLS
(
 x RECNUM,
 y POSITION (1:4000)
)

===================================================================================================
 CREATE TABLE  IR_STG
(
   FILE_ID    VARCHAR2(40),
   FILE_STATUS  VARCHAR2(40),
   MODE  VARCHAR2(10),
   SEG_NAME VARCHAR2(10),  
   ERR_CODE     VARCHAR2(10),
   ERR_DESC     VARCHAR2(400)
);

CREATE OR REPLACE 
PROCEDURE P AS
REC_IR IR_STG%ROWTYPE;
BEGIN
    FOR REC IN (SELECT  X,TRIM(Y) AS Y FROM T ORDER BY X)
    LOOP
         IF REC.Y LIKE 'FILE%' THEN

            REC_IR.FILE_ID := substr(REC.Y, 10, 10);
            REC_IR.FILE_STATUS := substr(REC.Y, 27, 8);
            REC_IR.MODE := substr(REC.Y, 37, 4);

         ELSIF SUBSTR(REC.Y,1,3) IN ('AAA','BBB','CCC','ZZZ','YYY') THEN

            REC_IR.SEG_NAME := SUBSTR(REC.Y,1,3);
            REC_IR.ERR_CODE   := SUBSTR(REC.Y, 15, 3);
            REC_IR.ERR_DESC := SUBSTR(REC.Y,19, 60);
            INSERT INTO IR_STG_FERAS VALUES REC_IR;
            COMMIT;

        ELSIF REC.Y ='END OF REPORT' THEN

            REC_IR.FILE_ID := NULL;
            REC_IR.FILE_STATUS := NULL;
            REC_IR.MODE := NULL;
            REC_IR.SEG_NAME := NULL;
            REC_IR.ERR_CODE    := NULL;
            REC_IR.ERR_DESC := NULL;

       END IF;    

    END LOOP;

EXCEPTION
  WHEN OTHERS THEN
   ROLLBACK;
   RAISE_APPLICATION_ERROR (-20458,SQLERRM);
END PRC_FERAS_INBOUND;

===================================================================================================

SQL> Select * from IR_STG;

FILE_ID                                  FILE_STATUS                              MODE      SEG_NAME   ERR_CODE 
---------------------------------------- ---------------------------------------- ---------- ------------ ---------- 
0000002329                               REJECTED                                 PROD       CCC          302        
0000002328                               REJECTED                                 PROD       BBB          213        
0000002328                               REJECTED                                 PROD       CCC          302        
0000002323                               REJECTED                                 PROD       CCC          302        
0000002320                               REJECTED                                 PROD       CCC          302        
0000002317                               REJECTED                                 PROD       CCC          302        
0000002331                               REJECTED                                 PROD       BBB          213        
0000002331                               REJECTED                                 PROD       CCC          302        

I have leart this approch from your book " Expert One On One Oracle "
In the chapter no :9 DATA LOADING under the topic " LOAD REPORT : STYLE INPUT DATA " you have provided this approch.

QUESTIONS
1) Is this a Right approch to follow this, for above Data File?
2) Also will the data in the Data file will be read Sequentially or Randomly by the SQL Loader?
Can i Guarentee the data between segment "FILE ID " & " END OF REPORT " will be loaded in the same order to the table as they present in the DATA FILE.

Please suggest me is this a right approch to follow for these kind of Data Files?

Thanks,
Rajesh.
Tom Kyte
April 28, 2009 - 11:29 am UTC

...
EXCEPTION
  WHEN OTHERS THEN
   ROLLBACK;
   RAISE_APPLICATION_ERROR (-20458,SQLERRM);
END PRC_FERAS_INBOUND;

....


you didn't learn that from me (if you read the chapter on transactions, I tell you not to do that :) )


sqlldr will load the records in order (do not use parallel)...

Is this a right approch?

Rajeshwaran, Jeyabal, April 28, 2009 - 12:10 pm UTC

Thanks for your correcting me Tom, i will change that pieace of code.

Please answer to my question 1.

But the approch i followed with loading my data into the Database using this control file is right Tom? Shall i go by this approch to move my data into production DB ?
please correct me if i am wrong with my control file please ?


Tom Kyte
April 28, 2009 - 12:18 pm UTC

for #1, if your code

a) matches your specifications
b) has been tested
c) works
d) and you like it and are happy with it

then you are doing it ok I guess, I don't review all code that comes across these pages.

You need to procedurally process this data, your control file loads the text into a table so you can read it out line by line.


You could just use utl_file directly and read the input file line by line as well if the file exists on the server.

Hats off to TOM !!!

Rajeshwaran, Jeyabal, April 28, 2009 - 12:37 pm UTC

Tom,

Can't "count" how many times I have been benefited by the discussions and
explanations here. Really learned a lot.

This count query never returns - it's still counting - no matter how much
we tune it... Thanks for your Great Support.

DATA Files Without any File Extension.

Rajeshwaran, Jeyabal, April 29, 2009 - 11:50 pm UTC

Tom,

  I have a Data file (DATA_20070803_0637) without any file extension that needs to be processed by the SQL Loader.
  I am not able to process the Data file, It Says file doesnot exists  ( Unable to open file (d:\feras\sqlloader\DATA_20070803_0637.dat) )
  I am working  with Oracle 10gR1 on Windows XP OS.

   sqlldr userid=scott/tiger@ORCL control=D:\FERAS\SQLLOADER\ctl.txt log=D:\FERAS\SQLLOADER\log.txt bad=D:\FERAS\SQLLOADER\bad.txt discard=D:\FERAS\SQLLOADER\disc.txt data=d:\feras\sqlloader\DATA_20070803_0637

SQL*Loader: Release 10.1.0.2.0 - Production on Thu Apr 30 08:50:04 2009

Copyright (c) 1982, 2004, Oracle.  All rights reserved.

SQL*Loader-500: Unable to open file (d:\feras\sqlloader\DATA_20070803_0637.dat)
SQL*Loader-553: file not found
SQL*Loader-509: System error: The system cannot find the file specified.
SQL*Loader-2026: the load was aborted because SQL Loader cannot continue.

Question
1) Why the SQL Loader program says UNABLE TO OPEN FILE (D:\FERAS\SQLLOADER\DATA_20070803_0637.DAT) , But i have supplied the Data file as DATA_20070803_0637
2) Why .DAT extension is added to the Data file ( only for the Data files without any extension ) during execution of SQLLoader.
3) Is it possible to process Data files  without any file Extensions in SQL Loader? If yes can you show me how?

Thanks,
Rajesh.

Tom Kyte
April 30, 2009 - 9:19 am UTC

http://docs.oracle.com/docs/cd/B19306_01/server.102/b14215/ldr_params.htm#sthref557

If you do not specify a file extension or file type, the default is .dat.


but in windows a rather simple workaround exists, just use "." at the end


C:\oracle\product\10.1.0\db_1\BIN>sqlldr scott/tiger emp.ctl data=data

SQL*Loader: Release 10.1.0.4.0 - Production on Thu Apr 30 09:00:55 2009

Copyright (c) 1982, 2004, Oracle.  All rights reserved.

SQL*Loader-500: Unable to open file (data.dat)
SQL*Loader-553: file not found
SQL*Loader-509: System error: The system cannot find the file specified.
SQL*Loader-2026: the load was aborted because SQL Loader cannot continue.

C:\oracle\product\10.1.0\db_1\BIN>sqlldr scott/tiger emp.ctl data=data.

SQL*Loader: Release 10.1.0.4.0 - Production on Thu Apr 30 09:00:58 2009

Copyright (c) 1982, 2004, Oracle.  All rights reserved.

Commit point reached - logical record count 14




In unix you would use a symbolic link


Parallel Loading

Rajeshwaran, Jeyabal, June 15, 2009 - 12:53 am UTC

Tom,

I have some questions regarding your above post.
sqlldr will load the records in order (do not use parallel)...

1) Doest SQL Loader will load data in order? Is that true for both Conventional path & Direct Path Loading?
2) Why do you say " do not use parallel" any specific reason for this? If so Can you please provide an example for that?

Thanks,
Rajesh
Tom Kyte
June 15, 2009 - 12:17 pm UTC

1) because the files it reads are sequential files, it'll read N records and array insert N records, read N more records, insert them and so on.

2) if you use parallel, then the file is chopped into M smaller chunks and loaded in parallel, it will not read from the start to the finish in order anymore.

Parallel Loading

Rajeshwaran, Jeyabal, June 16, 2009 - 11:27 am UTC

Tom,

So using Parallel operations in SQL Loader, Will involve in faster loading of data. Am i right?
Tom Kyte
June 16, 2009 - 2:39 pm UTC

you might be

a) right
b) wrong
c) neither right nor wrong

think about it.... conceptualize it..... understand it.....


parallel loading

reader, February 01, 2010 - 5:32 pm UTC

Hi Tom,

I have a problem loading two mutually exclusive data files in parallel. I need to generate a non overlapping sequence number for each record loaded in. If I use RECNUM in my SQL*Loader control file then I will get duplicates as each file loads in parallel.

so I call:

sqlldr user/pwd control=test.ctl data=data1.lis direct = true parallel=true

sqlldr user/pwd control=test.ctl data=data2.lis direct=true parallel=true

And in the control file the primary key column value is generated using RECNUM for both data files. Please can you tell me how I can generate a unique sequence number for loading two files in parallel.

Tom Kyte
February 02, 2010 - 12:08 pm UTC

well, I would use an external table and a parallel direct path insert - that would be easiest, just use

sqlldr u/p control=test.ctl external_table=generate_only

on it to get the template, then add the two files to the create table and go for it. Use a sequence to generate unique numbers, in parallel, with direct path.

A reader, February 02, 2010 - 1:13 pm UTC

can you give an example please :)
Tom Kyte
February 02, 2010 - 2:24 pm UTC

an example of what?

if you mean external tables or external_table=generate_only - there is a really neat search box on the home page...

parallel direct path insert:

http://www.oracle.com/pls/db102/search?remark=quick_search&word=parallel+direct+path+insert&tab_id=&format=ranked

Parallel Load

reader, February 03, 2010 - 2:59 pm UTC

Hi Tom,

That last question wasn't by me by the way .... OK, if I use a database sequence and I am using append and parallel, won't that disable the direct path load. Because we are using SQL??

I was of the understanding that doing any SQL in a direct path load (append hint) etc...would automatically disable direct and then I'd be back to conventional??

Thanks for all you help on this website!!
Tom Kyte
February 04, 2010 - 11:49 am UTC

I said to use an external table and parallel direct path insert - via insert /*+ append */

eg: sqlldr = NO, not using it, SQL = YES, using it.

use sqlldr to generate the external table definition - create the external table mapped to BOTH files and do parallel direct path insert - using SEQUENCES.

Reader from non-UK, February 04, 2010 - 12:22 pm UTC

Not it is more clear

Reader from non-UK, February 04, 2010 - 12:22 pm UTC

Not --> Now

OK

Kumar, February 15, 2010 - 3:57 am UTC

Hi Tom,
Does data dictionary store information about tables being loaded through SQL*Loader? Actually I would like to get the control file name from database itself for a particular table which gets loaded through SQL*Loader.
Any dictionary views available for this?

Reader, March 09, 2010 - 2:23 pm UTC

tom,

Can you tell me if there are any disadvantages of using direct path loading?

In direct path loading, data cannot be accessed until it is loaded. Other than that, is there any disadvantage?

Tom Kyte
March 09, 2010 - 4:16 pm UTC

off the top of my head...

triggers are not fired.

indexes could be left in direct load state.

it doesn't use the buffer cache, so it could be slower than non-direct path.
it doesn't use the buffer cache, so it could be faster than non-direct path.

it locks the segment you are loading into (usually 'the table', but if you direct path into a single partition - just that partition)

it might not generate redo (if noarchivelog mode, no redo, if archive log mode AND table is nologging AND dba permits non-logged operations then no redo) - which can be good and bad.

Loading multiple flat file to the single target table

Atanu Chatterjee, November 07, 2010 - 12:20 pm UTC

I do not know that it is possible or not through SQL Loader but requirement is that... I have to load multiple Flat file into single target table like below...

Column1,Column2 from First Flat file, Column3, Column4 from Second Flat File.

Note - a. Number of rows are same in both flat file.
b. Number of Field and Rows in both flat file are much higher so it is quite tedious to marge these two flat file manually. Please give some idea.
Tom Kyte
November 08, 2010 - 8:14 am UTC

you have two files.

create two EXTERNAL TABLES (would prefer this myself over sqlldr anyday) or two control files (since they have a different layout)

load them as separate and distinct entities - since they are. Just load twice - once for file one and once for file two.

Oracle Error

A reader, March 22, 2011 - 3:26 pm UTC

Hi Tom,

I am getting an error while trying to direct load a very big file may be 800 MB into several tables(~10-12 table) through when clause.

ORA-39776: fatal Direct Path API error loading table

and internal error code 
ORA-00600: internal error code, arguments: [kohdtf048], [], [], [], [], [], [],
[]

Then i take out the direct=true & load the data it works.


Normally people suggest ORA 600 means Oracle Support but I think it happens only due to direct load option.

I have the direct load options for many tables but they work well as they load the data in only 1-2 tables.

SO does that mean that I should go for conventional load only ? or do I have to commit after 10000 rows using ROWS option.

Could you please suggest.
Tom Kyte
March 23, 2011 - 8:15 am UTC

ora-600 ALWAYS means Oracle support.

if you want to use direct path - contact support please.

Of course you could also use a multi-table insert and an external table instead of the archaic sqlldr tool.

formatting query

Anandmohan, April 25, 2011 - 12:30 pm UTC

if we give select * from emp where deptno=10;

then the output should be:
select *
from emp
where deptno=10;
please provide some pl/sql or sql code
Tom Kyte
April 25, 2011 - 1:30 pm UTC

please download sqldeveloper, it has a pretty printer in it.

formatting query

A reader, April 25, 2011 - 9:57 pm UTC

i know that there is sql developer for formatting query,,but i want to try it manuaqlly by writing pl/sql or sql code ,,cany any one help me?
Tom Kyte
April 26, 2011 - 7:17 am UTC

go for it, why do you need our help if you want to write it yourself?????

You are a programmer right?


You asked me above for the code - you didn't want to code it yourself yesterday. I told you where to get the code (sql developer).

If you know want to write it yourself, go for it?

formatting

A reader, April 26, 2011 - 11:14 am UTC

dont get serious,,,u are tellinig thet i have to see the source code of sql developer?
Tom Kyte
April 26, 2011 - 2:44 pm UTC

is your keyboard broken?

I'm telling you that you either

a) want to use sql developer because you want to pretty print some SQL (you want to use a tool)

b) write a pretty printer because you have a application that wants a pretty printer as part of it (I don't sit here and write code for people on demand all day - that is not my purpose. You however probably do write code for your employer all day and if they've asked you to develop a printer pretty... think about it)


How about you get serious. Why would I "please provide some pl/sql or sql code" for a pretty printer? Such code exists and you are free to use it (sqldeveloper). If you want to develop such code 1) don't expect I have it for you, 2) don't expect that I will write it for you.

function in sqlloader

A reader, April 27, 2011 - 6:16 am UTC

Consider following tables :-
a
b

I need to load data to a using sqlloader .One particular column in table a needs to be derived by looking up value from b.
I plan to make use of sqlloader such a way that i will have a function which will return the value to this column

....
....

FIELDS TERMINATED BY '|'
OPTIONALLY ENCLOSED BY '"'
TRAILING NULLCOLS
(
...
...
..
calculate_me "fn_getDept()",
.....
)


The function returns a value (based on select like .. select deptno from emp where emp_id =i_emp)

I guess this will cause looking up tables for every row and sqlloader load may slow down for huge files.
Can I use any option in sql which will invoke fn_getDept only once as it is constant for all the rows
for single load.
Tom Kyte
April 27, 2011 - 7:54 am UTC

function result cache in 11g would accomplish most of that. The function would still be called for each row - but it would only really execute once (as long as the table it queried did not change)

But, I would just use an external table:

insert into my_real_table
select ...
  from external_table, lookup_table
 where join_condition

diffence between oracle apps reports and pl/sql reports

A reader, April 27, 2011 - 11:09 am UTC

what is the diffence between oracle apps reports and pl/sql reports.why we go for reports in apps?
Tom Kyte
April 27, 2011 - 1:27 pm UTC

That is like asking "what is the difference between a finished application and a programming language"

You can use plsql to code reports that you would sell as an application.

You can buy an application that uses various languages to produce reports for you, the reports are part of the application, you don't have to build them.

need some help on this

venkat, October 05, 2011 - 8:36 am UTC

i can't provide you any create table script here, but i need to know some info from you sir.

i have a requirement,
the user will upload a file to the server and as soon as the file is uploaded it starts a unix job which in turn calls sqlloader to load data from the (.csv) file to a table.

the file has only 3 columns

name
age
location

table has 5 columns

name
age
location
uploaded_by
uploaded_date
.
now when i am executing the sqlloader script, i am populating the uploaded_by field in the table with the user who uploaded the data
like this
uploaded_by "nvl(:uploaded_by,USER)"
and
uploaded with sysdate
like this
uploaded_date "nvl(:uploaded_date,

but here the problem is that uploaded_by is getting copied from the ENVIRONMENT variable we defined in .profile file.

what i need is

can i populate the column uploaded_by with a shell variable

like this

uploaded_by "nvl(:uploaded_by,$upl_user)"
uploaded_date "nvl(:uploaded_date,sysdate)"

please answer this. it might sound stupid to you.
Tom Kyte
October 05, 2011 - 11:01 am UTC

... it might sound stupid to you. ..

No, just very confusing. why can't you provide a create table and data? I KNOW you can - you just won't. That is strike one.

... but here the problem is that uploaded_by is getting copied from the ENVIRONMENT
variable we defined in .profile file. ...

how is that happening? we don't do anything with the environment...


what i need is

can i populate the column uploaded_by with a shell variable


sure, if you know how to write shell scripts and you have your shell script write a control file and that control file has the environment variable value in it (this by the way is what I believe to be the answer to your question, but I'm not entirely sure).

you are at it again

venkata, October 05, 2011 - 2:22 pm UTC

wow...bingo...pretty good reply for a pretty good question..
Tom Kyte
October 06, 2011 - 9:11 am UTC

or pretty good reply for a very vague, ill posed question - sure.


go ahead and read the question again, it starts with the famous:

i can't provide you any create table script here, but i need to know some info from you sir.


which can be stated in other words as:

I'm not going to go to the trouble of actually spending my time setting up a good example like you do, but I want you to solve my problem anyway.


Then they wrote some good stuff - but not meaningful stuff yet, and then write:

but here the problem is that uploaded_by is getting copied from the ENVIRONMENT variable we defined
in .profile file.


huh? How does uploaded_by get copied from the ENVIRONMENT variable? How did that happen? sqlldr doesn't get anything from the environment for loading really. So, there is a bit of logic going on somewhere here that just isn't being explained at all. I don't even really know what that means "getting copied from the environment"



But lo and behold, I do think I answered the question:


sure, if you know how to write shell scripts and you have your shell script write a control file and that control file has the environment variable value in it (this by the way is what I believe to be the answer to your question, but I'm not entirely sure).




to give a real answer, I actually like to show the answer - but given they were not willing to take the 30 seconds or so to put together a 'shell of a test case', well, so be it.

if you were in front of me

venkata, October 06, 2011 - 11:23 am UTC

very very well said.

sql loader issue

A reader, November 13, 2011 - 10:36 pm UTC

Hi Tom,
I have two columns in a table where i have to append the timestamp value as character. But the problem happens that the same current timestamp is populated for n number of rows but if I use ROWS=1then every rows are unique.

Create table t1(col1 varchar2(100),col2 varchar2(100), col varchar2(100));

my expected result is
col1 col2 col3
value1 value1_col2_01012011153630.555GMT value1_col3_0101201153630.555GMT

value2 value2_col2_01012011153630.900GMT value2_col3_01012011153630.900GMT

The data file has value for col1,col2 only
value1,col2
value2,col3

can you please help

Tom Kyte
November 15, 2011 - 8:22 am UTC

you can either use rows=1 (but please do expect duplicates, it is inevitable, anyone that glues a timestamp onto anything HAS TO expect duplicates - computers are pretty darn fast and there is this concurrency thing)

or you can really slow it down by using a trigger to assign the values.

Concatenation

A reader, November 15, 2011 - 10:56 pm UTC

Hi Tom,

Could you please suggest if we can use a concatenated field in another column. Lets say:

Load data into emp fields terminated by ","
(
first_name "ucase(:first_name)",
last_name "ucase(:last_name)",
full_name "first_name" || " " || ":last_name",
address ":full_name" || " Street"
)

data is like :
Andrew,Walton
Andy,Mich

When I run I don't get the value in ADDRESS field properly. can we use a derived?computed?concatenated column for reference to another table?

Any suggestion.

Tom Kyte
November 16, 2011 - 9:43 am UTC

No, the bind values are set during the initial executing of the insert sqlldr does. The values DO NOT, CANNOT change during the execution of the insert. Things do not happen left to right or right to left in SQL - it is not a procedural language.

The binds are set (:first_name, :last_name, :full_name, :address), and then sqlldr executes the insert statement:

insert into emp (first_name, last_name, full_name, address )
values ( ucase(:first_name), ucase(:last_name),
:first_name || ' ' || :last_name, :full_name || ' Street' );




when you see it like that - maybe it makes more sense. You see that :full_name IS NOT the column "full_name", it is just a bind variable that is set to whatever sqlldr happened to have parsed out of the string.



Reader, December 06, 2011 - 3:29 pm UTC

Tom,

I am using SQL*Loader to load data in to a table. Here are the steps in a shell script -

1) Download CSV file from a website
2) Use SQL*loader to load Staging table using the file from Step 1. I use TRUNCATE in SQL*LOADER control file.
3) Run pl/sql procedure to move data from Staging to actual table.

If step one fails to download file, SQL*LOADER says, file not found and the error is -
SQL*Loader-500: Unable to open file test.csv
SQL*Loader-553: file not found
SQL*Loader-509: System error: No such file or directory

Does this truncate data from the table and then give the file not found error?


Tom Kyte
December 07, 2011 - 1:06 pm UTC

i strongly recommend you to NOT USE sqlldr, use an external table. You can skip the overhead of loading a staging table altogether. plsql will read the records straight from the file.

search this site for

'external table'

for tons of examples (it is documented as well). You can switch over to an external table easily by running

sqlldr user/password controlfile.ctl EXTERNAL_TABLE=GENERATE_ONLY


the resulting log file will have your create table statement all done for you!


It truncated before looking for the file.

Where is the Link?

Duane, February 22, 2017 - 8:19 pm UTC

This posting contains the same link 6 times. The link does not work. I am unable to find it using web searches, ask the expert searches, or fiddling with the URL. " http://asktom.oracle.com/~tkyte/SkipCols/index.html"
Connor McDonald
February 23, 2017 - 2:53 am UTC

You dont really need it - you needed it for Oracle 8.0 (aka the year 1997) when we didnt have the FILLER and BOUNDFILLER keywords in sqlldr.

I assume you're on a release 8i or later ? If so, just use FILLER or BOUNDFILLER

More to Explore

PL/SQL demos

Check out more PL/SQL tutorials on our LiveSQL tool.

PL/SQL docs

PL/SQL reference manual from the Oracle documentation library