What slows sqlldr down ?
robert, January 23, 2003 - 8:30 pm UTC
We sent many fixed length data files to clients.
A new bus. requirement is that we audit what is exactly sent. (the files are < 500 records).
Still on 8i, so I wrote a very generic pl/sql module that basically reads col info (all char) and builds control at run time, sends sqlldr command to cron queue and loads tables that are structured based on the data file layout.
I know you kind of advice against this. But it's fix-len string and this is the best way I could think of to ensure encapsulation, easy use and correct load.
All user has to do is create a table based on the file layout and call the module.
Basically I call dbms_sleep to slow down pl/sql a little and check for log file existence to determine end of load.
It has been working just great until the other day this
loading of 268 records took 14 minutes (!)...as you can imagine the log file showed up and my pl/sql code continues on and SELECT on partially loaded table....
A couple of hours later I re-ran and load took .2 secs.
So my question is what could cause sqlldr to slow down ? ---
Can it be a bottleneck on the OS or Oracle or both ?
(it can't be a table lock because I'm still in beta with this module and no one is reading those tables let alone updating/locking them)
Do you know if the log file shows up as soon as sqlldr is
started ?
Boss wants total auditability on data we sent so now I added code to put the module into an infinite loop scanning the log file for the line "%CPU%TIME%" to ensure end of load.
Your comment is, as always, most appreciated.
January 23, 2003 - 8:41 pm UTC
Sounds like
o archives filled up or
o checkpoint not complete (you can look at the alert log to check those, look
for cannot allocate new log)
o there is a unique index on the table and something else was playing with
the table -- blocking you (they inserted a row you wanted to)
o the table was locked (is is a child table in a foreign key relationship without
indexes on the fkey?)
those are the culprits that pop into my head first...
sql loader
atul, January 23, 2003 - 9:23 pm UTC
Sir,
I have got test.dat file with no further info...
ie i dont have table info etc...
so with only .dat file,can i use sql-loader to load data??
Thanks.
atul
January 24, 2003 - 7:09 am UTC
umm, no. if you don't have a table structure -- what would you be loading it into.
you need a ctl file to tell sqlldr how to load it into what table.
Diff flat files
Anirudh, March 21, 2003 - 7:21 am UTC
Hi Tom,
Please suggest what to do in the following scenerio:
I have a flat file, lets say containing the data for the emp table. There is a description file(.doc) which comes along with that flat file describing the positions and lengths of the fields. Its fine that I can use a SQLLDR to load that data in to the emp table but it nay be the case when some of the fields are swapped/reshuffled so....in the first flat file I may get empno,ename,mgr,sal...and in second it may be empno,sal,ename....(ofcourse this will be stated in the description file everytime the flat file comes). In this case do I have another alternative except from reading the description file then dynamically making the ctl file for the loader and then doing the stuff.
I hope I am clear.
March 21, 2003 - 8:58 am UTC
you would be creating a control file to load the data -- each time.
seems like "data standards" would be really useful to you
.ctl file
Tom, March 21, 2003 - 9:50 am UTC
(Original post) There is a description file(.doc) which comes along with that flat file describing the
positions and lengths of the fields. ...
(Followup:)You would be creating a control file to load the data -- each time.
If the .doc file is in the SAME format each time, you can parse it with sophisticated filters (awk/sed, or perl) to produce a control file. I have seen this done, it is not pretty, but sometimes you cannot control your data sources. I second the part about "data standards", if possible. Otherwise, you will be manually creating the .ctl files each time you get a .doc file.
Carriage Return or Linefeed in data
Abdul Samed, May 02, 2003 - 5:03 am UTC
Hi Tom,
We have a datafile in which one the field has carriage return say Address1 (sometime it has 2 lines of data), so while loading through SQL*Loader, it considers single record as 2 record.
first records whatever before carriage return and
second record whatever after carriage return.
Please let me know how to solve this issue.
Many thanks.
Regards, Aby
May 02, 2003 - 8:16 am UTC
well, any reasonable piece of software would consider 2 records to be 2 records (a newline is the convention for saying "new record coming" after all)
I don't know what issue I'm to solve here. No example input datafile. No example table. No example result set.
Tell me -- if you gave the file to someone who didn't read english (or whatever language the input data is in) and had never seen a number before in their life (so it would look like a random string of characters) would you be able to describe to them to algorith by which you determine conclusively "this is the end of the record". If you can -- add that as well. If you cannot, I'll probably not be able to either.
LRM-00112 Error
ramks, May 05, 2003 - 11:14 am UTC
Hi Tom
I have the following Sql Loader Call in my shell script
which is running in sun os 5.8
sqlldr userid=${FCP_LOGIN} \
data="${lc_inboundpath}/${lc_filename} \
control="${lc_controlfile}" \
log="${lc_cust_top}/log/${lc_filename}".log
bad="${lc_cust_top}/log/${lc_filename}".bad
all the variables are set before this call but when i run this, it is giving error "LRM-00112 Multiple Values Not allowed for Parameter bad"
Can u help in this regard.
thanks in adv
May 05, 2003 - 11:34 am UTC
echo that command out
echo sqlldr userid=.......
and see what is it resolving to. maybe a space in a file name or something like that.
I have a simple (for you) doubt
Raju, February 20, 2004 - 7:16 pm UTC
Tom,
Thanks for the wonderful service. Second time poster but regular reader( my homepage). I have a simple SQL*loader issue. I have a .dat file which looks like
----
Tom
C Horizons
9732994000
Sam
TXS
7322221345
Veronica
T Systems
123 456 7890
----
---
and so on
-------------------------
Basically the format is of the type name, company, phone. There are no "record" delimiter. I want to load this into a table with three (3) fields name, company,phone. Can we do it using SQL Loader - may be some workaround.
Thanks for all the help.
February 20, 2004 - 7:37 pm UTC
I don't think so -- we have "concatenate 3", but that'll just create records like:
TomC Horizons9732994000
SamTXS7322221345
VeronicaT Systems123 456 7890
no separators. we'd have to have something at the end -- any chance you are on unix? be trivial to fix with pipes.
Yes I am on Unix
Raju, February 21, 2004 - 9:28 am UTC
Tom,
The datafile is on Unix. Can I just add a delimiter (| or comma) at the end of each line and then use the CONCAT command. Can you please guide me.
Thanks
February 21, 2004 - 11:22 am UTC
$ mknod pipe.dat p
$ cat t.dat | sed 's/$/,/' this'll add a comma for us
Tom,
C Horizons,
9732994000,
Sam,
TXS,
7322221345,
Veronica,
T Systems,
123 456 7890,
$ cat t.dat | sed 's/$/,/' > pipe.dat & do this to start the load
[1] 4978
$ sqlldr / t data=pipe.dat
SQL*Loader: Release 9.2.0.4.0 - Production on Sat Feb 21 11:19:23 2004
Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
Commit point reached - logical record count 3
[1]+ Done cat t.dat | sed 's/$/,/' >pipe.dat
and t.ctl would be:
$ cat t.ctl
LOAD DATA
concatenate 3
INTO TABLE T
replace
FIELDS TERMINATED BY ','
(a,b,c)
Thanks Tom
Raju, February 21, 2004 - 1:51 pm UTC
Thanks for the help.It worked like a charm
Developer
Dave Suzuki, March 02, 2004 - 12:33 am UTC
Hello Tom,
I am new to use SQL Loader.
We are using 8.1.7 version of DB.
Toad and SQL Plus is available for us.
we are loading data from our legacy system to Oracle DB.
I have created few temp tables for this purpose.
Now I need to create control file. OR Am I missing some thing?
Do I need to create data file or it is a part of control file.
could you please give me some example?
Please note: I need to put this control file in UNIX server from windows.
Dave
March 02, 2004 - 7:32 am UTC
you create a control file (sort of like a "program") to tell sqlldr how to read your input datafiles.
so yes, you need to create a control file for sqlldr.
you then need to provide some datafile to be loaded. this is the file that your sqlldr control file describes.
have you checked out the server utilities guide? it describes the tools like sqlldr.
Developer
Dave, March 02, 2004 - 10:11 am UTC
have you checked out the server utilities guide?
No I do not have sever utilities guide.
So data file contains the data to be load. isn't?
Here is my script for the table.
Please give me some hint to create control file.
CREATE TABLE NFS_POSITION_TMP
(
TRX_LOAD_ID NUMBER NOT NULL,
PRG_EXEC_ID NUMBER NOT NULL,
ORIGINAL_FILE_NAME VARCHAR2(50),
REC_CODE VARCHAR2(6),
HDR_REC_CLIENT_ID VARCHAR2(8),
FILLER1_HDR VARCHAR2(22),
FILE_ORIGIN VARCHAR2(21),
FILLER2_HDR VARCHAR2(9),
FILE_NAME VARCHAR2(21),
FILLER3_HDR VARCHAR2(9),
TRANS_CREATION_DATE VARCHAR2(11),
FILLER4_HDR VARCHAR2(38),
FIRM VARCHAR2(9),
BRANCH VARCHAR2(8),
ACCOUNT_NO VARCHAR2(11),
ACCOUNT_TYPE VARCHAR2(6),
FILLER1_POS1 VARCHAR2(7),
CUSIP VARCHAR2(14),
FILLER2_POS1 VARCHAR2(6),
SECURITY_TYPE VARCHAR2(6),
SECURITY_TYPE_MOD VARCHAR2(6),
MKT_CODE VARCHAR2(6),
FILLER3_POS VARCHAR2(6),
DTC_ELIG_CODE VARCHAR2(6),
MARGIN_IND VARCHAR2(6),
REG_REP VARCHAR2(8),
MARGIN_LST_ACT_DATE VARCHAR2(11),
STOCK_REC_LST_ACT_DATE VARCHAR2(11),
SYMBOL VARCHAR2(14),
MKT_PRICE VARCHAR2(23),
POS_FLD_SIGN1_POS1 VARCHAR2(6),
TRADE_DATE_QTY VARCHAR2(20),
POS_FLD_SIGN2_POS1 VARCHAR2(6),
FILLER3_POS1 VARCHAR2(14),
STLMNT_DATE_QTY VARCHAR2(20),
POS_FLD_SIGN1_POS2 VARCHAR2(6),
SGRTD_QTY VARCHAR2(20),
POS_FLD_SIGN2_POS2 VARCHAR2(6),
TRANSIT_QTY VARCHAR2(20),
POS_FLD_SIGN3_POS2 VARCHAR2(6),
TRANSFER_QTY VARCHAR2(20),
POS_FLD_SIGN4_POS2 VARCHAR2(6),
LGL_TRANSFER_QTY VARCHAR2(20),
POS_FLD_SIGN5_POS2 VARCHAR2(6),
NON_NEGBLE_QTY VARCHAR2(20),
POS_FLD_SIGN6_POS2 VARCHAR2(6),
FILLER4_POS2 VARCHAR2(9),
TRADE_DATE_SHRT_SALE_QTY VARCHAR2(20),
POS_FLD_SIGN1_POS3 VARCHAR2(6),
STLMNT_DATE_SHRT_SALE_QTY VARCHAR2(20),
POS_FLD_SIGN2_POS3 VARCHAR2(6),
MTD_POS_TRADE_DATE_BAL VARCHAR2(16),
POS_FLD_SIGN3_POS3 VARCHAR2(6),
MTD_POS_STLMNT_DATE_BAL VARCHAR2(16),
POS_FLD_SIGN4_POS3 VARCHAR2(6),
MTD_POS_TRADE_DATE_POS_COMM VARCHAR2(16),
POS_FLD_SIGN5_POS3 VARCHAR2(6),
MTD_SETTLE_DATE_POS_COMM VARCHAR2(16),
POS_FLD_SIGN6_POS3 VARCHAR2(6),
FILLER1_POS3 VARCHAR2(10),
FILLER2_POS3 VARCHAR2(6),
FILLER3_POS3 VARCHAR2(6),
NO_OF_SEC_DESC_LINES VARCHAR2(6),
SHORT_NAME VARCHAR2(15),
FILLER4_POS3 VARCHAR2(7),
SEC_DESC_LINE1 VARCHAR2(25),
SEC_DESC_LINE2 VARCHAR2(25),
SEC_DESC_LINE3 VARCHAR2(25),
SEC_DESC_LINE4 VARCHAR2(25),
SEC_DESC_LINE5 VARCHAR2(25),
SEC_DESC_LINE6 VARCHAR2(25),
FILLER1_POS5 VARCHAR2(20),
FILLER2_POS5 VARCHAR2(6),
FILLER3_POS5 VARCHAR2(20),
FILLER4_POS5 VARCHAR2(6),
FILLER5_POS5 VARCHAR2(20),
FILLER6_POS5 VARCHAR2(6),
FILLER7_POS5 VARCHAR2(7),
FILLER8_POS5 VARCHAR2(6),
FILLER9_POS5 VARCHAR2(17),
FILLER10_POS5 VARCHAR2(6),
FILLER11_POS5 VARCHAR2(17),
FILLER12_POS5 VARCHAR2(9),
FILLER1_POS6 VARCHAR2(6),
FILLER2_POS6 VARCHAR2(17),
MTD_POS_INC_TRADE_DATE VARCHAR2(16),
POS_FLD_SIGN1_POS6 VARCHAR2(6),
MTD_POS_INC_STLE_DATE VARCHAR2(16),
POS_FLD_SIGN2_POS6 VARCHAR2(6),
REG_REP2 VARCHAR2(8),
AGENCY_CODE VARCHAR2(13),
PROD_CODE VARCHAR2(17),
MATURITY_DATE VARCHAR2(13),
CASH_COLLCTD_BAL VARCHAR2(18),
POS_FLD_SIGN3_POS6 VARCHAR2(6),
FILLER3_POS6 VARCHAR2(23),
RECTTL_CODE_TLR VARCHAR2(6),
FILLER1_TLR VARCHAR2(25),
TTL_RECS VARCHAR2(20),
FILLER2_TLR VARCHAR2(10),
TTL_LOGICAL_RECS VARCHAR2(20),
FILLER3_TLR VARCHAR2(51),
LOADED_FLG VARCHAR2(1) DEFAULT 'N' NOT NULL,
LOAD_DATE DATE DEFAULT TRUNC(SYSDATE),
REC_NUM NUMBER NOT NULL,
CONTENT VARCHAR2(450),
CREATE_USER_ID VARCHAR2(11) NOT NULL,
CREATE_DATE DATE NOT NULL,
UPDATE_USER_ID VARCHAR2(11) NOT NULL,
UPDATE_DATE DATE NOT NULL
)
Dave
California, Irvine
March 02, 2004 - 3:59 pm UTC
yes, yes you DO so have it. Everyone on the planet has it.
otn.oracle.com -> documentation
please check out the server utilities guide, lots of examples.
A create table cannot tell you how to create a control file to load a DATA FILE. the datafile is what drives the control file, not the table. the control file tells sqlldr how to read the datafile, your input file.
To Dave Suzuki
denni50, March 02, 2004 - 4:51 pm UTC
here is a sample of a control file I use. I replaced my
columns with yours...you can do some tweaking as needed.
This should get you going.
load data
infile "F:\DATA\loaddata.csv"
into table yourtable append
fields terminated by "," optionally enclosed by '"'
trailing nullcols
(
TRX_LOAD_ID,
ORIGINAL_FILE_NAME,
REC_CODE,
HDR_REC_CLIENT_ID,
FILLER1_HDR,
FILE_ORIGIN,
FILLER2_HDR,
FILE_NAME,
FILLER3_HDR,
TRANS_CREATION_DATE,
FILLER4_HDR,
FIRM,
BRANCH,
ACCOUNT_NO,
ACCOUNT_TYPE,
FILLER1_POS1,
CUSIP,
FILLER2_POS1,
SECURITY_TYPE,
SECURITY_TYPE_MOD,
MKT_CODE,
FILLER3_POS,
DTC_ELIG_CODE,
MARGIN_IND,
REG_REP,
MARGIN_LST_ACT_DATE,
STOCK_REC_LST_ACT_DATE,
SYMBOL,
MKT_PRICE,
POS_FLD_SIGN1_POS1,
TRADE_DATE_QTY,
POS_FLD_SIGN2_POS1,
FILLER3_POS1,
STLMNT_DATE_QTY,
POS_FLD_SIGN1_POS2,
SGRTD_QTY,
POS_FLD_SIGN2_POS2,
TRANSIT_QTY,
POS_FLD_SIGN3_POS2,
TRANSFER_QTY,
POS_FLD_SIGN4_POS2,
LGL_TRANSFER_QTY,
POS_FLD_SIGN5_POS2,
NON_NEGBLE_QTY,
POS_FLD_SIGN6_POS2,
FILLER4_POS2,
TRADE_DATE_SHRT_SALE_QTY,
POS_FLD_SIGN1_POS3,
STLMNT_DATE_SHRT_SALE_QTY,
POS_FLD_SIGN2_POS3,
MTD_POS_TRADE_DATE_BAL,
POS_FLD_SIGN3_POS3,
MTD_POS_STLMNT_DATE_BAL,
POS_FLD_SIGN4_POS3,
MTD_POS_TRADE_DATE_POS_COMM,
POS_FLD_SIGN5_POS3,
MTD_SETTLE_DATE_POS_COMM,
POS_FLD_SIGN6_POS3,
FILLER1_POS3,
FILLER2_POS3,
FILLER3_POS3,
NO_OF_SEC_DESC_LINES,
SHORT_NAME,
FILLER4_POS3,
SEC_DESC_LINE1,
SEC_DESC_LINE2,
SEC_DESC_LINE3,
SEC_DESC_LINE4,
SEC_DESC_LINE5,
SEC_DESC_LINE6,
FILLER1_POS5,
FILLER2_POS5,
FILLER3_POS5,
FILLER4_POS5,
FILLER5_POS5,
FILLER6_POS5,
FILLER7_POS5,
FILLER8_POS5,
FILLER9_POS5,
FILLER10_POS5,
FILLER11_POS5,
FILLER12_POS5,
FILLER1_POS6,
FILLER2_POS6,
MTD_POS_INC_TRADE_DATE,
POS_FLD_SIGN1_POS6,
MTD_POS_INC_STLE_DATE,
POS_FLD_SIGN2_POS6,
AGENCY_CODE,
PROD_CODE,
MATURITY_DATE,
CASH_COLLCTD_BAL,
POS_FLD_SIGN3_POS6,
FILLER3_POS6,
RECTTL_CODE_TLR,
FILLER1_TLR,
TTL_RECS,
FILLER2_TLR,
TTL_LOGICAL_RECS,
FILLER3_TLR,
LOADED_FLG,
LOAD_DATE,
REC_NUM,
CONTENT,
CREATE_USER_ID,
CREATE_DATE,
UPDATE_USER_ID,
UPDATE_DATE)
give your control file a name with .ctl then go to DOS:
c:> sqlldr.exe user@database control=f:\data\controlname.ctl log=f:\data\controlname.log
(will be prompted for password).
and that's it!
oops!
denni50, March 02, 2004 - 5:00 pm UTC
make sure you change the DOS directory to f:
then enter sqlldr commands.
f:> sqlldr.exe...etc
Sql Loader and Bind Variables
Mir, March 04, 2004 - 4:16 pm UTC
Hi Tom,
I have the following Sql Loader Control file. I am looking for a way to load data from various directories into the staging table.
How can i pass values to siteid and sitedir columns dynamically.
for example:
when i am loading data from say c:\www01\1.log i want siteid to be "1" and sitedir to be "www01"
and so on if c:\www02\1.log i want siteid to be "2" and sitedir to be "www02"
load data
into table stg_log append
fields terminated by ' ' optionally enclosed by '"' TRAILING NULLCOLS
(
logid SEQUENCE(MAX, 1),
c_date position(1:19) date 'RRRR-MM-DD HH24:MI:SS',
c_ip char,
cs_username char,
s_ip char,
s_port char,
cs_method char,
cs_uri_stem char(4000),
cs_uri_query char(4000),
sc_status char,
c_user_agent char
siteid number(*,0),
sitedir varchar2(10))
March 04, 2004 - 5:29 pm UTC
You need to dynamically create the control file one the fly then. In unix, this is really easy. In windows (without a proper scripting language) it is rather cumbersome. But the script would be something akin to:
echo load data > t.ctl
echo into table stg_log append >> t.ctl
.....
echo siteid constant %1, >> t.ctl
echo sitedir constant "%2" >> t.ctl
echo ) >> t.ctl
sqlldr u/p t.ctl data=c:\%2\1.log ....
and you would run the script as
script 1 www01
for example.
Oracle Consultant
Dawar, March 06, 2004 - 11:55 am UTC
Tom,
Is it possible to load multiple files through one control file?
Regards,
Dawar
March 06, 2004 - 12:52 pm UTC
sqlldr will load a file at a time, but a single control file can be used to load many files
sqlldr t.ctl data=file1.dat
sqlldr t.ctl data=file2.dat
Oracle Consultnat
Dawar, March 07, 2004 - 12:16 am UTC
Tom,
Thanks for your earlier feed back about multiple files in one control file.
So it is possible to use multiple 'infile' parameters as:
load data
infile 'c:\data1.txt'
infile 'c:\data2.txt'
into table MY_TABLE
terminated by ','
(x,
y
)
Dawar
March 07, 2004 - 9:19 am UTC
yes, that'll work as well (pretty easy to "test" no?)
sqldlr objects
Mariano, March 22, 2004 - 5:39 pm UTC
Tom, hi.
I'm trying to load a flat file into the base with an object column (its attribute is like hh24:mi:ss but it has some
inner aritmetical operations in order to achieve the results it was build for) but i'm having diferent errors with it.
let's say:
*********************************************
create or replace type hour as object
(
hhmmss varchar2(16)
);
*********************************************
i have this sample file (in this order: name, number, number, blank space, hour, blank space, hour, date, date):
*********************************************
Guest 10;8042;30;;15:52;;15:57;21/03/2004;21/03/2004
Guest 10;8042;30;;15:58;;22:02;21/03/2004;21/03/2004
Guest 11;8043;8;;16:02;;16:08;21/03/2004;21/03/2004
Guest 11;8043;8;;16:11;;22:01;21/03/2004;21/03/2004
*********************************************
I build this control file:
********************************************
LOAD DATA
INFILE 'c:\lucent\data\loginlogoutdiario.txt'
BADFILE 'c:\lucent\data\loginlogoutdiario.bad'
DISCARDFILE 'c:\lucent\data\loginlogoutdiario.dsc'
APPEND
INTO TABLE clucent_login_logout
WHEN ((1) != 'Customer')
FIELDS TERMINATED BY ';'
TRAILING NULLCOLS
(LLL_ID SEQUENCE(MAX,1),
LLL_NOMBRE,
LLL_LOGINID,
LLL_SKILL,
filler_1 filler,
LLL_SINCE EXPRESSION "hour(:lll_since)",
filler_2 filler,
LLL_TILL EXPRESSION "hour(:lll_till)",
LLL_INITIALDATE date 'dd/mm/yyyy',
LLL_FINALDATE date 'dd/mm/yyyy')
********************************************************
On running, i'm getting (e.g.):
Registro 2: Rejected - Error on table CLUCENT_LOGIN_LOGOUT.
ORA-01008: not all variables bound
********************************************************
I know it's just not a sqlldr problem but a wrong approach im using with objects, however I have solutions in pl/sql but not here.
How can I load the object without problems with sqlldr?
Thanks.-
March 22, 2004 - 9:11 pm UTC
TRAILING NULLCOLS
(LLL_ID SEQUENCE(MAX,1),
LLL_NOMBRE,
LLL_LOGINID,
LLL_SKILL,
filler_1 filler,
LLL_SINCE column object ( hhmmss ),
filler_2 filler,
LLL_TILL column object ( hhmmss ),
LLL_INITIALDATE date 'dd/mm/yyyy',
LLL_FINALDATE date 'dd/mm/yyyy')
Thanks but...
Mariano, March 23, 2004 - 9:05 am UTC
That was just as cool as simple. However, if I want to verified the input string before loading it (eg, let's say is lacking a digit) I can use my own construct function. I was reading Database Utilities
</code>
http://download-west.oracle.com/docs/cd/B10501_01/server.920/a96652/ch07.htm#1656 <code>
and trying to apply the example in my case but it's raising
ORA-00904: invalid column name.
Here's my control file:
*********************************************************
LOAD DATA
INFILE 'c:\lucent\data\ctiauxdaily.txt'
BADFILE 'c:\lucent\data\ctiauxdaily.bad'
DISCARDFILE 'c:\lucent\data\ctiauxdaily.dsc'
APPEND
INTO TABLE clucent_aux_daily
WHEN (1)!= 'Totals'
FIELDS TERMINATED BY ';'
TRAILING NULLCOLS
(LAD_ID SEQUENCE(MAX,1),
LAD_NAME,
LAD_LOGID,
LAD_STAFFED_TIME EXPRESSION "accumulate(:LAD_STAFFED_TIME)"
LAD_DATE date 'dd/mm/yyyy')
**********************************************************
accumulate is a function that's part of my object hour.
Can you suggest a solution to this?
Thanks
March 23, 2004 - 9:39 am UTC
if the data is "bad", it will be automagically filtered to the bad file. you can post process the bad file.
Post process
Mariano, March 23, 2004 - 10:05 am UTC
It's a good suggestion and it's my plan B, however I have methods in my constructor to handle the error, lets say i need an input like 1283:45:06 (hours:minutes:seconds) and the flat file has 1283:45:6 (lack of a digit: i cannot control the system thats giving me this info) but then in the special constructor has routines that decide if it's 06 seconds or 60 seconds (then transform it in 1 minute). Doing it with the bad file means i have to do 2 procceses when oracle documentation suggests i can do it in once (of course, i'm horribly failing with it).
Plus, the attribute is used for different inputs masks (hh24:mi:ss as it simplest mask but it can be about accumulative hours like the example i mention above, too) so maybe the default constructor fails to realize there's a digit lacking and can insert the value in the table.
Ok, I can control the table after the insert and repair the data corrupted but it's 2 step process and I'd like to exhaust any chance to it in just one (it's a standalone process running in background using dbms_pipe).
Any tip you can give me, will be greatly appreciate.
As always, thanks you a lot.
March 23, 2004 - 10:32 am UTC
then you don't need/want to call a function, just put the logic into your constructor itself:
CREATE OR REPLACE TYPE hour
AS OBJECT
(
hhmmss varchar2(10),
constructor FUNCTION hour(hhmmss in varchar2) RETURN self AS result
);
/
CREATE OR REPLACE TYPE body hour
AS
constructor FUNCTION hour(hhmmss in varchar2) RETURN self AS result
is
begin
self.hhmmss := 'blah';
return ;
end;
end;
/
Constant Parameter
Mariano, March 24, 2004 - 8:01 am UTC
Tom,
1) In general, can we assume that anything we suppose it can be done with a particular feature that's not mention in the documentation, really does not exist? E.g., I couldn't find an answer in the documentation to the example i'm posting in my 2nd question in spite of several pieces of special sqlldr functions or parameters seems that, put altogether, can work out my problem.
2)Let's say i've got a flat file like this:
*************************************************
21/03/2004
GUEST1;8000;9000;1000;1500
GUEST2;8000;9000;1000;1500
GUEST3;8000;9000;1000;1500
GUEST4;8000;9000;1000;1500
*************************************************
I want to load into a table like this:
create table download_tab (
cname varchar2(100),
cvalue1 number,
cvalue2 number,
cvalue3 number,
cvalue4 number,
cdate date);
My control file is...
******************************
LOAD DATA
INFILE 'c:\test.txt'
BADFILE 'c:\test.bad'
DISCARDFILE 'c:\test.dsc'
APPEND
INTO TABLE download_tab
-- this condition in order to not load the first line.-
WHEN (3:3) != '/'
FIELDS TERMINATED BY ';'
TRAILING NULLCOLS
(cname,
cvalue1,
cvalue2,
cvalue3,
cvalue4)
*****************************************
As you may see, i have a date column that's not in the flat file nor in the control. How can I load in cdate column not a constant date or sysdate but the value in the first row in my flat file example?
Using RECNUM with CONSTANT?
Thanks.-
Regards.-
March 24, 2004 - 9:32 am UTC
I'm not touching this one:
...can we assume that anything we suppose it can be done with a
particular feature that's not mention in the documentation...
don't know what you tried to say...
but, sqlldr is a very simple tool -- very simple. You are not going to "program" it to the extent you seem to want to. No, I don't know of anyway short of loading the first row into a table (using LOAD=1) and then running sqlldr again with SKIP=1 to load the rest and let them reference this already loaded value
REF in sqlldr
A reader, March 24, 2004 - 11:07 am UTC
I am having dificulties inserting this object type:
Create or Replace Type RĂ©pertoire AS OBJECT (
ID NUMBER, -- PK
Propriétaire Ref Disque,
Nom Varchar2(255),
PĂ©re Ref RĂ©pertoire,
Member ...)
Create or Replace Type Disque AS OBJECT (
Domain Varchar2(15), -- NT Domain Name
Machine Varchar2(15), -- NT Machine Name
Nom Character, -- PK(Domain,Machine,Nom)
Member ...)
Create Table Disques of Disque(
CONSTRAINT PK_DISQUES primary key (Domain,Machine,Nom),
CONSTRAINT CKC_Nom_DISQUES CHECK (UPPER(Nom)>='A' and UPPER(Nom)
<='Z')
);
Prompt Table RĂ©pertoires
Create Table RĂ©pertoires of RĂ©pertoire(
PĂ©re With ROWID SCOPE IS RĂ©pertoires,
CONSTRAINT PK_RĂ©pertoires primary key(ID)
);
Alter Table RĂ©pertoires
ADD CONSTRAINT FK_RĂ©pertoires_PERE FOREIGN KEY (PĂ©re)
REFERENCES RĂ©pertoires ON DELETE CASCADE;
Control file:
LOAD DATA
INFILE 'Repertoires.txt'
INTO TABLE RĂ©pertoires
FIELDS TERMINATED BY ';'
TRAILING NULLCOLS
( ID "IDRĂ©pertoires.Nextval",
DDomain FILLER char(15),
DMachine FILLER char(15),
DNom FILLER Char,
Propriétaire REF(Constant 'DISQUES',DDomain,DMachine,DNom),
Nom Char(255)
)
WIth this one line there's no problem:
DOM_TEST;AMD2800;M;ROOT
But the log reports that "Error SQL*Loader-350: the directive REF for column PROPRIETAIRE expects 2 arguments, but 4 have been found":
SQL*Loader-350: Erreur de syntaxe Ă“ la ligne 12.
La directive REF pour la colonne PROPRI+TAIRE attend 2 arguments ; 4
trouvĂšs.
SQL*Loader-2026: Chargement arrÛtÚ : poursuite impossible.
I have barely found any referenece to REFs in sqlldr docs.
Can you help me ?
March 24, 2004 - 11:21 am UTC
sorry, but I'm not really sure what you are trying to accomplish there.
Discard into table?
Salaam Yitbarek, June 04, 2004 - 11:42 am UTC
A shot in the dark, here, but can SQL Loader put the discarded records into a table instead of a file?
June 04, 2004 - 12:32 pm UTC
no, it cannot.
constraints
jasdeep, June 05, 2004 - 12:24 pm UTC
hi tom
i was learning about sqlldr by using ulcases from rdbms/demo/ and was creating my own tables but couldn,t enter data in ulcase2 table emp as it says deptno co;umn (which is the last column of the table ) does not exist.
i made that column and then i found out i havent used reference key in this table which is based on dept table.
my question is how would we know that such references have to be made and which constraints have to be applied when we get data files from database other than oracle.
thanks
June 05, 2004 - 1:38 pm UTC
sorry -- not making sense to me here. "i haven't use reference key in this table" -- not sure what you mean. an example would be useful.
Pupulating Default Column Values.
Sujit Mondal, June 24, 2004 - 11:08 am UTC
I have a ctl script like below:
LOAD DATA
INFILE '$OUT_DIR/BCAST_PGM_QHR.DAT'
APPEND
INTO TABLE RSCDWDBO.T8043_NTI_PGM_QHR_FCT
FIELDS TERMINATED BY '|'
TRAILING NULLCOLS
(
C8043_NBC_TELECAST_DM_ID INTEGER EXTERNAL,
C8043_DATE_DM_ID INTEGER EXTERNAL,
C8043_BROADCAST_WEEK_DM_ID INTEGER EXTERNAL,
C8043_NETWORK_DM_ID INTEGER EXTERNAL,
C8043_NMR_SHOW_CD_DM_ID INTEGER EXTERNAL,
C8043_CLOCK_QHR_DM_ID INTEGER EXTERNAL,
C8043_NTI_YLY_UE_ID INTEGER EXTERNAL,
C8043_SMPL_DM_ID INTEGER EXTERNAL,
C8043_VCR_INDICATOR INTEGER EXTERNAL,
C8043_AA_IND INTEGER EXTERNAL,
C8043_CNTRB_DUR INTEGER EXTERNAL,
C8043_HH_PROJ INTEGER EXTERNAL,
C8043_HH_RTG INTEGER EXTERNAL,
C8043_FEED_DT_REF_ID INTEGER EXTERNAL,
C8043_F2_5 INTEGER EXTERNAL,
C8043_F6_8 INTEGER EXTERNAL
}
BCAST_PGM_QHR.DAT like below:
1|1|1|2|1|1|1|2|1|1|1|2|130|21|230|350
2|1|3|2|1|5|1|1|1|1|1|2|130|21|230|350
And sometime BCAST_PGM_QHR.DAT is like below (Values for last two columns are missing):
3|1|1|2|1|1|1|2|1|1|1|2|140|25
4|1|3|2|1|5|1|1|1|1|1|2|150|25
I want to populate zeros (By default it populates NULL) for last two columns when the values for those two columns are missing.
Can you plese let me know how can I do it by changing the ctl file.
Regards
-Sujit
June 24, 2004 - 2:51 pm UTC
C8043_F2_5 INTEGER EXTERNAL "nvl(:c8043_f2_5,0)",
C8043_F6_8 INTEGER EXTERNAL "nvl(:c8043_f6_8,0)"
)
Is this possible?
michael, August 05, 2004 - 6:47 pm UTC
Table
SQL> desc dim_cond_qual;
Name Null? Type
----------------------------------------- -------- ----------------------------
DCQ_ID NUMBER(38)
COND_QUAL VARCHAR2(400)
Control file
OPTIONS (SKIP=1, ROWS=999999, ERRORS=999999, DIRECT=Y)
LOAD DATA
INFILE 'c:\test.txt'
BADFILE 'c:\test.bad'
DISCARDFILE 'c:\test.dsc'
INTO TABLE dim_cond_qual
FIELDS TERMINATED BY ','
TRAILING NULLCOLS
(
COND_QUAL
,dcq_id "dim_cond_qual_seq.nextval"
)
Note: The datafile has only one column of data.
Datafile
ColumnName
a=1
a=1;b=1
a=2;b=1
I would like to load into the table sequence for DCQ_ID and data from Datafile for COND_QUAL.
How can I achieve this?
Thank you for your time and assistance.
August 05, 2004 - 9:27 pm UTC
looks great, when I ran it -- worked fine. whats the issue?
Sequence does not populate
michael, August 05, 2004 - 11:52 pm UTC
The sequence did not populate because of DIRECT=Y option.
sqlldr
prince, August 11, 2004 - 1:18 am UTC
Hi Tom,
All our Convertion program are written in pro*c program which uses SQLLDR,
We found out from "prstat" Unix Command that SQLLDR is taking 25% of the CPU. as well as it showing Oracle user name and password while grep
the
Unix Process .
Below is the Sample
$ ps -ef |grep 4921
a1ctrb1 9253 9215 0 11:52:29 pts/4 0:00 grep 4921
orbit 4921 4920 24 11:41:28 pts/3 10:09 sqlldr
userid=MFGR_PROXY/xxxxxxxx@ODSPRDGR control=/usr1/gr.ebs/bin//conv
Below I have attached the source Code I used to call SQLLDR in my Pro*C program . Please advice me so that CPU time should be less and SQLLDR Should not show the
Oracle user name and pwd in Unix prompt while grep the the SQLLDR process.
sprintf(TempStr,"mv %s/%s
%s/%s",s02_incoming,"rmmfxref",s02_incoming,"rmmfxref.dat");
printf("The File Name is <%s>\n",TempStr);
system(TempStr);
userid.len =
sprintf(userid.arr,"%s/%s@%s",s02_orauid,s02_orapwd,s02_orasid);
sprintf(sqlloader,"sqlldr userid=%s control=%s/%s log=/tmp/%s
errors=4000000 rows = 3000000 bindsize = 1048576",userid.arr
,s02_bin2,"convctrl.ctl","ulcase2.log");
system(sqlloader);
if (sqlca.sqlcode < 0 )
{
printf("Error While Connecting to SQL Loader <%d>",sqlca.sqlcode);
}
Thanks and Regards,
Prince.
August 11, 2004 - 9:56 am UTC
why would you want the cpu time to be less? You want this to take longer?????
cpu is not something you can put into the bank and use later.
you cannot "save it up"
google "hideargs" for a c program that shows how to spawn a program and hide the arguments. or you can just write your command to a script, run the script and erase the script.
Why Direct path is used when transformation is done?
Tony, September 03, 2004 - 6:24 am UTC
As per Oracle document, we can't apply transformation in Direct Path Load or Direct path load can't be applied when transformation is done. But, log file shows that Direct path load is done even when I do transformation by applying LPAD.,etc functions and set DIRECT=TRUE.
Please clarify.
September 03, 2004 - 11:48 am UTC
the restriction was lifted and the documentation was modified -- which version of the doc did you use and what version of sqlldr where you using?
you can direct path with SQL functions these days (9i and up)
Help Required
Parag Jayant Patankar, November 16, 2004 - 6:13 am UTC
Hi Tom,
I am trying to do upload data by SQL Loader in AIX 4.3.3 (Unix)
inf="/$A2_VERSION/$A2_ENV/cft/receive/netp/recv/netp.txt"
baf="$A2_IMP/netpbad.txt"
echo $inf
echo $baf
sqlldr atlas/atlas control=/f205/$A2_ENV/site/jcl/v9spd910.ctl log=$A2_IMP/v9spd910.log rows=1
oad data
infile $inf
badfile $baf
TRUNCATE
into table v9spd910
fields terminated by " "
trailing nullcols
(
v0010n position(0001:0999)
)
Actually I am trying to avoid hard-coding of paths and filenames in ctl file so I can use these scripts in any of my test environments without modification.
When I am trying to copy file it is giving me following error
**SPRTR012**(atlas){1_QC1}/f205/qc1/site/jcl > v9spd910load
/f205/qc1/cft/receive/netp/recv/netp.txt
/f205/qc1/data1/imp/netpbad.txt
SQL*Loader: Release 8.1.7.4.0 - Production on Tue Nov 16 19:13:41 2004
(c) Copyright 2000 Oracle Corporation. All rights reserved.
SQL*Loader-350: Syntax error at line 2.
Illegal combination of non-alphanumeric characters
infile $inf
^
**SPRTR012**(atlas){1_QC1}/f205/qc1/site/jcl >
I had tried different combinations in variables to include or exclude character "'" but every time it is giving me same error.
Kindly help me on this. ( Preferably, in Unix Scripting )
regards & thanks
pjp
November 16, 2004 - 6:45 am UTC
sqlldr .... data=INPUT_FILENAME bad=BADFILE_NAME
they can all be specified on the command line and passed in.
so, a single constant control file -- and your script becomes one line long -- a call to sqlldr with bad=$inf and so on.
A reader, February 01, 2005 - 10:56 am UTC
Hi Tom,
I have the following table:
Create table nitstoryn
(
SRL NUMBER(12)
STORYFILE_NME VARCHAR2(30)
AUTHOR VARCHAR2(42)
BODY CLOB
);
I am using SQl Loader to load the data. Obviously the "Body" is big almost 10,000 characters.
When I use a control file, which has the positions, it is ok.
Load Data
INFILE 'national.TXT'
TRUNCATE
INTO TABLE nitstoryn
(
srl POSITION(1:12) CHAR,
storyfile_nme POSITION(13:42) CHAR,
author POSITION(43:74) CHAR,
body POSITION(75:10000) CHAR
)
However, when I use a control file, which has delimiter like:
Load Data
INFILE 'national.TXT'
TRUNCATE
INTO TABLE nitstoryn
FIELDS TERMINATED BY '||'
(
srl CHAR ,
storyfile_nme CHAR ,
author CHAR ,
body CHAR
)
I get this error in the log file:
"Record 1: Rejected - Error on table NITSTORYN, column BODY.
Field in data file exceeds maximum length"
Could you please tell me why it is like this?
Best Regards,
- Arash
February 01, 2005 - 2:03 pm UTC
(
srl CHAR ,
storyfile_nme CHAR ,
author CHAR ,
body CHAR(1000000)
)
char defaults to 255 unless told otherwise...
You are great! Thanks.
A reader, February 01, 2005 - 4:22 pm UTC
Speed problem
Mohammad, February 17, 2005 - 7:23 pm UTC
Hi Tom,
I have 81 diffrent CSV flat file that i use sqlloader to import to oracle from a windows station connecting to oracle windows server:so i run below command 81 times
sqlldr UserId=u/p@netservicename control=my.ctl direct=true
It takes about 20 minuts totally, it seems all the time spent for 81 login period(every login time take about 15 seconds ), not for actual import process .
I need to know if there is any way to import 81 flat file to 81 tables with just calling sqlldr one time, or at least reduce the login time to Oracle server.
Many Thanks
February 17, 2005 - 7:41 pm UTC
what is in your sqlnet.ora? on the client?
SqlNet.ora file contents
Mohammad, February 17, 2005 - 11:12 pm UTC
Hi Tom,
This is the contents of my oracle client sqlnet.ora
------------------------------------------------------------
# sqlnet.ora Network Configuration File:
#C:\Oracle\product\10.1.0\Client_1\network\admin\sqlnet.ora
# Generated by Oracle configuration tools.
SQLNET.AUTHENTICATION_SERVICES= (NTS)
------------------------------------------------------------
Regards
February 18, 2005 - 7:37 am UTC
put a # in front of sqlnet.auth..... and see how fast you connect.
Many Thanks
Mohammad, February 20, 2005 - 6:00 pm UTC
Thanks Tom,
Really appreciate, it is very fast now.
Best Regards,
Mohammad
Sql Loader
Mohammad, February 22, 2005 - 8:42 pm UTC
Hi Tom,
I have a control file as :
Load Data
Infile "C:\Skills.csv" "str X'7c0d0a'"
Into Table Skills
REPLACE
Fields terminated by X'09'
TRAILING NULLCOLS
(
ID,
NAME,
DESCRIPTION,
NOTES char(10000),
COLOR,
MANPOWER,
SKILLPRIORITY,
THRESHOLD,
REQUIRELEVEL,
WORKSITEID)
and Tab seperated data file as :
1 Check-in Check-in 16776960 1 0 0 3 1|
2 Gate Agent Arr & Boarding 33023 1 0 0 0 1|
..................................................
In sql loader Oracle 10g it works very well, but in Oracle 9i when it reaches 5'th field , color, when size of color field data is 8 , it can't load that row.How i can modify my control file to work with oracle 9i as well.
Thank you in advance,
Best Regards,
Mohammad
February 23, 2005 - 1:59 am UTC
"it can't load that row"
very vague, no information -- no error message, no symptons
and since there is no table create, I cannot run your ctl file to see what it might have been....
SQL*Loader
Dominic Cote, February 24, 2005 - 10:01 pm UTC
Hi ! I have a good question about SQL*Loader...
Is it possible to call SQL*Loader about a SQL scrpit in Windows ? I give you an example.
I'm having something like that in my SQL script;
Set echo on
set linesize 2000
etc...
declare
l_string varchar2(2000);
l_user varchar2(20) := '@user';
l_control varchar2(20) := '@Ctl_file';
begin
l_string := '$sqlldr userid='||l_user||' control='||etc...
execute immediate l_string;
end;
I save my file with .sql extention.
In a SQL session, i call my script like;
SQL>@c:\temp\my_script.sql
Whith import and export in Oracle 8i, it work.
But in Oracle 10G, when i call SQL*Loader in a script it doesn't work and i dont know why.
Do you have any idea ?
February 25, 2005 - 4:46 pm UTC
that is not a SQL script
that is an anonymous block of PLSQL -- PLSQL is capable of two things:
interacting with SQL
calling PLSQL stuff.
that won't work at all with import and export in 8i (now, in 10g when IMPDB and EXPDB have plsql api's -- that changes) but in 8i -- nope, you are not doing that.
In 9i, you have external tables which is bascially SQLLDR on steriods in the database, so sqlldr becomes simply "select *"
search this site for
calling sqlldr from plsql
for some other ideas, including using utl_file to read small amounts of data, using a java stored procedure to run a host command.
What if there are multiple rows on one line?
Yogesh Thakare, March 07, 2005 - 5:21 pm UTC
What if there are multiple rows on one line?
We receive a file from one of the vendors, which is a very large file(270MB). And there are more than one row on a single line (and its not comma separated either). I can go on adding the <Line Feed> after end of each row, but thats time taking.
Would you please suggest a better way to load such file? eg there are 3 rows till the <end of line>.
eg
Row1Col1 Row1Col2 Row1Col3 Row2Col1 Row2Col2 Row2Col3 Row3Col1 Row3Col2 Row3Col3
Row4Col1 Row4Col2 Row4Col3 Row5Col1 Row5Col2 Row5Col3 Row6Col1 Row6Col2 Row6Col3
TIA,
Yogesh
March 08, 2005 - 7:41 am UTC
map that as an external table and use a multi-table insert
so, say I had a table:
create table t ( id int, day1 int, day2 int, day3 int, day4 int );
I can:
insert ALL
into t2 ( id, day, val ) values ( id, 'DAY1', day1 )
into t2 ( id, day, val ) values ( id, 'DAY2', day2 )
into t2 ( id, day, val ) values ( id, 'DAY3', day3 )
into t2 ( id, day, val ) values ( id, 'DAY4', day4 )
select * from t;
insert that single row multiple times using different sets of columns. using sqlldr, I would have multiple INTO clauses instead.
Insert ALL - 8i ??
Greg, March 08, 2005 - 9:15 am UTC
In the previous post, you mentioned external tables and multi-table insert ..
IIRC, those external tables are 9i and later .. anyway of pulling this trick off in 8i?? Or do we just suffer with multiple: "INTO TABLE xxx ... " :\
(Yeah, I know it's sad .. we still haven't upgraded .. *sigh* ... )
March 08, 2005 - 10:52 am UTC
if you were loading, you would use sqlldr with multiple into's
A little out of topic, but what is your opinion of this practice: SQL*Loader Case Studies
Christian Veit, March 15, 2005 - 8:51 am UTC
Hi Tom,
I have a simular file structure like the one described in this case study:
"</code>
http://download-west.oracle.com/docs/cd/B10501_01/server.920/a96652/ch10.htm#1007588" <code>
In short I have a parent-row whith multiple child-rows, but the child-rows do not include the parent-key:
1000, Parent1
ChildA1
ChildA2
...
1001, Parent2
ChildB1
ChildB2
...
Now this case study suggests using a row-level-before-Trigger to save the value of Parent-key in a global package-variable and use it for the child records where it is missing.
So this relies totally on the order when inserting.
I think this is somewhat "dangerous" and limited (probably no direct path because of the trigger, no parallelism in loading because of the order) - what is your opinion?
My idea was to use the rownum-keyword (to get the file line number) in sqlloader and do a self-join / or better analytic function whith the line number to get the parent key. Should be possible, what do you think?
(Version I is 9.2.04)
Many greetings and thank you for this site!!!
March 15, 2005 - 8:57 am UTC
you have to use rows=1 (to load slow by slow) and it would in fact work since sqlldr processes sequentially.
no direct path, no parallel -- those are given yes.
SQL Loader issue
Kamal, March 22, 2005 - 1:30 pm UTC
Hi Tom
I am using SQL LOADER to load data from a data file into a temp table. The data is stored as CSV file. I am using delimiter as ',' and loading the data... now we have problem that the data contains comma (,) in it...so we are not able load the data....since it is getting seperated into two columns....i cannot go for fixed length delimitation as the size may vary and the user is storing it as CSV... please tell how to make that data which contains comma (,) as a single column data
Example
the data will be like this
"Hilary, Ellision 20"
thanks in advance
kamal
March 22, 2005 - 6:03 pm UTC
optionally enclosed by....
ctl-f for that on this page and you'll see at least one example
HOW TO LOAD MULTIPLE USER DATAFILES INTO SINGLE TABLES USING SQL*LOADER
SHEKAR, April 10, 2005 - 7:57 am UTC
HOW TO LOAD MULTIPLE USER DATAFILES INTO MULTIPLE TABLES USING SQL*LOADER
April 10, 2005 - 10:41 am UTC
run sqlldr over and over...
sqlldr .... data=file1.dat
sqlldr .... data=file2.dat
SQL Loader
Shirish, May 18, 2005 - 5:16 pm UTC
Hi Tom,
I have a flat file from where i have to transfer data to Oracle 9i. The flat files are generated every 3 minutes and after every one hour , i have to transfer the data to the Oracle DB.
The files that are generated every 3 minutes, have different file names and also the number of rows are not fixed.
Please let me the step by step details as to how to go about.
Regards
Shirish
May 18, 2005 - 8:04 pm UTC
sounds like you are writing a program outside of the database to load data by calling sqlldr, or using an external table
but, it is a program outside of the database I would suspect, that reads a directory and loads files.
SQL LOADER
Shirish, May 19, 2005 - 2:15 pm UTC
Hi Tom,
Yes, you are absolutely right here. We have written a program which creates these flat files every 3 minutes.
Now we have to transfer data from these flat files to the Oracle 9i database after every 1 hour.
Say, suppose the flat files are generated every 3 minutes ( i.e. in 1 hour ==> 20 files will be generated.)
Now we have to transfer these 20 files to the Oracle DB.
This is a continuous process which will happen after every 1 hour.( i.e. after every hour we will have to transfer the next 20 generated files to the Oracle DB and so on.....)
Kindly let me know the steps in detail that are involved
in tranferring the data from the flat files to the Oracle DB using SQL Loader or any other tools.
Regards
Shirish
May 19, 2005 - 2:43 pm UTC
I'm saying you need to write a PROGRAM to do this, not that you have written a program to generate the files.
You need to write yet another program to process said files.
sqlldr from a client workstation could be an obvious choice to load data, yes.
Mutiple table loading from multiple infiles
Satyabrata Padhy, June 08, 2005 - 5:26 am UTC
Hi Tom,
Can we load mutiple tables in one control file, where specifying two infiles.The requirement is, infile_1 data go into table_1 and infile_2 data go into table_2.
Please help me out.
Thanks in advance.
Regards,
Satya
June 08, 2005 - 8:52 am UTC
why would you want a single control file in this case? You'll want two control files for sure.
Mutiple table loading from multiple infiles
Satyabrata Padhy, June 08, 2005 - 5:27 am UTC
Hi Tom,
Can we load mutiple tables in one control file, where specifying two infiles.The requirement is, infile_1 data go into table_1 and infile_2 data go into table_2.
Please help me out.
Thanks in advance.
Regards,
Satya
Mutiple table loading from multiple infiles
AndyQ, June 08, 2005 - 10:00 am UTC
Is this a limitation within sqlldr? Being able to execute 1 control file to load various flat files into various tables would be useful
June 08, 2005 - 10:28 am UTC
sqlldr can load data from more than one file (multiple infile statements)
sqlldr can load data into more than one table (multiple into statements)
but to load file1 into table1 and file2 into table2, two control files are the only concise, expedient way to accomplish it.
You would have to have something in the file that identified what "file" it was and use a when clause in the INTO statement in order to do it in a single ctl file.
a file to go into a table -> a control file to me.
Applying function while loading to avoid Update after Load
Vivek Sharma, July 21, 2005 - 8:46 am UTC
Hi Tom,
I have a table with 4 columns. I have a TXT file in which the data for first 3 columns are available. Last column, I have to apply a function and populate. I tried this but am unable to load the data. The table structure is
SQL> desc sales_range
Name Null? Type
----------------------------------------------------- -------- -------------------------
SALESMAN_ID NUMBER(5)
SALESMAN_NAME VARCHAR2(35)
SALES_AMOUNT NUMBER(10)
SALES_DATE DATE
The function I created is
function sales_fnx(s_id sales_range.salesman_id%type)
return date
as
s_dt date;
begin
if mod(s_id,2)=0 then
s_dt:=to_date('12-31-1999','MM-DD-YYYY');
elsif mod(s_id,3)=0 then
s_dt:=to_date('01-01-2000','MM-DD-YYYY');
elsif mod(s_id,5)=0 then
s_dt:=to_date('01-02-2000','MM-DD-YYYY');
elsif mod(s_id,7)=0 then
s_dt:=to_date('01-03-2000','MM-DD-YYYY');
else
s_dt:=to_date('01-04-2000','MM-DD-YYYY');
end if;
return s_dt;
end;
the Control file is
load data
INFILE sales.txt
APPEND INTO TABLE sales_range
trailing nullcols
(salesman_id,
salesman_name,
sales_amount,
sales_date "sales_fnx(:salesman_id)")
and the TXT file is
15491 , CPTPAUWNYIWPPCFAHXDKLLDPQ , 24310
49769 , DGVJOEADCXOCIJZEAGOLONKQE , 14632
21887 , RSKFSHQFBCIARNRIIRXHLRAOD , 4915
27342 , CISZAGOFWJOUELSZWNXWTFKSC , 26290
52364 , RUVPHBAQFIROWBBLMTJWEZPWH , 10766
But it rejects the record. What could be the reason ? How can I populate the data to a column which is not present in the External File.
If SQLLoader populates the data based on the funntion, I eliminate the step of updating sales_date column. The step otherwise would be
1. Load the data into three columns
2. Update sales_date column with sales_fnx(salesman_id) value.
Please help.
Regards
Vivek
July 21, 2005 - 3:50 pm UTC
it loaded "as is" for me, suggest you review your log file to see why the reject was rejected, I cannot reproduce.
mohan, October 24, 2005 - 9:44 am UTC
Hi
When using sql loader to load the data from flat file to relation table ,there is no data into table due to error loading records
LOAD DATA
INFILE 'C:\test_data.dat'
REPLACE
INTO TABLE test_1
FIELDS TERMINATED BY '|' OPTIONALLY ENCLOSED BY '"'
(PARTITION_KEY ,
ENTITY_CODE ,
ENTITY_DESC ,
GROUP_CODE ,
COUNTRY_CODE ,
ET_CODE_USER ,
SWAPS_NETTING_IND ,
SWAPS_MARGIN_CALL_IND ,
REPOS_MARGIN_CALL_IND ,
FLAG_TRIGGER_BEFORE ,
FLAG_TRIGGER_AFTER ,
ATTRIBUTE_1 ,
ATTRIBUTE_2 ,
ATTRIBUTE_3 ,
ATTRIBUTE_4 ,
AMOUNT_1 ,
AMOUNT_2 ,
COUNTRY_OF_RESIDENCE ,
BRANCH_SUBSIDIARY ,
SEBC_NOT_QUALIFYING ,
LONG_NAME ,
STREET_1 ,
STREET_2 ,
ZIP_CODE ,
STATE ,
CITY ,
RISK_AREA ,
RISK_SUB_AREA ,
STATUS ,
IDENTIFIER_1 ,
ECONOMIC_SECTOR ,
ADMIN_PROFILE_ID ,
BIRTH_DATE ,
CAD_TYPE ,
INDUSTRY_SECTOR ,
REVIEW_DATE ,
INCORPORATION_COUNTRY ,
OUTLOOK_RATING ,
INTERNET_ADDRESS ,
EMAIL_ADDRESS ,
BANKSCOPE_CODE ,
CAD_PARAMETER ,
LT_INTERNAL_RATING ,
TOTAL_ANNUAL_SALES ,
TOTAL_ASSETS ,
TOTAL_CURRENCY ,
IS_LOMBARD ,
DEFAULT_FLAG ,
INTERNAL_RATING_BSC ,
ENTITY_GL_CRITERIA ,
SPECIALIZED_LENDING ,
ENTITY_CODE_2 ,
LT_INTERNAL_RATING_DATE ,
TOTAL_SALES_DATE ,
TOTAL_ASSETS_DATE ,
CONTINUOUS_PD ,
RATING_METHOD ,
CLAIMS_BY_OTHER_BANKS ,
ATTRIBUTE_5 ,
ATTRIBUTE_6 ,
ATTRIBUTE_7 ,
ATTRIBUTE_8 ,
ATTRIBUTE_9 ,
MANAGED_AS_RETAIL ,
CHECK_ERROR_NUMBER ,
IMPORT_SOURCE ,
LONG_DESC ,
SUFICIENT_INFO_ON_DEFAULT ,
AVOID_RECLASSIFICATION )
"1002"|"1113"|""C"UST|NAME"||"SG"|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||"F"|||||"F"
"1003"|"1114"|"'C'UST|NAME"||"SG"|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||"F"||
Thanks
Mohan
October 24, 2005 - 11:39 am UTC
fix the errors?
Lines terminated by
Harry, November 10, 2005 - 5:23 am UTC
Hi Tom,
Thank you for the greate site.
My question is how do I load data which contains line feed, I need to preserve the line feed in the field. Thanks a lot.
DATA:(there is a line feed in the ename column)
7369,"Mark
T","clrkl",7902,"1980-12-17 00:00:00",800.00,,20$
7369,"SMITH","clrkl",7902,"1980-12-17 00:00:00",800.00,,20$
CONTROL FILE:
load data
infile 'test.dmp'
--APPEND
truncate
into table emp
fields terminated by ',' OPTIONALLY ENCLOSED BY '"'
(empno terminated by ',',
ename terminated by ',',
job terminated by ',',
mgr terminated by ',',
hiredate DATE "yyyy-mm-dd hh24:mi:ss" terminated by ',',
sal terminated by ',',
comm terminated by ',',
deptno terminated by '$')
November 11, 2005 - 11:38 am UTC
do you have access to my book "Expert one on one Oracle" - if so the chapter on sqlldr covers this in detail.
reader
A reader, November 16, 2005 - 10:50 am UTC
Like to know how to load "null" into a numeric field
with SQL*Loader.
Example:
========
SQL> desc test
Name Null? Type
----------------------------------------- -------- ----------------------------
COL1 NUMBER
COL2 DATE
COL3 NUMBER
JServer Release 9.2.0.5.0 - Production
$ cat test.ctl
load data
infile test.dat
into table y.test append
fields terminated by ',' optionally enclosed by '"'
(
col1 ,
col2 date "yyyymmdd",
col3
)
$ cat test.dat
1,20051027,null
Record 1: Rejected - Error on table HRG0110.TEST, column COL3.
ORA-01722: invalid number
0 Rows successfully loaded.
1 Row not loaded due to data errors.
November 16, 2005 - 6:05 pm UTC
1,2,
just leave it blank and specify "trailing nullcols" after the "into" clause.
reader
A reader, November 17, 2005 - 7:12 am UTC
What will be the solution if "null" is to be loaded
in the first column, COL1
November 17, 2005 - 8:18 am UTC
,2,1
just don't "put anything" there.
A reader, December 07, 2005 - 3:41 pm UTC
I call sql loader thru VB using 'shell' command.
Can I pass in sql loader a name of the file I load as an input parameter and use it in sql loader as const.
This way we can control files we already proccesed.
I have:
LOAD DATA
INFILE *
APPEND
INTO TABLE TEMP_ORDERS
when REC_TYPE='H'
(REC_TYPE position(1:5) char,
CLIENT_CODE position(6:11) char,
ORDER_NUMBER position(12:17) char,
....,
FILE_NAME my input parameter)
Thanks.
December 08, 2005 - 1:40 am UTC
have your VB program write the control file. there are no methods to "pass" something into the control file.
SQLLOADER performance
ashish, December 20, 2005 - 11:22 am UTC
Hi Tom,
We are working on a datawarehouse application and have hourly ETL's working against the database.
We have around 2 mio rows getting loaded in many tables on a hr basis, and we are n't able to compete with the log files getting generated by another system.
What are the best options to load the data in the warehouse more efficiently. Since these tables are associated with RI constraints can we use some method to use direct=true option with parallel=yes to get the data loaded directly into datafiles.
But we have indexes on these tables for foreign keys and the Primary Key Index.
Should we make the indexes unusable before the load and then use skip_index_maintenance = true and rebuild them online after the load is complete.
I am not sure since the rebuild is also going to take some time and more redo will be generated for this activity.
Please guide us for the suitable approach.
Thanks in adavnce!
December 20, 2005 - 12:59 pm UTC
what log files? what competing system?
SQLLOADER performance
Ashish, December 20, 2005 - 11:06 pm UTC
Hi tom,
Log files (meaning data files) are generated by the app servers which needs to get loaded and that's what our competing system is. They generate lot of data every minute in a file and put them in a shareable file store. We pick up these files and get them to our local production db host and then run sql loader on these files.
But with the performance being too slow, we are lagging behind on many files getting accumulated on that filestore.
Thanks
December 21, 2005 - 7:18 am UTC
with a measely 2 million records?? you cannot load that in a couple of seconds?
What is the table(s) you are loading into and how heavily are the indexed, if you tkprof a load, what do you see?
SQLLOADER performance
Ashish Kapur, December 21, 2005 - 9:50 am UTC
Hi tom,
2 mio records are for the parent table, we have child tables which have a minimum of 3 rows for each parent record. These files we do recieve per minute, to get loaded in the warehouse.
Right now the table is partitioned based on a weekly scenario, we have indexes on PK, date based RANGE partitioned Index and the foreign keys indexes.
We still lag behind by many files. What is the most optimal way to execute sqlloader:
1. Be in conventional mode and disable all RI constraints
2. Be in direct mode, disable all indexes and rebuild them once the load ios complete, not sure since the load is every hr happening.
3. Use External tables, create staging area and do the transformations over there and move the data from staging to real production env.
4. Do we need to have RI enabled on staging area while doing transformations,can we totally put off RI in staging and doing joins push the data which have parent child relationships.We will have Indexes in Place to do join.
5. Can we just MOVE the data from staging tables to production tables, so that we don't use DELETE command in staging to get rid of the moved rows.
Your thoughts will be appreciated!
December 21, 2005 - 7:32 pm UTC
please be more clear on the volumes here. total record count + WIDTH of records per minute/hour/whatever.
what is the database you are loading into - describe the system a little better (what is the goal of this system, what does it do)
Tkprof SQLLOADER
A Reader, December 21, 2005 - 9:54 am UTC
That is something interesting to tkprof SQLLOADER. Please let us know how to do it.
Thanks
December 21, 2005 - 7:33 pm UTC
logon trigger that issues alter session command.
SQLLOADER performance
Ashish Kapur, December 23, 2005 - 10:02 am UTC
Hi Tom,
We have 3 tables which are bound by RI constraints.
The parent table gets (A) 2 mio requests per hr, based on the parent record we have 3 child records (B) per parent record and based on this table we have 1 other child record (C).
This means we have 2mio+6mio+1mio records/hr and we need to extract, transform and then load the data in these table with RI enabled.
As far as indexes are concerned we have PK, FK's are indexed and the tables are partitioned by date Range and hence have LOCAL INDEX on this column.
Partitioning is based on WEEK.
Please let us know, which way to go and which method will be the most optimal method to do this ETL.
December 23, 2005 - 12:43 pm UTC
IN BULK - bulk up as much as you can.
array processing will be "key"
SQL*LOADER performace
Ashish Kapur, December 24, 2005 - 12:16 am UTC
Hi Tom,
An example would be what we would look forward to.
Thanks in anticipation.
December 24, 2005 - 9:04 am UTC
I have literrally thousands of examples on this site showing bulk operations. They are well documented as well.
Pl/SQL + SQLloader
Tony, December 26, 2005 - 4:35 am UTC
Tom
Happy christmas..
I like to know how can we invoke the sql loader from a pl/sql block.
Cheers
Tony
December 26, 2005 - 10:49 am UTC
sql loader with record delimiters
ravi, March 02, 2006 - 8:32 am UTC
Hi Tom,
My Dat file look something like this when i open it in unix.
06/02/15,bbip19658354,Internet,5456221,12500553^M
06/02/15,bbip19927623,Internet,10794257,34952296^M
06/02/15,bbip10211641,Internet,144490200,125243823^M
i am trying to load this using the following controlfile
load data
infile '/btsahcsp/home/btsahcsp/scripts/sample.tmp' "str '^M'"
badfile '/btsahcsp/home/btsahcsp/scripts/sample.bad'
truncate into table USAGE_DATA_INCOMING_STG_TBL
fields terminated by ',' trailing nullcols
(
record_date DATE "yy/mm/dd",sid,tap_name,upstream_usage,downstream_usage,total_u
sage ":upstream_usage + :downstream_usage",sno SEQUENCE(MAX,1) ,cust_type char(3
) "nvl(:cust_type,'BTC')"
)
i am getting the following error. please help
SQL*Loader-510: Physical record in data file (/btsahcsp/home/btsahcsp/scripts/sample.tmp) is longer than the maximum(1048576)
Thanks in advance,
ravi
March 02, 2006 - 12:41 pm UTC
well, is that str ^M two characters or one
for in your dat file, I'm sure the ^M is a single character and this file was created on windoze which uses carriage return/linefeeds instead of just linefeeds.
So, if you just typed in ^M as two characters - you don't have any record ending with those two characters
consider using HEX (ascii code for ^M that is a carriage return is 13 decimal) instead of two characters.
Or getting rid of the ^M ( the utility dos2unix is included on many unix distributions to fix this windows issue )
Problems with tab delimited
Craig, May 17, 2006 - 7:17 pm UTC
I'm having a bear of a time with this one. It seems as if in a tab delimited file, null fields are being ignored by sqlldr. Witness:
test_comma.txt:
"ABc","1","a1"
,"2",
"ghi",,"c3"
"jkl","4","d4"
"mno",,
,,"f6"
"stu",,
test_tab.txt:
"ABc" "1" "a1"
"2"
"ghi" "c3"
"jkl" "4" "d4"
"mno"
"f6"
"stu"
You'll have to trust me that those are tabs.
DROP TABLE cnelson.t;
CREATE TABLE cnelson.t
( a VARCHAR2(5),
b VARCHAR2(5),
c VARCHAR2(5)
);
First, let's process the comma delimited file.
OPTIONS (errors=10, direct=false, skip=0)
LOAD DATA
INFILE 'test_comma.txt'
BADFILE 'bad_comma.txt'
TRUNCATE
INTO TABLE CNELSON.T
FIELDS TERMINATED BY X'2C' OPTIONALLY ENCLOSED BY X'22'
TRAILING NULLCOLS
(
A char(5),
B char(5),
C char(5)
)
-bash-2.05b$ sqlldr control=test_comma.ctl
SQL*Loader: Release 9.2.0.4.0 - Production on Wed May 17 18:09:41 2006
Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
Commit point reached - logical record count 8
-bash-2.05b$
SQL> select * from cnelson.t;
A B C
----- ----- -----
ABc 1 a1
2
ghi c3
jkl 4 d4
mno
f6
stu
7 rows selected.
SQL>
Well, that seems to look ok. Let's try the tab delimited file now.
OPTIONS (errors=10, direct=false, skip=0)
LOAD DATA
INFILE 'test_tab.txt'
BADFILE 'bad_tab.txt'
TRUNCATE
INTO TABLE CNELSON.T
FIELDS TERMINATED BY X'09' OPTIONALLY ENCLOSED BY X'22'
TRAILING NULLCOLS
(
A char(5),
B char(5),
C char(5)
)
-bash-2.05b$ sqlldr control=test_tab.ctl
SQL*Loader: Release 9.2.0.4.0 - Production on Wed May 17 18:06:07 2006
Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
Commit point reached - logical record count 8
-bash-2.05b$
SQL> select * from cnelson.t;
A B C
----- ----- -----
ABc 1 a1
2
ghi c3
jkl 4 d4
mno
f6
stu
7 rows selected.
SQL>
That doesn't look right at all!! It seems that whenever a field is "missing" (i.e. null), it shifts all the subsequent fields for that record. When I remove TRAILING NULLCOLS, the records that have nulls error out, just as you would expect if you had a missing field. But I DON'T have a missing field...it's just null. The correct amount of delimiters are present in each record. Please help straighten me out!!
Thanks!
May 18, 2006 - 10:36 am UTC
don't have time to look at this right now - however if you have access to "Expert One on One Oracle" - I did write up all about tabs and sqlldr and the different behaviours in the chapter on sqlldr.
date in milliseconds
jasdeep, May 18, 2006 - 4:31 pm UTC
Hi Tom
I have this problem that I got .csv file from client which I ahve to load into my database.
In Which date is like 'mm/dd/yyyy hh24:mi:ss.999' format
whem i loads i gives problem i tried after deleting 2 records after .999
those only 2 records goes into db.
I cant edit all the file. iS there any way that we can get these records loaded.
May 19, 2006 - 11:45 am UTC
if you are loading into a DATE type
...
date_col "to_date( substr(:date_col,1,19), 'mm/dd/yyyy hh24:mi:ss')",
.....
or whatever SQL function you need to write to make it work (eg: if jan 1st is really 1/1/2006 - not 01/01/2006, you would need to fix up the substr, maybe
substr( :date_col, 1, length(:date_col)-4 )
Problems with tab delimited
Craig, May 19, 2006 - 10:59 am UTC
I appreciate whatever time you have to spend on my problem(s). I have now re-read that chapter and gone through your examples, but my issues remain. Below I have recreated my issue with your data, only I've removed all the EXTRA tabs, so that there is just one tab between each field. For clarity:
10\tSales\tVirginia
20\t\tIllinois
LOAD DATA
INFILE *
INTO TABLE DEPT
REPLACE
FIELDS TERMINATED BY X'09'
TRAILING NULLCOLS
(DEPTNO,
DNAME "nvl(:DNAME, 'xxx')",
LOC)
BEGINDATA
10 Sales Virginia
20 Illinois
SQL> select * from dept;
DEPTNO DNAME LOC
---------- -------------- -------------
10 Sales Virginia
20 xxx Illinois
That works.
LOAD DATA
INFILE *
INTO TABLE DEPT
REPLACE
FIELDS TERMINATED BY X'09'
TRAILING NULLCOLS
(DEPTNO,
DNAME "nvl(:DNAME, 'xxx')",
LOC)
BEGINDATA
10 Sales "Virginia"
20 "Illinois"
SQL> select * from dept;
DEPTNO DNAME LOC
---------- -------------- -------------
10 Sales "Virginia"
20 xxx "Illinois"
This also works, but of course the LOC data is loaded with the quotes, so I'll tack on optionally enclosed by '"'.
LOAD DATA
INFILE *
INTO TABLE DEPT
REPLACE
FIELDS TERMINATED BY X'09' OPTIONALLY ENCLOSED BY '"'
TRAILING NULLCOLS
(DEPTNO,
DNAME "nvl(:DNAME, 'xxx')",
LOC)
BEGINDATA
10 Sales "Virginia"
20 "Illinois"
SQL> select * from dept;
DEPTNO DNAME LOC
---------- -------------- -------------
10 Sales Virginia
20 Illinois
Clearly this is not right. It's as if sqlldr sees the 2 juxtaposed tabs as one big delimiter, but only if I'm using optionally enclosed. Surely this is not the correct behavior?
Regarding the milliseconds question above, You could always just substring 'mm/dd/yyyy hh24:mi:ss.999' to be a more oracle-friendly format.
date_field "to_date(substring(:date_field, 1, 19), 'MM/DD/YYYY HH24:MI:SS')"
May 19, 2006 - 12:23 pm UTC
can you use trim instead of using optionally enclosed by?
date
jasdeep, May 19, 2006 - 11:54 am UTC
you mean to say use substr in control file
May 19, 2006 - 12:23 pm UTC
yes, just like the above example used NVL() for example
trim instead of optionally enclosed by
Craig, May 19, 2006 - 2:41 pm UTC
Do you mean I should try to trim each field, or is there a way to "blanket" trim the quotes off? I have the O'Reilly SQL*Loader text and the online documentation, but there are about a gozillion features that don't get much press... Am I wrong in thinking that optionally enclosed *should* handle my situation?
thanks
May 20, 2006 - 4:37 pm UTC
instead of using the optionally enclosed by, use a sql function to optionally trim the leading and trail "'s
It has to do with the processing of the optionally enclosed by which treats whitespace differently (of which a TAB is considered).
For example, if you used comma delimited data - this would not happen.
Tab Delimitor Control File With Trim Option
Kiran Arsid, May 31, 2006 - 11:25 pm UTC
Hi,
I have .txt file which is a tab delimitor and some of the fields enclosed with ". the sample of the data file is
Shipped/t/t1008836/t"CUSTOMER NAME"/t144/t/t20060430.
where /t is the tab between each field.
I used trim option with " in it in order to eliminate the " quotes and is not working, its giving me the syntax error. Can you please help me in giving suggestions on how to remove the " quotes. The control file i am using is below:
LOAD DATA
INFILE 'datafile.dat'
BADFILE 'badfile.dat'
REPLACE
INTO TABLE HISTORY_STAGING_TBL
FIELDS TERMINATED BY X'09'
TRAILING NULLCOLS
(DEMAND_TYPE CHAR NULLIF DEMAND_TYPE = BLANKS,
SUPPLY_TYPE CHAR NULLIF SUPPLY_TYPE = BLANKS,
ITEM CHAR NULLIF ITEM = BLANKS,
CUSTOMER_NAME CHAR "TRIM('"' from :CUSTOMER_NAME)",
SUPPLIER_NAME CHAR,
YTD_TOTAL INTEGER EXTERNAL,
CUTOFF_DATE DATE "YYYYMMDD")
Thanks
Kiran.
June 01, 2006 - 9:59 am UTC
quickest fix that is easy to read:
change '"' into chr(34)
else, you'll have to escape the " in the '"'
Tab Delimited with trim option
Kiran Arsid, June 01, 2006 - 2:06 pm UTC
Hi,
Firstly i thank for your suggestion. I used chr(34) instead of '"' in the trim and its working fine.
Thanks a lot once again.
Kiran.
A record spanning mulitple lines
Nik, June 08, 2006 - 5:35 pm UTC
One field in my records has CR/LF characters and hence the data file has records spanning multiple lines.
How do I load such data using a sql loader?
Is there a way to change the record delimiter so that it reads the records even after the line is over?
truncate
A reader, June 28, 2006 - 8:14 pm UTC
what does truncate means in sqlldr statement
June 28, 2006 - 9:00 pm UTC
means "truncate the table before loading it"
Question about initial response to this question
Mathew Butler, June 29, 2006 - 5:02 am UTC
June 29, 2006 - 7:29 am UTC
I meant the "archive destination filled up" - sorry about that.
What dose this error mean
Austin, June 30, 2006 - 4:31 am UTC
I am using .xls file to load a table in oracle table columns are
desc test
--------------
a varchar2(20)
b varchar2(20)
data in xls file is as follows
cell
------
row 1 2
2 3
3 4
5 6
when i run from command line the following error log is generated
SQL*Loader: Release 9.2.0.1.0 - Production on Fri Jun 30 13:53:30 2006
Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
Control File: load_toad.ctl
Data File: C:\m.xls
Bad File: C:\m.bad
Discard File: C:\m.dsc
(Allow all discards)
Number to load: ALL
Number to skip: 0
Errors allowed: 50
Continuation: none specified
Path used: Direct
Table "TEST", loaded from every logical record.
Insert option in effect for this table: APPEND
Column Name Position Len Term Encl Datatype
------------------------------ ---------- ----- ---- ---- ---------------------
A FIRST * WHT CHARACTER
B NEXT * WHT CHARACTER
Record 1: Rejected - Error on table "TEST", column B.
Field in data file exceeds maximum length
Record 2: Rejected - Error on table "TEST", column B.
Field in data file exceeds maximum length
Record 3: Rejected - Error on table "TEST", column B.
Field in data file exceeds maximum length
Record 5: Rejected - Error on table "TEST", column A.
Field in data file exceeds maximum length
Record 4: Rejected - Error on table "TEST", column B.
ORA-01401: inserted value too large for column
Record 6: Rejected - Error on table "TEST", column A.
Field in data file exceeds maximum length
Table "TEST":
0 Rows successfully loaded.
6 Rows not loaded due to data errors.
0 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.
Bind array size not used in direct path.
Column array rows : 5000
Stream buffer bytes: 256000
Read buffer bytes: 1048576
Total logical records skipped: 0
Total logical records read: 6
Total logical records rejected: 6
Total logical records discarded: 0
Total stream buffers loaded by SQL*Loader main thread: 0
Total stream buffers loaded by SQL*Loader load thread: 0
Run began on Fri Jun 30 13:53:30 2006
Run ended on Fri Jun 30 13:53:36 2006
Elapsed time was: 00:00:06.15
CPU time was: 00:00:00.03
Can you please tell me what is the problem here.
Regards
Austin
June 30, 2006 - 7:30 am UTC
yeah, easy.
.xls is a proprietary binary format file. It is not something sqlldr will read and process.
adding a constant field
abz, November 21, 2006 - 5:53 am UTC
Oracle 9.2
I have a control file as below.
LOAD DATA
INFILE 'C:\ip.txt'
BADFILE 'C:\ip.BAD'
DISCARDFILE 'C:\ip.DIS'
INSERT
INTO TABLE ip
(
p1 POSITION(1) char terminated by ' ',
p2 char terminated by ' ',
p3 char terminated by ' '
)
I want to add another field p4, but the value of that
field will not come from the ip.txt file, instead, it will be a constant "AB" for all records.
Can you please tell me whats the syntax in .ctl for this.
November 22, 2006 - 3:26 pm UTC
did you look up "constant" in the server utilities guide :)
that would be it.
p4 constant 'AB'
A reader, December 20, 2006 - 4:04 pm UTC
Tom,
How can I load primary key data from the database sequence?
Can you please give an example?
Thanks.
December 20, 2006 - 6:58 pm UTC
if you are using sqlldr
....,
column_name "sequence.nextval",
.....
works
SQL Loader Problem exist in my case
Chetan Sachdeva, December 29, 2006 - 6:35 am UTC
Hi,
I have similar kind of problem. I have a table structure with two fields:
CREATE TABLE RECON_IMPORT_TXN_TEST
(
ID_REC NUMBER(10) NOT NULL,
DT_TRAN timestamp
)
I am using SQL loader to insert data in that table through following control file:
load data
infile "C:\abc_CTL.dat"
append into table TEST
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
(
id_rec integer,
dt_tran "to_timestamp(substr(:dt_tran,1,26), 'Mon dd yyyy hh:mi:ss:ff3AM')",
dt_lme_prmt "to_timestamp(substr(:dt_lme_prmt,1,26), 'Mon dd yyyy hh:mi:ss:ff3AM')"
)
But its giving me error. my date in dat file is as: Nov 14 2006 12:00:00:000AM
Kindly suggest me any solution..
Thanks in advance..
Chetan Sachdeva
)
December 29, 2006 - 9:57 am UTC
sorry?
I mean, you have two timestamps referenced in the control file, you have one character string.
dt_lme_prmt - what the heck is that, there is no such column in your example
sqlldr date problem
A reader, February 13, 2007 - 1:54 pm UTC
I have .csv file which has a date column
some values in that column is like '/ / '
I tried using replace it didn't work.
Can you give me a solution.
Regards
February 13, 2007 - 4:15 pm UTC
insufficient data
no example to work with or anything.
Problem in calling SQL Loader from Java
Karthik, March 01, 2007 - 8:13 am UTC
Hi,
I have a java class which calls SQL Loader.
Process p = Runtime.getRuntime().EXEC(wFullCommand);
try {
p.waitFor();
}
catch(InterruptedException e) {
SYSTEM.OUT.print(e.getMessage());
}
This was working properly in Oracle 9i. When I migrate this code to 10g I'm getting an error message " end-of-file on communication channel".
But the same file when loaded through the command line works perfectly. I have checked the file paths,tns for their correctness. This java class is called from an Oracle package.
Thanks,
Karthik
March 02, 2007 - 11:34 am UTC
why wouldn't you use an external table in both 9i and 10g - infinitely better error handling abilities, no need for java.
what is getting the 3113, your session? sqlldr? not very clear here what is happening.
Problem in calling SQL Loader from Java
Karthik, March 05, 2007 - 1:31 am UTC
Hi Tom,
Thanks for your response.
My session is throwing the 3113 Error. It is like this -
Oracle package calls a java class which inturn invokes SQL Loader. When SQL Loader is invoked with the same arguments on the command line, it works. But my session throws this error when invoked through the package.
Thanks,
Karthik
March 05, 2007 - 12:43 pm UTC
(external table is the correct answer here by the way, erase that java code, it is not only not necessary but the wrong approach)
3113 in your session would likely indicate "a bug" - please utilize support
but please - really - erase the code, we don't want or need it.
Loading Status
Deepak, April 05, 2007 - 1:15 pm UTC
Hi Tom,
Have a requirement of monitoring the data loading status through SQL Loader by querying the Database. Am using Oracle 9i. Please let me know, if there are any SYS tables/views from where we can get the load status information?
April 05, 2007 - 1:47 pm UTC
define "loading status" for me.
Loading Status
Deepak, April 06, 2007 - 5:18 am UTC
Tom,
What I meant by loading status is the number of rows insterted in the table (commited/uncommited) by a SQL Loader session till a certain point in time. The requirement is basically to develop a GUI whichi will display the no. of rows loaded so far out of 'N' no. of rows.
April 06, 2007 - 11:43 am UTC
that will not happen - sqlldr doesn't display that information. You would have to write your own loader process that called dbms_application_info to make that information available.
Or, you will process the "output" (console output) from sqlldr - it can show you the number of records processed.
sqlldr converts µ to ¿
A reader, April 17, 2007 - 9:00 am UTC
Hi Tom,
When i try to load data using sqlloader the data 'µ' is getting converted as '¿'. The database character set is UTF8 and database version is 10g on RAC. If i insert using TOAD or SQL*PLUS with insert into try values('µ'); i didnot get any data conversion. The issue is only when i use sqlplus with insert into try values('µ'); or sqlldr.
Could you please help me to correct this problem?
Thanks
April 17, 2007 - 10:17 am UTC
sounds like your character set was not set
what is your NLS_LANG when running toad versus when you execute sqlldr.
SQLLDR and "case statement" or "case expression"?
Phil, April 20, 2007 - 6:28 pm UTC
Hi Tom
I have some data which annoyingly has a date format that is slightly inconsistant. Can I avoid this using a column definition such as the following which does not work?
load data infile 'test.dat' append into test_table fields terminated by '|' trailing nullcols (
END_TIME "case
when :END_TIME like '%/%/%/%' then
decode(:END_TIME,0,null,to_date(:END_TIME, 'DD/MM/YY/ HH24:MI:SS'))
else
decode(:END_TIME,0,null,to_date(:END_TIME, 'DD/MM/YY HH24:MI:SS'))
end",
and_so_on,
last_col)
April 20, 2007 - 8:09 pm UTC
guess you could use replace(:end_time, '/ ', ' ' ) but that would permit other invalid dates from creeping in.
what you have works - why not run with it?
can i add datafile name to each record?
nairs, November 02, 2007 - 4:40 am UTC
Hi Tom,
Thank you so much for being there for us.
I have a lot of files to upload into a single table using sql loader. Each file is from a different date identified by the filename, but unfortunately the timestamp is not present as a field in the data file. The only way I can identify the timestamp is by the datafile name. So I need the corresponding datafile name as part of the record
Is there anyway I can add the datafile name as a column value for each record for the file?
load data
infile 'xxxx010105.CSV' BADFILE 'xxxx010105.bad' DISCARDFILE 'xxxx010105.discard'
infile 'xxxx010205.CSV' BADFILE 'xxxx010205.bad' DISCARDFILE 'xxxx010205.discard'
....
infile 'xxxx100107.CSV' BADFILE 'xxxx100107.bad' DISCARDFILE 'xxxx100107.discard'
append into table stage_tab
fields terminated by "," optionally enclosed by '"'
(C1,C2,C3,C4,C5,C6)
Many Thanks.
Reader, March 07, 2008 - 10:20 am UTC
I need to load the below data into a table called test_load
create table test_load
(id number
,dt date
,code varchar2(1)
);
id,tdate,time,code
1,2008-02-28,09:20:30,'A'
2,2008-02-29,14:20:40,'B'
3,2008-03-04,12:10:30,'C'
4,2008-03-06,16:00:00,'D'
The file contains date and time is separate fields. I need to load them into one field in the table.
I used control file as shown below, but I got the error
SQL*Loader-291: Invalid bind variable :tdate in SQL string for column dt.
load data
INFILE '/home/data/test_load.csv'
BADFILE '/home/bad/test_load.bad'
DISCARDFILE '/home/discard/test_load.dis'
TRUNCATE
INTO TABLE test_load
fields terminated by ","
optionally enclosed by '"'
TRAILING NULLCOLS
( id
,tdate "to_date((:tdate||:time),'YYYY-MM-DD HH24:MI:SS')
,code char(1)
)
Can you please tell me how to concatenate two columns
in SQL*Loader control file?
adding the file name in the csv file before loading
ajay, April 09, 2008 - 12:32 am UTC
Hi tom ,
I want to add the datafile name to each row in the csv file before it is being loaded in to the table .
can you please tell is it possible to do this , if so how to do it .
i am working on sql loader in unix..
April 09, 2008 - 2:48 pm UTC
you'd have to dynamically generate the ctl file and inject the filename into it. eg: echo the ctl file out from the shell script you are running to invoke sqlldr
Sql Loader
John Christopher E., April 11, 2008 - 1:56 am UTC
Hi Tom,
Thanks in advance.
When I try to load the table from a flat file of text format through SQLLDR, It successfully loads the entire rows into the table, But each column is stored with only the first character of the field selected, So my entire table consist only one character for all the columns loaded.A new text file is also created in the same area where my flat file was stored, with the same flat file's name, which tells about the description of the sql loader, the details are given below.
SQL*Loader: Release 9.2.0.1.0 - Production on Fri Apr 4 12:06:13 2008
Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
Control File: NameLDR.CTL
Data File: NameLDR.txt
Bad File: NameLDR.bad
Discard File: NameLDR.dsc
(Allow all discards)
Number to load: ALL
Number to skip: 0
Errors allowed: 50
Continuation: none specified
Path used: Direct
Table "HX_NAMES_LOOKUP_TMP", loaded from every logical record.
Insert option in effect for this table: INSERT
Column Name Position Len Term Encl Datatype
------------------------------ ---------- ----- ---- ---- ---------------------
COUNTRY_NM_FORMAT FIRST 1 CHARACTER
NAME NEXT 1 CHARACTER
NAME_INITIALS NEXT 1 CHARACTER
NAME_PREFIX NEXT 1 CHARACTER
NAME_SUFFIX NEXT 1 CHARACTER
NAME_ROYAL_PREFIX NEXT 1 CHARACTER
NAME_ROYAL_SUFFIX NEXT 1 CHARACTER
NAME_TITLE NEXT 1 CHARACTER
LAST_NAME_SRCH NEXT 1 CHARACTER
FIRST_NAME_SRCH NEXT 1 CHARACTER
LAST_NAME NEXT 1 CHARACTER
FIRST_NAME NEXT 1 CHARACTER
MIDDLE_NAME NEXT 1 CHARACTER
SECOND_LAST_NAME NEXT 1 CHARACTER
SECOND_LAST_SRCH NEXT 1 CHARACTER
NAME_AC NEXT 1 CHARACTER
PREF_FIRST_NAME NEXT 1 CHARACTER
PARTNER_LAST_NAME NEXT 1 CHARACTER
PARTNER_ROY_PREFIX NEXT 1 CHARACTER
LAST_NAME_PREF_NLD NEXT 1 CHARACTER
NAME_DISPLAY NEXT 1 CHARACTER
NAME_FORMAL NEXT 1 CHARACTER
COUNTRY_CD NEXT 1 CHARACTER
SEX NEXT 1 CHARACTER
Table "HX_NAMES_LOOKUP_TMP":
189 Rows successfully loaded.
0 Rows not loaded due to data errors.
0 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.
Bind array size not used in direct path.
Column array rows : 5000
Stream buffer bytes: 256000
Read buffer bytes: 1048576
Total logical records skipped: 0
Total logical records read: 189
Total logical records rejected: 0
Total logical records discarded: 0
Total stream buffers loaded by SQL*Loader main thread: 1
Total stream buffers loaded by SQL*Loader load thread: 0
Run began on Fri Apr 04 12:06:13 2008
Run ended on Fri Apr 04 12:06:14 2008
Elapsed time was: 00:00:00.30
CPU time was: 00:00:00.03
April 11, 2008 - 7:54 am UTC
ummm, the control file would be *really* useful don't you think?
That is where you told us what to do, and we are only doing what you told us - so I need to see that to tell you what you put in there wrong.
adding the datafile name to each record in the csv file
arunkumar, April 14, 2008 - 12:50 am UTC
Hi Tom,
I want add the datafile name to each record in the csv file before it is being loaded in to the table.
Is there any functionality in sql loader which does this job?
in detail i am asking whether by setting the functionality in the ctl file to the column ,it should automatically load the filename in the table while loading the csv file?
April 16, 2008 - 1:47 pm UTC
look up two reviews, the one by 'ajay'
Sql Loader
John Christopher E., April 21, 2008 - 5:36 am UTC
Hi Tom,
You needed the control file,
LOAD DATA
INFILE 'NameLDR.txt'
BADFILE 'NameLDR.bad'
DISCARDFILE 'NameLDR.dsc'
PRESERVE BLANKS
INTO TABLE "HX_NAMES_LOOKUP_TMP"
INSERT
(COUNTRY_NM_FORMAT,
NAME,
NAME_INITIALS,
NAME_PREFIX,
NAME_SUFFIX,
NAME_ROYAL_PREFIX,
NAME_ROYAL_SUFFIX,
NAME_TITLE,
LAST_NAME_SRCH,
FIRST_NAME_SRCH,
LAST_NAME,
FIRST_NAME,
MIDDLE_NAME,
SECOND_LAST_NAME,
SECOND_LAST_SRCH,
NAME_AC,
PREF_FIRST_NAME,
PARTNER_LAST_NAME,
PARTNER_ROY_PREFIX,
LAST_NAME_PREF_NLD,
NAME_DISPLAY,
NAME_FORMAL,
COUNTRY_CD,
SEX)
April 23, 2008 - 4:45 pm UTC
you do not describe the input file with things like:
...
fields terminated by ","
optionally enclosed by '"'
.....
so, by default, everything is a char(1), you got what you asked for - since you didn't ask us to look for something to delimit the fields.
Why decimal portion not getting loaded properly ?
Parag J Patankar, May 21, 2008 - 6:33 am UTC
Hi,
I am trying upload data using sqlldr ( sql loader ) in Oracle 9.2 in AIX, Details are
Table
drop table parag;
CREATE TABLE parag
(
source_sys char (00006) not null,
portfolio_FO char (00006) not null,
prod_id char (00016) not null,
V00230 char (00003) not null,
inst_id char (00008) not null,
V09490F char (00016) not null,
V09060 number (15,2) not null,
V09040 char (00008) not null,
V09000 char (00008) not null
);
GRANT ALL ON parag TO PUBLIC;
Control File
load data
infile '$A2_FIC/parag.dat'
replace
into table parag
fields terminated by ',' trailing nullcols
(
source_sys,
portfolio_FO,
prod_id,
V00230,
inst_id,
V09490F,
V09060 decimal external
V09040,
V09000
)
Data
F01,1517,BROK4,INO,FXSPT,0,-182867.10,20080331,20080331
F02,7822,BROK4,INO,FXSPT,0,-1765629.15,20080331,20080331
F03,1515,STFXSPTFWD,EUR,FXD,48486608,-4652.20,20080331,20080331
F04,1515,STFXSPTFWD,INO,FXD,48486608,274003.25,20080331,20080331
F05,1515,STFXSPTFWD,INO,FXD,48485733,1504906.30,20080331,20080331
F06,1515,STFXSPTFWD,INO,FXD,48549114,7498008.35,20080331,20080331
F07,1515,STFXSPTFWD,INO,FXD,48548049,40765000.40,20080331,20080331
F08,1515,STFXSPTFWD,INO,FXD,48485744,80125030.45,20080331,20080331
F09,1515,STFXSPTFWD,INO,FXD,48552707,-2768610.50,20080331,20080331
F10,1515,STFXSPTFWD,INO,FXD,47923730,-6268500.55,20080331,20080331
F11,1515,STFXSPTFWD,INO,FXD,48179724,-9870000.60,20080331,20080331
F12,1515,STFXSPTFWD,INO,FXD,47404500,-13656000.65,20080331,20080331
But my result is showing,
SQL> /
SOURCE V09060
------ ----------
F01 -182867.1
F02 -1765629.2
F03 -4652.2
F04 274003.25
F05 1504906.3
F06 7498008.35
F07 40765000.4
F08 80125030.5
F09 -2768610.5
F10 -6268500.6
F11 -9870000.6
F12 -13656001
12 rows selected.
Questions
1/ My table column v09060 is 15,2 is this short ? ( looks like ) but not understood for e.g. F02, v09060 data including decimal point is 10 while in my table it is 15,2
2/ How to overcome this problem ? without giving V09060 "to_number(:v09060,'999999999999.99')", in control file
Pl guide me.
thanks & regards
parag
May 21, 2008 - 8:57 am UTC
1) i did not understand this question. "is this short?" ????? I don't know what you are trying to ask.
number(15,2) is 15 digits of precision with up to 2 digits on the right of the decimal.
2) i don't know what the "problem" is here.
everything I see looks correct - you have exactly what you loaded?
sqlloader
Parag J Patankar, May 21, 2008 - 9:53 am UTC
Hi Tom,
Sorry for not clear question. I want to ask you when I am loading data thru sqlloader, why decimal point (scale) get truncated ? while length of data is only 12.
SQL> select length(-13656000.65) from dual;
LENGTH(-13656000.65)
--------------------
12
But in table I had given as (15,2). So if I want to insert values without rounding off / truncation what I should define as my data type in precision and scale ?
thanks & regards
parag j patankar
May 21, 2008 - 10:26 am UTC
that is just sqlplus's default numformat (show numformat)
ops$tkyte%ORA9IR2> select x, to_char(x,'9999999999999.99') xx from t;
X XX
---------- -----------------
-182867.1 -182867.10
-1765629.2 -1765629.15
-4652.2 -4652.20
274003.25 274003.25
1504906.3 1504906.30
7498008.35 7498008.35
40765000.4 40765000.40
80125030.5 80125030.45
-2768610.5 -2768610.50
-6268500.6 -6268500.55
-9870000.6 -9870000.60
-13656001 -13656000.65
12 rows selected.
the data is "all there", you are just seeing a report with a number format applied.
use to_char or set numformat to what you want.
Pass user-defined parameters to control file
David Shen, May 29, 2008 - 6:41 pm UTC
Hi Tom,
I would like to suggest Oracle add this feature to the next version of SQL*LOADER. The SQL*LOADER executable can read both the command line and the control file, so it won't be difficult to make "pass user-defined parameters to control file through command line" possible.
Summary of the feature wanted from the next version of SQL*LOADER:
1) user-defined parameters can be accepted on the command line,
2) control file can read any pre-defined or user-defined parameters on the command line,
3) control file can use the values of any parameters to populate table columns.
If you could, please pass my suggestion along to the guys who are developing the next version of SQL*LOADER.
Thank you so much.
David Shen
May 29, 2008 - 10:03 pm UTC
the only way to get enhancement requests in is via metalink. That is the only way - you use an iTar.
Reader, June 06, 2008 - 11:18 am UTC
Tom,
Can you tell me how to load data when certail field is not null in the data file
It expects <> or =.
IS there any way to specify not null condition?
June 09, 2008 - 10:25 am UTC
eh? not sure at all what you mean.
but not very well
jamshade, June 11, 2008 - 4:47 am UTC
it is my second day that i join in the access class that i have studied the database assitant and successfuly i passed it. and now i start the new class of access tecnician, cause of that in this class we are reading the SQL system that up to now i couldn't cauth it so well but as i see it is so importent. but not very well????????
Load selected No of rows
krish, July 10, 2008 - 9:51 am UTC
Hi Tom,
I am a regular reader of this forum .Can you please tell me how can I load a specific no of rows through SQLLoader. Like SKIP is used to skip header, is there any method through which I can load only top 5 rows or as such ..In that case what will be the .ctl file ?
Thanks in advance
July 10, 2008 - 10:24 am UTC
there is skip to skip over the first N records.
there is load - to load N records
if you wanted the first 5, you would use "load=5"
if you wanted 6-10, you would skip=5 load=5
Total logical records read: 1
karthikeyan T M, May 05, 2009 - 10:59 am UTC
thanks a lot for your comments. It is really useful to fix the issue which we have faced. Thank you
Loading of data
anupam pandey, May 18, 2009 - 9:10 am UTC
We have a control file which is as follows
LOAD DATA
APPEND INTO TABLE SE_ROLLUPS_TMP7
FIELDS TERMINATED BY "^A"
(
TIME_ID ,
ENGINE_ID ,
KEY_ID char(2000) "substr(:KEY_ID,1,255)"
)
In Table SE_ROLLUPS_TMP7 ,column KEY_ID is varchar2(300).
While loading the data we got an error saying
Record 1: Rejected - Error on table SE_ROLLUPS_TMP7, column KEY_ID.
Field in data file exceeds maximum length
When investigated we found that length of key_id field in data file was 2384 while the buffer specified in control file was only 2000 .When increased the buffer size to varchar(3000) data got loaded.
The problem here is we dont know what will be the lenth of that column .
So the question is is there any setting by which we can tell oracle to read the whole field and take the substring portion for loading .
Thanks And Regards,
Anupam Pandey
May 23, 2009 - 11:17 am UTC
... The problem here is we dont know what will be the lenth of that column . ...
then you, as data processing professionals, need to get an understanding of the data you are tasked with processing???
you can say char(some_big_number), but if you don't know, how can we know? We are software, we must be told what to do.
Or you might be able to do something like this
ops$tkyte%ORA10GR2> create table se_rollups_tmp7
2 ( time_id number,
3 engine_id number,
4 key_id varchar2(255)
5 )
6 /
Table created.
ops$tkyte%ORA10GR2> !cat test.ctl
LOAD DATA
APPEND INTO TABLE SE_ROLLUPS_TMP7
FIELDS TERMINATED BY "^A"
(
TIME_ID ,
ENGINE_ID ,
KEY_ID position(1:500) "substr(:key_id,instr(:key_id,'^A',1,2)+2, 255)"
)
ops$tkyte%ORA10GR2>
ops$tkyte%ORA10GR2> !sqlldr / test data=test.dat
SQL*Loader: Release 10.2.0.4.0 - Production on Sat May 23 10:55:20 2009
Copyright (c) 1982, 2007, Oracle. All rights reserved.
Commit point reached - logical record count 2
ops$tkyte%ORA10GR2> select time_id, engine_id, substr(key_id,1,20), length(key_id), substr(key_id,length(key_id)-5) from se_rollups_tmp7;
TIME_ID ENGINE_ID SUBSTR(KEY_ID,1,20) LENGTH(KEY_ID)
---------- ---------- -------------------- --------------
SUBSTR(KEY_ID,LENGTH(KEY_ID)-5)
-------------------------------------------------------------------------------
1 2 hello world 11
world
3 4 hello world123456789 255
901234
I have not read any of your book But i desparetly requre help
Sandip Sen Majumder, April 13, 2010 - 8:13 am UTC
this is my table
create table crimes.veho (
date_time varchar2(20),
smac varchar2(20),
dmac varchar2(20),
etype varchar2(8) ,
sip varchar2(20),
dip varchar2(20),
sport number(10),
dport number(10),
sno number(15),
psize number(10),
protocol varchar2(50),
sid number(10),
status varchar2(10),
v_data varchar2(3500))
tablespace vehere;
and this is control file
SQL*Loader: Release 9.0.1.1.1 - Production on Tue Apr 13 18:36:09 2010
(c) Copyright 2001 Oracle Corporation. All rights reserved.
Control File: C:\vfone_crimes.ctl
Data File: C:\xmlDir\130410034724.txt
Bad File: C:\130410034724.bad
Discard File: none specified
(Allow all discards)
Number to load: ALL
Number to skip: 0
Errors allowed: 50
Bind array: 64 rows, maximum of 256000 bytes
Continuation: none specified
Path used: Conventional
Table VEHO, loaded from every logical record.
Insert option in effect for this table: APPEND
TRAILING NULLCOLS option in effect
Column Name Position Len Term Encl Datatype
------------------------------ ---------- ----- ---- ---- ---------------------
DATE_TIME FIRST * | O(,) CHARACTER
SMAC NEXT * | O(,) CHARACTER
DMAC NEXT * | O(,) CHARACTER
ETYPE NEXT * | O(,) CHARACTER
SIP NEXT * | O(,) CHARACTER
DIP NEXT * | O(,) CHARACTER
SPORT NEXT * | O(,) CHARACTER
DPORT NEXT * | O(,) CHARACTER
SNO NEXT * | O(,) CHARACTER
PSIZE NEXT * | O(,) CHARACTER
PROTOCOL NEXT * | O(,) CHARACTER
SID NEXT * | O(,) CHARACTER
STATUS NEXT * | O(,) CHARACTER
V_DATA NEXT * | O(,) CHARACTER
Record 1: Rejected - Error on table VEHO, column V_DATA.
Field in data file exceeds maximum length
Record 2: Rejected - Error on table VEHO, column V_DATA.
Field in data file exceeds maximum length
Record 3: Rejected - Error on table VEHO, column V_DATA.
Field in data file exceeds maximum length
Record 4: Rejected - Error on table VEHO, column V_DATA.
Field in data file exceeds maximum length
Record 5: Rejected - Error on table VEHO, column V_DATA.
Field in data file exceeds maximum length
Record 6: Rejected - Error on table VEHO, column V_DATA.
Field in data file exceeds maximum length
Record 7: Rejected - Error on table VEHO, column V_DATA.
Field in data file exceeds maximum length
Record 8: Rejected - Error on table VEHO, column V_DATA.
Field in data file exceeds maximum length
Record 9: Rejected - Error on table VEHO, column V_DATA.
Field in data file exceeds maximum length
Record 10: Rejected - Error on table VEHO, column V_DATA.
Field in data file exceeds maximum length
Record 11: Rejected - Error on table VEHO, column V_DATA.
Field in data file exceeds maximum length
Record 12: Rejected - Error on table VEHO, column V_DATA.
Field in data file exceeds maximum length
Record 13: Rejected - Error on table VEHO, column V_DATA.
Field in data file exceeds maximum length
Record 14: Rejected - Error on table VEHO, column V_DATA.
Field in data file exceeds maximum length
Record 15: Rejected - Error on table VEHO, column V_DATA.
Field in data file exceeds maximum length
Record 16: Rejected - Error on table VEHO, column V_DATA.
Field in data file exceeds maximum length
Record 17: Rejected - Error on table VEHO, column V_DATA.
Field in data file exceeds maximum length
Record 18: Rejected - Error on table VEHO, column V_DATA.
Field in data file exceeds maximum length
Record 19: Rejected - Error on table VEHO, column V_DATA.
Field in data file exceeds maximum length
Record 20: Rejected - Error on table VEHO, column V_DATA.
Field in data file exceeds maximum length
Record 21: Rejected - Error on table VEHO, column V_DATA.
Field in data file exceeds maximum length
Record 22: Rejected - Error on table VEHO, column V_DATA.
Field in data file exceeds maximum length
Record 23: Rejected - Error on table VEHO, column V_DATA.
Field in data file exceeds maximum length
Record 24: Rejected - Error on table VEHO, column V_DATA.
Field in data file exceeds maximum length
Record 25: Rejected - Error on table VEHO, column V_DATA.
Field in data file exceeds maximum length
Record 26: Rejected - Error on table VEHO, column V_DATA.
Field in data file exceeds maximum length
Record 27: Rejected - Error on table VEHO, column V_DATA.
Field in data file exceeds maximum length
Record 28: Rejected - Error on table VEHO, column V_DATA.
Field in data file exceeds maximum length
Record 29: Rejected - Error on table VEHO, column V_DATA.
Field in data file exceeds maximum length
Record 44: Rejected - Error on table VEHO, column V_DATA.
Field in data file exceeds maximum length
Record 65: Rejected - Error on table VEHO, column V_DATA.
Field in data file exceeds maximum length
Record 69: Rejected - Error on table VEHO, column V_DATA.
Field in data file exceeds maximum length
Record 75: Rejected - Error on table VEHO, column V_DATA.
Field in data file exceeds maximum length
Record 77: Rejected - Error on table VEHO, column V_DATA.
Field in data file exceeds maximum length
Record 83: Rejected - Error on table VEHO, column V_DATA.
Field in data file exceeds maximum length
Record 84: Rejected - Error on table VEHO, column V_DATA.
Field in data file exceeds maximum length
Record 85: Rejected - Error on table VEHO, column V_DATA.
Field in data file exceeds maximum length
Record 86: Rejected - Error on table VEHO, column V_DATA.
Field in data file exceeds maximum length
Record 87: Rejected - Error on table VEHO, column V_DATA.
Field in data file exceeds maximum length
Record 88: Rejected - Error on table VEHO, column V_DATA.
Field in data file exceeds maximum length
Record 91: Rejected - Error on table VEHO, column V_DATA.
Field in data file exceeds maximum length
Record 92: Rejected - Error on table VEHO, column V_DATA.
Field in data file exceeds maximum length
Record 93: Rejected - Error on table VEHO, column V_DATA.
Field in data file exceeds maximum length
Record 94: Rejected - Error on table VEHO, column V_DATA.
Field in data file exceeds maximum length
Record 95: Rejected - Error on table VEHO, column V_DATA.
Field in data file exceeds maximum length
Record 96: Rejected - Error on table VEHO, column V_DATA.
Field in data file exceeds maximum length
Record 97: Rejected - Error on table VEHO, column V_DATA.
Field in data file exceeds maximum length
Record 98: Rejected - Error on table VEHO, column V_DATA.
Field in data file exceeds maximum length
Record 99: Rejected - Error on table VEHO, column V_DATA.
Field in data file exceeds maximum length
Record 100: Rejected - Error on table VEHO, column V_DATA.
Field in data file exceeds maximum length
Record 101: Rejected - Error on table VEHO, column V_DATA.
Field in data file exceeds maximum length
MAXIMUM ERROR COUNT EXCEEDED - Above statistics reflect partial run.
Table VEHO:
50 Rows successfully loaded.
51 Rows not loaded due to data errors.
0 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.
Space allocated for bind array: 231168 bytes(64 rows)
Read buffer bytes: 1048576
Total logical records skipped: 0
Total logical records read: 101
Total logical records rejected: 51
Total logical records discarded: 0
Run began on Tue Apr 13 18:36:09 2010
Run ended on Tue Apr 13 18:36:09 2010
Elapsed time was: 00:00:00.13
CPU time was: 00:00:00.04
How can i solve this problem please help me out.
April 13, 2010 - 9:31 am UTC
hah, here we go:
I haven't done anything to research my problem.
Here is my table (that is good)
Here is my control file (followed by NOT a control file...)
anyway, pretty trivial:
date_time varchar2(20),
..
v_data varchar2(3500)
date_time in a string???? wow. I see lots of problems in your life.
But the problem is simple. v_data is 3,500 bytes, the default length of a sqlldr variable is 255. You appear to have data that exceeds the current maximum length of 255.
therefore, in your control file (not your log file) use:
....
status,
v_data char(3500)
)
......
tell sqlldr how big it is.
Strange problem with SQL*Loader ...
Bob, April 16, 2010 - 6:17 pm UTC
Hi Tom - hope you are well. I have this strange problem with sql*loader. My setup on Oracle 10g is as follows:
i)create table y (a varchar2(100, b varchar2(100) );
ii)create table y_good(a number, b number);
iii)create table y_bad(a number, b number);
DROP TRIGGER Y_TRG;
CREATE OR REPLACE TRIGGER Y_TRG
BEFORE INSERT OR UPDATE
ON Y FOR EACH ROW
WHEN (
new.a is not null and new.b is not null
)
declare
l number;
b_exc exception;
begin
if inserting then
if y_fn(:new.a, :new.b) = 'N'
then
raise b_exc;
else
y_proc('Y');
end if;
end if;
exception
when b_exc then
y_proc('N');
raise_application_error(-20101, 'Error: b_exc help!!');
when others then raise_application_error(-20102, SQLERRM);
end;
/
CREATE OR REPLACE function y_fn(a number, b number) return varchar2 is
begin
if a+b > 4 then return 'Y'; else return 'N'; end if;
end;
/
CREATE OR REPLACE procedure y_proc(p_flag in varchar2)
is
PRAGMA AUTONOMOUS_TRANSACTION;
begin
if p_flag = 'Y' then
insert into y_good values (2,2);
else
insert into y_bad values(3,3);
end if;
commit;
end;
/
Now if I run SQL*Loader to populate table y as follows:
b.ctl is:
OPTIONS (READSIZE=4586000, BINDSIZE=4586000, ROWS=1000, SILENT=DISCARDS, DISCARDMAX=50 )
LOAD DATA
INFILE b1.dat
APPEND
INTO TABLE y
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
TRAILING NULLCOLS
(
a char(100),
b char(100)
)
b1.dat is:
1,1
2,6
2,8
SQL*Loader command is:
sqlldr user/passwd control=b.ctl
b.log gives me:
SQL*Loader: Release 9.2.0.8.0 - Production on Sat Apr 17 01:10:05 2010
Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
Control File: b.ctl
Data File: b1.dat
Bad File: b1.bad
Discard File: none specified
(Allow all discards)
Number to load: ALL
Number to skip: 0
Errors allowed: 50
Bind array: 1000 rows, maximum of 4586000 bytes
Continuation: none specified
Path used: Conventional
Silent options: DISCARDS
Table Y, loaded from every logical record.
Insert option in effect for this table: APPEND
TRAILING NULLCOLS option in effect
Column Name Position Len Term Encl Datatype
------------------------------ ---------- ----- ---- ---- ---------------------
A FIRST 100 , O(") CHARACTER
B NEXT 100 , O(") CHARACTER
Record 1: Rejected - Error on table Y.
ORA-20101: Error: b_exc help!!
ORA-06512: at "Y_TRG", line 18
ORA-04088: error during execution of trigger 'Y_TRG'
Table Y:
2 Rows successfully loaded.
1 Row not loaded due to data errors.
0 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.
Space allocated for bind array: 204000 bytes(1000 rows)
Read buffer bytes: 4586000
Total logical records skipped: 0
Total logical records read: 3
Total logical records rejected: 1
Total logical records discarded: 0
Run began on Sat Apr 17 01:10:05 2010
Run ended on Sat Apr 17 01:10:08 2010
Elapsed time was: 00:00:02.68
CPU time was: 00:00:00.01
Now I select from each of the tables :-)
SQL> select * from y_good;
A B
---------- ----------
2 2
2 2
2 rows selected.
correct ..
SQL> select * from y;
A
---------------------------------------
B
---------------------------------------
2
6
2
8
2 rows selected.
correct ...
And ...
SQL> select * from y_bad;
A B
---------- ----------
3 3
3 3
2 rows selected.
wrong ...
Why do I get two records in y_bad when I am expecting only one!?
April 17, 2010 - 12:44 pm UTC
because you make assumptions about how many times a trigger can or should fire and you use the most evil of features (well, two of them)
o autonomous transactions, which by definition break the rules of transactions - goodbye ACID propertiers.
o triggers.
http://asktom.oracle.com/Misc/something-different-part-i-of-iii.html http://asktom.oracle.com/Misc/part-ii-seeing-restart.html http://asktom.oracle.com/Misc/part-iii-why-is-restart-important-to.html Truth be told, regardless of how good a programmer you are, if this code was submitted to me for review in a code review, I would reject it out of hand - it would not be considered, it would be erased. Period.
Do things in straightforward, concise, easy to understand ways, not through trickery and magic. Triggers are trickery and magic - coupled with autonomous transactions they are black magic.
I hate this code:
exception
when b_exc then
y_proc('N');
raise_application_error(-20101, 'Error: b_exc help!!');
when others then raise_application_error(-20102, SQLERRM);
end;
can you explain the when others? Why is it there - other than to hide the REAL line number the error occurred on from view?
SQL loader
A reader, April 18, 2010 - 5:47 am UTC
What a response, thanks - lesson learnt! I don't profess to be a good programmer - only to be a better one from using this site! The second link was pretty useful ...
So is it possible to do checking before loading records by calling a function in SQL*Loader but not storing the result.
I see your point about the "others exception". We don't get the line number ...So under what circumstances "should" we use others?
April 19, 2010 - 8:46 am UTC
... So is it possible to do checking before loading records by calling a function
in SQL*Loader but not storing the result. ...
sure, you can call functions right in the sql statment.
See my first original example - I show a control file that calls "upper", you could write your own plsql functions that take the data as input, validate it, and either
a) fail because data is bad
b) succeed and return data because it is good
... So under what circumstances "should" we use others?
..
Almost never. If you ask me you would use it ONLY at the top level in a call to the database (so a client might use it to call a stored procedure) and ONLY to log the error/translate the error
Eg: a client wants to call procedure P, they could:
begin p; end;
but that would return things like "ora-1555 snapshot too old" to the end user and no one in the administration team would know about it.
or they could:
begin
p;
exception
when others then
log_error( ..... ); -- this is an autonomous transaction, the only valid
-- use of an autonomous transaction, log error is
-- autonomous<b>
RAISE_application_error( -20002, 'an unrecoverable error has occurred and the development team has been notified, please try again later' );</b>
end;
When Others
Gaurav Sachan, April 19, 2010 - 3:53 am UTC
"When Others" is just a generic exception handling situation which has not been previously handled exclusively.
April 19, 2010 - 8:55 am UTC
and should typically NEVER be used.
Rajj, August 27, 2010 - 1:40 am UTC
Hi Tom,
Thanks as usual for your great help and for your university
Please help me, how to proceed in the below scenario.
I got a scenario, where my client has shared huge data to load into a table. The data is divided into seperate sheets in the same excel. The excel has 52 sub sheets.
Is there a way from sql loader to load all the sub sheets data into the table. I am using Oracle 10g relase 2
Regards,
Raj
September 07, 2010 - 7:39 am UTC
you won't be using sqlldr, you'll have to get the data out of the spreadsheet into a normal file before sqlldr can touch it.
and if it is in a spreadsheet - you have by definition a teeny tiny bit of data to load, it cannot be huge, spreadsheets are extremely limited in size.
How to create new record Every found character " ' " using SQl Loader
Defri, October 06, 2010 - 2:24 am UTC
Dear Tom,
I have data from text file like this :
UNB+UNOA:2+TPS+MPN+101005:1712+1'UNH+1+CODECO:D:95 B:UN:ITG14'BGM+34++9'TDT+20+1057+1++MPN:172:20+++YH FB:103::KINTAMANI'RFF+VON:1057'LOC+9+IDSUB:139:6+TPS :GAT:ZZZ'DTM+178:190012312300:203'
I want insert that data to table using sql loader like this :
UNB+UNOA:2+TPS+MPN+101005:1712+1'
UNH+1+CODECO:D:95B:UN:ITG14'
BGM+34++9'TDT+20+1057+1++MPN:172:20+++YHFB:103::KI NTAMANI'
RFF+VON:1057'LOC+9+IDSUB:139:6+TPS:GAT:ZZZ'
DTM+178:190012312300:203'
Every found character " ' " , I want create new record.
Please help me.
Thanks for support before.
Regards,
Defry
October 06, 2010 - 5:03 am UTC
using the STR attribute of the INFILE clause, you can specify what the "end of line" character is.
ops$tkyte%ORA11GR2> select to_char( ascii(''''), 'xx' ) from dual;
TO_
---
27
ops$tkyte%ORA11GR2> !cat t.dat
UNB+UNOA:2+TPS+MPN+101005:1712+1'UNH+1+CODECO:D:95B:UN:ITG14'BGM+34++9'TDT+20+1057+1++MPN:172:20+++YH FB:103::KINTAMANI'RFF+VON:1057'LOC+9+IDSUB:139:6+TPS:GAT:ZZZ'DTM+178:190012312300:203'
ops$tkyte%ORA11GR2> desc t
Name Null? Type
------------------------------------ -------- -------------------------
DATA VARCHAR2(4000)
ops$tkyte%ORA11GR2> !cat t.ctl
LOAD DATA
INFILE t.dat "str X'27'"
INTO TABLE t
REPLACE
(data char(1000))
ops$tkyte%ORA11GR2> !sqlldr / t
SQL*Loader: Release 11.2.0.1.0 - Production on Wed Oct 6 06:02:55 2010
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Commit point reached - logical record count 7
ops$tkyte%ORA11GR2> select * from t;
DATA
------------------------------------------------------------------------
UNB+UNOA:2+TPS+MPN+101005:1712+1
UNH+1+CODECO:D:95B:UN:ITG14
BGM+34++9
TDT+20+1057+1++MPN:172:20+++YH FB:103::KINTAMANI
RFF+VON:1057
LOC+9+IDSUB:139:6+TPS:GAT:ZZZ
DTM+178:190012312300:203
7 rows selected.
sql loader
A reader, October 11, 2010 - 10:46 pm UTC
Hi Tom,
I have some very basic questions to ask.
1) I heard that in serial direct load the data is inserted beyond High water mark. Can you please explain the reason behind this?
2) Does Sql loader utility also employ the same insertion theory of inserting beyond HWM?
3) In direct path load of SQL loading, it bypasses the Buffer cache. I can understand that since through inserting/loading, the use of Cache is not necessary always which saves time, but how can it find and save the exact Table name where data is to be loaded? What is internally working here exactly?
October 12, 2010 - 7:23 am UTC
1) because direct path loads always write above the high water mark by definition - they cannot reuse existing free space. A direct path load gets a new - totally empty block - fills it up and writes it directly (hence the name) to disk, bypassing the buffer cache and all. It is designed for high speed, bulk loads of data - walking freelists for every row to be inserted (as a conventional path insert does...) would be a waste of time. The assumption is
o you have a ton of data
o that you want to load
o as efficiently as possible
o in parallel or not - doesn't matter to us (only DIRECT matters)
2) sqlldr doesn't dictate this, sqlldr is just asking the server to "direct path load" this data - just like create table as select does, or alter table t move, or insert /*+ APPEND */, or your own custom OCI program could.
the database direct path load always loads above the high water mark by design.
3) huh? You are doing the direct load, you have made the request, you already KNOW the table. I didn't understand the meaning of this question.
sqlloader
A reader, October 12, 2010 - 9:00 am UTC
Hi Tom,
I mean what is internally happening in the direct path load even if we know in which table we have to insert the data. How can it bypass the DB cache? I can sense the benifit but couldn't get a method of bypassing?
October 12, 2010 - 9:20 am UTC
the server code does the load, your dedicated or shared server would
a) receive the data to insert from the client
b) use that stuff from (a) to fill up a piece of memory with the data
c) write that data to the data files instead of putting it into the cache.
direct load
A reader, October 12, 2010 - 10:21 pm UTC
Hi Tom,
Is there any difference between direct load (insert /*+append*/ into tablename nologging ....) and direct path load?
Does sqlldr always perform direct path load or conventional path load?
October 13, 2010 - 6:51 am UTC
if the insert /*+ append */ can be done (eg: no triggers, foreign keys - all of the requirements for a direct path load are met) then it is a direct path load - yes.
sqlldr by default does a conventional path load (normal insert, reuses existing freespace, uses the buffer cache). You have to go out of your way to use a direct path load with sqlldr (direct=y)
sql loader
A reader, October 18, 2010 - 9:04 am UTC
Hi Tom,
You said, "You have to go out of your way to use a direct path load with sqlldr (direct=y)".
Does it mean we can use all the properties of a direct path load with direct=y clause in sqlldr command? i.e. Can we directly load the data in the datafile through "sqlldr" utility with direct=y clause?
October 25, 2010 - 8:55 am UTC
Data loading question
Katja, December 21, 2010 - 5:52 pm UTC
Hi Tom,
I am having a problem loading some data. The table I'm loading into is as follows:
NFI_PLOT NUMBER(7,0)
POLY_ID VARCHAR2(20 BYTE)
SAMPLE_DATE DATE
LANDUSE_NUM NUMBER(1,0)
INFO_DATE DATE
INFO_SOURCE CHAR(1 BYTE)
POLY_AREA NUMBER(9,6)
LANDUSE VARCHAR2(4 BYTE)
RAW_PACK_ID_NUM NUMBER(7,0)
... plus some other fields that will be filled later.
My control file is as follows:
load data
infile '/dwarfs/fog/data/on_great_lakes_plus/on_pp_lu.csv'
append
into table nfi_pptmp.tmp_pp_landuse
fields terminated by ","
optionally enclosed by '"'
trailing nullcols
(nfi_plot,
poly_id,
sample_date,
landuse_num,
info_date,
info_source,
poly_area,
landuse char,
raw_pack_id_num constant 992)
My data is as follows (created in excel, but moved to unix and ran dos2unix on it):
1397701,1397701_1,30-Jun-06,1,1-Jul-95,I,0.492,CONU
1425231,1425231_1,29-Jul-09,1,29-Jul-09,O,400.116574,UNK
1425226,1425226_1,29-Jul-09,1,29-Jul-09,O,400.069482,UNK
1418356,1418356_1,29-Jul-09,1,29-Jul-09,O,400.186724,UNK
1418351,1418351_1,29-Jul-09,1,29-Jul-09,O,400.148144,UNK
1418346,1418346_1,29-Jul-09,1,29-Jul-09,O,400.104683,UNK
1411476,1411476_1,29-Jul-09,1,29-Jul-09,O,400.212071,UNK
1411471,1411471_1,29-Jul-09,1,29-Jul-09,O,400.17711,UNK
1404601,1404601_1,29-Jul-09,1,29-Jul-09,O,400.261212,UNK
1404596,1404596_1,29-Jul-09,1,29-Jul-09,O,400.234782,UNK
1397721,1397721_1,29-Jul-09,1,29-Jul-09,O,400.277685,UNK
1397716,1397716_1,29-Jul-09,1,29-Jul-09,O,400.25486,UNK
1397701,1397701_10,30-Jun-06,1,1-Jul-95,I,250.195,UNK
1397701,1397701_11,30-Jun-06,1,1-Jul-95,I,1.623,CONU
The log file reports the following:
SQL*Loader: Release 9.2.0.1.0 - Production on Tue Dec 21 14:46:11 2010
Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
Control File: lload_on_lu.ctl
Data File: /dwarfs/fog/data/on_great_lakes_plus/on_pp_lu.csv
Bad File: on_pp_lu.bad
Discard File: none specified
(Allow all discards)
Number to load: ALL
Number to skip: 0
Errors allowed: 50
Bind array: 64 rows, maximum of 256000 bytes
Continuation: none specified
Path used: Conventional
Table NFI_PPTMP.TMP_PP_LANDUSE, loaded from every logical record.
Insert option in effect for this table: APPEND
TRAILING NULLCOLS option in effect
Column Name Position Len Term Encl Datatype
------------------------------ ---------- ----- ---- ---- ---------------------
NFI_PLOT FIRST * , O(") CHARACTER
POLY_ID NEXT * , O(") CHARACTER
SAMPLE_DATE NEXT * , O(") CHARACTER
LANDUSE_NUM NEXT * , O(") CHARACTER
INFO_DATE NEXT * , O(") CHARACTER
INFO_SOURCE NEXT * , O(") CHARACTER
POLY_AREA NEXT * , O(") CHARACTER
LANDUSE NEXT * , O(") CHARACTER
RAW_PACK_ID_NUM CONSTANT
Value is '992'
Record 1: Rejected - Error on table NFI_PPTMP.TMP_PP_LANDUSE, column LANDUSE.
ORA-01401: inserted value too large for column
Record 14: Rejected - Error on table NFI_PPTMP.TMP_PP_LANDUSE, column LANDUSE.
ORA-01401: inserted value too large for column
Record 15: Rejected - Error on table NFI_PPTMP.TMP_PP_LANDUSE, column LANDUSE.
ORA-01401: inserted value too large for column
etc.
Values in LANDUSE of 4 characters are deemed 'too large', but the table I'm loading into has LANDUSE as varchar2(4 bytes). Do I need to modify my table to be able to load the records with LANDUSE values of CONU or AGRB etc.? Or could there be something else that I've missed?
Thank you very much! If you've written about this elsewhere, please forgive me for posting here, and I would really appreciate being directed to that location.
Katja
December 22, 2010 - 1:57 pm UTC
what is your character set - is it a multibyte character set - does it take more than 4 bytes to store your data in that character set.
Problems loading data - fixed!
A reader, December 22, 2010 - 9:17 am UTC
Hi Tom,
I forgot to mention that the table in the database is set (not changeable) and that the data it already contains does include 4-character LANDUSE values.
I thought about it overnight, and looked up some things, and decided I could try to select a substring (first 4 characters) of the LANDUSE field as I'm loading it - I did this and it worked. Thanks for listening!
Katja : )
How to use SQL*LOADER to load data in |SMITH|ALFRED| format
phoenixbai, December 28, 2010 - 2:33 am UTC
the data I need to upload to table using SQL*LOADER is in format as below:
|AD |Argentina |
|CN |China |
|US |America |
|GB |England |
so how should I write my control file to read the data into two columns table?
I googled and people say that use Enclosed fields—delimiter (|), and I tried with below, but not working:
LOAD DATA
INFILE *
insert
INTO TABLE tmp_country_mapping
FIELDS ENCLOSED BY '|'
(country_id, country_name)
BEGINDATA
|AD |Argentina |
|CN |China |
|US |America |
|GB |England |
And, I don`t want the spaces behind the value of the second column either.
December 28, 2010 - 10:17 am UTC
you wrote: "but not working"
I'll write: "but my car won't start"
do you understand how vague "but not working" is.
Your fields are terminated by | - not really enclosed by - otherwise you don't have any field terminators. So, we'll just say "terminated by" and skip the "first field" with a filler and trim anything we want.
ops$tkyte%ORA11GR2> truncate table tmp_country_mapping;
Table truncated.
ops$tkyte%ORA11GR2> desc tmp_country_mapping
Name Null? Type
---------------------------------------- -------- ----------------------------
COUNTRY_ID VARCHAR2(20)
COUNTRY_NAME VARCHAR2(30)
ops$tkyte%ORA11GR2> !cat t.ctl
LOAD DATA
INFILE *
insert
INTO TABLE tmp_country_mapping
FIELDS terminated BY '|'
(
X FILLER,
country_id "trim(:country_id)",
country_name "trim(:country_name)"
)
BEGINDATA
|AD |Argentina |
|CN |China |
|US |America |
|GB |England |
ops$tkyte%ORA11GR2> !sqlldr / t
SQL*Loader: Release 11.2.0.2.0 - Production on Wed Dec 29 02:51:05 2010
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Commit point reached - logical record count 4
ops$tkyte%ORA11GR2> select '"' || country_id || '"', '"'||country_name||'"' from tmp_country_mapping;
'"'||COUNTRY_ID||'"' '"'||COUNTRY_NAME||'"'
---------------------- --------------------------------
"AD" "Argentina"
"CN" "China"
"US" "America"
"GB" "England"
Thank you so much :D
PhoenixBai, December 28, 2010 - 7:18 pm UTC
Thank you Tom very much.
It works perfectly.
Sorry for being 'vague', won`t happen again!
And wish you a great day!!!
WHEN clause restriction.
Rajeshwaran, Jeyabal, January 29, 2011 - 10:23 am UTC
Here is the Control file along with data.
LOAD DATA
infile *
APPEND
INTO TABLE T
WHEN (Y = '2')
FIELDS TERMINATED BY ',' TRAILING NULLCOLS
(
X ,
Y
)
INTO TABLE T
WHEN (Y = '4')
FIELDS TERMINATED BY ',' TRAILING NULLCOLS
(
X ,
Y
)
begindata
1,2
3,4
5,6
7,0
rajesh@10GR2>
rajesh@10GR2> select * from t;
X Y
---------- ----------
1 2
Elapsed: 00:00:00.01
rajesh@10GR2>
Log file contents
Control File: ctl_1.txt
Data File: ctl_1.txt
Bad File: ctl_1.bad
Discard File: none specified
(Allow all discards)
Number to load: ALL
Number to skip: 0
Errors allowed: 50
Bind array: 64 rows, maximum of 256000 bytes
Continuation: none specified
Path used: Conventional
Table T, loaded when Y = 0X32(character '2')
Insert option in effect for this table: APPEND
TRAILING NULLCOLS option in effect
Column Name Position Len Term Encl Datatype
------------------------------ ---------- ----- ---- ---- ---------------------
X FIRST * , CHARACTER
Y NEXT * , CHARACTER
Table T, loaded when Y = 0X34(character '4')
Insert option in effect for this table: APPEND
TRAILING NULLCOLS option in effect
Column Name Position Len Term Encl Datatype
------------------------------ ---------- ----- ---- ---- ---------------------
X NEXT * , CHARACTER
Y NEXT * , CHARACTER
Record 2: Discarded - failed all WHEN clauses.
Record 3: Discarded - failed all WHEN clauses.
Record 4: Discarded - failed all WHEN clauses.
Table T:
1 Row successfully loaded.
0 Rows not loaded due to data errors.
3 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.
Table T:
0 Rows successfully loaded.
0 Rows not loaded due to data errors.
4 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.
Space allocated for bind array: 66048 bytes(64 rows)
Read buffer bytes: 1048576
Total logical records skipped: 0
Total logical records read: 4
Total logical records rejected: 0
Total logical records discarded: 3
Tom:
I am invoking this Control file using SQL*loader from Window OS. I dont know why the value of Y=4 is not getting loaded. Is that we cannot have multiple WHEN clause for same table ('T') in a single control file?
February 01, 2011 - 4:16 pm UTC
http://docs.oracle.com/docs/cd/E11882_01/server.112/e16536/ldr_control_file.htm#sthref699 <quote>
The important point in this example is that the second empno field is found immediately after the first ename, although it is in a separate INTO TABLE clause. Field scanning does not start over from the beginning of the record for a new INTO TABLE clause. Instead, scanning continues where it left off.</quote>
I see you figured it out below...
WHEN clause restriction.
Rajeshwaran, Jeyabal, January 30, 2011 - 2:55 am UTC
I made changes to Control file like below, and data loaded correctly.
LOAD DATA
infile *
APPEND
INTO TABLE T
WHEN (Y = '2')
FIELDS TERMINATED BY ',' TRAILING NULLCOLS
(
X ,
Y
)
INTO TABLE T
WHEN (Y = '4')
FIELDS TERMINATED BY ',' TRAILING NULLCOLS
(
X POSITION(1),
Y
)
begindata
1,2
3,4
5,6
7,0
rajesh@10GR2> truncate table t ;
Table truncated.
Elapsed: 00:00:00.70
rajesh@10GR2>
rajesh@10GR2> host
Microsoft Windows XP [Version 5.1.2600]
(C) Copyright 1985-2001 Microsoft Corp.
D:\>cd trash
D:\Trash>
D:\Trash>sqlldr userid=rajesh@10GR2 control=ctl_1.txt log=log.txt
Password:
SQL*Loader: Release 11.2.0.1.0 - Production on Sun Jan 30 14:02:46 2011
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Commit point reached - logical record count 3
D:\Trash>exit
rajesh@10GR2>
rajesh@10GR2> select * from t;
X Y
---------- ----------
1 2
3 4
Elapsed: 00:00:00.01
rajesh@10GR2>
rajesh@10GR2>
File Name column update through SQL Loader
Abhisek, March 08, 2011 - 6:22 am UTC
Hi Tom
I have a column in which I have to update with the filename getting loaded through SQL Loader. Can it be handled through SQL Loader?
I tried two way which works:
1. I can create a control file with the CONSTANT column value and before running the SQL loader I run the sed command to replace the CONTANT with the file name.
2. I create a Perl file to create a temporary datafile with append the file name at the end of the each line and run it through SQL loader to load the data with new datafile.
I hope it can be done through external tables but can it be done through SQL Loader?
Option 3 SQL Loader
Abhisek, March 08, 2011 - 5:14 pm UTC
Hi Tom,
The options 1,2 and 4 are fine for me. But I would appreciate if you could explain the process of named pipe with respect to SQL Loader.
March 09, 2011 - 7:31 am UTC
I thought I did?
use mknod to make a named pipe in Unix/Linux.
cat the file into sed, change the first bit of each line into the filename,<rest of line>, redirect that output into the named pipe.
start sqlldr and direct sqlldr to read from the named pipe.
$ mknod sqlldr_input.dat p
<b>did the mknod, made a named pipe...</b>
$ cat test.dat | sed 's/^/test.dat,/' > sqlldr_input.dat &
[1] 1220
<b>Ran a process to put data into the pipe, it will run until the pipe is emptied...</b>
$ sqlldr / test data=sqlldr_input.dat
SQL*Loader: Release 11.2.0.2.0 - Production on Wed Mar 9 08:30:15 2011
Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
Commit point reached - logical record count 4
[1]+ Done cat test.dat | sed 's/^/test.dat,/' > sqlldr_input.dat
<b>emptied the pipe...</b>
$ sqlplus /
SQL*Plus: Release 11.2.0.2.0 Production on Wed Mar 9 08:30:33 2011
Copyright (c) 1982, 2010, Oracle. All rights reserved.
Connected to:
Oracle Database 11g Enterprise Edition Release 11.2.0.2.0 - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
ops$tkyte%ORA11GR2> select * from dept;
FILENAME DEPTNO DNAME LOC
------------------------------ ---------- -------------- -------------
test.dat 10 ACCOUNTING NEW YORK
test.dat 20 RESEARCH DALLAS
test.dat 30 SALES CHICAGO
test.dat 40 OPERATIONS BOSTON
<b>data is loaded...</b>
Thanks a lot
A reader, March 09, 2011 - 7:50 am UTC
special charecters after data load
A reader, May 11, 2011 - 4:50 pm UTC
Hi Tom,
I am having some problem loading data in to a table. Below is the data from .csv file and the data from the table after the load.
DB version is 10.2.0.4
CSV File:
"PAYROLL","Payrol"
Control File:
OPTIONS (SKIP=0)
LOAD DATA
INFILE 'O:\EMK\SYBASE_Exports\PS_ACCESS_GRP_TBL1.csv'
BADFILE 'C:\sybase_migration\datafiles\PS_ACCESS_GRP_TBL.bad'
TRUNCATE INTO TABLE PS_ACCESS_GRP_TBL
FIELDS TERMINATED BY "," OPTIONALLY ENCLOSED BY '"'
TRAILING NULLCOLS
(
ACCESS_GROUP "trim(:ACCESS_GROUP)",
descr "trim(:descr)"
)
Loded Data in the Database:
ÿþ",ÿþ"
Thanks
May 12, 2011 - 7:36 am UTC
what are your charactersets, both in the database and in your client (sqlldr clients) environment.
sm42737, January 05, 2012 - 8:22 am UTC
Hi Tom,
I need to add the FILE NAME as the last column in the loading table for all the files for which data is getting loaded, i have multiple files and each file is having multiple records.
The list of the files to be loaded are in a list file (MSC.lst), data is getting loaded in my intended table.
I am using the following Shell script to load the data:
ls -1 | sed 's/\(.*\)\..*/\1/' >> MSC.lst
file="MSC.lst"
while read line
do
sqlldr $dbUser/$dbPass@$dbServ control=$cont/loader.ctl data=$da/"$line".txt log=$lo/$SYSDATE/"$line".log bad=$ba/$SYSDATE/"$line".bad discard=$di/$SYSDATE/"$line".dis errors=200000
done <"$file"
Please suggest how to populate the last column with the file name from which the record got loaded.
January 05, 2012 - 10:12 am UTC
You'll have to use sed again to add the field to each file.
use a named pipe (mknod p)
in your do loop something similar to the following - you can work out the right escaping of things:
do
cat $da/"$line".txt | sed 's/$/,$line.txt' > my_named_pipe &
sqlldr ..... data=my_named_pipe .....
end <"$file"
in short, have sqlldr read from a named pipe, the named pipe is populated by sed which adds the filename to the end.
A reader, January 06, 2012 - 3:34 am UTC
Hi Tom,
ok i got the concept you shared, but its adding "$line" as text at each record at the end instead of its value i.e. 'The file name'
For example
Original data in the file:
BSM646I,012751003377500
BSM646W,012751003377501
After applying the suggestion:
BSM646I,012751003377500,$line
BSM646W,012751003377501,$lile
Requirement:
BSM646I,012751003377500,MSCODSMSC20120106001.txt
BSM646W,012751003377501,MSCODSMSC20120106001.txt
Please suggest
January 10, 2012 - 9:28 pm UTC
you cannot figure out a shell script? really? I wrote:
"in your do loop something similar to the following - you can work out the right escaping of things"
sigh, i figured you did know how to do all of that since you had in the prior script :(
[tkyte@localhost ~]$ cat test.sh
#/bin/bash -vx
export line=foobar
<b>
cat $line.txt | sed 's/$/,'$line'.txt/'
</b>
[tkyte@localhost ~]$ sh test.sh
a,b,c,foobar.txt
d,e,f,foobar.txt
[tkyte@localhost ~]$
To: A Reader
Greg, January 06, 2012 - 4:37 pm UTC
Try replacing the single quotes <'> in the sed command with double quotes <">. Unix will not expand variables in single quotes (at least ksh and bash - the ones that I use)
SQL Loader Rollback
A reader, January 13, 2012 - 1:38 pm UTC
hi Tom,
As far as I knew, we cannot rollback the records once SQL loader is started. But somewhere in a site it was written, we can rollback SQL Loader operation even.
Is that true?
Lets say I have 5000 records. After 3000 records, I have a corrupt record which should not be loaded. According to me, SQL Loader can skip the corrupted records till the number we defined in ERRORS option.
Is it also possible to rollback all 3000 records?
Please suggest.
January 17, 2012 - 9:55 am UTC
using an external table would absolutely be the way to go here.
You can use a simple insert as select - the insert either works entirely or not at all (no need to even rollback)
If you want to load everything you can - but log errors into another table (either all errors or errors up to some number of errors), you can use dml error logging.
http://asktom.oracle.com/Misc/how-cool-is-this.html http://asktom.oracle.com/Misc/how-cool-is-this-part-ii.html you can take your existing sqlldr control file and turn it into an external table in seconds:
sqlldr u/p control=file.name external_table=generate_only
the logfile will have the create table in it, that sqlldr command won't load anything - it will just create the external table create statement.
Followup
A reader, January 17, 2012 - 1:04 pm UTC
Thanks Tom,
I will use the methods you specified rather aleady implemented for extenal table. But my curiousity remains unanswered.
Is it possible to rollback the data inserted by SQL Loader in any case? If yes, how can we do so?
As I know, SQL Loader is used for loading data and it auto-commits after n number of rows calculated with options we provide in ROWS, BINDSIZE..
Please suggest.
January 17, 2012 - 3:32 pm UTC
sqlldr commmits, so no.
that is why I told you about external tables - sqlldr is so 1990.
devi
Devi, January 18, 2012 - 1:30 am UTC
i have '-' in one of my fields. If i have to skip loading records where field2 contains '-' in the value(any where in field2), do i need to use external tables? if not how do i solve my problem. my data is not fixed length.
January 18, 2012 - 7:25 am UTC
using an external table would certainly make this trivial - so easy.
is the '-' in a certain place or does it move around. the when clause in sqlldr is very simplistic, not very flexible at all.
Devi, January 18, 2012 - 1:46 am UTC
my text file is like this:
abc,123,123-op
abwec,123234,123-op
abqc,,123 op
abdfghhhc,123,123-op
i do not want to load record which has '-' in 3rd field.
how shall load data using sqlldr?
January 18, 2012 - 7:27 am UTC
use an external table would be my advice, sqlldr is very much 1990's technology...
you could load into a view of your table with an instead of trigger - the instead of trigger on the view would decide whether to load the row or not. Expect the performance hit you would expect from something like this.
Invalid number format model
Max, February 09, 2012 - 1:30 pm UTC
LOAD DATA
INFILE xxx.dat
INTO TABLE AAAA
FIELD1 SYSDATE,
FIELD2 "TO_NUMBER(TO_CHAR(:FIELD1,'IYYYIW'))"
I am getting error of invalid format model for field2.
can you answer plz??
February 09, 2012 - 7:38 pm UTC
give complete example please.
an entire control file with at least one row in a begindata section - along with a create table.
Multiple data load
dani, January 02, 2013 - 12:04 pm UTC
HI,
I would like to know
1.how many records at a time can be added into the table at a time and into how many tables data can be added at a time?
2.Detailed example about the load using integrity constraint.
January 04, 2013 - 2:54 pm UTC
1) as many as you want. No limits
2) I don't know what you mean.
Multiple values to STR clause
Stan, February 27, 2013 - 4:46 pm UTC
Hi Tom,
I have a scenario where data file can come with LF or CRLF as record terminator. (It can be provided by UNIX server or Windows server) and I have no control over it. Is there any way in control file options to specify more than one record terminators? I can convert windows file to unix by dos2unix command, but wanted to explore if it is possible to handle this in control file.
Thanks!
February 28, 2013 - 7:55 am UTC
use LF as the end of line and rtrim() a chr(13) from the last column loaded. You'll be able to deal with with file format then.
How to load specific lines.
A reader, May 08, 2013 - 4:53 am UTC
Hi Tom,
How can we load specific lines into tables.
Input file:
#################
good1,trtrtyry,1
hello2,sfdfdfdfdf,438437847
good3,trtrtyry,1
hello4,sfdfdfdfdf,438437847
good5,trtrtyry,1
hello6,sfdfdfdfdf,438437847
good7,trtrtyry,1
hello8,sfdfdfdfdf,438437847
good9,trtrtyry,1
hello10,sfdfdfdfdf,438437847
good11,trtrtyry,1
hello12,sfdfdfdfdf,438437847
############
I just want to load all lines except 6th to 10th line(NOTE: there is no specific order data in any column so we can't put a column condition).
I tried to make use of recnum but as we all know it would not be supported in WHEN condition :(.
I can append line number to each line, using unix and then load but ,I dont want to do in that way. Is there any trick to do this(using only sql loader)?
Regards,
Reader.
May 08, 2013 - 1:16 pm UTC
how do you know it is line 6 and 10? is it "just because it is line 6 and 10, it is always 6 and 10".
if so, just use an external table with RECNUM and you'll be able to insert into yourtable select * from et where r not in (6,10);
ops$tkyte%ORA11GR2> create or replace directory my_dir as '/tmp'
2 /
Directory created.
ops$tkyte%ORA11GR2>
ops$tkyte%ORA11GR2> CREATE TABLE et
2 ( c1 varchar2(30), c2 varchar2(30), c3 varchar2(30), r number
3 )
4 ORGANIZATION EXTERNAL
5 ( TYPE ORACLE_LOADER
6 DEFAULT DIRECTORY "MY_DIR"
7 ACCESS PARAMETERS
8 ( fields terminated by ',' missing field values are null
9 ( c1 char, c2 char, c3 char, r RECNUM ) )
10 LOCATION
11 ( 'test.dat'
12 )
13 )
14 /
Table created.
ops$tkyte%ORA11GR2> select * from et where r not in (6,10);
C1 C2 C3 R
------------------------------ ------------------------------ ------------------------------ ----------
good1 trtrtyry 1 1
hello2 sfdfdfdfdf 438437847 2
good3 trtrtyry 1 3
hello4 sfdfdfdfdf 438437847 4
good5 trtrtyry 1 5
good7 trtrtyry 1 7
hello8 sfdfdfdfdf 438437847 8
good9 trtrtyry 1 9
good11 trtrtyry 1 11
hello12 sfdfdfdfdf 438437847 12
10 rows selected.
cont to my recent post.
A reader, May 09, 2013 - 3:52 am UTC
Hi Tom,
Thank you for your reply.
But I was aware how to do using external table.
Could you please tell me ,is it possible to do using sql loader(in some post you have told using some continueif ,I am
not sure whether that continueif can be used for this case)?
thanks for your reply.
A reader, May 09, 2013 - 3:51 pm UTC
SQL*Loader-500: Unable to open file
samar, May 10, 2013 - 2:07 pm UTC
Hi Tom,
I am trying to load below sample data file using SQL Loader(Release 8.0.6.3.0) into a table that has 3 columns. The data for 'comments' column appears in multiple lines.
I thought of using record terminator to load this data and used same control file as mentioned in this post. However when I run it I get this error message.
"SQL*Loader-500: Unable to open file (test1.dat)"
Am i missing something here?
load data
infile test1.dat "str X'7C0D0A'"
into table example
TRAILING NULLCOLS
fields terminated by ' '
(col1 ,
col2 ,
comments)
CREATE TABLE example(
col1 NUMBER,
col2 NUMBER,
comments VARCHAR2(4000)
)
test1.dat
88 123
I find your book very useful.
I tried many of your examples given in
your book.The examples are very much
meaningful
and educative.|
88 076
I find your book very useful. I tried many of your examples
given in your book.The examples are very much meaningful and educative.|
88 300
I find your book very useful. I tried many of your examples given in your book.
The examples are very much meaningful and educative.|
May 10, 2013 - 3:32 pm UTC
and if you just do infile test1.dat without the str clause - does it find test1.dat?
looks ok to me at a quick glance, there is no way I have 8.0.6 around to test it with however.
SQL*Loader-500: Unable to open file
samar, May 11, 2013 - 5:36 am UTC
Yes. It reads test1.dat file. It loads only first row with 'comments' column as null and rejects other rows of the first record set. It does same for the other 2 records of data file.
I understand it takes line feed(\n) as default record terminator.
Is there any alternate to load such records.
May 12, 2013 - 3:40 pm UTC
try using data=test1.dat on the command line instead of the control file.
8.0 software is ancient (more than 16 years...).
I assume you are on windows right? using a carriage return/linefeed is in fact correct for you right?
SQL*Loader-500: Unable to open file
samar, May 14, 2013 - 11:57 am UTC
I was able to load data with SqlLoader version 11.2, so looks like old version (8.0) was an issue.
Thanks for your support.
Regards,
Samar
can a different column to referred when loading
Umesh Kasturi, July 25, 2013 - 8:57 am UTC
Hi Tom,
While loading data from SQL Loader, Can a different column be referred in a colum
-- an example table to load data
create table t
( id number
,name varchar2(20)
,hdate date);
-- the following procedure inserts date into table T
create or replace procedure insert_t ( pid number, pname varchar2, phdate date)
as
begin
insert into t values ( pid,pname,phdate);
commit;
end;
/
I want to load data using the above procedure from the sql loader. This procedure logic is just an example
When loading data for column hdate , I want to refer the first 2 columns id and name . How can this be acieved?
load data
INFILE *
replace INTO TABLE T
FIELDS TERMINATED BY ',' optionally enclosed by '"'
TRAILING NULLCOLS
(id filler
, name filler
,hdate "insert_t(:id,:name,:hdate)"
)
begindata
1,Tom.12-jan-2001
2,Tim,13-jan-2001
Thanks
August 02, 2013 - 5:16 pm UTC
you really don't want to use that procedure. that procedure has a commit in it! you would be throwing all transactional properties out the window! remove that commit!!!!!
I really don't like procedures like this, a TABLE API. ugh, they are almost always pointless. Just insert.
I'll have to assume there is a sound technical reason - that the procedure is doing something to the data, something useful....
you can use an instead of trigger on a view, don't expect it to be blazing fast as you've turned a nice array based process into a slow by slow procedural process...
ops$tkyte%ORA11GR2> create table t
2 ( id number,
3 name varchar2(20),
4 hdate date
5 );
Table created.
ops$tkyte%ORA11GR2>
ops$tkyte%ORA11GR2> create or replace procedure insert_t( p_id number, p_name varchar2, p_hdate date )
2 as
3 begin
4 insert into t (id, name, hdate ) values ( p_id, 'x'||p_name, p_hdate );
5 end;
6 /
Procedure created.
ops$tkyte%ORA11GR2>
ops$tkyte%ORA11GR2> create or replace view v
2 as
3 select * from t;
View created.
ops$tkyte%ORA11GR2>
ops$tkyte%ORA11GR2> create or replace trigger v_trigger
2 instead of insert on v
3 begin
4 insert_t( :new.id, :new.name, :new.hdate );
5 end;
6 /
Trigger created.
ops$tkyte%ORA11GR2>
ops$tkyte%ORA11GR2> !cat v.ctl
load data
INFILE *
replace INTO TABLE ops$tkyte.v
FIELDS TERMINATED BY ',' optionally enclosed by '"'
TRAILING NULLCOLS
(id , name ,hdate
)
begindata
1,Tom,12-jan-2001
2,Tim,13-jan-2001
ops$tkyte%ORA11GR2> !sqlldr / v
SQL*Loader: Release 11.2.0.3.0 - Production on Fri Aug 2 13:10:00 2013
Copyright (c) 1982, 2011, Oracle and/or its affiliates. All rights reserved.
Commit point reached - logical record count 2
ops$tkyte%ORA11GR2> select * from t;
ID NAME HDATE
---------- ------------------------------ ---------
1 xTom 12-JAN-01
2 xTim 13-JAN-01
My Above question
Umesh Kasturi, July 29, 2013 - 4:11 am UTC
Tom
Can you please look into my above question
Thanks
August 02, 2013 - 5:33 pm UTC
i don't necessarily check here every single day....
Ganesh, September 12, 2013 - 11:18 am UTC
Data to load:
"HEADER", "This_File_Name_date.csv"
"R1C1","R1C2","R1C3"
"R2C1","R2C2","R2C3"
"R3C1","R3C2","R3C3"
"FOOTER",3
Table to hold the data:
CREATE TABLE stage_table
(source_file VARCHAR2(100),
column1 VARCHAR2(10),
column2 VARCHAR2(10),
column3 VARCHAR2(10));
Control file:
load data
infile 'This_File_Name.csv'
truncate
into table stage_table
fields terminated by ',' optionally enclosed by '"' trailing nullcols
(
source_file,
column1,
column2,
column3
)
1. Is it possible to read the file name from the "HEADER" record and load to source_file column in each record that we load.
2. How to skip the last line in the data file?
Thanks
September 23, 2013 - 5:59 pm UTC
use an external table
ops$tkyte%ORA11GR2> CREATE TABLE t
2 (
3 "SOURCE_FILE" VARCHAR2(100),
4 "COLUMN1" VARCHAR2(10),
5 "COLUMN2" VARCHAR2(10),
6 "COLUMN3" VARCHAR2(10)
7 )
8 ORGANIZATION external
9 (
10 TYPE oracle_loader
11 DEFAULT DIRECTORY home
12 ACCESS PARAMETERS
13 (
14 RECORDS DELIMITED BY NEWLINE CHARACTERSET US7ASCII
15 READSIZE 1048576
16 SKIP 1
17 FIELDS TERMINATED BY "," OPTIONALLY ENCLOSED BY '"' LDRTRIM
18 MISSING FIELD VALUES ARE NULL
19 REJECT ROWS WITH ALL NULL FIELDS
20 (
21 "SOURCE_FILE" CHAR(255)
22 TERMINATED BY "," OPTIONALLY ENCLOSED BY '"',
23 "COLUMN1" CHAR(255)
24 TERMINATED BY "," OPTIONALLY ENCLOSED BY '"',
25 "COLUMN2" CHAR(255)
26 TERMINATED BY "," OPTIONALLY ENCLOSED BY '"',
27 "COLUMN3" CHAR(255)
28 TERMINATED BY "," OPTIONALLY ENCLOSED BY '"'
29 )
30 )
31 location
32 (
33 'test.dat'
34 )
35 )REJECT LIMIT UNLIMITED
36 /
Table created.
ops$tkyte%ORA11GR2> select * from t where source_file <> 'FOOTER';
SOURCE_FIL COLUMN1 COLUMN2 COLUMN3
---------- ---------- ---------- ----------
R1C1 R1C2 R1C3
R2C1 R2C2 R2C3
R3C1 R3C2 R3C3
ops$tkyte%ORA11GR2>
sqlldr is the legacy data loading tool of the 20th century
Ganesh, September 13, 2013 - 11:09 am UTC
I have found the answer for my second question - using WHEN filters out the last line.
OOW-fever?
Hoek, September 24, 2013 - 9:51 am UTC
Tom, the source_file column's data is missing, it's got column1's data in it and and so on...you probably missed it because of OOW-fever? ;-)
(Yes, ofcourse I'm jealous and I'd love to be there as well)
for Ganesh
Barbara Boehmer, September 29, 2013 - 6:39 pm UTC
Ganesh,
Here is one method that loads the filename into a separate table, then uses an expression to retrieve that filename and use it in the stage_table.
Barbara
SCOTT@orcl12c> host type this_file_name.csv
"HEADER", "This_File_Name_date.csv"
"R1C1","R1C2","R1C3"
"R2C1","R2C2","R2C3"
"R3C1","R3C2","R3C3"
"FOOTER",3
SCOTT@orcl12c> host type filename_table.ctl
options(load=1)
load data
infile 'This_File_Name.csv'
truncate
into table filename_table
fields terminated by ',' optionally enclosed by '"' trailing nullcols
(
header filler,
source_file
)
SCOTT@orcl12c> host type stage_table.ctl
options(skip=1)
load data
infile 'This_File_Name.csv'
truncate
into table stage_table
when column1 != 'FOOTER'
fields terminated by ',' optionally enclosed by '"' trailing nullcols
(
source_file expression "(select source_file from filename_table)",
column1,
column2,
column3
)
SCOTT@orcl12c> create table filename_table
2 (source_file varchar2(23))
3 /
Table created.
SCOTT@orcl12c> host sqlldr scott/tiger control=filename_table.ctl log=filename_table.log
SQL*Loader: Release 12.1.0.1.0 - Production on Sun Sep 29 11:31:36 2013
Copyright (c) 1982, 2013, Oracle and/or its affiliates. All rights reserved.
Path used: Conventional, LOAD=1
Commit point reached - logical record count 1
Table FILENAME_TABLE:
1 Row successfully loaded.
Check the log file:
filename_table.log
for more information about the load.
SCOTT@orcl12c> select * from filename_table
2 /
SOURCE_FILE
-----------------------
This_File_Name_date.csv
1 row selected.
SCOTT@orcl12c> CREATE TABLE stage_table
2 (source_file VARCHAR2(23),
3 column1 VARCHAR2(10),
4 column2 VARCHAR2(10),
5 column3 VARCHAR2(10))
6 /
Table created.
SCOTT@orcl12c> host sqlldr scott/tiger control=stage_table.ctl log=stage_table.log
SQL*Loader: Release 12.1.0.1.0 - Production on Sun Sep 29 11:31:36 2013
Copyright (c) 1982, 2013, Oracle and/or its affiliates. All rights reserved.
Path used: Conventional
Commit point reached - logical record count 4
Table STAGE_TABLE:
3 Rows successfully loaded.
Check the log file:
stage_table.log
for more information about the load.
SCOTT@orcl12c> select * from stage_table
2 /
SOURCE_FILE COLUMN1 COLUMN2 COLUMN3
----------------------- ---------- ---------- ----------
This_File_Name_date.csv R1C1 R1C2 R1C3
This_File_Name_date.csv R2C1 R2C2 R2C3
This_File_Name_date.csv R3C1 R3C2 R3C3
3 rows selected.
Ganesh, November 14, 2013 - 3:26 pm UTC
Thanks Barbara Boehmer. I had done the same thing before you posted your solution but I did it as an update statement outside sqlloader. Your approach is more elegant. I will include it in my solution now, which will save few lines of code and looks pretty.
Hi Tom, I do use external tables but I don't think it replaces sql loader completely. At times you have to do something from your own PC, cannot access server or seek help from DBAs or server admin guys. Sqlldr comes very handy in such situation.
Setting Context when using SQL Loader
DVN, January 07, 2014 - 10:54 pm UTC
Hi Tom,
We are in the process of implementing Virtual private database in our application
Can you tell me how we can set the context when using SQL Loader so that policy with check option error will not come?
January 08, 2014 - 6:09 pm UTC
I would strongly suggest that you do not use the 20th century tool for loading data - but rather use the tool designed for the 21st century
see
http://www.oracle.com/technetwork/issue-archive/2012/12-mar/o22asktom-1518271.html look for (ctl-f for) "Once you’ve decided to use large"
using external tables, you would just be in an application - like any other application - and you'd just do whatever you do in any of your applications.
otherwise - you'd have to do something like an on-logon trigger and review what information you can in the v$ tables to figure out "this needs to have the context set" - OR - use a "special user" for sqlldr and have your policy function do the right thing for that "special user", or give your sqlldr "special user" access to views that are not policy protected.
sqlldr when clause
Rajesh, September 26, 2014 - 11:58 am UTC
Hi Tom,
While loading data to a table using Sqlldr my data is pipe delimited. One field in the file may contain like 0 or 0.0 or 0.00 or 0.000. Now I wanted to load the data into table by comparing that using when clause.
Now I am using like
when field='0'
into table
when field='0.0'
into table
when field='0.00'
into table
when field='0.000'
into table
But is there any thing like when to_number(field)=0. (I tried this which is not working)