Skip to Main Content
  • Questions
  • SQLLOADER - Error Field in data file exceeds maximum length

Breadcrumb

May 4th

Question and Answer

Chris Saxon

Thanks for the question, Janette.

Asked: July 11, 2001 - 3:24 pm UTC

Last updated: September 01, 2023 - 12:56 pm UTC

Version: 8.1.6

Viewed 100K+ times! This question is

You Asked

Hi,
I am trying to load a data file into a database table using SQL loader. I received the data in an Excel spreadsheet but I saved it as a comma delimited file.
When I run the SQL Loader command no records are loaded - looking at the log file I get the following error:
Rejected - Error on table PROSPECTUS, column PR_FULL_DESC.
Field in data file exceeds maximum length
I don't understand why this is happening as the maximum length of the PR_FULL_DESC field for any record I've attempted to load so far is 400 characters long and the field definition in the database table is VARCHAR2(4000).
Please help!
Thanks,
Janette

The control file I used is:
LOAD DATA
INFILE 'p.dat'
APPEND INTO TABLE PROSPECTUS
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
TRAILING NULLCOLS
(
PR_CODE NULLIF PR_CODE=BLANKS,
PR_NAME NULLIF PR_NAME=BLANKS,
PR_REF_NUM NULLIF PR_REF_NUM=BLANKS,
PR_TRANSACTION NULLIF PR_TRANSACTION=BLANKS,
PR_SHORT_DESC NULLIF PR_SHORT_DESC=BLANKS,
PR_FULL_DESC NULLIF PR_FULL_DESC=BLANKS,
PR_LOCN NULLIF PR_LOCN=BLANKS)

Database Table - PROSPECTUS:
Name Null? Type
------------------------------- -------- ----
PR_CODE VARCHAR2(30)
PR_NAME VARCHAR2(30)
PR_REF_NUM VARCHAR2(30)
PR_TRANSACTION VARCHAR2(6)
PR_SHORT_DESC VARCHAR2(1000)
PR_FULL_DESC VARCHAR2(4000)
PR_LOCN VARCHAR2(80)

Sample data (just one record):
PADDY,,PRP002,C,"A bright, spacious, attractive, ground floor, three bedroom apartment, situated on the corner of Townsend Street and Lombard Street East","A bright, spacious, attractive, ground floor, three bedroom apartment, situated on the corner of Townsend Street and Lombard Street East. Trinity Square is well located within walking distance of Trinity College, I.F.S.C., Pearse Street DART station and Temple Bar. The apartment will be sold with the benefit of a secure private car space.",CARDIFF,



and Tom said...

The default datatype in SQLLDR is char(255). Simply code:
...
PR_SHORT_DESC char(1000) NULLIF PR_SHORT_DESC=BLANKS,
PR_FULL_DESC char(4000) NULLIF PR_FULL_DESC=BLANKS,
....

in the control file to have sqlldr allocate a big enough buffer to hold it



Rating

  (50 ratings)

Is this answer out of date? If it is, please let us know via a Comment

Comments

thanks for the sqlldr response

Gail, October 23, 2001 - 10:14 am UTC

it helped me with my sqlldr error getting the

Field in data file exceeds maximum length

message.

Gail

SQLLDR Varchar2 Max Lengths

Jeff W, August 28, 2002 - 6:14 pm UTC

When in doubt, consult "Ask Tom" first!! I spent hours trying to figure out what I found on your site in less than 5 minutes! Search is powerful and answer I was looking for was right on the money.

Thanks Tom!!

Thank god...

john, February 04, 2003 - 4:39 pm UTC

I guess this was in the concepts guide somewhere, but beats me if I remember ever having seen it...

Thank god Tom addressed this as it had been driving me crazy...

Field...exceeds maximum length for a "filler" field.

news0reader, November 02, 2004 - 7:57 pm UTC

Tom, I'm using sql loader and I get the "exceeds maximum length" message for a field I've specified as a "filler" (Oracle 9i). The data file description is as follows:

(sample.txt)
col0 = number
col1 = text (up to 10000 chars)
col2 = number

My table has the following form:
col0 = number
col1 = number

My control file looks like this:

OPTIONS (
SKIP=1
)
LOAD DATA
INFILE sample.txt "str X'02'"
insert

INTO TABLE foo
fields terminated by X'01'
trailing nullcols
(
aid
, bio filler
, b_id
)

And I get many records rejected with message:

>>>
Record 13: Rejected - Error on table foo, column BIO. Field in data file exceeds maximum length
<<<

I've tried setting a large bindsize, to no avail. I've also tried adding various combinations of varchar(10000) as a descriptor in the control file...without success. The difference here, from samples I've seen online, is that the column at issue is one I want to skip.

Thanx

--A

Tom Kyte
November 03, 2004 - 6:57 am UTC

bio char(10000) filler


else defaults to char(255)

SQL Loader does not recognize qualified filler

news0reader, November 04, 2004 - 6:55 pm UTC

Hi Tom, I had tried that before I wrote, and since, but get a sql loader error. Part of my ctl file looks like this:

...
, rev_id
, review char(10000) filler
, t_article
...

When run it produces this:

SQL*Loader: Release 9.2.0.1.0 - Production on Thu Nov 4 18:21:22 2004

Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.

SQL*Loader-350: Syntax error at line 28.
Expecting "," or ")", found keyword filler.
, review char(10000) filler



Tom Kyte
November 05, 2004 - 4:07 pm UTC

FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
(DEPTNO,
DNAME filler char(1000),
LOC
)


filler char(.....)

not

char(.....) filler

(sorry about that -- seems everytime I don't setup a test case myself .....

Qualifier Comes After Main Entry

news0reader, November 08, 2004 - 4:40 pm UTC

Tom, thanx for the clarification: it makes sense that the qualifier comes after the keyword. While Oracle docs seem to have BNF-like diagrams for every construct and its brother, I didn't find one for sql loader directives!

--A

You are my god!!!!

database godess, December 16, 2004 - 7:40 pm UTC

Thank you so much for solving my problem with the char(4000) in sql*loader control file tip! I'd been struggling for hours.

A reader, February 10, 2005 - 2:24 pm UTC


SQL Loader Error

Suhail Wasim, February 16, 2005 - 3:38 pm UTC

Hi,
This piece of info did help me a lot plus saved me hours of work.

thanks a ton!

SQL Loader Error

Amol, May 17, 2005 - 3:44 am UTC

Thanks for the help

Thanks and how about for a number field!!!!

swathi, August 11, 2005 - 2:56 pm UTC

Hi Tom,

Thanks for the help. It solved my problem with varchar. But now I am getting the error for a number field now.My data in that field does not contain more than 4 digit numbers. I have specified my table format in oracle as ID number(19,0).However, when I am trying to load the datafile into oracle I am getting this error

"field in datafile exceeds maximum length"

Please help.

Thanks

Tom Kyte
August 12, 2005 - 8:11 am UTC

need teeny tiny, as in SMALL example

eg: one line of input and the control file, and if the table has more than 5 columns, pair the example down to 5 or less columns (and the error might even pop right out at you!!)

create table statement would be mandatory as would ctl file as would sample data.

JUST What I needed - Sqlldr - Error Field in data file exceeds maximum length

Michelle Walker, September 17, 2005 - 11:57 am UTC

Was the exact answer I needed.

Thanks

On point

A reader, October 19, 2005 - 12:55 pm UTC

Thanx Tom! Worked like a charm.

Imara Phillip

SQLLoader issue with default size of char

Sri Tadimalla, November 30, 2005 - 11:05 am UTC

Saved me hours of pain! Thanks Tom! You are the greatest!

Thank you!

Arun Mathur, January 19, 2006 - 9:35 am UTC

Also, I enjoy reading ur, I mean, your, blog (Kidding) :)

Regards,
Arun


Still helpful almost 5 years later

A reader, February 13, 2006 - 12:56 pm UTC

Your advice is still helpful almost 5 years after you first wrote it. Our DBA couldn't figure this out but your answer came up as soon as I searched for the error message in the sqlldr log.

Thanks

A DB Developer

Still helpful almost 5 years and 3 month later

A reader, May 31, 2006 - 1:28 pm UTC

Thanks Tom.

Many thanks

Henry, June 06, 2006 - 11:27 am UTC

Saved me from pulling my hair out! I've come across this error before and 'cured' it by shortening the offending fields - only possible because there were only one or two. But today I had many, and could not think of a way round it, so googled the error message and found this response. Just what I needed - maybe I'll stay sane for a few more weeks! Thanks again

Thanks!

Brent, August 17, 2006 - 10:45 am UTC

It just saved me some real agony.

Maximum string input in sqlloader (more than 4000 bytes)

Moses Valle, August 17, 2011 - 2:39 am UTC

Hi master Tom! Can sql loader load a string (varchar2) from flatfile to table. string is more than 4000 bytes wide?

I tried using this in the ctl file.

backup_target CHAR(32000) NULLIF BACKUP_TARGET=BLANKS "SUBSTR(:backup_target, 1, 4000)"

the string i will load from flatfile is 7000+ bytes wide. after running the loader i receive no errors but the data are not inserted and my input is dumped to the .bad file.

Please enlighten me :D Thanks so much!
Tom Kyte
August 17, 2011 - 4:25 am UTC

need a full example please.

Maximum string input in sqlloader (more than 4000 bytes)

Moses Valle, August 31, 2011 - 2:29 am UTC

Here is what I am doing.

I have these files, ctrlfile.ctl and inputstrng.dat
I created a table in HR schema called 'substr_test'. it has column named 'backup_target' with the datatype of VARCHAR2(4000).

----actual contents of ctrlfile.ctl ----start
load data
infile 'c:\inputstrng.dat'

APPEND

into table HR.substr_test

fields terminated by '|'
TRAILING nullcols
(
backup_target   CHAR(4000) NULLIF BACKUP_TARGET=BLANKS "SUBSTR(:backup_target, 1, 4000)"
)
----contents of ctrlfile.ctl ----end

Content of inputstrng.dat:

A very very long paragraph with 8000 characters (no carriage returns).

-----
> i run this in the command prompt:

  sqlldr hr/hr@orcl control=c:\ctrlfile.ctl

> after running, oracle displays no errors or anything. the contents of the input string are dumped to the .bad file.

> what i want to happen is to get the first 4000 characters of my 8000-byte-wide input string (inputstrng.dat) and transfer it to my desired table(hr.substr_test).

> Is this possible sir tom? Thanks so much! :)

Tom Kyte
August 31, 2011 - 1:38 pm UTC

that is not going to be possible - you cannot have a string greater than 4000 bytes in SQL - unless you use a clob (which isn't a 'string' really... it is a clob..)

The binding of the string just won't work - it cannot exceed 4000 bytes.

Sql Loader Clob

Moses Valle, September 14, 2011 - 2:15 am UTC

Thank you very much sir Tom :) I guess we will be proceeding to our long-term solution — which is to use CLOB.. :)
--

I have a follow-up question sir. This is also related with my previous question about loading data from a flatfile(.dat) using sql*loader.

Here is an example of what i am doing:

-> I want to insert the data from the .dat file to my desired table.

-> I have a table named 'TEST_TABLE' with columns/datatypes,

char_content  VARCHAR(4000)
clob_content  CLOB
num_content   NUMBER

-> I also have a sqlldr .ctl file which contains:

--START

load data
infile 'c:\clobtest1.dat'
APPEND
into table HR.TEST_TABLE
fields terminated by '|'
TRAILING nullcols
(
CHAR_CONTENT   CHAR(4000) NULLIF BACKUP_TARGET=BLANKS,
CLOB_CONTENT   CHAR(10000) ENCLOSED BY '<' AND '>',
NUM_CONTENT  
)

-- END

-> I have two flatfiles named 'clobtest_1.dat' and 'clobtest2.dat'

clobtest1.dat contains:

-- START

sample content for char_content | < sample content for clob content > | 123456

-- END

clobtest2.dat contains the SAME data as clobtest1.dat WITHOUT the '<' and '>'

-> Question: How and what will I change from my .CTL file so that I can successfully load the data (char_content, clob_content, num_content) from clobtest2.dat? 

I've tried other ways to load the clob column data but it requires additional delimeters, position of string, location of file, etc. I hope there is a way to load the data as is (clobtest2.dat).

-> THANK YOU IN ADVANCE SIR TOM! God bless you richly! :)

Tom Kyte
September 14, 2011 - 7:25 pm UTC

just add the word optionally (i had to change backup_target to a column that exists by the way...)

ops$tkyte%ORA11GR2> desc t
 Name                                     Null?    Type
 ---------------------------------------- -------- ----------------------------
 CHAR_CONTENT                                      VARCHAR2(4000)
 CLOB_CONTENT                                      CLOB
 NUM_CONTENT                                       NUMBER

ops$tkyte%ORA11GR2> !cat t.ctl
load data
APPEND
into table t
fields terminated by '|'
TRAILING nullcols
(
CHAR_CONTENT   CHAR(4000) NULLIF num_content=BLANKS,
CLOB_CONTENT   CHAR(10000) optionally ENCLOSED BY '<' AND '>',
NUM_CONTENT  
)

ops$tkyte%ORA11GR2> !cat t1.dat
sample content for char_content | < sample content for clob content > | 123456

ops$tkyte%ORA11GR2> !cat t2.dat
sample content for char_content | sample content for clob content | 123456

ops$tkyte%ORA11GR2> select * from t;

no rows selected

Elapsed: 00:00:00.00
ops$tkyte%ORA11GR2> !sqlldr / t data=t1.dat

SQL*Loader: Release 11.2.0.2.0 - Production on Wed Sep 14 20:24:44 2011

Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.

Commit point reached - logical record count 1

ops$tkyte%ORA11GR2> !sqlldr / t data=t2.dat

SQL*Loader: Release 11.2.0.2.0 - Production on Wed Sep 14 20:24:47 2011

Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.

Commit point reached - logical record count 1

ops$tkyte%ORA11GR2> select * from t;

CHAR_CONTENT
-------------------------------------------------------------------------------
CLOB_CONTENT
-------------------------------------------------------------------------------
NUM_CONTENT
-----------
sample content for char_content
 sample content for clob content
     123456

sample content for char_content
sample content for clob content
     123456


Elapsed: 00:00:00.01
ops$tkyte%ORA11GR2> 

thanksss!

Moses Valle, September 15, 2011 - 2:42 am UTC

Thank you for your response sir tom! Much appreciated :)

All is okay now :)

i need the answer

venkatesh, June 06, 2012 - 4:12 am UTC

hi tom,

am also facing the same issue "Data exceeds the maximum length".. i didnt get the answer from this site. pls can you send the solution for this issue.
Tom Kyte
June 06, 2012 - 5:46 am UTC

I am facing the same issue with my car won't start. i didn't get the answer from you. pls, can you send the solution for this issue.


now we are even. I know exactly as much about your problem as you do mine.


*give an example*
*build the CONTEXT*
*show us how to reproduce your issue using the smallest bit of COMPLETE CODE possible*


maybe your control file doesn't have the proper length, but we'll never know because

a) we don't have your control file
b) we don't know your data

CLOB with multiline data

Bala, July 01, 2012 - 3:13 am UTC

Tom,

I am trying to load multiline data into CLOB column. But I always get into problem either bad records or only partial record is getting loaded. Please help.

My table
-----------
create table testlob_loader (no number, remarks clob);

CTL file
---------
LOAD DATA
INFILE 'sampleclob.dat' "str '|\n'"
BADFILE 'testlob_loader.bad'
APPEND INTO TABLE testlob_loader
FIELDS TERMINATED BY ','
(
no,
remarks CHAR(12000) optionally ENCLOSED BY '<' AND '>'
)

My data file
-------------
1,<my test remarks column sdkfjasdfkjsdfkjsdkfjsdfkjsdfkjasdfk
ksdfjasdkfjasdkfjasdkfjsdkfjasdkfj
ksdfjkasdfjkasdfjasdkfj>|
2,<workingkasdfjskdfjasdkfjasdkfjasdkfjasdkfjsdkfjasdkfjsdkfjasdkfjasdkvdjvndfjhasdfjasjfkavcnjadhfjhsadf8934r893ruwey892389eusfj O62723823823!@&*((P)JHVR%#@@^*()__{:NB  VCCDXD
..kjh55%$#@!@$%^&*()_+|}{{P:"??~~~~~~````````>|

Running the sql loader like this
-----------------------------------
sqlldr <userid/pwd> control=testlob_loader.ctl LOG=testlob_loader.log

Getting Log like this
-----------------------
Table TESTLOB_LOADER, loaded from every logical record.
Insert option in effect for this table: APPEND

   Column Name                  Position   Len  Term Encl Datatype
------------------------------ ---------- ----- ---- ---- ---------------------
NO                                  FIRST     *   ,       CHARACTER            
REMARKS                              NEXT  4000   ,  O(<) CHARACTER            
                                                      O(>)

value used for ROWS parameter changed from 64 to 60
Record 1: Rejected - Error on table TESTLOB_LOADER, column REMARKS.
no terminator found after TERMINATED and ENCLOSED field

Table TESTLOB_LOADER:
  0 Rows successfully loaded.
  1 Row not loaded due to data errors.
  0 Rows not loaded because all WHEN clauses were failed.
  0 Rows not loaded because all fields were null.


Space allocated for bind array:                 255600 bytes(60 rows)
Read   buffer bytes: 1048576

Total logical records skipped:          0
Total logical records read:             1
Total logical records rejected:         1
Total logical records discarded:        0

Tom Kyte
July 02, 2012 - 7:15 am UTC

use:

INFILE demo.dat "str X'7C0A'"


\n is not meaningful to us.

A reader, July 02, 2012 - 10:39 am UTC

Hi Tom,

Still not working after changing the Str with Hex value as provided. Please see the log. It supposed to be 2 records as per the data file. Not sure why only 1 read. Even that too rejected. But when I see .bad file it shows both the records. Looks like everything is read as one record. I have to keep the data in .dat file and can not use LOB File for loading this CLOB data also. So please help what is wrong here. Is it due to the encoding of the file? Otherwise any error record end indicator would work better?

Total logical records skipped: 0
Total logical records read: 1
Total logical records rejected: 1
Total logical records discarded: 0
Tom Kyte
July 02, 2012 - 1:49 pm UTC

worked dandy for me, make sure you have a newline on the end of the last line (else it didn't end of |\n!)

ops$tkyte%ORA10GR2> desc t
 Name                                     Null?    Type
 ---------------------------------------- -------- ----------------------------
 NO                                                NUMBER
 REMARKS                                           CLOB

ops$tkyte%ORA10GR2> !cat t.ctl
LOAD DATA
INFILE test.dat "str X'7C0A'"
replace into table T
FIELDS TERMINATED BY ','
(
no,
remarks CHAR(12000) optionally ENCLOSED BY '<' AND '>'
)

ops$tkyte%ORA10GR2> !cat test.dat
1,<my test remarks column sdkfjasdfkjsdfkjsdkfjsdfkjsdfkjasdfk
ksdfjasdkfjasdkfjasdkfjsdkfjasdkfj
ksdfjkasdfjkasdfjasdkfj>|
2,<workingkasdfjskdfjasdkfjasdkfjasdkfjasdkfjsdkfjasdkfjsdkfjasdkfjasdkvdjvndfjhasdfjasjfkavcnjadhfj
hsadf8934r893ruwey892389eusfj O62723823823!@&*((P)JHVR%#@@^*()__{:NB  VCCDXD
..kjh55%$#@!@$%^&*()_+|}{{P:"??~~~~~~````````>|


ops$tkyte%ORA10GR2> !sqlldr / t

SQL*Loader: Release 10.2.0.5.0 - Production on Mon Jul 2 11:00:29 2012

Copyright (c) 1982, 2007, Oracle.  All rights reserved.

Commit point reached - logical record count 2

ops$tkyte%ORA10GR2> select * from t;

        NO
----------
REMARKS
-------------------------------------------------------------------------------
         1
my test remarks column sdkfjasdfkjsdfkjsdkfjsdfkjsdfkjasdfk
ksdfjasdkfjasdkfjasdkfjsdkfjasdkfj
ksdfjkasdfjkasdfjasdkfj

         2
workingkasdfjskdfjasdkfjasdkfjasdkfjasdkfjsdkfjasdkfjsdkfjasdkfjasdkvdjvndfjhas
dfjasjfkavcnjadhfj
hsadf8934r893ruwey892389eusfj O62723823823!@&*((P)JHVR%#@@^*()__{:NB  VCCDXD
..kjh55%$#@!@$%^&*()_+|}{{P:"??~~~~~~````````


A reader, July 02, 2012 - 9:34 pm UTC

Hi,

Its working. I think its windows and Unix problem of handling \n. I am using windows to test. So after I have used |\r\n its working as windows treats \n as \r\n.
Thanks a lot for your help.

software SE

Lin, November 15, 2012 - 2:37 pm UTC

I change my code from CHAR NULLIF NOTE=BLANKS =>CHAR(4000) NULLIF NOTE=BLANKS it works fine now

Anuradha, December 26, 2012 - 6:10 am UTC

Hi Tom,

I have a table with 3 fields. I am trying to load it with a data file which has 4 fields.

Is it possible to skip the first field from data file and load rest 3 inside table?


Tom Kyte
January 04, 2013 - 10:57 am UTC

see the filler keyword for sqlldr (assuming you are using sqlldr)


resolving the issues instantly & saving time

krishnareddynv, August 25, 2013 - 7:17 am UTC

Dear All,
You all are REAL TEACHERS for providing immediate solutions.
Thanking You for saving time and helping us to resolve the many issues in time correctly.

Meta, January 17, 2014 - 12:20 pm UTC

Thanks Tom, saved my day.

Question/problem

Dhruv, January 21, 2014 - 8:52 am UTC

Hi Tom,

Trying to load data from .dat file into table. The table has only one column (str1). The .dat file contains rows and i need to load each of these rows into this column. The rows are 2 to 3 lines long.

I am getting the below error:
SQL LOADER - Error Field in data file exceeds maximum length

Control file:
load data
infile 'ajay.dat'
insert into table bbc.ajay1.bk2
trailing nullcols
( str1 char(4000))


Thanks

A Thankful user, February 16, 2015 - 10:44 am UTC

Hi Tom,

I just wanted to thank you. Can always find the right to the point answers from your site.

Regards,

Great !

Suddhasatwa, June 22, 2015 - 11:58 am UTC

Thanks for the solution .. provided in 2004 for Oracle 9i, and works flawlessly in 2015 with 11gR2!

In multi-byte character sets, do we need to consider data_length or char_length

Moorthy Rekapalli, October 02, 2015 - 8:40 pm UTC

Tom,

Thank you very much for your help to Oracle community through this forum.

I have a follow-up question. In the following example, for columns that have char semantics and longer than 255, do we need to use data_length or char_length in sqlldr control files?

create table test(desc_byte varchar2(400 byte), char_desc varchar2(400 char));

column column_name format a15

select column_name, data_length, char_length, char_col_decl_length, default_length
  from user_tab_columns
 where table_name = 'TEST'


COLUMN_NAME     DATA_LENGTH CHAR_LENGTH CHAR_COL_DECL_LENGTH DEFAULT_LENGTH
--------------- ----------- ----------- -------------------- --------------
DESC_BYTE               400         400                  400
CHAR_DESC              1600         400                 1600


Thanks,
Moorthy.
Connor McDonald
October 03, 2015 - 2:12 am UTC

From the documentation:

"The lengths are always interpreted in bytes, even if character-length semantics are in effect for the file. This is necessary because the file could contain a mix of fields, some processed with character-length semantics and others processed with byte-length semantics"

Just confirming on data_length vs. char_length

Moorthy Rekapalli, October 05, 2015 - 3:58 pm UTC

Tom,

Thank you for your prompt reply.

For char_desc column, do we specify char(1600) or char(400) in sqlldr control file?

Based on your response, thinking that it is data_length in bytes, which is char(1600). But, just wanted to confirm.

Thanks,
Moorthy.
Chris Saxon
October 05, 2015 - 11:53 pm UTC

yes, bytes

Quiet a useful Tip

Amish Parikh, October 13, 2015 - 12:22 pm UTC

Being new to Data Migration, such tips are really effort and time savings.Had spent cpuple of hours for the same.

This is exactly what I needed

Anne, December 11, 2015 - 4:20 pm UTC

Thanks so much for the 'tip' about specifying the char(1000) in the .ctl file. This solved my problem.

This article is STILL helping people!

Cathy Joyner, August 18, 2016 - 8:26 pm UTC

It's remarkable... this issue was originally posted in 2001. I've spent the last two days fighting with this problem and finally resorted to Ask Tom. It fixed my problem in 5 minutes. WHY did I not start here? I should know better. At any rate, fifteen years after the original post, I wanted to say THANK YOU! You're STILL helping people every day with your excellent information!! :)
Connor McDonald
August 18, 2016 - 11:11 pm UTC

Thank you for taking the time to give us feedback

great catch

Gaylon, September 29, 2016 - 1:02 pm UTC

Never knew about the 255 char default. Handy to know...

A reader, January 17, 2018 - 3:44 pm UTC


TOM is always rocking

Shan, January 29, 2018 - 5:55 pm UTC

Your solutions are really help me to resolve my issues.
Connor McDonald
January 30, 2018 - 2:35 am UTC

glad we could help

Unknown length of CLOB data

A reader, April 24, 2018 - 12:37 pm UTC

What if I don't know how big my CLOB data will be.

What syntax do I use to SQL*Load in an unknown length of character data terminated by ',' into a CLOB? Not a BFILE, but stored in the input data file.
Connor McDonald
April 26, 2018 - 10:48 am UTC

You would need to choose an appropriate upper bound.

related q

Colin de Silva, June 29, 2018 - 9:15 am UTC

Hi there,
I have a related question.
Is there an option for sqlldr to truncate a field if it is outside the bounds specified, rather than abort the record (and abandon the job if aborted records > x)?

cheers,
Colin.
Chris Saxon
July 05, 2018 - 4:23 pm UTC

You can call functions to assign values to the columns. So you can substr the input to limit it:

SQL> create table t (
  2    id int,
  3    c1 varchar2(2)
  4  );

Table T created.

SQL>
SQL> ho type sqlldr.ctl
load data
infile *
into table t
fields terminated by ","
 ( id, c1 "substr(:c1, 1, 2)" )
begindata
1,x
2,xx
3,xxx
4,xxxx

SQL>
SQL> ho sqlldr userid=chris/chris@db

SQL*Loader: Release 12.2.0.1.0 - Production on Thu Jul 5 17:21:56 2018

Copyright (c) 1982, 2017, Oracle and/or its affiliates.  All rights reserved.

Path used:      Conventional
Commit point reached - logical record count 4

Table T:
  4 Rows successfully loaded.

Check the log file:
  sqlldr.log
for more information about the load.

SQL>
SQL> select * from t;
  ID C1
   1 x
   2 xx
   3 xx
   4 xx

Substr not worked in ctl file

Mahesh Rai, October 05, 2019 - 8:05 am UTC

Char(10000) "substr(:col1,1,4000)"

This is not working to cut 4k from 10k length string.
Connor McDonald
October 07, 2019 - 12:56 am UTC

Wow... one star. No test case, no error messages, just a one line saying "not worked", and what's more, I can prove you didn't test it, because it works!!!!

--
-- test.dat
--
1,my test remarks column sdkfjasdfkjsdfkjs....9000 more chars

--
-- table
--
SQL> create table t ( pk int, x varchar2(4000));

Table created.

--
-- control file
--
LOAD DATA
INFILE "c:\temp\test.dat"
replace into table T
FIELDS TERMINATED BY ','
(
pk,
x CHAR(12000) "substr(:x, 1, 4000)"
)

--
-- execution
--
X:\>sqlldr control=c:\temp\test.ctl userid=/@db19_pdb1

SQL*Loader: Release 19.0.0.0.0 - Production on Mon Oct 7 08:53:30 2019
Version 19.3.0.0.0

Copyright (c) 1982, 2019, Oracle and/or its affiliates.  All rights reserved.

Path used:      Conventional
Commit point reached - logical record count 1

Table T:
  1 Row successfully loaded.

Check the log file:
  test.log
for more information about the load.

--
-- result
--
SQL> select length(x) from t;

 LENGTH(X)
----------
      4000


A reader, December 04, 2020 - 6:13 am UTC


LOVED IT

Hassan Shahzad, April 22, 2021 - 6:50 pm UTC

Thank you so much. It did solve my problem.
The command i wrote in my control file was:

LOAD DATA
INFILE *
INTO TABLE dummy3
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
TRAILING NULLCOLS
(
cast char(4000),
country,
genre,
description char(4000)
)
Connor McDonald
April 27, 2021 - 5:56 am UTC

Glad it worked out

ORA-01461: can bind a LONG value only for insert into a LONG column even with CLOB and FILLER

Shobhit Mittal, September 13, 2021 - 3:34 pm UTC

First of all, just an update that the code posted on 'October 07, 2019 - 12:56 am UTC' does not work as It throws the error ORA-01461: can bind a LONG value only for insert into a LONG column.

I started working on fixing this issue using CLOB and FILLER to skip loading the data in column.

I am working on a CTL file which expects one of the column to have more than 16000 characters which we want to skip from loading to table.

Table: XXAP_CHR_EXP_LINE has column XXAP_CHR_EXP_LINE as CLOB

CTL: XXAP_CHR_EXP_LINE FILLER CHAR(20000) OPTIONALLY ENCLOSED BY '"'
which is interpreted as:
EXPENSE_LINE_ITEM_GUESTS NEXT 20000 | O(") CHARACTER
(FILLER FIELD)

When we run this with a data having data column around 15000 chars then also it is failing saying 'ORA-01461: can bind a LONG value only for insert into a LONG column'.

I also checked with 10000 chars then it works fine. Not sure why it's not working with CLOB. Even if I change the CHAR length in CTL more than 20000 say 50000 then also it's not working.

Regards,
Shobhit
Chris Saxon
September 14, 2021 - 1:04 pm UTC

Sorry, I'm unable to reproduce this.

Please provide a complete test case (create table + sqlldr control file) showing how you hit this problem

Great

Tiru, March 23, 2023 - 12:06 pm UTC

Thank you Tom, it helped me a lot.

Trying to import data in to dev database through sql developer got the below error.

Lakshmi, September 01, 2023 - 7:35 am UTC

File C:\Users\Rsurasi\Desktop\WC_MP_ELIG_HB_PP3.xlsx cannot be opened due to the following error: Tried to allocate an array of length 173, 101,780, but the maximum length for this record type is 100,000,000. If the file is not corrupt and not large, please open an issue on bugzilla to request increasing the maximum allowable size for this record type.
Delim
Left E
You can set a higher override value with IOUtils.setByteArrayMaxOverride().

Sql Developer version is 3.2.20.09.87

Please tell me the solution for this. Need to increase JVM Heap size in sqldeveloper.conf file or need to try in other format CSV ? will this fix the issue?
Chris Saxon
September 01, 2023 - 12:56 pm UTC

Bugzilla?! If you spot issues then you need to raise them with Oracle support: https://support.oracle.com

I'm unclear what you're trying to do, so not sure if there's a workaround here.