Hi Tom,
I have multiple csv-files in a directory. This directory will be updated with new csv-files. I need to load all csv files with sqloader in 1 table. So all the files have the same columns only different data.
This is how my control file looks:
load data
infile 'test.csv' "str '\r\n'"
append
into table TABLE1
fields terminated by ','
OPTIONALLY ENCLOSED BY '"' AND '"'
trailing nullcols
( COLUMN1 CHAR(4000),
COLUMN2 CHAR(4000),
COLUMN3 CHAR(4000)
) And here is my batch file (used generic names for demostration purposes):
@echo off
sqlldr 'username/pw@dbip'
CONTROL=test.ctl LOG=C:\Users\Desktop\sqloader_files\test.log
BAD=C:\Users\Desktop\sqloader_files\test.bad skip=1
pause
This works fine whith one csv file, however i need to iterate through all files and call sql loader for each file.When i am finished, i just move all the csv files in a different folder and then execute the batch file again when new files come along. Do you think this is a good solution?
I need to do this in Windows, however i am not familiar with any scripting language. Is it possible to do this only with a for loop in cmd? Or do we need to write a script with powershell or some other scripting language in windows? Would appreciate if someone could help me
thx in advance
You can specify multiple input files in your control file!
Provided they have the same record format, they can all go in the same table. You can do this by listing them out explicitly:
infile 'file1.txt'
infile 'file2.txt'
...
Or using wildcards (? for a single character, * for any number):
infile 'file?.txt'
For example:
C:\temp>dir test*
Volume in drive C is System
Volume Serial Number is E6AB-58D8
Directory of C:\temp
12/07/2017 15:22 127 test.ctl
12/07/2017 15:27 14 test1.txt
12/07/2017 15:27 14 test2.txt
3 File(s) 155 bytes
0 Dir(s) 153,746,407,424 bytes free
C:\temp>type test*
test.ctl
load data
infile 'test*.txt'
truncate
into table t
fields terminated by ','
OPTIONALLY ENCLOSED BY '"' AND '"'
( x, y )
test1.txt
r1,c2
r2,c2
test2.txt
r3,c2
r4,c2
C:\temp>sqlldr control=test.ctl userid=chris/chris@db
SQL*Loader: Release 12.1.0.2.0 - Production on Wed Jul 12 15:29:08 2017
Copyright (c) 1982, 2014, Oracle and/or its affiliates. All rights reserved.
Path used: Conventional
Commit point reached - logical record count 2
Commit point reached - logical record count 4
Table T:
4 Rows successfully loaded.
Check the log file:
test.log
for more information about the load.Or you could build a process to concatenate all the files into one:
C:\temp>type test1.txt test2.txt > big_test.txt
test1.txt
test2.txt
C:\temp>type big_test.txt
r1,c2
r2,c2
r3,c2
r4,c2
Then process that.