Skip to Main Content
  • Questions
  • automate the creation of external tables

Breadcrumb

Question and Answer

Tom Kyte

Thanks for the question, eashwer.

Asked: April 23, 2006 - 11:49 am UTC

Last updated: April 14, 2009 - 10:01 am UTC

Version: 9i

Viewed 1000+ times

You Asked

In the past, I've used unix shell scripts to perform
the following in sequence

1.browse through unix directory for listing of files matching
a pattern
2. In a loop, supply each matching file name to a shell script
which will call a sqlloader script (with a control file appropriate for the pattern of filename, ) and load up the file to a destination
table.


I wish to utilize external tables, but I dont know whether I'd
be able to automate the above steps within the database
all within the precincts of a package/st. procedure.


Can external tables be supplied a filename from a host o/s directory
location , a dynamic field list (based on layout of file being read),
a table name based on the pattern etc? Sample code would help.



and Tom said...

Reading the directory is the "hard part", the rest is easy.

</code> http://asktom.oracle.com/pls/asktom/f?p=100:11:::::P11_QUESTION_ID:439619916584 <code>
can read the directory, then ALTER TABLE can be used to change the file that the external table points to - you'd have to pick the right table to use and then use dynamic sql to alter it and dynamic sql to load from it (as you don't want the static references and the corresponding invalidations that would happen otherwise)

alter table ET location ('FILENAME');

so, just like you have N control files right now - you would have N external tables (they are your control files).

just like you pass in the data=filename to sqlldr, you would alter table to tell it where to read from.



Rating

  (4 ratings)

Is this answer out of date? If it is, please let us know via a Comment

Comments

External table for directory

Georg, April 24, 2006 - 4:42 pm UTC

Maybe you can create one additional file with the file names of the data files inside and setup an external table on this "dir file". Then you have access to all the filenames from sql.

Move loaded files to processed folder

Manuel Vidigal, April 13, 2009 - 7:31 am UTC

Hi Tom,

We use perl scripts to do the steps above, only because we have a step 3 - after processing the files in the database, if everything goes smoothly, we need to move the processed files to another directory so they don't get processed twice.

What do you think is the best and safest approach?

1 - Keep using the perl script

2 - Use a java stored procedure to give an OS Command to move the file?

3 - Is there a better way to do this?

I'm asking you this, because I've been doing some tests with the java stored procedure, and it won't give me any error when there's already a file with the same name on the processed folder.

Thanks in advance.
Tom Kyte
April 13, 2009 - 5:29 pm UTC

how about (3)

you are in a database
database remember things
remember that you did this file so as to not process it again?


... I'm asking you this, because I've been doing some tests with the java stored procedure, and it won't give me any error when there's already a file with the same name on the processed folder. ...

then you have a bug in your java stored procedure, java would return an error if you call their api to move a file. Or are you just running an OS command (which is really hard to figure out "did it work or not")

Processed folder

Manuel Vidigal, April 13, 2009 - 6:35 pm UTC

Yes, I was testing the OS command. Although I don't know Java, I will try to find an example of a java stored procedure that uses the api to move a file.

Another reason for the processed folder, is that every other day we need to clean out the processed files.

If you really needed to move the files to another folder, would you do it with the java api?
Tom Kyte
April 14, 2009 - 10:01 am UTC

but, like I said, you have a database

a database remembers things

utl_file can remove a file.


therefore, you could remember what you have processed IN THE DATABASE (and do not process it again) and the database could clean out the processed files (dbms_job, dbms_scheduler - whatever - to get a job to run, determine what has been processed using the DATABASE, clean it out)


I don't think you need to move anything. It makes it all less transactional (file system will NOT rollback).

I think you want to

a) process input
b) remember that you processed input
c) commit;


all nice and transactional - no mess, no muss, no fuss. And no code external from the database itself.

Thanx for your time

Manuel Vidigal, April 14, 2009 - 10:50 am UTC


More to Explore

PL/SQL demos

Check out more PL/SQL tutorials on our LiveSQL tool.

PL/SQL docs

PL/SQL reference manual from the Oracle documentation library