And how from an OCI app
Dieter Oberkofler, January 08, 2009 - 1:07 am UTC
... and how would you recommend doing this from an OCI based app?
January 08, 2009 - 9:11 am UTC
OCI would be running on the client machine as a client server application I presume.
so, it would just log into the database and insert the data.
Amy Ho, January 13, 2009 - 3:08 pm UTC
Thank you for the information about Apex. I am wondering if there is an example prgram using OCI?
January 16, 2009 - 4:05 pm UTC
if your dba installed them, there are a ton of demo oci programs, some that do LOB operations, on the server.
uploading files using sql*net API
Hadar Paiss, January 16, 2009 - 4:27 pm UTC
Another way to copy files from client to server is to use oracle*net api.
Oracle itself use it in oracle applications.
There used to be a sample of ftp like client and server. The client use tns api and the server is started by the tns listener ( a bit like external procedure).
The beauty of this is that it is cross platform and networking protocol.
I used it with pc's, vax/vms and unix over decnet and tcp without changing mt code.
If you are using APEX or alike you can use the 'curl' program available for *nix and windows to POST the file to the server easily.
January 16, 2009 - 6:24 pm UTC
that doesn't really exist anymore...
and it wouldn't really get it into the database - you'd have to write a server yourself to upload to and... we'll, it would be really complicated.
can we do reverse meaning from server to client using plsql
sarma, July 31, 2009 - 11:48 am UTC
Is there a way to read a file from server and write it to local machine using pl/sql?
August 04, 2009 - 12:25 pm UTC
no. plsql runs in the server (forget about plsql in forms for a minute - that would be something different - but then it would be forms calling a plsql stored procedure to access a file and then forms running some plsql in the forms server to process the data).
the server, rightly so, cannot write to the client file system. It would be very virus like if it could.
The client would have to officially export their file system, permit the server to mount it, and the server would have to be able to see that file system as if it owned it.
client based data import operations
Dieter, August 19, 2011 - 11:24 am UTC
I was wondering what you think of the following use case.
We have a lot of data upload interfaces that basically consist of a this client layer (currently in C++ using OCI) that imports the physical file (TXT, CSV, XML, etc.) and then inserts the contents in to oracle table for further processing in PL/SQL.
I would prefer to take advantage of external tables but this would force me to still upload the file from the client filesystem to the server filesystem where in can be access using an oracle directory.
Following you advice on how to upload, this would involve the following steps:
1) Use a C++/OCI client application to read the file contents on client filesystem and insert it into a BLOB.
2) On the server some PL/SQL code would now read the BLOB and store it to the filesystem on the server.
3) Now the physical file on the server can the processed using an external table.
It seems kind of complicated and inefficient and and wanted to hear you opinion on this approach or any better alternatives for client based data import operations?
August 30, 2011 - 1:08 pm UTC
why not just have a fileserver somewhere that is accessible to the database machine AND to this client machine.
then it becomes a simple file copy.
Dieter Oberkofler, September 02, 2011 - 12:29 am UTC
accessing a file server from heterogeneous clients is always creating problems because it relies on the individual configuration of each client.
September 02, 2011 - 8:35 am UTC
well, that was one reason.
but anyway, file upload it to a webserver. the clients don't have to have access - you just need to get the file up there - just upload to some application server you probably already have in place somewhere.
How about DBFS?
Micheal Kutz, September 02, 2011 - 10:08 am UTC
Since I am required to archive any data file I parse, I'm looking at using DBFS (new in 11.2/18.104.22.168).
From what I can see, it turns a database table into a filesystem that can be mounted on Linux machines and accessed with an FTP-like client (dbfs_client) on other OSs.
I'm on Linux therefore I can mount the table as a filesystem, thus skipping Dieter's "step 2".
Any thoughts/comments on this solution?
September 02, 2011 - 11:11 am UTC
yes that would work as well - but it would require a custom application be written to be placed and configured (software installed and maintained) on the clients - which is something easily avoided with...
a single application server
a file upload screen
then you could upload into the database - into the application server - whatever and use an external table on it.
Loading File to Server from Client
Shan, April 27, 2012 - 1:08 pm UTC
Loading the file to a table and then reading that file using PL/SQL block and putting it on Oracle Server. Will this work.?
April 28, 2012 - 8:09 am UTC
PL/SQL can only read files that are available on a file system available to the server.
If you could not use notepad/vi on the file on the server, plsql will not be able to see it either. PL/SQL runs in the server, on the server.
Re: client based data import operations
Narendra, July 26, 2021 - 10:30 am UTC
My question is similar to that of what Dieter asked in 2011 but in the context of Cloud. In a Cloud (at Customer) environment, we have been told that we will not have access to database server operating system/file systems.
In such environment, what is the recommendation to bulk load data into a database, and more specifically to be able to use external tables?