Skip to Main Content
  • Questions
  • DBMS_FILE_TRANSFER.PUT_FILE multiple "source_file_name"

Breadcrumb

Question and Answer

Connor McDonald

Thanks for the question, Saptadip.

Asked: June 12, 2018 - 8:06 pm UTC

Last updated: October 08, 2025 - 4:58 am UTC

Version: 11.2.0.4

Viewed 10K+ times! This question is

You Asked

Hi

I am using Datapump to export dump file from a database and while exporting the dumpfile, I am splitting that dumpfile into multiple files. Now I want to transfer those files to another server using DBMS_FILE_TRANSFER.PUT_FILE.



I know I can explicitly mention the file names in "source_file_name" but not want do that way as the file names can be different sometime.



So thinking if there is any way to put source_file_name values without bothering typing the hardcoded values.

Any help would be appreciated.

Regards,
Saptadip

and Connor said...

Perhaps approach this a different way. You can use an external table definition to be able to perform a direcctory listing of the file(s) in a particular folder.

There's a demo of that here

http://www.oracle-developer.net/display.php?id=513

So then your process would be:

for i in ( select file_name from my_external_table )
loop
  DBMS_FILE_TRANSFER.PUT_FILE(..., i.file_name, .... )
end loop;


Rating

  (4 ratings)

Is this answer out of date? If it is, please let us know via a Comment

Comments

dbms_file_transfer security

Narendra, October 01, 2025 - 6:18 am UTC

Hello Connor/Chris,

Would you know if I should be aware of any security concerns about granting access to DBMS_FILE_TRANSFER to normal local database users? I am considering a use case where local users in a PDB who can do datapump export and import, can use DBMS_FILE_TRANSFER to copy dump files between different PDBs, which will allow them to
1. Code the entire workflow (export-copy-import) from within database
2. Control file transfer as it depends on the database link

I have to admit I am assuming that DBMS_FILE_TRANSFER allows copying of datapump dump files. Documentation appears to be vague over this.

Thanks in advance
Connor McDonald
October 02, 2025 - 1:31 am UTC

I am assuming that DBMS_FILE_TRANSFER allows copying of datapump dump files\

This is supported.

Would you know if I should be aware of any security concerns about granting access

No more than normal in the sense that you want to tightly control what directories are allowed and what potential impacts you might have on the OS.

eg I would have a dedicated file system so that if someone runs amuck and fills it with datapump files, then nothing else is going to be impacted.

If its just export/import you want, check out network-style import, which may obviate the need to use dbms_file_transfer at all

Re: dbms_file_transfer security

Narendra, October 02, 2025 - 12:40 pm UTC

Thanks Connor for quick response.
One of the requirements is to be able to retain few historical dumps so network link based approach won't help.
If I am allowed to be more greedy, may I ask what your recommendation would be for implementing this automation of export-copy-import?
1. Handled completely from within database
1.1. Using Scheduler Chains??
1.2 Using AQs
2. Coordinated from outside the database (using an external program like shell/java/python etc.)

Thanks in advance
Connor McDonald
October 03, 2025 - 1:28 am UTC

If I've got access to the OS, I've built solutions in the past using external tables, which in turn call a preprocessor script which does the actual work (using standard shell) etc.

I'm not saying that is *the* best way to do it, but often, you've already got a series of scripts etc written for DBA use, that you'd like to expose in a controlled manner to non-DBA's, so it avoids re-inventing the wheel.

So you end up with something like

select * from copy_test_to_dev;

where "copy_test_to_dev" is an external table that simply calls a shell script, which (for example) does:

- expdp to unload test
- sqlplus to drop dev objects
- impdp to load the dump file
- emails an admin that this was done
- emails the dev team to advise when completed

etc etc

Re: dbms_file_transfer security

Narendra, October 03, 2025 - 6:21 am UTC

Thanks Connor
In today's world where cloud is the norm, having access to OS is almost out of question. What would you recommend in a cloud setup where there is no access to OS of dB server?

Thanks in advance
Connor McDonald
October 06, 2025 - 4:07 am UTC

Easy way - have a small VM that connects to the database. It can sit waiting for requests, or run on a regular cron and checks for requests.

Harder way - code it all up in PLSQL

To Narendra

J. Laurindo Chiappa, October 07, 2025 - 5:44 pm UTC

DBMS_FILE_TRANSFER was really designed with non-cloud environments in mind : in being cloud databases, what about getting an OCI Object Storage and exporting/import from/to it , alike :

expdp system/password@//your-db-tnsnames-alias DIRECTORY=DATA_PUMP_DIR DUMPFILE=' https://objectstorage.sa-saopaulo-1.oraclecloud.com/n/YOUR_NAMESPACE/b/YOUR_BUCKET_NAME/o/your_dumpfile_%U.dmp' CREDENTIAL='DEF_CRED_NAME' SCHEMAS=YOUR_SCHEMA PARALLEL=4

This "example" uses the expdp client, what could not be available in some Clouds, but you must bel able to do the same via PL/SQL , using DBMS_DATAPUMP....
Connor McDonald
October 08, 2025 - 4:58 am UTC

DBMS_FILE_TRANSFER was really designed with non-cloud environments in mind

Agreed - it was released was back in Oracle 10g

More to Explore

Utilities

All of the database utilities are explained in the Utilities guide.