Skip to Main Content
  • Questions
  • Datapump (Export the live database & import to remote host, with no data loss)

Breadcrumb

Question and Answer

Connor McDonald

Thanks for the question, Ess.

Asked: July 26, 2018 - 7:36 pm UTC

Last updated: August 20, 2018 - 3:59 am UTC

Version: 12.1.0.2.0

Viewed 10K+ times! This question is

You Asked

Hello Community,

Appreciate if anyone could provide a high level steps for following;

I have a requirement to export (live/archive enabled/OLTP) whole/partial database, while the database is online/available and import it to a remote/different host and make sure there is no downtime/interruption on the live (source) database and there is no data loss on the target database.

This is for Oracle 12.1.0.2.0 on NON Windows (SUN OS, LINUX, AIX).

and Connor said...

It depends what you mean by "no data loss" because the moment you take a copy, it is a point in time copy of the data.

Anyway - there are some options you might want to consider

1) DataGuard (whole database only)

You create a standby (the target) and then sever the link between primary and target, leaving the target as a standalone byte for byte replica.

https://docs.oracle.com/en/database/oracle/oracle-database/12.2/sbydb/creating-oracle-data-guard-physical-standby.html#GUID-B511FB6E-E3E7-436D-94B5-071C37550170

2) Transportable tablespace/database

You unload metadata about the tablespaces, then copy the tablespace files to the target

whole database

https://docs.oracle.com/database/121/ADMIN/transport.htm#ADMIN13724

or partial

https://docs.oracle.com/database/121/ADMIN/transport.htm#ADMIN11394

3) Pluggable database

Use a clone command to copy a pluggable to a destination, although in 12.1 this requires an interruption to the primary. Much better in 12.2

4) Datapump

Various modes to choose from

Full

expdp system/password full=Y directory=MYDIR dumpfile=full.dmp logfile=full.log
impdp system/password full=Y directory=MYDIR dumpfile=full.dmp logfile=impfull.log

Schema level

expdp system/password schemas=HR directory=MYDIR dumpfile=hr.dmp logfile=hr.log
impdp system/password schemas=HR directory=MYDIR dumpfile=hr.dmp logfile=imphr.log

Adding a "consistent=y" parameter to the export will ensure all the data is exported as at a common point in time


Rating

  (3 ratings)

Is this answer out of date? If it is, please let us know via a Comment

Comments

Great help. Appreciate the effort Mac

Ess Kay, July 27, 2018 - 1:08 pm UTC

Thanks much, Mac, for the response.
This is a follow up question. I had no other way to do so.
This step (data movement) is part of implementing TDE on Oracle 12.1. Since client wants to avoid downtime or keep it to min, I have to make sure that I find a solution to encrypt the data (tablespaces), accordingly.
Appreciate your expert input.
Connor McDonald
July 28, 2018 - 4:24 am UTC

For me - I'd move to 12.2 where it can be online with no outage.


Ess Kay, July 30, 2018 - 2:57 pm UTC

Client is not planning to move to 12.2.
Any other suggestions?
Connor McDonald
August 01, 2018 - 12:35 pm UTC

You could look at DBMS_REDEFINITION but that will be on a table by table basis.

Will REDEFINITION be ONLINE

Ess Kay, August 02, 2018 - 8:44 pm UTC

Will the database be 100% available during REDEFINITION operation?

Connor McDonald
August 20, 2018 - 3:59 am UTC

"Almost".

We take a lock at the commencement and conclusion of the operation, but not during (when all the data movement is occurring).

More to Explore

Data Pump

All of the database utilities including Data Pump are explained in the Utilities guide.