The next objective we have is migrating additional selected schemas from an Oracle 10g database (Character set WE8ISO8859P1 with BYTE semantics) to an Oracle 19c database (Character set AL32UTF8 with CHAR semantics)
Following the guidance contained in Doc ID 260192.1 "Changing the NLS_CHARACTERSET to AL32UTF8 / UTF8 (Unicode) in 8i, 9i , 10g and 11g" at the base of section 5b
"
When using full export/import the tables are pre-created in the new database using CHAR semantics OR larger
columns sizes in BYTES before importingconn / as sysdba
spool scott_tabledef.sql
set echo off heading off feedback off verify off pagesize 0 linesize 80 long 9999
EXECUTE DBMS_METADATA.SET_TRANSFORM_PARAM(DBMS_METADATA.SESSION_TRANSFORM,'PRETTY',true);
EXECUTE
DBMS_METADATA.SET_TRANSFORM_PARAM(DBMS_METADATA.SESSION_TRANSFORM,'SQLTERMINATOR',true);
SELECT DBMS_METADATA.GET_DDL ('TABLE', TABLE_NAME, OWNER) FROM dba_tables WHERE OWNER in
('SCOTT') order by OWNER , TABLE_NAME ;
EXECUTE DBMS_METADATA.SET_TRANSFORM_PARAM(DBMS_METADATA.SESSION_TRANSFORM,'DEFAULT');
"
Section 12 then goes on to say:
"
Use the IGNORE=Y parameter for imp or the TABLE_EXISTS_ACTION=TRUNCATE option for Impdp to import the data
into the pre-created tables."
I have tried applying the above a number of times and cannot get it to work with the schemas targetted for migration
Using the TABLE_EXISTS_ACTION=TRUNCATE parameter as advised, impdp generates the following message per each table of the import:
"
<schema>.<table_name> exists and has been truncated. Data will be loaded but all dependent metadata will be skipped due to table_exists_action of truncate
"
This implies that
1) impdp attempts to load the data with the constraints including refrence/foreign keys constraints already created, also indexes
or
2) the constraints and indexes are applied (manually) afterwards.
The first option is problematic as tables would have to be loaded in a precise order to avoid violating the foreign keys and would be loading with all indexes pre-created
The second option would mean more calls to dbms_metadata to separately generate SQL definition files for indexes and constraints.
The following has been tried against our source (10g) and target (19c) databases and would appear to be a cleaner option:
i) Generate the Table definitions on source(10g) without any constraints and without any ref constraints
set echo off heading off feedback off verify off pagesize 0 linesize 32000 trims on long 10000000 longchunksize 32000
spool tabledef_$db_user.sql
select 'alter session set NLS_LENGTH_SEMANTICS=CHAR;' from dual; -- include this line if changing to CHAR semantics
EXECUTE DBMS_METADATA.SET_TRANSFORM_PARAM(DBMS_METADATA.SESSION_TRANSFORM,'PRETTY',true);
EXECUTE DBMS_METADATA.SET_TRANSFORM_PARAM(DBMS_METADATA.SESSION_TRANSFORM,'SQLTERMINATOR',true);
EXECUTE DBMS_METADATA.SET_TRANSFORM_PARAM(DBMS_METADATA.SESSION_TRANSFORM,'CONSTRAINTS',false);
EXECUTE DBMS_METADATA.SET_TRANSFORM_PARAM(DBMS_METADATA.SESSION_TRANSFORM,'REF_CONSTRAINTS',false);
SELECT DBMS_METADATA.GET_DDL ('TABLE', TABLE_NAME, OWNER) FROM dba_tables WHERE OWNER = upper('$db_user') order by TABLE_NAME ;
EXECUTE DBMS_METADATA.SET_TRANSFORM_PARAM(DBMS_METADATA.SESSION_TRANSFORM,'DEFAULT');
(In the above $db_user is referencing the contents of a Solaris environment variable containing the schema name)
ii) Create the tables on 19c database with output spooled from Step i) above
iii) Run impdp on 19c with CONTENT=DATA_ONLY (to import the table rows only).
iv) Run impdp on 19c (again) with CONTENT=METADATA_ONLY EXCLUDE=TABLE/TABLE to import all other metadata apart from the table definitions applied in Step ii).
Can you see any issues in proceeding in this manner? If I have missed or misinterpreted anything in note 260192.1, please advise.
Thanks.