Skip to Main Content
  • Questions
  • 'BEFORE CREATE ON SCHEMA' trigger apparently not firing before Create Table

Breadcrumb

Question and Answer

Connor McDonald

Thanks for the question, OracleLearner.

Asked: November 29, 2019 - 1:32 pm UTC

Last updated: May 12, 2020 - 10:13 am UTC

Version: Version 19.5.0.0.0

Viewed 1000+ times

You Asked

In Oracle 8.1.7 instance set up with characterset US7ASCII

Connected to:
Oracle8i Enterprise Edition Release 8.1.7.4.0 - Production
With the Partitioning option
JServer Release 8.1.7.4.0 - Production

SQL> create table t1 (c1 varchar2(3));

Table created.

SQL> create table t2 (c1 varchar2(3));

Table created.

SQL>REM simulate legacy application inserting 8bit characters into these tables i.e. characters outwith the definition of the configured 7bit character set

SQL> insert into t1 values(chr(128)||chr(129)||chr(130));

1 row created.

SQL> insert into t2 values(chr(128)||chr(129)||chr(130));

1 row created.



Run original exp for tables t1 and t2.


Running original imp on the target database 19C database (defined with a character set AL32UTF8) with the above export file taken from the source database - tables and rows created and populated with data as shown - should result in 2 tables on the target database, each with one row containing 3 3-byte replacement characters (so 9 bytes per row) provided NLS_LENGTH_SEMANTICS is set to CHAR.

By default the import will execute 'ALTER SESSION SET NLS_LENGTH_SEMANTICS = BYTE' during the run.

To try and get round this create a schema level trigger in a privileged user executing 'ALTER
SESSION SET NLS_LENGTH_SEMANTICS = CHAR'


create or replace trigger test_user_before_create
before create on test_user.schema 
begin
     execute immediate 'ALTER SESSION SET NLS_LENGTH_SEMANTICS = CHAR';
end;
/


Run original imp on 19C...


C:\dmp>set NLS_LANG=ENGLISH_UNITED KINGDOM.US7ASCII

C:\dmp>imp test_user@al32pdb file=test_user.00000 log=test_user_imp.log

Import: Release 19.0.0.0.0 - Production on Fri Nov 29 11:44:39 2019
Version 19.5.0.0.0

Copyright (c) 1982, 2019, Oracle and/or its affiliates.  All rights reserved.

Password:

Connected to: Oracle Database 19c Enterprise Edition Release 19.0.0.0.0 - Production
Version 19.5.0.0.0

Export file created by EXPORT:V08.01.07 via conventional path
import done in US7ASCII character set and AL16UTF16 NCHAR character set
import server uses AL32UTF8 character set (possible charset conversion)
export server uses US7ASCII NCHAR character set (possible ncharset conversion)
. importing TEST_USER's objects into TEST_USER
. importing TEST_USER's objects into TEST_USER
. . importing table                           "T1"
IMP-00019: row rejected due to ORACLE error 12899
IMP-00003: ORACLE error 12899 encountered
ORA-12899: value too large for column "TEST_USER"."T1"."C1" (actual: 9, maximum: 3)
Column 1 ...          0 rows imported
. . importing table                           "T2"          1 rows imported
Import terminated successfully with warnings.

C:\dmp>sqlplus test_user@al32pdb

SQL*Plus: Release 19.0.0.0.0 - Production on Fri Nov 29 11:47:42 2019
Version 19.5.0.0.0

Copyright (c) 1982, 2019, Oracle.  All rights reserved.

Enter password:
Last Successful login time: Fri Nov 29 2019 11:44:44 +00:00

Connected to:
Oracle Database 19c Enterprise Edition Release 19.0.0.0.0 - Production
Version 19.5.0.0.0

SQL> desc t1
 Name                                      Null?    Type
 ----------------------------------------- -------- ----------------------------
 C1                                                 VARCHAR2(3)

SQL> desc t2
 Name                                      Null?    Type
 ----------------------------------------- -------- ----------------------------
 C1                                                 VARCHAR2(3 CHAR)

SQL> select vsize(c1) from t2;

 VSIZE(C1)
----------
         9

SQL> select dump(c1,1016) from t2;

DUMP(C1,1016)
--------------------------------------------------------------------------------
Typ=1 Len=9 CharacterSet=AL32UTF8: ef,bf,bd,ef,bf,bd,ef,bf,bd

SQL>


If the trigger is set to alter the character set 'BEFORE CREATE', why does t1 not get the CHAR qualification on the column name?

When tried with a production schema with a number of tables, some containing thousands of rows, every VARCHAR2 type table column was created as CHAR - this was repeated a number of times with the same result, no CHAR missing on any column.

Seems as if there is some synchronization/timing issue with the trigger that may kick in when the export is small?

Thanks,






and Chris said...

The trigger is firing. But after PARSING your statement. So the first statement is parsed with NLS_LENGTH_SEMANTICS = BYTE. And thus executes with this setting.

So, as you find, the first statement is "skipped".

To ensure you use CHAR semantics, use a logon trigger instead:

create or replace trigger test_user_logon
after logon on test_user.schema 
begin
  execute immediate 'ALTER SESSION SET NLS_LENGTH_SEMANTICS = CHAR';
end;
/

Rating

  (3 ratings)

Is this answer out of date? If it is, please let us know via a Comment

Comments

OracleLearner, December 05, 2019 - 4:12 pm UTC

Thanks for considering my question and responding.

I have tried the logon trigger, however it doesn't meet the need as the import explicitly sets NLS_LENGTH_SEMANTICS=BYTE after the logon trigger has fired and before the first CREATE TABLE statement.

The lines from a trace done on the import:

ALTER SESSION SET PLSQL_OPTIMIZE_LEVEL = 2 NLS_LENGTH_SEMANTICS = 'BYTE' 
  PLSQL_CODE_TYPE = INTERPRETED PLSQL_WARNINGS = 'DISABLE:ALL' PLSQL_CCFLAGS =
   ''


Thanks.
Chris Saxon
December 05, 2019 - 5:28 pm UTC

Ahhh... import preserves the NLS_LENGTH_SEMANTICS from the source table.

I think you'll have to precreate the table. Then load the data in.

OracleLearner, May 11, 2020 - 3:07 pm UTC

The next objective we have is migrating additional selected schemas from an Oracle 10g database (Character set WE8ISO8859P1 with BYTE semantics) to an Oracle 19c database (Character set AL32UTF8 with CHAR semantics)

Following the guidance contained in Doc ID 260192.1 "Changing the NLS_CHARACTERSET to AL32UTF8 / UTF8 (Unicode) in 8i, 9i , 10g and 11g" at the base of section 5b

"
When using full export/import the tables are pre-created in the new database using CHAR semantics OR larger
columns sizes in BYTES before importing


conn / as sysdba
spool scott_tabledef.sql
set echo off heading off feedback off verify off pagesize 0 linesize 80 long 9999
EXECUTE DBMS_METADATA.SET_TRANSFORM_PARAM(DBMS_METADATA.SESSION_TRANSFORM,'PRETTY',true);
EXECUTE
DBMS_METADATA.SET_TRANSFORM_PARAM(DBMS_METADATA.SESSION_TRANSFORM,'SQLTERMINATOR',true);
SELECT DBMS_METADATA.GET_DDL ('TABLE', TABLE_NAME, OWNER) FROM dba_tables WHERE OWNER in
('SCOTT') order by OWNER , TABLE_NAME ;
EXECUTE DBMS_METADATA.SET_TRANSFORM_PARAM(DBMS_METADATA.SESSION_TRANSFORM,'DEFAULT');

"

Section 12 then goes on to say:
"
Use the IGNORE=Y parameter for imp or the TABLE_EXISTS_ACTION=TRUNCATE option for Impdp to import the data
into the pre-created tables.

"

I have tried applying the above a number of times and cannot get it to work with the schemas targetted for migration

Using the TABLE_EXISTS_ACTION=TRUNCATE parameter as advised, impdp generates the following message per each table of the import:

"
<schema>.<table_name> exists and has been truncated. Data will be loaded but all dependent metadata will be skipped due to table_exists_action of truncate
"

This implies that
1) impdp attempts to load the data with the constraints including refrence/foreign keys constraints already created, also indexes
or
2) the constraints and indexes are applied (manually) afterwards.

The first option is problematic as tables would have to be loaded in a precise order to avoid violating the foreign keys and would be loading with all indexes pre-created
The second option would mean more calls to dbms_metadata to separately generate SQL definition files for indexes and constraints.


The following has been tried against our source (10g) and target (19c) databases and would appear to be a cleaner option:


i) Generate the Table definitions on source(10g) without any constraints and without any ref constraints
set echo off heading off feedback off verify off pagesize 0 linesize 32000 trims on long 10000000 longchunksize 32000
spool tabledef_$db_user.sql
select 'alter session set NLS_LENGTH_SEMANTICS=CHAR;' from dual; -- include this line if changing to CHAR semantics
EXECUTE DBMS_METADATA.SET_TRANSFORM_PARAM(DBMS_METADATA.SESSION_TRANSFORM,'PRETTY',true);
EXECUTE DBMS_METADATA.SET_TRANSFORM_PARAM(DBMS_METADATA.SESSION_TRANSFORM,'SQLTERMINATOR',true);
EXECUTE DBMS_METADATA.SET_TRANSFORM_PARAM(DBMS_METADATA.SESSION_TRANSFORM,'CONSTRAINTS',false);
EXECUTE DBMS_METADATA.SET_TRANSFORM_PARAM(DBMS_METADATA.SESSION_TRANSFORM,'REF_CONSTRAINTS',false);
SELECT DBMS_METADATA.GET_DDL ('TABLE', TABLE_NAME, OWNER) FROM dba_tables WHERE OWNER = upper('$db_user') order by TABLE_NAME ;
EXECUTE DBMS_METADATA.SET_TRANSFORM_PARAM(DBMS_METADATA.SESSION_TRANSFORM,'DEFAULT');


(In the above $db_user is referencing the contents of a Solaris environment variable containing the schema name)


ii) Create the tables on 19c database with output spooled from Step i) above


iii) Run impdp on 19c with CONTENT=DATA_ONLY (to import the table rows only).


iv) Run impdp on 19c (again) with CONTENT=METADATA_ONLY EXCLUDE=TABLE/TABLE to import all other metadata apart from the table definitions applied in Step ii).


Can you see any issues in proceeding in this manner? If I have missed or misinterpreted anything in note 260192.1, please advise.

Thanks.
Chris Saxon
May 12, 2020 - 10:13 am UTC

Your method seems reasonable.

Another approach is:

- Create the table with constraints and indexes
- Disable constraints - either all or just FKs
- Optionally make indexes unusable
- Import the data
- Reenable constraints and rebuild indexes as needed

This avoids an extra import step. And you could check the tables definitions match the source before loading the data; potentially saving lots of time if something goes wrong!

OracleLearner, May 15, 2020 - 1:10 pm UTC

Chris,

Thanks for your advice and your time on this,.

I'll give your suggestions a try.

More to Explore

PL/SQL demos

Check out more PL/SQL tutorials on our LiveSQL tool.

PL/SQL docs

PL/SQL reference manual from the Oracle documentation library