Skip to Main Content
  • Questions
  • How to perform a DDL statement within a transaction

Breadcrumb

Question and Answer

Chris Saxon

Thanks for the question, Mark.

Asked: October 01, 2016 - 11:42 am UTC

Last updated: October 04, 2016 - 11:22 am UTC

Version: 11.2.0.4

Viewed 1000+ times

You Asked

Connor/Chris/Tom,

I have an interesting business application scenario that I would like to get your help/advice on.

I am one of the principle developers that has inherited this system. It is *very* flexible and this one fact is a major appeal to our customers. Within the system deployment phase, customers create their own specific configuration which is stored within the application user-defined tables. As a consequence of saving such configuration, database objects within the Oracle schema e.g. tables/views/indexes are created dynamically to support it using (mostly) native dynamic SQL either directly within the business API PL/SQL subroutine or via table triggers. As we all know the execution of DDL statements have a nasty side effect of commiting the main transaction. Currently there are scenarios where the application metadata and the associated Oracle database objects get out of synchronization. For a well-behaved application we have to create a process where we can guarantee synchronization.

I did think of one solution where Oracle database objects could be created asynchronously by generating the necessary DDL statement, wrapping it within a message and then enqueuing to an Advanced Queue. Another database process is then responsible for the dequeue and executing of such messages. As the enqueue process participates within a transaction, the entire business transaction including the enqueue is atomic.

We also have business PL/SQL APIs that issue DDL statements and then depend on the operation further down in the same subroutine. This cannot be performed asynchronously and I am stuck! I did some research on the internet and someone suggested using edition based redefinition, but I can't see how this feature would help. We may be in a position in the future to refactor some (or all) of the system and I would like to address this particular issue if possible.

Many thanks


and Connor said...

You can use an autonomous transaction to do the work...but I stress, you will still need build a mechanism to handle rollback. Here's a possibility

procedure P1   
- does the transactional work
- calls procedure P2 (defined as autonomous)
  - does the DDL
  - submit proc P3 via dbms_job (we'll come back to this)
  - commit
- does work depending on success of P2
- etc etc
- log a record somewhere that I was successful
- commit;


So what does P3 do ? Well, it would run (say) 60 seconds from now and checks to see if P1 completed without error (hence the "log a record somewhere that I was successful"). If it was NOT successful, P3's job would be do reverse out the DDL that done in P2, hence making the DDL pseudo-transactional.

Hope this helps - but lets face, DDL in this way is.. just bad news :-)

Rating

  (2 ratings)

Is this answer out of date? If it is, please let us know via a Comment

Comments

What I thought... but

Mark, October 01, 2016 - 3:01 pm UTC

Hi Connor,

I presume that the "bad news" is in relation to issuing DDL in a synchronized fashion as opposed to doing it asynchronously. If it relates to the whole creating Oracle objects on the fly, can you think of any other way of keeping the application flexibility without going for a generic data model which I am not a fan of.

Many thanks
Mark.
Connor McDonald
October 01, 2016 - 11:59 pm UTC

bad news in several fronts

- ddl isnt really scalable, ie, its expensive to it (eg, depending on what options you have installed, check all the database level ddl triggers that fire)

- even my suggested approach is still a "pseudo-transactional" means - you can still end up with 'orphans' or some other mess

- if *lots* of things are being created at *runtime*, it suggests that downstream that lots of things are generating dynamic SQL at runtime to access it...and that's got security risks written all over it.

As much as I'm a "relational person", if you really have to have dynamic attributes, nominated at runtime, then I'd perhaps considering storing *just* those parts of the application in something like json/xml or similar.

Sorry - more

Mark, October 03, 2016 - 2:11 pm UTC

Hi,

Thanks for taking the time.

Is there a simple way of determining what DDL triggers fire when a DDL statement is executed?

There is a lot of dynamic SQL in this system to service the dynamically created objects and I agree this is a risk. How about generating PL/SQL for a specific dynamic object - effectively modifying most, if not all, dynamic SQL into static?

Currently the dynamic attribute data is stored in dynamically created partitioned tables. The data is then analyzed using SQL. Considering there could be million/billions of rows in such tables, would the use of JSON/XML scale?

Many thanks
Mark.

Chris Saxon
October 04, 2016 - 11:22 am UTC

SQL> select distinct TRIGGERING_EVENT from dba_triggers;

TRIGGERING_EVENT
-------------------------------------------------------------
UPDATE OR DELETE
DROP
ERROR
LOGOFF
CREATE OR ALTER OR DROP OR TRUNCATE OR RENAME
INSERT OR UPDATE OR DELETE
INSERT OR UPDATE
UPDATE
DELETE
DDL
TRUNCATE
RENAME
CREATE OR ALTER
ALTER
CREATE
INSERT
DROP OR TRUNCATE


So anything that is not the standard dml operations (ins/upd/del) might be candidates here.

JSON/XML are still just stored in the database, so they'll scale in similar fashion - the overhead being the parsing costs, which can be done inside or outside the database.

More to Explore

PL/SQL demos

Check out more PL/SQL tutorials on our LiveSQL tool.

PL/SQL docs

PL/SQL reference manual from the Oracle documentation library