Skip to Main Content

Breadcrumb

Question and Answer

Connor McDonald

Thanks for the question, Noob.

Asked: August 20, 2017 - 4:04 pm UTC

Last updated: August 24, 2017 - 1:10 am UTC

Version: 11.2.0.4.0

Viewed 1000+ times

You Asked

Hi Tom!

We have problem with database (warehouse) performance. Our DBA said that the main reason is some kind of company software create and delete about 20000 tables per day and it’s impacts on data dictionary. Can it be truth? And what we can do with it ?

Thank you!


and Connor said...

"We have problem with database (warehouse) performance"

That doesn't really tell us a lot. It's like me saying: "My car wont go above 30 km/h...please tell me why?". Could be dozens of causes.

"Our DBA said that the main reason is some kind of company software create and delete about 20000 tables per day"


That *could* be a factor. It could be something else. Your DBA should be able to back up his/her hypothesis with evidence for you, such as AWR reports or similar. Approach this scientifically and logically with them:

- find a time when performance is bad
- collect system metrics
- analyze the data
- find the root cause.

Rating

  (4 ratings)

Is this answer out of date? If it is, please let us know via a Comment

Comments

Noob, August 21, 2017 - 11:26 am UTC

Sorry for incorrect question. In this case may I change my question - Could creating and deleting tables many times (20000 per day during 1-2 years) damage data dictionary? If yes how we can restore it?

Connor McDonald
August 23, 2017 - 2:45 am UTC

One of the most expensive things you can do in an Oracle database is DDL.

From an Oracle perspective, DDL is something you do rarely, with perhaps "truncate" being the exception.


Recreate Tables

Christian, August 22, 2017 - 10:31 am UTC

An Application which recreates 20000 Tables each day doesn't sound like an application which is exactly made for an Oracle Database. This is highly uncommon or at least IMO this is not what you want to do if you design an application which works well in Oracle. So who knows what else this Tool does which could work well on another RDBMS but works horrific on Oracle.

My bet is this is a Tool made for other RDBMS as well, and the vendor did a 1:1 conversion from another RDBMS to Oracle. My other bet on recreating the tables is on temporary tables. In other RDBMS (say PostgreS) if you want to store data temporarily you create a temporary table on the fly. This table is visible to your session only and gets automatically dropped when you don't need it anymore. In Oracle this is fundamentally different:
Temporary Tables are created once like any other heap table. The structure of them is permanent and visible to all sessions, and you reference them in your code like you would reference any other heap table. The data in them however is visible only to your session, it is temporary and cleaned up for you (on commit, or at the end of your session).

So if you write a process which loads some data into a temporary table for whatever reason, in many other RDBMS you would
1.) create a temporary table
2.) fill the table and use it
3.) nothing, as the table is dropped automatically when you don't need it anymore

In Oracle you of course could
1.) create a temporary table (maybe use some mechanism to create a unique name because of concurrency)
2.) fill the table and use it
3.) drop it manually as it doesn't magically disappear

just to use the same mechanism. However; step 1 in Oracle is not usually done during a process which fills the temp table but initially when you install the application. Step 3 is omitted completely of course. You simply use the temporary tables like you would use a heap table.

Long story short: without further analyzing it is of course difficult to find the exact cause, but my bet is your DBA is right: the root cause is the application creating and dropping 20000 Tables a day. This most certainly is compeletely unnecessary. Of course this could only be the start - as already said: who knows what else this tool is doing wrong (maybe the dreaded max+1 sequence generator,...). You could gather dictionary stats and see if that helps.

Maybe you have the possibility to trace a session you know will recreate some tables and see where the time is spent. Maybe take a look what AWR / ASH tells you. If you really find out that creating and dropping objects becomes slower and slower and gather dictionary statistics doesn't work as well then I am not sure what to do (except export / import into a fresh database). Of course just because I don't know a way doesn't mean that there isn't one ;). But you have to pinpoint exactly the reason (maybe your DBA already has and just tells you his findings).

And of course I'd talk to the Vendor of this Software and tell them to carefully look up how temporary tables work in Oracle if they really want to support this RDBMS, because as already said: creating and dropping 20000 Database Objects on a day to day base doesn't sound like a good software design to me when the application runs on an Oracle Database.

cheers

Noob, August 23, 2017 - 10:21 am UTC

Thank you for answer.

I absolutely understand that recreating thousands table per day is bad approach for Oracle. But I want to find out could it irreplaceably damage data dictionary.

For example we stopped recreating tables at 7-00 AM and collect statistic on data dictionary. Than during whole day we have bad performance with “select” operations (it’s too slow I had never seen this before). We can’t export/import DB (30TB) because estimated time about 6 day (it’s very strange too).

If you say yes - recreating thousands tables can irreplaceably damage data dictionary it means every simple DB user with rights to create and delete table can destroy Oracle DB…

If you say no – we will try to find another reason of our problem.

Connor McDonald
August 24, 2017 - 1:03 am UTC

"recreating thousands tables can irreplaceably damage data dictionary"

No...it will not.

But its a large load on the system, and depending on what options you have installed, lots of triggers (and hence extra work) may be firing whenever you issue DDL.


To Noob

J. Laurindo Chiappa, August 23, 2017 - 2:11 pm UTC

See, I don´t like the word 'damage' because it leads to someone think in things breaking, exceptional/boundary conditions, 'errors' ... This is not the case, your datase is WORKING AS EXPECTED, right ? For sure, is a Documented scenary that frequent DDLs CAN imply in performance reduction (due to object invalidations, due to internal caches management, due to degraded internal objects Statistics, due to greater CPU consumptions, lock/latch waits, due to the COMMITs implied in any DDL, etc) - see the Oracle docs and http://www.nocoug.org/download/2011-11/Zitelli-NoCOUG-Four_Things.pdf, https://asktom.oracle.com/pls/apex/f?p=100:11:0::::P11_QUESTION_ID:47911859692542 together with My Oracle Support notes 1453975.1 , 1373500.1 and 1476047.1 as a reference...

So my answer would be : NO, you will NOT ** BREAK ** anything in the database issuing frequent DDLs - the database will act as expected and you CAN see performance reduction , AS DOCUMENTED and AS EXPECTED....

But to be clear : no one can to say the AMOUNT of slowdown resulting from the (not recommended) frequent DDLs : MAYBE this could be your main culprit of bad performance, but MAYBE not... To confirm if this is the root cause for your performance problems, among other things you can try to do a trace+tkprof from your 'slow' sessions, you can repeat your slow queries in your test/homologation/Q&A databases, you can extract general database performance indications from the Oracle performance tools (such as AWR/ASH)... This will be your next move...

Regards,

Chiappa
Connor McDonald
August 24, 2017 - 1:10 am UTC

nice input

More to Explore

Performance

Get all the information about database performance in the Database Performance guide.