Skip to Main Content

Breadcrumb

May 4th

Question and Answer

Connor McDonald

Thanks for the question.

Asked: August 07, 2020 - 1:44 am UTC

Last updated: August 10, 2020 - 4:11 am UTC

Version: Oracle 11.2.0.4

Viewed 1000+ times

You Asked

Hi Team,

We are having PeopleSoft systems connected with Oracle 11.2.0.4 database. We have monthly financial close activities every beginning of month and lot of jobs use to run as part of this which internally access multiple large records.

We have a database of 2.5 TB size and the important tables are partitioned in database and having millions of rows. Every month end activity will add additional 100GB of data to the database.

We are facing performance issues in the jobs intermittently. For ex: last month we did not have any issues with any of the jobs but this month end we had allocation jobs (one of the month end jobs) running for 5 hours which completed last time in 1 hour.

There is no changes in the queries that are hitting the database. Also we have taken the AWR report and there is no other queries that are making this query slower or no other jobs were taking database resources during that time.

How do we approach this situation. if we are going for query tuning, the same queries were working within very minimal time last time and next month also we may not see any issue (from our previous experience). So really we need to go for query tuning or any other approach?

Also if business reports slowness in the query execution, actually we can only monitor and see if any other jobs are running/if any other queries are blocking this table etc. Is there anything that we can do so that the currently running query will run faster without cancelling the running query.

I know this is a lot of information and question. Please let us know how to proceed in this kind of situation. Also how we can give an explanation to business why the job went slow only this month. We could not find any abnormalities in the AWR report or database side.

Thanks.
Ajmal Palliyal
+1-219-238-4933


and Connor said...

The is always a clash of goals here when it comes to SQL performance

a) When things are running well, we do *not* want the SQL plans to change

b) But when the data changes, we *want* the SQL plans to reflect the changed data

Of course, the ideal is that in (b), we only change SQL plans when we know things are going to better not worse. That is the optimizer's aim, but it does not always succeed.

I would recommend taking a look at SQL Plan Management (SPM) for a selection of business critical queries. SPM will lock a good running plan into place even if the optimizer finds what it thinks is a better one. The big benefit of SPM is that we will still *store* the new plan that the optimizer found, just not *use* it. You can then (from time to time) look at these new plans and decide to see if they are *truly* better.

Optimizer product manager Nigel Bayliss has an excellent white paper on SPM

https://blogs.oracle.com/optimizer/new-sql-plan-management-white-paper



Is this answer out of date? If it is, please let us know via a Comment

More to Explore

Performance

Get all the information about database performance in the Database Performance guide.