Skip to Main Content
  • Questions
  • Performance issue for global temp table

Breadcrumb

Question and Answer

Connor McDonald

Thanks for the question, Wang.

Asked: January 18, 2017 - 3:47 am UTC

Last updated: January 20, 2017 - 12:09 pm UTC

Version: 11.2.0.4

Viewed 10K+ times! This question is

You Asked

Hi tom,

We have reporting system which DB is migrated from Sybase to Oracle. There are lots temp tables like #xxx in stored procedure(SP). Now we encounter the performance issues when migrate these temp tables to Oracle global temp tables specially when exist in Sub SP.

If different parent SPs invoke the same Sub SP, sometimes the number of rows in those temp tables will be huge different. And we tried to do followings:

1. Add hint on the query, but the result is one parent SP performance being well while others are bad.

2. We found the same SP sometimes was executing quickly while sometimes was bad. I queried in the v$sql view and found several SQL statements in the SP have more than one plan_hash_value, obviously the bad plan has more buffer_gets and time cost, I guess this was caused by the temp tables statistics which used a wrong plan.
And I found the doc somewhere said using the dynamic sampling on the temp tables. so I did this likeļ¼š
dbms_stats.delete_table_stats('&&SCHEMA_ID', 'TT_XXX');
dbms_stats.lock_table_stats('&&SCHEMA_ID', 'TT_XXX');
however it is still not affect.

Since original reporting SPs' logic is complicated, and there are lots of temp tables everywhere in parent SP and Sub SP, we still want to keep them. Do you have any suggestion or best practice when try to tuning this kind of performance issue? Thanks.




and Connor said...

One option could be to move to 12c where there are session specific stats for global temporary tables. But even that doesn't necessarily solve the entire issue. For example, look at this in a single session:


SQL>
SQL> create global temporary table gtt on commit preserve rows
  2  as select * from dba_objects
  3  where 1=0;

Table created.

SQL>
SQL> insert into gtt
  2  select * from dba_objects
  3  where rownum <= 100;

100 rows created.

SQL> commit;

Commit complete.

SQL>
SQL> variable rc refcursor
SQL> exec open :rc for 'select count(*) from gtt';

PL/SQL procedure successfully completed.

SQL> print rc

  COUNT(*)
----------
       100

1 row selected.

SQL> select * from table(dbms_xplan.display_cursor());

PLAN_TABLE_OUTPUT
----------------------------------------------------------------------------------------------------------------------------------
SQL_ID  gmxmjw679mxx5, child number 0
-------------------------------------
select count(*) from gtt

Plan hash value: 3344941513

-------------------------------------------------------------------
| Id  | Operation          | Name | Rows  | Cost (%CPU)| Time     |
-------------------------------------------------------------------
|   0 | SELECT STATEMENT   |      |       |     2 (100)|          |
|   1 |  SORT AGGREGATE    |      |     1 |            |          |
|   2 |   TABLE ACCESS FULL| GTT  |   100 |     2   (0)| 00:00:01 |
-------------------------------------------------------------------

Note
-----
   - dynamic statistics used: dynamic sampling (level=2)


18 rows selected.

SQL>
SQL> insert into gtt
  2  select * from dba_objects
  3  where rownum <= 10000;

10000 rows created.

SQL>
SQL> variable rc refcursor
SQL> exec open :rc for 'select count(*) from gtt';

PL/SQL procedure successfully completed.

SQL> print rc

  COUNT(*)
----------
     10100

1 row selected.

SQL> select * from table(dbms_xplan.display_cursor());

PLAN_TABLE_OUTPUT
----------------------------------------------------------------------------------------------------------------------------------
SQL_ID  gmxmjw679mxx5, child number 0
-------------------------------------
select count(*) from gtt

Plan hash value: 3344941513

-------------------------------------------------------------------
| Id  | Operation          | Name | Rows  | Cost (%CPU)| Time     |
-------------------------------------------------------------------
|   0 | SELECT STATEMENT   |      |       |     2 (100)|          |
|   1 |  SORT AGGREGATE    |      |     1 |            |          |
|   2 |   TABLE ACCESS FULL| GTT  |   100 |     2   (0)| 00:00:01 |
-------------------------------------------------------------------

Note
-----
   - dynamic statistics used: dynamic sampling (level=2)


18 rows selected.

SQL>


So I created 100 rows, optimized for that, and then when I put another 10,000 rows in, I still optimized for just 100.

What you could do is to have some simple permutations of the same SQL to cater (say) 4 orders of magnitude. A table alias is sufficient to form a new SQL, eg


SQL>
SQL> create global temporary table gtt on commit preserve rows
  2  as select * from dba_objects
  3  where 1=0;

Table created.

SQL>
SQL> insert into gtt
  2  select * from dba_objects
  3  where rownum <= 100;

100 rows created.

SQL> commit;

Commit complete.

SQL>
SQL> variable rc refcursor
SQL> exec open :rc for 'select count(*) from gtt g_100_or_less';

PL/SQL procedure successfully completed.

SQL> print rc

  COUNT(*)
----------
       100

1 row selected.

SQL> select * from table(dbms_xplan.display_cursor());

PLAN_TABLE_OUTPUT
----------------------------------------------------------------------------------------------------------------------------------
SQL_ID  grk7zx21v16c9, child number 0
-------------------------------------
select count(*) from gtt g_100_or_less

Plan hash value: 3344941513

-------------------------------------------------------------------
| Id  | Operation          | Name | Rows  | Cost (%CPU)| Time     |
-------------------------------------------------------------------
|   0 | SELECT STATEMENT   |      |       |     2 (100)|          |
|   1 |  SORT AGGREGATE    |      |     1 |            |          |
|   2 |   TABLE ACCESS FULL| GTT  |   100 |     2   (0)| 00:00:01 |
-------------------------------------------------------------------

Note
-----
   - dynamic statistics used: dynamic sampling (level=2)


18 rows selected.

SQL>
SQL> insert into gtt
  2  select * from dba_objects
  3  where rownum <= 800;

800 rows created.

SQL>
SQL> variable rc refcursor
SQL> exec open :rc for 'select count(*) from gtt g_1000_or_less';

PL/SQL procedure successfully completed.

SQL> print rc

  COUNT(*)
----------
       900

1 row selected.

SQL> select * from table(dbms_xplan.display_cursor());

PLAN_TABLE_OUTPUT
----------------------------------------------------------------------------------------------------------------------------------
SQL_ID  c4yxah6ngtjf9, child number 0
-------------------------------------
select count(*) from gtt g_1000_or_less

Plan hash value: 3344941513

-------------------------------------------------------------------
| Id  | Operation          | Name | Rows  | Cost (%CPU)| Time     |
-------------------------------------------------------------------
|   0 | SELECT STATEMENT   |      |       |     4 (100)|          |
|   1 |  SORT AGGREGATE    |      |     1 |            |          |
|   2 |   TABLE ACCESS FULL| GTT  |   900 |     4   (0)| 00:00:01 |
-------------------------------------------------------------------

Note
-----
   - dynamic statistics used: dynamic sampling (level=2)


18 rows selected.

SQL>
SQL> insert into gtt
  2  select * from dba_objects
  3  where rownum <= 8000;

8000 rows created.

SQL>
SQL> variable rc refcursor
SQL> exec open :rc for 'select count(*) from gtt g_10k_or_less';

PL/SQL procedure successfully completed.

SQL> print rc

  COUNT(*)
----------
      8900

1 row selected.

SQL> select * from table(dbms_xplan.display_cursor());

PLAN_TABLE_OUTPUT
----------------------------------------------------------------------------------------------------------------------------------
SQL_ID  1p4u746xy4kta, child number 0
-------------------------------------
select count(*) from gtt g_10k_or_less

Plan hash value: 3344941513

-------------------------------------------------------------------
| Id  | Operation          | Name | Rows  | Cost (%CPU)| Time     |
-------------------------------------------------------------------
|   0 | SELECT STATEMENT   |      |       |    25 (100)|          |
|   1 |  SORT AGGREGATE    |      |     1 |            |          |
|   2 |   TABLE ACCESS FULL| GTT  |  8242 |    25   (0)| 00:00:01 |
-------------------------------------------------------------------

Note
-----
   - dynamic statistics used: dynamic sampling (level=2)


18 rows selected.

SQL>
SQL>
SQL>


So assuming you have control over the *insertion* code, the logic would look along the lines of:

insert ...

if sql%rowcount < 100 then
  open rc for 'select ... from gtt g_100_or_less';
elsif sql%rowcount < 1000 then
  open rc for 'select ... from gtt g_1000_or_less';
elsif sql%rowcount < 10000 then
  open rc for 'select ... from gtt g_10k_or_less';
else
  open rc for 'select ... from gtt g_big';



In that way, you have some reasonably accurate stats data without literal SQL polluting your cache. Depending on volume of executions you could adjust the number of permutations to best suit your needs.

Rating

  (2 ratings)

Is this answer out of date? If it is, please let us know via a Comment

Comments

12c: Session private statistics for global temporary tables

Duke Ganote, January 18, 2017 - 5:55 pm UTC

I had to look that up (although I generally avoid GTTs). This?
http://www.oracle.com/technetwork/issue-archive/2014/14-jan/o14asktom-2079690.html
Connor McDonald
January 19, 2017 - 1:12 am UTC

Yes thats it.

Wang Tiger, January 19, 2017 - 6:10 am UTC

Thanks McDonald for the detailed and useful info on this. I just thought if there has a hint could force to parse the query and get the new plan in the same session to replace the cached plan. If that, the things being to easy.
Connor McDonald
January 20, 2017 - 12:09 pm UTC

Yeah...no such hint exists.

More to Explore

PL/SQL demos

Check out more PL/SQL tutorials on our LiveSQL tool.

PL/SQL docs

PL/SQL reference manual from the Oracle documentation library