Skip to Main Content
  • Questions
  • Advice on rewriting code for a big table frequently used

Breadcrumb

Question and Answer

Chris Saxon

Thanks for the question, Bruno.

Asked: June 13, 2023 - 8:31 am UTC

Last updated: June 13, 2023 - 12:46 pm UTC

Version: 19.0.0.0

Viewed 1000+ times

You Asked

Hello,

We have PL/SQL package used to build dashboard. For that there is a table in which is stored the value for an indicator.
Basicaly the function in the package search for the good record update it if found or insert the record with the value.

For the research
        select oper_valeur_aff.valeur,
               nvl(oper_valeur_aff.arrete,'N')
          into w_montant,w_arrete
          from batix.oper_valeur_aff
          where oper_valeur_aff.societe=w_soc
            and oper_valeur_aff.affaire=w_code
            and oper_valeur_aff.oper=w_oper
            and oper_valeur_aff.date_debut=w_datdeb
            and oper_valeur_aff.date_fin=w_datfin
            and oper_valeur_aff.rupt_page=w_rupt_page
            and oper_valeur_aff.typlig=w_typlig
            and nvl(oper_valeur_aff.code_detail,'-1')=nvl(w_code_detail,'-1')
            and (oper_valeur_aff.arrete='O' or
                oper_valeur_aff.dat_dernier_calcul>=w_date_calcul-w_delai_raffraich);


For the update part
          update batix.oper_valeur_aff
            set oper_valeur_aff.valeur=w_montant,
                oper_valeur_aff.dat_dernier_calcul=w_date_calcul,
                oper_valeur_aff.arrete=w_arrete
            where oper_valeur_aff.societe=w_soc
              and oper_valeur_aff.affaire=w_code
              and oper_valeur_aff.oper=substr(w_oper,1,10)
              and oper_valeur_aff.date_debut=w_datdeb
              and oper_valeur_aff.date_fin=w_datfin
              and oper_valeur_aff.rupt_page=w_rupt_page
              and oper_valeur_aff.typlig=w_typlig
              and nvl(oper_valeur_aff.code_detail,'-1')=nvl(w_code_detail,'-1');
        if sql%notfound then
          insert into
            batix.oper_valeur_aff
              (societe,affaire,oper,date_debut,date_fin,dat_dernier_calcul,
              valeur,arrete,rupt_page,typlig,code_detail,per_debut,per_fin)
            values
              (w_soc,w_code,substr(w_oper,1,10),w_datdeb,w_datfin,
              w_date_calcul,w_montant,w_arrete,w_rupt_page,w_typlig,w_code_detail,w_perdeb,w_perfin);


This table could be big over 20 millions record, and growth in time
The tkprof show that we have to do something to optimize the process

Extract frome TKPROF

SELECT OPER_VALEUR_AFF.VALEUR, NVL(OPER_VALEUR_AFF.ARRETE,'N') 
FROM
 BATIX.OPER_VALEUR_AFF WHERE OPER_VALEUR_AFF.SOCIETE=:B10 AND 
  OPER_VALEUR_AFF.AFFAIRE=:B9 AND OPER_VALEUR_AFF.OPER=:B8 AND 
  OPER_VALEUR_AFF.DATE_DEBUT=:B7 AND OPER_VALEUR_AFF.DATE_FIN=:B6 AND 
  OPER_VALEUR_AFF.RUPT_PAGE=:B5 AND OPER_VALEUR_AFF.TYPLIG=:B4 AND 
  NVL(OPER_VALEUR_AFF.CODE_DETAIL,'-1')=NVL(:B3 ,'-1') AND 
  (OPER_VALEUR_AFF.ARRETE='O' OR OPER_VALEUR_AFF.DAT_DERNIER_CALCUL>=:B2 -:B1 
  )

call     count       cpu    elapsed       disk      query    current        rows
------- ------  -------- ---------- ---------- ---------- ----------  ----------
Parse        1      0.00       0.00          0          0          0           0
Execute  52390     10.32      10.63          0          0          3           0
Fetch    52390     17.65      26.76      41052     262339          0       45644
------- ------  -------- ---------- ---------- ---------- ----------  ----------
total   104781     27.98      37.39      41052     262339          3       45644

Misses in library cache during parse: 1
Misses in library cache during execute: 1
Optimizer mode: ALL_ROWS
Parsing user id: 641     (recursive depth: 1)
Number of plan statistics captured: 1

Rows (1st) Rows (avg) Rows (max)  Row Source Operation
---------- ---------- ----------  ---------------------------------------------------
         1          1          1  TABLE ACCESS BY INDEX ROWID BATCHED OPER_VALEUR_AFF (cr=5 pr=5 pw=0 time=1 us cost=5 size=51 card=1)
         1          1          1   INDEX RANGE SCAN OPER_VALEUR_AFF_CLE2 (cr=4 pr=4 pw=0 time=1 us cost=4 size=0 card=1)(object id 118685)


Elapsed times include waiting on following events:
  Event waited on                             Times   Max. Wait  Total Waited
  ----------------------------------------   Waited  ----------  ------------
  db file sequential read                     41052        0.02         23.91
  latch: redo allocation                          1        0.00          0.00

********************************************************************************

So my question is what could be the best way to optmize this part ?, i was planning to use collection but we could face memory issue i guess

Thanks a lot.

and Chris said...

The problem is:

Execute 52390

You're running the SQL statement lots of times. Every execution is fast: (37.39 / 52390) ~ 0.7 milliseconds on average.

Find a way to rewrite the process so you process the data in fewer executions. Ideally one!

Given you have UPDATE followed by SQL%NOTFOUND ... INSERT ..., it suggests you could rewrite the whole process as one MERGE statement. This handles UPDATE-if-exists, INSERT-if-not-exists logic for you.

Is this answer out of date? If it is, please let us know via a Comment

More to Explore

SQL

The Oracle documentation contains a complete SQL reference.