Skip to Main Content
  • Questions
  • INSERT/DELETE on PARENT-CHILD-SUB CHILD table dependency

Breadcrumb

Question and Answer

Connor McDonald

Thanks for the question.

Asked: May 16, 2017 - 6:46 am UTC

Last updated: April 04, 2019 - 4:38 am UTC

Version: 11.2

Viewed 1000+ times

You Asked

Hi Chirs/Connor,

I have written below procedure to move records from tables (from Primary DB to Arcihval DB).

The logic is to merge records on Archival DB then DELETE those rows from Primary as well.
We have already have same table structre in Archival like in Primary DB.
This procedure gets called in a loop where around 30-40 table names (e.g. tb_orders, tb_order_history, so on..)
This works perfectly fine when there are not foreign key constraints.

But am not able build a logic where i can handle parent child records movement as well as delete.
Scenario is Parent:Child:Sub Child - 
-- TB_ORDERS
   -- TB_ORDER_DETAILS
         -- TB_ORDERS_HISTORY     

   Can you suggest how to handle such scenarios, PLEASE!!
   

PROCEDURE p_dynamic_merge(piv_table_name       IN     VARCHAR2, -- Table name which need to archive
                          pin_max_process      IN     NUMBER, -- Degree of PARALLEL Operation e.g. 8
                          pin_max_merge        IN     NUMBER, -- BULK COLLECT LIMIT e.g. 5000
        piv_where_clause     IN     VARCHAR2, -- where condition for a table to fetch specific rows n merge over DB link..
                          pon_total_rec        OUT    NUMBER, -- No. of rows merged.
                          pov_retval           OUT    VARCHAR2,
                          pov_errmsg           OUT    VARCHAR2)
IS
    -- Variable declaration --
    v_insert_src           VARCHAR2(20000);
    v_insert_tgt           VARCHAR2(20000);
    v_update_src           VARCHAR2(20000);
    v_condition_src        VARCHAR2(20000);
    v_temp_src             VARCHAR2(1000);
    v_temp_tgt             VARCHAR2(1000); 
    v_final_query          VARCHAR2(20000);
    n_cnt                  PLS_INTEGER;
    v_from                 VARCHAR2(100);
    v_to                   VARCHAR2(100);
    v_str_cur              VARCHAR2(10000);
    n_row_updated          NUMBER(10); 
    d_date                 DATE;
    
    TYPE r_cursor IS REF CURSOR;
    c_cur_var r_cursor;

    TYPE t_rowid IS TABLE OF ROWID; -- index by binary_integer;
    tb_rowid  t_rowid := t_rowid();

    err_archive_data       EXCEPTION;
BEGIN
     
  v_insert_src    := NULL;
     v_insert_tgt    := NULL;
  v_update_src    := NULL;
  v_condition_src := NULL;
     v_temp_src      := NULL;
     v_temp_tgt      := NULL;
  n_cnt           := 0;
  -- Get source table columns --
  SELECT LISTAGG(' src.' || column_name, ', ') WITHIN GROUP (ORDER BY column_id) 
     INTO   v_insert_src
     FROM   user_tab_columns
     WHERE  table_name = UPPER(piv_table_name);

     v_insert_tgt := REPLACE(v_insert_src, 'src.', 'tgt.');

     -- Get Non-key columns --
     SELECT LISTAGG(' tgt.' || tab.column_name || ' = ' || ' src.' || tab.column_name, ', ') WITHIN GROUP (ORDER BY tab.column_name) 
     INTO   v_update_src
     FROM (SELECT column_name
           FROM   user_tab_columns
           WHERE  table_name = UPPER(piv_table_name)
           MINUS
           SELECT column_name
           FROM   user_constraints a, user_cons_columns b
           WHERE  a.constraint_name = b.constraint_name
           AND    a.constraint_type = 'P'
           AND    a.table_name = UPPER(piv_table_name)) tab;

     -- Get primary key columns --
     SELECT LISTAGG(' src.'||column_name || ' = '|| ' tgt.'||column_name, ' AND ') WITHIN GROUP (ORDER BY column_name)
     INTO   v_condition_src
     FROM   user_constraints a
     ,      user_cons_columns b
     WHERE  a.constraint_name = b.constraint_name
     AND    a.constraint_type = 'P'
     AND    a.table_name = UPPER(piv_table_name); 

     n_row_updated := 0;
     
     v_str_cur := 'SELECT ROWID FROM ' || piv_table_name || ' WHERE '|| piv_where_clause;

     OPEN c_cur_var FOR v_str_cur;
     LOOP

         FETCH c_cur_var BULK COLLECT INTO tb_rowid LIMIT pin_max_merge;
   EXIT WHEN tb_rowid.COUNT = 0;

   -- Use of object types i.e. tb_rowid not allowed for DB link operation..
   -- Load GTT which will hold ROWIDs which can be used in MERGE statement..
   DBMS_OUTPUT.PUT_LINE('B4 GTT :'||TO_CHAR(SYSDATE,'DD/MM/YYYY HH24:MI:SS'));
   
   FORALL i IN tb_rowid.FIRST .. tb_rowid.LAST 
             INSERT /*+ PARALLEL(tb_archive_staging, 2) */ INTO tb_archive_staging VALUES (tb_rowid(i));
    
         DBMS_OUTPUT.PUT_LINE('After GTT :'||TO_CHAR(SYSDATE,'DD/MM/YYYY HH24:MI:SS'));
   
   v_final_query := 'MERGE /*+ PARALLEL('||piv_table_name||', '||pin_max_process||') */ INTO '|| piv_table_name ||'@'||pkv_archive_db_link||' tgt
                           USING ( SELECT /*+ PARALLEL('||piv_table_name||', '||pin_max_process||') */ * FROM ' || piv_table_name || ' WHERE ROWID IN (SELECT row_id FROM tb_archive_staging)) src
                           ON ('|| v_condition_src ||')
                           WHEN MATCHED THEN
                           UPDATE
                           SET '|| v_update_src ||'
                           WHEN NOT MATCHED THEN
                           INSERT (' || v_insert_tgt || ')
                           VALUES (' || v_insert_src ||')';

      -- Enable DML Parallelism --
         BEGIN
           EXECUTE IMMEDIATE 'ALTER SESSION ENABLE PARALLEL DML' ;
         EXCEPTION
           WHEN OTHERS THEN
             NULL;
         END;
         DBMS_OUTPUT.PUT_LINE('B4 DB Link :'||TO_CHAR(SYSDATE,'DD/MM/YYYY HH24:MI:SS'));
   
   EXECUTE IMMEDIATE v_final_query;
   
   DBMS_OUTPUT.PUT_LINE('After DB Link :'||TO_CHAR(SYSDATE,'DD/MM/YYYY HH24:MI:SS'));
         n_row_updated := n_row_updated + SQL%ROWCOUNT;
   
   COMMIT;

   v_final_query := 'BEGIN 
                              FORALL i IN :tb_rowid.FIRST .. :tb_rowid.LAST 
                                 DELETE FROM '||piv_table_name||' WHERE rowid = :tb_rowid(i);
                              COMMIT;
                           END;';
   EXECUTE IMMEDIATE v_final_query USING tb_rowid, tb_rowid, tb_rowid;
   
         tb_rowid.DELETE;
   
     END LOOP;
     CLOSE c_cur_var;
  
     pon_total_rec := n_row_updated;
     
EXCEPTION
     WHEN OTHERS THEN
          ROLLBACK;
          pov_retval := SQLCODE;
          pov_errmsg := SQLERRM||'#'||dbms_utility.format_error_backtrace;
END p_dynamic_merge;

and Connor said...

You are discovering what massive task it is to handle replication of data in a safe and consistent manner (which is why we have Streams and Goldengate).

If you are persisting with this path, you would need to consider

a) using deferred constraints, or
b) disabling constraints during data loads and re-enabling them afterwards.

But building your own replication system is non-trivial. Very Very non-trivial

Rating

  (2 ratings)

Is this answer out of date? If it is, please let us know via a Comment

Comments

Tool for Purge

A reader, July 10, 2017 - 8:39 am UTC

Hi Connor,

Can you please help to understand or suggest on below:

Can Golden Gate used to delete table data based on certain range like copy 7 years of Data from Primary to Secondary then delete that 7 years data from Primary?

Golden Gate Replication Query

Lal, April 03, 2019 - 10:09 am UTC

I also have a similar Query

Our current Prod has booking data (say around 10 tables) for the past 10 years.
I have a need to replicate these tables using golden gate for a near real time copy.
But we are interested in records for the past one year only.

There is a main table which has the booking _creation_date. The child tables has reference to the main table using the booking_id column.

Question is how can i setup the Golden Gate replication with an initial load of last one year (booking _creation_date>='01-01-2018'?
Also how to apply the same for the child tables which does not have the date column but has reference to the main table using booking id.

Request your expert advise/solution on the same.
Connor McDonald
April 04, 2019 - 4:38 am UTC

Let's you wanted to do this:

CREATION_TIME >='2012-01-01'

Then you could do it using something like:

Replicat:

MAP {source table}, TARGET {target table}, 
FILTER (@COMPUTE (@DATE("JUL", "YYYY-MM-DD", creation_time) >=
                              @DATE("JUL", "YYYY-MM-DD", "2012-01-01")) > 0);


Extract:

TABLE {source table}, 
FILTER (@COMPUTE (@DATE("JUL", "YYYY-MM-DD", creation_time) >=
                              @DATE("JUL", "YYYY-MM-DD", "2012-01-01")) > 0);



I don't know of a way to map that to the child table - I think you'll need to try the Goldengate forums for that one.

https://community.oracle.com/community/groundbreakers/business_intelligence/system_management_and_integration/goldengate

More to Explore

PL/SQL demos

Check out more PL/SQL tutorials on our LiveSQL tool.

PL/SQL docs

PL/SQL reference manual from the Oracle documentation library