I want to replicate data from a local database table to a remote database table using a trigger. Below is an example trigger definition.
CREATE or REPLACE TRIGGER test_trigger AFTER INSERT or UPDATE or DELETE on test_tbl
FOR EACH row
BEGIN
IF INSERTING THEN
INSERT INTO test_tbl@remote_db
VALUES ( :new.COL1
,:new.REF1
);
END IF;
IF UPDATING THEN
UPDATE test_tbl@remote_db
SET COL1 = :new.COL1,
REF1 = :new.REF1
WHERE ( REF1 = :old.REF1);
END IF;
IF DELETING THEN
DELETE test_tbl@remote_db
WHERE ( REF1 = :old.REF1 );
END IF;
EXCEPTION
WHEN OTHERS THEN
dbms_output.put_line('failed on update');
END;
/
Everything works find as long as the remote_db is up and available
If the remote_db suddenly becomes unreachable - network outage or server crash - then the local test_tbl is locked when the trigger is fired and is stuck waiting. Thus all the applications update this table become blocked waiting for the local table to become unlocked. Is there a way to define the trigger such that it won't lock the local table if the remote database isn't accessible.
Thanks
It looks like you're building your own replication mechanism from the ground up. That's a *big* exercise, and fraught with risk.
I'd strongly recommend using something like a materialized view and oracle replication.
http://docs.oracle.com/cd/E11882_01/server.112/e10706/toc.htm Its going to look after 'queueing' the updates, ensuring things are applied in the right order etc...
If you are updating the table at both sites...then that's a whole new (bigger) ballgame.
As an aside,
EXCEPTION
WHEN OTHERS THEN
dbms_output.put_line('failed on update');
is a bad bad idea - I'm hoping you've got that in there purely for testing.