Hello Connor
Thank you very much for your response.
There are about 15 tables in our database that we need to track. Also I need some transformation of the data, maybe there are several tables participating for one row of changes. So I went to trigger option.
I made a table
create table CCARE.CHANGES_EVENT_LOG
(
changes_event_log_id NUMBER not null,
table_name VARCHAR2(200),
identifier_1_number NUMBER,
identifier_1_text VARCHAR2(4000),
identifier_2_number NUMBER,
identifier_2_text VARCHAR2(4000),
identifier_3_number NUMBER,
identifier_3_text VARCHAR2(4000),
reg_date_time DATE default sysdate
)
partition by range (REG_DATE_TIME)
(
partition D200901 values less than (TO_DATE(' 2020-09-01 00:00:00', 'SYYYY-MM-DD HH24:MI:SS', 'NLS_CALENDAR=GREGORIAN'))
...);
alter table CCARE.CHANGES_EVENT_LOG
add constraint PK_CHANGES_EVENT_LOG primary key (CHANGES_EVENT_LOG_ID)
with partition on reg_date_time column for 5 days , sadly it is not Timestamp (it's not easy to make changes for this table, because we need to stop a lot of applications).
In triggers of these tables I just insert table name and identifiers of changed rows . changes_event_log_id is inserted with sequence. I thought that it would be very easy then to track changes, sawing last processed changes_event_log_id and on next iteration make select like this:
select * from CCARE.CHANGES_EVENT_LOG where changes_event_log_id>:last_changes_event_log_id
But I faced two major problems:
First of all we are using Oracle RAC with 3 nodes and I had huge gaps between changes_event_log_id, because in RAC oracle caches Sequence by nodes and because ordered sequences is another overhead... The second was Phantom rows that occurred between transactions , first transaction maybe last more time than the second, so tracking by changes_event_log_id had big issues...
I decided not to change this method and it changes status of the row (I used identifier_3_number column, because couldn't alter table on heavy load), because it's very fast and made another thread, which detects unprocessed rows
select * from CCARE.CHANGES_EVENT_LOG t where t.identifier_3_number is null
.
But it was very slow (about 300 msc, which is issue), so I decided make query like this;
select t.changes_event_log_id
from CCARE.CHANGES_EVENT_LOG t
where t.reg_date_time > v_last_action
and t.identifier_3_number is null
on each iteration
v_last_action = sysdate-1/24/10
and on 50th iteration
v_last_action = sysdate -1
It worked fine, but I see that on some heavy database load I have several second delays , rarely 30 or even more seconds ... in normal situation it's not more than 15-20 millisecond.
As I see, there is not more than 3000 records per second (average 30), and about 2M records per day.
Can you suggest to use CQN for tracking this table with query like this? :
select t.changes_event_log_id from CCARE.CHANGES_EVENT_LOG t where t.identifier_3_number is null
Will it have big overhead and performance degradation for database system ?
Thank you