Skip to Main Content
  • Questions
  • Handle individual UKs on bulk inserts

Breadcrumb

May 4th

Question and Answer

Connor McDonald

Thanks for the question, Alexander.

Asked: October 20, 2016 - 2:17 pm UTC

Last updated: October 20, 2016 - 4:01 pm UTC

Version: 11.2

Viewed 1000+ times

You Asked

Hello,
I need to execute bulk insert into a table where two columns have unique constraints. One column has native values (cannot be changed) and another one contains abstract pseudo-random value (I generate it myself but I cannot change the algorithm). I.e.
create table PERSON (
SSN varchar2(7),
TOKEN varchar2(20)
);
alter table PERSON add constraint PK_PERSON primary key (SSN);
alter table PERSON add constraint UK_PERSON_TOKEN unique (TOKEN);

My scenario is -
1) execute bulk insert
2) analyze errors, for errors where issue can be fixed by using new values (UK_PERSON_TOKEN) I create subset of original collection with failed records only, re-generate TOKEN values and re-try step #1 using this new collection.

To make analysis possible I need to know which record has the issue and which constraint was triggered.

I can use two methods -
A) 'SAVE EXCEPTIONS' clause. In case of individual errors it populates sql%bulk_exceptions collection. Each element has ERROR_INDEX and ERROR_CODE values (in my case always 1). But it doesn't store error messages - as result I unable to identify which column had duplicated value - SSN (non-recoverable) or TOKEN (recoverable).
B) 'LOG ERRORS REJECT LIMIT UNLIMITED' clause. It populates error log table with all failed records and error information including error message from which I can deduce constraint name. The only part is missing is an index within collection used for bulk operation. The only way to identify failed collection element is to loop through whole collection and try to match it against error log record. It is slow and also doesn't help well in case when initial collection has two records with duplicated SSN values.

So my question is - is it somehow possible to get information from bulk insert which includes both collection index and constraint violated by this collection element?

Thank you,
Alex

and Chris said...

Log errors includes the name of the failed constraint. And the error table has the values from the failed columns. So you could bulk-select those with the constraint you want to re-run:

create table t (
  x int constraint x_u unique,
  y int constraint y_u unique
);

exec dbms_errlog.create_error_log ('t');

declare
  type rec is table of t%rowtype index by binary_integer;
  r rec;
  fail rec;
begin
  r(1).x := 1;
  r(1).y := 1;
  r(2).x := 1;
  r(2).y := 2;
  r(3).x := 3;
  r(3).y := 1;
  
  forall i in 1 .. r.count 
    insert into t values (r(i).x, r(i).y)
    log errors reject limit unlimited;

  select x, y  
  bulk collect into fail 
  from   err$_t
  where  ora_err_mesg$ like '%Y_U%';
  
  for i in 1 .. fail.count loop
    dbms_output.put_line(
      'X: ' || fail(i).x || 
      '; Y: ' || fail(i).y
    );
  end loop;
end;
/

X: 3; Y: 1


Then manipulate your array as needed before re-inserting.

Or, if you're happy with only dealing with failures from one constraint, you could always ignore the other!

The ignore_row_on_dupkey_index silently swallows up ORA-00001 exceptions. So you can use this to only throw exceptions on one of the constraints!

Then you could use either log errors or forall save exceptions:

rollback;

declare
  type rec is table of t%rowtype index by binary_integer;
  r rec;
  
  bulk_errors exception;
  pragma exception_init (bulk_errors, -24381);
begin
  r(1).x := 1;
  r(1).y := 1;
  r(2).x := 1;
  r(2).y := 2;
  r(3).x := 3;
  r(3).y := 1;
  
  forall i in 1 .. r.count save exceptions
    insert /*+ ignore_row_on_dupkey_index(t,x_u) */ 
    into t values (r(i).x, r(i).y);

exception
  when bulk_errors then
    for i in 1 .. sql%bulk_exceptions.count loop
      dbms_output.put_line(
        'X: ' || r(sql%bulk_exceptions (i).error_index).x || 
        '; Y: ' || r(sql%bulk_exceptions (i).error_index).Y 
      );
    end loop;
end;
/

X: 3; Y: 1

Rating

  (2 ratings)

Is this answer out of date? If it is, please let us know via a Comment

Comments

now I have some way to handle errors efficiently

Alexander Zemerov, October 20, 2016 - 3:55 pm UTC

Well, ERROR LOG doesn't help me very much - I don't know the index of failed record within my collection (as I stated in my question).
But usage of 'save exception' and ignoring one of constraints is a very nice idea.
Thank you so much!

Chris Saxon
October 20, 2016 - 4:01 pm UTC

I don't know the index of failed record within my collection

You don't need to - you just select all values into a new collection.

Glad the hint helps though.

Found additional solution and limitation

Alexander Zemerov, October 20, 2016 - 4:56 pm UTC

On good note -
Instead of IGNORE_ROW_ON_DUPKEY_INDEX hint we can use CHANGE_DUPKEY_ERROR_INDEX which allows to distinguish two different UKs. Unfortunately like many other Oracle things it is not flexible - you cannot parameterize it with custom error code (always substitute ORA-001 with ORA-38911) and as result you can distinguish only one constraint.

On bad note -
It looks like both IGNORE_ROW_ON_DUPKEY_INDEX and CHANGE_DUPKEY_ERROR_INDEX work only with unique indexes, not with uniqueu constraints. I.e. if you create unique constraints on non-unique indexes the above hints will fail with error "ORA-38913: Index specified in the index hint is invalid".

create table t (
  x int,
  y int
);
create index x_u on T(x);
create index y_u on T(y);
alter table T add constraint x_u primary key (x) using index x_u;
alter table T add constraint y_u unique (y) using index y_u; 

More to Explore

PL/SQL demos

Check out more PL/SQL tutorials on our LiveSQL tool.

PL/SQL docs

PL/SQL reference manual from the Oracle documentation library