Not thinking in sets is like thinking in slow by slow, row by row ways. If you hear yourself saying/thinking "for each record do this, then do that, then do the other thing". You should be thinking - we need to do this to everything, then that, then the other thing.
You could load a warehouse by reading a record, verifying the foreign keys are all good in that record, validating each field, and then inserting.
What you should do is
load everything by running a query using outer joins to verify foreign keys and validating each field
for example, you have an external table mapped to a gzipped compressed file. Your job is to load said file, verify data integrity, maybe even transform some data.
You could:
for x in (select * from external_table)
loop
check this
check that
check the other thing
if (all_ok)
insert record
else
log bad record
end if
end loop
that would be slow by slow. It would take linearly longer for every extra record you add.
Think about doing it this way: Instead of doing something for each record, you are doing something to all records.
insert /*+ append */ all
when (flag1+flag2+flag3+flag4+flag5 = 0) then insert production_table (all of the et columns)
else then insert into error_logging_table (all of the et columns and the flags)
select et.*,
case when lookup1.key is null then 1 else 0 end flag1,
case when lookup2.key is null then 1 else 0 end flag2,
case when lookup2.key is null then 1 else 0 end flag3,
count(*) over (partition by unique_column) flag4,
case when (a+b+c > 500) then 1 else 0 end flag5,
.......
from et, lookup1, lookup2, lookup3
where .....
here we decompress, verify/validate/transform if you want in the select, load and compress (direct path load, we can hybrid columnar compress on engineered systems) all in one step.
If you are using the word "for" a lot when thinking about your large processes, you are thinking procedurally. get out of that habit.