In Oracle it is said that operations performed in BULK will generally outperform slow by slow processing - 10g, 9i, 7.x, whatever version.
In fact, if you can get it into a SINGLE SQL STATEMENT, the biggest bulk process of them all, you are in general best off...
150,000 is a very small number of records.
My choice would be - just a normal insert. There would typically be no need for a direct path operation on such a tiny set of data. The normal conventional path insert would be free to use existing free space (whereas a direct path insert will not, it will always allocate new space above the high water mark).
So, if this is data to be loaded from a file, I would use:
insert into table select * from external_table;
that is, the file would be mapped as an external table (so you can query the file as if it were a table) and just inserted.
I would not write any code for this at all - if some of the data might "fail", I would use the new DML error logging feature of 10g
http://asktom.oracle.com/Misc/how-cool-is-this.html but NO CODE, just an insert.
If you needed to load over a network (eg: the file is on your pc, the server is over "there" somewhere), then I would use sqlldr - but again NO CODE.