Skip to Main Content
  • Questions
  • how to handle errors in oracle parallel function

Breadcrumb

Question and Answer

Connor McDonald

Thanks for the question, naveen.

Asked: February 22, 2018 - 9:30 am UTC

Last updated: February 27, 2018 - 1:48 am UTC

Version: oracle 10g

Viewed 1000+ times

You Asked

I have parallel function like which takes cursor as input and creates flat file using UTL_FILE

CREATE OR REPLACE FUNCTION generate_file (p_input IN SYS_REFCURSOR)
   RETURN dump_ntt  -- this just a nested table
   PIPELINED
   PARALLEL_ENABLE(PARTITION p_input BY ANY)
IS
   TYPE t_input IS RECORD
   (
     meta      VARCHAR2 (300)
   );
 
 
   TYPE tb_input IS TABLE OF t_input;
   v_input        tb_input;
   fHandle_ccif   UTL_FILE.FILE_TYPE;
   fHandle_idx    UTL_FILE.FILE_TYPE;
BEGIN
   FETCH p_input BULK COLLECT INTO v_input;
   CLOSE p_input;

   for i in 1 .. v_input.count
   loop
     fHandle_idx :=UTL_FILE.FOPEN ('HIST_LETTER_PDF', 'H' || 'filename' || '.idx', 'w');
     UTL_FILE.PUT_LINE (fHandle_idx, v_input(i).meta);
     UTL_FILE.FCLOSE (fHandle_idx);
   end loop;
   PIPE ROW (dump_ot (1));
END;
/


This function is called by 8 slave process like below

  SELECT *
  FROM   TABLE(
               generate_file(
                              CURSOR( 
                                     SELECT /*+ PARALLEL(s,8) */  COLUMN1
                                     FROM table1 s                                        
                                    )                                                        
                                                      )
             ) nt;



Everything works fine as desired. now i need to handle exception so that even if one record get failed others needs to continue with the process.

Any Ideas, as this is a parallel process of pipeline function





and Connor said...

I think you have misinterpreted the parallel element here.

It looks to me like you want to write a file - that must be done in serial. You can *read* the data in parallel, but ultimately you need to write one file, and that is a serial process.

So you don't need a parallel function to do this - just the query is parallel, as you've already done.


Rating

  (2 ratings)

Is this answer out of date? If it is, please let us know via a Comment

Comments

Questions

Gh, February 24, 2018 - 7:25 am UTC

Where comes from dump_ot?
Why function not procedure?
Why pipelined?
Why open and close file are in loop?

separe file for each record using parallel function

naveen mani, February 26, 2018 - 4:02 am UTC

the ask is to create separate files for each record of the drive query. these separate file will be sent to unix software for further processing.
Connor McDonald
February 27, 2018 - 1:48 am UTC

In that case, I would look at using something like

DBMS_PARALLEL_EXECUTE

Search this site for that and there's plenty of examples

More to Explore

PL/SQL demos

Check out more PL/SQL tutorials on our LiveSQL tool.

PL/SQL docs

PL/SQL reference manual from the Oracle documentation library