Skip to Main Content
  • Questions
  • manually pivoting large amounts of data to be PIPELINED through a function

Breadcrumb

Question and Answer

Chris Saxon

Thanks for the question, Dawne.

Asked: August 10, 2017 - 12:39 pm UTC

Last updated: August 11, 2017 - 10:10 am UTC

Version: 12.1.0.2.0

Viewed 1000+ times

You Asked

I have 3 SQL statements that will run independently and produced the required output in the required format. However, I need to combine these statements such that they are all contained and called within 1 stored procedure or such. I need to use the PIPELINE function such that the customer can call the function and retrieve the data. The final output data contains 103 columns.

The 1st SQL retrieves the requested data. The data is such that it has an ID, a schedule date, an amount. The schedule date is split out into quarterly hours such that 1 day for 1 ID will have 96 data rows. This is done using code like this:

JOIN (SELECT level, (sched_start_date + (LEVEL-1)/96) AS loc_start_date_time, (sched_start_date + LEVEL/96) AS loc_end_date_time
   FROM (SELECT sched_start_date, sched_end_date+1 from dual) 
         FROM (SELECT to_date('01-12-2017', 'DD-MM-YYYY') AS sched_start_date, to_date('31-12-2017', 'DD-MM-YYYY')+1 AS sched_end_date from dual) 
                CONNECT BY LEVEL <= (sched_end_date - sched_start_date) * 96) range
                ON std.loc_start_date_time < range.loc_end_date_time AND std.loc_end_date_time > range.loc_start_date_time.


The 2nd SQL then takes this data and manually pivots it into another temporary table that has the 103 columns. There are 96 columns that are quarterly hour columns named like: 00, 15, 30, 45, 100, 115, 130, 145, .... 1100, 1115, 1130, 1145.

The 3rd SQL is used to output the data using the PIPELINE function.

The input for all of this will be a date range (start_date, end_date). Therefore, there will be huge amount of data rows to process. This is why I chose NOT to use the pivot function. Is there a way to do this without pivoting the data? Is it possible to use the PIPELINE function such that I export data in chunks rather than 1 row at a time. I need the chunk of data to pivot on.
Thanks!

and Chris said...

What exactly is the problem that "huge amount of data rows to process" and pivoting is causing you? Did you test this and run into a problem, or are you just assuming that there will be an issue?

But more importantly, why are you splitting this into three statements?!

Surely you can select your data, chunk it into 15 min intervals and pivot all in one go?

e.g.:

with sched as (
  select date'2017-12-01'+(rownum-1)/12 st, date'2017-12-01'+rownum/6 en
  from   dual
  connect by level <= 50
), dts as (
  select date'2017-12-01' + (rownum - 1) / 96 dt
  from   dual
  connect by level <= ((date'2017-12-10' - date'2017-12-01') + 1 ) * 96
), qtr_hours as (
  select trunc(dt) dy, to_char(dt, 'hh24mi') qtr,
         st
  from   dts
  left join   sched
  on     st <= dt
  and    en > dt
)
  select * from qtr_hours
  pivot (
    count(st) for qtr in ('0000', '0015', '0030', '0045') -- etc.
  )
  order  by 1;

DY                    '0000'  '0015'  '0030'  '0045'  
01-DEC-2017 00:00:00  1       1       1       1       
02-DEC-2017 00:00:00  7       7       7       7       
03-DEC-2017 00:00:00  13      13      13      13      
04-DEC-2017 00:00:00  19      19      19      19      
05-DEC-2017 00:00:00  25      25      25      25      
06-DEC-2017 00:00:00  20      20      20      20      
07-DEC-2017 00:00:00  14      14      14      14      
08-DEC-2017 00:00:00  8       8       8       8       
09-DEC-2017 00:00:00  2       2       2       2       
10-DEC-2017 00:00:00  0       0       0       0 


If there's more to this, please post:

- DDL (create table) for your tables
- Sample data (insert into) to show the problem
- Any other code you've written
- A description of exactly why what you're doing is failing

Is this answer out of date? If it is, please let us know via a Comment

More to Explore

SQL

The Oracle documentation contains a complete SQL reference.