Skip to Main Content
  • Questions
  • SQL Query to split rows based on dates

Breadcrumb

Question and Answer

Chris Saxon

Thanks for the question, dave.

Asked: March 17, 2017 - 5:20 am UTC

Last updated: March 17, 2017 - 3:29 pm UTC

Version: oracle 11g

Viewed 1000+ times

You Asked

Could you please help me with a SQL that would generate the desired output?

create  table  user_tmp 
(vname varchar(15) , external_pgm varchar(1), eff_date date, exp_date date);

insert 
  into user_tmp 
  values('JOHN','A', '01-JAN-16', '01-APR-16' );

insert 
  into user_tmp 
  values('JOHN','B', '15-FEB-16','01-MAY-16');

COMMIT;

VNAME   A-IND  B-IND  EFF_DATE     EXP_DATE
JOHN      Y     N      01-JAN-16    14-FEB-16
JOHN      Y     Y      15-FEB-16    01-APR-16
JOHN      N     Y      02-APR-16    01-MAY-16


with LiveSQL Test Case:

and Chris said...

Thanks for providing a LiveSQL test. But we still need a description of the process for generating your output!

I'm guessing in this case if the programs overlap, you want to generate a new row showing this. If so, you can do this by:

- Generating the dates between the first eff_date and last exp_date
- Joining your table to these these generated dates where they're between the eff_date and exp_date

create  table  user_tmp 
(vname varchar(15) , external_pgm varchar(1), eff_date date, exp_date date);

insert 
  into user_tmp 
  values('JOHN','A', '01-JAN-2016', '01-APR-2016' );

insert 
  into user_tmp 
  values('JOHN','B', '15-FEB-2016','01-MAY-2016');

COMMIT;

with rws as (
  select date'2016-01-01'+level-1 dt from dual 
  connect by level <= (select max(exp_date)-min(eff_date)+1 from user_tmp)
), dates as (
  select dt, vname, external_pgm from rws
  join   user_tmp
  on     dt between eff_date and exp_date
)
  select * from dates;

DT                   VNAME           E
-------------------- --------------- -
01-JAN-2016 00:00:00 JOHN            A
02-JAN-2016 00:00:00 JOHN            A
03-JAN-2016 00:00:00 JOHN            A
...


So that this point you have a list of dates and which programs are running. What you want is a single row for each date, converting the programs to columns. i.e. a pivot!

- The value to appear in the columns. You just want a Y flag, so use min('Y')
- The column storing the values to be columns, which is your external_pgm
- The list of external_pgm values to become columns ('A', 'B')

This gives a pivot clause of:

pivot (min('Y') for external_pgm in ('A' a, 'B' b))


Plug this into your previous query. Then group by the generated A and B columns. (And map to N at this point if you want). And return the min and max date for these groups:

with rws as (
  select date'2016-01-01'+level-1 dt from dual 
  connect by level <= (select max(exp_date)-min(eff_date)+1 from user_tmp)
), dates as (
  select dt, vname, external_pgm from rws
  join   user_tmp
  on     dt between eff_date and exp_date
), cols as (
  select * from dates
  pivot (min('Y') for external_pgm in ('A' a, 'B' b))
)
  select vname, nvl(a, 'N'), nvl(b, 'N'), min(dt), max(dt)
  from   cols
  group  by vname,nvl(a, 'N'), nvl(b, 'N')
  order by min(dt);

VNAME           N N MIN(DT)              MAX(DT)            
--------------- - - -------------------- --------------------
JOHN            Y N 01-JAN-2016 00:00:00 14-FEB-2016 00:00:00
JOHN            Y Y 15-FEB-2016 00:00:00 01-APR-2016 00:00:00
JOHN            N Y 02-APR-2016 00:00:00 01-MAY-2016 00:00:00


If each program only occurs once, you're done! But if program A could restart again later you'll run it problems:

insert  into user_tmp 
  values('JOHN','A', '15-APR-2016', '01-JUN-2016' );
  
with rws as (
  select date'2016-01-01'+level-1 dt from dual 
  connect by level <= (select max(exp_date)-min(eff_date)+1 from user_tmp)
), dates as (
  select dt, vname, external_pgm from rws
  join   user_tmp
  on     dt between eff_date and exp_date
), cols as (
  select * from dates
  pivot (min('Y') for external_pgm in ('A' a, 'B' b))
)
  select vname, nvl(a, 'N'), nvl(b, 'N'), min(dt), max(dt)
  from   cols
  group  by vname,nvl(a, 'N'), nvl(b, 'N')
  order by min(dt);

VNAME           N N MIN(DT)              MAX(DT)            
--------------- - - -------------------- --------------------
JOHN            Y N 01-JAN-2016 00:00:00 01-JUN-2016 00:00:00
JOHN            Y Y 15-FEB-2016 00:00:00 01-MAY-2016 00:00:00
JOHN            N Y 02-APR-2016 00:00:00 14-APR-2016 00:00:00


If this is the case, you can use the Tabibitosan method to group the consecutive rows together. You do this by:

- Assigning a row_number() for each (A, B) group, ordered by date
- Subtract this number from the dt

This gives consecutive dates for an (A, B) pair the same value. Non-consecutive have a different value. So repeat the previous group by, but include this expression.

And voila:

with rws as (
  select date'2016-01-01'+level-1 dt from dual 
  connect by level <= (select max(exp_date)-min(eff_date)+1 from user_tmp)
), dates as (
  select dt, vname, external_pgm from rws
  join   user_tmp
  on     dt between eff_date and exp_date
), cols as (
  select * from dates
  pivot (min('Y') for external_pgm in ('A' a, 'B' b))
), tabibitosan as (
  select vname, nvl(a, 'N') a, nvl(b, 'N') b, dt, 
         dt - row_number() over (
           partition by nvl(a, 'N'), nvl(b, 'N') order by dt
         ) grp
  from   cols
)
  select vname, a, b, min(dt), max(dt) from tabibitosan
  group  by vname, grp, a, b
  order  by min(dt);

VNAME           A B MIN(DT)              MAX(DT)            
--------------- - - -------------------- --------------------
JOHN            Y N 01-JAN-2016 00:00:00 14-FEB-2016 00:00:00
JOHN            Y Y 15-FEB-2016 00:00:00 01-APR-2016 00:00:00
JOHN            N Y 02-APR-2016 00:00:00 14-APR-2016 00:00:00
JOHN            Y Y 15-APR-2016 00:00:00 01-MAY-2016 00:00:00
JOHN            Y N 02-MAY-2016 00:00:00 01-JUN-2016 00:00:00


For more on pivoting, read: https://blogs.oracle.com/sql/entry/how_to_convert_rows_to

And Tabibitosan, watch: https://www.youtube.com/watch?v=yvimYixXo2Q

Rating

  (1 rating)

Is this answer out of date? If it is, please let us know via a Comment

Comments

Great explanation

dave noel, March 17, 2017 - 2:48 pm UTC

Thank you so much for the wonderful explanation and solution.
Chris Saxon
March 17, 2017 - 3:29 pm UTC

Thanks! :)

More to Explore

Analytics

Analytic SQL got you confused? Check out Connor McDonald's complete video course.