Skip to Main Content
  • Questions
  • JSON containing date in Oracle Database

Breadcrumb

May 4th

Question and Answer

Chris Saxon

Thanks for the question, anwesh.

Asked: July 12, 2019 - 1:52 pm UTC

Last updated: January 12, 2022 - 4:45 pm UTC

Version: 11

Viewed 10K+ times! This question is

You Asked

I have a SQL query as:

With jtable as (:inp)
Select * from table t,jtable where t.DATE = jtable.DATE


Here
inp 
is a dataframe which I take care of by converting to json and eveything and it works perfectly fine except when it has to deal with datetime object.
So if I have dataframe with datetime object and I push it to json I get it in posix and hence the date in oracle which is still datetime wont be equal to it.

I tried changing it to make string TO_DATE(%y-%m-%d,YYYY-MM-DD) when I see datetime in dataframe but its non numeric and hence binding doesnot work.

Is there any way to bypass it so that I dont have to change my sql queries (sql query can be changed to TO_DATE and I can pass datettime simply as string)


jason-data == [{"TEST_DATE"16163612361}]
when the dataframe has TEST_DATE = pd.timestamp(2019,6,28) or something. Is there any way to tackle it?

and Chris said...

If you're on 12c, you can use the JSON* functions to extract attributes.

With these, the database can implicitly values to a DATE, provided the documents formats the dates to the ISO 8601 standard.

For example:

with rws as (
  select '{
  "dates" : [
    "2019-07-15",
    "15-JUL-2019",
    "2019-07-15 02:11:27 PM",
    "2019-07-15 14:11:27",
    "2019-07-15T14:11:27",
    "2019-07-15T14:11:27Z",
    "2019-07-15T14:11:27+00:00"
  ]
}' doc
  from   dual
)
  select j.*
  from   rws, json_table (
    doc, '$'
    columns 
    nested path '$.dates[*]' 
    columns (
      dt_str varchar2 path '$',
      dt_dt  date path '$' null on error
    )
  ) j;
  
DT_STR                       DT_DT                  
2019-07-15                   15-JUL-2019 00:00:00    
15-JUL-2019                  <null>                  
2019-07-15 02:11:27 PM       <null>                  
2019-07-15 14:11:27          <null>                  
2019-07-15T14:11:27          15-JUL-2019 00:00:00    
2019-07-15T14:11:27Z         15-JUL-2019 00:00:00    
2019-07-15T14:11:27+00:00    15-JUL-2019 00:00:00


So if your JSON contains dates in any other format, you'll have to to_date the values.

If you're on 11g you'll have to parse your JSON in some other way (e.g. with the APEX_JSON package). But you'll almost certainly have to explicitly convert the values to make them DATEs.

Rating

  (1 rating)

Is this answer out of date? If it is, please let us know via a Comment

Comments

The example in the answer is truncating the time

A. Olsen, January 12, 2022 - 4:08 pm UTC

Please note, that the answer lists:
2019-07-15T14:11:27Z 15-JUL-2019 00:00:00

I would like to know, how to get 15-JUL-2019 14:11:27 (on 19c)
Chris Saxon
January 12, 2022 - 4:45 pm UTC

Declare the column as a timestamp:

with rws as (
  select '{
  "dates" : [
    "2019-07-15",
    "15-JUL-2019",
    "2019-07-15 02:11:27 PM",
    "2019-07-15 14:11:27",
    "2019-07-15T14:11:27",
    "2019-07-15T14:11:27Z",
    "2019-07-15T14:11:27+00:00"
  ]
}' doc
  from   dual
)
  select j.*
  from   rws, json_table (
    doc, '$'
    columns 
    nested path '$.dates[*]' 
    columns (
      dt_str varchar2 path '$',
      dt_dt  timestamp path '$' null on error
    )
  ) j;
  
  
DT_STR                       DT_DT                             
2019-07-15                   15-JUL-2019 00.00.00.000000000    
15-JUL-2019                                                    
2019-07-15 02:11:27 PM                                         
2019-07-15 14:11:27                                            
2019-07-15T14:11:27          15-JUL-2019 14.11.27.000000000    
2019-07-15T14:11:27Z         15-JUL-2019 14.11.27.000000000    
2019-07-15T14:11:27+00:00    15-JUL-2019 14.11.27.000000000 

More to Explore

SQL

The Oracle documentation contains a complete SQL reference.