Skip to Main Content

Breadcrumb

Question and Answer

Chris Saxon

Thanks for the question, Barbara.

Asked: October 23, 2019 - 10:09 am UTC

Last updated: November 06, 2019 - 2:13 pm UTC

Version: 12c

Viewed 1000+ times

You Asked

Hello AskTom-Team,

is there a possibility to contrain and check JSON data?
For example:

insert into test(questionnaire_id, var_val, year)
    '{
        "f1":  "2571",
        "f11":  "38124",
        "f31":  "332.64",
        "f41":  "332.64",
        "f190":  "4.62",
        "f194":  "7.00",
        "f201":  "46.86",
        "f488":  "Braunschweig"
    }',2017);


I want to check e.g.
1) wether the value for f1 exists in an other table wz,
2) wether the value belonging to f31 is equal or greater as 0,
3) wether the combination of values from f11 and f488 (e.g. 38124, Braunschweig) exists in a table plz.

I found something for sql server tables but I cannot transform this to oracle tables. I really appreciate your help.

Thanks!

and Chris said...

1. Add a virtual column returning the value for f1. And create a foreign key on that.

2. Add a check constraint verifying the value >= 0.

3. Not sure what you're asking here...

create table wz (
  f1 varchar2(10) primary key
);

create table test (
  questionnaire_id integer, 
  var_val varchar2(1000)
    check ( var_val is json ), 
  year    integer
);

insert into wz values ( '2571' );
commit;

insert into test(questionnaire_id, var_val, year)
values ( 1, '{
      "f1":  "2571",
      "f11":  "38124",
      "f31":  "332.64",
      "f41":  "332.64",
      "f190":  "4.62",
      "f194":  "7.00",
      "f201":  "46.86",
      "f488":  "Braunschweig"
  }',2017
);

alter table test
  add f1 varchar2(10) 
    generated always as ( 
      json_value ( var_val, '$.f1' returning varchar2(10) ) 
    )
    references wz ( f1 );
  
insert into test(questionnaire_id, var_val, year)
values ( 2, '{
    "f1":  "not here",
    "f11":  "38124",
    "f31":  "332.64",
    "f41":  "332.64",
    "f190":  "4.62",
    "f194":  "7.00",
    "f201":  "46.86",
    "f488":  "Braunschweig"
}',2017
);

ORA-02291: integrity constraint (CHRIS.SYS_C009543) violated - parent key not found

alter table test
  add constraint f31_c
  check ( json_value ( var_val, '$.f31' ) >= 0 );
  
insert into test(questionnaire_id, var_val, year)
values ( 3, '{
      "f1":  "2571",
      "f11":  "38124",
      "f31":  -1,
      "f41":  "332.64",
      "f190":  "4.62",
      "f194":  "7.00",
      "f201":  "46.86",
      "f488":  "Braunschweig"
  }',2017
);

ORA-02290: check constraint (CHRIS.F31_C) violated

select questionnaire_id from test;

QUESTIONNAIRE_ID   
                  1 

Rating

  (3 ratings)

Is this answer out of date? If it is, please let us know via a Comment

Comments

Limit for virtual columns

Barbara, October 24, 2019 - 8:20 am UTC

Thanks for the great solution, it workes fine for me.
I was wondering wether there is a limitation for the amount of virtual columns. In the scenario mentioned above, the table test contains item and answers from a yearly questionnaire. Each year there are some new items in the questionnare. As a consequence I have to add new virtual columns yearly for contstraining the data. Will this be a problem (because I learnd that adding new (not virtual) coulumns to a table I not a good idea).
BTW, because of the new items in the questionnaire I decided to use JSON data (instead of a column for each questionnaire item) to be most flexible. But then I faced the problem with constraining them.
I hope it is okay that I possed the belonging question here and not asking a new question.
Chris Saxon
October 24, 2019 - 8:26 am UTC

There's a limit of 1,000 columns per table in Oracle Database. Both physical and virtual columns count towards this limit.

because I learnd that adding new (not virtual) coulumns to a table I not a good idea

I'm not sure who told you that, but it's TERRIBLE advice!

If you have a new data item to store, you should create a new column. Adding a column is simple. And makes your data much easier to work with.

JSON SCHEMA

Barbara Grave, November 05, 2019 - 10:10 am UTC

I just came across the possibility to use a JSON schema for the validation of my JSON data. In your opinion, would this also be a (good) possiblity to check the data types of the incoming JSON data? Expecially, because I have to validate the data types of 700 variables (f1 - f700) that would lead to 700 virtual columns. Would I rather use the virtual columns for other contstraints e.g. for fk?
Iā€™m looking forward to your reply.
Chris Saxon
November 05, 2019 - 1:20 pm UTC

What checks are you planning on doing exactly?

You could define virtual columns for every attribute, but there's no need to. You can add a check constraint doing the validation as with the value >= 0 example above.

JSON SCHEMA

Barbara Grave, November 06, 2019 - 7:41 am UTC

I have to check for all of the 700 variables:
- the data typ.
For some of the 700 variables I have to check/ add:
- > 0 or
- between 0 and 100 or
- fk to another table
- NOT NULL-Constraints
Regarding the NOT NULL-Constraints: The NOT NULL constraint may not be valid for all year, e.g. a variable becomes mandatory in year x (before year x it was not mandatory) or a new variable is added that is mandatory (in terms of a virtual column there would be NULL values for the years the variable was not asked).
As I understand it:
For the check constraints (>0, between 0 and 100) I can add contraints to json values as suggested by you, e.g.:
Alter table test
Add constraint vc_f31
 Check(json_value(var_val, ā€˜$.f31ā€™) >=  0);

For the foreign key I have to add a virtual column (since I cannot add a fk to json values, is this correct?), following again your suggestion:
alter table test
    add f1 varchar2(10)
    generated always as (
        json_value(var_value, '$.f1' returning varchar2(10))
    )
    references wz(f1);

Since the NOT NULL constraints are not fix over time I cannot implement this in the database. Is this correct? What would be a good solution for this?
For checking the data types of the 700 variables, what would be the best solution: adding a virtual column for each of the variables or using a json schema?
Chris Saxon
November 06, 2019 - 2:13 pm UTC

If you need to validate the data types, etc. for the contents of your JSON...

...why not shred it into its relational form and store the data in a "proper" table?

This will make adding constraints etc. MUCH easier. You can use plain SQL, instead of first having to extract the relevant items with JSON_value or similar.

since I cannot add a fk to json values, is this correct?

Correct. You can't create an FK on "part" of the data in a column.

Since the NOT NULL constraints are not fix over time I cannot implement this in the database. Is this correct?

Wrong. You can create constraints like:

check ( case
when extract ( year from dt ) = 2019
then <expr>
else 1
end is not null )


<expr> could be a regular column. Or a JSON_value call. Or deterministic functions that return a single value.

Though this assumes you can get the DT value from another column in the row. If you mean conditionally based on SYSDATE, then no, you can't make a constraint to do this.

More to Explore

SQL

The Oracle documentation contains a complete SQL reference.