skip to Main Content

I have json object in the below format in a table a column json_data

{
"body": {
    "emp number": "***",
    "id": "**",
    "details": [
        {
            "location": "***",
            "designation": "***",
            "employee type": "***"
        }
    ]

}
details is an array and it is dynamic in size, the format remains the same, i.e it will always have the key location

This location is present in another table b.

What is the best way to validate if a specific employee having multiple locations has the same list of locations in both the tables.

The below query works for one location,

    select b.location from table b
left join table a
on a.employee_number = b.employee number
where a.location is  not null and b.location = (((a.json_data->>'body')::json-> details)->0)::json->> location

2

Answers


  1. There is no reliable way to enforce relational integrity between an attribute of a JSON and another table. Validation with a query won’t prevent inconsistencies. The solution is a different design: don’t store the details in a JSON array.

    In your case, it seems like the best solution would be not to store the data twice. If there is only one place in the database where relationships between employees and locations are stored, there can be no inconsistency.

    Login or Signup to reply.
  2. I would aggregate the locations found in a.json_data and table b in arrays (properly sorted) for the same employee number then compare the arrays and use it all as a scalar subquery.

    (
     select array_agg(e ->> 'location' order by e ->> 'location')
     from json_array_elements(a.json_data -> 'body' -> 'details') e
    ) = 
    (
     select array_agg(b.location order by b.location) 
     from b 
     where b.employee_number = a.json_data -> 'body' ->> 'emp number'
    )
    

    Unrelated but this is a sub-optimal data design, hard to work with.

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search