I’m trying to import multiple json files into my postgres Database with the following code.
import json
import psycopg2
con = psycopg2.connect(database="kw_du", user="postgres", password="f67dd8hzFE5",host="localhost")
cur = con.cursor()
with open(r"C:UsersdiegoOneDriveDesktopWorkplace220906nu_2022_09_01.json", 'r') as data_file:
data = json.load(data_file)
with con.cursor() as cur:
cur.executemany('INSERT INTO test_nu(vp_ip, system_activ, vpm_activ) VALUES(%s)', [(json.dumps(d),) for d in data])
con.commit()
con.close()
The json file has the following structure:
[{"vp_id":1,"system_activ":1,"vpm_activ":1},{"vp_id":2,"system_activ":2,"vpm_activ":2}]
When executing the code i get the following error message:
line 13, in
cur.executemany(‘INSERT INTO test_nu (vp_id, system_aktiv) VALUES(%s)’, [(json.dumps(d),) for d in data])
psycopg2.errors.SyntaxError: ERROR: INSERT has more target columns
than expressions LINE 1: INSERT INTO test_nu (vp_id, system_aktiv)
VALUES(‘{"vp_id": …
I’ve seen, that the ERROR: INSERT has more target columns than expressions can have varios reasons, I was not able to solve this problem. Could the ‘ or {} be a problem? It seems like the json objectr doesn’t get parsed VALUES(‘{"vp_id": … Appreciate any hint.
2
Answers
You can save a number of round-trips to the database with the following:
Pass
json.dumps(data)
to this query.Fiddle Here
You can use
json_populate_recordset
.Refer sample query below –
Fiddle here.