-
My table structure is
My code import psycopg2
from psycopg2.extras import Json
insertdata = {"id": "001", "data": [Json({"foo": 1}), Json({"foo": 2})]}
print(type(insertdata["data"][0]))
cur.execute("insert into testtable values(%(id)s, %(data)s)", insertdata) Result says my data is list of text but surely it is psycopg2._json.Json. What is the exactly data type I should convert to?
|
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
Looking at mogrify: >>> print(cur.mogrify("%(data)s", insertdata).decode())
ARRAY['{"foo": 1}','{"foo": 2}'] The adapter for the json wrapper doesn't add a cast to the snippet it generates, so it's passed as "unknown" to the parser. Postgres can cast unknown -> json, but, because there isn't a type such as "unknown array" (there is "anyarray", but it's only a pseudotype to be used as parameter type, I don't think it can be really instantiated), the array is temporarily converted to a text array. Because there isn't an implicit text[] -> json[] cast, automatic cast fails. You can cast the expression, as suggested, using cur.execute("insert into testtable values(%(id)s, %(data)s::json[])", insertdata) Note that psycopg 3 has a more refine adaptation system, able to figure out the type of the parameter, and the query above doesn't require an explicit cast. |
Beta Was this translation helpful? Give feedback.
Looking at mogrify:
The adapter for the json wrapper doesn't add a cast to the snippet it generates, so it's passed as "unknown" to the parser. Postgres can cast unknown -> json, but, because there isn't a type such as "unknown array" (there is "anyarray", but it's only a pseudotype to be used as parameter type, I don't think it can be really instantiated), the array is temporarily converted to a text array. Because there isn't an implicit text[] -> json[] cast, automatic cast fails.
You can cast the expression, as suggested, using