Why does sqlalchemy add \ to "for the perfect JSON string for the postgresql json field?

SQLAlchemy 0.9 has added native PostgreSQL JSON data type support. But when I defined a mapper object that has a JSON field and set its value to a perfect JSON string:

json = '{"HotCold":"Cold,"Value":"10C"}' 

The database receives data in the form:

 "{\"HotCold\":\"Cold\",\"Value":\"10C\"}" 

All internal double quotes are reset, but if I install JSON from a python dict:

 json = {"HotCold": "Cold, "Value": "10C"} 

I get JSON data in the database as:

 {"HotCold":"Cold,"Value":"10C"} 

Why? Do I have to pass data in dict form to make it compatible with SQLAlchemy JSON support?

+6
source share
3 answers

Short answer: Yes, you must.

The JSON type in SQLAlchemy is used to store the Python structure as JSON. It effectively performs:

 database_value = json.dumps(python_value) 

in storage and uses

 python_value = json.loads(database_value) 

You saved a string and it was converted to a JSON value. The fact that the string itself contained JSON was just a coincidence. Do not store JSON strings, keep Python values ​​that are JSON-serializable.

Quick demo to illustrate:

 >>> print json.dumps({'foo': 'bar'}) {"foo": "bar"} >>> print json.dumps('This is a "string" with quotes!') "This is a \"string\" with quotes!" 

Notice how the second example applies the same citation.

Use the JAON SQLAlchemy type to store additional structured data for an object; PostgreSQL gives you access to content in SQL expressions on the server side, and SQLAlchemy provides full access to content in the form of Python values ​​on the Python side.

Keep in mind that you should always set the entire value anew to an object. Do not mutate the value inside it and expect SQLAlchemy to automatically detect this change; see the PostgreSQL JSON documentation.

+8
source

Meh, but I don’t want to do three round trips, as in json.loads() , to go to SQLAlchemy, which would then execute json.dumps() and then Postgres would cancel the promotion again.

So, instead, I created a metadata table that indicated the jsonb column type as Text. Now I take the json strings, and SQLALchemy passes them, and Postgres saves them as jsonb objects.

 import sqlalchemy as sa metadata = sa.MetaData() rawlog = sa.Table('rawlog', metadata, sa.Column('document', sa.Text) with create_engine("postgresql:///mydb") as engine: with engine.acquire() as conn: conn.execute(rawlog.insert().values(document=document) 

If the document is a string, not a python object.

0
source

Today I came across a similar scenario:

 after inserting new row with a JSONB field via SQLAlchemy, I checked PostgreSQL DB: "jsonb_fld" """{\""addr\"": \""66 RIVERSIDE DR\"", \""state\"": \""CA\"", ... 

Revision of the Python code sets the value of the JSONB field as follows:

  row[some_jsonb_field] = json.dumps(some_dict) 

after I took out json.dumps (...) and just did:

  row[some_jsonb_field] = some_dict 

everything looks better in the database: no extra \ or ".

I realized again that Python and SQLAlchemy, in the case, already take care of the smallest details such as json.dumps. Less code, more satisfaction.

0
source

All Articles