SQLAlchemy: checking column size gives odd result

Is it possible to query a column for the maximum possible data size (in bytes) that can be stored in it? For example, let's say I declare a column using

content = Column(LargeBinary) 

then how can I request content information? Following the inspection suggested in this question:

 table = File.__table__ field = table.c["content"] print("max=" + field.type.length) 

I get max=None , while I was expecting it to be max=65535 given field.type=BLOB . What am I doing wrong?

+5
source share
1 answer

As far as I understand, the maximum length of a column depends on the dialect , and this information is not stored inside sqlalchemy sources. Although, depending on the backend, you can get it dynamically. For example, for mysql you can get it from the INFORMATION_SCHEMA.COLUMNS table :

 q = select(["CHARACTER_MAXIMUM_LENGTH"]).select_from("INFORMATION_SCHEMA.COLUMNS").where( and_("table_name = '%s'" % File.__tablename__, "table_schema = '%s'" % schema, "column_name = '%s'" % column_name)) result = session.execute(q) print(result.first()[0]) # tested - prints 65535 

I am sure there is a more beautiful way to write a query.

See also:


field.type.length refers to the user-defined maximum length that initializes to None unless explicitly specified :

 def __init__(self, length=None): self.length = length 

And since you do not provide the argument length , you get None .

+4
source

Source: https://habr.com/ru/post/1210825/


All Articles