Writing data to postgres database

I want to write the pandas framework to the postgres table. I am making a db connection as follows:

import psycopg2
import pandas as pd
import sqlalchemy

def connect(user, password, db, host='localhost', port=5432):
    '''Returns a connection and a metadata object'''
    url  = 'postgresql://{}:{}@{}:{}/{}'
    url = url.format(user, password, host, port, db)

    # The return value of create_engine() is our connection object
    con = sqlalchemy.create_engine(url, client_encoding='utf8')

    # We then bind the connection to MetaData()
    meta = sqlalchemy.MetaData(bind=con, reflect=True)

    return con, meta

con, meta = connect('user_name', 'password', 'db_name', host='host_name')

When I read from an already completed table, it works fine:

df = pd.read_sql("SELECT * FROM db.table_name limit 10",con=con)
print df

I would like to write df for the table. To test this, I have a temporary table called "test" with two field names and an age.

# create a temp df
table = [['name', 'age'], ['nameA' , 20], ['nameB', 30]]
headers = table.pop(0)
df = pd.DataFrame(table, columns=headers)
# write to db
df.to_sql('db.test', con, if_exists = 'replace', index=False)

Then I check if the temp table is populated:

df = pd.read_sql("SELECT * FROM db.test limit 10",con=con)
print df

I get an empty framework! I have no errors when I use df.to_sql, but nothing is written to the database (?). What am I missing and how can I fix it?

Versions:

Pandas: 0.19.2
Sqlachemy: 1.1.10
Postgres: 9.4.9
+2
source share
1 answer

, df.to_sql . pd.io.sql.SQLDatabase :

meta = sqlalchemy.MetaData(con, schema='db_name')
meta.reflect()
pdsql = pd.io.sql.SQLDatabase(con, meta=meta)
pdsql.to_sql(df, 'test', if_exists='replace')

- , , df.to_sql() , .

+1

All Articles