PostgreSQL connection unexpectedly closes when performing a large insert

I populate a PostgreSQL table with ~ 11.000.000 rows that were previously selected from another database. I am using Python and psycopg2. The whole process takes about 1.5 hours. However, after ~ 30 minutes, I get an exception "connection closed unexpectedly." The source code is as follows:

incursor = indb.cursor()
incursor.execute("SELECT ...")
indb.commit() # (1) close transaction
outcursor = outdb.cursor()
rows = 0
for (col1, col2, col3) in incursor: # incursor contains ~11.000.000 rows
    outcursor.execute("INSERT ...", (col1, col2, col3)) # This fails after ~30 minutes
    row += 1
    if row % 100 == 0: # (2) Write data every 100 rows
         outcursor.close()
         outdb.commit()
         outcursor = outdb.cursor()
incursor.close()
outcursor.close()
outdb.commit()

I inserted (1)and (2)after the first attempt failed, assuming that the open transaction has an upper limit of time of ~ 30 minutes or that the cursor has an upper limit of waiting inserts. It seems that none of these assumptions is true, and the error lies elsewhere.

VirtualBox, . .

, . , , , ( ), psycopg2 PostgreSQL.

+5
3

"" - postgresql. PostgreSQL statement_timeout, , ERROR: canceling statement due to statement timeout ( ). psycopg2. , .

, ? TCP-, . , , 30 ? , TCP- keepalive. Postgresql TCP keepalive (tcp_keepalives_interval ..), /, , .

. , tcp_keepalives_interval - 7200, 2 . 30 , . , ( , conninfo) GUC / postgresql.conf.

:

+4

, guide db copy.

0

I have django admin commands that update thousands of thousands of rows. After a while I see the same error. I believe that memory usage exceeds the limit. However, I do not know how to manually manage the transaction in teams.

0
source

All Articles