Why does Rails disconnect Postgres connections on large bulk inserts?

I'm branding in the world of Linux and server administration, and I'm stuck.

I have a rails application that sometimes has to do large data inserts, usually around 20,000 rows. The code seems to work fine in development (osx), but on production servers (ubunto, on linear vps) it fails every time, usually after about 1700 inserts. The exact number changes (1655, 1697, 1756), but it is constantly in this ball.

I do not see much useful in the production.log file. just:

Connecting to database specified by database.yml 

seconds or so after a failure.

In the main postgresql log:

 2012-10-21 23:01:28 EDT LOG: could not receive data from client: Connection reset by peer 2012-10-21 23:01:28 EDT LOG: unexpected EOF on client connection 

I am running Rails 3.2.8, ruby ​​1.9.3-p194, psql 1.9.4, nginx, unicorn

Valid after the following deployment steps described in: http://railscasts.com/episodes/335-deploying-to-a-vps

Other notes:

a) I tried to wrap and not wrap ActiveRecord inserts in a transaction. No differnece.

b) Ruby does a lot of work to collect and organize data before inserting into db. This includes a few calls to a third-party web service. But I confirmed that these messages are successful and the data looks fine.

Any ideas? Or at least any suggestions as to where I can continue to follow? Many thanks,

+1
source share
1 answer

The moral of this story is: β€œWhen in doubt, blame the unicorns.”

(The unicorn was set to a workflow timeout after 30 seconds.)

+1
source

All Articles