Error psycopg2: DatabaseError: error without message from libpq

I have an application that parses and loads data from csv files into a Postgres 9.3 database. In serial execution, insert / cursor instructions work without problems.

I added celery to the mix to add parallel parsing and insert data files. Parsing works fine. However, I am going to run insert instructions, and I get:

[2015-05-13 11:30:16,464: ERROR/Worker-1] ingest_task.work_it: Exception Traceback (most recent call last): File "ingest_tasks.py", line 86, in work_it rowcount = ingest_data.load_data(con=con, statements=statements) File "ingest_data.py", line 134, in load_data ingest_curs.execute(statement) DatabaseError: error with no message from the libpq 
+7
python postgresql python-multiprocessing celery psycopg
source share
1 answer

I ran into a similar problem when multiprocessing engine.execute() . I solved this problem, finally by simply adding engine.dispose() directly to the first line under the function that the subprocess should enter, as suggested in the white paper:

When a program uses multiprocessing or fork() , and the Engine object is copied to a child process, engine.dispose() should be called so that the engine creates new databases local to that fork. Database Normally, connections do not move across process boundaries.

+5
source share

All Articles