I am working on a manage.py team that creates about 200 threads to verify remote hosts. My database setup allows me to use 120 connections, so I need to use some kind of pool. I tried using a dedicated stream like this
class Pool(Thread): def __init__(self): Thread.__init__(self) self.semaphore = threading.BoundedSemaphore(10) def give(self, trackers): self.semaphore.acquire() data = ... some ORM (not lazy, query triggered here) ... self.semaphore.release() return data
I pass an instance of this object to each control chain, but still get "OperationalError: FATAL: sorry, too many clients already" inside the pool object after initializing 120 threads. I expected that only 10 connections to the database would be open, and the threads would wait for a free semaphore slot. I can check if the semaphore works by commenting on "release ()", in this case only 10 threads will work, while others will wait for the application to finish.
As far as I understand, each thread opens a new connection to the database, even if the actual call is inside another thread, but why? Is there a way to execute all database queries in only one thread?
multithreading django postgresql orm connection-pooling
Riz
source share