I have a problem with my django application accumulating postgres connections over time. It seems that every 30 minutes a new connection is established and the old connections are not closed (see. Screen). Since the maximum connections are set to 100 after a while, all connections are blocked.
Does anyone know what causes this problem?

I discovered this after I included some celery tasks. Therefore, I am absolutely sure that this is due to celery.
So, I tried to close the connection manually after each task using the method after_return:
from django.db import connection
class DBTask(Task):
abstract = True
def after_return(self, *args, **kwargs):
connection.close()
@task(name='example', base=DBTask)
def example_task(value):
But that doesn't help either. Perhaps I am completely wrong, and he is not at all connected with celery.
My database configuration:
DATABASES = {
'default': {
'ENGINE': 'django.contrib.gis.db.backends.postgis',
'NAME': 'production',
'USER': 'production',
'HOST': 'some.host',
'CONN_MAX_AGE': 0,
},
}
Installed packages:
- django 1.8.9
- pyscopg2 2.6.1
- 3.1.20
- django-celery 3.1.17
webfaction (, )
question, CONN_MAX_AGE: 0 .
:
connection.close() celery, .
2:
connection.close() , .
3:
, :
celery_tasks.py
@task(name='push_notifications', base=DBTask)
def push_notifications_task(user_id):
user = CustomUser.objects.get(id=user_id)
PusherAPI().push_notifications(user)
connection.close()
models.py
class PusherAPI(object):
def push_notifications(self, user):
from .serializers import NotificationSerializer
self.pusher.trigger(
'user_%s' % user.slug,
'notifications',
NotificationSerializer(user).data
)
serializers.py
class NotificationSerializer(object):
def __init__(self, user=None):
if user is None:
self.user = get_current_user()
else:
self.user = user
@property
def data(self):
notifications = self.user.notifications.unread()
...
return note_dict
db- CustomUser.objects.get(id=user_id) notifications = self.user.notifications.unread()