How can I limit the database query time during web queries?

We have a pretty typical django application running on postgresql 9.0. We recently discovered some db queries that took more than 4 hours to complete due to inefficient searches in the admin interface. Although we plan to fix these queries, as a defense, we would like to artificially limit the database query time to 15 seconds - but only in the context of a web request; batch tasks and celery tasks should not be limited to this restriction.

How can we do this? Or is this a terrible idea?

+8
performance django postgresql
source share
3 answers

The best way to do this is to configure the role / user, which is used only to run web requests, and then set statement_timeout for this role.

ALTER ROLE role_name SET statement_timeout = 15000 

All other roles will use the global statement_timeout setting (which is disabled when stock is set).

+20
source share

You will need to handle this manually. This checks the rule for 15 seconds and kills the requests that violate it.

Request pg_stat_activity and find the intruders and call pg_terminate_backend (procpid) calls to kill the intruders.

Something like this in a loop:

 SELECT pg_terminate_backend(pg_stat_activity.procpid) FROM pg_stat_activity WHERE pg_stat_activity.datname = 'TARGET_DB' AND usename = 'WEBUSERNAME' AND (now()-query_start) > '00:00:15'; 
0
source share

As for time, you can pass all your requests through a class that, when instantiated, spawns two threads: one for the request and one for the timer. If the timer reaches 15 seconds, then destroy the request stream.

As for figuring out if a request was created from a web request, I don’t know enough that Django can help you. To put it simply, in your class that handles your database calls, the optional parameter for the constructor may be something like context , which could be http in the case of a web request and "" for something else.

-4
source share

All Articles