Flask Deployment, Parallel Queries

When I test my new Flask application with a built-in web server, everything is "single-threaded" and blocking. The server cannot serve one request without ending with another. It can only process one request at a time.

When deploying a web service, this is clearly undesirable. How do you deploy Flask applications so things can move in parallel?

Are there any different things to consider regarding thread safety and concurrency inside the code (protect objects with locks, etc.) or are all the sentences equivalent?

+6
source share
3 answers

I am using uWSGI with a gevent outline. This is a ticket. In fact, this is how I use py-redis, which blocks so as not to block.

In addition, I use uWSGI to record requests after the response, while still accepting more requests.

+5
source

There are some good options. I think the two most popular are:

Run as Apache with mod_wsgi or as Nginx with uWsgi

Both of them did a good job for me.

+3
source

I am using Nginx + gunicorn . But uWSGI is the de facto standard, I hope.

+1
source

Source: https://habr.com/ru/post/926394/


All Articles