First of all, I am in no way dissatisfied with the work of my site running on Django, it does not receive massive traffic, just over 1000 visits per day.
I was curious how well it handled large traffic peaks, so I used the ab-tool for benchmarking.
I noticed that performance when concurrency is greater than 1 provides the same request volume as one simultaneous connection.
Shouldn't I increase the number of repetitions / s with concurrency?
Im in a virtual machine with 1 GB of RAM, apache2 (prefork), mod_wsgi, memcached and mysql.
All content on the page has been cached, the database does not accept any hits. And if memcached drops the record, then only 2 light (indexed) requests will be executed - and they should be immediately rewritten.
Benchmarking data: (note: I compared it with 2000 and 10k queries with the same results)
For the start page via apache2 / mod_wsgi via django:
-n100 -c4: http://dpaste.com/97999/ (58.2 reqs / s)
-n100 -c1: http://dpaste.com/97998/ (57.7 reqs / s)
For robots.txt, directly from apache2:
-n100 -c4: http://dpaste.com/97992/ (4917 reqs / s)
-n100 -c1: http://dpaste.com/97991/ (1412 reqs / s)
This is my apache conf: http://dpaste.com/97995/
Edit: added more information
wsgi.conf: http://dpaste.com/98461/
mysite.conf: http://dpaste.com/98462/
My wsgi handler:
import os, sys os.environ['DJANGO_SETTINGS_MODULE'] = 'myproject.settings' import django.core.handlers.wsgi application = django.core.handlers.wsgi.WSGIHandler()
performance django concurrency apache2 mod-wsgi
schmilblick
source share