this question refers to performance issues that may or may not occur due to the large number of python sleeping threads on the web server.
Reference Information. I am implementing an online store using django / satchmo. The requirement relates to late payment. The customer can reserve the product and allow a third party to pay for it later (using a random and unique URL).
To handle the unordering of an item, I create a thread that will sleep during the reservation time and then delete the reservation / mark the product as sold when it wakes up. It looks like this:
I use the same method when filtering unique URLs after they expire, only the timer sleeps much longer (usually 5 days).
So my question for you SO is as follows:
Does a large number of sleeping threads seriously affect performance? Are there better methods for planning a single event in the future. I would like to keep this in python if possible; no call to at or cron via sys .
A site is not exactly high traffic; The (generous) upper limit for goods ordered per week will be around 100. Combined with trolley reservations, this may mean that there are more than 100 sleeping threads at any given time. Will I regret scheduling tasks this way?
thanks
performance python multithreading
pisswillis
source share