Handling concurrent / asynchronous requests with Python BaseHTTPServer

I installed a streaming (with Python streams) HTTP server by creating a class that inherits from HTTPServer and ThreadingMixIn:

class ThreadedHTTPServer(ThreadingMixIn, HTTPServer): pass 

I have a handler class that inherits from BaseHTTPRequestHandler and I start the server with something like this:

 class MyHandler(BaseHTTPRequestHandler): ... server = ThreadedHTTPServer(('localhost', 8080), MyHandler) # Prevent issues with socket reuse server.allow_reuse_address = True # Start the server server.serve_forever() 

This is all pretty simple. The problem I am facing is that ThreadingMixIn, ForkingMixIn or otherwise, the request ends up blocking the request handler for return. This can be easily seen by running this sample code:

 class MyHandler(BaseHTTPRequestHandler): def respond(self, status_code): self.send_response(status_code) self.end_headers() def do_GET(self): print "Entered GET request handler" time.sleep(10) print "Sending response!" respond(200) 

If the server processed them simultaneously, we could send two requests and see how the server enters both GET request handlers before sending the response. Instead, the server will go into the GET request handler for the first request, wait for it to return, then enter it for the second (so the second request will take ~ 20 seconds to return instead of 10).

Is there an easy way for me to implement a system in which the server does not expect a handler to return? In particular, I am trying to write a system that waits to receive several requests before returning any of them (a long survey form) and run into problems when the first request expects to block any future requests from connecting to the server.

+8
source share
1 answer
 class ThreadedHTTPServer(ThreadingMixIn, HTTPServer): pass 

. Your client may not be making simultaneous requests. If you execute concurrent requests, the streaming server works as expected. Here is the client:

 #!/usr/bin/env python import sys import urllib2 from threading import Thread def make_request(url): print urllib2.urlopen(url).read() def main(): port = int(sys.argv[1]) if len(sys.argv) > 1 else 8000 for _ in range(10): Thread(target=make_request, args=("http://localhost:%d" % port,)).start() main() 

And the corresponding server:

 import time from BaseHTTPServer import BaseHTTPRequestHandler, HTTPServer, test as _test from SocketServer import ThreadingMixIn class ThreadedHTTPServer(ThreadingMixIn, HTTPServer): pass class SlowHandler(BaseHTTPRequestHandler): def do_GET(self): self.send_response(200) self.send_header("Content-type", "text/plain") self.end_headers() self.wfile.write("Entered GET request handler") time.sleep(1) self.wfile.write("Sending response!") def test(HandlerClass = SlowHandler, ServerClass = ThreadedHTTPServer): _test(HandlerClass, ServerClass) if __name__ == '__main__': test() 

All 10 requests end in 1 second. If you remove ThreadingMixIn from the server definition, all 10 requests will take 10 seconds.

+11
source

All Articles