Python urllib2 timing

I would like to collect statistics related to how much time each phase of the web request takes. httplib offers:

 def run(self): conn = httplib.HTTPConnection('www.example.com') start = time.time() conn.request('GET', '/') request_time = time.time() resp = conn.getresponse() response_time = time.time() conn.close() transfer_time = time.time() self.custom_timers['request sent'] = request_time - start self.custom_timers['response received'] = response_time - start self.custom_timers['content transferred'] = transfer_time - start assert (resp.status == 200), 'Bad Response: HTTP %s' % resp.status 

Are these statistics available from a higher level interface like urllib2 ? Is there a high-level library that offers such statistics?

+7
source share
2 answers

As mentioned in a related question , a good way to do this now is to use requests . You can use it to measure request latency, although I'm not sure that you can measure the transmission time of content. You could do this by comparing the HEAD request with the GET request.

+1
source

time.time is not the most reliable and accurate. You can use the timeIt module in python for your profiling purpose. http://docs.python.org/library/timeit.html Here is a snippet of code that uses timeit

  statmnt = 'print "Replace print with the snippet you want to profile"' setup = 'print "Replace this line with some snippet specific imports"' n = 1 #Number of times you want the timeit module to execute the statmnt t = timeit.Timer(statmnt, setup) qTime = t.timeit(n) 

In your case, you need to create three timeit objects for the request, response, and content. Additional information about the timeit module

see the documentation.
0
source

All Articles