Response time for urllib in python

I want to get response time when I use urllib . I made the code below, but this is more than the response time. Can I get time using urllib or use some other method?

 import urllib import datetime def main(): urllist = [ "http://google.com", ] for url in urllist: opener = urllib.FancyURLopener({}) try: start = datetime.datetime.now() f = opener.open(url) end = datetime.datetime.now() diff = end - start print int(round(diff.microseconds / 1000)) except IOError, e: print 'error', url else: print f.getcode(), f.geturl() if __name__ == "__main__": main() 
+5
source share
1 answer

Save some problems and use the requests module. In his answers, he provides a datetime.timedelta field called 'elapsed', which lets you know how long the request has passed.

 >>> import requests >>> response = requests.get('http://www.google.com') >>> print response.elapsed 0:00:01.762032 >>> response.elapsed datetime.timedelta(0, 1, 762032) 
+14
source

All Articles