How to handle timeouts with httplib (python 2.6)?

I use httplib to access the api via https and you need to build exception handling in case the api does not work.

Here's an example connection:

connection = httplib.HTTPSConnection('non-existent-api.com', timeout=1)
connection.request('POST', '/request.api', xml, headers={'Content-Type': 'text/xml'})
response = connection.getresponse()

This should wait time, so I was expecting the exception to be raised, but response.read()just returning an empty string.

How to find out if there was a timeout? Better yet, what's the best way to gracefully deal with the problem of third-party attack?

+5
source share
3 answers

urllib and httplib do not set a timeout. You must enable the socket and set the timeout there:

import socket
socket.settimeout(10) # or whatever timeout you want
+5
source

, ?

, API , API http 404, 500...

, API ?

, , , -, , :

import httplib

conn = httplib.HTTPConnection('www.google.com')  # I used here HTTP not HTTPS for simplify
conn.request('HEAD', '/')  # Just send a HTTP HEAD request 
res = conn.getresponse()

if res.status == 200:
   print "ok"
else:
   print "problem : the query returned %s because %s" % (res.status, res.reason)  

, API, , catch:

import httplib
import socket

try:
   # I don't think you need the timeout unless you want to also calculate the response time ...
   conn = httplib.HTTPSConnection('www.google.com') 
   conn.connect()
except (httplib.HTTPException, socket.error) as ex:
   print "Error: %s" % ex

, - , ,

+13

This is what I found to work correctly with httplib2. Posting as this may help someone:

    import httplib2, socket

    def check_url(url):
        h = httplib2.Http(timeout=0.1) #100 ms timeout
        try:
            resp = h.request(url, 'HEAD')
        except (httplib2.HttpLib2Error, socket.error) as ex:
            print "Request timed out for ", url
            return False
        return int(resp[0]['status']) < 400
0
source

All Articles