Why is urllib2 not working for me?

I installed 3 different python script on my ubuntu 10.04 32-bit machine with python 2.6.5.

They all use urllib2, and I always get this error:

urllib2.URLError: <urlopen error [Errno 110] Connection timed out> 

Why?

Examples:

 >>> import urllib2 >>> response = urllib2.urlopen("http://www.google.com") Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/usr/lib/python2.6/urllib2.py", line 126, in urlopen return _opener.open(url, data, timeout) File "/usr/lib/python2.6/urllib2.py", line 391, in open response = self._open(req, data) File "/usr/lib/python2.6/urllib2.py", line 409, in _open '_open', req) File "/usr/lib/python2.6/urllib2.py", line 369, in _call_chain result = func(*args) File "/usr/lib/python2.6/urllib2.py", line 1161, in http_open return self.do_open(httplib.HTTPConnection, req) File "/usr/lib/python2.6/urllib2.py", line 1136, in do_open raise URLError(err) urllib2.URLError: <urlopen error [Errno 110] Connection timed out> >>> response = urllib2.urlopen("http://search.twitter.com/search.atom?q=hello&rpp=10&page=1") Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/usr/lib/python2.6/urllib2.py", line 126, in urlopen return _opener.open(url, data, timeout) File "/usr/lib/python2.6/urllib2.py", line 391, in open response = self._open(req, data) File "/usr/lib/python2.6/urllib2.py", line 409, in _open '_open', req) File "/usr/lib/python2.6/urllib2.py", line 369, in _call_chain result = func(*args) File "/usr/lib/python2.6/urllib2.py", line 1161, in http_open return self.do_open(httplib.HTTPConnection, req) File "/usr/lib/python2.6/urllib2.py", line 1136, in do_open raise URLError(err) urllib2.URLError: <urlopen error [Errno 110] Connection timed out> 

UPDATE:

 $ ping google.com PING google.com (72.14.234.104) 56(84) bytes of data. 64 bytes from google.com (72.14.234.104): icmp_seq=1 ttl=54 time=25.3 ms 64 bytes from google.com (72.14.234.104): icmp_seq=2 ttl=54 time=24.6 ms 64 bytes from google.com (72.14.234.104): icmp_seq=3 ttl=54 time=25.1 ms 64 bytes from google.com (72.14.234.104): icmp_seq=4 ttl=54 time=25.0 ms 64 bytes from google.com (72.14.234.104): icmp_seq=5 ttl=54 time=23.9 ms ^C --- google.com ping statistics --- 5 packets transmitted, 5 received, 0% packet loss, time 4003ms rtt min/avg/max/mdev = 23.959/24.832/25.365/0.535 ms $ w3m http://www.google.com w3m: Can't load http://www.google.com. $ telnet google.com 80 Trying 1.0.0.0... telnet: Unable to connect to remote host: Connection timed out 

UPDATE 2:

I am at home and I am using a router and access point :. However, I just noticed that Firefox is not working for me. But chrome, synaptic and other browsers such as Midori and Epiphany, etc., work.

UPDATE 3:

 >>> useragent = 'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/534.3 (KHTML, like Gecko) Ubuntu/10.04 Chromium/6.0.472.62 Chrome/6.0.472.62 Safari/534.3)' >>> request = urllib2.Request('http://www.google.com/') >>> request.add_header('User-agent', useragent ) >>> urllib2.urlopen(request) Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/usr/lib/python2.6/urllib2.py", line 126, in urlopen return _opener.open(url, data, timeout) File "/usr/lib/python2.6/urllib2.py", line 391, in open response = self._open(req, data) File "/usr/lib/python2.6/urllib2.py", line 409, in _open '_open', req) File "/usr/lib/python2.6/urllib2.py", line 369, in _call_chain result = func(*args) File "/usr/lib/python2.6/urllib2.py", line 1161, in http_open return self.do_open(httplib.HTTPConnection, req) File "/usr/lib/python2.6/urllib2.py", line 1136, in do_open raise URLError(err) urllib2.URLError: <urlopen error [Errno 110] Connection timed out> 

UPDATE 4:

 >>> socket.setdefaulttimeout(50) >>> urllib2.urlopen('http://www.google.com') Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/usr/lib/python2.6/urllib2.py", line 126, in urlopen return _opener.open(url, data, timeout) File "/usr/lib/python2.6/urllib2.py", line 391, in open response = self._open(req, data) File "/usr/lib/python2.6/urllib2.py", line 409, in _open '_open', req) File "/usr/lib/python2.6/urllib2.py", line 369, in _call_chain result = func(*args) File "/usr/lib/python2.6/urllib2.py", line 1161, in http_open return self.do_open(httplib.HTTPConnection, req) File "/usr/lib/python2.6/urllib2.py", line 1136, in do_open raise URLError(err) urllib2.URLError: <urlopen error [Errno 110] Connection timed out> 

UPDATE 5:

Wireshark results (packet sniffer):

Firefox: http://bit.ly/chtynm

Chrome: http://bit.ly/9ZjILK

Midori: http://bit.ly/cKilC4

midori is another browser that works for me. Only Firefox does not work.

+4
source share
8 answers

As suggested, first eliminate the network setup.

First, verify that you can ping the host you are trying to connect to:

 $ ping www.google.com 

Then try an HTTP connection using, for example, w3m :

 $ w3m http://www.google.com 
+4
source

I can only think in one time right now, XRobot does not trust them.

They? they:)

when you want to perform a scan or cleanup, and you see that they do not trust you, you just need to dump them, how is it?

First of all, you should know that any filter of the web server that they contain for malicious software such as a robot (maybe they know that you are a robot, umm XRobot :)), how do they do it? There are many ways to filter: for example, using captcha on a web page, filtering with User-Agent ...

And since your ICMP request works, the Chrome browser works, but not w3m, I suggest you change the User-Agent as follows:

 user_agent = 'Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.10) Gecko/20100915\ Ubuntu/10.04 (lucid) Firefox/3.6.10' request = urllib2.Request('http://www.google.com/') request.add_header('User-agent', user_agent ) opener.open(request) 

Maybe I get paranoia here, but hopefully this can help you :)

+3
source

What URL are you trying to connect to? There can be many reasons for this error, most of which are due to an incorrect name or IP address or a problem with your link to a remote host.

+1
source

It seems that chrome and synaptic can use HTTP proxies. In Chrome, select "Options" / "Under the hood" / "Change proxy settings." Check your gnome proxy settings:

 $ gconftool-2 -R /system/proxy 
+1
source

Have you checked your network connection? Something on the other end is not responding due to disconnection or connection failure.

Also publish the version of python used.

UPDATE:

This is almost certainly a network problem. I also have a Ubuntu 10.04 machine (32-bit) with Python 2.6.5, which is an almost untouched installation, and I cannot reproduce the problem.

 Python 2.6.5 (r265:79063, Apr 16 2010, 13:09:56) [GCC 4.4.3] on linux2 Type "help", "copyright", "credits" or "license" for more information. >>> import urllib2 >>> response = urllib2.urlopen("http://www.google.com") >>> print response.read(100) <!doctype html><html><head><meta http-equiv="content-type" content="text/html; charset=ISO-8859-1">< 
0
source

Follow these steps one at a time -

  • Check if you are connected and it works. ping google.com
  • If everything is fine, and your Internet connection is just slow, do it -

    import socket
    socket.setdefaulttimeout(300) #in seconds.

This will extend the timeout for your socket.

0
source

I have experienced similar behavior. In the end, I remembered that I used to run a script that installed a proxy. Removing the proxy from urllib2 solved my problem. This does not explain the secrets of telnet or w3m, but may help someone with the urllib2 part.

This page helped me figure out how to remove proxies.

http://www.decalage.info/en/python/urllib2noproxy

Here is the code:

 proxy_handler = urllib2.ProxyHandler({}) opener = urllib2.build_opener(proxy_handler) urllib2.install_opener(opener) 
0
source

I think there are some problems with permissions. I had the same issue on my Ubuntu 11.10. Calling python with sudo helped. Give it a try;

 jeffisabelle:~ $ python Python 2.7.2+ (default, Oct 4 2011, 20:03:08) [GCC 4.6.1] on linux2 Type "help", "copyright", "credits" or "license" for more information. >>> import urllib2 >>> response = urllib2.urlopen("http://www.google.com") Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/usr/lib/python2.7/urllib2.py", line 126, in urlopen return _opener.open(url, data, timeout) File "/usr/lib/python2.7/urllib2.py", line 394, in open response = self._open(req, data) File "/usr/lib/python2.7/urllib2.py", line 412, in _open '_open', req) File "/usr/lib/python2.7/urllib2.py", line 372, in _call_chain result = func(*args) File "/usr/lib/python2.7/urllib2.py", line 1201, in http_open return self.do_open(httplib.HTTPConnection, req) File "/usr/lib/python2.7/urllib2.py", line 1171, in do_open raise URLError(err) urllib2.URLError: <urlopen error [Errno 110] Connection timed out> jeffisabelle:~ $ sudo python Python 2.7.2+ (default, Oct 4 2011, 20:03:08) [GCC 4.6.1] on linux2 Type "help", "copyright", "credits" or "license" for more information. >>> import urllib2 >>> response = urllib2.urlopen("http://www.google.com") >>> 
0
source

All Articles