Is there a way to improve the speed of loading URLs in Python?
I have a program that I wrote in VB6 that smokes Python without trying. I reworked this thing, and now I tried it, and in Python (linux) the situation looks much slower, twice as long. Even the initial version of the program seemed to take longer than what I used for it, accepting Windows.
I tried using both urllib(2.7), urllib.request(3.3), and requests. I am currently trying urllib3, and it is not so fast. What usually takes 45 minutes on Windows seems to take about two hours to complete the same task on the same computer on the same internet connection. The challenge is just to search the Internet and download files when the search finds what it is looking for ... just a range of potential file names.
I will also ask, since this has happened more than once before today, how can I find error code 110 (connection timeout). What I use below does not work, and it still killed the program.
import urllib3
http = urllib3.PoolManager()
def dl_10(self):
self.NxtNum10 = int(self.HiStr10)
while self.NxtNum10 < int(self.HiStr10)+9999:
url = 'http://www.example.com/videos/encoded/'+str(self.NxtNum10)+'.mp4'
r = http.request('GET', url)
if r.status==404:
self.NxtNum10 +=1
continue
elif r.status==110:
continue
else:
urllib.request.urlretrieve(url,str(self.NxtNum10)+'_1.mp4')
statinfo = os.stat(str(self.NxtNum10)+'_1.mp4')
if statinfo.st_size<10000:
os.remove(str(self.NxtNum10)+'_1.mp4')
else:
self.End10 = self.NxtNum10
self.NxtNum10 +=1
self.counter +=1
self.NxtNum10 = 'FINISHED'
, , . , urllib (2.7) , . 10 , Windows.
Python?