I use requests to download files, but for large files I need to check the file size on disk every time, because I canโt show progress in percentages, and I would also like to know the download speed, How can I do this? Here is my code:
import requests import sys import time import os def downloadFile(url, directory) : localFilename = url.split('/')[-1] r = requests.get(url, stream=True) start = time.clock() f = open(directory + '/' + localFilename, 'wb') for chunk in r.iter_content(chunk_size = 512 * 1024) : if chunk : f.write(chunk) f.flush() os.fsync(f.fileno()) f.close() return (time.clock() - start) def main() : if len(sys.argv) > 1 : url = sys.argv[1] else : url = raw_input("Enter the URL : ") directory = raw_input("Where would you want to save the file ?") time_elapsed = downloadFile(url, directory) print "Download complete..." print "Time Elapsed: " + time_elapsed if __name__ == "__main__" : main()
I think one way to do this would be to read the file every time in a for loop and calculate the percentage of progress based on the Content-Length header. But this will again be a problem for large files (about 500 MB). Is there any other way to do this?
source share