I want to download a huge file from an ftp server in chunks of 50-100 MB each. At each point, I want to be able to set the “starting” point and the length of the required fragment. I will not have the “previous” pieces stored locally (ie I cannot ask the program to resume downlaod).
What is the best way to get around this? I use wget mostly, but is something even better?
Hello! I'm really interested in the prebuilt / inbuild function and not using the library for this purpose ... since wget / ftp (also, I think) allows resuming downloads, I don’t see if this will be a problem ... (I can not figure out of all the options though!)
hi noinfection - I looked at this and it won’t work ... I don’t want to store the whole huge file at the end, I just process it in pieces ... fyi all - I look to continue downloading FTP afther reconnect , which seems interesting ..
shell ftp wget
Dave
source share