I found your post from Google to find out someone already wrote a parallel wget analogue that does this. This is definitely possible and will be useful for very large files with a relatively high latency link: I got> 10x speed improvement with multiple parallel TCP connections.
However, since your organization works with both the application and the web service, I assume that your connection is high bandwidth and low latency, so I suspect this approach will not help you.
Since you are transferring a large number of small files (by modern standards), I suspect that you are actually burning the connection setting more than the transfer speed. You can verify this by loading a similar page with small images. In your situation, you can go sequentially, and not in parallel: see if your client HTTP library has the ability to use persistent HTTP connections, so that a three-way handshake is performed only once per page or less, and not once per image.
If you end up getting really fanatical about TCP latency, it's also possible to cheat , as some basic web services do.
(My own problem is related to the other end of the TCP performance spectrum, where for a long time, the bandwidth really starts to drag and drop to transfer several TB files, so if you turn on the parallel HTTP library, I would like to hear about it. The only tool that I found, called "puf", parallelized by files, not byteranges. If the above does not help you and you really need a parallel transmission tool, also contact us: I may have refused and wrote it to that time.)
source share