What is the fastest way to send a large binary file from one computer to another computer over the Internet?

I need to send large binary data (2Gb-10Gb) from one computer (client) to another computer (server) via the Internet. At first I tried to use the WCF service hosted in IIS using the wsHttpBinding binding with message protection, but it took a long time (several days), which is inappropriate for me. Now I'm thinking of writing client and server applications using sockets. Would it be faster?

What is the best way to do this?

thanks

+6
c # large-files wcf large-file-upload
source share
6 answers

A simple old FTP would be suitable for me in this case. Using it, you will have the opportunity to restore the interrupted translation, without having to repeat the work from the beginning. You should consider the possibility that massive downloads are interrupted for some reason.

+9
source share

When sending large amounts of data, you are limited by the bandwidth of the connection. And you have to take care of connection failures. Small crashes can have a big impact if you need to resend a lot of data.

You can use BITS , it transfers data in the background and divides the data into blocks. In this way, he will take care of a lot for you.

It depends on IIS (on the server) and has a client (API) for data transfer. Therefore, you do not need to read or write the basics of data transfer.

I don't know if it will be faster, but at least much more reliable than a single HTTP or FTP request. And you can run it very quickly.

If bandwidth is a problem and you do not need to send it over the Internet, you can check for high bandwidth / low latency connections, for example by sending a DVD to a courier.

You can use BITS from .Net, CodeProject has a shell .

+4
source share

Well, bandwidth is your problem, even less on sockets, it will not help you, because WCF overhead doesn’t play very much with long binary answers. Maybe your option is to use a lossless compression algorithm? If your data is compressible (do a dry run using zip, if it compresses a file on a local drive, you can find a suitable streaming algorithm). By the way, I suggest providing resume support :)

+1
source share

It is usually most appropriate to use what has already been written for this type of thing. e.g. FTP, SCP, rsync, etc.

FTP support resumes if the download breaks, although I'm not sure if it supports resumed downloads. Rsync is much better at that.

EDIT: Maybe it’s worth considering that I’m not very familiar, but maybe another option is bit torrent?

Another option is to run your own client / server using a protocol library such as UDT, which will give you better than TCP performance. See: http://udt.sourceforge.net/

+1
source share

Check out the included add-on - www.attachmore.com. You can send and share with them any type of file or size.

0
source share

Although there is some bandwidth overhead associated with higher-level platforms, I found WCF file transfer as a stream to be more than fast enough. Usually as fast as regular file transfer via SMB. I transferred hundreds of thousands of small files to the session, including large files of size 6-10gb, sometimes larger. There have never been any serious problems about any decent connection.

I really like the interfaces it provides. Lets you do some pretty cool things that can't work on FTP, like remote or duplex endpoints. You get programmatic control over all aspects of the connection on both sides, and they can send messages along with files. Interesting stuff.

Yes FTP is quick and easy if you don’t need all these things.

0
source share

All Articles