File Replication / Synchronization between Multiple Sites Using BitTorrent

I need to create a distributed system that relies on the replication of large files between sites.

I was thinking about using p2p technology, such as bittorrent, to save bandwidth and increase reliability.

Am I terribly wrong?

Has anyone ever created such a solution?

What libraries do you recommend?

+6
synchronization p2p replication distributed bittorrent
source share
3 answers

A new promising solution from the developers of BitTorrent: BitTorrent Sync .

It has the following functions:

  • Unlimited and free!
  • Currently supported by Windows, Mac and Linux. Mobile platforms are in operation.
  • Specially designed to handle large files.
  • Private and secure: all traffic is encrypted.
  • Peer Detection Protocols.
  • Supports traffic relays for disabled nodes.
+4
source share

I just found this open source project from Twitter that hits a nail perfectly:

https://github.com/lg/murder

From the docs:

Killing is a method of using Bittorrent to distribute files to a large number of servers in a production environment. This allows you to scale and quickly deploy in environments from hundreds to tens of thousands of servers, where centralized distribution systems would not work otherwise. "Killing" is usually used to denote a flock of ravens, which in this case refers to a bunch of servers that do something.

+3
source share

If you have more than two sites, then p2p is the best IMHO solution.

Just install rtorrent , deluge or any other high-performance torrent client on each site. Than you can only distribute .torrent files with scp / sftp and enjoy.

To protect content from third-party torrent clients, set a private flag when creating the .torrent file and use your own tracker. opentracker is a good choice.

One more tip: if your torrent client supports super planting mode (aka BEP-16 or initial seeding), enable it. This will help distribute content with minimal duplication between sites.

0
source share

All Articles