Git vs SVN - Network Performance (for backup)

Which is better for transferring large files with frequent updates over limited bandwidth? I could not find any comparisons.

UPDATE

In order not to exclude other solutions, is something better suited for sending deltas to large files? (Tried Unison)

+4
source share
3 answers

When using large and modified binaries, both git and svn should be basically the same for push / commit and pull / update operations. With large files, you are limited by the size of the difference you send. Both git and svn will perform the compression, so no one will obviously win here if one of them does not work better with your file type.

However, there is one critical point: cloning a git repository of this kind will be slow. This is due to the fact that the clone will have to pull out all these differences, and not just the last picture.

So, if you can avoid the clone command, in particular, you are free to use any tool that is best for you.

(I also suggest Dropbox as a good candidate for this task.)

+3
source

Unison or rsync is probably the best choice. Storing a large number of large binary files in a source control system can cause headaches.

+2
source

It is not possible to give specific numbers right now, but I use SVN and git, and the latter is faster.

More propaganda: http://whygitisbetterthanx.com/#git-is-fast

Git can use four main network protocols for data transfer: Local, Secure Shell (SSH), Git and HTTP.

...

The git protocol is the fastest transfer protocol. If you spend a lot of traffic on a public project or are executing a very large project that does not require user authentication for read access, you will probably want to configure the git daemon to serve your project.

From http://progit.org/book/ch4-1.html

And an informal note describing the protocol: http://git-scm.com/gitserver.txt

+1
source

All Articles