Git archive vs cp -R

If I have a clone of the git repository as a cached copy on a remote server to deploy capistrano / vlad style, is it better to do A)

git archive --format=tar origin/master | (cd #{destination} && tar xf -) 

or B)

 cp -R cached-copy #{destination} && rm -Rf #{destination}/.git 

To clarify, the repository is already on the remote server, and I just want to copy a specific version to the releases directory on the same server during deployment.

+4
source share
4 answers

I would say really

 rsync -avP /local/repo/* server:/remote/repo 

This works until it skips all the dot files in the repo, not just .git . If you want to skip only .git , you will need the -f option and the man page.

I love rsync . It works great, and most times you can use it the same way you would use scp!

+8
source

A)

You save network overhead for moving the .git directory, which can be quite large, depending on how many history and objects are not in the current HEAD.

If you ever wanted to have a real git repository at the remote end, you better go to the real repository and change only the deltas.

+3
source

None!

The best way to do this is:

  • git fetch your cache
  • Cloning a cache in the current directory (with --no-checkout options --no-checkout )
  • Checkout the purchase you want.

The local Git clone uses hard links. This means that as long as you do not modify the file that you do not like, you can have 1000 deployments and use (practically) only the space that you need for one. This method also archives or rsync much faster.

+3
source

Also, someone already wrote you their code.

vigetlab capistrano_rsync_with_remote_cache

I use this with Subversion and it works well for me.

+2
source

All Articles