How to permanently delete a large file for the whole team

Someone from my team uploaded a large file to the git server, and all team members now have a clone of the project with a large file.

I followed the manual at http://help.github.com/removing-sensitive-data/ and it works both in my source tree and on the remote server. But as soon as the other person receives new data from the remote server, he will easily re-enter the large file by pushing new commits to the server.

Typically, a team member will do the following to share his message with others:

git fetch origin git rebase origin/master git push origin 

In the "rebase" step, the old large file is re-injected into its local commits. Of course, the direct way is to require everyone in the team to re-clone the project after deleting the large file, but not everyone will be happy about it. I find any way besides re-cloning the entire project for everyone.

Any suggestions? Thanks.

+7
source share
3 answers

take a look at the filter tree. You need to edit the commit that entered the file. As soon as this is done, everyone can get it. This will make non-transient remote branches in their repositories - each commit after deleting the corrupted file will be different. When they reload their current changes on top of new deleted branches, it should no longer click on a large object.

An alternative is git rebase --preserve-merges -i , where you edit an abusive commit.

+3
source

If the runtime of deleting a large file is reasonable, you can write a script that will delete the file, ask everyone to run the script locally after reinstalling, and use the hook to verify that it is not being returned.

0
source

The progit book has a detailed example using git filter-branch (rather than a filter tree, as mentioned in other posts). Chapter here

' Deleting objects ' http://progit.org/book/ch9-7.html

0
source

All Articles