Gradually converting a repo to use git-lfs for certain files. One file at a time

287 Views Asked by At

We have a repo that is high value and expensive to disrupt.

It also has large XML files that cause significant problems when using a webapp for merges and git log history. In addition it is likely making our CI/CD inefficient. In other words ... the regular things that spur people to move to git-lfs.

We want to carefully do this. One file at a time.

I have seen approaches similar to what's listed below:

cp *.xml ~/tmp
git rm *.xml
git commit
git lfs track *.xml
git add .gitattributes
git commit; git push

In a fresh directory:

git clone --mirror $remote; cd repo
bfg --delete-files '*.xml'
git reflog expire --expire=now --all && git gc --prune=now --aggressive
git push

Back in src:

mv repo repo.bloated
git clone $remote; cd repo
cp ~/tmp/*.xml .
git add *.xml # (it now puts them in lfs)
git commit; git push

How can I do something similar but just start with one large xml file to mitigate risk during this transition. We prefer to be in easy contact with the dev maintaining that file, isolate changes and crawl here. Changing 100s of files could hold up developers and be expensive.

Do we just change * to the specific file name in the above example?

1

There are 1 best solutions below

0
On

If anyone has a similar concern we have experimented with this and I'm happy to say we have been able to safely and carefully do this.

Gradually transitioning to git lfs works, and to try it on a file by file basis you can replace the * with a particular file in the above snippets and do one file at a time.

Later you can go as whole hog as you need to, use bfg, etc.

If you are absolutely sure you want to do this to try out git lfs you can do it. We have done that in our repo to get more familiar with it before wider adoption and it works just fine.