-
-
Notifications
You must be signed in to change notification settings - Fork 55
Open
Description
There are large files in the repo using git lfs. git lfs technically works but usually there's a better solution for storing a large file (google drive? dropbox? youtube? etc)
I think github is storing these large objects? But it limits the bandwidth on those severely and basically git operations fail unless the user explicitly tells git lfs to quit.
To Reproduce
git clone https://github.com/betodealmeida/shillelagh
>>
Updating files: 100% (4/4), done.
Downloading talks/Designated Driver Episode #5 • Shillelagh.mkv (512 MB)
Error downloading object: talks/Designated Driver Episode #5 • Shillelagh.mkv (cfdd1eb): Smudge error: Error downloading talks/Designated Driver Episode #5 • Shillelagh.mkv (cfdd1eb9175ecfd150293e4d0d1d6aead1c95b73254f11487a6686650010a464): batch response: This repository exceeded its LFS budget. The account responsible for the budget should increase it to restore access.
Errors logged to '/.../shillelagh/.git/lfs/logs/20251111T093539.747692.log'.
Use `git lfs logs last` to view the log.
error: external filter 'git-lfs filter-process' failed
fatal: talks/Designated Driver Episode #5 • Shillelagh.mkv: smudge filter lfs failed
Basically any local git operations will generate an error message like this. You can't checkout main, you can't fetch, you can't pull, you can't change branches, etc.
For anyone trying to workaround this, the workaround is
export GIT_LFS_SKIP_SMUDGE=1
Metadata
Metadata
Assignees
Labels
No labels