GitLab not checking filesize of files in repo properly?


I am a part of a Game Development Team. We are developing an Unreal Engine 4 game for PC. We have a private GitLab Server handling version control of the project.

Right now, when anyone tries to Git Pull from a branch that isn’t master from our Git Project, they receive a Smudge Error like the following:

Downloading UE4Proj/Content/Game/Art/Architecture/MedievalVillage/Textures/Bucket_A.uasset (1.3 MB)
Error downloading object: UE4Proj/Content/Game/Art/Architecture/MedievalVillage/Textures/Bucket_A.uasset (c366451): Smudge error: Error downloading Ue4Proj/Content/Game/Art/Architecture/MedievalVillage/Textures/Bucket_A.uasset (c366451e360519497bf1719bacdc40c938c833adf9b8060d90d0829fec15d6c8): expected OID c366451e360519497bf1719bacdc40c938c833adf9b8060d90d0829fec15d6c8, got e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855 after 0 bytes written

error: external filter 'git-lfs filter-process' failed
fatal: UE4Proj/Content/Game/Art/Architecture/MedievalVillage/Textures/Bucket_A.uasset: smudge filter lfs failed

It’s not just a single file either, and looks so far to have affected all branches that aren’t master:

Downloading UE4Proj/Content/Game/Art/Architecture/MedievalVillage/Textures/Bucket_C.uasset (1.4 MB)
Error downloading object: UE4Proj/Content/Game/Art/Architecture/MedievalVillage/Textures/Bucket_C.uasset (4023793): Smudge error: Error downloading UE4Proj/Content/Game/Art/Architecture/MedievalVillage/Textures/Bucket_C.uasset (40237933795d4ca4b4c58e2884e219c3e4cd8168af176f01b33d71e3353376d7): expected OID 40237933795d4ca4b4c58e2884e219c3e4cd8168af176f01b33d71e3353376d7, got e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855 after 0 bytes written

error: external filter 'git-lfs filter-process' failed
fatal: UE4Proj/Content/Game/Art/Architecture/MedievalVillage/Textures/Bucket_C.uasset: smudge filter lfs failed

So this clearly isn’t some isolated incident. Unreal Engine is reporting no issue for either of these UASSET files, which tells me there is nothing wrong with the content of these files.

To Git Pull the project, we have Team Member using Sourcetaree and Windows Command Line. They’re all receiving this same errors from our non-master branches.

Now you may be wondering why I’m asking here as opposed to GItHub or Stack Overflow (as this looks like Git is the source of the issue). I initially thought that too: I posted this same issue to Stack Overflow and a user there believes the issue is with my GitLab Server:

Your server should check the size of the object that’s uploaded and not accept the object if it’s not the right size, even if it doesn’t check the hash, so your server is broken if it doesn’t do that. The Git LFS upload protocol is designed to perform a verify operation where the server can validate the uploaded object. It doesn’t provide a way to delete objects, though. I can’t speak for what GitLab does or doesn’t do; I’m not familiar with it, so you’ll have to ask them.

Could there be something wrong with the configuration of my GitLab Server? This problem has never occurred before. I actually recently upgraded to 13.6.1 from 13.0.14 and had disabled Puma and enabled Unicorn. I switched the two so Puma is enabled (and performed gitlab-ctl reconfigure), but that doesn’t seem to have affected these Smudge Errors that keep occurring.

I would just delete what looks like corrupted Branches … but it seems to be EVERY Branch that isn’t master that’s corrupted now it’s looking like.

This seems to say that you also saw these errors while you were running unicorn, is that correct?

If so I highly doubt that unicorn/puma has anything to do with them.

I don’t know how GitLab’s LFS store works, but I would try to find out if the files actually stored on the server are good and matches the pointer stored in git (I have just read that that’s how git-lfs works).

Maybe get one developer to upload one of the affected files to another project (make one just for testing this) and get another developer to check that out to see if that file survives a trip through GitLab. Maybe try for each combination of SSH/HTTP(S). - If there are problems it of course makes sense to see if the file on the server is sane, so you know whether it was up- or download that caused problems.

So I resolved the issue: we ended up rolling back our GitLab VM to a time prior to a Git Push that had caused the GitLab VM to lock up (it took several attempts before the Git Branch was pushed at that time). After rolling back the GitLab VM, I added more RAM and CPU to the VM: the Git Branch was pushed again and this time without the GitLab VM locking up. Once successfully pushed, the branch could be pulled without error! So we no longer receive Smudge Errors when pulling the branch (or any other branch in the Git Project for that matter, as I’ll explain further).

We believe that ever since that event of the GitLab VM locking up during a Git Push, the Project Git Repository was corrupted. Indeed, anyone pulling from any of our Project Branches (minus master for some reason) were receiving Smudge Errors, including from previous branches that were pullable without error before! I am very thankful that regular snapshots and backups are performed on our Git Server.

Word of warning to anyone hosting a Git Server: if a big push is about to be attempted, snapshot AND backup the Git Server first. Trust me, you REALLY don’t want a corrupted Git repository!