At work, I’m running GitLab 13.4.4.
I just enabled LFS in gitlab.rb for the first time, and am trying it out.
The only thing I did was uncomment and configure these 2 lines:
- gitlab_rails[‘lfs_enabled’] = true
- gitlab_rails[‘lfs_storage_path’] = “/var/opt/lfs”
Then I ran gitlab-ctl reconfigure
.
To test, I created a fresh git repo using the web interface, cloned the repo on my computer (MacOS High Sierra), and ran this:
git lfs install
git lfs track "*.mp4"
git add .gitattributes
cp -v /small_movie.mp4 .
git add small_movie.mp4
git commit
cp -v /big_movie.mp4 .
git add big_movie.mp4
git commit
git push origin master
Even though the small_movie.mp4 file is 241MB, and the big_movie.mp4 file is 1.1GB, the git push
command output said this:
$ git push origin master
Locking support detected on remote "origin". Consider enabling it with:
$ git config lfs.https://company-software-server.com/tal/lfs-test-repo.git/info/lfs.locksverify true
Uploading LFS objects: 0% (0/1), 4.7 GB | 7.9 MB/s, done.
LFS: Put "https://company-software-server.com/tal/lfs-test-repo.git/gitlab-lfs/objects/f2314cbc2a5d01df0cbf97f89dfa338f3613610ab7971c34d7e3c8766900744d/1147248372": read tcp 192.168.XX.XYZ:56903->129.14.XX.YY:443: i/o timeout
error: failed to push some refs to 'https://company-software-server.com/tal/lfs-test-repo.git'
Not only did it try to push 4.7 GB of data for some reason (both the movies put together are nowhere near that big), but it failed.
The only server I have between my client, and gitlab, is an NGINX server. I configured NGINX with:
- client_max_body_size 0;
to set the upload size to infinite, so that nginx doesn’t prevent uploads.
Am I missing something? Why did it try to push way more data than the size of the files? Why could it have failed?
Update: Bypassing nginx entirely, and using a LAN IP over HTTP (no encryption) instead as a test, I get very similar results. It starts out fairly fast, but slows down over time, sends 2.3 GB of data (again, way more than the size of both files for some reason), and fails.
Update 2: I can confirm that LFS is enabled in the web interface settings of my repo on my GitLab server. It looks like it was enabled by default.
Update 3: I read somewhere that LFS defaults to HTTPS. My guess is that when I use it locally to bypass my Nginx (which provides SSL), and tell git to use HTTP, LFS is still trying to use HTTPS and failing. Remotely, there’s some headers, or something the Nginx isn’t handling right for LFS. Any idea what it could be? The large size in the output is apparently caused by the retries, because of the failures, and is expected.
Thanks