We’re trying to migrate a number of repositories from Bitbucket cloud to GitLab.com. The repos themselves are not too big (200-400MB) and most complete without a problem. But two of them have large LFS data (38GB and 17GB), and these are being problematic.
The first time I tried the imports, they all completed, but there was a lot of LFS objects missing. I guessed that only the objects referenced at the head were included, but I wasn’t sure.
Question 1: Is it expected that the Bitbucket Cloud importer would import all LFS objects?
I have since invited users to the GitLab.com group, with the hope of getting the Merge Request authors to line up. I also purchased 40GB of extra storage to accommodate all the LFS data (if I can somehome get it there).
Now I’ve deleted those repos and tried to import again, but the ones with large LFS seem to stay in the “importing” state indefinately. The GitLab usage page shows one of them as consuming 7.7GB, but this is not increasing (it’s been running over night). I’ve tried a few times, but deleting the project and starting again, both using the Bitbucket Cloud and the From URL importers, and they always seem to get stuck, with a varying data size shown on the usage page.
Question 2: How can I move forward with this? Is there any diagnistic information available in GitLab.com? I’ve tried hitting the import status API endpoint but that doesn’t give any more information - just that the import is “Started” and there are apparently no errors.
We are currently on the free tier, intending to use Premium, but are reluctant to upgrade if we can’t even get past the importing.
Any ideas how we could proceed?