Can't Use S3 Presigned URLs for Project Imports Anymore

We’re moving to a company-wide GitLab.com instance this fall. We’re starting to prepare for the migration, and I have run into an issue with the GitLab import API. Previously, we migrated from a temporary self-hosted GitLab instance to the current self-hosted instance. I was able to create some scripts to do the export and import for me. However, it looks like these APIs have changed since the last time I used them.

I am looking to use a pre-signed AWS S3 URL that contains the exported project tar.gz files. Doing it this way is secure and allows the target GitLab instance to download the files from a private S3 bucket without any additional authentication. The problem is that the docs no longer seem to give me the option to use a pre-signed URL. Instead, it looks like they want me to provide AWS credentials via an API call. This seems highly insecure, and not an optimal solution. I have tried using the other import endpoints, but I get a message telling me to use the projects/remote-import-s3 endpoint.

My question is this: is there still a way for me to use a pre-signed AWS S3 URL with the new projects/remote-import-s3 endpoint? The docs don’t indicate that there is. Here’s a link to the docs: Project import/export API | GitLab .

Hi,

From what I see in the docs, there is a python example that uses a pre-signed url.

Did you try doing it that way? The curl example seems to work differently, requiring tokens.

I did do a google search for gitlab and pre-signed S3 url’s, and issues that have been opened for that in the past seem to hint that pre-signed works fine.

I don’t use S3 so cannot test it myself, but perhaps it’s worth trying that python script example.