I am writing a job which generates a couple large artifacts in a multi-step pipeline, but when I go to register these artifacts, fails with the following:
Uploading artifacts…
artifacts: found 14 matching files
WARNING: Uploading artifacts to coordinator… failed id=835195188 responseStatus=500 Internal Server Error status=500 Internal Server Error token=18VbH-TN
WARNING: Retrying… error=invalid argument
WARNING: Uploading artifacts to coordinator… failed id=835195188 responseStatus=500 Internal Server Error status=500 Internal Server Error token=18VbH-TN
WARNING: Retrying… error=invalid argument
WARNING: Uploading artifacts to coordinator… failed id=835195188 responseStatus=500 Internal Server Error status=500 Internal Server Error token=18VbH-TN
FATAL: invalid argument
ERROR: Job failed: command terminated with exit code 1
We are using gitlab.com w/ our own runners.
GitLab Enterprise Edition 13.6.0-pre 61ba1c10173
Gitlab Runners are version 12.5
@herriojr did you resolve this? I’m running into the same issue with gitlab-ce (using local s3 via minio).
@bill.evans I found out there is a size limit for the default artifact stuff. I ended up having to upload to s3. IDK if they support different backends for the artifacts stuff, but I wish they would so I can still use the same code regardless and just store on my s3 instances since we run a self hosted runner.
Thank you, that’s an interesting thought.
From Continuous Integration and Deployment Admin settings | GitLab, the default (if not set) is 100MB … and the artifacts I’m trying with my on-prem local S3 instance (with the min.io docker image) are definitely under that limit. I’ll play with it, though, in case the default isn’t what they say it is 
Yeah, we use gitlab saas, not fully self hosted, so I can’t configure the maximum artifact size. We just have a few self hosted runners, hence me wanting to be able to configure where we upload to for artifacts given we have our runners are in our own vpc and would have access to s3 from there.
They claim their artifact max-size is 1GB on gitlab.com, so I wouldn’t have thought that that would be a constraint for you. Perhaps we’re no closer to solving this puzzle …
Hi Bill, yeah, no, I’m building full Android OSes which exceeds the 1GB limit.
I happened to see a related answer on StackOverflow that suggests max-size in gitlab may not be sufficient: you may also need to increase attachment size for any rev-proxy (e.g., nginx). I recognize you worked through this already, but in case it is relevant:
Hi Bill, that solution appears to be for self hosted and not saas, so it would not apply in my case. Thanks!