Uploading artifacts to coordinator... too large archive

Hi I have GitLab CE running on Kubernetes with many projects all uploading artifacts and for some reason there is one project that is failing when uploading to coordinator during the CI pipeline.

I exec’d into the runner and du -h the_dir and it came out as 693MB. The max artifact upload is set to 1024MB so clearly less than the max but it is still failing.

Gitlab was installed via helm with the chart version: 12.10.1.

I’m not sure what the issue is, if anyone could help i would be forever thankful.

Been facing the same issue for the past few days. Did you manage to solve it?

We have a self-hosted instance with several projects. We’ve been getting status code 413 (Request Entity Too Large) for pushes and artifact uploads. An excerpt from the logs of a failed build:

I tried:

  • Increasing Admin Area > Settings > CI/CD > Max Artifacts Size;
  • Increasing repository capacity inside each repo;
  • Configuring gitlab.rb under sudo and
    1. Increasing nginx['client_max_body_size'] or setting it to 0;
    2. Executing gitlab-ctl reconfigure
    3. Or gitlab-ctl restart nginx.

Screenshot from /var/opt/gitlab/nginx/conf/gitlab-http.conf after gitlab-ctl reconfigure:
image

I looked up all over the web after an answer but to no avail.

Did anybody ever find a solution for this? I’m facing the same issue, and no settings seem to be able to fix it.

We are facing this same problem. Was anyone able to fix it?

We solved this for our infrastructure setup.

For anyone who stumbles upon this, here’s what might help:

  1. Firstly, do the steps that GitLab’s own documentation tells you to. Increase the Max Artifacts Size for your repository/group/organisation.
  2. Increase Nginx’s client_max_body_size to the desired value or set it to 0. Restart, GitLab service after doing this. The command for this depends on your setup - whether it is Omnibus, Kubernetes or something else.
  3. Thirdly, and this is the most important if doing the above two things doesn’t work, check any other proxy servers which might be between your GitLab Runner (which is uploading the artifacts) and your GitLab instance/object storage. In our case, since it’s a Kubernetes setup, we had an Nginx Ingress. Hence, we needed to increase the max body size there as well.
3 Likes

We’ve tried everything at our location, and nothing worked. Turns out the problem was Cloudfare. The free plan limits you to 100MB requests. The solution for us was either paying a lot for Professional plan, which would still be too small at 500MB, or collapsing multiple jobs into a single job so no artifacts had to be passed around. We opted for the latter.

Yep, this is one of the problems with Cloudflare, but not the only one. Cloudflare claim to be a CDN and compare themselves to the likes of CloudFront and Akamai and yet they only allow CDN capability for HTML/CSS/JS and images - but be careful with that as well as if you have an image website that gets a lot of traffic like someone once did they ended up blocked unless they took the Enterprise plan. Any other content outside of HTML/CSS/JS that they consider excessive and you will have problems. I don’t consider them a true CDN if they pick and choose what content you can/cannot have on your site. That means even if you want to provide downloads for your clients that are legal, you can still fall foul to their T&C’s because they pick and choose what they see is CDN content. You can even ask on their forums like I did since I wanted to provide downloads (exe, zip) and the replies there automatically assume you are asking because you want to do something illegal - bad to accuse someone of something especially when they want to do something legal: https://community.cloudflare.com/t/terms-of-service-point-2-8/353593/6 - if you pay Cloudflare tons of money for Enterprise plan, then they will let you do what you want, but otherwise their remaining plans are highly restrictive. They don’t care about the little man, if they did, then they would have had my business. All of the above is easy enough to find using Google, so these are not wild claims.

That aside, you can configure Gitlab to use a CDN:

### Rails asset / CDN host
###! Defines a url for a host/cdn to use for the Rails assets
###! Docs: https://docs.gitlab.com/omnibus/settings/configuration.html#set-a-content-delivery-network-url
# gitlab_rails['cdn_host'] = 'https://mycdnsubdomain.fictional-cdn.com'

I use BunnyCDN, not with Gitlab, but it would be workable with the configuration above if I provided my URL. That way any assets/artifacts would be uploadable/downloadable via the CDN and therefore bypass Cloudflare.

There are plenty of other CDN’s to choose from, that don’t take the same attitude that Cloudflare does and are far less restrictive than Cloudflare. I only see Cloudflare as being useful as an extra security layer for your server and basic web caching for HTML/CSS/JS. Maybe if they change their attitude I’ll change my mind, but until then, there are better alternatives especially for those who simply cannot afford an Enterprise plan for a small site providing downloads or your own personal Gitlab.