What are Resource limits for CI jobs on GitLab.com? RAM? CPUs

I am running a python script in CI on GitLab.com
What are the resource limits for CI jobs on GitLab.com?

  • max RAM?
  • max CPUs?
  • max disk space?

Hi @westshawn and welcome to the GitLab Community Forums.

If you’re using the GitLab.com autoscaling shared runners, I believe the runners have 1CPU, 4GB RAM, and 22Gb of storage (16Gb available).

You also have the option to install GitLab Runner on your own infrastructure use it to run your CI jobs. Doing this, there is essentially no resource limit, as it circumvents any limitations of using shared resources.


Thank You!!

Hi @gitlab-greg, is there any evidence such as official docs about the resource limits?

Thx, Martin

All your CI/CD jobs run on n1-standard-1 instances with 3.75GB of RAM, CoreOS and the latest Docker Engine installed. Instances provide 1 vCPU and 25GB of HDD disk space.

Source: https://docs.gitlab.com/ce/user/gitlab_com/index.html#shared-runners


You can also confirm these stats real-time by adding a few commands to a GitLab CI job.

For example:

image: "ubuntu:latest"

    - free -m | grep -v "Swap" # RAM
    - df -h| grep -E "Filesystem|overlay" # storage
    - lscpu | grep -E "^CPU\(s\)" # CPUs

The output should look something like:

$ free -m | grep -v "Swap"
              total        used        free      shared  buff/cache   available
Mem:           3693         481        2221         199         991        2790
$ df -h| grep -E "Filesystem|overlay"
Filesystem      Size  Used Avail Use% Mounted on
overlay          22G  4.6G   17G  22% /
$ lscpu | grep -E "^CPU\(s\)"
CPU(s):                          1

I thought we can use a half of n2-standard-2 instance now, but it was not correct :smile: