Hi everyone,
I’ve got a CI script that uses python and at some point it generates data that are written on disk. This data are highly hierarchical and generates a lot of directories and files.
Everything is ran using the python:latest docker image on Linux gitlab-shared-runners.
During one night, the ci task failed with a "OSError [36] File name too long " error.
I was wondering if somebody knows what is the file path size limit on gitlab shared runners. In the ideal case, it would be great if somebody knows a workaround for this kind of problem.
In was thinking that gitlab was using a file system such as ext4 (or similar) witch allows very long file name and path.
Any ideas ?
Thanks to all !