Pipeline hits timeout due to Gulp 'never finishing'?

Problem to solve

I run npm in the pipeline, all is good. I generate a folder (public) for the pipeline to zip and ship to the server. But the whole pipeline stalls on the gulp process. Now, I’ve simplified the entire gulp process down to simply moving a single file into the public folder, just to try and debug the damn thing … but ALL I get is three blinking dots until I get:

$ npm run build

phoenix@1.0.0 build
gulp
[11:39:11] Using gulpfile ~/ … /gulpfile.js
[11:39:11] Starting ‘default’…
[11:39:11] Starting ‘worker’…
Move Worker.js to public/assets/js
[11:39:13] Finished ‘worker’ after 2.38 s
[11:39:13] Finished ‘default’ after 2.38 s
< HERE IS STARE AT THREE BLINKING DOTS FOR ABOUT 12 MORE MINUTES >
WARNING: step_script could not run to completion because the timeout was exceeded. For more control over job and script timeouts see: Configuring runners | GitLab
ERROR: Job failed: execution took longer than 15m0s seconds

Notice that the console log from gulp move_worker function gets printed just fine, so the function is called. But after that the pipeline just hangs till timeout.

This is the content of my gulp files:

[ FILE gulpfile.js ]

const gulp = require('gulp');

function worker ( done ) {
    require('./gulp/tasks/worker.js');
    done();
}

exports.default = gulp.series(
    worker
);

[ FILE gulp/tasks/worker.js ]

const gulp = require('gulp');

let worker_path = 'src/modules/service-worker/worker.js',
    output_path = 'public/assets/js';

function move_worker () {
console.log( 'Move Worker.js to public/assets/js' );
    return gulp.src( worker_path )
    .pipe( gulp.dest( output_path ));
}

exports.default = move_worker();

[ FILE package.json ]
( this is the script run from the pipeline )

...
"scripts": {
    "build": "gulp"
}
...

Steps to reproduce

I have fiddled with the paths to provoke an error of ‘wrong paths’ and that seems to work (changing any paths will throw the expected errors), so paths are not the issue. I suspect I have to do something ‘pipeline’ or ‘Docker’ specific to return the ‘right thing’ to the pipeline. But I wouldn’t have a clue as to what that would be.

.gitlab-ci.yml – Configuration

(partial – it works just fine without the gulp step)

script:
    - set +e
    - npm run build // The build step produces a zip at dist/archive.zip - this works when manually creating a public folder
    - curl -u $ARTIFACTORY_USER:$ARTIFACTORY_KEY -T "dist/archive.zip" https: ... left out // it's then moved to server
  
  artifacts:
    paths:
      - dist/archive.zip

The above configuration works just fine when I manually (in the pipeline) create the public folder using this, and skip the gulp build step:

mkdir public && touch public/file.txt

So, it’s all basically boiled down to what’s going on in the gulp files … as far as I can reason.

Anyone have ANY hint about what it is that I’m totally missing … apart from GitLab pipelines being a complete black box to work with?

Hi mate, Did you ever resolve this? I have been getting this for a while, but can’t see why, just that it is gulp. If I re-run the pipeline, sometimes it works and sometimes it takes a few retries to get through.