I have a legacy system which is using Control-M to run a windows .bat file that runs for around 2 hours to build the required artifacts for the production application. I am using on-premise GitLab SCM. How can I develop the gitlab pipeline using .gitlab-ci.yml to run this long-running batch file?
I have no experience having runners on windows hosts, so I can’t help with that part, but for a project you can go to “Settings”/“CI/CD”/General pipelines" and set the job timeout. I believe the default is one hour which will not work if you have to run a job that takes around two hours.
I would only allow a trigger from CI/CD and do not wait for the results inside the job, e.g. by storing a file lock on the host, and a separate daemon picks up the go-to-build action seeing the file lock being updated. A timeout may work but block the job and queues, depending how much pipelines are triggered in parallel.
Once the batch file on Windows completes, it may send back the artifacts to GitLab using the REST API and attach the binaries and test results as a new comment. Alternatively, use the generic package registry to store the artifacts as objects, and only link to them.
Over all, if the artifact’s file size is too large, consider an external storage, tag the artifacts, and deploy from there in the next step. Or go back to the timeout, if that works for you