We had a working CI testing pipeline on gitlab.com that reported our unit test results and code coverage properly in our merge requests, setup using something like the following:
pytest:
stage: test
before_script:
- pip install numpy scipy pandas xarray pytest pytest-cov netCDF4
script:
- pytest --cov --cov-report term --cov-report xml:coverage.xml --junitxml=report.xml --cov-config=.coveragerc
coverage: '/(?i)total.*? (100(?:\.0+)?\%|[1-9]?\d(?:\.\d+)?\%)$/'
artifacts:
when: always
paths:
- report.xml
reports:
junit: report.xml
coverage_report:
coverage_format: cobertura
path: coverage.xml
We recently added a more detailed data-based reporting CI job that uses quarto notebooks to run and compare different versions of our repo. Because this takes a fair bit of processing, we only want this to run on merge requests, so we set it up like the following:
sensitivity_report:
stage: deploy
rules:
- if: $CI_PIPELINE_SOURCE == 'merge_request_event'
before_script:
# snipped code installing quarto and associated libraries
script:
# snipped code using quarto to render a particular notebook
artifacts:
name: ${CI_COMMIT_REF_NAME}_sensitivity_report
paths:
- ${CI_COMMIT_REF_NAME}_sensitivity_report.*
Now, the pytest job is running happily on any commit, and the sensitity_report is only running on merge requests. Everything seems good, but the merge request Overview page is now only showing the outcome of the sensitivity report, and the unit test pass rate and coverage is not shown:
For comparison, this is what we used to see on a merge request before we added the sensitivity test:
Both jobs are being run though, as you can see here in the pipelines of the commit itself:
It feels like because the merge request is splitting the jobs into two seperate pipelines - one for the commit and one for the MR - it is only looking in the MR pipelines for testing results, and because the test is in the commit it isn’t showing them here. This isn’t great, because we want it to be clear in our MR when the testing is passing or not, and that our code coverage is adequate.
I would appreciate any ideas on how I can get our merge-request overview page to collect these pytest results and show them like it did before we added the sensitivity report job.