Upload external logs to pipeline results?

We …:

  • are using third-party software on our local server to do builds.
  • installed and registered a GL runner on our local server
  • call the local build script successfully from a pipeline inside GL.

The third-party build software generates log files. It would be nice to access them via the GL UI when the builds fail, rather than log into our local server and track down the logs for the failed job.

I tried registering them as artifacts…:

Is there a “better” way to register log files with dynamic names and make them easily accessible / visible from the pipeline results log? I feel like I’m applying a “square peg to a round hole” method.

Hello @cburgwork :wave:,

GitLab allows you to define artifacts in your .gitlab-ci.yml file, which can include your log files.

If you’ve defined the path of your log files correctly and it still says “No artifacts found”, I suspect there’s an issue with the artifact path configuration.

Here’s an example of how you might define your artifacts:

job:
  script: <your script>
  artifacts:
    paths:
      - path/to/your/logs/*.log

In this example, path/to/your/logs/*.log should be replaced with the path to your log files. The *.log part is a wildcard that will match any file ending in .log.

Since you’re using dynamic log filenames, you’ll probably want to use the * selector somewhere.

Alternatively, if you want to save any/all *.log files created during the pipeline, you could do something like:

job:
  script: <your script>
  artifacts:
    paths:
      - **/*.log

That should save all *.log files created during the CI job as an artifact, regardless where the *.log files are stored.

Once you’ve defined your artifacts, they should appear in the GitLab UI under the “Job Artifacts” section of the job detail page. You can then click on the “Browse” button to view the files directly in your browser.

If your log files are still not showing up as expected, make sure that:

  1. The paths to the log files are correct.
  2. The log files are being generated before the job finishes.
  3. The job in which you’re defining the artifacts is the job that’s generating the logs.

If you can’t figure out where the log files are going, you might consider adding an after_script:

job:
  script: <your script>
  after_script:
    - ls -al **/*.log
    - find . -type f -name *.log
  artifacts:
    paths:
      - **/*.log

I hope this helps! Let us know how it goes, and if you have any other questions.

Greg, thanks for your detailed reply.

The main issue is that the artifacts are available from the job page but not the pipeline page.

On the pipeline page, it says “No artifacts found”:

On the job page, you can see in the pipeline log that it found the artifacts:

After clicking the Download button on the job page, it downloads the artifacts:

Why are the artifacts only available on the job page and not the pipeline page?

Also, a suggestion for a possible improvement, make these log contents visible in the UI, instead of having to DL them and view them externally.

1 Like