Getting 400 bad request while uploading artifacts to s3 bucket in a self hosted gitlab setup

I am evaluating self hosted gitlab in aws. Earlier I was using minio for all backend storage. Now using external storage(aws-S3 buckets) for artifacts, cache, backups etc… When I tried sample .gitlab-ci.yaml file with simple upload an artifact file, I get below error(attached screenshot)

But I could see the artifact(file) in s3 bucket. Below are the logs from workhorse container

{"content_type":"application/json","correlation_id":"01FRHH0D564FYNKTEBBTRRGD19","duration_ms":13167,"host":"gitlab.127.0.0.1.nip.io","level":"info","method":"POST","msg":"access","proto":"HTTP/1.1","referrer":"","remote_addr":"10.42.0.13:41844","remote_ip":"10.42.0.13","route":"^/api/v4/jobs/[0-9]+/artifacts\\z","status":400,"system":"http","time":"2022-01-04T03:09:44Z","ttfb_ms":13167,"uri":"/api/v4/jobs/1/artifacts?artifact_format=zip\u0026artifact_type=archive\u0026expire_in=1+week","user_agent":"gitlab-runner 14.6.0 (14-6-stable; go1.13.8; linux/amd64)","written_bytes":1338}

When I reverted back changes to use minio for all the storage artifact upload works fine. Below is sample .gitlab-ci.yml file

image: ubuntu:18.04

stages:
  - build_stage
  - test_stage

build:
  stage: build_stage
  script:
    - echo "building..." >> ./build_result.txt
  artifacts:
    paths:
    - build_result.txt
    expire_in: 1 week

test:
  stage: test_stage
  script:
    - ls
    - cat build_result.txt

Used external storage config(values.yaml)

  minio:
    enabled: false
  hosts:
    domain: gok8s.cloud
  registry:
    bucket: dev-registry
  appConfig:
    lfs:
      bucket: dev-lfs
      connection: # https://gitlab.com/gitlab-org/charts/gitlab/blob/master/doc/charts/globals.md#connection
        secret: s3-config
        key: connection
    artifacts:
      bucket: dev-artifacts
      connection:
        secret: s3-config
        key: connection
    uploads:
      bucket: dev-uploads
      connection:
        secret: s3-config
        key: connection
    packages:
      bucket: dev-packages
      connection:
        secret: s3-config
        key: connection
    backups:
      bucket: dev-backups
      tmpBucket: dev-tmp
certmanager-issuer:
  email: raghav@gok8s.cloud
gitlab:
  toolbox:
    backups:
      objectStorage:
        config:
          secret: s3-config
          key: connection
registry:
  storage:
    secret: s3-config
    key: connection

Created secret s3-config with appropriate keys and values being aws’s credentials.

Am I missing any additional configuration? thanks in advance