I am also struggling with this.
I was able to proof-of-concept a workaround, which was basically to have the first job of the child pipeline manually download the artifacts (using the API calls, which I know OP wasn’t interested in). This has some complications:
The CI_JOB_TOKEN can download artifacts, but only if you know the job ID (and the project ID, which is readily available in the child, and the pipeline ID, which can be passed down via a variable from the trigger job). API calls that search for a jobs by name based on the pipeline ID are unavailable to it (and the call to download by ref name downloads the latest successful job – which by definition isn’t the one currently executing).
My workaround to this complication was to use dynamically generated child pipelines, weaving in the CI_JOB_ID in the job that initially created the content, using sed magic. Note that the child pipeline is now also part of the artifacts.
While it worked in test conditions, the real application of this is ugly in my situation, so I ended up dropping the idea altogether. I’m planning to wait until this feature gets supported directly. Maybe this gives someone else an idea, though.
This doesn’t solve the problem of pulling the artifacts back from the child pipeline into the parent. See https://gitlab.com/gitlab-org/gitlab/-/issues/215725, for instance.