X-Robots-tag reponse header for static pages

I want to deploy some gitlab pages which I don’t want to be indexed.
Notably those under the /.well-known/ path.
I have added a robots.txt but that only controlls crawling not indexing.
So if the crawler happens upon the page another way it may still index the pages.

Since the files hosted there like pgp keys aren’t html I can’t add a meta tag like:
<meta name="robots" content="noindex">

Is there a way to specify a directory/files for which netlify should add
X-Robots-Tag "noindex, nofollow" to the response header?

It doesn’t look like google likes indexing the .well-known stuff, but duckduckgo has indexed
https://www.facebook.com/.well-known/change-password so it looks like
there is no hard rule against doing it.

I know this isn’t a super important issue for openpgpkeys in particular,
but I think a solution to this should exist for the more general case.

nvm found this:

I assume it works for gitlab as well.