[14:54:02] well hmm [14:54:25] because running the Quibble job would need some place to store logs and I wasn't fan of uploading them to a local log instance I... [14:54:31] went to explore S3 [14:54:51] I have found out WMCS learned to expose us object storage with a S3 API [14:56:58] eventually I have read bunch of documentations and forged a patch https://gerrit.wikimedia.org/r/c/integration/config/+/1156847 [14:57:19] I generated the secrets for the project "integration/config" [14:57:45] which is a `config` (aka trusted) project: https://zuul-dev.wmcloud.org/t/wikimedia/projects [14:57:59] and per the doc I have read at https://zuul-dev.wmcloud.org/t/wikimedia/projects [14:58:16] the secret is only exposed to that job cause it is defined in the trusted project [15:01:41] hashar: to add a little more nuance/clarity: it's exposed only to the playbooks defined in that job. [15:02:43] so the base job pre playbook has access to it. the quibble job run playbook does not have access to it. the base job post-logs playbook does have access to it. [15:03:23] in other words, when you inherit a job, you don't change anything about the secret access that your parent job playbooks already established. [15:04:03] and in other, other words: secrets are scoped to playbooks and that scope is determined by job definitiens. [15:06:14] ( end result: https://object.eqiad1.wikimediacloud.org/swift/v1/AUTH_a3598983742448b3b056b5fcb228faa9/artifacts/f3c/wikimedia/f3c765289d8247b98ad4344d483674e1/index.html ) \o/ [15:06:40] \o/ [15:07:12] I am not sure what is that AUTH_Xxxxx thing is and whether that is leaking some kind of credentials [15:09:07] corvus: what I don't get is that if I create a job that has the base job as parent [15:09:15] would it override the post-run steps? [15:09:16] https://gerrit.wikimedia.org/g/integration/config/+/refs/heads/zuul3/zuul.d/jobs.yaml [15:09:24] that is the base job which has a post run to post logs [15:11:12] the great thing is the upload-logs-s3 worked out of the box [15:12:01] A job only has a single main playbook, and when inheriting from a parent ... the pre- and post-playbooks are appended and prepended in a nesting fashion. [15:12:09] https://zuul-ci.org/docs/zuul/latest/config/job.html [15:34:35] hashar: the AUTH_ thing is fine... it's just the tenant id. i forget why they call it that, but half opendev's public clouds do the same. [15:35:37] hashar: which zuul ran that job that uploaded logs? [15:38:45] hmm [15:38:49] oh that is the dev one [15:39:12] https://zuul-dev.wmcloud.org/t/wikimedia/build/f3c765289d8247b98ad4344d483674e1 [15:40:53] the recipe uploads both to the docker compose log container: https://zuul-logs.wmcloud.org/f3c/wikimedia/f3c765289d8247b98ad4344d483674e1/ [15:40:58] and to the s3 bucket [15:41:42] so my guess is that on that zuul-dev I could drop the logs container and replace the base log url to the one exposed by swif [15:42:17] anyway that is the end of the week for me. Next week I will attempt to run Quibble [15:44:09] and I will play with zuul.projects / job.required-projects | https://zuul-ci.org/docs/zuul/latest/job-content.html#var-zuul.projects [15:47:24] moare sailling over the week-end :] [17:42:21] neat that artifact upload works. Have a nice sailing weekend has.harAway :) [17:42:58] i see zuul-web's links to those artifacts are missing the AUTH_XXX thing, so getting a 404 there. That's an oddity. [21:56:35] yeah, it's probably because we're using the s3 api instead of the swift api. i think we get the correct url automatically if we do that. [21:56:49] however, it's easy to modify the url that gets returned, so we can just do that if we want to keep using s3