feat(g-w): reporting artifact size at creation time#8322
Conversation
matt-boris
left a comment
There was a problem hiding this comment.
You're going to have to update the String() string method of the TaskArtifact interface, across all types who implement this interface. I think that's what's causing your test failures right now, which are quite difficult to see what's going on since you're getting output like:
--- FAIL: TestFileArtifactWithNames (0.05s)
artifacts_test.go:68: Expected different artifacts to be generated...
Expected:
["S3 Artifact - Name: 'public/build/firefox.exe', Path: '/home/task_177219180842834/taskcluster/workers/generic-worker/testdata/SampleArtifacts/_/X.txt', Expires: 2026-02-27T12:33:32.906Z, Content Encoding: 'gzip', MIME Type: 'text/plain; charset=utf-8'"]
Actual:
["S3 Artifact - Name: 'public/build/firefox.exe', Path: '/home/task_177219180842834/taskcluster/workers/generic-worker/testdata/SampleArtifacts/_/X.txt', Expires: 2026-02-27T12:33:32.906Z, Content Encoding: 'gzip', MIME Type: 'text/plain; charset=utf-8'"]
Here's the method in the interface https://github.com/taskcluster/taskcluster/blob/main/workers/generic-worker/artifacts/artifacts.go#L46-L52
The good thing is, this test is also failing on the insecure engine, so you'll be able to test/validate your fixes locally.
thanks for the hints! Checking. Outputs are identical indeed |
| reference: issue 8101 | ||
| --- | ||
| Generic worker now reports artifact file sizes to the queue via the optional `contentLength` parameter when creating artifacts. This applies to all file-based artifacts (S3 and object) as well as log artifacts. The reported size is the original file size on disk, before any encoding such as gzip compression. | ||
| This feature relies on the API changes introduced in [v96.7.0](https://github.com/taskcluster/taskcluster/blob/main/CHANGELOG.md#v9670) and will fail schema validation if services are not upgraded. |
There was a problem hiding this comment.
Thanks for calling this out! 👍🏻
There was a problem hiding this comment.
Final request here from me at this point would be could you please look into adding a generic worker integration test for this? Where you create a file in a payload with a known size, then can confirm with the queue after the task runs that it has the correct contentLength set?
Edit: once tests are updated to include ContentLength in the expected artifacts, I think validateArtifacts() helper will be sufficient here! 🙏🏻 Thank you!
755bedc to
8d6f178
Compare
8d6f178 to
5412956
Compare
Generic-worker will send contentLength of the original artifact before compression to the queue.createArtifact
5412956 to
ca2d791
Compare
|
|
||
| taskID := submitAndAssert(t, td, payload, "completed", "completed") | ||
|
|
||
| queue := serviceFactory.Queue(nil, config.RootURL) |
There was a problem hiding this comment.
Awesome cleanup here! Thanks so much!
|
Merging, as all tests passed, we're having an issue with statuses reporting back. Likely due to mozcloud migration work going on. |
Generic-worker will send contentLength of the original artifact before compression to the queue.createArtifact
This will be a breaking change for worker deployers, since it depends on the schema validation in services, ath needs to be deployed before this.
Second part of #8101