fix: update batch handling to ensure each operation has its own unique idempotency-token #2905
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Context
If batch is used to update an object, and that subrequest has an
x-goog-idempotency-token
set on it, that token can be used to short circuit execution of the request if that token has already been processed (essentially interpreting the request as a retry).The bug
Previously, we didn't set idempotency token for batch operations, but the IdempotencyIdInterceptor would still attach one if it was visible in the thread local.
The fix
Update our handling, so that we alway set a new unique token for each batch operation making it immune to any possibly still visible thread local value.
HTTP Request logs of what this will now look like.
In this particular case, the leak of the thread local value occurred during a json resumable upload. The json resumable upload has been updated to properly cleanup its token after execution.
Fixes #2902 by ensuring every operation has a unique id. In 2902 if an token had leaked it would be applied to the first batch and processed as expected, but when reaching the second batch the token would still be the same, and the request was interpreted as already happening and the response from the first batch was returned.