update google_dataflow_flex_template_job to send sdkpipeline parameters via environment field #6357
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Fixes hashicorp/terraform-provider-google#14679
Since the above issue was introduced, the flex template job resource has had the problem where fields can be sent via both an environment block and the parameters one -- which could lead to permadiffs due to the environment block fields not being flagged as computed, and would lead to guaranteed errors on max_workers and num_workers because they would be sent via both blocks always even when only included in parameters if the user applied a plan through the permadiff (as integer nils are equivalent to 0 within Terraform).
This has been resolved by:
resourceDataflowFlexJobSetupEnv()
function will now update the ResourceData to assign the parameter values to their corresponding environment variables, and remove those parameters from the parameters block sent to the API.Plus resolved a bonus undocumented issue
This technically will not be a breaking change!
Release Note Template for Downstream PRs (will be copied)
Derived from GoogleCloudPlatform/magic-modules#9031