Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Purpose of this PR
In #1955 the use of
sparkJobNamespace
was replaced withsparkJobNamespaces
. However the README.md and the Quick Start Guide still show the use of the old Helm value.Proposed changes:
Change Category
Indicate the type of change by marking the applicable boxes:
Additional Notes
--set "sparkJobNamespaces={default}"
and--set sparkJobNamespaces[0]=default
(for personal preference I went with the first one)About the Spark Job Namespace
section in the quick start guide was referencing quite heavily the old default value and how the empty string was enabling the deployment of spark applications on all namespaces. From the test I did with Helm chart v1.2.7 it seems to me that the empty list enables the execution of spark applications only in the spark-operator namespace. Based on this information I updated the doc describing this behavior, I may be wrong here, so please correct me if I am missing something