Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feedback] docs/components/spark-operator/getting-started.md #3893

Closed
JTBS opened this issue Oct 3, 2024 · 4 comments · May be fixed by #3917
Closed

[Feedback] docs/components/spark-operator/getting-started.md #3893

JTBS opened this issue Oct 3, 2024 · 4 comments · May be fixed by #3917
Assignees

Comments

@JTBS
Copy link

JTBS commented Oct 3, 2024

Please change documentation:
helm install my-release spark-operator/spark-operator
--namespace spark-operator
--create-namespace
--set "sparkJobNamespaces={spark}"
--set webhook.enable=true
--set webhook.port=443

TO:
helm install my-release spark-operator/spark-operator
--namespace spark-operator
--create-namespace
--set "spark.jobNamespaces={spark}"
--set webhook.enable=true
--set webhook.port=443

Change: It needs to be "spark.jobNamespaces" NOT sparkJobNamespaces - I spent lot of time to finally figure this out :) So it might save time for others

@varodrig
Copy link
Contributor

/assign @varodrig

@varodrig
Copy link
Contributor

varodrig commented Nov 30, 2024

issue already resolved. Issue needs to be closed.
image

@varodrig
Copy link
Contributor

varodrig commented Dec 2, 2024

/close
cc @hbelmiro @JTBS

Copy link

@varodrig: Closing this issue.

In response to this:

/close
cc @hbelmiro

Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes/test-infra repository.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants