Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Spark operator - driver pod is not starting #2198

Closed
1 task
yuvarajgopal opened this issue Sep 27, 2024 · 8 comments
Closed
1 task

Spark operator - driver pod is not starting #2198

yuvarajgopal opened this issue Sep 27, 2024 · 8 comments
Labels
question Further information is requested

Comments

@yuvarajgopal
Copy link

  • ✋ I have searched the open/closed issues and my issue is not listed.

after installing spark-pi sample app - there are no events and no driver pod - spark submit job not working on AKS installing spark-operator using helm chat

Provide a link to the example/module related to the question

Additional context

@yuvarajgopal yuvarajgopal added the question Further information is requested label Sep 27, 2024
@Cian911
Copy link
Contributor

Cian911 commented Sep 27, 2024

Was there a SparkApplication created?

If so, can you view it to check to see what error occured?

kubectl get sparkapplications

kubectl get sparkapplication ${YOUR_SPARK_APPLICATION_NAME} -o yaml | less

@yuvarajgopal
Copy link
Author

kubectl get sparkapplications -n spark-operator
NAME STATUS ATTEMPTS START FINISH AGE
spark-pi 152m
spark-pi-minimal 162m

kubectl get sparkapplication spark-pi -o yaml -n spark-operator
apiVersion: sparkoperator.k8s.io/v1beta2
kind: SparkApplication
metadata:
annotations:
kubectl.kubernetes.io/last-applied-configuration: |
{"apiVersion":"sparkoperator.k8s.io/v1beta2","kind":"SparkApplication","metadata":{"annotations":{},"name":"spark-pi","namespace":"spark-operator"},"spec":{"arguments":["5000"],"driver":{"cores":1,"labels":{"version":"3.5.2"},"memory":"512m","serviceAccount":"spark-sa"},"executor":{"cores":1,"instances":1,"labels":{"version":"3.5.2"},"memory":"512m"},"image":"spark:3.5.2","imagePullPolicy":"IfNotPresent","mainApplicationFile":"local:///opt/spark/examples/jars/spark-examples_2.12-3.5.2.jar","mainClass":"org.apache.spark.examples.SparkPi","mode":"cluster","sparkVersion":"3.5.2","type":"Scala"}}
creationTimestamp: "2024-09-27T14:02:03Z"
generation: 1
name: spark-pi
namespace: spark-operator
resourceVersion: "3194547"
uid: f0ffe140-699f-498a-9b92-58c88e061dd6
spec:
arguments:

  • "5000"
    driver:
    cores: 1
    labels:
    version: 3.5.2
    memory: 512m
    serviceAccount: spark-sa
    executor:
    cores: 1
    instances: 1
    labels:
    version: 3.5.2
    memory: 512m
    image: spark:3.5.2
    imagePullPolicy: IfNotPresent
    mainApplicationFile: local:///opt/spark/examples/jars/spark-examples_2.12-3.5.2.jar
    mainClass: org.apache.spark.examples.SparkPi
    mode: cluster
    sparkVersion: 3.5.2
    type: Scala

@ChenYi015
Copy link
Contributor

@yuvarajgopal The default job namespaces of spark operator is ["default"], you can change the namespace of your spark app to default and try again.

@yuvarajgopal
Copy link
Author

wow..! Thanks @ChenYi015 its started - but the problem is my spark operator controller, web hook pod are running under spark-operator namespace and I need to move everything to default - ? - so can't we run the spark driver in different namespace?

@ChenYi015
Copy link
Contributor

@yuvarajgopal You can run the spark operator in the spark-operator namespace and run spark applications in any namespace. For example, if you want to run spark applications in namespaces ns1, ns2 and ns3 (make sure these namespaces exist), then you can install the operator as follows:

helm install spark-operator spark-operator/spark-operator \
    --namespace spark-operator \
    --create-namespace \
    --set 'spark.jobNamespaces={ns1,ns2,ns3}'

@Thrynk
Copy link

Thrynk commented Oct 9, 2024

I tried it and it works ! Thank you @ChenYi015 I think the mistake mainly comes from the getting started documentation not being up to date regarding the spark.jobNamespaces to override the helm chart.

@ChenYi015
Copy link
Contributor

@Thrynk Thanks for the feedback on the docs. Would you like to raise an issue to the Spark operator docs not being up to date? We will update it when we have time.

@Thrynk
Copy link

Thrynk commented Oct 9, 2024

It seems there already is one.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

4 participants