-
Notifications
You must be signed in to change notification settings - Fork 28.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARK-44690][BUILD][3.5] Downgrade Scala to 2.13.8 #42362
[SPARK-44690][BUILD][3.5] Downgrade Scala to 2.13.8 #42362
Conversation
Should we downgrade in the master branch too for now? |
Mistaken operation... |
If we cannot accept the solution proposed in #41943 for Spark 4.0, then we should temporarily downgrade in master too. |
Yes let's downgrade master too for consistency. Later, we either resolve this another way or just drop Java 8 support in 4.0 |
Okay, let me submit a new PR to revert it in master |
close this one due to #42364 merged |
The test results are not as we expected, more investigation is needed to solve this problem. Scenario 1:If the minimum support is Java 11, build with Java 11 and test with Java 17 and Java 21-ea. Modify
Build & Test
failed:
3.run failed:
Scenario 2:If the minimum support is Java 17, build with Java 17 and test with Java 21-ea Modify
Build & Test
failed:
|
Actually, for that particular problem, I think the answer is straightforward: those internal classes really just don't exist in later Java versions :) However, from looking at StorageUtils, all that is referenced only to try to bridge Java 8 and 9 support. A lot of that hacky code should go away along with the problem. I'm confident that much is fixable as part of updating past Java 8. There may be other issues though! |
Okay, let's address these issues after we stop supporting Java 8. I believe they can definitely be solved too. |
@srowen You're right, after making some changes to the core module's code, it is now possible to build Spark with Java 11(-target:11) and test with Java 17. Indeed, this task will need to be accomplished once we've discontinued support for Java 8. Happy ~ |
What changes were proposed in this pull request?
The aim of this PR is to downgrade the Scala 2.13 dependency in
Apache Spark 3.5
to 2.13.8, to ensure thatApache Spark 3.5
behaves the same asApache Spark 3.4.x
in terms of Maven build and testing: it can be build with-target:jvm-1.8
, and tested with Java 11/17.Why are the changes needed?
As reported in SPARK-44376, there are issues when maven build and test using Java 11/17 with
-target:jvm-1.8
:build/mvn clean install -Pscala-2.13
with Java 17build/mvn clean install -Pscala-2.13 -Djava.version=17
with Java 17build/mvn clean package -Pscala-2.13 -DskipTests
orbuild/mvn clean install -Pscala-2.13 -DskipTests
with Java 8 first, then runbuild/mvn test -Pscala-2.13
with Java 17This is inconsistent with the behavior of
Apache Spark 3.4.x
, so we need to use the previous Scala 2.13 version supporting this behavior inApache Spark 3.5.0
.Does this PR introduce any user-facing change?
No.
How was this patch tested?