-
Notifications
You must be signed in to change notification settings - Fork 28.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARK-49700][CONNECT][SQL] Unified Scala Interface for Connect and Classic #48818
base: master
Are you sure you want to change the base?
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This makes Source use the classic API.
@@ -1244,24 +1243,6 @@ class Dataset[T] private[sql]( | |||
select(replacedAndExistingColumns ++ newColumns : _*) | |||
} | |||
|
|||
/** @inheritdoc */ | |||
private[spark] def withColumns( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Moved to super class.
* Execute a block of code with this session set as the active session, and restore the | ||
* previous session on completion. | ||
*/ | ||
private[sql] def withActive[T](block: => T): T = { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Moved to super class.
@@ -330,8 +333,6 @@ class SparkSessionBuilderSuite extends SparkFunSuite with Eventually { | |||
.set(wh, "./data1") | |||
.set(td, "bob") | |||
|
|||
val sc = new SparkContext(conf) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oops...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I can spin this off in a separate PR.
|
||
// The only way this could have ever worked if this is effectively a no-op. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Will remove in the next iteration.
For anyone interested. This is mostly ready for review. I still looking at some of the developer API, most of it uses classic now, however in most cases the interfaces should suffice. Moving them back might make the lives of spark library developers at bit easier. |
What changes were proposed in this pull request?
This PR makes the shared SQL (JVM) interface the primary interface.
The implementations are moved to the
classic
andconnect
sub packages.Please note that this does change a number of developer APIs:
Why are the changes needed?
This is the final step in creating a unified Scala interface for both Classic and Connect.
Does this PR introduce any user-facing change?
No.
How was this patch tested?
Existing tests.
Was this patch authored or co-authored using generative AI tooling?
No.