-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix get_type for higher-order array functions #13756
Changes from 2 commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -21,10 +21,11 @@ use arrow::{ | |
compute::can_cast_types, | ||
datatypes::{DataType, TimeUnit}, | ||
}; | ||
use datafusion_common::utils::coerced_fixed_size_list_to_list; | ||
use datafusion_common::{ | ||
exec_err, internal_datafusion_err, internal_err, plan_err, | ||
types::{LogicalType, NativeType}, | ||
utils::{coerced_fixed_size_list_to_list, list_ndims}, | ||
utils::list_ndims, | ||
Result, | ||
}; | ||
use datafusion_expr_common::{ | ||
|
@@ -414,7 +415,18 @@ fn get_valid_types( | |
_ => Ok(vec![vec![]]), | ||
} | ||
} | ||
|
||
fn array(array_type: &DataType) -> Option<DataType> { | ||
match array_type { | ||
DataType::List(_) => Some(array_type.clone()), | ||
DataType::LargeList(field) | DataType::FixedSizeList(field, _) => { | ||
Some(DataType::List(Arc::clone(field))) | ||
} | ||
_ => None, | ||
} | ||
} | ||
|
||
fn recursive_array(array_type: &DataType) -> Option<DataType> { | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Can we extend the existing array function for nested array instead of creating another signature for nested array There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I don't know how to do this, please advise! There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
I don't understand -- if the goal is to remove recursive flattening, should we be adding new code to support it 🤔 There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. the pre-existing the recursive type normalization matters for flatten only, cause it (currently) operates recursively and otherwise would need to gain code to handle FixedLengthList inputs the recursive array-ification was useless for other array functions and was made non-recursive. |
||
match array_type { | ||
DataType::List(_) | ||
| DataType::LargeList(_) | ||
|
@@ -653,6 +665,13 @@ fn get_valid_types( | |
array(¤t_types[0]) | ||
.map_or_else(|| vec![vec![]], |array_type| vec![vec![array_type]]) | ||
} | ||
ArrayFunctionSignature::RecursiveArray => { | ||
if current_types.len() != 1 { | ||
return Ok(vec![vec![]]); | ||
} | ||
recursive_array(¤t_types[0]) | ||
.map_or_else(|| vec![vec![]], |array_type| vec![vec![array_type]]) | ||
} | ||
ArrayFunctionSignature::MapArray => { | ||
if current_types.len() != 1 { | ||
return Ok(vec![vec![]]); | ||
|
Original file line number | Diff line number | Diff line change | ||||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
|
@@ -993,3 +993,84 @@ where | |||||||||||||||||||||||||||||
let data = mutable.freeze(); | ||||||||||||||||||||||||||||||
Ok(arrow::array::make_array(data)) | ||||||||||||||||||||||||||||||
} | ||||||||||||||||||||||||||||||
|
||||||||||||||||||||||||||||||
#[cfg(test)] | ||||||||||||||||||||||||||||||
mod tests { | ||||||||||||||||||||||||||||||
use super::array_element_udf; | ||||||||||||||||||||||||||||||
use arrow_schema::{DataType, Field}; | ||||||||||||||||||||||||||||||
use datafusion_common::{Column, DFSchema, ScalarValue}; | ||||||||||||||||||||||||||||||
use datafusion_expr::expr::ScalarFunction; | ||||||||||||||||||||||||||||||
use datafusion_expr::{cast, Expr, ExprSchemable}; | ||||||||||||||||||||||||||||||
use std::collections::HashMap; | ||||||||||||||||||||||||||||||
|
||||||||||||||||||||||||||||||
#[test] | ||||||||||||||||||||||||||||||
fn test_array_element_return_type() { | ||||||||||||||||||||||||||||||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I think we can add tests in slt file that cover the array signature test cases, so we can avoid creating rust test here. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. The rust test allows explicitly exercising various ways of getting expression type. I can add slt test, how would it look like? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I did try to write some slt regression tests, but i couldn't expose the bug. Yet, the unit tests proves the bug exists. |
||||||||||||||||||||||||||||||
let complex_type = DataType::FixedSizeList( | ||||||||||||||||||||||||||||||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. When I change this complex type to let complex_type = DataType::List(
Field::new("some_arbitrary_test_field", DataType::Int32, false).into(),
); It also passes when complex_type is a let complex_type = DataType::Struct(Fields::from(vec![
Arc::new(Field::new("some_arbitrary_test_field", DataType::Int32, false)),
])); It seems like there is something about FixedSizeList that is causing issues to me There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Weird, when I remove this line in expr schema the test passes (with FixedSizedList): diff --git a/datafusion/expr/src/expr_schema.rs b/datafusion/expr/src/expr_schema.rs
index 3317deafb..50aeb222f 100644
--- a/datafusion/expr/src/expr_schema.rs
+++ b/datafusion/expr/src/expr_schema.rs
@@ -152,6 +152,7 @@ impl ExprSchemable for Expr {
.map(|e| e.get_type(schema))
.collect::<Result<Vec<_>>>()?;
+
// Verify that function is invoked with correct number and type of arguments as defined in `TypeSignature`
let new_data_types = data_types_with_scalar_udf(&arg_data_types, func)
.map_err(|err| {
@@ -168,7 +169,7 @@ impl ExprSchemable for Expr {
// Perform additional function arguments validation (due to limited
// expressiveness of `TypeSignature`), then infer return type
- Ok(func.return_type_from_exprs(args, schema, &new_data_types)?)
+ Ok(func.return_type_from_exprs(args, schema, &arg_data_types)?)
}
Expr::WindowFunction(window_function) => self
.data_type_and_nullable_with_window_function(schema, window_function) Which basically says pass the input data types directly to the function call rather than calling datafusion/datafusion/expr/src/expr_schema.rs Line 171 in 68ead28
🤔 this looks like it was added in Sep via 1b3608d (before that the input types were passed directly) 🤔 There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. It doesn't seem right to me that There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
correct, #13756 (comment) There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
i did the same, basically removing this block datafusion/datafusion/expr/src/expr_schema.rs Lines 155 to 167 in b30c200
it's enough to fix the unit test in this PR
agreed There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
The reason is because we can't guarantee the input is already coerced. To determine the return type of a function for a given set of inputs, we follow these steps:
That is why we have coercion in There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
I like the idea in principle. It should be combined with a new ScalarUDFImpl sub-trait that doesn't have return type-related methods at all, since they are not to be used once the plan is constructed.
in a logical plan we can. My understanding is that coercing analyzer also calls the But the real problem is that same types, the There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
Is there example about the difference of this two, especially for function. For Expr::ScalarFunction, it has no difference in LogicalPlan, we don't do anything special, but I think this is what you don't expect. What should we have in LogicalPlan, There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
Why There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
the difference is more apparent for duplicate syntax (such is IS NULL vs IS UNKNOWN), syntax sugar (order by 1, order by all, select *)
for a fully resolved logical plan it's fair question to ask what is the type of an expression (and this may or may not be O(1) available answer) however, there is no point to ask a UDF what is its type, since we already asked it think of this as engine and UDF being implemented by independent parties, with UDF being a contract layer. |
||||||||||||||||||||||||||||||
Field::new("some_arbitrary_test_field", DataType::Int32, false).into(), | ||||||||||||||||||||||||||||||
13, | ||||||||||||||||||||||||||||||
); | ||||||||||||||||||||||||||||||
let array_type = | ||||||||||||||||||||||||||||||
DataType::List(Field::new_list_field(complex_type.clone(), true).into()); | ||||||||||||||||||||||||||||||
let index_type = DataType::Int64; | ||||||||||||||||||||||||||||||
|
||||||||||||||||||||||||||||||
let schema = DFSchema::from_unqualified_fields( | ||||||||||||||||||||||||||||||
vec![ | ||||||||||||||||||||||||||||||
Field::new("my_array", array_type.clone(), false), | ||||||||||||||||||||||||||||||
Field::new("my_index", index_type.clone(), false), | ||||||||||||||||||||||||||||||
] | ||||||||||||||||||||||||||||||
.into(), | ||||||||||||||||||||||||||||||
HashMap::default(), | ||||||||||||||||||||||||||||||
) | ||||||||||||||||||||||||||||||
.unwrap(); | ||||||||||||||||||||||||||||||
|
||||||||||||||||||||||||||||||
let udf = array_element_udf(); | ||||||||||||||||||||||||||||||
|
||||||||||||||||||||||||||||||
// ScalarUDFImpl::return_type | ||||||||||||||||||||||||||||||
assert_eq!( | ||||||||||||||||||||||||||||||
udf.return_type(&[array_type.clone(), index_type.clone()]) | ||||||||||||||||||||||||||||||
.unwrap(), | ||||||||||||||||||||||||||||||
complex_type | ||||||||||||||||||||||||||||||
); | ||||||||||||||||||||||||||||||
|
||||||||||||||||||||||||||||||
// ScalarUDFImpl::return_type_from_exprs with typed exprs | ||||||||||||||||||||||||||||||
assert_eq!( | ||||||||||||||||||||||||||||||
udf.return_type_from_exprs( | ||||||||||||||||||||||||||||||
&[ | ||||||||||||||||||||||||||||||
cast(Expr::Literal(ScalarValue::Null), array_type.clone()), | ||||||||||||||||||||||||||||||
cast(Expr::Literal(ScalarValue::Null), index_type.clone()), | ||||||||||||||||||||||||||||||
], | ||||||||||||||||||||||||||||||
&schema, | ||||||||||||||||||||||||||||||
&[array_type.clone(), index_type.clone()] | ||||||||||||||||||||||||||||||
) | ||||||||||||||||||||||||||||||
.unwrap(), | ||||||||||||||||||||||||||||||
complex_type | ||||||||||||||||||||||||||||||
); | ||||||||||||||||||||||||||||||
|
||||||||||||||||||||||||||||||
// ScalarUDFImpl::return_type_from_exprs with exprs not carrying type | ||||||||||||||||||||||||||||||
assert_eq!( | ||||||||||||||||||||||||||||||
udf.return_type_from_exprs( | ||||||||||||||||||||||||||||||
&[ | ||||||||||||||||||||||||||||||
Expr::Column(Column::new_unqualified("my_array")), | ||||||||||||||||||||||||||||||
Expr::Column(Column::new_unqualified("my_index")), | ||||||||||||||||||||||||||||||
], | ||||||||||||||||||||||||||||||
&schema, | ||||||||||||||||||||||||||||||
&[array_type.clone(), index_type.clone()] | ||||||||||||||||||||||||||||||
) | ||||||||||||||||||||||||||||||
.unwrap(), | ||||||||||||||||||||||||||||||
complex_type | ||||||||||||||||||||||||||||||
); | ||||||||||||||||||||||||||||||
|
||||||||||||||||||||||||||||||
// Via ExprSchemable::get_type (e.g. SimplifyInfo) | ||||||||||||||||||||||||||||||
let udf_expr = Expr::ScalarFunction(ScalarFunction { | ||||||||||||||||||||||||||||||
func: array_element_udf(), | ||||||||||||||||||||||||||||||
args: vec![ | ||||||||||||||||||||||||||||||
Expr::Column(Column::new_unqualified("my_array")), | ||||||||||||||||||||||||||||||
Expr::Column(Column::new_unqualified("my_index")), | ||||||||||||||||||||||||||||||
], | ||||||||||||||||||||||||||||||
}); | ||||||||||||||||||||||||||||||
assert_eq!( | ||||||||||||||||||||||||||||||
ExprSchemable::get_type(&udf_expr, &schema).unwrap(), | ||||||||||||||||||||||||||||||
complex_type | ||||||||||||||||||||||||||||||
); | ||||||||||||||||||||||||||||||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. This didn't pass before the change. The assertions above did pass. |
||||||||||||||||||||||||||||||
} | ||||||||||||||||||||||||||||||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
so this says that if the type is a list, keep the type, but if the type is large list / fixed size list then take the field type?
Why doesn't it also take the field type for
List
🤔 ? (Aka it doesn't make sense to me that List is treated differently than LargeList and FixedSizeListThere was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
for backwards compat i should keep LargeList so it stays LargeList, will push shortly
not my invention, it was like this before.
i think the intention is "converge List, LL and FSL into one type... or maybe two types... to keep UDF impl simpler".
i am not attached to this approach, but i think code may be reliant on that