-
-
Notifications
You must be signed in to change notification settings - Fork 434
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature] Modify signal_interpolate for input with NaNs and extrapolation #666
[Feature] Modify signal_interpolate for input with NaNs and extrapolation #666
Conversation
…lation of points outside the data range
…if this is the appropriate place to do this)
Codecov Report
@@ Coverage Diff @@
## dev #666 +/- ##
==========================================
- Coverage 52.75% 52.75% -0.01%
==========================================
Files 277 277
Lines 12615 12634 +19
==========================================
+ Hits 6655 6665 +10
- Misses 5960 5969 +9
Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here. |
What's the status here is it good for review/merge or still wip? |
@DominiqueMakowski @JanCBrammer I think it's good for review, but let me know if I'm missing anything! |
Will try to have a look this week. Please @ me here in case you don't hear back. |
Co-authored-by: Dominique Makowski <[email protected]>
gentle ping just in case @JanCBrammer |
@danibene, @DominiqueMakowski, I recall that settling on an API for Pandas-style NaN interpolationThe NaN interpolation is a rather big modification of the API: going from mandatory custom extrapolation valuesI'm not opposed to adding the |
I changed the identification of the indices corresponding to the original first and last x values since if the x values are decimals e.g. 0.33, 0.66, ... rounding to the next integer is not very precise.
@Tam-Pham tell us what you think so we can merge or not and release 2.1 |
@Tam-Pham bump |
1 similar comment
@Tam-Pham bump |
I'm not sure what to do here, I don't have a clear idea of the risks / benefits balance, and @JanCBrammer comment makes me cautious. Is it indeed a change that will benefit many users and carries no risk? |
It's hard for me to say - maybe an alternative could be that I remove the change that @JanCBrammer was hesitant about, i.e.:
and I could open a separate PR for a separate function that calls |
So if I understand, we want to add the possibility of interpolating arrays with nan pandas-style to bypass the usage of pandas and allow more control over it? To do so, we have modified signal_interpolate, that previously required x_values and y_values, now if only one is passed, and has Nans, it will interpolate them am I right? |
Yes that's all correct! Do you think we should keep this addition in this PR or just the changes that allow extrapolation of points outside of the data range? |
i think it's fine, perhaps to make it clearer we can have an internal function called I guess as long as it doesn't break the current behavior it's good |
Done, let me know if you'd like me to change anything! Also, I changed an unrelated line in cd02053 NeuroKit/neurokit2/signal/signal_interpolate.py Lines 80 to 81 in 277f614
It returned the original y values if the length of the new x values matched the length of the original x values, but the length can still match without the values matching, so instead I changed it to check that all the values match. Here is an example of the difference in behavior in case the x values don't match but their length does |
Thanks a ton Dani for all that work!! |
Description
Modify the
signal_interpolate()
function to accept a single array containing NaNs as input and to extrapolate.(See #651 for original motivation for doing so)
Proposed Changes
I modified the
signal_interpolate()
function as follows:x_values
ory_values
can beNone
), for interpolation of missing values like in Pandasfill_value
as an argument to be passed to the Scipy interpolate function, such that it can be set to "extrapolate"Checklist
Here are some things to check before creating the PR. If you encounter any issues, do let us know :)