-
-
Notifications
You must be signed in to change notification settings - Fork 33
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Status of the project and foreseeable future ? #76
Comments
Thanks for starting the discussion @neutrinoceros. We have only touched on this topic lightly, so this is a good place for a more structural discussion. To answer your concrete questions:
What I would say is that we should encourage users of this package to instead adopt a direct numpy build dependency over time, so this package can fade away (it's an elaborate hack for the lack of features in Python packaging regarding ABI compatibility after all). There are a couple of other things to think about:
For myself, I'd prefer to first experiment with this in a few other packages (e.g., PyWavelets and SciPy) before making any changes to this package. |
why not? Our public API is all C, so while inconvenient, C++ people can just cast. Since they can enforce compiling with the new numpy (at least rather quickly), hacks should be simpler, as you don't need both versions side-by-side for as long? |
I wasn't thinking about C++, only C and maybe Cython. I checked a bit more and it's indeed okay for C (ABI-wise) to go from a const parameter at build time to a non-const one at runtime (a case we did not previously allow, but now do). It's the 2->1 case in https://stackoverflow.com/questions/5083765/does-changing-fmystruct-a-to-fconst-mystruct-a-breaks-api-abi-in-c. We recently had problems with this exact thing in Cython (see the README of https://github.com/lysnikolaou/const-cython), that's why it came to mind. But that's for signatures in |
Can you please clarify what you mean here @rgommers ? |
Once NumPy 2.0 is out, you may want to bump the build requirement to NumPy >=2.0 rather than something like (This is necessary to ensure that downstream must adjust to API changes. That allows us to do ABI breaks as long as they are disguised as API changes; in the simplest case, removing a function/macro.) |
Interesting. Would you recommend pinning |
Yes, that's recommended. |
Thank you ! |
Also, numpy==1.25.0 is python 3.9+ so any project that supports 3.8 still requires this package. |
You can also take over the relevant 3.8 constraints from the |
Copying this comment from @matejsp because it belong in this issue: One idea for this project future and easier usage/upgrades would be to replace all versions (that are 1.19.x & >=py3.9)
ALTERNATIVE: I don't know if you can have multiple ranges that work with setup tools (maybe >= and then exclude all version up to 1.25.*:
For most platforms you would then just pick best numpy version with best OS compatibility and STILL retain backwards compatibility :) Could be win win :D |
I just released a new version and closed all open PRs and almost all open issues. I think we can try a next release which uses more flexible constraints and targets numpy 1.25/1.26, making use of those versions targeting an older C API by default. That will resolve several "build from source" issues where the oldest numpy version that we target now no longer builds from source for some reason, but we must keep targeting it because (a) it's the first release with wheels and (b) backwards compatibility. So far that has held up well - at least no complaints regarding ABIs/versioning since 1.25.0 was released in June'23 AFAIK. I think I'd go for this flavor (if the pin now is
|
For projects which use 3.9 anyway, is there actually much of a reason to specify any bound at all (unless they have some requirement at build time)? A wheel build should presumably pick up the 1.25 or 1.26. |
We do get real bug reports for such situations. I just closed one from scikit-learn, where they had issues because building against 1.23.x (as in this repo) resulted in issues with folks who installed 1.22.x from source on a technically not-yet-supported Python version. We really should try to avoid this kind of thing - it's not hard to do so I believe. For the time being, projects will be supporting 1.21 - 1.24 versions commonly as runtime dependency, and those versions will run into compat issues. Also, I think we should keep a |
Well, sklearn is "special" since they support Python 3.8 so they clearly need oldest-support-numpy! I think it is also a good choice here to just do it as you did. But if we rely on Python >3.9 and oldest-support-numpy becomes less interesting, I am not sure that there is any reasonable situation where:
|
There are lots of projects that still support Python 3.8. Once 3.9 is the minimum, I think the advice is to not use |
What I was wondering was whether it isn't OK to just skip the |
Oh I'm sure there's a failure mode that people are going to hit - the only question is how often. Example, when building everything from source (whether due to a
|
There's also things like |
As I'm sure you guys know, Numpy 1.25 introduced a mechanism to control backward compatibility at build time, which, quoting
So, what is the future of this project ? I can imagine it will stop receiving updates at some point in the near future, but will it still be relevant when CPython 3.12.0 final is released (currently scheduled for early October) ?
I figure there would already be some internal discussions about it, but maybe this issue can serve as a reference for maintainers of downstream code.
The text was updated successfully, but these errors were encountered: