-
-
Notifications
You must be signed in to change notification settings - Fork 279
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
NumPy 2 bringup #1997
Comments
Can we merge these two issues just to make it easier to track them? |
The issue Axel raises seems like a subpoint of this issue (depending on what we decide). Namely do we want to opt-in to this newer/slimmer ABI and how does that fit into NumPy 2 |
Sure. IMO Axel issue is a subset of this one. I don't have strong opinions on which one to keep, or if you want to keep both, but I also don't want to get lost on two mega-threads 😬 |
Added Axel's item to the list above |
Handling the ABI is the key point here (that and current packages missing a Normally I'd say we do a dual migration (keep 1.x; add 2.0), but numpy has around 5000** dependents in conda-forge, so that would be a pretty substantial CI impact, especially if it takes a while to drop 1.x. **
Obviously not all of them are compiling against numpy, but still... |
Thanks Axel! 🙏 Anyone should feel free to update the issue as needed 🙂 |
Following up on our discussion earlier about improving the visibility of Also include a note about one approach we might take to ensure that value is embedded in built binaries. Though maybe there are better approaches for that portion |
There should now be a string baked into binaries built with NumPy to notate what kind of NumPy compatibility they have |
It is worth noting that thanks to Axel and others we now have NumPy 2.0.0rc1 packages: conda-forge/numpy-feedstock#311 Also ecosystem support of NumPy 2 is being tracked in this issue Ralf opened: numpy/numpy#26191 We are now in a good spot to start testing building packages with NumPy 2 |
I discussed this with @rgommers recently and one important point that he brought up is the situation with In particular, since |
Of course, if there's appetite for a split into |
That doesn't sound good to me as a custom conda-forge split. If we want anything like that, let's do this properly and create a numpy-headers package that's officially supported by NumPy and that can be used by anyone (unless they need a static library or |
We do have a run_export on numpy. |
Yeah... Clearly I shouldn't be writing these comments from a phone 😅 I misremembered that part, but in that case the task becomes easier - we just set up the right run export in numpy itself, and then remove |
Another question we have to answer soon: what mechanism do we want to use for setting Right now I'm thinking of setting The only issue there is that the run-export on
while the upper bound ( |
What if we had something like...? {% set version = "2.0.0" %}
package:
name: numpy
version: {{ version }}
...
build:
...
run_exports:
- {{ pin_subpackage("numpy", lower_bound=os.environ.get("NPY_FEATURE_VERSION", version)) }} That way we can defer this environment variable setting to packages If they don't set something, we can provide a sensible default (either We could also consider whether conda-build could allow |
I don't think this type of Hence, doing nothing should be the right default here, trying to change it from NumPy's default will likely only be the cause of extra complexity/confusion, and perhaps bugs. |
I'd be surprised if it works like that. AFAIU, that
Leaving aside NEP29, this is a quantity we have to be able to control IMO. Otherwise our metadata for packages building against numpy is bound to be wrong, and deteriorate over time (when numpy inevitably moves the lower bound, and we don't move in sync across all our feedstocks). I don't see how we can reasonably avoid making |
I very well could be wrong. It is easy to test Really we just need more ideas to sample from. It's more important that we have a large variety before selecting one. So feel free to propose more |
It won't be wrong. The metadata that is in the upstream releases (i.e. the |
I'm not sure I follow. Say a project has Or do you mean that the default for that in numpy is so low (1.19?) that it won't ever become a factor? That seems doubtful to me. Even aside from those questions, we still have an interest to provide a common baseline for numpy compatibility (so that most of conda-forge is still usable with the oldest supported numpy), and avoid that individual packages move on too quickly (unless they really need to), or extremely slowly (i.e. going back to 1.19 adds about 2 years on top on top of what NEP29 foresees w.r.t. being able to use a given ABI feature).
In summary, this seems highly dubious to me. There's still a lower bound somewhere, either in numpy's default feature level, or in an explicit override of |
This. And it's not doubtful, it is guaranteed to work. The whole point is to take away build-time version as a thing that dynamically overrides the declares runtime dependency range.
No, that does not make sense. If a package has
No, and no. The lower bound is whatever What the conda-forge tooling should do is check that the |
You chopped off the part of my quote that accounts for the scenario you describe.
I'm not saying we should disregard runtime constraints. I'm saying we also need to express constraints arising from the feature level - both of those can be attached to the same package without conflict. They stack and the intersection of both is what's actually permissible for the final artefact.
I don't see this happening soon enough to be available for the 2.0 transition, it would need work on conda-build AFAICT.
I'm not sure what you mean here. Presumably by "python package" you don't mean "pure python" packages? Anything else that has a run-export (to my knowledge) uses the build-time version as a lower bound. That's precisely the issue that requires attention here, because of the very unusual situation how building against numpy 2.0 produces something compatible with |
Migration is a-go! :) |
Oh, you know, just a little migration... *checks notes*... 2800 affected feedstocks 🫣😅 |
OK, it looks like that number is not correct - one of the packages that stood out to me was importlib_metadata, which doesn't depend on numpy at all. It's probably not a great sign that this is being picked up. |
Just so it doesn't get lost in the notifications, something seems to be off with the python 3.12 pinning. I know it is confusing to do dual migrations at the same time so I'm not sure there is an easy answer |
Yeah, thanks for the note @hmaarrfk & @xhochy. In previous numpy bumps, we always did them for the open python 3.x & pypy migrations as well, and this wasn't an issue. Here however, bumping the 3.12 migrator gets immediately picked up by a given feedstock rerendering and so runs into breakage if that package isn't yet numpy 2.0-compatible (basically all of them). I undid the changes to the 312 migrator to unbreak this, but I'm kind of unsure how to square this circle now. I think for now the least bad option is to just accept that we don't get numpy 2.0 builds for 3.12. Alternatives might be:
With 1., the numpy2 migrator would look like
while the python312 one would stay untouched. |
Next snag: bot isn't opening PRs 😑 |
We are 87% of the easy through the Python 3.12 migration. Maybe we are close to closing this out? Perhaps with a little effort? Wonder if there are other configuration options available to us if we make the NumPy 2 migration depend on the Python 3.12 migration? IOW the "no ice cream until you eat your vegetables" approach 😉 |
Yeah, there's |
The migrator status page is somewhat misleading on this - it reports the percentage of "PRs opened", not "PRs closed". So we're actually at ~67% with the 3.12 migration, with about ~900 feedstocks still to go. That's IMO quite a bit too early to close? I guess the only nice alternative to points 1. & 2. above would be to implement something in cf-scripts like I guess there aren't really any other zips that are affected in this way, so it'd be a bit tailor-made for the numpy/python situation. From that POV, maybe we could also just deal with the hassle of closing the superfluous PRs this one time, because once we're on numpy 2.0 (and have dropped 3.8 around October), we can completely remove numpy from the python zip1 going forward. Footnotes
|
FWIW, I think we need to migrate 3.12 as well, as we won't simply be able to switch this one later (due to lots of missing dependencies then). So if we don't do it as part of the overall numpy 2.0 migration, we'd need a separate migrator (now or later) that catches up 3.12 specifically. I feel that a second such migration wastes more time than it would do to deal with the overzealous PRs to python-but-not-numpy-related feedstocks (which by themselves aren't harmful even if merged). |
After some discussion, we decided to soft-close the Python 3.12 migration. Meaning that the Python 3.12 migration is still running, but we now also include Python 3.12 in the global pinning. As a result, re-rendering a feedstock could add Python 3.12 (even if its dependencies are not available on Python 3.12 yet). Meaning the feedstock would get Python 3.12 builds that would fail. These could simply be ignored for now. Based on the Python 3.12 migration status, this is only relevant for ~11% of feedstocks (admittedly that is ~500 feedstocks). The rest have either already been migrated or have a migration PR With Python 3.12 now addressed, we restarted the NumPy 2 migration today ( conda-forge/conda-forge-pinning-feedstock#5851 ). Here is the status of NumPy 2 migration. It may take it a bit to start adding PRs to all the feedstocks that can now take a PR. So please be patient 🙂 Thanks all! 🙏 |
statsmodels still seeling some NumPy 1.22 builds on my NumPy 2 migration. Have these not been migrated yet? Or will they never be migrated (and so need to be skipped (and if so, any pointers would be helpful)). These all fail. All of the NumPy 2 targeted build pass (as do a few NumPy 1.22, although most of these should be disabled when the recipe is rerendered). |
Those are all the PyPy builds, which is not participating in the numpy 2.0 migration (IOW, they should run just as before, but have broken for unrelated reasons). I haven't opened an issue yet, but PyPy support is (unfortunately!) being wound down, so don't hesitate to skip those. I did just that in the statsmodels PR even before seeing your comment here... :) |
JFYI NumPy is planning to release 2.0.0 on June 16th |
More updates:
|
Do we still need this issue pinned, or even open? |
Nearly half of the feedstocks affected by this migration are in PR or awaiting PR. Also still seeing activity on projects adding NumPy 2 support Also we are still waiting on a NumPy 2 compatible TensorFlow: tensorflow/tensorflow#67291 Maybe we can leave this as-is for a bit longer to see if we can get more of these through |
The migration definitely needs more time, and (while independent) I think this issue should stay open, but we can IMO unpin it if there's any other issue that currently deserves a pinned spot |
TensorFlow 2.18.0 supports numpy 2 xref: conda-forge/tensorflow-feedstock#408, conda-forge/tensorflow-feedstock#389 |
NumPy is currently working on a NumPy 2.0 release, which is planned to come out later this year. Here are the current (draft) release notes. Also here's the upstream tracking issue ( numpy/numpy#24300 ), and ecosystem compatibility tracker.
Some questions worth discussing:
numpy
pins in packages ( Hot-fix packages using NumPy to require <2 ? conda-forge-repodata-patches-feedstock#516 )?Todos:
{{ pin_compatible("numpy") }}
regro/cf-scripts#2469pin_compatible("numpy"...)
regro/cf-scripts#2470{{ pin_compatible("numpy") }}
from NumPy docs #2156cc @conda-forge/core
The text was updated successfully, but these errors were encountered: