-
Notifications
You must be signed in to change notification settings - Fork 192
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Default version capping included by pixi add? #639
Comments
He @paugier, We do this deliberately, checkout the blogpost @wolfv made: https://prefix.dev/blog/the_python_packaging_debate There is another issue that talks about overriding the version of certain packages. #620. To add to this we might allow to override the whole specifying a version automatically through a machine local configuration. |
Thanks for the quick reply! I guess you know what you are doing but I still don't understand the motivations for including strict upper bounds for every dependency. There are cases for which it is for sure incorrect, for example capping But this may be due to my misunderstanding of Pixi. I'd like to invest time on Pixi mostly to be able to maintain in the repository of some Python libraries what is needed to create/maintain the conda-forge recipe (in particular with a check in the CI that the conda package builds without issue). So I identify the dependencies in pixi.toml to what will be used in the conda-forge recipe. I don't know if it is correct? It seems to me that with upper bounds of dependencies, we have one syntax to say two very different things:
If I publish a conda-forge package with Therefore, I feel that there is something wrong with generalization of strict upper bounds and even after reading the blogpost I don't envision a good short term solution. Maybe tools like Pixi should be able to generate from soft requirements (without upper bounds if we don't know that there is an existing issue) + a successful lockfile, a set of strong requirements (with upper bounds) that lead with a very high probability to a working environments ? Then, one could ask for different things
|
We've changed it to a range in this PR: #536 because of this issue: #285. I can't seem to find it but we have been talking about making the way we deal with this range globally configurable. Because I see why this would be different for python libraries. That said, it is a helper for the user and they aren't forced to use this helper requirement, the requirement is in this format to help the user edit the requirement to what they really want the dependency to be. If you use It would help us if you would describe what your perfect world solution would look like, with some use-cases. |
@paugier Thanks for your input. I think the "strictness" requirement of the dependencies is highly dependent on the ecosystem. Although a minor version bump in a python package is (not uncommonly) still compatible this can be very different for C++ packages where an ABI break may have silently been introduced. Its hard for us to know upfront whether a package is a python package or not so we choose to be relatively restrictive. Of course as a user you are free to alter the requirement anyway you please but for current pixi projects which are generally not published I think the strictness by default suffices. Like @ruben-arts said in the future it would be great if we had a project or global configuration that allows configuring the default behavior. We could even write a less restrictive configuration to the project configuration by default when you initialize a python project. I would also be interested to know what you think we should default to? Not have a upperbound at all? Or upperbound to the major version?
Id be interested to know how that could be done because that would be awesome! I see some issues because most of the incompatibility issues are discovered at run- or buildtime, not while resolving an environment. |
Yes, it makes sense!
Yes sure! I was not explicit enough when I wrote "a successful lockfile". I meant a lockfile for which all the tests succeed. Wouldn't it be enough to have minimum versions in the pixi.toml + upper bounds for known issues + the date of the last lockfile that leads to successful build and tests ? With this data, it should be possible to compute a working environment, isn't it? We would just need a command to tell Pixi that this lockfile leads to successful build and tests. I remembered that I was able to recreate successful environments with https://pypi.org/project/pypi-timemachine/ |
Pixi looks great, thanks for that! I'm also very confused about this automatic pinning, because I expect the CLI to help me to fill the toml file to reflect my specification, not adding from the resolution process some extra information I didn't wanted to put there.
Would it be acceptable to pass an extra flag to the It would also make sense to have a Also confused that we can't have packages without version spec in deps lists, |
The reason we do this is that if you do not specify a spec we basically take the "current" version. This is pinned exactly in the lock-file. After that a developer starts building their software using that version. Later, once you add more packages, or remove the lock-file this could completely break if the spec is very "loose" due to the chaotic nature of package resolution. Without a spec you could end up with a wildly different version than what you initially developed with which can cause annoying dependency conflicts. With the current approach we try to have a middleground for people that dont really care too much but where there is still the posibility to allow for flexibility and simple upgrade paths without (semver) breakage. We think its a sane default which is not uncommon in other package managers (cargo does the same thing, so does any node package manager) Note that if you do specify a spec alongside your package name that that spec is used verbatim (e.g. "cmake *" will result in that spec being added exactly like that). That being said I do like your suggestion of the --pin flag! That sounds like a good approach for people to modify the default behavior if so desired. As for having to always specify a requirement, this is on purpose too. I think its very important to think about the bounds you provide. From my personal experience having too loose bounds will always bite you later. |
As I typically run command like |
After careful consideration and the feedback here we conclude:
Let us know what you think, do you agree on the naming? Feel free to pick this up! If we get around to it before that time we will assign someone to this issue. |
I agree on the naming, thanks for the update. |
won't Maybe I'm also wondering whether we might persist the choices in |
The Do you mean I think the potential |
I would prefer also changing the logic.
Notice that |
I see, that sounds like a good idea! |
Awesome! Thanks a lot! |
I think being able to specify one's favorite pinning style should be configurable in the global configuration file. # config.toml
default-pin = "semver" # or major, minor, exact, unconstrained |
Yeah exactly! I'd like that as well! @baszalmstra is working on |
Re-opening, missed the rest of the discussion here before, as multiple features are being discussed. |
Problem description
pixi automatically adds upper bounds for all dependencies. I don't think this is a reasonable choice for Python libraries (see for example https://iscinumpy.dev/post/bound-version-constraints/). I used Poetry which also does this and had to switch to PDM to avoid this behavior.
pixi even includes upper bounds for dependencies not using semver...
The text was updated successfully, but these errors were encountered: