Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

refactor requires-python marker simplification #6268

Merged
merged 15 commits into from
Sep 3, 2024

Conversation

BurntSushi
Copy link
Member

@BurntSushi BurntSushi commented Aug 20, 2024

This commit refactors how deal with requires-python so that instead of
simplifying markers of dependencies inside the resolver, we do it at the
edges of our system. When writing markers to output, we simplify when
there's an obvious requires-python context. And when reading markers
as input, we complexity markers with the relevant requires-python
constraint.

There are some lingering problems though:

  • This PR doesn't leverage the type system as much as I would hope to
    force folks into dealing with marker serialization correctly. It's a
    very subtle problem and I think we should invest in that, but I think
    it would be better in a follow-up PR since this one is already pretty
    big.
  • Some of the snapshot updates are wrong, some are fixing a bug and
    still others require investigation to figure out whether they're
    correct or not.
  • I side-stepped the issue of MarkerTree's hidden state by
    implementing requires-python marker simplification in a cursed way:
    run restrict_versions, serialize the marker to string form and then
    re-parse it. This should ensure any hidden state in the MarkerTree
    is removed. Of course, this isn't what we should do. I think we should
    just remove the hidden state entirely, but I took a shortcut to prove
    out the idea.

Fixes #6269, Fixes #6412, Fixes #6836

Comment on lines -7966 to 8666
╰─▶ Because your project depends on datasets<2.19 and datasets>=2.19, we can conclude that your project's requirements are unsatisfiable.
╰─▶ Because only datasets<2.19 is available and your project depends on datasets>=2.19, we can conclude that your project's requirements are unsatisfiable.
"###);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah — this seems wrong in this context.

Copy link
Member

@zanieb zanieb Aug 20, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looking at this..

Moving the transformation call so we can see the display before mutation...

diff --git a/crates/uv-resolver/src/error.rs b/crates/uv-resolver/src/error.rs
index 4a38e3e30..dc9eeecf8 100644
--- a/crates/uv-resolver/src/error.rs
+++ b/crates/uv-resolver/src/error.rs
@@ -222,7 +222,6 @@ impl std::fmt::Display for NoSolutionError {
 
         // Transform the error tree for reporting
         let mut tree = self.error.clone();
-        simplify_derivation_tree_markers(&self.python_requirement, &mut tree);
         let should_display_tree = std::env::var_os("UV_INTERNAL__SHOW_DERIVATION_TREE").is_some()
             || tracing::enabled!(tracing::Level::TRACE);
 
@@ -230,6 +229,7 @@ impl std::fmt::Display for NoSolutionError {
             display_tree(&tree, "Resolver derivation tree before reduction");
         }
 
+        simplify_derivation_tree_markers(&self.python_requirement, &mut tree);
         collapse_no_versions_of_workspace_members(&mut tree, &self.workspace_members);

Using a cargo insta test invocation:

UV_INTERNAL__SHOW_DERIVATION_TREE=1 cit unconditional_overlapping_marker_disjoint_version_constraints

We get the following trees:

Resolver derivation tree before reduction
  root==0a0.dev0 depends on project*
      project==0.1.0 depends on datasets{python_full_version >= '3.11'}>=2.19
      no versions of datasets{python_full_version >= '3.11'}>=2.19
    no versions of project<0.1.0 | >0.1.0
Resolver derivation tree after reduction
  project==0.1.0 depends on datasets>=2.19
  no versions of datasets>=2.19

It looks like your simplifications here are fine, the tree is wrong before then.

Here are the (correct) trees from main:

Resolver derivation tree before reduction
  root==0a0.dev0 depends on project*
      project==0.1.0 depends on datasets>=2.19
      project==0.1.0 depends on datasets<2.19
    no versions of project<0.1.0 | >0.1.0
Resolver derivation tree after reduction
  project==0.1.0 depends on datasets>=2.19
  project==0.1.0 depends on datasets<2.19

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Without UV_EXCLUDE_NEWER, the message is entirely unhinged.

Resolver derivation tree before reduction
    root==0a0.dev0 depends on project*
        project==0.1.0 depends on datasets{python_full_version >= '3.11'}>=2.19
        project==0.1.0 depends on datasets<2.19
        no versions of datasets{python_full_version >= '3.11'}>2.19.0, <2.19.1 | >2.19.1, <2.19.2 | >2.19.2, <2.20.0 | >2.20.0, <2.21.0 | >2.21.0
    no versions of project<0.1.0 | >0.1.0
Resolver derivation tree after reduction
    project==0.1.0 depends on datasets>=2.19
    project==0.1.0 depends on datasets<2.19
    no versions of datasets>2.19.0, <2.19.1 | >2.19.1, <2.19.2 | >2.19.2, <2.20.0 | >2.20.0, <2.21.0 | >2.21.0
× No solution found when resolving dependencies:
╰─▶ Because only the following versions of datasets are available:
        datasets<=2.19.0
        datasets==2.19.1
        datasets==2.19.2
        datasets==2.20.0
        datasets==2.21.0
    and your project depends on datasets<2.19, we can conclude that your project and datasets>=2.19.0 are incompatible.
    And because your project depends on datasets>=2.19, we can conclude that your project's requirements are unsatisfiable.

Copy link

codspeed-hq bot commented Aug 21, 2024

CodSpeed Performance Report

Merging #6268 will not alter performance

Comparing ag/reveal-what-is-hidden (92e68dd) with main (4a6bea5)

Summary

✅ 14 untouched benchmarks

@BurntSushi
Copy link
Member Author

Here is my current state: I still cannot fully explain the diff on the transformers ecosystem test and I haven't yet investigated the root cause in the change to the error message.

I've spent a lot of time looking at the transformers test case. I've minimized the pyproject.toml down a bit, although it is still quite a large number of dependencies:

[project]
name = "transformers"
version = "4.39.0.dev0"
requires-python = ">=3.9.0"
dependencies = []

[project.urls]
repository = "https://github.com/huggingface/transformers"

[project.optional-dependencies]
accelerate = [
    "accelerate>=0.21.0",
]
flax = [
    "flax>=0.4.1,<=0.7.0",
]
torch-speech = [
    "librosa",
]
quality = [
    "datasets!=2.5.0",
]
all = [
    "tensorflow>=2.6,<2.16",
    "onnxconverter-common",
    "tensorflow-text<2.16",
    "jax>=0.4.1,<=0.4.13",
]
agents = [
    "opencv-python",
]

[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"

Then lock this using main. Then mv uv.lock main.lock and re-lock using this PR. You end up with a pretty sizeable diff:

$ diff main.lock pr1.lock | wc -l
434

Here's one example of a difference. This is from main:

[[package]]
name = "tensorflow-macos"
version = "2.15.1"
source = { registry = "https://pypi.org/simple" }
resolution-markers = [
    "python_full_version < '3.10' and platform_machine == 'aarch64' and platform_system == 'Linux'",
]
dependencies = [
    { name = "tensorflow-cpu-aws", marker = "python_full_version < '3.10' and platform_machine == 'aarch64' and platform_system == 'Linux'" },
]
wheels = [
    { url = "https://files.pythonhosted.org/packages/b3/c8/b90dc41b1eefc2894801a120cf268b1f25440981fcf966fb055febce8348/tensorflow_macos-2.15.1-cp310-cp310-macosx_12_0_arm64.whl", hash = "sha256:b8f01d7615fe4ff3b15a12f84471bd5344fed187543c4a091da3ddca51b6dc26", size = 2158 },
    { url = "https://files.pythonhosted.org/packages/bc/11/b73387ad260614ec43c313a630d14fe5522455084abc207fce864aaa3d73/tensorflow_macos-2.15.1-cp311-cp311-macosx_12_0_arm64.whl", hash = "sha256:58fca6399665f19e599c591c421672d9bc8b705409d43ececd0931d1d3bc6a7e", size = 2159 },
    { url = "https://files.pythonhosted.org/packages/3a/54/95b9459cd48d92a0522c8dd59955210e51747a46461bcedb64a9a77ba822/tensorflow_macos-2.15.1-cp39-cp39-macosx_12_0_arm64.whl", hash = "sha256:cca3c9ba5b96face05716792cb1bcc70d84c5e0c34bfb7735b39c65d0334b699", size = 2158 },
]

And this is from this branch:

[[package]]
name = "tensorflow-macos"
version = "2.15.1"
source = { registry = "https://pypi.org/simple" }
resolution-markers = [
    "python_full_version < '3.10' and platform_machine == 'arm64' and platform_system == 'Darwin'",
]
wheels = [
    { url = "https://files.pythonhosted.org/packages/b3/c8/b90dc41b1eefc2894801a120cf268b1f25440981fcf966fb055febce8348/tensorflow_macos-2.15.1-cp310-cp310-macosx_12_0_arm64.whl", hash = "sha256:b8f01d7615fe4ff3b15a12f84471bd5344fed187543c4a091da3ddca51b6dc26", size = 2158 },
    { url = "https://files.pythonhosted.org/packages/bc/11/b73387ad260614ec43c313a630d14fe5522455084abc207fce864aaa3d73/tensorflow_macos-2.15.1-cp311-cp311-macosx_12_0_arm64.whl", hash = "sha256:58fca6399665f19e599c591c421672d9bc8b705409d43ececd0931d1d3bc6a7e", size = 2159 },
    { url = "https://files.pythonhosted.org/packages/3a/54/95b9459cd48d92a0522c8dd59955210e51747a46461bcedb64a9a77ba822/tensorflow_macos-2.15.1-cp39-cp39-macosx_12_0_arm64.whl", hash = "sha256:cca3c9ba5b96face05716792cb1bcc70d84c5e0c34bfb7735b39c65d0334b699", size = 2158 },
]

Here's another example. From main:

[[package]]
name = "flatbuffers"
version = "24.3.25"
source = { registry = "https://pypi.org/simple" }
resolution-markers = [
    "python_full_version < '3.10' and platform_machine == 'aarch64' and platform_system == 'Linux'",
]

From this branch:

[[package]]
name = "flatbuffers"
version = "24.3.25"
source = { registry = "https://pypi.org/simple" }
resolution-markers = [
    "python_full_version < '3.10' and platform_machine == 'arm64' and platform_system == 'Darwin'",
]

This represents a key thing I don't yet understand: why does flatbuffers 24.3.25 end up in only one fork? Why doesn't it end up in both forks in both cases? I don't know.

One thing I do know is that this is related to fork prioritization. If I remove forks.sort() from uv-resolver/src/resolver/mod.rs in both main and this branch, then the differences in the lock file go away.

Because of that, I spent some time trying to making fork prioritization in this branch work in the same way as in main. But I think this is very hard to do because of the different requires-python handling and because of the hidden state for Python versions inside of MarkerTree. I think fork prioritization in this branch, as-is, matches the spirit of what it's trying to do: solve for more constraining python_version requirements before less constraining.

Here are my thoughts in summary for where to go from here:

  • Keep trying to discover whether the resolution in this branch is actually incorrect, or if it is just a "different but correct" resolution. If both resolutions are correct, then I'd be tempted to merge (after fixing the error message issue) since this PR does fix other outstanding bugs.
  • I am wondering whether both resolutions are actually incorrect, and perhaps there is a deeper unresolved bug that this PR is exacerbating or just letting manifest in a different way. What leads me to this thought is the shenanigans around tensorflow-macos in both lock files. Moreover, in both cases, dependencies like flatbuffers have a version that end up in one single fork, but the fork it ends up in is different between this branch in main. Why is that? Should it be in both in both cases?
  • I believe it is correct to say that the instigator for all of this is numpy. Because the dependency tree is so large, there are a number of different numpy dependencies with interesting markers. I believe it is the only reason we fork at all in this case.

This is rather odd overall

@BurntSushi
Copy link
Member Author

I think the resolve_warm_jupyter_universal benchmark is pretty noisy. We should probably be good to merge. With that said, this PR does put more pressure on marker ops that could probably be amortized with some refactoring. For example:

impl Forks {
    fn new(
        name_to_deps: BTreeMap<PackageName, Vec<PubGrubDependency>>,
        python_requirement: &PythonRequirement,
    ) -> Forks {
        let python_marker = python_requirement.to_marker_tree();
        let python_marker = python_marker.as_ref();

crates/uv-resolver/src/python_requirement.rs Outdated Show resolved Hide resolved
crates/uv-resolver/src/resolver/mod.rs Outdated Show resolved Hide resolved
@BurntSushi BurntSushi force-pushed the ag/reveal-what-is-hidden branch 2 times, most recently from 6fc3d2a to cfbcd7c Compare September 3, 2024 15:25
@BurntSushi BurntSushi force-pushed the ag/reveal-what-is-hidden branch from cfbcd7c to 3e6abec Compare September 3, 2024 17:47
@BurntSushi
Copy link
Member Author

I've added regression tests for #6269, #6412 and #6836.

@BurntSushi BurntSushi force-pushed the ag/reveal-what-is-hidden branch from 3e6abec to 9ac6c68 Compare September 3, 2024 18:01
These are regression tests for #6269, #6412 and #6836. In this commit,
their test outputs are all wrong. We'll update these snapshots after
fixing the underlying bug by refactoring how `requires-python`
simplification works.
Conversions from strings to paths are always infallible.
When I first wrote this routine, it was intended to only emit a trace
for the final "unioned" resolution. But we actually moved that semantic
operation to the construction of the resolution *graph*. So there is no
unioned `Resolution` any more.

But this is still useful to see. So I changed this to just emit a trace
of *every* resolution right before constructing the graph.

It might be nice to also emit a trace of the unioned graph too. Or
perhaps we should do that instead if this proves too noisy. (Although
this is only emitted at TRACE level.)
This commit refactors how deal with `requires-python` so that instead of
simplifying markers of dependencies inside the resolver, we do it at the
edges of our system. When writing markers to output, we simplify when
there's an obvious `requires-python` context. And when reading markers
as input, we complexity markers with the relevant `requires-python`
constraint.
This update changes the error message to one that is worse than the
status quo, but it is still correct because `datasets >= 2.19` doesn't
actually exist given our `EXCLUDE_NEWER` in tests at present.

The underlying cause here seems to be in how PubGrub deals with
reporting incompatibilities. Namely, when it has `foo < 1` and
`foo >= 1`, it reports an incompatibility immediately before looking for
versions. But when it has `foo < 1` and `foo >= 1 ; marker`, then
because they aren't both pubgrub "packages," it starts requesting
versions first and hits the "not available" error path instead of the
"incompatible" error path.

Since this is more of an underlying issue with how we setup
`PubGrubPackage` and our interaction with pubgrub, we ended up deciding
to move forward here with the regression since this PR is fixing a
correctness issue. In particular, if one changes the `requires-python`
to `>=3.8`, then both `main` and this PR produce similarly bad error
messages.
It is not clear whether this update is correct or not. Moreover, it's
not clear whether the status quo is correct or not. The problem is that
`transformers` is so big that it's very hard to understand what the
right output is without a deeper investigation.

One thing that is interesting is if fork prioritization is removed in
this PR *and* on `main`, then the differences in this ecosystem test go
away.

We've decided for now to move forward with this update even though we're
uncertain because this PR fixes a few outstanding correctness issues.
Unlike the previous update, this message is specifically referring to a
fork's markers inside the resolver. We probably *could* massage the
message to be simplified with respect to requires-python, but it's not
obvious to me that that is the right thing to do.
A `Dependency` now has both "simplified" and "complexified" markers, so
just update the snapshots to match the new reality.
The output no longer results in installig two different versions of
astroid unconditionally on Python 3.10.

Fixes #6269
The `tomli` dependency is now included for `python_version <= 3.11`,
which is what is expected.

Fixes #6412
The `importlib-metadata` is no longer unconditionally repeated in the
output for Python 3.10 (or Python 3.7).

Fixes #6836
Previously we were using `[+-~]`, but this includes the full range of
characters from `+` to `~`. Incidentally, this does include `-`. We
instead rewrite this as `[-+~]`, which probably matches the intent.
The key change here is to use raw strings so that we don't need to
double-escape things like `\d`. And in particular, we rely on the fact
that `"\n"` and `r"\n"` are precisely equivalent when fed to
`Regex::new` in the `regex` crate.
This is to account for slight differences on our Windows CI.
@BurntSushi BurntSushi force-pushed the ag/reveal-what-is-hidden branch 2 times, most recently from b81645e to 08fc8ad Compare September 3, 2024 19:01
And otherwise make the regexes a little more robust.
@BurntSushi BurntSushi force-pushed the ag/reveal-what-is-hidden branch from da57490 to 92e68dd Compare September 3, 2024 22:31
@BurntSushi BurntSushi merged commit 8403a6d into main Sep 3, 2024
58 checks passed
@BurntSushi BurntSushi deleted the ag/reveal-what-is-hidden branch September 3, 2024 22:41
tmeijn pushed a commit to tmeijn/dotfiles that referenced this pull request Sep 4, 2024
This MR contains the following updates:

| Package | Update | Change |
|---|---|---|
| [astral-sh/uv](https://github.com/astral-sh/uv) | patch | `0.4.0` -> `0.4.4` |

MR created with the help of [el-capitano/tools/renovate-bot](https://gitlab.com/el-capitano/tools/renovate-bot).

**Proposed changes to behavior should be submitted there as MRs.**

---

### Release Notes

<details>
<summary>astral-sh/uv (astral-sh/uv)</summary>

### [`v0.4.4`](https://github.com/astral-sh/uv/blob/HEAD/CHANGELOG.md#044)

[Compare Source](astral-sh/uv@0.4.3...0.4.4)

##### Enhancements

-   Allow customizing the project environment path with `UV_PROJECT_ENVIRONMENT` ([#&#8203;6834](astral-sh/uv#6834))
-   Warn when `VIRTUAL_ENV` is set but will not be respected in project commands ([#&#8203;6864](astral-sh/uv#6864))
-   Add `--no-hashes` to `uv export` ([#&#8203;6954](astral-sh/uv#6954))
-   Make HTTP headers title case for backward compatibility ([#&#8203;6887](astral-sh/uv#6887))
-   Pin `.python-version` in `uv init` ([#&#8203;6869](astral-sh/uv#6869))
-   Support `file://` URLs for `UV_PYTHON_INSTALL_MIRROR` ([#&#8203;6950](astral-sh/uv#6950))
-   Introduce more docker tags for uv ([#&#8203;6053](astral-sh/uv#6053))

##### Bug fixes

-   Avoid canonicalizing the cache directory ([#&#8203;6949](astral-sh/uv#6949))
-   Show all PyPy versions in `uv python list --all-versions` ([#&#8203;6917](astral-sh/uv#6917))
-   Avoid incorrect `requires-python` marker simplifications ([#&#8203;6268](astral-sh/uv#6268))

##### Documentation

-   Add documentation for `UV_PROJECT_ENVIRONMENT` ([#&#8203;6987](astral-sh/uv#6987))
-   Add optional dependencies section to the lockfile document ([#&#8203;6982](astral-sh/uv#6982))
-   Document use of the `file://` scheme in Python installation mirrors ([#&#8203;6984](astral-sh/uv#6984))
-   Fix outdated references to the help menu documentation in the first steps page ([#&#8203;6980](astral-sh/uv#6980))
-   Show env option in CLI reference documentation ([#&#8203;6863](astral-sh/uv#6863))
-   Add bind mount example to `docker.md` ([#&#8203;6921](astral-sh/uv#6921))

### [`v0.4.3`](https://github.com/astral-sh/uv/blob/HEAD/CHANGELOG.md#043)

[Compare Source](astral-sh/uv@0.4.2...0.4.3)

##### Enhancements

-   Show build backend output when `--verbose` is provided ([#&#8203;6903](astral-sh/uv#6903))
-   Allow `uv sync --frozen --package` without copying member `pyproject.toml` ([#&#8203;6943](astral-sh/uv#6943))

##### Bug fixes

-   Avoid panic with missing temporary directory ([#&#8203;6929](astral-sh/uv#6929))
-   Avoid updating incorrect dependencies for sorted `uv add` ([#&#8203;6939](astral-sh/uv#6939))
-   Use lower-bound semantics for all Python compatibility comparisons ([#&#8203;6882](astral-sh/uv#6882))

### [`v0.4.2`](https://github.com/astral-sh/uv/blob/HEAD/CHANGELOG.md#042)

[Compare Source](astral-sh/uv@0.4.1...0.4.2)

##### Enhancements

-   Adding support for `.pyc`  files in `uv run` ([#&#8203;6886](astral-sh/uv#6886))
-   Treat missing `top_level.txt` as non-fatal ([#&#8203;6881](astral-sh/uv#6881))

##### Bug fixes

-   Fix `is_disjoint` check for supported environments ([#&#8203;6902](astral-sh/uv#6902))
-   Remove dangling archives in `uv cache clean ${package}` ([#&#8203;6915](astral-sh/uv#6915))
-   Error when discovered Python is incompatible with `--isolated` workspace ([#&#8203;6885](astral-sh/uv#6885))
-   Warn when discovered Python is incompatible with PEP 723 script ([#&#8203;6884](astral-sh/uv#6884))

### [`v0.4.1`](https://github.com/astral-sh/uv/blob/HEAD/CHANGELOG.md#041)

[Compare Source](astral-sh/uv@0.4.0...0.4.1)

##### Enhancements

-   Add `uv export --format requirements-txt` ([#&#8203;6778](astral-sh/uv#6778))
-   Allow `@` references in `uv tool install --from` ([#&#8203;6842](astral-sh/uv#6842))
-   Normalize version specifiers by sorting ([#&#8203;6333](astral-sh/uv#6333))
-   Respect the user's upper-bound in `requires-python` ([#&#8203;6824](astral-sh/uv#6824))
-   Use Windows registry to discover Python on Windows directly ([#&#8203;6761](astral-sh/uv#6761))
-   Hint at `--no-workspace` in `uv init` failures ([#&#8203;6815](astral-sh/uv#6815))
-   Update to last PyPy releases ([#&#8203;6784](astral-sh/uv#6784))

##### Bug fixes

-   Avoid deadlocks when multiple uv processes lock resources ([#&#8203;6790](astral-sh/uv#6790))
-   Expand tildes when matching against `PATH` ([#&#8203;6829](astral-sh/uv#6829))
-   Fix `uv init --no-project` alias ([#&#8203;6837](astral-sh/uv#6837))
-   Ignore pre-release segments when discovering via `requires-python` ([#&#8203;6813](astral-sh/uv#6813))
-   Support inline optional tables in `uv add` and `uv remove` ([#&#8203;6787](astral-sh/uv#6787))
-   Update default `hello.py` to pass `ruff format` ([#&#8203;6811](astral-sh/uv#6811))
-   Avoid stripping root for user path display ([#&#8203;6865](astral-sh/uv#6865))
-   Error when user-provided environments are disjoint with Python ([#&#8203;6841](astral-sh/uv#6841))
-   Retain alphabetical sorting for `pyproject.toml` in `uv add` operations ([#&#8203;6388](astral-sh/uv#6388))))

##### Documentation

-   Add a link to the multiple index docs in the alternative index guide ([#&#8203;6826](astral-sh/uv#6826))
-   Add docs for inline exclude newer in PEP 723 scripts ([#&#8203;6831](astral-sh/uv#6831))
-   Enumerate available Docker tags ([#&#8203;6768](astral-sh/uv#6768))
-   Omit `[pip]` section from configuration file docs ([#&#8203;6814](astral-sh/uv#6814))
-   Update `project.urls` in `pyproject.toml`  ([#&#8203;6844](astral-sh/uv#6844))
-   Add docs for AWS CodeArtifact usage ([#&#8203;6816](astral-sh/uv#6816))

##### Other changes

</details>

---

### Configuration

📅 **Schedule**: Branch creation - At any time (no schedule defined), Automerge - At any time (no schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you are satisfied.

♻ **Rebasing**: Whenever MR becomes conflicted, or you tick the rebase/retry checkbox.

🔕 **Ignore**: Close this MR and you won't be reminded about this update again.

---

 - [ ] <!-- rebase-check -->If you want to rebase/retry this MR, check this box

---

This MR has been generated by [Renovate Bot](https://github.com/renovatebot/renovate).
<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiIzNy40NDAuNyIsInVwZGF0ZWRJblZlciI6IjM3LjQ0MC43IiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6WyJSZW5vdmF0ZSBCb3QiXX0=-->
BurntSushi added a commit that referenced this pull request Sep 6, 2024
I split this change into its own commit because I'm hoping it
crystalizes what it means when we say "a `MarkerTree` has hidden state."
That is, it isn't so much that there is some explicit member of a
`MarkerTree` that is omitted, but rather, the lower and upper version
bounds on `python_full_version` are are rewritten as "unbounded" when
traversing the ADD for display.

We will actually retain this functionality, but rejigger it so that it's
explicit when we do this. In particular, this simplification has been
problematic for us because it fundamentally changes the truth tables of
a marker expression *unless* you are extremely careful to interpret it
only under the original context in which it was simplified. This is
quite difficult to do generally, and in prior work in #6268, we
completed a refactor where we worked around this type of simplification
and moved it to the edges of uv.

In subsequent commits, we'll re-implement this form of simplification as
a more explicit step.
BurntSushi added a commit that referenced this pull request Sep 7, 2024
I split this change into its own commit because I'm hoping it
crystalizes what it means when we say "a `MarkerTree` has hidden state."
That is, it isn't so much that there is some explicit member of a
`MarkerTree` that is omitted, but rather, the lower and upper version
bounds on `python_full_version` are are rewritten as "unbounded" when
traversing the ADD for display.

We will actually retain this functionality, but rejigger it so that it's
explicit when we do this. In particular, this simplification has been
problematic for us because it fundamentally changes the truth tables of
a marker expression *unless* you are extremely careful to interpret it
only under the original context in which it was simplified. This is
quite difficult to do generally, and in prior work in #6268, we
completed a refactor where we worked around this type of simplification
and moved it to the edges of uv.

In subsequent commits, we'll re-implement this form of simplification as
a more explicit step.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
4 participants