Skip to content
Permalink

Comparing changes

Choose two branches to see what’s changed or to start a new pull request. If you need to, you can also or learn more about diff comparisons.

Open a pull request

Create a new pull request by comparing changes across two branches. If you need to, you can also . Learn more about diff comparisons here.
base repository: pypa/pip
Failed to load repositories. Confirm that selected base ref is valid, then try again.
Loading
base: 24.3.1
Choose a base ref
...
head repository: pypa/pip
Failed to load repositories. Confirm that selected head ref is valid, then try again.
Loading
compare: 25.0
Choose a head ref

Commits on Oct 27, 2024

  1. Bump for development

    sbidoul committed Oct 27, 2024

    Verified

    This commit was signed with the committer’s verified signature.
    skjnldsv John Molakvoæ
    Copy the full SHA
    0f9e238 View commit details
  2. Merge pull request #13049 from sbidoul/release/24.3.1

    Release/24.3.1
    sbidoul authored Oct 27, 2024

    Verified

    This commit was signed with the committer’s verified signature.
    skjnldsv John Molakvoæ
    Copy the full SHA
    4204359 View commit details

Commits on Nov 3, 2024

  1. Skip self version check on EXTERNALLY-MANAGED environments

    ichard26 committed Nov 3, 2024

    Verified

    This commit was signed with the committer’s verified signature.
    MorrisJobke Morris Jobke
    Copy the full SHA
    69533e3 View commit details

Commits on Nov 9, 2024

  1. Override rich.console pipe handler for rich 13.8.0+

    Explicitly override `rich.console.Console.on_broken_pipe()` to reraise
    the original exception, to bring the behavior of rich 13.8.0+ in line
    with older versions.  The new versions instead close output fds and exit
    with error instead, which prevents pip's pipe handler from firing.
    This is the minimal change needed to make pip's test suite pass after
    upgrading vendored rich.
    
    Bug #13006
    Bug #13072
    mgorny committed Nov 9, 2024
    Copy the full SHA
    099ae97 View commit details
  2. Merge pull request #13073 from mgorny/rich-pipe-handling

    uranusjr authored Nov 9, 2024
    1
    Copy the full SHA
    fe0925b View commit details

Commits on Dec 7, 2024

  1. Accommodate for recent pathname2url() changes upstream

    - UNC paths converted to URLs now start with two slashes, like earlier (yes, really)
    - Trailing slashes are now preserved on Windows, matching POSIX behaviour
    ichard26 committed Dec 7, 2024
    Copy the full SHA
    5beed92 View commit details
  2. Merge pull request #13105 from ichard26/windows-paths

    Accommodate for recent pathname2url() changes upstream
    sbidoul authored Dec 7, 2024
    Copy the full SHA
    a75dad5 View commit details
  3. Import self version check eagerly in install command to fix RCE (#13085)

    The comment was preserved as it is still relevant, but a note about
    preventing arbitrary code execution was added. See #13079 for the
    security bug report.
    
    Signed-off-by: Caleb Brown <calebbrown@google.com>
    calebbrown authored Dec 7, 2024
    Copy the full SHA
    634bf25 View commit details
  4. Remove gone_in for issue 11859

    sbidoul authored and ichard26 committed Dec 7, 2024
    Copy the full SHA
    60cba9c View commit details
  5. pre-commit autoupdate (#12898)

    updates:
    - [github.com/pre-commit/pre-commit-hooks: v4.6.0 → v5.0.0](pre-commit/pre-commit-hooks@v4.6.0...v5.0.0)
    - [github.com/psf/black-pre-commit-mirror: 24.4.2 → 24.10.0](psf/black-pre-commit-mirror@24.4.2...24.10.0)
    - [github.com/astral-sh/ruff-pre-commit: v0.5.6 → v0.8.1](astral-sh/ruff-pre-commit@v0.5.6...v0.8.1)
    - [github.com/pre-commit/mirrors-mypy: v1.12.1 → v1.13.0](pre-commit/mirrors-mypy@v1.12.1...v1.13.0)
    
    Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
    pre-commit-ci[bot] authored Dec 7, 2024
    Copy the full SHA
    07c7a14 View commit details
  6. Copy the full SHA
    dc6c4f3 View commit details
  7. Upgrade msgpack to 1.1.0

    ichard26 committed Dec 7, 2024
    Copy the full SHA
    c0900b8 View commit details
  8. Copy the full SHA
    0904ed7 View commit details
  9. Upgrade idna to 3.10

    ichard26 committed Dec 7, 2024
    Copy the full SHA
    c530f32 View commit details
  10. Upgrade rich to 13.9.4

    ichard26 committed Dec 7, 2024
    Copy the full SHA
    0daddaf View commit details
  11. Upgrade packaging to 24.2

    ichard26 committed Dec 7, 2024
    Copy the full SHA
    2845a52 View commit details
  12. Upgrade tomli to 2.2.1

    ichard26 committed Dec 7, 2024
    Copy the full SHA
    79148f1 View commit details
  13. Convert more record classes to dataclasses (#12659)

    - Removes BestCandidateResult's iter_all() and iter_applicable()
      methods as they were redundant
    - Removes ParsedLine's is_requirement attribute as it was awkward to use
      (to please mypy, you would need to add asserts on .requirement)
    - Removes ParsedRequirement's defaults as they conflict with slots (Python
      3.10 dataclasses have a built-in workaround that we can't use yet...)
    ichard26 authored Dec 7, 2024
    Copy the full SHA
    8dbbb2e View commit details
  14. Correct typos in docs and code comments (#13032)

    Also move "Selected quotes from research participants" out of a random paragraph.
    JoshuaPerdue authored Dec 7, 2024
    Copy the full SHA
    a194b45 View commit details

Commits on Dec 8, 2024

  1. Merge pull request #13074 from ichard26/vendoring-bumps-25.0

    Vendoring bumps for 25.0
    sbidoul authored Dec 8, 2024
    Copy the full SHA
    4599fc7 View commit details
  2. Merge pull request #13106 from sbidoul/rm-gone-in-for-11859-sbi

    Remove gone_in for issue 11859 (deprecation of --build-option and --global-option)
    sbidoul authored Dec 8, 2024
    Copy the full SHA
    c432c33 View commit details
  3. Merge pull request #13064 from ichard26/block-upgrade-prompt

    Skip self version check on EXTERNALLY-MANAGED environments
    pradyunsg authored Dec 8, 2024
    Copy the full SHA
    2324303 View commit details

Commits on Dec 9, 2024

  1. Inherit HTTP cache file read/write permissions from cache directory (#…

    …13070)
    
    The NamedTemporaryFile class used to create HTTP cache files is hard-coded to use
    file mode 600 (owner read/write only). This makes it impossible to share a pip
    cache with other users.
    
    With this patch, once a cache file is committed, its permissions are updated to
    inherit the read/write permissions of the cache directory. As before, the owner
    read/write permissions will always be set to avoid a completely unusable cache.
    
    ---------
    
    Co-authored-by: Richard Si <sichard26@gmail.com>
    JustinVanHeek and ichard26 authored Dec 9, 2024
    Copy the full SHA
    667acf4 View commit details

Commits on Dec 10, 2024

  1. Remove unused news file GHA workflow and bot config (#13107)

    The PSF Chronographer essentially replaces the news file check GHA
    workflow. And we haven't used the triage-new-issues bot in ages.
    ichard26 authored Dec 10, 2024
    1
    Copy the full SHA
    d3ac6a2 View commit details
  2. Fix options and default in --keyring-provider help (#13110)

    The `auto` mode was added a while ago, but the option help was not
    updated to reflect this.
    radoering authored Dec 10, 2024
    Copy the full SHA
    947917b View commit details

Commits on Dec 12, 2024

  1. Return file size along with count on cache purge or removal (#13108)

    charwick authored Dec 12, 2024
    Copy the full SHA
    8936fee View commit details

Commits on Dec 14, 2024

  1. Pass CA and client TLS certificates to build subprocesses (#13063)

    The _PIP_STANDALONE_CERT environment variable hack is no longer required
    as pip doesn't run a zip archive of itself to provision build dependencies
    these days (which due to a CPython bug would leave behind temporary certifi
    files).
    
    Some people do depend on this private envvar in the wild, so the removal
    has been called out in the news entry.
    ichard26 authored Dec 14, 2024
    1
    Copy the full SHA
    34fc0e2 View commit details

Commits on Dec 16, 2024

  1. Add missing space in running as root warning message (#13116)

    tifv authored Dec 16, 2024
    Copy the full SHA
    90add48 View commit details
  2. Rerun time based retry tests to avoid flaky failures (#12869)

    Also increase the time tolerance to account for more extreme variation.
    ichard26 authored Dec 16, 2024
    Copy the full SHA
    3b91f42 View commit details

Commits on Dec 21, 2024

  1. Remove redundant prefix in failed PEP 517 builds error message

    The error handling logic will add the ERROR: prefix already. Including one in the error message results in two ERROR: prefixes.
    
        ERROR: ERROR: Failed to build installable wheels for some pyproject.toml based projects (simplewheel, singlemodule)
    ichard26 authored Dec 21, 2024
    Copy the full SHA
    e9f58aa View commit details

Commits on Dec 22, 2024

  1. Merge pull request #13122 from pypa/ichard26-patch-1

    Remove redundant prefix in failed PEP 517 builds error message
    sbidoul authored Dec 22, 2024
    Copy the full SHA
    9626dec View commit details
  2. Speed up nox docs sessions (#13118)

    * Use --jobs auto in docs nox sessions
    
    On my 8 core system, a clean cold build takes 5-6 seconds instead of
    10-11 seconds. Nothing major, but it's a welcomed QoL improvement. FYI,
    this flag doesn't work and will be ignored on Windows.
    
    * Stop installing pip twice in docs nox sessions
    
    At some point, session.install("pip") in the docs and docs-live nox
    sessions was changed to install pip in editable mode, presumably to
    enable reruns w/o dependency installation (-R flag) to pick up changes
    for our pip sphinx extension. This doesn't do anything though as pip
    is reinstalled normally as it's declared in docs/requirements.txt.
    
    I think it's a fair compromise that if you want to pick up changes in
    pip's source that show up in the documentation, you should not be using
    the -R nox flag.
    ichard26 authored Dec 22, 2024
    Copy the full SHA
    7c218b9 View commit details

Commits on Dec 24, 2024

  1. test: Skip build/install steps with nox's --no-install flag

    This saves time when you want to rerun the test suite with different
    pytest arguments but you haven't made any code changes in-between.
    ichard26 committed Dec 24, 2024
    Copy the full SHA
    c340d7e View commit details

Commits on Dec 25, 2024

  1. CI: micro-optimize test collection & pass nox's --no-install

    Pytest can be pretty slow to collect pip's entire test suite and
    prepare for test execution. I've observed a ~15s delay from invoking
    pytest to the first test running in CI in the worst case. This can be
    improved by reducing how many files pytest has to process while
    collecting tests. In short, passing tests/unit is faster than -m unit.
    
    In addition, use nox's --no-install flag to skip redundant build and
    install steps on the 2nd nox session invocation (for the integration
    tests), which was made possible by the previous commit.
    ichard26 committed Dec 25, 2024
    Copy the full SHA
    5ce1145 View commit details

Commits on Dec 26, 2024

  1. Merge pull request #13126 from ichard26/microoptimize-ci

    Faster test session reruns & microoptimize CI
    ichard26 authored Dec 26, 2024
    Copy the full SHA
    c10dda5 View commit details

Commits on Dec 28, 2024

  1. Remove section about non-existing --force-keyring flag (#12455)

    I must have messed up while merging/rebasing at some point...
    Darsstar authored Dec 28, 2024
    Copy the full SHA
    dd6c4ad View commit details

Commits on Dec 29, 2024

  1. ci: use much faster D: drive for TEMP on Windows (#13129)

    This is apparently an inherent limitation of Azure (which powers GHA)
    which uses a slow C: drive for the OS (read-optimized) and a fast D:
    drive for working space.
    
    A Dev Drive/ReFS volume was considered, but after a fair bit of testing,
    it offered a smaller improvement in Windows CI times than simply
    moving TEMP to the D: drive.
    ichard26 authored Dec 29, 2024
    Copy the full SHA
    f8f0f5a View commit details

Commits on Dec 31, 2024

  1. Trim pyproject.toml and MANIFEST.in (#13137)

    This mostly removes legacy references to files that do not exist
    anymore. In addition, the smarter exclude_also coverage option is
    used instead of exclude_lines.
    ichard26 authored Dec 31, 2024
    Copy the full SHA
    bc553db View commit details

Commits on Jan 5, 2025

  1. perf: Avoid unnecessary URL processing while parsing links (#13132)

    There are three optimizations in this commit, in descending order of
    impact:
    
    - If the file URL in the "project detail" response is already absolute,
      then avoid calling urljoin() as it's expensive (mostly because it
      calls urlparse() on both of its URL arguments) and does nothing. While
      it'd be more correct to check whether the file URL has a scheme, we'd
      need to parse the URL which is what we're trying to avoid in the first
      place. Anyway, by simply checking if the URL starts with http[s]://,
      we can avoid slow urljoin() calls for PyPI responses.
    
    - Replacing urllib.parse.urlparse() with urllib.parse.urlsplit() in
      _ensure_quoted_url(). The URL parsing functions are equivalent for our
      needs[^1]. However, urlsplit() is faster, and we achieve better cache
      utilization of its internal cache if we call it directly[^2].
    
    - Calculating the Link.path property in advance as it's very hot.
    
    [^1]: we don't care about URL parameters AFAIK (which are different than
      the query component!)
    
    [^2]: urlparse() calls urlsplit() internally, but it passes the authority
      parameter (unlike any of our calls) so it bypasses the cache.
    
    Co-authored-by: Stéphane Bidoul <stephane.bidoul@acsone.eu>
    ichard26 and sbidoul authored Jan 5, 2025
    Copy the full SHA
    ffbf6f0 View commit details

Commits on Jan 10, 2025

  1. Switch to ubuntu-22.04 for github workflow

    notatallshaw committed Jan 10, 2025
    Copy the full SHA
    eafc29e View commit details
  2. NEWS ENTRY

    notatallshaw committed Jan 10, 2025
    Copy the full SHA
    c93a9c0 View commit details
  3. Merge pull request #13152 from notatallshaw/ubuntu-22.04

    Switch to ubuntu-22.04 for github workflow
    notatallshaw authored Jan 10, 2025
    Copy the full SHA
    285ff72 View commit details

Commits on Jan 11, 2025

  1. ci: run zipapp tests on M1 macOS (#13130)

    The macos-latest runner is significantly faster than even the
    ubuntu-latest runners (11 minutes vs 17 minutes). Once the Windows jobs
    are made faster in a separate commit, we should have ~15 minute CI. ✨
    ichard26 authored Jan 11, 2025
    Copy the full SHA
    cb56edb View commit details
  2. Fix mypy 1.14.1 error (#13148)

    notatallshaw authored Jan 11, 2025
    Copy the full SHA
    23e9222 View commit details
  3. pre-commit autoupdate: ruff (#13144)

    updates:
    - [github.com/astral-sh/ruff-pre-commit: v0.8.2 → v0.8.6](astral-sh/ruff-pre-commit@v0.8.2...v0.8.6)
    
    Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
    pre-commit-ci[bot] authored Jan 11, 2025
    Copy the full SHA
    39be130 View commit details

Commits on Jan 12, 2025

  1. Pass --proxy to build subprocesses (#13124)

    Similar to --cert and --client-cert, the --proxy flag was not passed
    down to the isolated build environment. This was simply an oversight.
    
    I opted to store the original proxy string in a new attribute on the
    session as digging into the .proxies dictionary felt janky, and so did
    passing the proxy string to the finder as an argument.
    
    Co-authored-by: lcmartin <luis.martinez@collins.com>
    ichard26 and martinezlc99 authored Jan 12, 2025
    Copy the full SHA
    d1c0dad View commit details
  2. Support PEP 639 License-Expression and License-File in JSON output (#…

    …13134)
    
    Adds PEP 639 support to `pip inspect` and `pip install --report`.
    sbidoul authored Jan 12, 2025
    Copy the full SHA
    394e032 View commit details
  3. Add non-functional WheelDistribution.locate_file() method (#11685)

    importlib.metadata.Distribution was always meant to be a proper ABC with
    API enforcement, but this enforcement was never added (probably due to
    Python 2.7 compatibility concerns). Upstream would like to fix this
    wart, so let's define a locate_file() method that simply raises
    NotImplementedError as permitted.
    
    Co-authored-by: Stéphane Bidoul <stephane.bidoul@gmail.com>
    Co-authored-by: Richard Si <sichard26@gmail.com>
    3 people authored Jan 12, 2025
    Copy the full SHA
    d2bb8eb View commit details
  4. Upgrade pyproject-hooks to 1.2.0 (#13125)

    ichard26 authored Jan 12, 2025
    Copy the full SHA
    dafc095 View commit details
  5. Deprecate Python 2 relic --no-python-version-warning

    ichard26 committed Jan 12, 2025
    Copy the full SHA
    d18fca1 View commit details
Showing with 14,968 additions and 13,122 deletions.
  1. +5 −1 .github/dependabot.yml
  2. +0 −7 .github/triage-new-issues.yml
  3. +34 −44 .github/workflows/ci.yml
  4. +0 −25 .github/workflows/news-file.yml
  5. +44 −0 .github/workflows/release.yml
  6. +5 −5 .pre-commit-config.yaml
  7. +7 −0 AUTHORS.txt
  8. +4 −6 MANIFEST.in
  9. +67 −2 NEWS.rst
  10. +67 −0 build-project.py
  11. +2 −0 build-requirements.in
  12. +24 −0 build-requirements.txt
  13. +5 −9 docs/html/conf.py
  14. +1 −1 docs/html/development/architecture/command-line-interface.rst
  15. +1 −1 docs/html/development/architecture/overview.rst
  16. +2 −5 docs/html/development/release-process.rst
  17. +3 −3 docs/html/reference/requirements-file-format.md
  18. +0 −19 docs/html/topics/authentication.md
  19. +1 −1 docs/html/ux-research-design/research-results/improving-pips-documentation.md
  20. +1 −1 docs/html/ux-research-design/research-results/personas.md
  21. +3 −3 docs/html/ux-research-design/research-results/pip-force-reinstall.md
  22. +2 −2 docs/html/ux-research-design/research-results/prioritizing-features.md
  23. +3 −2 docs/html/ux-research-design/research-results/users-and-security.md
  24. +1 −0 docs/requirements.txt
  25. +16 −8 noxfile.py
  26. +4 −6 pyproject.toml
  27. +1 −1 src/pip/__init__.py
  28. +6 −2 src/pip/_internal/build_env.py
  29. +9 −0 src/pip/_internal/cli/base_command.py
  30. +2 −2 src/pip/_internal/cli/cmdoptions.py
  31. +1 −0 src/pip/_internal/cli/index_command.py
  32. +1 −1 src/pip/_internal/cli/progress_bars.py
  33. +4 −1 src/pip/_internal/commands/cache.py
  34. +8 −7 src/pip/_internal/commands/install.py
  35. +8 −1 src/pip/_internal/commands/show.py
  36. +1 −1 src/pip/_internal/configuration.py
  37. +41 −32 src/pip/_internal/index/package_finder.py
  38. +2 −2 src/pip/_internal/metadata/__init__.py
  39. +2 −0 src/pip/_internal/metadata/_json.py
  40. +7 −1 src/pip/_internal/metadata/importlib/_dists.py
  41. +23 −9 src/pip/_internal/models/link.py
  42. +12 −0 src/pip/_internal/network/cache.py
  43. +1 −0 src/pip/_internal/network/session.py
  44. +1 −0 src/pip/_internal/operations/build/metadata_editable.py
  45. +11 −13 src/pip/_internal/operations/freeze.py
  46. +1 −1 src/pip/_internal/pyproject.py
  47. +92 −43 src/pip/_internal/req/req_file.py
  48. +2 −2 src/pip/_internal/req/req_install.py
  49. +1 −1 src/pip/_internal/resolution/resolvelib/factory.py
  50. +9 −1 src/pip/_internal/self_outdated_check.py
  51. +0 −36 src/pip/_internal/utils/encoding.py
  52. +8 −1 src/pip/_internal/utils/logging.py
  53. +15 −14 src/pip/_internal/utils/misc.py
  54. +1 −0 src/pip/_internal/utils/packaging.py
  55. +1 −1 src/pip/_internal/utils/unpacking.py
  56. +2 −1 src/pip/_vendor/cachecontrol/__init__.py
  57. +2 −2 src/pip/_vendor/cachecontrol/adapter.py
  58. +1 −0 src/pip/_vendor/cachecontrol/cache.py
  59. +1 −1 src/pip/_vendor/cachecontrol/caches/file_cache.py
  60. +1 −0 src/pip/_vendor/cachecontrol/controller.py
  61. +2 −2 src/pip/_vendor/cachecontrol/filewrapper.py
  62. +4 −1 src/pip/_vendor/cachecontrol/heuristics.py
  63. +2 −1 src/pip/_vendor/idna/__init__.py
  64. +31 −27 src/pip/_vendor/idna/codec.py
  65. +6 −4 src/pip/_vendor/idna/compat.py
  66. +161 −119 src/pip/_vendor/idna/core.py
  67. +3,537 −3,539 src/pip/_vendor/idna/idnadata.py
  68. +7 −4 src/pip/_vendor/idna/intranges.py
  69. +1 −2 src/pip/_vendor/idna/package_data.py
  70. +8,261 −8,178 src/pip/_vendor/idna/uts46data.py
  71. +8 −8 src/pip/_vendor/msgpack/__init__.py
  72. +5 −3 src/pip/_vendor/msgpack/ext.py
  73. +29 −51 src/pip/_vendor/msgpack/fallback.py
  74. +2 −2 src/pip/_vendor/packaging/__init__.py
  75. +4 −4 src/pip/_vendor/packaging/_elffile.py
  76. +1 −0 src/pip/_vendor/packaging/_manylinux.py
  77. +145 −0 src/pip/_vendor/packaging/licenses/__init__.py
  78. +759 −0 src/pip/_vendor/packaging/licenses/_spdx.py
  79. +15 −9 src/pip/_vendor/packaging/markers.py
  80. +83 −24 src/pip/_vendor/packaging/metadata.py
  81. +19 −8 src/pip/_vendor/packaging/specifiers.py
  82. +15 −25 src/pip/_vendor/packaging/tags.py
  83. +33 −44 src/pip/_vendor/packaging/utils.py
  84. +26 −7 src/pip/_vendor/packaging/version.py
  85. +14 −10 src/pip/_vendor/platformdirs/__init__.py
  86. +1 −1 src/pip/_vendor/platformdirs/android.py
  87. +6 −0 src/pip/_vendor/platformdirs/api.py
  88. +14 −0 src/pip/_vendor/platformdirs/macos.py
  89. +0 −6 src/pip/_vendor/platformdirs/unix.py
  90. +2 −2 src/pip/_vendor/platformdirs/version.py
  91. +0 −1 src/pip/_vendor/pyproject_hooks.pyi
  92. +17 −9 src/pip/_vendor/pyproject_hooks/__init__.py
  93. +0 −8 src/pip/_vendor/pyproject_hooks/_compat.py
  94. +181 −101 src/pip/_vendor/pyproject_hooks/_impl.py
  95. +5 −2 src/pip/_vendor/pyproject_hooks/_in_process/__init__.py
  96. +113 −77 src/pip/_vendor/pyproject_hooks/_in_process/_in_process.py
  97. 0 src/pip/_vendor/pyproject_hooks/py.typed
  98. +1 −8 src/pip/_vendor/requests/certs.py
  99. +0 −2 src/pip/_vendor/rich/_inspect.py
  100. +1 −1 src/pip/_vendor/rich/_null_file.py
  101. +3 −4 src/pip/_vendor/rich/_win32_console.py
  102. +1 −0 src/pip/_vendor/rich/align.py
  103. +1 −0 src/pip/_vendor/rich/ansi.py
  104. +30 −23 src/pip/_vendor/rich/cells.py
  105. +2 −2 src/pip/_vendor/rich/color.py
  106. +55 −27 src/pip/_vendor/rich/console.py
  107. +2 −1 src/pip/_vendor/rich/default_styles.py
  108. +1 −2 src/pip/_vendor/rich/filesize.py
  109. +1 −1 src/pip/_vendor/rich/highlighter.py
  110. +1 −1 src/pip/_vendor/rich/live.py
  111. +8 −0 src/pip/_vendor/rich/logging.py
  112. +5 −5 src/pip/_vendor/rich/padding.py
  113. +13 −7 src/pip/_vendor/rich/panel.py
  114. +46 −25 src/pip/_vendor/rich/pretty.py
  115. +25 −9 src/pip/_vendor/rich/progress.py
  116. +1 −1 src/pip/_vendor/rich/progress_bar.py
  117. +29 −4 src/pip/_vendor/rich/prompt.py
  118. +33 −19 src/pip/_vendor/rich/segment.py
  119. +1 −0 src/pip/_vendor/rich/spinner.py
  120. +1 −1 src/pip/_vendor/rich/style.py
  121. +16 −8 src/pip/_vendor/rich/syntax.py
  122. +15 −8 src/pip/_vendor/rich/table.py
  123. +10 −6 src/pip/_vendor/rich/text.py
  124. +2 −2 src/pip/_vendor/rich/theme.py
  125. +70 −26 src/pip/_vendor/rich/traceback.py
  126. +16 −8 src/pip/_vendor/rich/tree.py
  127. +3 −0 src/pip/_vendor/tomli/LICENSE-HEADER
  128. +1 −4 src/pip/_vendor/tomli/__init__.py
  129. +158 −79 src/pip/_vendor/tomli/_parser.py
  130. +10 −5 src/pip/_vendor/tomli/_re.py
  131. +8 −8 src/pip/_vendor/vendor.txt
  132. BIN tests/data/packages/license.dist-0.1-py2.py3-none-any.whl
  133. BIN tests/data/packages/license.dist-0.2-py2.py3-none-any.whl
  134. +2 −2 tests/functional/test_cache.py
  135. +32 −0 tests/functional/test_install.py
  136. +16 −0 tests/functional/test_proxy.py
  137. +42 −1 tests/functional/test_show.py
  138. +17 −0 tests/lib/__init__.py
  139. +1 −1 tests/ruff.toml
  140. +18 −2 tests/unit/test_collector.py
  141. +4 −4 tests/unit/test_index.py
  142. +31 −0 tests/unit/test_network_cache.py
  143. +114 −0 tests/unit/test_req_file.py
  144. +14 −0 tests/unit/test_self_check_outdated.py
  145. +10 −16 tests/unit/test_urls.py
  146. +1 −45 tests/unit/test_utils.py
  147. +7 −5 tests/unit/test_utils_retry.py
  148. +0 −122 tools/vendoring/patches/packaging.patch
  149. +0 −20 tools/vendoring/patches/requests.patch
6 changes: 5 additions & 1 deletion .github/dependabot.yml
Original file line number Diff line number Diff line change
@@ -3,8 +3,12 @@ updates:
- package-ecosystem: "github-actions"
directory: "/"
schedule:
interval: "monthly"
interval: "weekly"
groups:
github-actions:
patterns:
- "*"
- package-ecosystem: "pip"
directory: "/"
schedule:
interval: "weekly"
7 changes: 0 additions & 7 deletions .github/triage-new-issues.yml

This file was deleted.

78 changes: 34 additions & 44 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
@@ -25,7 +25,7 @@ concurrency:
jobs:
docs:
name: docs
runs-on: ubuntu-latest
runs-on: ubuntu-22.04

steps:
- uses: actions/checkout@v4
@@ -36,7 +36,7 @@ jobs:
- run: nox -s docs

determine-changes:
runs-on: ubuntu-latest
runs-on: ubuntu-22.04
outputs:
tests: ${{ steps.filter.outputs.tests }}
vendoring: ${{ steps.filter.outputs.vendoring }}
@@ -64,7 +64,7 @@ jobs:

packaging:
name: packaging
runs-on: ubuntu-latest
runs-on: ubuntu-22.04

steps:
- uses: actions/checkout@v4
@@ -83,7 +83,7 @@ jobs:

vendoring:
name: vendoring
runs-on: ubuntu-latest
runs-on: ubuntu-22.04

needs: [determine-changes]
if: >-
@@ -112,7 +112,7 @@ jobs:
strategy:
fail-fast: true
matrix:
os: [ubuntu-latest, macos-13, macos-latest]
os: [ubuntu-22.04, macos-13, macos-latest]
python:
- "3.8"
- "3.9"
@@ -129,7 +129,7 @@ jobs:
allow-prereleases: true

- name: Install Ubuntu dependencies
if: matrix.os == 'ubuntu-latest'
if: matrix.os == 'ubuntu-22.04'
run: |
sudo apt-get update
sudo apt-get install bzr
@@ -149,17 +149,17 @@ jobs:
- name: Run unit tests
run: >-
nox -s test-${{ matrix.python.key || matrix.python }} --
-m unit
tests/unit
--verbose --numprocesses auto --showlocals
- name: Run integration tests
run: >-
nox -s test-${{ matrix.python.key || matrix.python }} --
-m integration
nox -s test-${{ matrix.python.key || matrix.python }} --no-install --
tests/functional
--verbose --numprocesses auto --showlocals
--durations=5
tests-windows:
name: tests / ${{ matrix.python }} / ${{ matrix.os }} / ${{ matrix.group }}
name: tests / ${{ matrix.python }} / ${{ matrix.os }} / ${{ matrix.group.number }}
runs-on: ${{ matrix.os }}-latest

needs: [packaging, determine-changes]
@@ -180,54 +180,46 @@ jobs:
# - "3.11"
# - "3.12"
- "3.13"
group: [1, 2]
group:
- { number: 1, pytest-filter: "not test_install" }
- { number: 2, pytest-filter: "test_install" }

steps:
# The D: drive is significantly faster than the system C: drive.
# https://github.com/actions/runner-images/issues/8755
- name: Set TEMP to D:/Temp
run: |
mkdir "D:\\Temp"
echo "TEMP=D:\\Temp" >> $env:GITHUB_ENV
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python }}
allow-prereleases: true

# We use C:\Temp (which is already available on the worker)
# as a temporary directory for all of the tests because the
# default value (under the user dir) is more deeply nested
# and causes tests to fail with "path too long" errors.
- run: pip install nox
env:
TEMP: "C:\\Temp"

# Main check
- name: Run unit tests
if: matrix.group == 1
run: >-
nox -s test-${{ matrix.python }} --
-m unit
--verbose --numprocesses auto --showlocals
env:
TEMP: "C:\\Temp"

- name: Run integration tests (group 1)
if: matrix.group == 1
- name: Run unit tests (group 1)
if: matrix.group.number == 1
run: >-
nox -s test-${{ matrix.python }} --
-m integration -k "not test_install"
tests/unit
--verbose --numprocesses auto --showlocals
env:
TEMP: "C:\\Temp"
- name: Run integration tests (group 2)
if: matrix.group == 2
- name: Run integration tests (group ${{ matrix.group.number }})
run: >-
nox -s test-${{ matrix.python }} --
-m integration -k "test_install"
nox -s test-${{ matrix.python }} --no-install --
tests/functional -k "${{ matrix.group.pytest-filter }}"
--verbose --numprocesses auto --showlocals
env:
TEMP: "C:\\Temp"
tests-zipapp:
name: tests / zipapp
runs-on: ubuntu-latest
# The macos-latest (M1) runners are the fastest available on GHA, even
# beating out the ubuntu-latest runners. The zipapp tests are slow by
# nature, and we don't care where they run, so we pick the fastest one.
runs-on: macos-latest

needs: [packaging, determine-changes]
if: >-
@@ -240,18 +232,16 @@ jobs:
with:
python-version: "3.10"

- name: Install Ubuntu dependencies
run: |
sudo apt-get update
sudo apt-get install bzr
- name: Install MacOS dependencies
run: brew install breezy subversion

- run: pip install nox

# Main check
- name: Run integration tests
run: >-
nox -s test-3.10 --
-m integration
tests/functional
--verbose --numprocesses auto --showlocals
--durations=5
--use-zipapp
@@ -268,7 +258,7 @@ jobs:
- tests-zipapp
- vendoring

runs-on: ubuntu-latest
runs-on: ubuntu-22.04

steps:
- name: Decide whether the needed jobs succeeded or failed
25 changes: 0 additions & 25 deletions .github/workflows/news-file.yml

This file was deleted.

44 changes: 44 additions & 0 deletions .github/workflows/release.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,44 @@
name: Publish Python 🐍 distribution 📦 to PyPI

on:
push:
tags:
- "*"

jobs:
build:
name: Build distribution 📦
runs-on: ubuntu-latest

steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4
with:
persist-credentials: false
- name: Build a binary wheel and a source tarball
run: ./build-project.py
- name: Store the distribution packages
uses: actions/upload-artifact@65c4c4a1ddee5b72f698fdd19549f0f0fb45cf08 # v4
with:
name: python-package-distributions
path: dist/

publish-to-pypi:
name: >-
Publish Python 🐍 distribution 📦 to PyPI
needs:
- build
runs-on: ubuntu-latest
environment:
name: pypi
url: https://pypi.org/project/pip/${{ github.ref_name }}
permissions:
id-token: write # IMPORTANT: mandatory for trusted publishing

steps:
- name: Download all the dists
uses: actions/download-artifact@fa0a91b85d4f404e444e00e005971372dc801d16 # v4
with:
name: python-package-distributions
path: dist/
- name: Publish distribution 📦 to PyPI
uses: pypa/gh-action-pypi-publish@76f52bc884231f62b9a034ebfe128415bbaabdfc # release/v1
10 changes: 5 additions & 5 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -2,7 +2,7 @@ exclude: 'src/pip/_vendor/'

repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.6.0
rev: v5.0.0
hooks:
- id: check-builtin-literals
- id: check-added-large-files
@@ -17,25 +17,25 @@ repos:
exclude: .patch

- repo: https://github.com/psf/black-pre-commit-mirror
rev: 24.4.2
rev: 24.10.0
hooks:
- id: black

- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.5.6
rev: v0.9.1
hooks:
- id: ruff
args: [--fix, --exit-non-zero-on-fix]

- repo: https://github.com/pre-commit/mirrors-mypy
rev: v1.12.1
rev: v1.14.1
hooks:
- id: mypy
exclude: tests/data
args: ["--pretty", "--show-error-codes"]
additional_dependencies: [
'keyring==24.2.0',
'nox==2023.4.22',
'nox==2024.03.02',
'pytest',
'types-docutils==0.20.0.3',
'types-setuptools==68.2.0.0',
7 changes: 7 additions & 0 deletions AUTHORS.txt
Original file line number Diff line number Diff line change
@@ -125,6 +125,7 @@ burrows
Bussonnier Matthias
bwoodsend
c22
Caleb Brown
Caleb Martinez
Calvin Smith
Carl Meyer
@@ -134,6 +135,7 @@ Carter Thayer
Cass
Chandrasekhar Atina
Charlie Marsh
charwick
Chih-Hsuan Yen
Chris Brinker
Chris Hunt
@@ -403,18 +405,22 @@ Josh Cannon
Josh Hansen
Josh Schneier
Joshua
JoshuaPerdue
Juan Luis Cano Rodríguez
Juanjo Bazán
Judah Rand
Julian Berman
Julian Gethmann
Julien Demoor
July Tikhonov
Jussi Kukkonen
Justin van Heek
jwg4
Jyrki Pulliainen
Kai Chen
Kai Mueller
Kamal Bin Mustafa
Karolina Surma
kasium
kaustav haldar
keanemind
@@ -625,6 +631,7 @@ R. David Murray
Rafael Caricio
Ralf Schmitt
Ran Benita
Randy Döring
Razzi Abuissa
rdb
Reece Dunham
10 changes: 4 additions & 6 deletions MANIFEST.in
Original file line number Diff line number Diff line change
@@ -5,23 +5,22 @@ include README.rst
include SECURITY.md
include pyproject.toml

include build-requirements.in
include build-requirements.txt
include build-project.py

include src/pip/_vendor/README.rst
include src/pip/_vendor/vendor.txt
include src/pip/_vendor/pyparsing/diagram/template.jinja2
recursive-include src/pip/_vendor *LICENSE*
recursive-include src/pip/_vendor *COPYING*

include docs/docutils.conf
include docs/requirements.txt

exclude .git-blame-ignore-revs
exclude .coveragerc
exclude .mailmap
exclude .appveyor.yml
exclude .readthedocs.yml
exclude .pre-commit-config.yaml
exclude .readthedocs-custom-redirects.yml
exclude tox.ini
exclude noxfile.py

recursive-include src/pip/_vendor *.pem
@@ -34,6 +33,5 @@ recursive-exclude src/pip/_vendor *.pyi
prune .github
prune docs/build
prune news
prune tasks
prune tests
prune tools
69 changes: 67 additions & 2 deletions NEWS.rst
Original file line number Diff line number Diff line change
@@ -9,6 +9,71 @@
.. towncrier release notes start
25.0 (2025-01-26)
=================

Deprecations and Removals
-------------------------

- Deprecate the ``no-python-version-warning`` flag as it has long done nothing
since Python 2 support was removed in pip 21.0. (`#13154 <https://github.com/pypa/pip/issues/13154>`_)

Features
--------

- Prefer to display :pep:`639` ``License-Expression`` in ``pip show`` if metadata version is at least 2.4. (`#13112 <https://github.com/pypa/pip/issues/13112>`_)
- Support :pep:`639` ``License-Expression`` and ``License-File`` metadata fields in JSON
output. ``pip inspect`` and ``pip install --report`` now emit
``license_expression`` and ``license_file`` fields in the ``metadata`` object,
if the corresponding fields are present in the installed ``METADATA`` file. (`#13134 <https://github.com/pypa/pip/issues/13134>`_)
- Files in the network cache will inherit the read/write permissions of pip's cache
directory (in addition to the current user retaining read/write access). This
enables a single cache to be shared among multiple users. (`#11012 <https://github.com/pypa/pip/issues/11012>`_)
- Return the size, along with the number, of files cleared on ``pip cache purge`` and ``pip cache remove`` (`#12176 <https://github.com/pypa/pip/issues/12176>`_)
- Cache ``python-requires`` checks while filtering potential installation candidates. (`#13128 <https://github.com/pypa/pip/issues/13128>`_)
- Optimize package collection by avoiding unnecessary URL parsing and other processing. (`#13132 <https://github.com/pypa/pip/issues/13132>`_)

Bug Fixes
---------

- Reorder the encoding detection when decoding a requirements file, relying on
UTF-8 over the locale encoding by default, matching the documented behaviour.
(`#12771 <https://github.com/pypa/pip/issues/12771>`_)
- The pip version self check is disabled on ``EXTERNALLY-MANAGED`` environments. (`#11820 <https://github.com/pypa/pip/issues/11820>`_)
- Fix a security bug allowing a specially crafted wheel to execute code during
installation. (`#13079 <https://github.com/pypa/pip/issues/13079>`_)
- The inclusion of ``packaging`` 24.2 changes how pre-release specifiers with ``<`` and ``>``
behave. Including a pre-release version with these specifiers now implies
accepting pre-releases (e.g., ``<2.0dev`` can include ``1.0rc1``). To avoid
implying pre-releases, avoid specifying them (e.g., use ``<2.0``).
The exception is ``!=``, which never implies pre-releases. (`#13163 <https://github.com/pypa/pip/issues/13163>`_)
- The ``--cert`` and ``--client-cert`` command-line options are now respected while
installing build dependencies. Consequently, the private ``_PIP_STANDALONE_CERT``
environment variable is no longer used. (`#5502 <https://github.com/pypa/pip/issues/5502>`_)
- The ``--proxy`` command-line option is now respected while installing build dependencies. (`#6018 <https://github.com/pypa/pip/issues/6018>`_)

Vendored Libraries
------------------

- Upgrade CacheControl to 0.14.1
- Upgrade idna to 3.10
- Upgrade msgpack to 1.1.0
- Upgrade packaging to 24.2
- Upgrade platformdirs to 4.3.6
- Upgrade pyproject-hooks to 1.2.0
- Upgrade rich to 13.9.4
- Upgrade tomli to 2.2.1

Improved Documentation
----------------------

- Removed section about non-existing ``--force-keyring`` flag. (`#12455 <https://github.com/pypa/pip/issues/12455>`_)

Process
-------

- Started releasing to PyPI from a GitHub Actions CI/CD workflow that implements trusted publishing and bundles :pep:`740` digital attestations.

24.3.1 (2024-10-27)
===================

@@ -3350,7 +3415,7 @@ Improved Documentation
- Upgrade the bundled copy of requests to 2.6.0, fixing CVE-2015-2296.
- Display format of latest package when using ``pip list --outdated``. (#2475)
- Don't use pywin32 as ctypes should always be available on Windows, using
pywin32 prevented uninstallation of pywin32 on Windows. (:pull:`2467`)
pywin32 prevented uninstallation of pywin32 on Windows. (:pr:`2467`)
- Normalize the ``--wheel-dir`` option, expanding out constructs such as ``~``
when used. (#2441)
- Display a warning when an undefined extra has been requested. (#2142)
@@ -3641,7 +3706,7 @@ Improved Documentation
--no-download`` are now formally deprecated. See #906 for discussion on
possible alternatives, or lack thereof, in future releases.
- **DEPRECATION** ``pip zip`` and ``pip unzip`` are now formally deprecated.
- pip will now install Mac OSX platform wheels from PyPI. (:pull:`1278`)
- pip will now install Mac OSX platform wheels from PyPI. (:pr:`1278`)
- pip now generates the appropriate platform-specific console scripts when
installing wheels. (#1251)
- pip now confirms a wheel is supported when installing directly from a path or
67 changes: 67 additions & 0 deletions build-project.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,67 @@
#!/usr/bin/env python3
"""Build pip using pinned build requirements."""

import subprocess
import tempfile
import venv
from os import PathLike
from pathlib import Path
from types import SimpleNamespace


class EnvBuilder(venv.EnvBuilder):
"""A subclass of venv.EnvBuilder that exposes the python executable command."""

def ensure_directories(
self, env_dir: str | bytes | PathLike[str] | PathLike[bytes]
) -> SimpleNamespace:
context = super().ensure_directories(env_dir)
self.env_exec_cmd = context.env_exec_cmd
return context


def get_git_head_timestamp() -> str:
return subprocess.run(
[
"git",
"log",
"-1",
"--pretty=format:%ct",
],
text=True,
stdout=subprocess.PIPE,
).stdout.strip()


def main() -> None:
with tempfile.TemporaryDirectory() as build_env:
env_builder = EnvBuilder(with_pip=True)
env_builder.create(build_env)
subprocess.run(
[
env_builder.env_exec_cmd,
"-Im",
"pip",
"install",
"--no-deps",
"--only-binary=:all:",
"--require-hashes",
"-r",
Path(__file__).parent / "build-requirements.txt",
],
check=True,
)
subprocess.run(
[
env_builder.env_exec_cmd,
"-Im",
"build",
"--no-isolation",
],
check=True,
env={"SOURCE_DATE_EPOCH": get_git_head_timestamp()},
)


if __name__ == "__main__":
main()
2 changes: 2 additions & 0 deletions build-requirements.in
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
build
setuptools
24 changes: 24 additions & 0 deletions build-requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
#
# This file is autogenerated by pip-compile with Python 3.12
# by the following command:
#
# pip-compile --allow-unsafe --generate-hashes build-requirements.in
#
build==1.2.2.post1 \
--hash=sha256:1d61c0887fa860c01971625baae8bdd338e517b836a2f70dd1f7aa3a6b2fc5b5 \
--hash=sha256:b36993e92ca9375a219c99e606a122ff365a760a2d4bba0caa09bd5278b608b7
# via -r build-requirements.in
packaging==24.2 \
--hash=sha256:09abb1bccd265c01f4a3aa3f7a7db064b36514d2cba19a2f694fe6150451a759 \
--hash=sha256:c228a6dc5e932d346bc5739379109d49e8853dd8223571c7c5b55260edc0b97f
# via build
pyproject-hooks==1.2.0 \
--hash=sha256:1e859bd5c40fae9448642dd871adf459e5e2084186e8d2c2a79a824c970da1f8 \
--hash=sha256:9e5c6bfa8dcc30091c74b0cf803c81fdd29d94f01992a7707bc97babb1141913
# via build

# The following packages are considered to be unsafe in a requirements file:
setuptools==75.8.0 \
--hash=sha256:c5afc8f407c626b8313a86e10311dd3f661c6cd9c09d4bf8c15c0e11f9f2b0e6 \
--hash=sha256:e3982f444617239225d675215d51f6ba05f845d4eec313da4418fdbb56fb27e3
# via -r build-requirements.in
14 changes: 5 additions & 9 deletions docs/html/conf.py
Original file line number Diff line number Diff line change
@@ -17,7 +17,6 @@
# first-party extensions
"sphinx.ext.autodoc",
"sphinx.ext.todo",
"sphinx.ext.extlinks",
"sphinx.ext.intersphinx",
# our extensions
"pip_sphinxext",
@@ -26,6 +25,7 @@
"sphinx_copybutton",
"sphinx_inline_tabs",
"sphinxcontrib.towncrier",
"sphinx_issues",
]

# General information about the project.
@@ -71,14 +71,6 @@
"pypug": ("https://packaging.python.org", None),
}

# -- Options for extlinks -------------------------------------------------------------

extlinks = {
"issue": ("https://github.com/pypa/pip/issues/%s", "#%s"),
"pull": ("https://github.com/pypa/pip/pull/%s", "PR #%s"),
"pypi": ("https://pypi.org/project/%s/", "%s"),
}

# -- Options for towncrier_draft extension --------------------------------------------

towncrier_draft_autoversion_mode = "draft" # or: 'sphinx-release', 'sphinx-version'
@@ -137,3 +129,7 @@ def to_document_name(path: str, base_dir: str) -> str:
copybutton_prompt_text = r"\$ | C\:\> "
copybutton_prompt_is_regexp = True
copybutton_only_copy_prompt_lines = False

# -- Options for sphinx_issues --------------------------------------------------------

issues_default_group_project = "pypa/pip"
Original file line number Diff line number Diff line change
@@ -154,7 +154,7 @@ Its main addition consists of the following function:
.. py:method:: get_default_values()
Overrides the original method to allow updating the defaults ater the instantiation of the
Overrides the original method to allow updating the defaults after the instantiation of the
option parser.

It allows overriding the default options and arguments using the ``Configuration`` class
2 changes: 1 addition & 1 deletion docs/html/development/architecture/overview.rst
Original file line number Diff line number Diff line change
@@ -134,7 +134,7 @@ Once it has those, it selects one file and downloads it.
cannot….should not be …. ? I want only the Flask …. Why am I getting the
whole list?

Answer: It's not every file, just files of Flask. No API for getting alllllll
Answer: It's not every file, just files of Flask. No API for getting all
files on PyPI. It’s for getting all files of Flask.)

.. _`tracking issue`: https://github.com/pypa/pip/issues/6831
7 changes: 2 additions & 5 deletions docs/html/development/release-process.rst
Original file line number Diff line number Diff line change
@@ -146,11 +146,8 @@ Creating a new release
This will update the relevant files and tag the correct commit.
#. Submit the ``release/YY.N`` branch as a pull request and ensure CI passes.
Merge the changes back into ``main`` and pull them back locally.
#. Build the release artifacts using ``nox -s build-release -- YY.N``.
This will checkout the tag, generate the distribution files to be
uploaded and checkout the main branch again.
#. Upload the release to PyPI using ``nox -s upload-release -- YY.N``.
#. Push the tag created by ``prepare-release``.
#. Push the tag created by ``prepare-release``. This will trigger the release
workflow on GitHub and publish to PyPI.
#. Regenerate the ``get-pip.py`` script in the `get-pip repository`_ (as
documented there) and commit the results.
#. Submit a Pull Request to `CPython`_ adding the new version of pip
6 changes: 3 additions & 3 deletions docs/html/reference/requirements-file-format.md
Original file line number Diff line number Diff line change
@@ -56,9 +56,9 @@ examples of all these forms, see {ref}`pip install Examples`.

### Encoding

Requirements files are `utf-8` encoding by default and also support
{pep}`263` style comments to change the encoding (i.e.
`# -*- coding: <encoding name> -*-`).
The default encoding for requirement files is `UTF-8` unless a different
encoding is specified using a {pep}`263` style comment (e.g. `# -*- coding:
<encoding name> -*-`).

### Line continuations

19 changes: 0 additions & 19 deletions docs/html/topics/authentication.md
Original file line number Diff line number Diff line change
@@ -163,25 +163,6 @@ from the subprocess in which they run Pip. You won't know whether the keyring
backend is waiting the user input or not in such situations.
```

pip is conservative and does not query keyring at all when `--no-input` is used
because the keyring might require user interaction such as prompting the user
on the console. You can force keyring usage by passing `--force-keyring` or one
of the following:

```bash
# possibly with --user, --global or --site
$ pip config set global.force-keyring true
# or
$ export PIP_FORCE_KEYRING=1
```

```{warning}
Be careful when doing this since it could cause tools such as pipx and Pipenv
to appear to hang. They show their own progress indicator while hiding output
from the subprocess in which they run Pip. You won't know whether the keyring
backend is waiting the user input or not in such situations.
```

Note that `keyring` (the Python package) needs to be installed separately from
pip. This can create a bootstrapping issue if you need the credentials stored in
the keyring to download and install keyring.
Original file line number Diff line number Diff line change
@@ -512,7 +512,7 @@ _Suggested content:_
_Page purpose:_

- To onboard people who want to contribute to pip's docs
- To share previous research and recommendataions related to pip's docs
- To share previous research and recommendations related to pip's docs

_Suggested content:_

2 changes: 1 addition & 1 deletion docs/html/ux-research-design/research-results/personas.md
Original file line number Diff line number Diff line change
@@ -2,7 +2,7 @@

## Problem

We want to develop personas for pip's user to facilate faster user-centered decision making for the pip development team.
We want to develop personas for pip's user to facilitate faster user-centered decision making for the pip development team.

[Skip to recommendations](#recommendations)

Original file line number Diff line number Diff line change
@@ -66,7 +66,7 @@ Most respondents use `--force-reinstall` "almost never" (65.6%):
![screenshot of survey question of how often users use --force-reinstall](https://i.imgur.com/fjLQUPV.png)
![bar chart of how often users use --force-reinstall](https://i.imgur.com/Xe1XDkI.png)

Amongst respondents who said they use `--force-resinstall` often or very often:
Amongst respondents who said they use `--force-reinstall` often or very often:

- 54.54% (6/11) of respondents thought that pip should install the same version of requests - i.e. that `--force-reinstall` should _not_ implicitly upgrade
- 45.45% (5/11) of respondents thought that pip should upgrade requests to the latest version - i.e that `--force-reinstall` _should_ implicitly upgrade
@@ -76,7 +76,7 @@ Respondents find `--force-reinstall` less useful than useful:
![screenshot of survey question of how useful users find --force-reinstall](https://i.imgur.com/6cv4lFn.png)
![bar chart of how useful users find --force-reinstall](https://i.imgur.com/gMUBDBo.png)

Amongst respondents who said they find `--force-resinstall` useful or very useful:
Amongst respondents who said they find `--force-reinstall` useful or very useful:

- 38.46% (20/52) of respondents thought that pip should install the same version of requests - i.e. that `--force-reinstall` should _not_ implicitly upgrade
- 50% (26/52) of respondents thought that pip should upgrade requests to the latest version - i.e that `--force-reinstall` _should_ implicitly upgrade
@@ -89,7 +89,7 @@ In this case, we recommend showing the following message when a user tries to us

> Error: the pip install --force-reinstall option no longer exists. Use pip uninstall then pip install to replace up-to-date packages, or pip install --upgrade to update your packages to the latest available versions.
Should the pip development team wish to keep `--force-resintall`, we recommend maintaining the current (implicit upgrade) behaviour, as pip's users have not expressed a clear preference for a different behaviour.
Should the pip development team wish to keep `--force-reinstall`, we recommend maintaining the current (implicit upgrade) behaviour, as pip's users have not expressed a clear preference for a different behaviour.

In this case, we recommend upgrading the [help text](https://pip.pypa.io/en/stable/reference/pip_install/#cmdoption-force-reinstall) to be more explicit:

Original file line number Diff line number Diff line change
@@ -109,9 +109,9 @@ Results varied by the amount of Python experience the user had.

![Screenshot of Install a package from wheels](https://i.imgur.com/9DMBfNL.png)

#### Install apackage from a local directory
#### Install a package from a local directory

![Screenshot of Install apackage from a local directory](https://i.imgur.com/Jp95rak.png)
![Screenshot of Install a package from a local directory](https://i.imgur.com/Jp95rak.png)

#### Control where you want your installed package to live on your computer

Original file line number Diff line number Diff line change
@@ -56,6 +56,8 @@ Both of these groups identified their "sphere of influence" and did their best t

### User thoughts about security

Selected quotes from research participants

#### Responsibility as author

Participants who spent a lot of their time writing Python code - either for community or as part of their job - expressed a responsibility to their users for the code they wrote - people who wrote code which was made public expressed a stronger responsibility.
@@ -76,8 +78,7 @@ Participants also explained they rely on code security scanning and checking sof
#### Reliance on good software development practices

A small number of participants e### Selected quotes from research participants
xplained they have good software practices in place, which help with writing secure software.
A small number of participants explained they have good software practices in place, which help with writing secure software.

> "We have a book about ethics of code - we have mandatory certification."
1 change: 1 addition & 0 deletions docs/requirements.txt
Original file line number Diff line number Diff line change
@@ -7,6 +7,7 @@ myst_parser
sphinx-copybutton
sphinx-inline-tabs
sphinxcontrib-towncrier >= 0.2.0a0
sphinx-issues

# `docs.pipext` uses pip's internals to generate documentation. So, we install
# the current directory to make it work.
24 changes: 16 additions & 8 deletions noxfile.py
Original file line number Diff line number Diff line change
@@ -19,6 +19,7 @@

nox.options.reuse_existing_virtualenvs = True
nox.options.sessions = ["lint"]
nox.needs_version = ">=2024.03.02" # for session.run_install()

LOCATIONS = {
"common-wheels": "tests/data/common_wheels",
@@ -44,7 +45,9 @@ def run_with_protected_pip(session: nox.Session, *arguments: str) -> None:
env = {"VIRTUAL_ENV": session.virtualenv.location}

command = ("python", LOCATIONS["protected-pip"]) + arguments
session.run(*command, env=env, silent=True)
# By using run_install(), these installation steps can be skipped when -R
# or --no-install is passed.
session.run_install(*command, env=env, silent=True)


def should_update_common_wheels() -> bool:
@@ -84,8 +87,13 @@ def test(session: nox.Session) -> None:
session.log(msg)

# Build source distribution
# HACK: we want to skip building and installing pip when nox's --no-install
# flag is given (to save time when running tests back to back with different
# arguments), but unfortunately nox does not expose this configuration state
# yet. https://github.com/wntrblm/nox/issues/710
no_install = "-R" in sys.argv or "--no-install" in sys.argv
sdist_dir = os.path.join(session.virtualenv.location, "sdist")
if os.path.exists(sdist_dir):
if not no_install and os.path.exists(sdist_dir):
shutil.rmtree(sdist_dir, ignore_errors=True)

run_with_protected_pip(session, "install", "build")
@@ -94,7 +102,7 @@ def test(session: nox.Session) -> None:
# pip, so uninstall pip to force build to provision a known good version of pip.
run_with_protected_pip(session, "uninstall", "pip", "-y")
# fmt: off
session.run(
session.run_install(
"python", "-I", "-m", "build", "--sdist", "--outdir", sdist_dir,
silent=True,
)
@@ -127,7 +135,6 @@ def test(session: nox.Session) -> None:

@nox.session
def docs(session: nox.Session) -> None:
session.install("-e", ".")
session.install("-r", REQUIREMENTS["docs"])

def get_sphinx_build_command(kind: str) -> List[str]:
@@ -143,6 +150,7 @@ def get_sphinx_build_command(kind: str) -> List[str]:
"-c", "docs/html", # see note above
"-d", "docs/build/doctrees/" + kind,
"-b", kind,
"--jobs", "auto",
"docs/" + kind,
"docs/build/" + kind,
]
@@ -154,7 +162,6 @@ def get_sphinx_build_command(kind: str) -> List[str]:

@nox.session(name="docs-live")
def docs_live(session: nox.Session) -> None:
session.install("-e", ".")
session.install("-r", REQUIREMENTS["docs"], "sphinx-autobuild")

session.run(
@@ -163,6 +170,7 @@ def docs_live(session: nox.Session) -> None:
"-b=dirhtml",
"docs/html",
"docs/build/livehtml",
"--jobs=auto",
*session.posargs,
)

@@ -331,7 +339,7 @@ def build_release(session: nox.Session) -> None:
)

session.log("# Install dependencies")
session.install("build", "twine")
session.install("twine")

with release.isolated_temporary_checkout(session, version) as build_dir:
session.log(
@@ -367,11 +375,11 @@ def build_dists(session: nox.Session) -> List[str]:
)

session.log("# Build distributions")
session.run("python", "-m", "build", silent=True)
session.run("python", "build-project.py", silent=True)
produced_dists = glob.glob("dist/*")

session.log(f"# Verify distributions: {', '.join(produced_dists)}")
session.run("twine", "check", *produced_dists, silent=True)
session.run("twine", "check", "--strict", *produced_dists, silent=True)

return produced_dists

10 changes: 4 additions & 6 deletions pyproject.toml
Original file line number Diff line number Diff line change
@@ -60,8 +60,6 @@ exclude = ["contrib", "docs", "tests*", "tasks"]
"pip" = ["py.typed"]
"pip._vendor" = ["vendor.txt"]
"pip._vendor.certifi" = ["*.pem"]
"pip._vendor.requests" = ["*.pem"]
"pip._vendor.distlib._backport" = ["sysconfig.cfg"]
"pip._vendor.distlib" = [
"t32.exe",
"t64.exe",
@@ -123,6 +121,9 @@ drop = [
"bin/",
# interpreter and OS specific msgpack libs
"msgpack/*.so",
# optional accelerator extension libraries
"*mypyc*.so",
"tomli/*.so",
# unneeded parts of setuptools
"easy_install.py",
"setuptools",
@@ -304,10 +305,7 @@ source0 = [
]

[tool.coverage.report]
exclude_lines = [
# We must re-state the default because the `exclude_lines` option overrides
# it.
"pragma: no cover",
exclude_also = [
# This excludes typing-specific code, which will be validated by mypy anyway.
"if TYPE_CHECKING",
]
2 changes: 1 addition & 1 deletion src/pip/__init__.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
from typing import List, Optional

__version__ = "24.3.1"
__version__ = "25.0"


def main(args: Optional[List[str]] = None) -> int:
8 changes: 6 additions & 2 deletions src/pip/_internal/build_env.py
Original file line number Diff line number Diff line change
@@ -246,6 +246,8 @@ def _install_requirements(
# target from config file or env var should be ignored
"--target",
"",
"--cert",
finder.custom_cert or where(),
]
if logger.getEffectiveLevel() <= logging.DEBUG:
args.append("-vv")
@@ -270,21 +272,23 @@ def _install_requirements(
for link in finder.find_links:
args.extend(["--find-links", link])

if finder.proxy:
args.extend(["--proxy", finder.proxy])
for host in finder.trusted_hosts:
args.extend(["--trusted-host", host])
if finder.client_cert:
args.extend(["--client-cert", finder.client_cert])
if finder.allow_all_prereleases:
args.append("--pre")
if finder.prefer_binary:
args.append("--prefer-binary")
args.append("--")
args.extend(requirements)
extra_environ = {"_PIP_STANDALONE_CERT": where()}
with open_spinner(f"Installing {kind}") as spinner:
call_subprocess(
args,
command_desc=f"pip subprocess to install {kind}",
spinner=spinner,
extra_environ=extra_environ,
)


9 changes: 9 additions & 0 deletions src/pip/_internal/cli/base_command.py
Original file line number Diff line number Diff line change
@@ -29,6 +29,7 @@
NetworkConnectionError,
PreviousBuildDirError,
)
from pip._internal.utils.deprecation import deprecated
from pip._internal.utils.filesystem import check_path_owner
from pip._internal.utils.logging import BrokenStdoutLoggingError, setup_logging
from pip._internal.utils.misc import get_prog, normalize_path
@@ -228,4 +229,12 @@ def _main(self, args: List[str]) -> int:
)
options.cache_dir = None

if options.no_python_version_warning:
deprecated(
reason="--no-python-version-warning is deprecated.",
replacement="to remove the flag as it's a no-op",
gone_in="25.1",
issue=13154,
)

return self._run_wrapper(level_number, options, args)
4 changes: 2 additions & 2 deletions src/pip/_internal/cli/cmdoptions.py
Original file line number Diff line number Diff line change
@@ -260,8 +260,8 @@ class PipOption(Option):
default="auto",
help=(
"Enable the credential lookup via the keyring library if user input is allowed."
" Specify which mechanism to use [disabled, import, subprocess]."
" (default: disabled)"
" Specify which mechanism to use [auto, disabled, import, subprocess]."
" (default: %default)"
),
)

1 change: 1 addition & 0 deletions src/pip/_internal/cli/index_command.py
Original file line number Diff line number Diff line change
@@ -123,6 +123,7 @@ def _build_session(
"https": options.proxy,
}
session.trust_env = False
session.pip_proxy = options.proxy

# Determine if we can prompt the user for authentication or not
session.auth.prompting = not options.no_input
2 changes: 1 addition & 1 deletion src/pip/_internal/cli/progress_bars.py
Original file line number Diff line number Diff line change
@@ -63,7 +63,7 @@ def _raw_progress_bar(
size: Optional[int],
) -> Generator[bytes, None, None]:
def write_progress(current: int, total: int) -> None:
sys.stdout.write("Progress %d of %d\n" % (current, total))
sys.stdout.write(f"Progress {current} of {total}\n")
sys.stdout.flush()

current = 0
5 changes: 4 additions & 1 deletion src/pip/_internal/commands/cache.py
Original file line number Diff line number Diff line change
@@ -8,6 +8,7 @@
from pip._internal.exceptions import CommandError, PipError
from pip._internal.utils import filesystem
from pip._internal.utils.logging import getLogger
from pip._internal.utils.misc import format_size

logger = getLogger(__name__)

@@ -180,10 +181,12 @@ def remove_cache_items(self, options: Values, args: List[Any]) -> None:
if not files:
logger.warning(no_matching_msg)

bytes_removed = 0
for filename in files:
bytes_removed += os.stat(filename).st_size
os.unlink(filename)
logger.verbose("Removed %s", filename)
logger.info("Files removed: %s", len(files))
logger.info("Files removed: %s (%s)", len(files), format_size(bytes_removed))

def purge_cache(self, options: Values, args: List[Any]) -> None:
if args:
15 changes: 8 additions & 7 deletions src/pip/_internal/commands/install.py
Original file line number Diff line number Diff line change
@@ -10,6 +10,13 @@
from pip._vendor.packaging.utils import canonicalize_name
from pip._vendor.rich import print_json

# Eagerly import self_outdated_check to avoid crashes. Otherwise,
# this module would be imported *after* pip was replaced, resulting
# in crashes if the new self_outdated_check module was incompatible
# with the rest of pip that's already imported, or allowing a
# wheel to execute arbitrary code on install by replacing
# self_outdated_check.
import pip._internal.self_outdated_check # noqa: F401
from pip._internal.cache import WheelCache
from pip._internal.cli import cmdoptions
from pip._internal.cli.cmdoptions import make_target_python
@@ -408,12 +415,6 @@ def run(self, options: Values, args: List[str]) -> int:
# If we're not replacing an already installed pip,
# we're not modifying it.
modifying_pip = pip_req.satisfied_by is None
if modifying_pip:
# Eagerly import this module to avoid crashes. Otherwise, this
# module would be imported *after* pip was replaced, resulting in
# crashes if the new self_outdated_check module was incompatible
# with the rest of pip that's already imported.
import pip._internal.self_outdated_check # noqa: F401
protect_pip_from_modification_on_windows(modifying_pip=modifying_pip)

reqs_to_build = [
@@ -432,7 +433,7 @@ def run(self, options: Values, args: List[str]) -> int:

if build_failures:
raise InstallationError(
"ERROR: Failed to build installable wheels for some "
"Failed to build installable wheels for some "
"pyproject.toml based projects ({})".format(
", ".join(r.name for r in build_failures) # type: ignore
)
9 changes: 8 additions & 1 deletion src/pip/_internal/commands/show.py
Original file line number Diff line number Diff line change
@@ -66,6 +66,7 @@ class _PackageInfo(NamedTuple):
author: str
author_email: str
license: str
license_expression: str
entry_points: List[str]
files: Optional[List[str]]

@@ -161,6 +162,7 @@ def _get_requiring_packages(current_dist: BaseDistribution) -> Iterator[str]:
author=metadata.get("Author", ""),
author_email=metadata.get("Author-email", ""),
license=metadata.get("License", ""),
license_expression=metadata.get("License-Expression", ""),
entry_points=entry_points,
files=files,
)
@@ -180,13 +182,18 @@ def print_results(
if i > 0:
write_output("---")

metadata_version_tuple = tuple(map(int, dist.metadata_version.split(".")))

write_output("Name: %s", dist.name)
write_output("Version: %s", dist.version)
write_output("Summary: %s", dist.summary)
write_output("Home-page: %s", dist.homepage)
write_output("Author: %s", dist.author)
write_output("Author-email: %s", dist.author_email)
write_output("License: %s", dist.license)
if metadata_version_tuple >= (2, 4) and dist.license_expression:
write_output("License-Expression: %s", dist.license_expression)
else:
write_output("License: %s", dist.license)
write_output("Location: %s", dist.location)
if dist.editable_project_location is not None:
write_output(
2 changes: 1 addition & 1 deletion src/pip/_internal/configuration.py
Original file line number Diff line number Diff line change
@@ -330,7 +330,7 @@ def iter_config_files(self) -> Iterable[Tuple[Kind, List[str]]]:
This should be treated like items of a dictionary. The order
here doesn't affect what gets overridden. That is controlled
by OVERRIDE_ORDER. However this does control the order they are
displayed to the user. It's probably most ergononmic to display
displayed to the user. It's probably most ergonomic to display
things in the same order as OVERRIDE_ORDER
"""
# SMELL: Move the conditions out of this function
73 changes: 41 additions & 32 deletions src/pip/_internal/index/package_finder.py
Original file line number Diff line number Diff line change
@@ -334,44 +334,30 @@ class CandidatePreferences:
allow_all_prereleases: bool = False


@dataclass(frozen=True)
class BestCandidateResult:
"""A collection of candidates, returned by `PackageFinder.find_best_candidate`.
This class is only intended to be instantiated by CandidateEvaluator's
`compute_best_candidate()` method.
"""

def __init__(
self,
candidates: List[InstallationCandidate],
applicable_candidates: List[InstallationCandidate],
best_candidate: Optional[InstallationCandidate],
) -> None:
"""
:param candidates: A sequence of all available candidates found.
:param applicable_candidates: The applicable candidates.
:param best_candidate: The most preferred candidate found, or None
if no applicable candidates were found.
"""
assert set(applicable_candidates) <= set(candidates)

if best_candidate is None:
assert not applicable_candidates
else:
assert best_candidate in applicable_candidates
self._applicable_candidates = applicable_candidates
self._candidates = candidates
:param all_candidates: A sequence of all available candidates found.
:param applicable_candidates: The applicable candidates.
:param best_candidate: The most preferred candidate found, or None
if no applicable candidates were found.
"""

self.best_candidate = best_candidate
all_candidates: List[InstallationCandidate]
applicable_candidates: List[InstallationCandidate]
best_candidate: Optional[InstallationCandidate]

def iter_all(self) -> Iterable[InstallationCandidate]:
"""Iterate through all candidates."""
return iter(self._candidates)
def __post_init__(self) -> None:
assert set(self.applicable_candidates) <= set(self.all_candidates)

def iter_applicable(self) -> Iterable[InstallationCandidate]:
"""Iterate through the applicable candidates."""
return iter(self._applicable_candidates)
if self.best_candidate is None:
assert not self.applicable_candidates
else:
assert self.best_candidate in self.applicable_candidates


class CandidateEvaluator:
@@ -675,11 +661,29 @@ def find_links(self) -> List[str]:
def index_urls(self) -> List[str]:
return self.search_scope.index_urls

@property
def proxy(self) -> Optional[str]:
return self._link_collector.session.pip_proxy

@property
def trusted_hosts(self) -> Iterable[str]:
for host_port in self._link_collector.session.pip_trusted_origins:
yield build_netloc(*host_port)

@property
def custom_cert(self) -> Optional[str]:
# session.verify is either a boolean (use default bundle/no SSL
# verification) or a string path to a custom CA bundle to use. We only
# care about the latter.
verify = self._link_collector.session.verify
return verify if isinstance(verify, str) else None

@property
def client_cert(self) -> Optional[str]:
cert = self._link_collector.session.cert
assert not isinstance(cert, tuple), "pip only supports PEM client certs"
return cert

@property
def allow_all_prereleases(self) -> bool:
return self._candidate_prefs.allow_all_prereleases
@@ -732,6 +736,11 @@ def _sort_links(self, links: Iterable[Link]) -> List[Link]:
return no_eggs + eggs

def _log_skipped_link(self, link: Link, result: LinkType, detail: str) -> None:
# This is a hot method so don't waste time hashing links unless we're
# actually going to log 'em.
if not logger.isEnabledFor(logging.DEBUG):
return

entry = (link, result, detail)
if entry not in self._logged_links:
# Put the link at the end so the reason is more visible and because
@@ -929,7 +938,7 @@ def _format_versions(cand_iter: Iterable[InstallationCandidate]) -> str:
"Could not find a version that satisfies the requirement %s "
"(from versions: %s)",
req,
_format_versions(best_candidate_result.iter_all()),
_format_versions(best_candidate_result.all_candidates),
)

raise DistributionNotFound(f"No matching distribution found for {req}")
@@ -963,15 +972,15 @@ def _should_install_candidate(
logger.debug(
"Using version %s (newest of versions: %s)",
best_candidate.version,
_format_versions(best_candidate_result.iter_applicable()),
_format_versions(best_candidate_result.applicable_candidates),
)
return best_candidate

# We have an existing version, and its the best version
logger.debug(
"Installed version (%s) is most up-to-date (past versions: %s)",
installed_version,
_format_versions(best_candidate_result.iter_applicable()),
_format_versions(best_candidate_result.applicable_candidates),
)
raise BestVersionAlreadyInstalled

4 changes: 2 additions & 2 deletions src/pip/_internal/metadata/__init__.py
Original file line number Diff line number Diff line change
@@ -30,7 +30,7 @@ def _should_use_importlib_metadata() -> bool:
"""Whether to use the ``importlib.metadata`` or ``pkg_resources`` backend.
By default, pip uses ``importlib.metadata`` on Python 3.11+, and
``pkg_resourcess`` otherwise. This can be overridden by a couple of ways:
``pkg_resources`` otherwise. This can be overridden by a couple of ways:
* If environment variable ``_PIP_USE_IMPORTLIB_METADATA`` is set, it
dictates whether ``importlib.metadata`` is used, regardless of Python
@@ -71,7 +71,7 @@ def get_default_environment() -> BaseEnvironment:
This returns an Environment instance from the chosen backend. The default
Environment instance should be built from ``sys.path`` and may use caching
to share instance state accorss calls.
to share instance state across calls.
"""
return select_backend().Environment.default()

2 changes: 2 additions & 0 deletions src/pip/_internal/metadata/_json.py
Original file line number Diff line number Diff line change
@@ -23,6 +23,8 @@
("Maintainer", False),
("Maintainer-email", False),
("License", False),
("License-Expression", False),
("License-File", True),
("Classifier", True),
("Requires-Dist", True),
("Requires-Python", False),
8 changes: 7 additions & 1 deletion src/pip/_internal/metadata/importlib/_dists.py
Original file line number Diff line number Diff line change
@@ -2,6 +2,7 @@
import importlib.metadata
import pathlib
import zipfile
from os import PathLike
from typing import (
Collection,
Dict,
@@ -95,6 +96,11 @@ def read_text(self, filename: str) -> Optional[str]:
raise UnsupportedWheel(error)
return text

def locate_file(self, path: str | PathLike[str]) -> pathlib.Path:
# This method doesn't make sense for our in-memory wheel, but the API
# requires us to define it.
raise NotImplementedError


class Distribution(BaseDistribution):
def __init__(
@@ -190,7 +196,7 @@ def read_text(self, path: InfoPath) -> str:
return content

def iter_entry_points(self) -> Iterable[BaseEntryPoint]:
# importlib.metadata's EntryPoint structure sasitfies BaseEntryPoint.
# importlib.metadata's EntryPoint structure satisfies BaseEntryPoint.
return self._dist.entry_points

def _metadata_impl(self) -> email.message.Message:
32 changes: 23 additions & 9 deletions src/pip/_internal/models/link.py
Original file line number Diff line number Diff line change
@@ -170,12 +170,23 @@ def _ensure_quoted_url(url: str) -> str:
and without double-quoting other characters.
"""
# Split the URL into parts according to the general structure
# `scheme://netloc/path;parameters?query#fragment`.
result = urllib.parse.urlparse(url)
# `scheme://netloc/path?query#fragment`.
result = urllib.parse.urlsplit(url)
# If the netloc is empty, then the URL refers to a local filesystem path.
is_local_path = not result.netloc
path = _clean_url_path(result.path, is_local_path=is_local_path)
return urllib.parse.urlunparse(result._replace(path=path))
return urllib.parse.urlunsplit(result._replace(path=path))


def _absolute_link_url(base_url: str, url: str) -> str:
"""
A faster implementation of urllib.parse.urljoin with a shortcut
for absolute http/https URLs.
"""
if url.startswith(("https://", "http://")):
return url
else:
return urllib.parse.urljoin(base_url, url)


@functools.total_ordering
@@ -185,6 +196,7 @@ class Link:
__slots__ = [
"_parsed_url",
"_url",
"_path",
"_hashes",
"comes_from",
"requires_python",
@@ -241,6 +253,8 @@ def __init__(
# Store the url as a private attribute to prevent accidentally
# trying to set a new value.
self._url = url
# The .path property is hot, so calculate its value ahead of time.
self._path = urllib.parse.unquote(self._parsed_url.path)

link_hash = LinkHash.find_hash_url_fragment(url)
hashes_from_link = {} if link_hash is None else link_hash.as_dict()
@@ -270,7 +284,7 @@ def from_json(
if file_url is None:
return None

url = _ensure_quoted_url(urllib.parse.urljoin(page_url, file_url))
url = _ensure_quoted_url(_absolute_link_url(page_url, file_url))
pyrequire = file_data.get("requires-python")
yanked_reason = file_data.get("yanked")
hashes = file_data.get("hashes", {})
@@ -322,7 +336,7 @@ def from_element(
if not href:
return None

url = _ensure_quoted_url(urllib.parse.urljoin(base_url, href))
url = _ensure_quoted_url(_absolute_link_url(base_url, href))
pyrequire = anchor_attribs.get("data-requires-python")
yanked_reason = anchor_attribs.get("data-yanked")

@@ -421,7 +435,7 @@ def netloc(self) -> str:

@property
def path(self) -> str:
return urllib.parse.unquote(self._parsed_url.path)
return self._path

def splitext(self) -> Tuple[str, str]:
return splitext(posixpath.basename(self.path.rstrip("/")))
@@ -452,10 +466,10 @@ def _egg_fragment(self) -> Optional[str]:
project_name = match.group(1)
if not self._project_name_re.match(project_name):
deprecated(
reason=f"{self} contains an egg fragment with a non-PEP 508 name",
reason=f"{self} contains an egg fragment with a non-PEP 508 name.",
replacement="to use the req @ url syntax, and remove the egg fragment",
gone_in="25.0",
issue=11617,
gone_in="25.1",
issue=13157,
)

return project_name
12 changes: 12 additions & 0 deletions src/pip/_internal/network/cache.py
Original file line number Diff line number Diff line change
@@ -76,6 +76,18 @@ def _write(self, path: str, data: bytes) -> None:

with adjacent_tmp_file(path) as f:
f.write(data)
# Inherit the read/write permissions of the cache directory
# to enable multi-user cache use-cases.
mode = (
os.stat(self.directory).st_mode
& 0o666 # select read/write permissions of cache directory
| 0o600 # set owner read/write permissions
)
# Change permissions only if there is no risk of following a symlink.
if os.chmod in os.supports_fd:
os.chmod(f.fileno(), mode)
elif os.chmod in os.supports_follow_symlinks:
os.chmod(f.name, mode, follow_symlinks=False)

replace(f.name, path)

1 change: 1 addition & 0 deletions src/pip/_internal/network/session.py
Original file line number Diff line number Diff line change
@@ -339,6 +339,7 @@ def __init__(
# Namespace the attribute with "pip_" just in case to prevent
# possible conflicts with the base class.
self.pip_trusted_origins: List[Tuple[str, Optional[int]]] = []
self.pip_proxy = None

# Attach our User Agent to the request
self.headers["User-Agent"] = user_agent()
1 change: 1 addition & 0 deletions src/pip/_internal/operations/build/metadata_editable.py
Original file line number Diff line number Diff line change
@@ -38,4 +38,5 @@ def generate_editable_metadata(
except InstallationSubprocessError as error:
raise MetadataGenerationFailed(package_details=details) from error

assert distinfo_dir is not None
return os.path.join(metadata_dir, distinfo_dir)
24 changes: 11 additions & 13 deletions src/pip/_internal/operations/freeze.py
Original file line number Diff line number Diff line change
@@ -1,9 +1,10 @@
import collections
import logging
import os
from dataclasses import dataclass, field
from typing import Container, Dict, Generator, Iterable, List, NamedTuple, Optional, Set

from pip._vendor.packaging.utils import canonicalize_name
from pip._vendor.packaging.utils import NormalizedName, canonicalize_name
from pip._vendor.packaging.version import InvalidVersion

from pip._internal.exceptions import BadCommand, InstallationError
@@ -220,19 +221,16 @@ def _get_editable_info(dist: BaseDistribution) -> _EditableInfo:
)


@dataclass(frozen=True)
class FrozenRequirement:
def __init__(
self,
name: str,
req: str,
editable: bool,
comments: Iterable[str] = (),
) -> None:
self.name = name
self.canonical_name = canonicalize_name(name)
self.req = req
self.editable = editable
self.comments = comments
name: str
req: str
editable: bool
comments: Iterable[str] = field(default_factory=tuple)

@property
def canonical_name(self) -> NormalizedName:
return canonicalize_name(self.name)

@classmethod
def from_dist(cls, dist: BaseDistribution) -> "FrozenRequirement":
2 changes: 1 addition & 1 deletion src/pip/_internal/pyproject.py
Original file line number Diff line number Diff line change
@@ -73,7 +73,7 @@ def load_pyproject_toml(
build_system = None

# The following cases must use PEP 517
# We check for use_pep517 being non-None and falsey because that means
# We check for use_pep517 being non-None and falsy because that means
# the user explicitly requested --no-use-pep517. The value 0 as
# opposed to False can occur when the value is provided via an
# environment variable or config file option (due to the quirk of
135 changes: 92 additions & 43 deletions src/pip/_internal/req/req_file.py
Original file line number Diff line number Diff line change
@@ -2,12 +2,16 @@
Requirements file parsing
"""

import codecs
import locale
import logging
import optparse
import os
import re
import shlex
import sys
import urllib.parse
from dataclasses import dataclass
from optparse import Values
from typing import (
TYPE_CHECKING,
@@ -25,7 +29,6 @@
from pip._internal.cli import cmdoptions
from pip._internal.exceptions import InstallationError, RequirementsFileParseError
from pip._internal.models.search_scope import SearchScope
from pip._internal.utils.encoding import auto_decode

if TYPE_CHECKING:
from pip._internal.index.package_finder import PackageFinder
@@ -81,52 +84,66 @@
str(o().dest) for o in SUPPORTED_OPTIONS_EDITABLE_REQ
]

# order of BOMS is important: codecs.BOM_UTF16_LE is a prefix of codecs.BOM_UTF32_LE
# so data.startswith(BOM_UTF16_LE) would be true for UTF32_LE data
BOMS: List[Tuple[bytes, str]] = [
(codecs.BOM_UTF8, "utf-8"),
(codecs.BOM_UTF32, "utf-32"),
(codecs.BOM_UTF32_BE, "utf-32-be"),
(codecs.BOM_UTF32_LE, "utf-32-le"),
(codecs.BOM_UTF16, "utf-16"),
(codecs.BOM_UTF16_BE, "utf-16-be"),
(codecs.BOM_UTF16_LE, "utf-16-le"),
]

PEP263_ENCODING_RE = re.compile(rb"coding[:=]\s*([-\w.]+)")
DEFAULT_ENCODING = "utf-8"

logger = logging.getLogger(__name__)


@dataclass(frozen=True)
class ParsedRequirement:
def __init__(
self,
requirement: str,
is_editable: bool,
comes_from: str,
constraint: bool,
options: Optional[Dict[str, Any]] = None,
line_source: Optional[str] = None,
) -> None:
self.requirement = requirement
self.is_editable = is_editable
self.comes_from = comes_from
self.options = options
self.constraint = constraint
self.line_source = line_source
# TODO: replace this with slots=True when dropping Python 3.9 support.
__slots__ = (
"requirement",
"is_editable",
"comes_from",
"constraint",
"options",
"line_source",
)

requirement: str
is_editable: bool
comes_from: str
constraint: bool
options: Optional[Dict[str, Any]]
line_source: Optional[str]


@dataclass(frozen=True)
class ParsedLine:
def __init__(
self,
filename: str,
lineno: int,
args: str,
opts: Values,
constraint: bool,
) -> None:
self.filename = filename
self.lineno = lineno
self.opts = opts
self.constraint = constraint

if args:
self.is_requirement = True
self.is_editable = False
self.requirement = args
elif opts.editables:
self.is_requirement = True
self.is_editable = True
__slots__ = ("filename", "lineno", "args", "opts", "constraint")

filename: str
lineno: int
args: str
opts: Values
constraint: bool

@property
def is_editable(self) -> bool:
return bool(self.opts.editables)

@property
def requirement(self) -> Optional[str]:
if self.args:
return self.args
elif self.is_editable:
# We don't support multiple -e on one line
self.requirement = opts.editables[0]
else:
self.is_requirement = False
return self.opts.editables[0]
return None


def parse_requirements(
@@ -179,7 +196,7 @@ def handle_requirement_line(
line.lineno,
)

assert line.is_requirement
assert line.requirement is not None

# get the options that apply to requirements
if line.is_editable:
@@ -301,7 +318,7 @@ def handle_line(
affect the finder.
"""

if line.is_requirement:
if line.requirement is not None:
parsed_req = handle_requirement_line(line, options)
return parsed_req
else:
@@ -340,7 +357,7 @@ def _parse_and_recurse(
parsed_files_stack: List[Dict[str, Optional[str]]],
) -> Generator[ParsedLine, None, None]:
for line in self._parse_file(filename, constraint):
if not line.is_requirement and (
if line.requirement is None and (
line.opts.requirements or line.opts.constraints
):
# parse a nested requirements file
@@ -568,7 +585,39 @@ def get_file_content(url: str, session: "PipSession") -> Tuple[str, str]:
# Assume this is a bare path.
try:
with open(url, "rb") as f:
content = auto_decode(f.read())
raw_content = f.read()
except OSError as exc:
raise InstallationError(f"Could not open requirements file: {exc}")

content = _decode_req_file(raw_content, url)

return url, content


def _decode_req_file(data: bytes, url: str) -> str:
for bom, encoding in BOMS:
if data.startswith(bom):
return data[len(bom) :].decode(encoding)

for line in data.split(b"\n")[:2]:
if line[0:1] == b"#":
result = PEP263_ENCODING_RE.search(line)
if result is not None:
encoding = result.groups()[0].decode("ascii")
return data.decode(encoding)

try:
return data.decode(DEFAULT_ENCODING)
except UnicodeDecodeError:
locale_encoding = locale.getpreferredencoding(False) or sys.getdefaultencoding()
logging.warning(
"unable to decode data from %s with default encoding %s, "
"falling back to encoding from locale: %s. "
"If this is intentional you should specify the encoding with a "
"PEP-263 style comment, e.g. '# -*- coding: %s -*-'",
url,
DEFAULT_ENCODING,
locale_encoding,
locale_encoding,
)
return data.decode(locale_encoding)
4 changes: 2 additions & 2 deletions src/pip/_internal/req/req_install.py
Original file line number Diff line number Diff line change
@@ -837,7 +837,7 @@ def install(
"try using --config-settings editable_mode=compat. "
"Please consult the setuptools documentation for more information"
),
gone_in="25.0",
gone_in="25.1",
issue=11457,
)
if self.config_settings:
@@ -925,7 +925,7 @@ def check_legacy_setup_py_options(
reason="--build-option and --global-option are deprecated.",
issue=11859,
replacement="to use --config-settings",
gone_in="25.0",
gone_in=None,
)
logger.warning(
"Implying --no-binary=:all: due to the presence of "
2 changes: 1 addition & 1 deletion src/pip/_internal/resolution/resolvelib/factory.py
Original file line number Diff line number Diff line change
@@ -309,7 +309,7 @@ def iter_index_candidate_infos() -> Iterator[IndexCandidateInfo]:
specifier=specifier,
hashes=hashes,
)
icans = list(result.iter_applicable())
icans = result.applicable_candidates

# PEP 592: Yanked releases are ignored unless the specifier
# explicitly pins a version (via '==' or '===') that can be
10 changes: 9 additions & 1 deletion src/pip/_internal/self_outdated_check.py
Original file line number Diff line number Diff line change
@@ -26,7 +26,11 @@
get_best_invocation_for_this_python,
)
from pip._internal.utils.filesystem import adjacent_tmp_file, check_path_owner, replace
from pip._internal.utils.misc import ensure_dir
from pip._internal.utils.misc import (
ExternallyManagedEnvironment,
check_externally_managed,
ensure_dir,
)

_WEEK = datetime.timedelta(days=7)

@@ -231,6 +235,10 @@ def pip_self_version_check(session: PipSession, options: optparse.Values) -> Non
installed_dist = get_default_environment().get_distribution("pip")
if not installed_dist:
return
try:
check_externally_managed()
except ExternallyManagedEnvironment:
return

upgrade_prompt = _self_version_check_logic(
state=SelfCheckState(cache_dir=options.cache_dir),
36 changes: 0 additions & 36 deletions src/pip/_internal/utils/encoding.py

This file was deleted.

9 changes: 8 additions & 1 deletion src/pip/_internal/utils/logging.py
Original file line number Diff line number Diff line change
@@ -137,12 +137,19 @@ def __rich_console__(
yield Segment("\n")


class PipConsole(Console):
def on_broken_pipe(self) -> None:
# Reraise the original exception, rich 13.8.0+ exits by default
# instead, preventing our handler from firing.
raise BrokenPipeError() from None


class RichPipStreamHandler(RichHandler):
KEYWORDS: ClassVar[Optional[List[str]]] = []

def __init__(self, stream: Optional[TextIO], no_color: bool) -> None:
super().__init__(
console=Console(file=stream, no_color=no_color, soft_wrap=True),
console=PipConsole(file=stream, no_color=no_color, soft_wrap=True),
show_time=False,
show_level=False,
show_path=False,
29 changes: 15 additions & 14 deletions src/pip/_internal/utils/misc.py
Original file line number Diff line number Diff line change
@@ -19,12 +19,13 @@
Any,
BinaryIO,
Callable,
Dict,
Generator,
Iterable,
Iterator,
List,
Mapping,
Optional,
Sequence,
TextIO,
Tuple,
Type,
@@ -667,7 +668,7 @@ def __init__(
def build_wheel(
self,
wheel_directory: str,
config_settings: Optional[Dict[str, Union[str, List[str]]]] = None,
config_settings: Optional[Mapping[str, Any]] = None,
metadata_directory: Optional[str] = None,
) -> str:
cs = self.config_holder.config_settings
@@ -678,15 +679,15 @@ def build_wheel(
def build_sdist(
self,
sdist_directory: str,
config_settings: Optional[Dict[str, Union[str, List[str]]]] = None,
config_settings: Optional[Mapping[str, Any]] = None,
) -> str:
cs = self.config_holder.config_settings
return super().build_sdist(sdist_directory, config_settings=cs)

def build_editable(
self,
wheel_directory: str,
config_settings: Optional[Dict[str, Union[str, List[str]]]] = None,
config_settings: Optional[Mapping[str, Any]] = None,
metadata_directory: Optional[str] = None,
) -> str:
cs = self.config_holder.config_settings
@@ -695,27 +696,27 @@ def build_editable(
)

def get_requires_for_build_wheel(
self, config_settings: Optional[Dict[str, Union[str, List[str]]]] = None
) -> List[str]:
self, config_settings: Optional[Mapping[str, Any]] = None
) -> Sequence[str]:
cs = self.config_holder.config_settings
return super().get_requires_for_build_wheel(config_settings=cs)

def get_requires_for_build_sdist(
self, config_settings: Optional[Dict[str, Union[str, List[str]]]] = None
) -> List[str]:
self, config_settings: Optional[Mapping[str, Any]] = None
) -> Sequence[str]:
cs = self.config_holder.config_settings
return super().get_requires_for_build_sdist(config_settings=cs)

def get_requires_for_build_editable(
self, config_settings: Optional[Dict[str, Union[str, List[str]]]] = None
) -> List[str]:
self, config_settings: Optional[Mapping[str, Any]] = None
) -> Sequence[str]:
cs = self.config_holder.config_settings
return super().get_requires_for_build_editable(config_settings=cs)

def prepare_metadata_for_build_wheel(
self,
metadata_directory: str,
config_settings: Optional[Dict[str, Union[str, List[str]]]] = None,
config_settings: Optional[Mapping[str, Any]] = None,
_allow_fallback: bool = True,
) -> str:
cs = self.config_holder.config_settings
@@ -728,9 +729,9 @@ def prepare_metadata_for_build_wheel(
def prepare_metadata_for_build_editable(
self,
metadata_directory: str,
config_settings: Optional[Dict[str, Union[str, List[str]]]] = None,
config_settings: Optional[Mapping[str, Any]] = None,
_allow_fallback: bool = True,
) -> str:
) -> Optional[str]:
cs = self.config_holder.config_settings
return super().prepare_metadata_for_build_editable(
metadata_directory=metadata_directory,
@@ -764,7 +765,7 @@ def warn_if_run_as_root() -> None:
logger.warning(
"Running pip as the 'root' user can result in broken permissions and "
"conflicting behaviour with the system package manager, possibly "
"rendering your system unusable."
"rendering your system unusable. "
"It is recommended to use a virtual environment instead: "
"https://pip.pypa.io/warnings/venv. "
"Use the --root-user-action option if you know what you are doing and "
1 change: 1 addition & 0 deletions src/pip/_internal/utils/packaging.py
Original file line number Diff line number Diff line change
@@ -11,6 +11,7 @@
logger = logging.getLogger(__name__)


@functools.lru_cache(maxsize=32)
def check_requires_python(
requires_python: Optional[str], version_info: Tuple[int, ...]
) -> bool:
2 changes: 1 addition & 1 deletion src/pip/_internal/utils/unpacking.py
Original file line number Diff line number Diff line change
@@ -176,7 +176,7 @@ def untar_file(filename: str, location: str) -> None:
)
mode = "r:*"

tar = tarfile.open(filename, mode, encoding="utf-8")
tar = tarfile.open(filename, mode, encoding="utf-8") # type: ignore
try:
leading = has_leading_dir([member.name for member in tar.getmembers()])

3 changes: 2 additions & 1 deletion src/pip/_vendor/cachecontrol/__init__.py
Original file line number Diff line number Diff line change
@@ -6,9 +6,10 @@
Make it easy to import from cachecontrol without long namespaces.
"""

__author__ = "Eric Larson"
__email__ = "eric@ionrock.org"
__version__ = "0.14.0"
__version__ = "0.14.1"

from pip._vendor.cachecontrol.adapter import CacheControlAdapter
from pip._vendor.cachecontrol.controller import CacheController
4 changes: 2 additions & 2 deletions src/pip/_vendor/cachecontrol/adapter.py
Original file line number Diff line number Diff line change
@@ -77,7 +77,7 @@ def send(

return resp

def build_response(
def build_response( # type: ignore[override]
self,
request: PreparedRequest,
response: HTTPResponse,
@@ -143,7 +143,7 @@ def _update_chunk_length(self: HTTPResponse) -> None:
_update_chunk_length, response
)

resp: Response = super().build_response(request, response) # type: ignore[no-untyped-call]
resp: Response = super().build_response(request, response)

# See if we should invalidate the cache.
if request.method in self.invalidating_methods and resp.ok:
1 change: 1 addition & 0 deletions src/pip/_vendor/cachecontrol/cache.py
Original file line number Diff line number Diff line change
@@ -6,6 +6,7 @@
The cache object API for implementing caches. The default is a thread
safe in-memory dictionary.
"""

from __future__ import annotations

from threading import Lock
2 changes: 1 addition & 1 deletion src/pip/_vendor/cachecontrol/caches/file_cache.py
Original file line number Diff line number Diff line change
@@ -6,7 +6,7 @@
import hashlib
import os
from textwrap import dedent
from typing import IO, TYPE_CHECKING, Union
from typing import IO, TYPE_CHECKING
from pathlib import Path

from pip._vendor.cachecontrol.cache import BaseCache, SeparateBodyBaseCache
1 change: 1 addition & 0 deletions src/pip/_vendor/cachecontrol/controller.py
Original file line number Diff line number Diff line change
@@ -5,6 +5,7 @@
"""
The httplib2 algorithms ported for use with requests.
"""

from __future__ import annotations

import calendar
4 changes: 2 additions & 2 deletions src/pip/_vendor/cachecontrol/filewrapper.py
Original file line number Diff line number Diff line change
@@ -38,10 +38,10 @@ def __init__(
self.__callback = callback

def __getattr__(self, name: str) -> Any:
# The vaguaries of garbage collection means that self.__fp is
# The vagaries of garbage collection means that self.__fp is
# not always set. By using __getattribute__ and the private
# name[0] allows looking up the attribute value and raising an
# AttributeError when it doesn't exist. This stop thigns from
# AttributeError when it doesn't exist. This stop things from
# infinitely recursing calls to getattr in the case where
# self.__fp hasn't been set.
#
5 changes: 4 additions & 1 deletion src/pip/_vendor/cachecontrol/heuristics.py
Original file line number Diff line number Diff line change
@@ -68,7 +68,10 @@ def update_headers(self, response: HTTPResponse) -> dict[str, str]:

if "expires" not in response.headers:
date = parsedate(response.headers["date"])
expires = expire_after(timedelta(days=1), date=datetime(*date[:6], tzinfo=timezone.utc)) # type: ignore[index,misc]
expires = expire_after(
timedelta(days=1),
date=datetime(*date[:6], tzinfo=timezone.utc), # type: ignore[index,misc]
)
headers["expires"] = datetime_to_header(expires)
headers["cache-control"] = "public"
return headers
3 changes: 2 additions & 1 deletion src/pip/_vendor/idna/__init__.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,3 @@
from .package_data import __version__
from .core import (
IDNABidiError,
IDNAError,
@@ -20,8 +19,10 @@
valid_string_length,
)
from .intranges import intranges_contain
from .package_data import __version__

__all__ = [
"__version__",
"IDNABidiError",
"IDNAError",
"InvalidCodepoint",
58 changes: 31 additions & 27 deletions src/pip/_vendor/idna/codec.py
Original file line number Diff line number Diff line change
@@ -1,49 +1,51 @@
from .core import encode, decode, alabel, ulabel, IDNAError
import codecs
import re
from typing import Any, Tuple, Optional
from typing import Any, Optional, Tuple

_unicode_dots_re = re.compile('[\u002e\u3002\uff0e\uff61]')
from .core import IDNAError, alabel, decode, encode, ulabel

_unicode_dots_re = re.compile("[\u002e\u3002\uff0e\uff61]")

class Codec(codecs.Codec):

def encode(self, data: str, errors: str = 'strict') -> Tuple[bytes, int]:
if errors != 'strict':
raise IDNAError('Unsupported error handling \"{}\"'.format(errors))
class Codec(codecs.Codec):
def encode(self, data: str, errors: str = "strict") -> Tuple[bytes, int]:
if errors != "strict":
raise IDNAError('Unsupported error handling "{}"'.format(errors))

if not data:
return b"", 0

return encode(data), len(data)

def decode(self, data: bytes, errors: str = 'strict') -> Tuple[str, int]:
if errors != 'strict':
raise IDNAError('Unsupported error handling \"{}\"'.format(errors))
def decode(self, data: bytes, errors: str = "strict") -> Tuple[str, int]:
if errors != "strict":
raise IDNAError('Unsupported error handling "{}"'.format(errors))

if not data:
return '', 0
return "", 0

return decode(data), len(data)


class IncrementalEncoder(codecs.BufferedIncrementalEncoder):
def _buffer_encode(self, data: str, errors: str, final: bool) -> Tuple[bytes, int]:
if errors != 'strict':
raise IDNAError('Unsupported error handling \"{}\"'.format(errors))
if errors != "strict":
raise IDNAError('Unsupported error handling "{}"'.format(errors))

if not data:
return b'', 0
return b"", 0

labels = _unicode_dots_re.split(data)
trailing_dot = b''
trailing_dot = b""
if labels:
if not labels[-1]:
trailing_dot = b'.'
trailing_dot = b"."
del labels[-1]
elif not final:
# Keep potentially unfinished label until the next call
del labels[-1]
if labels:
trailing_dot = b'.'
trailing_dot = b"."

result = []
size = 0
@@ -54,32 +56,33 @@ def _buffer_encode(self, data: str, errors: str, final: bool) -> Tuple[bytes, in
size += len(label)

# Join with U+002E
result_bytes = b'.'.join(result) + trailing_dot
result_bytes = b".".join(result) + trailing_dot
size += len(trailing_dot)
return result_bytes, size


class IncrementalDecoder(codecs.BufferedIncrementalDecoder):
def _buffer_decode(self, data: Any, errors: str, final: bool) -> Tuple[str, int]:
if errors != 'strict':
raise IDNAError('Unsupported error handling \"{}\"'.format(errors))
if errors != "strict":
raise IDNAError('Unsupported error handling "{}"'.format(errors))

if not data:
return ('', 0)
return ("", 0)

if not isinstance(data, str):
data = str(data, 'ascii')
data = str(data, "ascii")

labels = _unicode_dots_re.split(data)
trailing_dot = ''
trailing_dot = ""
if labels:
if not labels[-1]:
trailing_dot = '.'
trailing_dot = "."
del labels[-1]
elif not final:
# Keep potentially unfinished label until the next call
del labels[-1]
if labels:
trailing_dot = '.'
trailing_dot = "."

result = []
size = 0
@@ -89,7 +92,7 @@ def _buffer_decode(self, data: Any, errors: str, final: bool) -> Tuple[str, int]
size += 1
size += len(label)

result_str = '.'.join(result) + trailing_dot
result_str = ".".join(result) + trailing_dot
size += len(trailing_dot)
return (result_str, size)

@@ -103,7 +106,7 @@ class StreamReader(Codec, codecs.StreamReader):


def search_function(name: str) -> Optional[codecs.CodecInfo]:
if name != 'idna2008':
if name != "idna2008":
return None
return codecs.CodecInfo(
name=name,
@@ -115,4 +118,5 @@ def search_function(name: str) -> Optional[codecs.CodecInfo]:
streamreader=StreamReader,
)


codecs.register(search_function)
10 changes: 6 additions & 4 deletions src/pip/_vendor/idna/compat.py
Original file line number Diff line number Diff line change
@@ -1,13 +1,15 @@
from .core import *
from .codec import *
from typing import Any, Union

from .core import decode, encode


def ToASCII(label: str) -> bytes:
return encode(label)


def ToUnicode(label: Union[bytes, bytearray]) -> str:
return decode(label)

def nameprep(s: Any) -> None:
raise NotImplementedError('IDNA 2008 does not utilise nameprep protocol')

def nameprep(s: Any) -> None:
raise NotImplementedError("IDNA 2008 does not utilise nameprep protocol")
280 changes: 161 additions & 119 deletions src/pip/_vendor/idna/core.py

Large diffs are not rendered by default.

7,076 changes: 3,537 additions & 3,539 deletions src/pip/_vendor/idna/idnadata.py

Large diffs are not rendered by default.

11 changes: 7 additions & 4 deletions src/pip/_vendor/idna/intranges.py
Original file line number Diff line number Diff line change
@@ -8,6 +8,7 @@
import bisect
from typing import List, Tuple


def intranges_from_list(list_: List[int]) -> Tuple[int, ...]:
"""Represent a list of integers as a sequence of ranges:
((start_0, end_0), (start_1, end_1), ...), such that the original
@@ -20,18 +21,20 @@ def intranges_from_list(list_: List[int]) -> Tuple[int, ...]:
ranges = []
last_write = -1
for i in range(len(sorted_list)):
if i+1 < len(sorted_list):
if sorted_list[i] == sorted_list[i+1]-1:
if i + 1 < len(sorted_list):
if sorted_list[i] == sorted_list[i + 1] - 1:
continue
current_range = sorted_list[last_write+1:i+1]
current_range = sorted_list[last_write + 1 : i + 1]
ranges.append(_encode_range(current_range[0], current_range[-1] + 1))
last_write = i

return tuple(ranges)


def _encode_range(start: int, end: int) -> int:
return (start << 32) | end


def _decode_range(r: int) -> Tuple[int, int]:
return (r >> 32), (r & ((1 << 32) - 1))

@@ -43,7 +46,7 @@ def intranges_contain(int_: int, ranges: Tuple[int, ...]) -> bool:
# we could be immediately ahead of a tuple (start, end)
# with start < int_ <= end
if pos > 0:
left, right = _decode_range(ranges[pos-1])
left, right = _decode_range(ranges[pos - 1])
if left <= int_ < right:
return True
# or we could be immediately behind a tuple (int_, end)
3 changes: 1 addition & 2 deletions src/pip/_vendor/idna/package_data.py
Original file line number Diff line number Diff line change
@@ -1,2 +1 @@
__version__ = '3.7'

__version__ = "3.10"
16,439 changes: 8,261 additions & 8,178 deletions src/pip/_vendor/idna/uts46data.py

Large diffs are not rendered by default.

16 changes: 8 additions & 8 deletions src/pip/_vendor/msgpack/__init__.py
Original file line number Diff line number Diff line change
@@ -1,20 +1,20 @@
from .exceptions import *
from .ext import ExtType, Timestamp

# ruff: noqa: F401
import os

from .exceptions import * # noqa: F403
from .ext import ExtType, Timestamp

version = (1, 0, 8)
__version__ = "1.0.8"
version = (1, 1, 0)
__version__ = "1.1.0"


if os.environ.get("MSGPACK_PUREPYTHON"):
from .fallback import Packer, unpackb, Unpacker
from .fallback import Packer, Unpacker, unpackb
else:
try:
from ._cmsgpack import Packer, unpackb, Unpacker
from ._cmsgpack import Packer, Unpacker, unpackb
except ImportError:
from .fallback import Packer, unpackb, Unpacker
from .fallback import Packer, Unpacker, unpackb


def pack(o, stream, **kwargs):
8 changes: 5 additions & 3 deletions src/pip/_vendor/msgpack/ext.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
from collections import namedtuple
import datetime
import struct
from collections import namedtuple


class ExtType(namedtuple("ExtType", "code data")):
@@ -157,12 +157,14 @@ def to_datetime(self):
:rtype: `datetime.datetime`
"""
utc = datetime.timezone.utc
return datetime.datetime.fromtimestamp(0, utc) + datetime.timedelta(seconds=self.to_unix())
return datetime.datetime.fromtimestamp(0, utc) + datetime.timedelta(
seconds=self.seconds, microseconds=self.nanoseconds // 1000
)

@staticmethod
def from_datetime(dt):
"""Create a Timestamp from datetime with tzinfo.
:rtype: Timestamp
"""
return Timestamp.from_unix(dt.timestamp())
return Timestamp(seconds=int(dt.timestamp()), nanoseconds=dt.microsecond * 1000)
80 changes: 29 additions & 51 deletions src/pip/_vendor/msgpack/fallback.py
Original file line number Diff line number Diff line change
@@ -1,27 +1,22 @@
"""Fallback pure Python implementation of msgpack"""
from datetime import datetime as _DateTime
import sys
import struct

import struct
import sys
from datetime import datetime as _DateTime

if hasattr(sys, "pypy_version_info"):
# StringIO is slow on PyPy, StringIO is faster. However: PyPy's own
# StringBuilder is fastest.
from __pypy__ import newlist_hint
from __pypy__.builders import BytesBuilder

try:
from __pypy__.builders import BytesBuilder as StringBuilder
except ImportError:
from __pypy__.builders import StringBuilder
USING_STRINGBUILDER = True
_USING_STRINGBUILDER = True

class StringIO:
class BytesIO:
def __init__(self, s=b""):
if s:
self.builder = StringBuilder(len(s))
self.builder = BytesBuilder(len(s))
self.builder.append(s)
else:
self.builder = StringBuilder()
self.builder = BytesBuilder()

def write(self, s):
if isinstance(s, memoryview):
@@ -34,17 +29,17 @@ def getvalue(self):
return self.builder.build()

else:
USING_STRINGBUILDER = False
from io import BytesIO as StringIO
from io import BytesIO

newlist_hint = lambda size: []
_USING_STRINGBUILDER = False

def newlist_hint(size):
return []

from .exceptions import BufferFull, OutOfData, ExtraData, FormatError, StackError

from .exceptions import BufferFull, ExtraData, FormatError, OutOfData, StackError
from .ext import ExtType, Timestamp


EX_SKIP = 0
EX_CONSTRUCT = 1
EX_READ_ARRAY_HEADER = 2
@@ -231,6 +226,7 @@ class Unpacker:
def __init__(
self,
file_like=None,
*,
read_size=0,
use_list=True,
raw=False,
@@ -333,6 +329,7 @@ def feed(self, next_bytes):

# Use extend here: INPLACE_ADD += doesn't reliably typecast memoryview in jython
self._buffer.extend(view)
view.release()

def _consume(self):
"""Gets rid of the used parts of the buffer."""
@@ -649,50 +646,31 @@ class Packer:
The error handler for encoding unicode. (default: 'strict')
DO NOT USE THIS!! This option is kept for very specific usage.
Example of streaming deserialize from file-like object::
unpacker = Unpacker(file_like)
for o in unpacker:
process(o)
Example of streaming deserialize from socket::
unpacker = Unpacker()
while True:
buf = sock.recv(1024**2)
if not buf:
break
unpacker.feed(buf)
for o in unpacker:
process(o)
Raises ``ExtraData`` when *packed* contains extra bytes.
Raises ``OutOfData`` when *packed* is incomplete.
Raises ``FormatError`` when *packed* is not valid msgpack.
Raises ``StackError`` when *packed* contains too nested.
Other exceptions can be raised during unpacking.
:param int buf_size:
Internal buffer size. This option is used only for C implementation.
"""

def __init__(
self,
*,
default=None,
use_single_float=False,
autoreset=True,
use_bin_type=True,
strict_types=False,
datetime=False,
unicode_errors=None,
buf_size=None,
):
self._strict_types = strict_types
self._use_float = use_single_float
self._autoreset = autoreset
self._use_bin_type = use_bin_type
self._buffer = StringIO()
self._buffer = BytesIO()
self._datetime = bool(datetime)
self._unicode_errors = unicode_errors or "strict"
if default is not None:
if not callable(default):
raise TypeError("default must be callable")
if default is not None and not callable(default):
raise TypeError("default must be callable")
self._default = default

def _pack(
@@ -823,18 +801,18 @@ def pack(self, obj):
try:
self._pack(obj)
except:
self._buffer = StringIO() # force reset
self._buffer = BytesIO() # force reset
raise
if self._autoreset:
ret = self._buffer.getvalue()
self._buffer = StringIO()
self._buffer = BytesIO()
return ret

def pack_map_pairs(self, pairs):
self._pack_map_pairs(len(pairs), pairs)
if self._autoreset:
ret = self._buffer.getvalue()
self._buffer = StringIO()
self._buffer = BytesIO()
return ret

def pack_array_header(self, n):
@@ -843,7 +821,7 @@ def pack_array_header(self, n):
self._pack_array_header(n)
if self._autoreset:
ret = self._buffer.getvalue()
self._buffer = StringIO()
self._buffer = BytesIO()
return ret

def pack_map_header(self, n):
@@ -852,7 +830,7 @@ def pack_map_header(self, n):
self._pack_map_header(n)
if self._autoreset:
ret = self._buffer.getvalue()
self._buffer = StringIO()
self._buffer = BytesIO()
return ret

def pack_ext_type(self, typecode, data):
@@ -941,11 +919,11 @@ def reset(self):
This method is useful only when autoreset=False.
"""
self._buffer = StringIO()
self._buffer = BytesIO()

def getbuffer(self):
"""Return view of internal buffer."""
if USING_STRINGBUILDER:
if _USING_STRINGBUILDER:
return memoryview(self.bytes())
else:
return self._buffer.getbuffer()
4 changes: 2 additions & 2 deletions src/pip/_vendor/packaging/__init__.py
Original file line number Diff line number Diff line change
@@ -6,10 +6,10 @@
__summary__ = "Core utilities for Python packages"
__uri__ = "https://github.com/pypa/packaging"

__version__ = "24.1"
__version__ = "24.2"

__author__ = "Donald Stufft and individual contributors"
__email__ = "donald@stufft.io"

__license__ = "BSD-2-Clause or Apache-2.0"
__copyright__ = "2014 %s" % __author__
__copyright__ = f"2014 {__author__}"
8 changes: 4 additions & 4 deletions src/pip/_vendor/packaging/_elffile.py
Original file line number Diff line number Diff line change
@@ -48,8 +48,8 @@ def __init__(self, f: IO[bytes]) -> None:

try:
ident = self._read("16B")
except struct.error:
raise ELFInvalid("unable to parse identification")
except struct.error as e:
raise ELFInvalid("unable to parse identification") from e
magic = bytes(ident[:4])
if magic != b"\x7fELF":
raise ELFInvalid(f"invalid magic: {magic!r}")
@@ -67,11 +67,11 @@ def __init__(self, f: IO[bytes]) -> None:
(2, 1): ("<HHIQQQIHHH", "<IIQQQQQQ", (0, 2, 5)), # 64-bit LSB.
(2, 2): (">HHIQQQIHHH", ">IIQQQQQQ", (0, 2, 5)), # 64-bit MSB.
}[(self.capacity, self.encoding)]
except KeyError:
except KeyError as e:
raise ELFInvalid(
f"unrecognized capacity ({self.capacity}) or "
f"encoding ({self.encoding})"
)
) from e

try:
(
1 change: 1 addition & 0 deletions src/pip/_vendor/packaging/_manylinux.py
Original file line number Diff line number Diff line change
@@ -164,6 +164,7 @@ def _parse_glibc_version(version_str: str) -> tuple[int, int]:
f"Expected glibc version with 2 components major.minor,"
f" got: {version_str}",
RuntimeWarning,
stacklevel=2,
)
return -1, -1
return int(m.group("major")), int(m.group("minor"))
145 changes: 145 additions & 0 deletions src/pip/_vendor/packaging/licenses/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,145 @@
#######################################################################################
#
# Adapted from:
# https://github.com/pypa/hatch/blob/5352e44/backend/src/hatchling/licenses/parse.py
#
# MIT License
#
# Copyright (c) 2017-present Ofek Lev <oss@ofek.dev>
#
# Permission is hereby granted, free of charge, to any person obtaining a copy of this
# software and associated documentation files (the "Software"), to deal in the Software
# without restriction, including without limitation the rights to use, copy, modify,
# merge, publish, distribute, sublicense, and/or sell copies of the Software, and to
# permit persons to whom the Software is furnished to do so, subject to the following
# conditions:
#
# The above copyright notice and this permission notice shall be included in all copies
# or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED,
# INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A
# PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT
# HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF
# CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE
# OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
#
#
# With additional allowance of arbitrary `LicenseRef-` identifiers, not just
# `LicenseRef-Public-Domain` and `LicenseRef-Proprietary`.
#
#######################################################################################
from __future__ import annotations

import re
from typing import NewType, cast

from pip._vendor.packaging.licenses._spdx import EXCEPTIONS, LICENSES

__all__ = [
"NormalizedLicenseExpression",
"InvalidLicenseExpression",
"canonicalize_license_expression",
]

license_ref_allowed = re.compile("^[A-Za-z0-9.-]*$")

NormalizedLicenseExpression = NewType("NormalizedLicenseExpression", str)


class InvalidLicenseExpression(ValueError):
"""Raised when a license-expression string is invalid
>>> canonicalize_license_expression("invalid")
Traceback (most recent call last):
...
packaging.licenses.InvalidLicenseExpression: Invalid license expression: 'invalid'
"""


def canonicalize_license_expression(
raw_license_expression: str,
) -> NormalizedLicenseExpression:
if not raw_license_expression:
message = f"Invalid license expression: {raw_license_expression!r}"
raise InvalidLicenseExpression(message)

# Pad any parentheses so tokenization can be achieved by merely splitting on
# whitespace.
license_expression = raw_license_expression.replace("(", " ( ").replace(")", " ) ")
licenseref_prefix = "LicenseRef-"
license_refs = {
ref.lower(): "LicenseRef-" + ref[len(licenseref_prefix) :]
for ref in license_expression.split()
if ref.lower().startswith(licenseref_prefix.lower())
}

# Normalize to lower case so we can look up licenses/exceptions
# and so boolean operators are Python-compatible.
license_expression = license_expression.lower()

tokens = license_expression.split()

# Rather than implementing boolean logic, we create an expression that Python can
# parse. Everything that is not involved with the grammar itself is treated as
# `False` and the expression should evaluate as such.
python_tokens = []
for token in tokens:
if token not in {"or", "and", "with", "(", ")"}:
python_tokens.append("False")
elif token == "with":
python_tokens.append("or")
elif token == "(" and python_tokens and python_tokens[-1] not in {"or", "and"}:
message = f"Invalid license expression: {raw_license_expression!r}"
raise InvalidLicenseExpression(message)
else:
python_tokens.append(token)

python_expression = " ".join(python_tokens)
try:
invalid = eval(python_expression, globals(), locals())
except Exception:
invalid = True

if invalid is not False:
message = f"Invalid license expression: {raw_license_expression!r}"
raise InvalidLicenseExpression(message) from None

# Take a final pass to check for unknown licenses/exceptions.
normalized_tokens = []
for token in tokens:
if token in {"or", "and", "with", "(", ")"}:
normalized_tokens.append(token.upper())
continue

if normalized_tokens and normalized_tokens[-1] == "WITH":
if token not in EXCEPTIONS:
message = f"Unknown license exception: {token!r}"
raise InvalidLicenseExpression(message)

normalized_tokens.append(EXCEPTIONS[token]["id"])
else:
if token.endswith("+"):
final_token = token[:-1]
suffix = "+"
else:
final_token = token
suffix = ""

if final_token.startswith("licenseref-"):
if not license_ref_allowed.match(final_token):
message = f"Invalid licenseref: {final_token!r}"
raise InvalidLicenseExpression(message)
normalized_tokens.append(license_refs[final_token] + suffix)
else:
if final_token not in LICENSES:
message = f"Unknown license: {final_token!r}"
raise InvalidLicenseExpression(message)
normalized_tokens.append(LICENSES[final_token]["id"] + suffix)

normalized_expression = " ".join(normalized_tokens)

return cast(
NormalizedLicenseExpression,
normalized_expression.replace("( ", "(").replace(" )", ")"),
)
759 changes: 759 additions & 0 deletions src/pip/_vendor/packaging/licenses/_spdx.py

Large diffs are not rendered by default.

24 changes: 15 additions & 9 deletions src/pip/_vendor/packaging/markers.py
Original file line number Diff line number Diff line change
@@ -18,9 +18,9 @@

__all__ = [
"InvalidMarker",
"Marker",
"UndefinedComparison",
"UndefinedEnvironmentName",
"Marker",
"default_environment",
]

@@ -232,7 +232,7 @@ def _evaluate_markers(markers: MarkerList, environment: dict[str, str]) -> bool:


def format_full_version(info: sys._version_info) -> str:
version = "{0.major}.{0.minor}.{0.micro}".format(info)
version = f"{info.major}.{info.minor}.{info.micro}"
kind = info.releaselevel
if kind != "final":
version += kind[0] + str(info.serial)
@@ -309,17 +309,23 @@ def evaluate(self, environment: dict[str, str] | None = None) -> bool:
"""
current_environment = cast("dict[str, str]", default_environment())
current_environment["extra"] = ""
# Work around platform.python_version() returning something that is not PEP 440
# compliant for non-tagged Python builds. We preserve default_environment()'s
# behavior of returning platform.python_version() verbatim, and leave it to the
# caller to provide a syntactically valid version if they want to override it.
if current_environment["python_full_version"].endswith("+"):
current_environment["python_full_version"] += "local"
if environment is not None:
current_environment.update(environment)
# The API used to allow setting extra to None. We need to handle this
# case for backwards compatibility.
if current_environment["extra"] is None:
current_environment["extra"] = ""

return _evaluate_markers(self._markers, current_environment)
return _evaluate_markers(
self._markers, _repair_python_full_version(current_environment)
)


def _repair_python_full_version(env: dict[str, str]) -> dict[str, str]:
"""
Work around platform.python_version() returning something that is not PEP 440
compliant for non-tagged Python builds.
"""
if env["python_full_version"].endswith("+"):
env["python_full_version"] += "local"
return env
107 changes: 83 additions & 24 deletions src/pip/_vendor/packaging/metadata.py
Original file line number Diff line number Diff line change
@@ -5,6 +5,8 @@
import email.message
import email.parser
import email.policy
import pathlib
import sys
import typing
from typing import (
Any,
@@ -15,15 +17,16 @@
cast,
)

from . import requirements, specifiers, utils
from . import licenses, requirements, specifiers, utils
from . import version as version_module
from .licenses import NormalizedLicenseExpression

T = typing.TypeVar("T")


try:
ExceptionGroup
except NameError: # pragma: no cover
if sys.version_info >= (3, 11): # pragma: no cover
ExceptionGroup = ExceptionGroup
else: # pragma: no cover

class ExceptionGroup(Exception):
"""A minimal implementation of :external:exc:`ExceptionGroup` from Python 3.11.
@@ -42,9 +45,6 @@ def __init__(self, message: str, exceptions: list[Exception]) -> None:
def __repr__(self) -> str:
return f"{self.__class__.__name__}({self.message!r}, {self.exceptions!r})"

else: # pragma: no cover
ExceptionGroup = ExceptionGroup


class InvalidMetadata(ValueError):
"""A metadata field contains invalid data."""
@@ -128,6 +128,10 @@ class RawMetadata(TypedDict, total=False):
# No new fields were added in PEP 685, just some edge case were
# tightened up to provide better interoptability.

# Metadata 2.4 - PEP 639
license_expression: str
license_files: list[str]


_STRING_FIELDS = {
"author",
@@ -137,6 +141,7 @@ class RawMetadata(TypedDict, total=False):
"download_url",
"home_page",
"license",
"license_expression",
"maintainer",
"maintainer_email",
"metadata_version",
@@ -149,6 +154,7 @@ class RawMetadata(TypedDict, total=False):
_LIST_FIELDS = {
"classifiers",
"dynamic",
"license_files",
"obsoletes",
"obsoletes_dist",
"platforms",
@@ -167,7 +173,7 @@ class RawMetadata(TypedDict, total=False):


def _parse_keywords(data: str) -> list[str]:
"""Split a string of comma-separate keyboards into a list of keywords."""
"""Split a string of comma-separated keywords into a list of keywords."""
return [k.strip() for k in data.split(",")]


@@ -216,16 +222,18 @@ def _get_payload(msg: email.message.Message, source: bytes | str) -> str:
# If our source is a str, then our caller has managed encodings for us,
# and we don't need to deal with it.
if isinstance(source, str):
payload: str = msg.get_payload()
payload = msg.get_payload()
assert isinstance(payload, str)
return payload
# If our source is a bytes, then we're managing the encoding and we need
# to deal with it.
else:
bpayload: bytes = msg.get_payload(decode=True)
bpayload = msg.get_payload(decode=True)
assert isinstance(bpayload, bytes)
try:
return bpayload.decode("utf8", "strict")
except UnicodeDecodeError:
raise ValueError("payload in an invalid encoding")
except UnicodeDecodeError as exc:
raise ValueError("payload in an invalid encoding") from exc


# The various parse_FORMAT functions here are intended to be as lenient as
@@ -251,6 +259,8 @@ def _get_payload(msg: email.message.Message, source: bytes | str) -> str:
"home-page": "home_page",
"keywords": "keywords",
"license": "license",
"license-expression": "license_expression",
"license-file": "license_files",
"maintainer": "maintainer",
"maintainer-email": "maintainer_email",
"metadata-version": "metadata_version",
@@ -426,7 +436,7 @@ def parse_email(data: bytes | str) -> tuple[RawMetadata, dict[str, list[str]]]:
payload = _get_payload(parsed, data)
except ValueError:
unparsed.setdefault("description", []).append(
parsed.get_payload(decode=isinstance(data, bytes))
parsed.get_payload(decode=isinstance(data, bytes)) # type: ignore[call-overload]
)
else:
if payload:
@@ -453,8 +463,8 @@ def parse_email(data: bytes | str) -> tuple[RawMetadata, dict[str, list[str]]]:


# Keep the two values in sync.
_VALID_METADATA_VERSIONS = ["1.0", "1.1", "1.2", "2.1", "2.2", "2.3"]
_MetadataVersion = Literal["1.0", "1.1", "1.2", "2.1", "2.2", "2.3"]
_VALID_METADATA_VERSIONS = ["1.0", "1.1", "1.2", "2.1", "2.2", "2.3", "2.4"]
_MetadataVersion = Literal["1.0", "1.1", "1.2", "2.1", "2.2", "2.3", "2.4"]

_REQUIRED_ATTRS = frozenset(["metadata_version", "name", "version"])

@@ -535,7 +545,7 @@ def _process_name(self, value: str) -> str:
except utils.InvalidName as exc:
raise self._invalid_metadata(
f"{value!r} is invalid for {{field}}", cause=exc
)
) from exc
else:
return value

@@ -547,7 +557,7 @@ def _process_version(self, value: str) -> version_module.Version:
except version_module.InvalidVersion as exc:
raise self._invalid_metadata(
f"{value!r} is invalid for {{field}}", cause=exc
)
) from exc

def _process_summary(self, value: str) -> str:
"""Check the field contains no newlines."""
@@ -591,10 +601,12 @@ def _process_dynamic(self, value: list[str]) -> list[str]:
for dynamic_field in map(str.lower, value):
if dynamic_field in {"name", "version", "metadata-version"}:
raise self._invalid_metadata(
f"{value!r} is not allowed as a dynamic field"
f"{dynamic_field!r} is not allowed as a dynamic field"
)
elif dynamic_field not in _EMAIL_TO_RAW_MAPPING:
raise self._invalid_metadata(f"{value!r} is not a valid dynamic field")
raise self._invalid_metadata(
f"{dynamic_field!r} is not a valid dynamic field"
)
return list(map(str.lower, value))

def _process_provides_extra(
@@ -608,7 +620,7 @@ def _process_provides_extra(
except utils.InvalidName as exc:
raise self._invalid_metadata(
f"{name!r} is invalid for {{field}}", cause=exc
)
) from exc
else:
return normalized_names

@@ -618,7 +630,7 @@ def _process_requires_python(self, value: str) -> specifiers.SpecifierSet:
except specifiers.InvalidSpecifier as exc:
raise self._invalid_metadata(
f"{value!r} is invalid for {{field}}", cause=exc
)
) from exc

def _process_requires_dist(
self,
@@ -629,10 +641,49 @@ def _process_requires_dist(
for req in value:
reqs.append(requirements.Requirement(req))
except requirements.InvalidRequirement as exc:
raise self._invalid_metadata(f"{req!r} is invalid for {{field}}", cause=exc)
raise self._invalid_metadata(
f"{req!r} is invalid for {{field}}", cause=exc
) from exc
else:
return reqs

def _process_license_expression(
self, value: str
) -> NormalizedLicenseExpression | None:
try:
return licenses.canonicalize_license_expression(value)
except ValueError as exc:
raise self._invalid_metadata(
f"{value!r} is invalid for {{field}}", cause=exc
) from exc

def _process_license_files(self, value: list[str]) -> list[str]:
paths = []
for path in value:
if ".." in path:
raise self._invalid_metadata(
f"{path!r} is invalid for {{field}}, "
"parent directory indicators are not allowed"
)
if "*" in path:
raise self._invalid_metadata(
f"{path!r} is invalid for {{field}}, paths must be resolved"
)
if (
pathlib.PurePosixPath(path).is_absolute()
or pathlib.PureWindowsPath(path).is_absolute()
):
raise self._invalid_metadata(
f"{path!r} is invalid for {{field}}, paths must be relative"
)
if pathlib.PureWindowsPath(path).as_posix() != path:
raise self._invalid_metadata(
f"{path!r} is invalid for {{field}}, "
"paths must use '/' delimiter"
)
paths.append(path)
return paths


class Metadata:
"""Representation of distribution metadata.
@@ -688,8 +739,8 @@ def from_raw(cls, data: RawMetadata, *, validate: bool = True) -> Metadata:
field = _RAW_TO_EMAIL_MAPPING[key]
exc = InvalidMetadata(
field,
"{field} introduced in metadata version "
"{field_metadata_version}, not {metadata_version}",
f"{field} introduced in metadata version "
f"{field_metadata_version}, not {metadata_version}",
)
exceptions.append(exc)
continue
@@ -733,6 +784,8 @@ def from_email(cls, data: bytes | str, *, validate: bool = True) -> Metadata:
metadata_version: _Validator[_MetadataVersion] = _Validator()
""":external:ref:`core-metadata-metadata-version`
(required; validated to be a valid metadata version)"""
# `name` is not normalized/typed to NormalizedName so as to provide access to
# the original/raw name.
name: _Validator[str] = _Validator()
""":external:ref:`core-metadata-name`
(required; validated using :func:`~packaging.utils.canonicalize_name` and its
@@ -770,6 +823,12 @@ def from_email(cls, data: bytes | str, *, validate: bool = True) -> Metadata:
""":external:ref:`core-metadata-maintainer-email`"""
license: _Validator[str | None] = _Validator()
""":external:ref:`core-metadata-license`"""
license_expression: _Validator[NormalizedLicenseExpression | None] = _Validator(
added="2.4"
)
""":external:ref:`core-metadata-license-expression`"""
license_files: _Validator[list[str] | None] = _Validator(added="2.4")
""":external:ref:`core-metadata-license-file`"""
classifiers: _Validator[list[str] | None] = _Validator(added="1.1")
""":external:ref:`core-metadata-classifier`"""
requires_dist: _Validator[list[requirements.Requirement] | None] = _Validator(
27 changes: 19 additions & 8 deletions src/pip/_vendor/packaging/specifiers.py
Original file line number Diff line number Diff line change
@@ -234,7 +234,7 @@ def __init__(self, spec: str = "", prereleases: bool | None = None) -> None:
"""
match = self._regex.search(spec)
if not match:
raise InvalidSpecifier(f"Invalid specifier: '{spec}'")
raise InvalidSpecifier(f"Invalid specifier: {spec!r}")

self._spec: tuple[str, str] = (
match.group("operator").strip(),
@@ -256,7 +256,7 @@ def prereleases(self) -> bool:
# operators, and if they are if they are including an explicit
# prerelease.
operator, version = self._spec
if operator in ["==", ">=", "<=", "~=", "==="]:
if operator in ["==", ">=", "<=", "~=", "===", ">", "<"]:
# The == specifier can include a trailing .*, if it does we
# want to remove before parsing.
if operator == "==" and version.endswith(".*"):
@@ -694,12 +694,18 @@ class SpecifierSet(BaseSpecifier):
specifiers (``>=3.0,!=3.1``), or no specifier at all.
"""

def __init__(self, specifiers: str = "", prereleases: bool | None = None) -> None:
def __init__(
self,
specifiers: str | Iterable[Specifier] = "",
prereleases: bool | None = None,
) -> None:
"""Initialize a SpecifierSet instance.

:param specifiers:
The string representation of a specifier or a comma-separated list of
specifiers which will be parsed and normalized before use.
May also be an iterable of ``Specifier`` instances, which will be used
as is.
:param prereleases:
This tells the SpecifierSet if it should accept prerelease versions if
applicable or not. The default of ``None`` will autodetect it from the
@@ -710,12 +716,17 @@ def __init__(self, specifiers: str = "", prereleases: bool | None = None) -> Non
raised.
"""

# Split on `,` to break each individual specifier into it's own item, and
# strip each item to remove leading/trailing whitespace.
split_specifiers = [s.strip() for s in specifiers.split(",") if s.strip()]
if isinstance(specifiers, str):
# Split on `,` to break each individual specifier into its own item, and
# strip each item to remove leading/trailing whitespace.
split_specifiers = [s.strip() for s in specifiers.split(",") if s.strip()]

# Make each individual specifier a Specifier and save in a frozen set for later.
self._specs = frozenset(map(Specifier, split_specifiers))
# Make each individual specifier a Specifier and save in a frozen set
# for later.
self._specs = frozenset(map(Specifier, split_specifiers))
else:
# Save the supplied specifiers in a frozen set.
self._specs = frozenset(specifiers)

# Store our prereleases value so we can use it later to determine if
# we accept prereleases or not.
Loading