diff --git a/dev/index.html b/dev/index.html index a3e2f8636b..4cb67a248a 100644 --- a/dev/index.html +++ b/dev/index.html @@ -984,8 +984,7 @@
PDM, as described, is a modern Python package and dependency manager supporting the latest PEP standards. But it is more than a package manager. It boosts your development workflow in various aspects. The most significant benefit is it installs and manages packages
-in a similar way to npm
that doesn't need to create a virtualenv at all!
PDM, as described, is a modern Python package and dependency manager supporting the latest PEP standards. But it is more than a package manager. It boosts your development workflow in various aspects.
PDM, as described, is a modern Python package and dependency manager supporting the latest PEP standards. But it is more than a package manager. It boosts your development workflow in various aspects. The most significant benefit is it installs and manages packages in a similar way to npm
that doesn't need to create a virtualenv at all!
PDM requires Python 3.8+ to be installed. It works on multiple platforms including Windows, Linux and macOS.
Note
You can still have your project working on lower Python versions, read how to do it here.
"},{"location":"#recommended-installation-method","title":"Recommended installation method","text":"PDM requires python version 3.8 or higher.
Like Pip, PDM provides an installation script that will install PDM into an isolated environment.
Linux/MacWindowscurl -sSL https://pdm-project.org/install-pdm.py | python3 -\n
(Invoke-WebRequest -Uri https://pdm-project.org/install-pdm.py -UseBasicParsing).Content | py -\n
For security reasons, you should verify the checksum of install-pdm.py
. It can be downloaded from install-pdm.py.sha256.
For example, on Linux/Mac:
curl -sSLO https://pdm-project.org/install-pdm.py\ncurl -sSL https://pdm-project.org/install-pdm.py.sha256 | shasum -a 256 -c -\n# Run the installer\npython3 install-pdm.py [options]\n
The installer will install PDM into the user site and the location depends on the system:
$HOME/.local/bin
for Unix%APPDATA%\\Python\\Scripts
on WindowsYou can pass additional options to the script to control how PDM is installed:
usage: install-pdm.py [-h] [-v VERSION] [--prerelease] [--remove] [-p PATH] [-d DEP]\n\noptional arguments:\n -h, --help show this help message and exit\n -v VERSION, --version VERSION | envvar: PDM_VERSION\n Specify the version to be installed, or HEAD to install from the main branch\n --prerelease | envvar: PDM_PRERELEASE Allow prereleases to be installed\n --remove | envvar: PDM_REMOVE Remove the PDM installation\n -p PATH, --path PATH | envvar: PDM_HOME Specify the location to install PDM\n -d DEP, --dep DEP | envvar: PDM_DEPS Specify additional dependencies, can be given multiple times\n
You can either pass the options after the script or set the env var value.
"},{"location":"#other-installation-methods","title":"Other installation methods","text":"HomebrewScooppipxpipasdfinside projectbrew install pdm\n
scoop bucket add frostming https://github.com/frostming/scoop-frostming.git\nscoop install pdm\n
pipx install pdm\n
Install the head version of GitHub repository. Make sure you have installed Git LFS on your system.
pipx install git+https://github.com/pdm-project/pdm.git@main#egg=pdm\n
To install PDM with all features:
pipx install pdm[all]\n
See also: https://pypa.github.io/pipx/
pip install --user pdm\n
Assuming you have asdf installed.
asdf plugin add pdm\nasdf local pdm latest\nasdf install pdm\n
By copying the Pyprojectx wrapper scripts to a project, you can install PDM as (npm-style) dev dependency inside that project. This allows different projects/branches to use different PDM versions.
To initialize a new or existing project, cd into the project folder and:
Linux/MacWindowscurl -LO https://github.com/pyprojectx/pyprojectx/releases/latest/download/wrappers.zip && unzip wrappers.zip && rm -f wrappers.zip\n./pw --init pdm\n
Invoke-WebRequest https://github.com/pyprojectx/pyprojectx/releases/latest/download/wrappers.zip -OutFile wrappers.zip; Expand-Archive -Path wrappers.zip -DestinationPath .; Remove-Item -Path wrappers.zip\n.\\pw --init pdm\n
When installing pdm with this method, you need to run all pdm
commands through the pw
wrapper:
./pw pdm install\n
"},{"location":"#update-the-pdm-version","title":"Update the PDM version","text":"pdm self update\n
"},{"location":"#packaging-status","title":"Packaging Status","text":""},{"location":"#shell-completion","title":"Shell Completion","text":"PDM supports generating completion scripts for Bash, Zsh, Fish or Powershell. Here are some common locations for each shell:
BashZshFishPowershellpdm completion bash > /etc/bash_completion.d/pdm.bash-completion\n
# Make sure ~/.zfunc is added to fpath, before compinit.\npdm completion zsh > ~/.zfunc/_pdm\n
Oh-My-Zsh:
mkdir $ZSH_CUSTOM/plugins/pdm\npdm completion zsh > $ZSH_CUSTOM/plugins/pdm/_pdm\n
Then make sure pdm plugin is enabled in ~/.zshrc
pdm completion fish > ~/.config/fish/completions/pdm.fish\n
# Create a directory to store completion scripts\nmkdir $PROFILE\\..\\Completions\necho @'\nGet-ChildItem \"$PROFILE\\..\\Completions\\\" | ForEach-Object {\n . $_.FullName\n}\n'@ | Out-File -Append -Encoding utf8 $PROFILE\n# Generate script\nSet-ExecutionPolicy Unrestricted -Scope CurrentUser\npdm completion powershell | Out-File -Encoding utf8 $PROFILE\\..\\Completions\\pdm_completion.ps1\n
"},{"location":"#virtualenv-and-pep-582","title":"Virtualenv and PEP 582","text":"PDM offers experimental support for PEP 582 as an opt-in feature, in addition to virtualenv management. Although the Python Steering Council has rejected PEP 582, you can still test it out using PDM.
To learn more about the two modes, refer to the relevant chapters on Working with virtualenv and Working with PEP 582.
"},{"location":"#pdm-eco-system","title":"PDM Eco-system","text":"Awesome PDM is a curated list of awesome PDM plugins and resources.
"},{"location":"#sponsors","title":"Sponsors","text":""},{"location":"dev/benchmark/","title":"Benchmark","text":"This page has been removed, please visit https://lincolnloop.github.io/python-package-manager-shootout/ for a detailed benchmark report.
"},{"location":"dev/changelog/","title":"Changelog","text":"Attention
Major and minor releases also include changes listed within prior beta releases.
"},{"location":"dev/changelog/#release-v2112-2024-01-02","title":"Release v2.11.2 (2024-01-02)","text":""},{"location":"dev/changelog/#bug-fixes","title":"Bug Fixes","text":"pdm update --update-eager
can hit InconsistentCandidate error when dependency is included both through default dependencies and extra. #2495pdm install
should not warn when overwriting its own symlinks on install
/update
. #2502pdm export
. #1910--skip-existing
to pdm publish
to ignore the uploading error if the package already exists. #2362==major.minor.*
as default requires python for application projects. #2382package-type
field in the tool.pdm
table to differentiate between library and application projects. #2394pdm lock
now supports --update-reuse
option to keep the pinned versions in the lockfile if possible. #2419inherit_metadata
to inherit and merge markers from parent requirements. This is enabled by default when creating a new lockfile. #2421symlink_individual
for creating a symlink for each individual package file and hardlink
for creating hardlinks. #2425reuse-installed
. When this strategy is enabled, PDM will try to reuse the versions already installed in the environment, even if the package names are given in the command line following add
or update
. This strategy is supported by add
, update
and lock
commands. #2479PDM_CACHE_DIR
environment variable to configure cache directory location. #2485pdm init -n
. #2436pdm init
now implies --lib
if --backend
is passed. #2437install.cache_method = \"symlink\"
. #2466KeyError
raised by pdm update --unconstrained
when the project itself is listed as a dependency. #2483-r
requirements paths relative to the requirement file they are specified in #2422pdm publish
fails with HTTP error. #2400__init__.py
contains an unusual line. #2378pdm init
being read-only when copied from a read-only PDM installation. #2379export
command. #2390--extra--index-url
#2342include_packages
and exclude_packages
config under tool.pdm.source
table. #1645requires-python
range. And provide a way to ignore them per-package. #2304-q/--quiet
option to suppress some warnings printed to the console. This option is mutually exclusive with -v/--verbose
. #2304--strategy/-S
option for lock
command, to specify one or more strategy flags for resolving dependencies. --static-urls
and --no-cross-platform
are deprecated at the same time. #2310pdm.cli.commands.venv.backend.Backend._ensure_clean
to empty the .venv
folder instead of deleting it. #2282--venv
option. #2314--no-default
is requested. #2230--no-isolated
does. #2071--no-lock
option doesn't work as expected. Also support --no-lock
option for add
, remove
and update
commands. #2245findpython
to find pythons with the spec given by the user. #2225virtualenv
python.exe
binary under bin/
as well as Scripts/
and the virtualenv
/conda
root. #2236${PROJECT_ROOT}
variable in the lockfile. #2240pdm run
should only find local file if the command starts with ./
. #2221--overwrite
option to pdm init
to overwrite existing files(default False). #2163pdm list
command. Add --tree
as an alias and preferred name of --graph
option. #2165pdm run
to run a script with the relative or absolute path. #2217@ file://
dependencies can not be updated. #2169requires-python
cause PDM to crash. #2175comarable_version(\"1.2.3+local1\") == Version(\"1.2.3\")
. #2182pdm-pep517
. #2167sitecustomize.py
. #2139keyring
, copier
, cookiecutter
, template
, truststore
dependency groups. #2109PDM_PROJECT
for -p/--project
option. #2126pyproject.toml
if both --unconstrained
and --dry-run
are passed to pdm update
. #2125build-system
table when importing from other package manager. #2126unearth
to 0.10.0 #2113pdm install
. #2086*_lock
hooks are always emitted with dry_run=True in pdm update
. #2060pdm install --plugins
can't install self. #2062cookiecutter
and copier
as project generator. #2059pdm init
now accepts a template argument to initialize project from a built-in or Git template. #2053DeprecationWarning
with FutureWarning
for better exposure. #2012install-pdm.py
and its checksum file on the docs site. #2026--edit/-e
to pdm config
to edit the config file in default editor. #2028--project
option to pdm venv
to support another path as the project root. #2042truststore
as the SSL backend. This only works on Python 3.10 or newer. #2049pdm self list
. #2018url
field when converting requirements from a Pipfile-style file requirement. #2032No significant changes.
"},{"location":"dev/changelog/#release-v273-2023-06-13","title":"Release v2.7.3 (2023-06-13)","text":""},{"location":"dev/changelog/#bug-fixes_17","title":"Bug Fixes","text":"parser
argument is passed to BaseCommand.__init__()
method. #2007pdm list
. #1973pdm init -n
doesn't respect the --python
option. #1984setup.py
if it prints something to stdout. #1995install-pdm.py
. #1996PDM_PYPI_USERNAME
and PDM_PYPI_PASSWORD
when there are no defaults in config. #1961repository.custom.verify_ssl
config option as well as new command line argument of publish
command. #1928PATH
env var. #1944ResourceWarning
s when running the test suite with warnings enabled. #1915tool.poetry.build
doesn't exist. #1935pdm import
clobbers build-system.requires
value in pyproject.toml
. #1948pdm sync
instead of pdm install --no-lock
. #1947PATH
env var isn't set correctly when running under non-isolation mode. #1904tool.pdm.plugins
setting. #1461--json
flag to both run
and info
command allowing to dump scripts and infos as JSON. #1854_
) as internal tasks and hide them from the listing. #1855pdm init -n
(non-interactive mode), a venv will be created by default. Previously, the selected Python will be used under PEP 582 mode. #1862--no-cross-platform
to pdm lock
to create a non-cross-platform lockfile. #1898--venv
option descriptions in zsh completion script. #1847package
and package[extra]
. #1851FileNotFoundError
if the requirement path is not found. #1875No significant changes.
"},{"location":"dev/changelog/#release-v254-2023-05-05","title":"Release v2.5.4 (2023-05-05)","text":""},{"location":"dev/changelog/#bug-fixes_24","title":"Bug Fixes","text":"<2.0
to avoid incompatibility with cachecontrol
. #1886markdown-exec
to 1.5.0
for rendering TOC in CLI reference page. #1836PDM_USE_VENV
as PDM_IN_VENV
for --venv
flag as it mistakenly override another existing env var. #1829pdm --pep582
raises an argument error. #1823resolution.respect-source-order
is enabled, sources are lazily evaluated. This means that if a match is found on the first source, the remaining sources will not be requested. #1509--venv <venv>
to run a command in the virtual environment with the given name. #1705PDM_PREFER_BINARY
environment variable. #1817pdm lock
. #1796environment.is_global
property. #1814pdm init -p <dir>
if the target directory is not created yet. #1822pdm-backend
. #1684pdm.toml
which can be committed to the VCS. #1742Environment
is renamed to PythonLocalEnvironment
and GlobalEnvironment
is renamed to PythonEnvironment
. Move pdm.models.environment
module to pdm.environments
package. #1791unearth
to 0.8 to allow calling keyring from CLI. #1653venv
command to show the path or the python interpreter for a managed venv. #1680--lib
option to init
command to create a library project without prompting. #1708pdm fix
to migrate to the new PDM features. Add a hint when invoking PDM commands. #1743.pdm-python
in project root .gitignore
when running pdm init
. #1749PDM_IGNORE_ACTIVE_VENV
env var. #1782pre_invoke
to emit before any command is invoked. #1792pdm export
due to non-deterministic order of group iteration. #1786pdm show --version
#1788installer
to 0.7.0
and emit a warning if the RECORD validation fails. #1784pdm export
output doesn't include the extras of the dependencies. #1767pdm export
. #1730venv.prompt
configuration when using conda
as the backend. #1734.
with -
when normalizing package name. #1745pdm venv activate
without specifying env_name
to activate in project venv created by conda #1735ruff
as the linter. #1715asdf
. #1725requires-python
doesn't work for all dependencies. #1690pdm-pep517
instead of setuptools
. #1658importlib.resources
. #1660pdm run
. #1652pdm config
. #1622python
. #1626packaging>=22
. #1619subdirectory
attribute to the lockfile entry. #1630pyproject.toml
. #1310pytest
plugin pdm.pytest
for plugin developers. #1594pdm.lock
with an @generated
comment. #1611sitecustomize
to the home directory if it exists in the filesystem(not packed in a zipapp). #1572build-system.requires
, since build
and hatch
both support it. Be aware it is not allowed in the standard. #1560packaging 22.0
. #1562__file__
usages with importlib.resources
, to make PDM usable in a zipapp. #1567package==22.0
from the dependencies to avoid some breakages to the end users. #1568pdm.pep517
as the metadata transformer for unknown custom build backends. #1546installer
to 0.6.0
. #1550unearth
to 0.6.3
and test against packaging==22.0
. #1555pdm use
. #1542tool.pdm.overrides
table to tool.pdm.resolution.overrides
. The old name is deprecated at the same time. #1503--backend
option to pdm init
command, users can choose a favorite backend from setuptools
, flit
, hatchling
and pdm-pep517
(default), since they all support PEP 621 standards. #1504{args[:default]}
placeholder. #1507python.use_venv=False
#1508findpython
installed. #1516install.cache
set to true
and caching method is pth
. #863pdm-pep517
. #1504pep517
with pyproject-hooks
because of the rename. #1528setup.py
format, users are encouraged to migrate to the PEP 621 metadata. #1504sitecustomize.py
respect the PDM_PROJECT_MAX_DEPTH
environment variable #1471python_version
in the environment marker. When the version contains only one digit, the result was incorrect. #1484venv.prompt
configuration to allow customizing prompt when a virtualenv is activated #1332ca_certs
or from the command line via pdm publish --ca-certs <path> ...
. #1392plugin
command to self
, and it can not only manage plugins but also all dependencies. Add a subcommand self update
to update PDM itself. #1406pdm init
to receive a Python path or version via --python
option. #1412requires-python
when importing from other formats. #1426pdm
instead of pip
to resolve and install build requirements. So that PDM configurations can control the process. #1429pdm config
command. #1450pdm lock --check
flag to validate whether the lock is up to date. #1459pip
when creating a new venv. #1463pdm list
command with new formats: --csv,--markdown
and add options --fields,--sort
to control the output contents. Users can also include licenses
in the --fields
option to display the package licenses. #1469pdm lock --check
in pre-commit. #1471--project
argument. #1220pypi.[ca,client]_cert[s]
config items are passed to distribution builder install steps to allow for custom PyPI index sources with self signed certificates. #1396pdm init
. #1410python*
command in pdm run
. #1414==1.*
). #1465importlib-metadata
from PyPI for Python < 3.10. #1467pypi.[ca,client]_cert[s]
config items are passed to distribution builder install steps to allow for custom PyPI index sources with self signed certificates. #1396pdm init
. #1410pdm lock --refresh
if some packages has URLs. #1361file://
links the first time they are added. #1325data-requires-python
when parsing package links. #1334editables
package isn't installed for self package. #1344setup-script
, run-setuptools
, and is-purelib
. #1327pdm run
. #1312post_lock
for add
and update
operations, to ensure the pyproject.toml
is updated before the hook is run. #1320add
or update
command. #1287packaging
dependency to ensure that packaging.utils.parse_wheel_filename
is available. #1293pypi.ca_certs
config entry. #1240pdm export
to available pre-commit hooks. #1279summary
field in pdm.lock
contains the description
from the package's pyproject.toml
. #1274pdm show
for a package that is only available as source distribution. #1276pip
. #1268pdm
module, it will be removed in the future. #1282python
executable in the PATH
. #1255metadata.files
in pdm.lock
. #1256[metada.files]
table of the lock file. #1259env_file
variables no longer override existing environment variables. #1235<this_package_name>[group1, group2]
#1241requires-python
when creating the default venv. #1237PYTHONPATH
. #1211pwsh
as an alias of powershell
for shell completion. #1216zsh
completion regarding --pep582
flag. #12184.0
. #1203unearth
to fix a bug that install links with weak hashes are skipped. This often happens on self-hosted PyPI servers. #1202pdm venv
commands into the main program. Make PEP 582 an opt-in feature. #1162global_project.fallback_verbose
defaulting to True
. When set to False
disables message Project is not found, fallback to the global project
#1188--only-keep
option to pdm sync
to keep only selected packages. Originally requested at #398. #1191unearth
to 0.4.1
to skip the wheels with invalid version parts. #1178PDM_RESOLVE_MAX_ROUNDS
environment variable (was spelled \u2026ROUDNS
before). #1180--no-clean
option from pdm sync
command. #1191[project]
table is not allowed, according to PEP 621. They are however still allowed in the [tool.pdm.dev-dependencies]
table. PDM will emit a warning when it finds editable dependencies in the [project]
table, or will abort when you try to add them into the [project]
table via CLI. #1083setup.py
project. #1062rich
. #1091-v
option. #1096unearth
to replace pip
's PackageFinder
and related data models. PDM no longer relies on pip
internals, which are unstable across updates. #1096find_matches()
to speed up the resolution. #1098publish
to PDM since it is required for so many people and it will make the workflow easier. #1107composite
script kind allowing to run multiple defined scripts in a single command as well as reusing scripts but overriding env
or env_file
. #1117--skip
to opt-out some scripts and hooks from any execution (both scripts and PDM commands). #1127pre/post_publish
, pre/post_run
and pre/post_script
hooks as well as an extensive lifecycle and hooks documentation. #1147[tool.pdm.build]
, according to pdm-pep517 1.0.0
. At the same time, warnings will be shown against old usages. #1153pyproject.toml
rather than building it. #1156[tool.pdm.build]
table. #1157post_use
hook triggered after successfully switching Python version. #1163respect-source-order
under [tool.pdm.resolution]
to respect the source order in the pyproject.toml
file. Packages will be returned by source earlier in the order or later ones if not found. #593tomllib
on Python 3.11 #1072click
, halo
, colorama
and log_symbols
. PDM has no vendors now. #1091pdm-pep517
to 1.0.0
. #1153pdm 0.x
) is no longer supported. #1157tox.ini
file for easier local testing against all Python versions. #1160venv
scheme for prefix
kind install scheme. #1158skip-add-to-path
option to installer in order to prevent changing PATH
. Replace bin
variable name with bin_dir
. #1145[tool.poetry.build]
config table. #1131pdm
process and not to the process actually being run. #1095version
in the cache key of the locked candidates if they are from a URL requirement. #1099requires-python
pre-release versions caused pdm update
to fail with InvalidPyVersion
. #1111setup.cfg
or setup.py
. #1101pdm update
command. #1104venv
install scheme when available. This scheme is more stable than posix_prefix
scheme since the latter is often patched by distributions. #1106pdm.lock
by --lockfile
option or PDM_LOCKFILE
env var. #1038pyproject.toml
when running pdm add --no-editable <package>
. #1050get_sysconfig_path.py
script. #1056${PROJECT_ROOT}
variable in the result of export
command. #1079pdm init
and create default README for libraries. #1041requirements.txt
. #1036pdm use
error. #1039optional
key when converting from Poetry's dependency entries. #1042--no-editable
is passed. pdm add --no-editable
will now override the editable
mode of the given packages. #1011pdm lock --refresh
. #1019${PROJECT_ROOT}
in the output of pdm list
. #1004installer 0.5.x
. #1002license
field to \"None\". #991pdm search
command. #993~/.local
) with global_project.user_site
config. #885auto_global
to global_project.fallback
and deprecate the old name. #986show
command. #966_.site_packages
is overridden by default option value. #985pdm-pep517
to support PEP 639. #959pythonfinder
to findpython
as the Python version finder. #930pre_*
and post_*
scripts for task composition. Pre- and Post- scripts for init
, build
, install
and lock
will be run if present. #789--config/-c
option to specify another global configuration file. #883[tool.pdm.overrides]
table. #909use_venv
to python.use_venv
; rename config feature.install_cache
to install.cache
; rename config feature.install_cache_method
to install.cache_method
; rename config parallel_install
to install.parallel
. #914[tool.pdm.overrides]
table. #861requires-python
should be produced if ANY(*
) is given. #917pdm.lock
gets created when --dry-run
is passed to pdm add
. #918path
. #904ExtrasError
to ExtrasWarning
for better understanding. Improve the warning message. #892Candidate
into a new class PreparedCandidate
. Candidate
no longer holds an Environment
instance. #920pip>=22.0
. #875pdm run
, it will run the Python REPL. #856direct_url.json
for a local pre-built wheel. #861pip<22.0
. #874network
marker. #858-
unexpectedly. #853use
command to save the human effort. And introduce an -i
option to ignored that remembered value. #846RECORD
. #847ModuleNotFoundError
during uninstall when the modules required are removed. #850pdm update
even if --no-sync
is passed. #837feature.install_cache_method
config. #822lock --refresh
to update the hash stored with the lock file without updating the pinned versions. #642[tool.pdm.overrides]
table. #790post_init
, pre_lock
, post_lock
, pre_install
and post_install
. #798install --check
to check if the lock file is up to date. #810allow_prereleases
setting. Now non-named requirements are resolved earlier than pinned requirements. #799atoml
to tomlkit
as the style-preserving TOML parser. The latter has supported TOML v1.0.0. #809minimum
, without upper bounds. #787sysconfig
to return the PEP 582 scheme in pdm run
. #784--pre/--prelease
option for pdm add
and pdm update
. It will allow prereleases to be pinned. #774git+https
candidates cannot be resolved. #771version
from [project]
table to [tool.pdm]
table, delete classifiers
from dynamic
, and warn usage about the deprecated usages. #748x >= VERSION
when adding dependencies. #752pdm list --freeze
to fix a bug due to Pip's API change. #533requires-python
. #744-s/--section
option from all previously supported commands. Use -G/--group
instead. #756importlib
to replace imp
in the sitecustomize
module for Python 3. #574pdm export
. #741-s/--site-packages
to pdm run
as well as a script config item. When it is set to True
, site-packages from the selected interpreter will be loaded into the running environment. #733NO_SITE_PACKAGES
isn't set in pdm run
if the executable is out of local packages. #733pdm run
, but keep them seen when PEP 582 is enabled. #708pip
with --isolated
when building wheels. In this way some env vars like PIP_REQUIRE_VIRTUALENV
can be ignored. #669pip
is not DEBUNDLED. #685summary
is None
, the lockfile can't be generated. #719${PROJECT_ROOT}
should be written in the URL when relative path is given. #721pdm import
can't merge the settings correctly. #723--no-sync
option to update
command. #684find_links
source type. It can be specified via type
key of [[tool.pdm.source]]
table. #694--dry-run
option to add
, install
and remove
commands. #698project.core.ui.display_columns
), fixing unnecessary wrapping due to / with empty lines full of spaces in case of long URLs in the last column. #680check_update
is boolean. #689setup_dev.py
in favor of pip install
. #676requires-python
is not recognized in candidates evaluation. #657installer
to 0.3.0
, fixing a bug that broke installation of some packages with unusual wheel files. #653packaging
and typing-extensions
to direct dependencies. #674requires-python
now participates in the resolution as a dummy requirement. #658--no-isolation
option for install
, lock
, update
, remove
, sync
commands. #640project_max_depth
configurable and default to 5
. #643pdm-pep517
backend on Python 2.7 when installing self as editable. #640*-nspkg.pth
files in install_cache
mode. It will still work without them. #623-r/--reinstall
option to sync
command to force re-install the existing dependencies. #601pdm cache clear
can clear cached installations if not needed any more. #604setuptools
won't see the dependencies under local packages. #601direct_url.json
when installing wheels. #607*
fails to be converted as SpecifierSet
. #609--json
to the list command which outputs the dependency graph as a JSON document. #583feature.install_cache
. When it is turned on, wheels will be installed into a centralized package repo and create .pth
files under project packages directory to link to the cached package. #589pdm show
. #580~/.pyenv/shims/python3
as the pyenv interpreter. #590-s/--section
option in favor of -G/--group
. #591pdm/installers/installers.py
is renamed to pdm/installers/manager.py
to be more accurate. The Installer
class under that file is renamed to InstallerManager
and is exposed in the pdm.core.Core
object for overriding. The new pdm/installers/installers.py
contains some installation implementations. #589pkg_resources.Distribution
to the implementation of importlib.metadata
. #592distlib
. #519--<field-name>
options in pdm show. When no package is given, show this project. #527--freeze
option to pdm list
command which shows the dependencies list as pip's requirements.txt format. #531PYTHONPATH
. #522pdm-pep517
to 0.8.0
. #524toml
to tomli
. #541plugin
to manage pdm plugins, including add
, remove
and list
commands. #510resolvelib
any more. This makes PDM more stable across updates of sub-dependencies. #515-u/--unconstrained
to support unconstraining version specifiers when adding packages. #501No significant changes.
"},{"location":"dev/changelog/#release-v161-2021-05-31","title":"Release v1.6.1 (2021-05-31)","text":"No significant changes.
"},{"location":"dev/changelog/#release-v160-2021-05-31","title":"Release v1.6.0 (2021-05-31)","text":""},{"location":"dev/changelog/#features-improvements_57","title":"Features & Improvements","text":"pdm export
no longer produces requirements file applicable for all platforms due to the new approach. #456--no-editable
option to install non-editable versions of all packages. #443--no-self
option to prevent the project itself from being installed. #444.gitignore
file in the __pypackages__
directory. #446PDM_PROJECT_ROOT
env var. Change to the project root when executing scripts. #470tomlkit
to atoml
as the style-preserving TOML parser and writer. #465--dev
flag for older versions of PDM. #444--config-setting
. #452keyring
as a dependency and guide users to install it when it is not available. #442distlib
. #447pdm.cli.actions
#428pdm use
with no argument given, which will list all available pythons for pick. #409setup.py
failed for NameError. #407install
and sync
commands. Add a new option --prod/--production
to exclude them. Improve the dependency selection logic to be more convenient to use \u2014 the more common the usage is, the shorter the command is. #391source-includes
to mark files to be included only in sdist builds. #390pdm-pep517
to 0.7.0
; update resolvelib
to0.7.0
. #390-d/--dev
option in install
and sync
commands. #391pdm run
on a directory not initialized yet.resolvelib
to 0.6.0
. #381pdm.models.readers
to improve typing support #321project.python_executable
to project.python
that contains all info of the interpreter. #382:all
given to -s/--section
to refer to all sections under the same species. Adjust add
, sync
, install
, remove
and update
to support the new dev-dependencies
groups. Old behavior will be kept the same. #351dev-dependencies
is now a table of dependencies groups, where key is the group name and value is an array of dependencies. These dependencies won't appear in the distribution's metadata. dev-depedencies
of the old format will turn into dev
group under dev-dependencies
. #351dev-dependencies
, includes
, excludes
and package-dir
out from [project]
table to [tool.pdm]
table. The migration will be done automatically if old format is detected. #351--dry-run
option for update
command to display packages that need update, install or removal. Add --top
option to limit to top level packages only. #358init
command via -n/--non-interactive
option. No question will be asked in this mode. #368pdm info
, also add an option --packages
to show that value only. #372<script>-X.Y
variant to the bin folder. #365-g/--global
that was deprecated in 1.4.0
. One should use -g -p <project_path>
for that purpose. #361pdm init
#352pdm-pep517
to 0.6.1
. #353type
argument to pdm cache clear
and improve its UI. #343entry-points
. #344models.project_info.ProjectInfo
, which indexes distlib.metadata._data
#335pdm.plugins
to pdm
. Export some useful objects and models for shorter import path. #318cmd
in tools.pdm.scripts
configuration items now allows specifying an argument array instead of a string.stream
singleton, improve the UI related code. #320cache
command, add list
, remove
and info
subcommands. #329pyproject.toml
. #308pdm.iostream
to improve 'typing' support #301specifiers.py
to a separated module. #303setup.py
has no intall_requires
key. #299pdm init
fails when pyproject.toml
exists but has no [project]
section. #295-I/--ignore-python
passed or PDM_IGNORE_SAVED_PYTHON=1
, ignore the interpreter set in .pdm.toml
and don't save to it afterwards. #283-p/--project
is introduced to specify another path for the project base. It can also be combined with -g/--global
option. The latter is changed to a flag only option that does not accept values. #286-f setuppy
for pdm export
to export the metadata as setup.py #289src
directory can't be uninstalled correctly. #277pdm sync
or pdm install
is not present in the error message. #274requires-python
attribute when fetching the candidates of a package. #264egg-info
directory when dependencies change. So that pdm list --graph
won't show invalid entries. #240requirements.txt
file, build the package to find the name if not given in the URL. #245name
and version
if not. #253packaging
. #130sections
value of a pinned candidate to be reused. #234>
, >=
, <
, <=
to combine with star versions. #254--save-compatible
slightly. Now the version specifier saved is using the REAL compatible operator ~=
as described in PEP 440. Before: requests<3.0.0,>=2.19.1
, After: requests~=2.19
. The new specifier accepts requests==2.19.0
as compatible version. #225${PROJECT_ROOT}
in the dependency specification can be expanded to refer to the project root in pyproject.toml. The environment variables will be kept as they are in the lock file. #226PYTHONPATH
(with python -I
mode) when executing pip commands. #231pip 21.0
. #235pyproject.toml
.pdm use <path-to-python-root>
. #221PYTHONPATH
manipulation under Windows platform. #215/search
endpoint is not available on given index. #211Poetry
, Pipfile
, flit
) can also be imported as PEP 621 metadata. #175pdm search
to query the /search
HTTP endpoint. #195classifiers
dynamic in pyproject.toml
template for autogeneration. #209is_subset()
returns incorrect result. #206pdm-pep517
to <0.3.0
, this is the last version to support legacy project metadata format.[metadata.files]
table. #196pip-shims
package as a dependency. #132pdm --pep582
can enable PEP 582 globally by manipulating the WinReg. #191__pypackages__
into PATH
env var during pdm run
. #193pdm run
:-s/--site-packages
to include system site-packages when running. #178setuptools
is installed before invoking editable install script. #174wheel
not wheels
for global projects #182sitecustomize.py
instead of a .pth
file to enable PEP 582. Thanks @Aloxaf. Update get_package_finder()
to be compatible with pip 20.3
. #185pyproject.toml
.[tool.pdm.scripts]
section.pdm run --list/-l
to show the list of script shortcuts. #168pdm install
. #169build-system.requires
anymore. #167build
to a home-grown version. #162LogWrapper
. #164is_subset
and is_superset
may return wrong result when wildcard excludes overlaps with the upper bound. #165pycomplete
. #159sitecustomize.py
incorrectly gets injected into the editable console scripts. #158find_matched()
is exhausted when accessed twice. #149pdm-pep517
to 0.2.0
that supports reading version from SCM. #146wheel==0.35
. #135pdm export
fails when the project doesn't have name
property. #126pip
to 20.1
. #125export
to export to alternative formats. #117resolvelib
0.4.0. #118resolvelib 0.3.0
. #116show
command to show package metadata. #114setuptools
to be installed in the isolated environment.pdm use
. #96python_requires
when initializing project. #89wheel
package is available before building packages. #90pythonfinder
, python-cfonts
, pip-shims
and many others. Drop dependency vistir
. #89pdm build
. #81pmd import
to import project metadata from Pipfile
, poetry
, flit
, requirements.txt
. #79pdm init
and pdm install
will auto-detect possible files that can be imported.package_dir
is mapped. #81pdm init
will use the current directory rather than finding the parents when global project is not activated.plugins
to entry_points
click
to argparse
, for better extensibility. #73-g/--global
to manage global project. The default location is at ~/.pdm/global-project
.-p/--project
to select project root other than the default one. #30pdm config del
to delete an existing config item. #71pdm init
. #674.0
as infinite upper bound when checking subsetting. #66ImpossiblePySpec
's hash clashes with normal one.pdm config
to inspect configurations. #26pdm cache clear
to clean caches. #63--python
option in pdm init
. #49python_requires
when initializing and defaults to >={current_version}
. #50setup.py
.pdm --help
. #42python-cfonts
to display banner. #42_editable_intall.py
compatible with Py2.pdm list --graph
to show a dependency graph of the working set. #10pdm update --unconstrained
to ignore the version constraint of given packages. #13pdm install
. #33pdm info
to show project environment information. #9pip
to 20.0
, update pip_shims
to 0.5.0
. #28setup_dev.py
for the convenience to setup pdm for development. #29pdm init
to bootstrap a project.pdm build
command.pdm init
to bootstrap a project.First off, thanks for taking the time to contribute! Contributions include but are not restricted to:
The following is a set of guidelines for contributing.
"},{"location":"dev/contributing/#a-recommended-flow-of-contributing-to-an-open-source-project","title":"A recommended flow of contributing to an Open Source project","text":"This section is for beginners to OSS. If you are an experienced OSS developer, you can skip this section.
git clone https://github.com/pdm-project/pdm.git\n# Or if you prefer SSH clone:\ngit clone git@github.com:pdm-project/pdm.git\n
git remote add fork https://github.com/yourname/pdm.git\ngit fetch fork\n
where fork
is the remote name of the fork repository.ProTips:
To update main branch to date:
git pull origin main\n# In rare cases that your local main branch diverges from the remote main:\ngit fetch origin && git reset --hard main\n
We recommend working in a virtual environment. Feel free to create a virtual environment with either the venv
module or the virtualenv
tool. For example:
python -m venv .venv\n. .venv/bin/activate # linux\n.venv/Scripts/activate # windows\n
Make sure your pip
is newer than 21.3
to install PDM in develop/editable mode.
python -m pip install -U \"pip>=21.3\"\npython -m pip install -e .\n
Make sure PDM uses the virtual environment you just created:
pdm config -l python.use_venv true\npdm config -l venv.in_project true\n
Install PDM development dependencies:
pdm install\n
Now, all dependencies are installed into the Python environment you chose, which will be used for development after this point.
"},{"location":"dev/contributing/#run-tests","title":"Run tests","text":"pdm run test\n
The test suite is still simple and needs expansion! Please help write more test cases.
Note
You can also run your test suite against all supported Python version using tox
with the tox-pdm
plugin. You can either run it by yourself with:
tox\n
or from pdm
with:
pdm run tox\n
"},{"location":"dev/contributing/#code-style","title":"Code style","text":"PDM uses pre-commit
for linting. Install pre-commit
first, for example with pip or pipx:
python -m pip install pre-commit\n
pipx install pre-commit\n
Then initialize pre-commit
:
pre-commit install\n
You can now lint the code with:
pdm run lint\n
PDM uses black
for code style and isort
for sorting import statements. If you are not following them, the CI will fail and your Pull Request will not be merged.
When you make changes such as fixing a bug or adding a feature, you must add a news fragment describing your change. News fragments are placed in the news/
directory, and should be named according to this pattern: <issue_num>.<issue_type>.md
(e.g., 566.bugfix.md
).
feature
: Features and improvementsbugfix
: Bug fixesrefactor
: Code restructuresdoc
: Added or improved documentationdep
: Changes to dependenciesremoval
: Removals or deprecations in the APImisc
: Miscellaneous changes that don't fit any of the other categoriesThe contents of the file should be a single sentence in the imperative mood that describes your changes. (e.g., Deduplicate the plugins list.
) See entries in the Change Log for more examples.
If you make some changes to the docs/
and you want to preview the build result, simply do:
pdm run doc\n
"},{"location":"dev/contributing/#release","title":"Release","text":"Once all changes are done and ready to release, you can preview the changelog contents by running:
pdm run release --dry-run\n
Make sure the next version and the changelog are as expected in the output.
Then cut a release on the main branch:
pdm run release\n
GitHub action will create the release and upload the distributions to PyPI.
Read more options about version bumping by pdm run release --help
.
Some reusable fixtures for pytest
.
New in version 2.4.0
To enable them in your test, add pdm.pytest
as a plugin. You can do so in your root conftest.py
:
# single plugin\npytest_plugins = \"pytest.plugin\"\n\n# many plugins\npytest_plugins = [\n ...\n \"pdm.pytest\",\n ...\n]\n
"},{"location":"dev/fixtures/#pdm.pytest.IndexMap","title":"IndexMap = Dict[str, Path]
module-attribute
","text":"Path some root-relative http paths to some local paths
"},{"location":"dev/fixtures/#pdm.pytest.IndexOverrides","title":"IndexOverrides = Dict[str, str]
module-attribute
","text":"PyPI indexes overrides fixture format
"},{"location":"dev/fixtures/#pdm.pytest.IndexesDefinition","title":"IndexesDefinition = Dict[str, Union[Tuple[IndexMap, IndexOverrides, bool], IndexMap]]
module-attribute
","text":"Mock PyPI indexes format
"},{"location":"dev/fixtures/#pdm.pytest.Distribution","title":"Distribution
","text":"A mock Distribution
"},{"location":"dev/fixtures/#pdm.pytest.LocalFileAdapter","title":"LocalFileAdapter
","text":" Bases: requests.adapters.BaseAdapter
A local file adapter for request.
Allows to mock some HTTP requests with some local files
"},{"location":"dev/fixtures/#pdm.pytest.MockWorkingSet","title":"MockWorkingSet
","text":" Bases: collections.abc.MutableMapping
A mock working set
"},{"location":"dev/fixtures/#pdm.pytest.PDMCallable","title":"PDMCallable
","text":" Bases: Protocol
The PDM fixture callable signature
"},{"location":"dev/fixtures/#pdm.pytest.PDMCallable.__call__","title":"__call__(args, strict=False, input=None, obj=None, env=None, **kwargs)
","text":"Parameters:
Name Type Description Defaultargs
str | list[str]
the command arguments as a single lexable string or a strings array
requiredstrict
bool
raise an exception on failure instead of returning if enabled
False
input
str | None
an optional string to be submitted too stdin
None
obj
Project | None
an optional existing Project
.
None
env
Mapping[str, str] | None
override the environment variables with those
None
Returns:
Type DescriptionRunResult
The command result
"},{"location":"dev/fixtures/#pdm.pytest.RunResult","title":"RunResult
dataclass
","text":"Store a command execution result.
"},{"location":"dev/fixtures/#pdm.pytest.RunResult.exception","title":"exception: Exception | None = None
instance-attribute
class-attribute
","text":"If set, the exception raised on execution
"},{"location":"dev/fixtures/#pdm.pytest.RunResult.exit_code","title":"exit_code: int
instance-attribute
","text":"The execution exit code
"},{"location":"dev/fixtures/#pdm.pytest.RunResult.output","title":"output: str
property
","text":"The execution stdout
output (stdout
alias)
outputs: str
property
","text":"The execution stdout
and stderr
outputs concatenated
stderr: str
instance-attribute
","text":"The execution stderr
output
stdout: str
instance-attribute
","text":"The execution stdout
output
print()
","text":"A debugging facility
"},{"location":"dev/fixtures/#pdm.pytest.TestRepository","title":"TestRepository
","text":" Bases: BaseRepository
A mock repository to ease testing dependencies
"},{"location":"dev/fixtures/#pdm.pytest.build_env","title":"build_env(build_env_wheels, tmp_path_factory)
","text":"A fixture build environment
Parameters:
Name Type Description Defaultbuild_env_wheels
Iterable[Path]
a list of wheel to install in the environment
requiredReturns:
Type DescriptionPath
The build environment temporary path
"},{"location":"dev/fixtures/#pdm.pytest.build_env_wheels","title":"build_env_wheels()
","text":"Expose some wheels to be installed in the build environment.
Override to provide your owns.
Returns:
Type DescriptionIterable[Path]
a list of wheels paths to install
"},{"location":"dev/fixtures/#pdm.pytest.local_finder_artifacts","title":"local_finder_artifacts()
","text":"The local finder search path as a fixture
Override to provides your own artifacts.
Returns:
Type DescriptionPath
The path to the artifacts root
"},{"location":"dev/fixtures/#pdm.pytest.pdm","title":"pdm(core, monkeypatch)
","text":"A fixture allowing to execute PDM commands
Returns:
Type DescriptionPDMCallable
A pdm
fixture command.
project(project_no_init)
","text":"A fixture creating an initialized test project for the current test.
Returns:
Type DescriptionProject
The initialized project
"},{"location":"dev/fixtures/#pdm.pytest.project_no_init","title":"project_no_init(tmp_path, mocker, core, pdm_session, monkeypatch, build_env)
","text":"A fixture creating a non-initialized test project for the current test.
Returns:
Type DescriptionProject
The non-initialized project
"},{"location":"dev/fixtures/#pdm.pytest.pypi_indexes","title":"pypi_indexes()
","text":"Provides some mocked PyPI entries
Returns:
Type DescriptionIndexesDefinition
a definition of the mocked indexes
"},{"location":"dev/fixtures/#pdm.pytest.remove_pep582_path_from_pythonpath","title":"remove_pep582_path_from_pythonpath(pythonpath)
","text":"Remove all pep582 paths of PDM from PYTHONPATH
"},{"location":"dev/fixtures/#pdm.pytest.repository","title":"repository(project, mocker, repository_pypi_json, local_finder)
","text":"A fixture providing a mock PyPI repository
Returns:
Type DescriptionTestRepository
A mock repository
"},{"location":"dev/fixtures/#pdm.pytest.repository_pypi_json","title":"repository_pypi_json()
","text":"The test repository fake PyPI definition path as a fixture
Override to provides your own definition path.
Returns:
Type DescriptionPath
The path to a fake PyPI repository JSON definition
"},{"location":"dev/fixtures/#pdm.pytest.venv_backends","title":"venv_backends(project, request)
","text":"A fixture iterating over venv
backends
working_set(mocker, repository)
","text":"a mock working set as a fixture
Returns:
Type DescriptionMockWorkingSet
a mock working set
"},{"location":"dev/write/","title":"PDM Plugins","text":"PDM is aiming at being a community driven package manager. It is shipped with a full-featured plug-in system, with which you can:
The core PDM project focuses on dependency management and package publishing. Other functionalities you wish to integrate with PDM are preferred to lie in their own plugins and released as standalone PyPI projects. In case the plugin is considered a good supplement of the core project it may have a chance to be absorbed into PDM.
"},{"location":"dev/write/#write-your-own-plugin","title":"Write your own plugin","text":"In the following sections, I will show an example of adding a new command hello
which reads the hello.name
config.
The PDM's CLI module is designed in a way that user can easily \"inherit and modify\". To write a new command:
from pdm.cli.commands.base import BaseCommand\n\nclass HelloCommand(BaseCommand):\n\"\"\"Say hello to the specified person.\n If none is given, will read from \"hello.name\" config.\n \"\"\"\n\n def add_arguments(self, parser):\n parser.add_argument(\"-n\", \"--name\", help=\"the person's name to whom you greet\")\n\n def handle(self, project, options):\n if not options.name:\n name = project.config[\"hello.name\"]\n else:\n name = options.name\n print(f\"Hello, {name}\")\n
First, let's create a new HelloCommand
class inheriting from pdm.cli.commands.base.BaseCommand
. It has two major functions:
add_arguments()
to manipulate the argument parser passed as the only argument, where you can add additional command line arguments to ithandle()
to do something when the subcommand is matched, you can do nothing by writing a single pass
statement. It accepts two arguments: an pdm.project.Project
object as the first one and the parsed argparse.Namespace
object as the second.The document string will serve as the command help text, which will be shown in pdm --help
.
Besides, PDM's subcommand has two default options: -v/--verbose
to change the verbosity level and -g/--global
to enable global project. If you don't want these default options, override the arguments
class attribute to a list of pdm.cli.options.Option
objects, or assign it to an empty list to have no default options:
class HelloCommand(BaseCommand):\n\narguments = []\n
Note
The default options are loaded first, then add_arguments()
is called.
Write a function somewhere in your plugin project. There is no limit on what the name of the function is, but the function should take only one argument -- the PDM core object:
def hello_plugin(core):\ncore.register_command(HelloCommand, \"hello\")\n
Call core.register_command()
to register the command. The second argument as the name of the subcommand is optional. PDM will look for the HelloCommand
's name
attribute if the name is not passed.
Let's recall the first code snippet, hello.name
config key is consulted for the name if not passed via the command line.
class HelloCommand(BaseCommand):\n\"\"\"Say hello to the specified person.\n If none is given, will read from \"hello.name\" config.\n \"\"\"\n\n def add_arguments(self, parser):\n parser.add_argument(\"-n\", \"--name\", help=\"the person's name to whom you greet\")\n\n def handle(self, project, options):\n if not options.name:\nname = project.config[\"hello.name\"]\nelse:\n name = options.name\n print(f\"Hello, {name}\")\n
Till now, if you query the config value by pdm config get hello.name
, an error will pop up saying it is not a valid config key. You need to register the config item, too:
from pdm.project.config import ConfigItem\n\ndef hello_plugin(core):\n core.register_command(HelloCommand, \"hello\")\ncore.add_config(\"hello.name\", ConfigItem(\"The person's name\", \"John\"))\n
where ConfigItem
class takes 4 parameters, in the following order:
description
: a description of the config itemdefault
: default value of the config itemglobal_only
: whether the config is allowed to set in home config onlyenv_var
: the name of environment variable which will be read as the config valueBesides of commands and configurations, the core
object exposes some other methods and attributes to override. PDM also provides some signals you can listen to. Please read the API reference for more details.
When developing a plugin, one hopes to activate and plugin in development and get updated when the code changes.
You can achieve this by installing the plugin in editable mode. To do this, specify the dependencies in tool.pdm.plugins
array:
[tool.pdm]\nplugins = [\n\"-e file:///${PROJECT_ROOT}\"\n]\n
Then install it with:
pdm install --plugins\n
After that, all the dependencies are available in a project plugin library, including the plugin itself, in editable mode. That means any change to the codebase will take effect immediately without re-installation. The pdm
executable also uses a Python interpreter under the hood, so if you run pdm
from inside the plugin project, the plugin in development will be activated automatically, and you can do some testing to see how it works.
PDM exposes some pytest fixtures as a plugin in the pdm.pytest
module. To benefit from them, you must add pdm[pytest]
as a test dependency.
To enable them in your test, add pdm.pytest
as a plugin. You can do so by in your root conftest.py
:
# single plugin\npytest_plugins = \"pytest.plugin\"\n\n# many plugins\npytest_plugins = [\n ...\n \"pdm.pytest\",\n ...\n]\n
You can see some usage examples into PDM own tests, especially the conftest.py file for configuration.
See the pytest fixtures documentation for more details.
"},{"location":"dev/write/#publish-your-plugin","title":"Publish your plugin","text":"Now you have defined your plugin already, let's distribute it to PyPI. PDM's plugins are discovered by entry point types. Create an pdm
entry point and point to your plugin callable (yeah, it doesn't need to be a function, any callable object can work):
PEP 621:
# pyproject.toml\n\n[project.entry-points.pdm]\nhello = \"my_plugin:hello_plugin\"\n
setuptools:
# setup.py\n\nsetup(\n ...\n entry_points={\"pdm\": [\"hello = my_plugin:hello_plugin\"]}\n ...\n)\n
"},{"location":"dev/write/#activate-the-plugin","title":"Activate the plugin","text":"As plugins are loaded via entry points, they can be activated with no more steps than just installing the plugin. For convenience, PDM provides a plugin
command group to manage plugins.
Assume your plugin is published as pdm-hello
:
pdm self add pdm-hello\n
Now type pdm --help
in the terminal, you will see the new added hello
command and use it:
$ pdm hello Jack\nHello, Jack\n
See more plugin management subcommands by typing pdm self --help
in the terminal.
To specify the required plugins for a project, you can use the tool.pdm.plugins
config in the pyproject.toml
file. These dependencies can be installed into a project plugin library by running pdm install --plugins
. The project plugin library will be loaded in subsequent PDM commands.
This is useful when you want to share the same plugin set with the contributors.
# pyproject.toml\n[tool.pdm]\nplugins = [\n\"pdm-packer\"\n]\n
Run pdm install --plugins
to install and activate the plugins.
Alternatively, you can have project-local plugins that are not published to PyPI, by using editable local dependencies:
# pyproject.toml\n[tool.pdm]\nplugins = [\n\"-e file:///${PROJECT_ROOT}/plugins/my_plugin\"\n]\n
"},{"location":"reference/api/","title":"API Reference","text":""},{"location":"reference/api/#pdm.core.Core","title":"pdm.core.Core
","text":"A high level object that manages all classes and configurations
"},{"location":"reference/api/#pdm.core.Core.add_config","title":"add_config(name, config_item)
staticmethod
","text":"Add a config item to the configuration class.
Parameters:
Name Type Description Defaultname
str
The name of the config item
requiredconfig_item
pdm.project.config.ConfigItem
The config item to add
required"},{"location":"reference/api/#pdm.core.Core.create_project","title":"create_project(root_path=None, is_global=False, global_config=None)
","text":"Create a new project object
Parameters:
Name Type Description Defaultroot_path
PathLike
The path to the project root directory
None
is_global
bool
Whether the project is a global project
False
global_config
str
The path to the global config file
None
Returns:
Type DescriptionProject
The project object
"},{"location":"reference/api/#pdm.core.Core.handle","title":"handle(project, options)
","text":"Called before command invocation
"},{"location":"reference/api/#pdm.core.Core.load_plugins","title":"load_plugins()
","text":"Import and load plugins under pdm.plugin
namespace A plugin is a callable that accepts the core object as the only argument.
def my_plugin(core: pdm.core.Core) -> None:\n ...\n
"},{"location":"reference/api/#pdm.core.Core.main","title":"main(args=None, prog_name=None, obj=None, **extra)
","text":"The main entry function
"},{"location":"reference/api/#pdm.core.Core.register_command","title":"register_command(command, name=None)
","text":"Register a subcommand to the subparsers, with an optional name of the subcommand.
Parameters:
Name Type Description Defaultcommand
Type[pdm.cli.commands.base.BaseCommand]
The command class to register
requiredname
str
The name of the subcommand, if not given, command.name
is used
None
"},{"location":"reference/api/#pdm.core.Project","title":"pdm.core.Project
","text":"Core project class.
Parameters:
Name Type Description Defaultcore
Core
The core instance.
requiredroot_path
str | Path | None
The root path of the project.
requiredis_global
bool
Whether the project is global.
False
global_config
str | Path | None
The path to the global config file.
None
"},{"location":"reference/api/#pdm.project.core.Project.config","title":"config: Mapping[str, Any]
cached
property
","text":"A read-only dict configuration
"},{"location":"reference/api/#pdm.project.core.Project.default_source","title":"default_source: RepositoryConfig
property
","text":"Get the default source from the pypi setting
"},{"location":"reference/api/#pdm.project.core.Project.project_config","title":"project_config: Config
cached
property
","text":"Read-and-writable configuration dict for project settings
"},{"location":"reference/api/#pdm.project.core.Project.find_interpreters","title":"find_interpreters(python_spec=None)
","text":"Return an iterable of interpreter paths that matches the given specifier,
which can beget_provider(strategy='all', tracked_names=None, for_install=False, ignore_compatibility=True, direct_minimal_versions=False)
","text":"Build a provider class for resolver.
:param strategy: the resolve strategy :param tracked_names: the names of packages that needs to update :param for_install: if the provider is for install :param ignore_compatibility: if the provider should ignore the compatibility when evaluating candidates :param direct_minimal_versions: if the provider should prefer minimal versions instead of latest :returns: The provider object
"},{"location":"reference/api/#pdm.project.core.Project.get_reporter","title":"get_reporter(requirements, tracked_names=None, spinner=None)
","text":"Return the reporter object to construct a resolver.
:param requirements: requirements to resolve :param tracked_names: the names of packages that needs to update :param spinner: optional spinner object :returns: a reporter
"},{"location":"reference/api/#pdm.project.core.Project.get_repository","title":"get_repository(cls=None, ignore_compatibility=True)
","text":"Get the repository object
"},{"location":"reference/api/#pdm.project.core.Project.resolve_interpreter","title":"resolve_interpreter()
","text":"Get the Python interpreter path.
"},{"location":"reference/api/#pdm.project.core.Project.use_pyproject_dependencies","title":"use_pyproject_dependencies(group, dev=False)
","text":"Get the dependencies array and setter in the pyproject.toml Return a tuple of two elements, the first is the dependencies array, and the second value is a callable to set the dependencies array back.
"},{"location":"reference/api/#pdm.project.core.Project.write_lockfile","title":"write_lockfile(toml_data, show_message=True, write=True, **_kwds)
","text":"Write the lock file to disk.
"},{"location":"reference/api/#signals","title":"Signals","text":"New in version 1.12.0
The signal definition for PDM.
Examplefrom pdm.signals import post_init, post_install\n\ndef on_post_init(project):\n project.core.ui.echo(\"Project initialized\")\n# Connect to the signal\npost_init.connect(on_post_init)\n# Or use as a decorator\n@post_install.connect\ndef on_post_install(project, candidates, dry_run):\n project.core.ui.echo(\"Project install succeeded\")\n
"},{"location":"reference/api/#pdm.signals.post_build","title":"post_build: NamedSignal = pdm_signals.signal('post_build')
module-attribute
","text":"Called after a project is built.
Parameters:
Name Type Description Defaultproject
Project
The project object
requiredartifacts
Sequence[str]
The locations of built artifacts
requiredconfig_settings
dict[str, str] | None
Additional config settings passed via args
required"},{"location":"reference/api/#pdm.signals.post_init","title":"post_init: NamedSignal = pdm_signals.signal('post_init')
module-attribute
","text":"Called after a project is initialized.
Parameters:
Name Type Description Defaultproject
Project
The project object
required"},{"location":"reference/api/#pdm.signals.post_install","title":"post_install: NamedSignal = pdm_signals.signal('post_install')
module-attribute
","text":"Called after a project is installed.
Parameters:
Name Type Description Defaultproject
Project
The project object
requiredcandidates
dict[str, Candidate]
The candidates installed
requireddry_run
bool
If true, won't perform any actions
required"},{"location":"reference/api/#pdm.signals.post_lock","title":"post_lock: NamedSignal = pdm_signals.signal('post_lock')
module-attribute
","text":"Called after a project is locked.
Parameters:
Name Type Description Defaultproject
Project
The project object
requiredresolution
dict[str, Candidate]
The resolved candidates
requireddry_run
bool
If true, won't perform any actions
required"},{"location":"reference/api/#pdm.signals.post_publish","title":"post_publish: NamedSignal = pdm_signals.signal('post_publish')
module-attribute
","text":"Called after a project is published.
Parameters:
Name Type Description Defaultproject
Project
The project object
required"},{"location":"reference/api/#pdm.signals.post_run","title":"post_run: NamedSignal = pdm_signals.signal('post_run')
module-attribute
","text":"Called after any run.
Parameters:
Name Type Description Defaultproject
Project
The project object
requiredscript
str
the script name
requiredargs
Sequence[str]
the command line provided arguments
required"},{"location":"reference/api/#pdm.signals.post_script","title":"post_script: NamedSignal = pdm_signals.signal('post_script')
module-attribute
","text":"Called after any script.
Parameters:
Name Type Description Defaultproject
Project
The project object
requiredscript
str
the script name
requiredargs
Sequence[str]
the command line provided arguments
required"},{"location":"reference/api/#pdm.signals.post_use","title":"post_use: NamedSignal = pdm_signals.signal('post_use')
module-attribute
","text":"Called after use switched to a new Python version.
Parameters:
Name Type Description Defaultproject
Project
The project object
requiredpython
PythonInfo
Information about the new Python interpreter
required"},{"location":"reference/api/#pdm.signals.pre_build","title":"pre_build: NamedSignal = pdm_signals.signal('pre_build')
module-attribute
","text":"Called before a project is built.
Parameters:
Name Type Description Defaultproject
Project
The project object
requireddest
str
The destination location
requiredconfig_settings
dict[str, str] | None
Additional config settings passed via args
required"},{"location":"reference/api/#pdm.signals.pre_install","title":"pre_install: NamedSignal = pdm_signals.signal('pre_install')
module-attribute
","text":"Called before a project is installed.
Parameters:
Name Type Description Defaultproject
Project
The project object
requiredcandidates
dict[str, Candidate]
The candidates to install
requireddry_run
bool
If true, won't perform any actions
required"},{"location":"reference/api/#pdm.signals.pre_invoke","title":"pre_invoke: NamedSignal = pdm_signals.signal('pre_invoke')
module-attribute
","text":"Called before any command is invoked.
Parameters:
Name Type Description Defaultproject
Project
The project object
requiredcommand
str | None
the command name
requiredoptions
Namespace
the parsed arguments
required"},{"location":"reference/api/#pdm.signals.pre_lock","title":"pre_lock: NamedSignal = pdm_signals.signal('pre_lock')
module-attribute
","text":"Called before a project is locked.
Parameters:
Name Type Description Defaultproject
Project
The project object
requiredrequirements
list[Requirement]
The requirements to lock
requireddry_run
bool
If true, won't perform any actions
required"},{"location":"reference/api/#pdm.signals.pre_publish","title":"pre_publish: NamedSignal = pdm_signals.signal('pre_publish')
module-attribute
","text":"Called before a project is published.
Parameters:
Name Type Description Defaultproject
Project
The project object
required"},{"location":"reference/api/#pdm.signals.pre_run","title":"pre_run: NamedSignal = pdm_signals.signal('pre_run')
module-attribute
","text":"Called before any run.
Parameters:
Name Type Description Defaultproject
Project
The project object
requiredscript
str
the script name
requiredargs
Sequence[str]
the command line provided arguments
required"},{"location":"reference/api/#pdm.signals.pre_script","title":"pre_script: NamedSignal = pdm_signals.signal('pre_script')
module-attribute
","text":"Called before any script.
Parameters:
Name Type Description Defaultproject
Project
The project object
requiredscript
str
the script name
requiredargs
Sequence[str]
the command line provided arguments
required"},{"location":"reference/build/","title":"Build Configuration","text":"pdm
uses the PEP 517 to build the package. It acts as a build frontend that calls the build backend to build the package.
A build backend is what drives the build system to build source distributions and wheels from arbitrary source trees.
If you run pdm init
, PDM will let you choose the build backend to use. Unlike other package managers, PDM does not force you to use a specific build backend. You can choose the one you like. Here is a list of build backends and corresponding configurations initially supported by PDM:
pyproject.toml
configuration:
[build-system]\nrequires = [\"pdm-backend\"]\nbuild-backend = \"pdm.backend\"\n
Read the docs
pyproject.toml
configuration:
[build-system]\nrequires = [\"setuptools\", \"wheel\"]\nbuild-backend = \"setuptools.build_meta\"\n
Read the docs
pyproject.toml
configuration:
[build-system]\nrequires = [\"flit_core >=3.2,<4\"]\nbuild-backend = \"flit_core.buildapi\"\n
Read the docs
pyproject.toml
configuration:
[build-system]\nrequires = [\"hatchling\"]\nbuild-backend = \"hatchling.build\"\n
Read the docs
pyproject.toml
configuration:
[build-system]\nrequires = [\"maturin>=1.4,<2.0\"]\nbuild-backend = \"maturin\"\n
Read the docs
Apart from the above mentioned backends, you can also use any other backend that supports PEP 621, however, poetry-core is not supported because it does not support reading PEP 621 metadata.
Info
If you are using a custom build backend that is not in the above list, PDM will handle the relative paths as PDM-style(${PROJECT_ROOT}
variable).
Options:
-h
, --help
: Show this help message and exit.-V
, --version
: Show the version and exit-c
, --config
: Specify another config file path [env var: PDM_CONFIG_FILE
] -v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress output-I
, --ignore-python
: Ignore the Python path saved in.pdm-python
. [env var: PDM_IGNORE_SAVED_PYTHON
]--pep582
SHELL
: Print the command line to be eval'd by the shellCommands:
"},{"location":"reference/cli/#add","title":"add","text":"Add package(s) to pyproject.toml and install them
Package Arguments:
-e
, --editable
: Specify editable packagespackages
: Specify packagesOptions:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress output-g
, --global
: Use the global project, supply the project root with -p
option-p
, --project
: Specify another path as the project root, which changes the base ofpyproject.toml
and __pypackages__
[env var: PDM_PROJECT
]-L
, --lockfile
: Specify another lockfile path. Default:pdm.lock
. [env var: PDM_LOCKFILE
]--no-lock
: Don't try to create or update the lockfile. [env var: PDM_NO_LOCK
]--save-compatible
: Save compatible version specifiers--save-wildcard
: Save wildcard version specifiers--save-exact
: Save exact version specifiers--save-minimum
: Save minimum version specifiers--update-reuse
: Reuse pinned versions already present in lock file if possible--update-eager
: Try to update the packages and their dependencies recursively--update-all
: Update all dependencies and sub-dependencies--update-reuse-installed
: Reuse installed packages if possible--pre
, --prerelease
: Allow prereleases to be pinned-u
, --unconstrained
: Ignore the version constraint of packages--dry-run
: Show the difference only and don't perform any action--venv
NAME
: Run the command in the virtual environment with the given key. [env var: PDM_IN_VENV
]-k
, --skip
: Skip some tasks and/or hooks by their comma-separated names. Can be supplied multiple times. Use:all
to skip all hooks. Use:pre
and:post
to skip all pre or post hooks.-d
, --dev
: Add packages into dev dependencies-G
, --group
: Specify the target dependency group to add into--no-sync
: Only writepyproject.toml
and do not sync the working set (default: True
)Install Options:
--no-editable
: Install non-editable versions for all packages--no-self
: Don't install the project itself. [env var: PDM_NO_SELF
]--fail-fast
, -x
: Abort on first installation error--no-isolation
: Disable isolation when building a source distribution that follows PEP 517, as in: build dependencies specified by PEP 518 must be already installed if this option is used.Build artifacts for distribution
Options:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress output-p
, --project
: Specify another path as the project root, which changes the base ofpyproject.toml
and __pypackages__
[env var: PDM_PROJECT
]--no-isolation
: Disable isolation when building a source distribution that follows PEP 517, as in: build dependencies specified by PEP 518 must be already installed if this option is used.-k
, --skip
: Skip some tasks and/or hooks by their comma-separated names. Can be supplied multiple times. Use:all
to skip all hooks. Use:pre
and:post
to skip all pre or post hooks.--no-sdist
: Don't build source tarballs (default: True
)--no-wheel
: Don't build wheels (default: True
)-d
, --dest
: Target directory to put artifacts (default: dist
)--no-clean
: Do not clean the target directory (default: True
)--config-setting
, -C
: Pass options to the backend. options with a value must be specified after \"=\": --config-setting=--opt(=value)
or -C--opt(=value)
Control the caches of PDM
Options:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress outputCommands:
"},{"location":"reference/cli/#clear","title":"clear","text":"Clean all the files under cache directory
Positional Arguments:
type
: Clear the given type of cachesOptions:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress outputRemove files matching the given pattern
Positional Arguments:
pattern
: The pattern to removeOptions:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress outputList the built wheels stored in the cache
Positional Arguments:
pattern
: The pattern to list (default: *
)Options:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress outputShow the info and current size of caches
Options:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress outputGenerate completion scripts for the given shell
Positional Arguments:
shell
: The shell to generate the scripts for. If not given, PDM will properly guess from SHELL
env var.Options:
-h
, --help
: Show this help message and exit.Display the current configuration
Positional Arguments:
key
: Config keyvalue
: Config valueOptions:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress output-g
, --global
: Use the global project, supply the project root with -p
option-p
, --project
: Specify another path as the project root, which changes the base ofpyproject.toml
and __pypackages__
[env var: PDM_PROJECT
]-l
, --local
: Set config in the project's local configuration file-d
, --delete
: Unset a configuration key-e
, --edit
: Edit the configuration file in the default editor(defined by EDITOR env var)Export the locked packages set to other formats
Options:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress output-g
, --global
: Use the global project, supply the project root with -p
option-p
, --project
: Specify another path as the project root, which changes the base ofpyproject.toml
and __pypackages__
[env var: PDM_PROJECT
]-L
, --lockfile
: Specify another lockfile path. Default:pdm.lock
. [env var: PDM_LOCKFILE
]-f
, --format
: Specify the export file format (default: requirements
)--without-hashes
: Don't include artifact hashes (default: True
)-o
, --output
: Write output to the given file, or print to stdout if not given--pyproject
: Read the list of packages frompyproject.toml
--expandvars
: Expand environment variables in requirements--self
: Include the project itself--editable-self
: Include the project itself as an editable dependencyDependencies Selection:
-G
, --group
GROUP
: Select group of optional-dependencies separated by comma or dev-dependencies (with -d
). Can be supplied multiple times, use:all
to include all groups under the same species.--no-default
: Don't include dependencies from the default group (default: True
)-d
, --dev
: Select dev dependencies--prod
, --production
: Unselect dev dependencies (default: True
)Fix the project problems according to the latest version of PDM
Positional Arguments:
problem
: Fix the specific problem, or all if not givenOptions:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress output-g
, --global
: Use the global project, supply the project root with -p
option-p
, --project
: Specify another path as the project root, which changes the base ofpyproject.toml
and __pypackages__
[env var: PDM_PROJECT
]--dry-run
: Only show the problemsImport project metadata from other formats
Positional Arguments:
filename
: The file nameOptions:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress output-g
, --global
: Use the global project, supply the project root with -p
option-p
, --project
: Specify another path as the project root, which changes the base ofpyproject.toml
and __pypackages__
[env var: PDM_PROJECT
]-d
, --dev
: import packages into dev dependencies-G
, --group
: Specify the target dependency group to import into-f
, --format
: Specify the file format explicitlyShow the project information
Options:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress output-g
, --global
: Use the global project, supply the project root with -p
option-p
, --project
: Specify another path as the project root, which changes the base ofpyproject.toml
and __pypackages__
[env var: PDM_PROJECT
]--venv
NAME
: Run the command in the virtual environment with the given key. [env var: PDM_IN_VENV
]--python
: Show the interpreter path--where
: Show the project root path--packages
: Show the local packages root--env
: Show PEP 508 environment markers--json
: Dump the information in JSONInitialize a pyproject.toml for PDM
Positional Arguments:
template
: Specify the project template, which can be a local path or a Git URLgenerator_args
: Arguments passed to the generatorOptions:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress output-g
, --global
: Use the global project, supply the project root with -p
option-p
, --project
: Specify another path as the project root, which changes the base ofpyproject.toml
and __pypackages__
[env var: PDM_PROJECT
]-k
, --skip
: Skip some tasks and/or hooks by their comma-separated names. Can be supplied multiple times. Use:all
to skip all hooks. Use:pre
and:post
to skip all pre or post hooks.--copier
: Use Copier to generate project [not installed] (default: builtin
)--cookiecutter
: Use Cookiecutter to generate project [not installed] (default: builtin
)-r
, --overwrite
: Overwrite existing filesBuiltin Generator Options:
-n
, --non-interactive
: Don't ask questions but use default values--python
: Specify the Python version/path to use--lib
: Create a library project--backend
: Specify the build backend, which implies --libInstall dependencies from lock file
Options:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress output-g
, --global
: Use the global project, supply the project root with -p
option-p
, --project
: Specify another path as the project root, which changes the base ofpyproject.toml
and __pypackages__
[env var: PDM_PROJECT
]--dry-run
: Show the difference only and don't perform any action-L
, --lockfile
: Specify another lockfile path. Default:pdm.lock
. [env var: PDM_LOCKFILE
]--no-lock
: Don't try to create or update the lockfile. [env var: PDM_NO_LOCK
]-k
, --skip
: Skip some tasks and/or hooks by their comma-separated names. Can be supplied multiple times. Use:all
to skip all hooks. Use:pre
and:post
to skip all pre or post hooks.--venv
NAME
: Run the command in the virtual environment with the given key. [env var: PDM_IN_VENV
]--check
: Check if the lock file is up to date and fail otherwise--plugins
: Install the plugins specified inpyproject.toml
Install Options:
--no-editable
: Install non-editable versions for all packages--no-self
: Don't install the project itself. [env var: PDM_NO_SELF
]--fail-fast
, -x
: Abort on first installation error--no-isolation
: Disable isolation when building a source distribution that follows PEP 517, as in: build dependencies specified by PEP 518 must be already installed if this option is used.Dependencies Selection:
-G
, --group
GROUP
: Select group of optional-dependencies separated by comma or dev-dependencies (with -d
). Can be supplied multiple times, use:all
to include all groups under the same species.--no-default
: Don't include dependencies from the default group (default: True
)-d
, --dev
: Select dev dependencies--prod
, --production
: Unselect dev dependencies (default: True
)List packages installed in the current working set
Positional Arguments:
patterns
: Filter packages by patterns. e.g. pdm list requests- flask-. In --tree mode, only show the subtree of the matched packages.Options:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress output-g
, --global
: Use the global project, supply the project root with -p
option-p
, --project
: Specify another path as the project root, which changes the base ofpyproject.toml
and __pypackages__
[env var: PDM_PROJECT
]--venv
NAME
: Run the command in the virtual environment with the given key. [env var: PDM_IN_VENV
]--freeze
: Show the installed dependencies in pip's requirements.txt format--tree
, --graph
: Display a tree of dependencies-r
, --reverse
: Reverse the dependency tree--resolve
: Resolve all requirements to output licenses (instead of just showing those currently installed)--fields
: Select information to output as a comma separated string. All fields: groups,homepage,licenses,location,name,version. (default: name,version,location
)--sort
: Sort the output using a given field name. If nothing is set, no sort is applied. Multiple fields can be combined with ','.--csv
: Output dependencies in CSV document format--json
: Output dependencies in JSON document format--markdown
: Output dependencies and legal notices in markdown document format - best effort basis--include
: Dependency groups to include in the output. By default all are included--exclude
: Exclude dependency groups from the outputResolve and lock dependencies
Options:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress output-g
, --global
: Use the global project, supply the project root with -p
option-p
, --project
: Specify another path as the project root, which changes the base ofpyproject.toml
and __pypackages__
[env var: PDM_PROJECT
]-L
, --lockfile
: Specify another lockfile path. Default:pdm.lock
. [env var: PDM_LOCKFILE
]--no-isolation
: Disable isolation when building a source distribution that follows PEP 517, as in: build dependencies specified by PEP 518 must be already installed if this option is used.-k
, --skip
: Skip some tasks and/or hooks by their comma-separated names. Can be supplied multiple times. Use:all
to skip all hooks. Use:pre
and:post
to skip all pre or post hooks.--refresh
: Don't update pinned versions, only refresh the lock file--check
: Check if the lock file is up to date and quit--update-reuse
: Reuse pinned versions already present in lock file if possible (default: all
)--update-reuse-installed
: Reuse installed packages if possibleLock Strategy:
--strategy
, -S
STRATEGY
: Specify lock strategy (cross_platform, static_urls, direct_minimal_versions, inherit_metadata). Add 'no_' prefix to disable. Can be supplied multiple times or split by comma.--no-cross-platform
: [DEPRECATED] Only lock packages for the current platform--static-urls
: [DEPRECATED] Store static file URLs in the lockfile--no-static-urls
: [DEPRECATED] Do not store static file URLs in the lockfileDependencies Selection:
-G
, --group
GROUP
: Select group of optional-dependencies separated by comma or dev-dependencies (with -d
). Can be supplied multiple times, use:all
to include all groups under the same species.--no-default
: Don't include dependencies from the default group (default: True
)-d
, --dev
: Select dev dependencies--prod
, --production
: Unselect dev dependencies (default: True
)Build and publish the project to PyPI
Options:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress output-p
, --project
: Specify another path as the project root, which changes the base ofpyproject.toml
and __pypackages__
[env var: PDM_PROJECT
]-k
, --skip
: Skip some tasks and/or hooks by their comma-separated names. Can be supplied multiple times. Use:all
to skip all hooks. Use:pre
and:post
to skip all pre or post hooks.-r
, --repository
: The repository name or url to publish the package to [env var: PDM_PUBLISH_REPO
]-u
, --username
: The username to access the repository [env var: PDM_PUBLISH_USERNAME
]-P
, --password
: The password to access the repository [env var: PDM_PUBLISH_PASSWORD
]-S
, --sign
: Upload the package with PGP signature-i
, --identity
: GPG identity used to sign files.-c
, --comment
: The comment to include with the distribution file.--no-build
: Don't build the package before publishing (default: True
)--skip-existing
: Skip uploading files that already exist. This may not work with some repository implementations.--no-very-ssl
: Disable SSL verification--ca-certs
: The path to a PEM-encoded Certificate Authority bundle to use for publish server validation [env var: PDM_PUBLISH_CA_CERTS
]Remove packages from pyproject.toml
Positional Arguments:
packages
: Specify the packages to removeOptions:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress output-g
, --global
: Use the global project, supply the project root with -p
option-p
, --project
: Specify another path as the project root, which changes the base ofpyproject.toml
and __pypackages__
[env var: PDM_PROJECT
]--dry-run
: Show the difference only and don't perform any action-L
, --lockfile
: Specify another lockfile path. Default:pdm.lock
. [env var: PDM_LOCKFILE
]--no-lock
: Don't try to create or update the lockfile. [env var: PDM_NO_LOCK
]-k
, --skip
: Skip some tasks and/or hooks by their comma-separated names. Can be supplied multiple times. Use:all
to skip all hooks. Use:pre
and:post
to skip all pre or post hooks.--venv
NAME
: Run the command in the virtual environment with the given key. [env var: PDM_IN_VENV
]-d
, --dev
: Remove packages from dev dependencies-G
, --group
: Specify the target dependency group to remove from--no-sync
: Only writepyproject.toml
and do not uninstall packages (default: True
)Install Options:
--no-editable
: Install non-editable versions for all packages--no-self
: Don't install the project itself. [env var: PDM_NO_SELF
]--fail-fast
, -x
: Abort on first installation error--no-isolation
: Disable isolation when building a source distribution that follows PEP 517, as in: build dependencies specified by PEP 518 must be already installed if this option is used.Run commands or scripts with local packages loaded
Options:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress output-g
, --global
: Use the global project, supply the project root with -p
option-p
, --project
: Specify another path as the project root, which changes the base ofpyproject.toml
and __pypackages__
[env var: PDM_PROJECT
]-k
, --skip
: Skip some tasks and/or hooks by their comma-separated names. Can be supplied multiple times. Use:all
to skip all hooks. Use:pre
and:post
to skip all pre or post hooks.--venv
NAME
: Run the command in the virtual environment with the given key. [env var: PDM_IN_VENV
]-l
, --list
: Show all available scripts defined inpyproject.toml
-j
, --json
: Output all scripts infos in JSONExecution Parameters:
-s
, --site-packages
: Load site-packages from the selected interpreterscript
: The command to runargs
: Arguments that will be passed to the commandSearch for PyPI packages
Positional Arguments:
query
: Query string to searchOptions:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress outputManage the PDM program itself (previously known as plugin)
Options:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress outputCommands:
"},{"location":"reference/cli/#list_2","title":"list","text":"List all packages installed with PDM
Options:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress output--plugins
: List plugins onlyInstall packages to the PDM's environment
Positional Arguments:
packages
: Specify one or many package names, each package can have a version specifierOptions:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress output--pip-args
: Arguments that will be passed to pip installRemove packages from PDM's environment
Positional Arguments:
packages
: Specify one or many package namesOptions:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress output--pip-args
: Arguments that will be passed to pip uninstall-y
, --yes
: Answer yes on the questionUpdate PDM itself
Options:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress output--head
: Update to the latest commit on the main branch--pre
: Update to the latest prerelease version--pip-args
: Additional arguments that will be passed to pip installManage the PDM program itself (previously known as plugin)
Options:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress outputCommands:
"},{"location":"reference/cli/#list_3","title":"list","text":"List all packages installed with PDM
Options:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress output--plugins
: List plugins onlyInstall packages to the PDM's environment
Positional Arguments:
packages
: Specify one or many package names, each package can have a version specifierOptions:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress output--pip-args
: Arguments that will be passed to pip installRemove packages from PDM's environment
Positional Arguments:
packages
: Specify one or many package namesOptions:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress output--pip-args
: Arguments that will be passed to pip uninstall-y
, --yes
: Answer yes on the questionUpdate PDM itself
Options:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress output--head
: Update to the latest commit on the main branch--pre
: Update to the latest prerelease version--pip-args
: Additional arguments that will be passed to pip installShow the package information
Positional Arguments:
package
: Specify the package name, or show this package if not givenOptions:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress output-g
, --global
: Use the global project, supply the project root with -p
option-p
, --project
: Specify another path as the project root, which changes the base ofpyproject.toml
and __pypackages__
[env var: PDM_PROJECT
]--venv
NAME
: Run the command in the virtual environment with the given key. [env var: PDM_IN_VENV
]--name
: Show name--version
: Show version--summary
: Show summary--license
: Show license--platform
: Show platform--keywords
: Show keywordsSynchronize the current working set with lock file
Options:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress output-g
, --global
: Use the global project, supply the project root with -p
option-p
, --project
: Specify another path as the project root, which changes the base ofpyproject.toml
and __pypackages__
[env var: PDM_PROJECT
]--dry-run
: Show the difference only and don't perform any action-L
, --lockfile
: Specify another lockfile path. Default:pdm.lock
. [env var: PDM_LOCKFILE
]-k
, --skip
: Skip some tasks and/or hooks by their comma-separated names. Can be supplied multiple times. Use:all
to skip all hooks. Use:pre
and:post
to skip all pre or post hooks.--clean
: Clean packages not in the lockfile--only-keep
: Only keep the selected packages--venv
NAME
: Run the command in the virtual environment with the given key. [env var: PDM_IN_VENV
]-r
, --reinstall
: Force reinstall existing dependenciesInstall Options:
--no-editable
: Install non-editable versions for all packages--no-self
: Don't install the project itself. [env var: PDM_NO_SELF
]--fail-fast
, -x
: Abort on first installation error--no-isolation
: Disable isolation when building a source distribution that follows PEP 517, as in: build dependencies specified by PEP 518 must be already installed if this option is used.Dependencies Selection:
-G
, --group
GROUP
: Select group of optional-dependencies separated by comma or dev-dependencies (with -d
). Can be supplied multiple times, use:all
to include all groups under the same species.--no-default
: Don't include dependencies from the default group (default: True
)-d
, --dev
: Select dev dependencies--prod
, --production
: Unselect dev dependencies (default: True
)Update package(s) in pyproject.toml
Positional Arguments:
packages
: If packages are given, only update themOptions:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress output-g
, --global
: Use the global project, supply the project root with -p
option-p
, --project
: Specify another path as the project root, which changes the base ofpyproject.toml
and __pypackages__
[env var: PDM_PROJECT
]-L
, --lockfile
: Specify another lockfile path. Default:pdm.lock
. [env var: PDM_LOCKFILE
]--no-lock
: Don't try to create or update the lockfile. [env var: PDM_NO_LOCK
]--save-compatible
: Save compatible version specifiers--save-wildcard
: Save wildcard version specifiers--save-exact
: Save exact version specifiers--save-minimum
: Save minimum version specifiers--update-reuse
: Reuse pinned versions already present in lock file if possible--update-eager
: Try to update the packages and their dependencies recursively--update-all
: Update all dependencies and sub-dependencies--update-reuse-installed
: Reuse installed packages if possible--pre
, --prerelease
: Allow prereleases to be pinned-u
, --unconstrained
: Ignore the version constraint of packages-k
, --skip
: Skip some tasks and/or hooks by their comma-separated names. Can be supplied multiple times. Use:all
to skip all hooks. Use:pre
and:post
to skip all pre or post hooks.--venv
NAME
: Run the command in the virtual environment with the given key. [env var: PDM_IN_VENV
]-t
, --top
: Only update those listed inpyproject.toml
--dry-run
, --outdated
: Show the difference only without modifying the lockfile content--no-sync
: Only update lock file but do not sync packages (default: True
)Install Options:
--no-editable
: Install non-editable versions for all packages--no-self
: Don't install the project itself. [env var: PDM_NO_SELF
]--fail-fast
, -x
: Abort on first installation error--no-isolation
: Disable isolation when building a source distribution that follows PEP 517, as in: build dependencies specified by PEP 518 must be already installed if this option is used.Dependencies Selection:
-G
, --group
GROUP
: Select group of optional-dependencies separated by comma or dev-dependencies (with -d
). Can be supplied multiple times, use:all
to include all groups under the same species.--no-default
: Don't include dependencies from the default group (default: True
)-d
, --dev
: Select dev dependencies--prod
, --production
: Unselect dev dependenciesUse the given python version or path as base interpreter
Positional Arguments:
python
: Specify the Python version or pathOptions:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress output-g
, --global
: Use the global project, supply the project root with -p
option-p
, --project
: Specify another path as the project root, which changes the base ofpyproject.toml
and __pypackages__
[env var: PDM_PROJECT
]-k
, --skip
: Skip some tasks and/or hooks by their comma-separated names. Can be supplied multiple times. Use:all
to skip all hooks. Use:pre
and:post
to skip all pre or post hooks.-f
, --first
: Select the first matched interpreter-i
, --ignore-remembered
: Ignore the remembered selection--venv
: Use the interpreter in the virtual environment with the given nameVirtualenv management
Options:
-h
, --help
: Show this help message and exit.-p
, --project
: Specify another path as the project root, which changes the base ofpyproject.toml
and __pypackages__
[env var: PDM_PROJECT
]--path
: Show the path to the given virtualenv--python
: Show the python interpreter path for the given virtualenvCommands:
"},{"location":"reference/cli/#create","title":"create","text":"Create a virtualenv
Positional Arguments:
python
: Specify which python should be used to create the virtualenvvenv_args
: Additional arguments that will be passed to the backendOptions:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress output-w
, --with
: Specify the backend to create the virtualenv-f
, --force
: Recreate if the virtualenv already exists-n
, --name
: Specify the name of the virtualenv--with-pip
: Install pip with the virtualenvList all virtualenvs associated with this project
Options:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress outputRemove the virtualenv with the given name
Positional Arguments:
env
: The key of the virtualenvOptions:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress output-y
, --yes
: Answer yes on the following questionPrint the command to activate the virtualenv with the given name
Positional Arguments:
env
: The key of the virtualenvOptions:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress outputPurge selected/all created Virtualenvs
Options:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress output-f
, --force
: Force purging without prompting for confirmation-i
, --interactive
: Interactively purge selected VirtualenvsThe default theme used by PDM is as follows:
Key Default Styleprimary
cyan success
green warning
yellow error
red info
blue req
bold green You can change the theme colors with pdm config
command. For example, to change the primary
color to magenta
:
pdm config theme.primary magenta\n
Or use a hex color code:
pdm config theme.success '#51c7bd'\n
"},{"location":"reference/configuration/#available-configurations","title":"Available Configurations","text":"The following configuration items can be retrieved and modified by pdm config
command.
build_isolation
Isolate the build environment from the project environment Yes Yes PDM_BUILD_ISOLATION
cache_dir
The root directory of cached files The default cache location on OS No PDM_CACHE_DIR
check_update
Check if there is any newer version available True No PDM_CHECK_UPDATE
global_project.fallback
Use the global project implicitly if no local project is found False
No global_project.fallback_verbose
If True show message when global project is used implicitly True
No global_project.path
The path to the global project <default config location on OS>/global-project
No global_project.user_site
Whether to install to user site False
No install.cache
Enable caching of wheel installations False Yes install.cache_method
Specify how to create links to the caches(symlink/symlink_individual/hardlink/pth
) symlink
Yes install.parallel
Whether to perform installation and uninstallation in parallel True
Yes PDM_PARALLEL_INSTALL
python.use_pyenv
Use the pyenv interpreter True
Yes python.use_venv
Use virtual environments when available True
Yes PDM_USE_VENV
python.providers
List of python provider names for findpython All providers supported by findpython Yes pypi.url
The URL of PyPI mirror https://pypi.org/simple
Yes PDM_PYPI_URL
pypi.username
The username to access PyPI Yes PDM_PYPI_USERNAME
pypi.password
The password to access PyPI Yes PDM_PYPI_PASSWORD
pypi.ignore_stored_index
Ignore the configured indexes False
Yes PDM_IGNORE_STORED_INDEX
pypi.ca_certs
Path to a PEM-encoded CA cert bundle (used for server cert verification) The CA certificates from certifi Yes pypi.client_cert
Path to a PEM-encoded client cert and optional key No pypi.client_key
Path to a PEM-encoded client cert private key, if not in pypi.client_cert No pypi.verify_ssl
Verify SSL certificate when query PyPI True
Yes pypi.json_api
Consult PyPI's JSON API for package metadata False
Yes PDM_PYPI_JSON_API
pypi.<name>.url
The URL of custom package source https://pypi.org/simple
Yes pypi.<name>.username
The username to access custom source Yes pypi.<name>.password
The password to access custom source Yes pypi.<name>.type
index
or find_links
index
Yes pypi.<name>.verify_ssl
Verify SSL certificate when query custom source True
Yes strategy.save
Specify how to save versions when a package is added minimum
(can be: exact
, wildcard
, minimum
, compatible
) Yes strategy.update
The default strategy for updating packages reuse
(can be : eager
, reuse
, all
, reuse-installed
) Yes strategy.resolve_max_rounds
Specify the max rounds of resolution process 10000 Yes PDM_RESOLVE_MAX_ROUNDS
strategy.inherit_metadata
Inherit the groups and markers from parents for each package True
Yes venv.location
Parent directory for virtualenvs <default data location on OS>/venvs
No venv.backend
Default backend to create virtualenv virtualenv
Yes PDM_VENV_BACKEND
venv.prompt
Formatted string to be displayed in the prompt when virtualenv is active {project_name}-{python_version}
Yes PDM_VENV_PROMPT
venv.in_project
Create virtualenv in .venv
under project root True
Yes PDM_VENV_IN_PROJECT
venv.with_pip
Install pip when creating a new venv False
Yes PDM_VENV_WITH_PIP
repository.<name>.url
The URL of custom package source https://pypi.org/simple
Yes repository.<name>.username
The username to access custom repository Yes repository.<name>.password
The password to access custom repository Yes repository.<name>.ca_certs
Path to a PEM-encoded CA cert bundle (used for server cert verification) The CA certificates from certifi Yes repository.<name>.verify_ssl
Verify SSL certificate when uploading to repository True
Yes If the corresponding env var is set, the value will take precedence over what is saved in the config file.
"},{"location":"reference/pep621/","title":"PEP 621 Metadata","text":"The project metadata are stored in the pyproject.toml
. The specifications are defined by PEP 621, PEP 631 and PEP 639. Read the detailed specifications in the PEPs.
In the following part of this document, metadata should be written under [project]
table if not given explicitly.
You can split a long description onto multiple lines, thanks to TOML support for multiline strings. Just remember to escape new lines, so the final description appears on one line only in your package metadata. Indentation will be removed as well when escaping new lines:
description = \"\"\"\\\n Lorem ipsum dolor sit amet, consectetur adipiscing elit, \\\n sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. \\\n Ut enim ad minim veniam, quis nostrud exercitation ullamco \\\n laboris nisi ut aliquip ex ea commodo consequat.\\\n\"\"\"\n
See TOML's specification on strings.
"},{"location":"reference/pep621/#package-version","title":"Package version","text":"StaticDynamic[project]\nversion = \"1.0.0\"\n
[project]\n...\ndynamic = [\"version\"]\n\n[tool.pdm]\nversion = { source = \"file\", path = \"mypackage/__version__.py\" }\n
The version will be read from the mypackage/__version__.py
file searching for the pattern: __version__ = \"{version}\"
.
Read more information about other configurations in dynamic project version from the pdm-backend
documentation.
The required version of Python is specified as the string requires-python
:
requires-python = \">=3.9\"\nclassifiers = [\n \"Programming Language :: Python :: 3\",\n \"Programming Language :: Python :: 3.9\",\n \"Programming Language :: Python :: 3.10\",\n \"Programming Language :: Python :: 3.11\",\n ...\n]\n
Note: As per PEP 621, PDM is not permitted to dynamically update the classifiers
section like some other non-compliant tools. Thus, you should also include the appropriate trove classifiers as shown above if you plan on publishing your package on PyPI.
The license is specified as the string license
:
license = {text = \"BSD-2-Clause\"}\nclassifiers = [\n \"License :: OSI Approved :: BSD License\",\n ...\n]\n
Note: As per PEP 621, PDM is not permitted to dynamically update the classifiers
section like some other non-compliant tools. Thus, you should also include the appropriate trove classifiers as shown above if you plan on publishing your package on PyPI.
The project.dependencies
is an array of dependency specification strings following the PEP 440 and PEP 508.
Examples:
[project]\n...\ndependencies = [\n# Named requirement\n\"requests\",\n# Named requirement with version specifier\n\"flask >= 1.1.0\",\n# Requirement with environment marker\n\"pywin32; sys_platform == 'win32'\",\n# URL requirement\n\"pip @ git+https://github.com/pypa/pip.git@20.3.1\"\n]\n
"},{"location":"reference/pep621/#optional-dependencies","title":"Optional dependencies","text":"You can have some requirements optional, which is similar to setuptools
' extras_require
parameter.
[project.optional-dependencies]\nsocks = [ 'PySocks >= 1.5.6, != 1.5.7, < 2' ]\ntests = [\n'ddt >= 1.2.2, < 2',\n'pytest < 6',\n'mock >= 1.0.1, < 4; python_version < \"3.4\"',\n]\n
To install a group of optional dependencies:
pdm install -G socks\n
-G
option can be given multiple times to include more than one group.
Depending on which build backend you are using, PDM will expand some variables in the dependency strings.
"},{"location":"reference/pep621/#environment-variables","title":"Environment variables","text":"pdm-backendhatchling[project]\ndependencies = [\"flask @ https://${USERNAME}:${PASSWORD}/artifacts.io/Flask-1.1.2.tar.gz\"]\n
[project]\ndependencies = [\"flask @ https://{env:USERNAME}:{env:PASSWORD}/artifacts.io/Flask-1.1.2.tar.gz\"]\n
Find more usages here
Don't worry about credential leakage, the environment variables will be expanded when needed and kept untouched in the lock file.
"},{"location":"reference/pep621/#relative-paths","title":"Relative paths","text":"When you add a package from a relative path, PDM will automatically save it as a relative path for pdm-backend
and hatchling
.
For example, if you run pdm add ./my-package
, it will result in the following line in pyproject.toml
.
[project]\ndependencies = [\"my-package @ file:///${PROJECT_ROOT}/my-package\"]\n
[project]\ndependencies = [\"my-package @ {root:uri}/my-package\"]\n
By default, hatchling doesn't support direct references in the dependency string, you need to turn it on in pyproject.toml
:
[tool.hatch.metadata]\nallow-direct-references = true\n
The relative path will be expanded based on the project root when installing or locking.
"},{"location":"reference/pep621/#console-scripts","title":"Console scripts","text":"The following content:
[project.scripts]\nmycli = \"mycli.__main__:main\"\n
will be translated to setuptools
style:
entry_points = {\n 'console_scripts': [\n 'mycli=mycli.__main__:main'\n ]\n}\n
Also, [project.gui-scripts]
will be translated to gui_scripts
entry points group in setuptools
style.
Other types of entry points are given by [project.entry-points.<type>]
section, with the same format of [project.scripts]
:
[project.entry-points.pytest11]\nmyplugin = \"mypackage.plugin:pytest_plugin\"\n
If the entry point name contains dots or other special characters, wrap it in quotes:
[project.entry-points.\"flake8.extension\"]\nmyplugin = \"mypackage.plugin:flake8_plugin\"\n
"},{"location":"usage/advanced/","title":"Advanced Usage","text":""},{"location":"usage/advanced/#automatic-testing","title":"Automatic Testing","text":""},{"location":"usage/advanced/#use-tox-as-the-runner","title":"Use Tox as the runner","text":"Tox is a great tool for testing against multiple Python versions or dependency sets. You can configure a tox.ini
like the following to integrate your testing with PDM:
[tox]\nenv_list = py{36,37,38},lint\n\n[testenv]\nsetenv =\nPDM_IGNORE_SAVED_PYTHON=\"1\"\ndeps = pdm\ncommands =\npdm install --dev\npytest tests\n\n[testenv:lint]\ndeps = pdm\ncommands =\npdm install -G lint\nflake8 src/\n
To use the virtualenv created by Tox, you should make sure you have set pdm config python.use_venv true
. PDM then will install dependencies from pdm lock
into the virtualenv. In the dedicated venv you can directly run tools by pytest tests/
instead of pdm run pytest tests/
.
You should also make sure you don't run pdm add/pdm remove/pdm update/pdm lock
in the test commands, otherwise the pdm lock
file will be modified unexpectedly. Additional dependencies can be supplied with the deps
config. Besides, isolated_build
and passenv
config should be set as the above example to make PDM work properly.
To get rid of these constraints, there is a Tox plugin tox-pdm which can ease the usage. You can install it by
pip install tox-pdm\n
Or,
pdm add --dev tox-pdm\n
And you can make the tox.ini
much tidier as following, :
[tox]\nenv_list = py{36,37,38},lint\n\n[testenv]\ngroups = dev\ncommands =\npytest tests\n\n[testenv:lint]\ngroups = lint\ncommands =\nflake8 src/\n
See the project's README for a detailed guidance.
"},{"location":"usage/advanced/#use-nox-as-the-runner","title":"Use Nox as the runner","text":"Nox is another great tool for automated testing. Unlike tox, Nox uses a standard Python file for configuration.
It is much easier to use PDM in Nox, here is an example of noxfile.py
:
import os\nimport nox\n\nos.environ.update({\"PDM_IGNORE_SAVED_PYTHON\": \"1\"})\n@nox.session\ndef tests(session):\n session.run_always('pdm', 'install', '-G', 'test', external=True)\n session.run('pytest')\n\n@nox.session\ndef lint(session):\n session.run_always('pdm', 'install', '-G', 'lint', external=True)\n session.run('flake8', '--import-order-style', 'google')\n
Note that PDM_IGNORE_SAVED_PYTHON
should be set so that PDM can pick up the Python in the virtualenv correctly. Also make sure pdm
is available in the PATH
. Before running nox, you should also ensure configuration item python.use_venv
is true to enable venv reusing.
__pypackages__
directory","text":"By default, if you run tools by pdm run
, __pypackages__
will be seen by the program and all subprocesses created by it. This means virtual environments created by those tools are also aware of the packages inside __pypackages__
, which result in unexpected behavior in some cases. For nox
, you can avoid this by adding a line in noxfile.py
:
os.environ.pop(\"PYTHONPATH\", None)\n
For tox
, PYTHONPATH
will not be passed to the test sessions so this isn't going to be a problem. Moreover, it is recommended to make nox
and tox
live in their own pipx environments so you don't need to install for every project. In this case, PEP 582 packages will not be a problem either.
Only one thing to keep in mind -- PDM can't be installed on Python < 3.7, so if your project is to be tested on those Python versions, you have to make sure PDM is installed on the correct Python version, which can be different from the target Python version the particular job/task is run on.
Fortunately, if you are using GitHub Action, there is pdm-project/setup-pdm to make this process easier. Here is an example workflow of GitHub Actions, while you can adapt it for other CI platforms.
Testing:\nruns-on: ${{ matrix.os }}\nstrategy:\nmatrix:\npython-version: [3.7, 3.8, 3.9, '3.10', '3.11']\nos: [ubuntu-latest, macOS-latest, windows-latest]\n\nsteps:\n- uses: actions/checkout@v3\n- name: Set up PDM\nuses: pdm-project/setup-pdm@v3\nwith:\npython-version: ${{ matrix.python-version }}\n\n- name: Install dependencies\nrun: |\npdm sync -d -G testing\n- name: Run Tests\nrun: |\npdm run -v pytest tests\n
TIPS
For GitHub Action users, there is a known compatibility issue on Ubuntu virtual environment. If PDM parallel install is failed on that machine you should either set parallel_install
to false
or set env LD_PRELOAD=/lib/x86_64-linux-gnu/libgcc_s.so.1
. It is already handled by the pdm-project/setup-pdm
action.
Note
If your CI scripts run without a proper user set, you might get permission errors when PDM tries to create its cache directory. To work around this, you can set the HOME environment variable yourself, to a writable directory, for example:
export HOME=/tmp/home\n
"},{"location":"usage/advanced/#use-pdm-in-a-multi-stage-dockerfile","title":"Use PDM in a multi-stage Dockerfile","text":"It is possible to use PDM in a multi-stage Dockerfile to first install the project and dependencies into __pypackages__
and then copy this folder into the final stage, adding it to PYTHONPATH
.
# build stage\nFROM python:3.8 AS builder\n\n# install PDM\nRUN pip install -U pip setuptools wheel\nRUN pip install pdm\n\n# copy files\nCOPY pyproject.toml pdm.lock README.md /project/\nCOPY src/ /project/src\n\n# install dependencies and project into the local packages directory\nWORKDIR /project\nRUN mkdir __pypackages__ && pdm sync --prod --no-editable\n\n\n# run stage\nFROM python:3.8\n\n# retrieve packages from build stage\nENV PYTHONPATH=/project/pkgs\nCOPY --from=builder /project/__pypackages__/3.8/lib /project/pkgs\n\n# retrieve executables\nCOPY --from=builder /project/__pypackages__/3.8/bin/* /bin/\n\n# set command/entrypoint, adapt to fit your needs\nCMD [\"python\", \"-m\", \"project\"]\n
"},{"location":"usage/advanced/#use-pdm-to-manage-a-monorepo","title":"Use PDM to manage a monorepo","text":"With PDM, you can have multiple sub-packages within a single project, each with its own pyproject.toml
file. And you can create only one pdm.lock
file to lock all dependencies. The sub-packages can have each other as their dependencies. To achieve this, follow these steps:
project/pyproject.toml
:
[tool.pdm.dev-dependencies]\ndev = [\n\"-e file:///${PROJECT_ROOT}/packages/foo-core\",\n\"-e file:///${PROJECT_ROOT}/packages/foo-cli\",\n\"-e file:///${PROJECT_ROOT}/packages/foo-app\",\n]\n
packages/foo-cli/pyproject.toml
:
[project]\ndependencies = [\"foo-core\"]\n
packages/foo-app/pyproject.toml
:
[project]\ndependencies = [\"foo-core\"]\n
Now, run pdm install
in the project root, and you will get a pdm.lock
with all dependencies locked. All sub-packages will be installed in editable mode.
Look at the \ud83d\ude80 Example repository for more details.
"},{"location":"usage/advanced/#hooks-for-pre-commit","title":"Hooks forpre-commit
","text":"pre-commit
is a powerful framework for managing git hooks in a centralized fashion. PDM already uses pre-commit
hooks for its internal QA checks. PDM exposes also several hooks that can be run locally or in CI pipelines.
requirements.txt
","text":"This hook wraps the command pdm export
along with any valid argument. It can be used as a hook (e.g., for CI) to ensure that you are going to check in the codebase a requirements.txt
, which reflects the actual content of pdm lock
.
# export python requirements\n- repo: https://github.com/pdm-project/pdm\nrev: 2.x.y # a PDM release exposing the hook\nhooks:\n- id: pdm-export\n# command arguments, e.g.:\nargs: ['-o', 'requirements.txt', '--without-hashes']\nfiles: ^pdm.lock$\n
"},{"location":"usage/advanced/#check-pdmlock-is-up-to-date-with-pyprojecttoml","title":"Check pdm.lock
is up to date with pyproject.toml","text":"This hook wraps the command pdm lock --check
along with any valid argument. It can be used as a hook (e.g., for CI) to ensure that whenever pyproject.toml
has a dependency added/changed/removed, that pdm.lock
is also up to date.
- repo: https://github.com/pdm-project/pdm\nrev: 2.x.y # a PDM release exposing the hook\nhooks:\n- id: pdm-lock-check\n
"},{"location":"usage/advanced/#sync-current-working-set-with-pdmlock","title":"Sync current working set with pdm.lock
","text":"This hook wraps the command pdm sync
along with any valid argument. It can be used as a hook to ensure that your current working set is synced with pdm.lock
whenever you checkout or merge a branch. Add keyring to additional_dependencies
if you want to use your systems credential store.
- repo: https://github.com/pdm-project/pdm\nrev: 2.x.y # a PDM release exposing the hook\nhooks:\n- id: pdm-sync\nadditional_dependencies:\n- keyring\n
"},{"location":"usage/config/","title":"Configure the Project","text":"PDM's config
command works just like git config
, except that --list
isn't needed to show configurations.
Show the current configurations:
pdm config\n
Get one single configuration:
pdm config pypi.url\n
Change a configuration value and store in home configuration:
pdm config pypi.url \"https://test.pypi.org/simple\"\n
By default, the configuration are changed globally, if you want to make the config seen by this project only, add a --local
flag:
pdm config --local pypi.url \"https://test.pypi.org/simple\"\n
Any local configurations will be stored in pdm.toml
under the project root directory.
The configuration files are searched in the following order:
<PROJECT_ROOT>/pdm.toml
- The project configuration<CONFIG_ROOT>/config.toml
- The home configuration<SITE_CONFIG_ROOT>/config.toml
- The site configurationwhere <CONFIG_ROOT>
is:
$XDG_CONFIG_HOME/pdm
(~/.config/pdm
in most cases) on Linux as defined by XDG Base Directory Specification~/Library/Application Support/pdm
on macOS as defined by Apple File System Basics%USERPROFILE%\\AppData\\Local\\pdm
on Windows as defined in Known foldersand <SITE_CONFIG_ROOT>
is:
$XDG_CONFIG_DIRS/pdm
(/etc/xdg/pdm
in most cases) on Linux as defined by XDG Base Directory Specification/Library/Application Support/pdm
on macOS as defined by Apple File System BasicsC:\\ProgramData\\pdm\\pdm
on Windows as defined in Known foldersIf -g/--global
option is used, the first item will be replaced by <CONFIG_ROOT>/global-project/pdm.toml
.
You can find all available configuration items in Configuration Page.
"},{"location":"usage/config/#configure-the-python-finder","title":"Configure the Python finder","text":"By default, PDM will try to find Python interpreters in the following sources:
venv
: The PDM virtualenv locationpath
: The PATH
environment variablepyenv
: The pyenv install rootrye
: The rye toolchain install rootasdf
: The asdf python install rootwinreg
: The Windows registryYou can unselect some of them or change the order by setting python.providers
config key:
pdm config python.providers rye # Rye source only\npdm config python.providers pyenv,asdf # pyenv and asdf\n
"},{"location":"usage/config/#allow-prereleases-in-resolution-result","title":"Allow prereleases in resolution result","text":"By default, pdm
's dependency resolver will ignore prereleases unless there are no stable versions for the given version range of a dependency. This behavior can be changed by setting allow_prereleases
to true
in [tool.pdm]
table:
[tool.pdm]\nallow_prereleases = true\n
"},{"location":"usage/config/#configure-the-package-indexes","title":"Configure the package indexes","text":"You can tell PDM where to to find the packages by either specifying sources in the pyproject.toml
or via pypi.*
configurations.
Add sources in pyproject.toml
:
[[tool.pdm.source]]\nname = \"private\"\nurl = \"https://private.pypi.org/simple\"\nverify_ssl = true\n
Change the default index via pdm config
:
pdm config pypi.url \"https://test.pypi.org/simple\"\n
Add extra indexes via pdm config
:
pdm config pypi.extra.url \"https://extra.pypi.org/simple\"\n
The available configuration options are:
url
: The URL of the indexverify_ssl
: (Optional)Whether to verify SSL certificates, default to trueusername
: (Optional)The username for the indexpassword
: (Optional)The password for the indextype
: (Optional) index or find_links, default to indexBy default, all sources are PEP 503 style \"indexes\" like pip's --index-url
and --extra-index-url
, however, you can set the type to find_links
which contains files or links to be looked for directly. See this answer for the difference between the two types.
These configurations are read in the following order to build the final source list:
pypi.url
, if pypi
doesn't appear in the name
field of any source in pyproject.toml
pyproject.toml
pypi.<name>.url
in PDM config.You can set pypi.ignore_stored_index
to true
to disable all indexes from the PDM config and only use those specified in pyproject.toml
.
Disable the default PyPI index
If you want to omit the default PyPI index, just set the source name to pypi
and that source will replace it.
[[tool.pdm.source]]\nurl = \"https://private.pypi.org/simple\"\nverify_ssl = true\nname = \"pypi\"\n
Indexes in pyproject.toml
or config When you want to share the indexes with other people who are going to use the project, you should add them in pyproject.toml
. For example, some packages only exist in a private index and can't be installed if someone doesn't configure the index. Otherwise, store them in the local config which won't be seen by others.
By default, all sources are considered equal, packages from them are sorted by the version and wheel tags, the most matching one with the highest version is selected.
In some cases you may want to return packages from the preferred source, and search for others if they are missing from the former source. PDM supports this by reading the configuration respect-source-order
:
[tool.pdm.resolution]\nrespect-source-order = true\n
"},{"location":"usage/config/#specify-index-for-individual-packages","title":"Specify index for individual packages","text":"You can bind packages to specific sources with include_packages
and exclude_packages
config under tool.pdm.source
table.
[[tool.pdm.source]]\nname = \"private\"\nurl = \"https://private.pypi.org/simple\"\ninclude_packages = [\"foo\", \"foo-*\"]\nexclude_packages = [\"bar-*\"]\n
With the above configuration, any package matching foo
or foo-*
will only be searched from the private
index, and any package matching bar-*
will be searched from all indexes except private
.
Both include_packages
and exclude_packages
are optional and accept a list of glob patterns, and include_packages
takes effect exclusively when the pattern matches.
You can specify credentials in the URL with ${ENV_VAR}
variable expansion and these variables will be read from the environment variables:
[[tool.pdm.source]]\nname = \"private\"\nurl = \"https://${PRIVATE_PYPI_USERNAME}:${PRIVATE_PYPI_PASSWORD}@private.pypi.org/simple\"\n
"},{"location":"usage/config/#configure-https-certificates","title":"Configure HTTPS certificates","text":"You can use a custom CA bundle or client certificate for HTTPS requests. It can be configured for both indexes(for package download) and repositories(for upload):
pdm config pypi.ca_certs /path/to/ca_bundle.pem\npdm config repository.pypi.ca_certs /path/to/ca_bundle.pem\n
Besides, it is possible to use the system trust store, instead of the bundled certifi certificates for verifying HTTPS certificates. This approach will typically support corporate proxy certificates without additional configuration.
To use truststore
, you need Python 3.10 or newer and install truststore
into the same environment as PDM:
$ pdm self add truststore\n
"},{"location":"usage/config/#index-configuration-merging","title":"Index configuration merging","text":"Index configurations are merged with the name
field of [[tool.pdm.source]]
table or pypi.<name>
key in the config file. This enables you to store the url and credentials separately, to avoid secrets being exposed in the source control. For example, if you have the following configuration:
[[tool.pdm.source]]\nname = \"private\"\nurl = \"https://private.pypi.org/simple\"\n
You can store the credentials in the config file:
pdm config pypi.private.username \"foo\"\npdm config pypi.private.password \"bar\"\n
PDM can retrieve the configurations for private
index from both places.
If the index requires a username and password, but they can't be found from the environment variables nor config file, PDM will prompt you to enter them. Or, if keyring
is installed, it will be used as the credential store. PDM can use the keyring
from either the installed package or the CLI.
If a package is required by many projects on the system, each project has to keep its own copy. This can be a waste of disk space, especially for data science and machine learning projects.
PDM supports caching installations of the same wheel by installing it in a centralized package repository and linking to that installation in different projects. To enable it, run:
pdm config install.cache on\n
It can be enabled on a per-project basis by adding the --local
option to the command.
The caches are located in $(pdm config cache_dir)/packages
. You can view the cache usage with pdm cache info
. Note that the cached installs are managed automatically -- they will be deleted if they are not linked to any projects. Manually deleting the caches from disk may break some projects on the system.
In addition, several different ways of linking to cache entries are supported:
symlink
(default), create symlinks to the package directories or children if the parent is a namespace package.symlink_individual
, for each individual files in the package directory, create a symlink to it.hardlink
, create hard links to the package files of the cache entry.You can switch between them by running pdm config [-l] install.cache_method <method>
.
Note
Only the installation of named requirements resolved from PyPI can be cached.
"},{"location":"usage/config/#configure-the-repositories-for-upload","title":"Configure the repositories for upload","text":"When using the pdm publish
command, it reads the repository secrets from the global config file(<CONFIG_ROOT>/config.toml
). The content of the config is as follows:
[repository.pypi]\nusername = \"frostming\"\npassword = \"<secret>\"\n\n[repository.company]\nurl = \"https://pypi.company.org/legacy/\"\nusername = \"frostming\"\npassword = \"<secret>\"\nca_certs = \"/path/to/custom-cacerts.pem\"\n
Alternatively, these credentials can be provided with env vars:
export PDM_PUBLISH_REPO=...\nexport PDM_PUBLISH_USERNAME=...\nexport PDM_PUBLISH_PASSWORD=...\nexport PDM_PUBLISH_CA_CERTS=...\n
A PEM-encoded Certificate Authority bundle (ca_certs
) can be used for local / custom PyPI repositories where the server certificate is not signed by the standard certifi CA bundle.
Note
Repositories are different from indexes in the previous section. Repositories are for publishing while indexes are for locking and resolving. They don't share the configuration.
Tip
You don't need to configure the url
for pypi
and testpypi
repositories, they are filled by default values. The username, password, and certificate authority bundle can be passed in from the command line for pdm publish
via --username
, --password
, and --ca-certs
, respectively.
To change the repository config from the command line, use the pdm config
command:
pdm config repository.pypi.username \"__token__\"\npdm config repository.pypi.password \"my-pypi-token\"\n\npdm config repository.company.url \"https://pypi.company.org/legacy/\"\npdm config repository.company.ca_certs \"/path/to/custom-cacerts.pem\"\n
"},{"location":"usage/config/#password-management-with-keyring","title":"Password management with keyring","text":"When keyring is available and supported, the passwords will be stored to and retrieved from the keyring instead of writing to the config file. This supports both indexes and upload repositories. The service name will be pdm-pypi-<name>
for an index and pdm-repository-<name>
for a repository.
You can enable keyring by either installing keyring
into the same environment as PDM or installing globally. To add keyring to the PDM environment:
pdm self add keyring\n
Alternatively, if you have installed a copy of keyring globally, make sure the CLI is exposed in the PATH
env var to make it discoverable by PDM:
export PATH=$PATH:path/to/keyring/bin\n
"},{"location":"usage/config/#override-the-resolved-package-versions","title":"Override the resolved package versions","text":"New in version 1.12.0
Sometimes you can't get a dependency resolution due to incorrect version ranges set by upstream libraries that you can't fix. In this case you can use PDM's overrides feature to force a specific version of a package to be installed.
Given the following configuration in pyproject.toml
:
[tool.pdm.resolution.overrides]\nasgiref = \"3.2.10\" # exact version\nurllib3 = \">=1.26.2\" # version range\npytz = \"https://mypypi.org/packages/pytz-2020.9-py3-none-any.whl\" # absolute URL\n
Each entry of that table is a package name with the wanted version. In this example, PDM will resolve the above packages into the given versions no matter whether there is any other resolution available.
Warning
By using [tool.pdm.resolution.overrides]
setting, you are at your own risk of any incompatibilities from that resolution. It can only be used if there is no valid resolution for your requirements and you know the specific version works. Most of the time, you can just add any transient constraints to the dependencies
array.
New in version 2.7.0
You can add extra options passed to individual pdm commands by tool.pdm.options
configuration:
[tool.pdm.options]\nadd = [\"--no-isolation\", \"--no-self\"]\ninstall = [\"--no-self\"]\nlock = [\"--no-cross-platform\"]\n
These options will be added right after the command name. For instance, based on the configuration above, pdm add requests
is equivalent to pdm add --no-isolation --no-self requests
.
New in version 2.10.0
You may see some warnings when resolving dependencies like this:
PackageWarning: Skipping scipy@1.10.0 because it requires Python\n<3.12,>=3.8 but the project claims to work with Python>=3.9.\nNarrow down the `requires-python` range to include this version. For example, \">=3.9,<3.12\" should work.\n warnings.warn(record.message, PackageWarning, stacklevel=1)\nUse `-q/--quiet` to suppress these warnings, or ignore them per-package with `ignore_package_warnings` config in [tool.pdm] table.\n
This is because the supported range of Python versions of the package doesn't cover the requires-python
value specified in the pyproject.toml
. You can ignore these warnings in a per-package basis by adding the following config:
[tool.pdm]\nignore_package_warnings = [\"scipy\", \"tensorflow-*\"]\n
Where each item is a case-insensitive glob pattern to match the package name.
"},{"location":"usage/dependency/","title":"Manage Dependencies","text":"PDM provides a bunch of handful commands to help manage your project and dependencies. The following examples are run on Ubuntu 18.04, a few changes must be done if you are using Windows.
"},{"location":"usage/dependency/#add-dependencies","title":"Add dependencies","text":"pdm add
can be followed by one or several dependencies, and the dependency specification is described in PEP 508.
Examples:
pdm add requests # add requests\npdm add requests==2.25.1 # add requests with version constraint\npdm add requests[socks] # add requests with extra dependency\npdm add \"flask>=1.0\" flask-sqlalchemy # add multiple dependencies with different specifiers\n
PDM also allows extra dependency groups by providing -G/--group <name>
option, and those dependencies will go to [project.optional-dependencies.<name>]
table in the project file, respectively.
You can reference other optional groups in optional-dependencies
, even before the package is uploaded:
[project]\nname = \"foo\"\nversion = \"0.1.0\"\n\n[project.optional-dependencies]\nsocks = [\"pysocks\"]\njwt = [\"pyjwt\"]\nall = [\"foo[socks,jwt]\"]\n
After that, dependencies and sub-dependencies will be resolved properly and installed for you, you can view pdm.lock
to see the resolved result of all dependencies.
Local packages can be added with their paths. The path can be a file or a directory:
pdm add ./sub-package\npdm add ./first-1.0.0-py2.py3-none-any.whl\n
The paths MUST start with a .
, otherwise it will be recognized as a normal named requirement. The local dependencies will be written to the pyproject.toml
file with the URL format:
[project]\ndependencies = [\n\"sub-package @ file:///${PROJECT_ROOT}/sub-package\",\n\"first @ file:///${PROJECT_ROOT}/first-1.0.0-py2.py3-none-any.whl\",\n]\n
Using other build backends If you are using hatchling
instead of the pdm backend, the URLs would be as follows:
sub-package @ {root:uri}/sub-package\nfirst @ {root:uri}/first-1.0.0-py2.py3-none-any.whl\n
Other backends doesn't support encoding relative paths in the URL and will write the absolute path instead."},{"location":"usage/dependency/#url-dependencies","title":"URL dependencies","text":"PDM also supports downloading and installing packages directly from a web address.
Examples:
# Install gzipped package from a plain URL\npdm add \"https://github.com/numpy/numpy/releases/download/v1.20.0/numpy-1.20.0.tar.gz\"\n# Install wheel from a plain URL\npdm add \"https://github.com/explosion/spacy-models/releases/download/en_core_web_trf-3.5.0/en_core_web_trf-3.5.0-py3-none-any.whl\"\n
"},{"location":"usage/dependency/#vcs-dependencies","title":"VCS dependencies","text":"You can also install from a git repository url or other version control systems. The following are supported:
git
hg
svn
bzr
The URL should be like: {vcs}+{url}@{rev}
Examples:
# Install pip repo on tag `22.0`\npdm add \"git+https://github.com/pypa/pip.git@22.0\"\n# Provide credentials in the URL\npdm add \"git+https://username:password@github.com/username/private-repo.git@master\"\n# Give a name to the dependency\npdm add \"pip @ git+https://github.com/pypa/pip.git@22.0\"\n# Or use the #egg fragment\npdm add \"git+https://github.com/pypa/pip.git@22.0#egg=pip\"\n# Install from a subdirectory\npdm add \"git+https://github.com/owner/repo.git@master#egg=pkg&subdirectory=subpackage\"\n
"},{"location":"usage/dependency/#hide-credentials-in-the-url","title":"Hide credentials in the URL","text":"You can hide the credentials in the URL by using the ${ENV_VAR}
variable syntax:
[project]\ndependencies = [\n\"mypackage @ git+http://${VCS_USER}:${VCS_PASSWD}@test.git.com/test/mypackage.git@master\"\n]\n
These variables will be read from the environment variables when installing the project.
"},{"location":"usage/dependency/#add-development-only-dependencies","title":"Add development only dependencies","text":"New in 1.5.0
PDM also supports defining groups of dependencies that are useful for development, e.g. some for testing and others for linting. We usually don't want these dependencies appear in the distribution's metadata so using optional-dependencies
is probably not a good idea. We can define them as development dependencies:
pdm add -dG test pytest\n
This will result in a pyproject.toml as following:
[tool.pdm.dev-dependencies]\ntest = [\"pytest\"]\n
You can have several groups of development only dependencies. Unlike optional-dependencies
, they won't appear in the package distribution metadata such as PKG-INFO
or METADATA
. The package index won't be aware of these dependencies. The schema is similar to that of optional-dependencies
, except that it is in tool.pdm
table.
[tool.pdm.dev-dependencies]\nlint = [\n\"flake8\",\n\"black\"\n]\ntest = [\"pytest\", \"pytest-cov\"]\ndoc = [\"mkdocs\"]\n
For backward-compatibility, if only -d
or --dev
is specified, dependencies will go to dev
group under [tool.pdm.dev-dependencies]
by default. Note
The same group name MUST NOT appear in both [tool.pdm.dev-dependencies]
and [project.optional-dependencies]
.
Local directories and VCS dependencies can be installed in editable mode. If you are familiar with pip
, it is just like pip install -e <package>
. Editable packages are allowed only in development dependencies:
Note
Editable installs are only allowed in the dev
dependency group. Other groups, including the default, will fail with a [PdmUsageError]
.
# A relative path to the directory\npdm add -e ./sub-package --dev\n# A file URL to a local directory\npdm add -e file:///path/to/sub-package --dev\n# A VCS URL\npdm add -e git+https://github.com/pallets/click.git@main#egg=click --dev\n
"},{"location":"usage/dependency/#save-version-specifiers","title":"Save version specifiers","text":"If the package is given without a version specifier like pdm add requests
. PDM provides three different behaviors of what version specifier is saved for the dependency, which is given by --save-<strategy>
(Assume 2.21.0
is the latest version that can be found for the dependency):
minimum
: Save the minimum version specifier: >=2.21.0
(default).compatible
: Save the compatible version specifier: >=2.21.0,<3.0.0
.exact
: Save the exact version specifier: ==2.21.0
.wildcard
: Don't constrain version and leave the specifier to be wildcard: *
.One can give --pre/--prerelease
option to pdm add
so that prereleases are allowed to be pinned for the given packages.
To update all dependencies in the lock file:
pdm update\n
To update the specified package(s):
pdm update requests\n
To update multiple groups of dependencies:
pdm update -G security -G http\n
Or using comma-separated list:
pdm update -G \"security,http\"\n
To update a given package in the specified group:
pdm update -G security cryptography\n
If the group is not given, PDM will search for the requirement in the default dependencies set and raises an error if none is found.
To update packages in development dependencies:
# Update all default + dev-dependencies\npdm update -d\n# Update a package in the specified group of dev-dependencies\npdm update -dG test pytest\n
"},{"location":"usage/dependency/#about-update-strategy","title":"About update strategy","text":"Similarly, PDM also provides 3 different behaviors of updating dependencies and sub-dependencies\uff0c which is given by --update-<strategy>
option:
reuse
: Keep all locked dependencies except for those given in the command line (default).reuse-installed
: Try to reuse the versions installed in the working set. This will also affect the packages requested in the command line.eager
: Try to lock a newer version of the packages in command line and their recursive sub-dependencies and keep other dependencies as they are.all
: Update all dependencies and sub-dependencies.One can give -u/--unconstrained
to tell PDM to ignore the version specifiers in the pyproject.toml
. This works similarly to the yarn upgrade -L/--latest
command. Besides, pdm update
also supports the --pre/--prerelease
option.
To remove existing dependencies from project file and the library directory:
# Remove requests from the default dependencies\npdm remove requests\n# Remove h11 from the 'web' group of optional-dependencies\npdm remove -G web h11\n# Remove pytest-cov from the `test` group of dev-dependencies\npdm remove -dG test pytest-cov\n
"},{"location":"usage/dependency/#install-the-packages-pinned-in-lock-file","title":"Install the packages pinned in lock file","text":"There are a few similar commands to do this job with slight differences:
pdm sync
installs packages from the lock file.pdm update
will update the lock file, then sync
.pdm install
will check the project file for changes, update the lock file if needed, then sync
.sync
also has a few options to manage installed packages:
--clean
: will remove packages no longer in the lockfile--only-keep
: only selected packages (using options like -G
or --prod
) will be kept.You can specify another lockfile than the default pdm lock
by using the -L/--lockfile <filepath>
option or the PDM_LOCKFILE
environment variable.
Say we have a project with following dependencies:
[project] # This is production dependencies\ndependencies = [\"requests\"]\n\n[project.optional-dependencies] # This is optional dependencies\nextra1 = [\"flask\"]\nextra2 = [\"django\"]\n\n[tool.pdm.dev-dependencies] # This is dev dependencies\ndev1 = [\"pytest\"]\ndev2 = [\"mkdocs\"]\n
Command What it does Comments pdm install
install all groups locked in the lockfile pdm install -G extra1
install prod deps, dev deps, and \"extra1\" optional group pdm install -G dev1
install prod deps and only \"dev1\" dev group pdm install -G:all
install prod deps, dev deps and \"extra1\", \"extra2\" optional groups pdm install -G extra1 -G dev1
install prod deps, \"extra1\" optional group and only \"dev1\" dev group pdm install --prod
install prod only pdm install --prod -G extra1
install prod deps and \"extra1\" optional pdm install --prod -G dev1
Fail, --prod
can't be given with dev dependencies Leave the --prod
option All development dependencies are included as long as --prod
is not passed and -G
doesn't specify any dev groups.
Besides, if you don't want the root project to be installed, add --no-self
option, and --no-editable
can be used when you want all packages to be installed in non-editable versions.
You may also use the pdm lock command with these options to lock only the specified groups, which will be recorded in the [metadata]
table of the lock file. If no --group/--prod/--dev/--no-default
option is specified, pdm sync
and pdm update
will operate using the groups in the lockfile. However, if any groups that are not included in the lockfile are given as arguments to the commands, PDM will raise an error.
This feature is especially valuable when managing multiple lockfiles, where each may have different versions of the same package pinned. To switch between lockfiles, you can use the --lockfile/-L
option.
For a realistic example, your project depends on a release version of werkzeug
and you may want to work with a local in-development copy of it when developing. You can add the following to your pyproject.toml
:
[project]\nrequires-python = \">=3.7\"\ndependencies = [\"werkzeug\"]\n\n[tool.pdm.dev-dependencies]\ndev = [\"werkzeug @ file:///${PROJECT_ROOT}/dev/werkzeug\"]\n
Then, run pdm lock
with different options to generate lockfiles for different purposes:
# Lock default + dev, write to pdm.lock\n# with the local copy of werkzeug pinned.\npdm lock\n# Lock default, write to pdm.prod.lock\n# with the release version of werkzeug pinned.\npdm lock --prod -L pdm.prod.lock\n
Check the metadata.groups
field in the lockfile to see which groups are included.
Currently, we support three flags to control the locking behavior: cross_platform
, static_urls
and direct_minimal_versions
, with the meanings as follows. You can pass one or more flags to pdm lock
by --strategy/-S
option, either by giving a comma-separated list or by passing the option multiple times. Both of these commands function in the same way:
pdm lock -S cross_platform,static_urls\npdm lock -S cross_platform -S static_urls\n
The flags will be encoded in the lockfile and get read when you run pdm lock
next time. But you can disable flags by prefixing the flag name with no_
:
pdm lock -S no_cross_platform\n
This command makes the lockfile not cross-platform.
"},{"location":"usage/dependency/#cross-platform","title":"Cross platform","text":"New in version 2.6.0
By default, the generated lockfile is cross-platform, which means the current platform isn't taken into account when resolving the dependencies. The result lockfile will contain wheels and dependencies for all possible platforms and Python versions. However, sometimes this will result in a wrong lockfile when a release doesn't contain all wheels. To avoid this, you can tell PDM to create a lockfile that works for this platform only, trimming the wheels not relevant to the current platform. This can be done by passing the --strategy no_cross_platform
option to pdm lock
:
pdm lock --strategy no_cross_platform\n
"},{"location":"usage/dependency/#static-urls","title":"Static URLs","text":"New in version 2.8.0
By default, PDM only stores the filenames of the packages in the lockfile, which benefits the reusability across different package indexes. However, if you want to store the static URLs of the packages in the lockfile, you can pass the --strategy static_urls
option to pdm lock
:
pdm lock --strategy static_urls\n
The settings will be saved and remembered for the same lockfile. You can also pass --strategy no_static_urls
to disable it.
New in version 2.10.0
When it is enabled by passing --strategy direct_minimal_versions
, dependencies specified in the pyproject.toml
will be resolved to the minimal versions available, rather than the latest versions. This is useful when you want to test the compatibility of your project within a range of dependency versions.
For example, if you specified flask>=2.0
in the pyproject.toml
, flask
will be resolved to version 2.0.0
if there is no other compatibility issue.
Note
Version constraints in package dependencies are not future-proof. If you resolve the dependencies to the minimal versions, there will likely be backwards-compatibility issues. For example, flask==2.0.0
requires werkzeug>=2.0
, but in fact, it can not work with Werkzeug 3.0.0
, which is released 2 years after it.
New in version 2.11.0
Previously, the pdm lock
command would record package metadata as it is. When installing, PDM would start from the top requirements and traverse down to the leaf node of the dependency tree. It would then evaluate any marker it encounters against the current environment. If a marker is not satisfied, the package would be discarded. In other words, we need an additional \"resolution\" step in installation.
When the inherit_metadata
strategy is enabled, PDM will inherit and merge environment markers from a package's ancestors. These markers are then encoded in the lockfile during locking, resulting in faster installations. This has been enabled by default from version 2.11.0
, to disable this strategy in the config, use pdm config strategy.inherit_metadata false
.
Similar to pip list
, you can list all packages installed in the packages directory:
pdm list\n
"},{"location":"usage/dependency/#include-and-exclude-groups","title":"Include and exclude groups","text":"By default, all packages installed in the working set will be listed. You can specify which groups to be listed by --include/--exclude
options, and include
has a higher priority than exclude
.
pdm list --include dev\npdm list --exclude test\n
There is a special group :sub
, when included, all transitive dependencies will also be shown. It is included by default.
You can also pass --resolve
to pdm list
, which will show the packages resolved in pdm.lock
, rather than installed in the working set.
By default, name, version and location will be shown in the list output, you can view more fields or specify the order of fields by --fields
option:
pdm list --fields name,licenses,version\n
For all supported fields, please refer to the CLI reference.
Also, you can specify the output format other than the default table output. The supported formats and options are --csv
, --json
, --markdown
and --freeze
.
Or show a dependency tree by:
$ pdm list --tree\ntempenv 0.0.0\n\u2514\u2500\u2500 click 7.0 [ required: <7.0.0,>=6.7 ]\nblack 19.10b0\n\u251c\u2500\u2500 appdirs 1.4.3 [ required: Any ]\n\u251c\u2500\u2500 attrs 19.3.0 [ required: >=18.1.0 ]\n\u251c\u2500\u2500 click 7.0 [ required: >=6.5 ]\n\u251c\u2500\u2500 pathspec 0.7.0 [ required: <1,>=0.6 ]\n\u251c\u2500\u2500 regex 2020.2.20 [ required: Any ]\n\u251c\u2500\u2500 toml 0.10.0 [ required: >=0.9.4 ]\n\u2514\u2500\u2500 typed-ast 1.4.1 [ required: >=1.4.0 ]\nbump2version 1.0.0\n
Note that --fields
option doesn't work with --tree
.
You can also limit the packages to show by passing the patterns to pdm list
:
pdm list flask-* requests-*\n
Be careful with the shell expansion In most shells, the wildcard *
will be expanded if there are matching files under the current directory. To avoid getting unexpected results, you can wrap the patterns with single quotes: pdm list 'flask-*' 'requests-*'
.
In --tree
mode, only the subtree of the matched packages will be displayed. This can be used to achieve the same purpose as pnpm why
, which is to show why a specific package is required.
$ pdm list --tree --reverse certifi\ncertifi 2023.7.22\n\u2514\u2500\u2500 requests 2.31.0 [ requires: >=2017.4.17 ]\n\u2514\u2500\u2500 cachecontrol[filecache] 0.13.1 [ requires: >=2.16.0 ]\n
"},{"location":"usage/dependency/#allow-prerelease-versions-to-be-installed","title":"Allow prerelease versions to be installed","text":"Include the following setting in pyproject.toml
to enable:
[tool.pdm]\nallow_prereleases = true\n
"},{"location":"usage/dependency/#set-acceptable-format-for-locking-or-installing","title":"Set acceptable format for locking or installing","text":"If you want to control the format(binary/sdist) of the packages, you can set the env vars PDM_NO_BINARY
and PDM_ONLY_BINARY
.
Each env var is a comma-separated list of package name. You can set it to :all:
to apply to all packages. For example:
# No binary for werkzeug will be locked nor used for installation\nPDM_NO_BINARY=werkzeug pdm add flask\n# Only binaries will be locked in the lock file\nPDM_ONLY_BINARY=:all: pdm lock\n# No binaries will be used for installation\nPDM_NO_BINARY=:all: pdm install\n# Prefer binary distributions and even if sdist with higher version is available\nPDM_PREFER_BINARY=flask pdm install\n
"},{"location":"usage/dependency/#solve-the-locking-failure","title":"Solve the locking failure","text":"If PDM is not able to find a resolution to satisfy the requirements, it will raise an error. For example,
pdm django==3.1.4 \"asgiref<3\"\n...\n\ud83d\udd12 Lock failed\nUnable to find a resolution for asgiref because of the following conflicts:\n asgiref<3 (from project)\nasgiref<4,>=3.2.10 (from <Candidate django 3.1.4 from https://pypi.org/simple/django/>)\nTo fix this, you could loosen the dependency version constraints in pyproject.toml. If that is not possible, you could also override the resolved version in `[tool.pdm.resolution.overrides]` table.\n
You can either change to a lower version of django
or remove the upper bound of asgiref
. But if it is not eligible for your project, you can try overriding the resolved package versions in pyproject.toml
.
Sometimes users may want to keep track of the dependencies of global Python interpreter as well. It is easy to do so with PDM, via -g/--global
option which is supported by most subcommands.
If the option is passed, <CONFIG_ROOT>/global-project
will be used as the project directory, which is almost the same as normal project except that pyproject.toml
will be created automatically for you and it doesn't support build features. The idea is taken from Haskell's stack.
However, unlike stack
, by default, PDM won't use global project automatically if a local project is not found. Users should pass -g/--global
explicitly to activate it, since it is not very pleasing if packages go to a wrong place. But PDM also leave the decision to users, just set the config global_project.fallback
to true
.
By default, when pdm
uses global project implicitly the following message is printed: Project is not found, fallback to the global project
. To disable this message set the config global_project.fallback_verbose
to false
.
If you want global project to track another project file other than <CONFIG_ROOT>/global-project
, you can provide the project path via -p/--project <path>
option. Especially if you pass --global --project .
, PDM will install the dependencies of the current project into the global Python.
Warning
Be careful with remove
and sync --clean/--pure
commands when global project is used, because it may remove packages installed in your system Python.
You can also export pdm lock
to other formats, to ease the CI flow or image building process. Currently, only requirements.txt
format is supported:
pdm export -o requirements.txt\n
Note
You can also run pdm export
with a .pre-commit
hook.
As any Python deliverable, your project will go through the different phases of a Python project lifecycle and PDM provides commands to perform the expected tasks for those phases.
It also provides hooks attached to these steps allowing for:
Besides, pre_invoke
signal is emitted before ANY command is invoked, allowing plugins to modify the project or options beforehand.
The built-in commands are currently split into 3 groups:
You will most probably need to perform some recurrent tasks between the installation and publication phases (housekeeping, linting, testing, ...) this is why PDM lets you define your own tasks/phases using user scripts.
To provides full flexibility, PDM allows to skip some hooks and tasks on demand.
"},{"location":"usage/hooks/#initialization","title":"Initialization","text":"The initialization phase should occur only once in a project lifetime by running the pdm init
command to initialize an existing project (prompt to fill the pyproject.toml
file).
They trigger the following hooks:
post_init
flowchart LR\n subgraph pdm-init [pdm init]\n direction LR\n post-init{{Emit post_init}}\n init --> post-init\n end
"},{"location":"usage/hooks/#dependencies-management","title":"Dependencies management","text":"The dependencies management is required for the developer to be able to work and perform the following:
lock
: compute a lock file from the pyproject.toml
requirements.sync
: synchronize (add/remove/update) PEP582 packages from the lock file and install the current project as editable.add
: add a dependencyremove
: remove a dependencyAll those steps are directly available with the following commands:
pdm lock
: execute the lock
taskpdm sync
: execute the sync
taskpdm install
: execute the sync
task, preceded from lock
if requiredpdm add
: add a dependency requirement, re-lock and then syncpdm remove
: remove a dependency requirement, re-lock and then syncpdm update
: re-lock dependencies from their latest versions and then syncThey trigger the following hooks:
pre_install
post_install
pre_lock
post_lock
flowchart LR\n subgraph pdm-install [pdm install]\n direction LR\n\n subgraph pdm-lock [pdm lock]\n direction TB\n pre-lock{{Emit pre_lock}}\n post-lock{{Emit post_lock}}\n pre-lock --> lock --> post-lock\n end\n\n subgraph pdm-sync [pdm sync]\n direction TB\n pre-install{{Emit pre_install}}\n post-install{{Emit post_install}}\n pre-install --> sync --> post-install\n end\n\n pdm-lock --> pdm-sync\n end
"},{"location":"usage/hooks/#switching-python-version","title":"Switching Python version","text":"This is a special case in dependency management: you can switch the current Python version using pdm use
and it will emit the post_use
signal with the new Python interpreter.
flowchart LR\n subgraph pdm-use [pdm use]\n direction LR\n post-use{{Emit post_use}}\n use --> post-use\n end
"},{"location":"usage/hooks/#publication","title":"Publication","text":"As soon as you are ready to publish your package/library, you will require the publication tasks:
build
: build/compile assets requiring it and package everything into a Python package (sdist, wheel)upload
: upload/publish the package to a remote PyPI indexAll those steps are available with the following commands:
pdm build
pdm publish
They trigger the following hooks:
pre_publish
post_publish
pre_build
post_build
flowchart LR\n subgraph pdm-publish [pdm publish]\n direction LR\n pre-publish{{Emit pre_publish}}\n post-publish{{Emit post_publish}}\n\n subgraph pdm-build [pdm build]\n pre-build{{Emit pre_build}}\n post-build{{Emit post_build}}\n pre-build --> build --> post-build\n end\n\n %% subgraph pdm-upload [pdm upload]\n %% pre-upload{{Emit pre_upload}}\n %% post-upload{{Emit post_upload}}\n %% pre-upload --> upload --> post-upload\n %% end\n\n pre-publish --> pdm-build --> upload --> post-publish\n end
Execution will stop at first failure, hooks included.
"},{"location":"usage/hooks/#user-scripts","title":"User scripts","text":"User scripts are detailed in their own section but you should know that:
pre_*
and post_*
script, including composite scripts.run
execution will trigger the pre_run
and post_run
hookspre_script
and post_script
hooksGiven the following scripts
definition:
[tool.pdm.scripts]\npre_script = \"\"\npost_script = \"\"\npre_test = \"\"\npost_test = \"\"\ntest = \"\"\npre_composite = \"\"\npost_composite = \"\"\ncomposite = {composite = [\"test\"]}\n
a pdm run test
will have the following lifecycle:
flowchart LR\n subgraph pdm-run-test [pdm run test]\n direction LR\n pre-run{{Emit pre_run}}\n post-run{{Emit post_run}}\n subgraph run-test [test task]\n direction TB\n pre-script{{Emit pre_script}}\n post-script{{Emit post_script}}\n pre-test[Execute pre_test]\n post-test[Execute post_test]\n test[Execute test]\n\n pre-script --> pre-test --> test --> post-test --> post-script\n end\n\n pre-run --> run-test --> post-run\n end
while pdm run composite
will have the following:
flowchart LR\n subgraph pdm-run-composite [pdm run composite]\n direction LR\n pre-run{{Emit pre_run}}\n post-run{{Emit post_run}}\n\n subgraph run-composite [composite task]\n direction TB\n pre-script-composite{{Emit pre_script}}\n post-script-composite{{Emit post_script}}\n pre-composite[Execute pre_composite]\n post-composite[Execute post_composite]\n\n subgraph run-test [test task]\n direction TB\n pre-script-test{{Emit pre_script}}\n post-script-test{{Emit post_script}}\n pre-test[Execute pre_test]\n post-test[Execute post_test]\n\n pre-script-test --> pre-test --> test --> post-test --> post-script-test\n end\n\n pre-script-composite --> pre-composite --> run-test --> post-composite --> post-script-composite\n end\n\n pre-run --> run-composite --> post-run\n end
"},{"location":"usage/hooks/#skipping","title":"Skipping","text":"It is possible to control which task and hook runs for any built-in command as well as custom user scripts using the --skip
option.
It accepts a comma-separated list of hooks/task names to skip as well as the predefined :all
, :pre
and :post
shortcuts respectively skipping all hooks, all pre_*
hooks and all post_*
hooks. You can also provide the skip list in PDM_SKIP_HOOKS
environment variable but it will be overridden as soon as the --skip
parameter is provided.
Given the previous script block, running pdm run --skip=:pre,post_test composite
will result in the following reduced lifecycle:
flowchart LR\n subgraph pdm-run-composite [pdm run composite]\n direction LR\n post-run{{Emit post_run}}\n\n subgraph run-composite [composite task]\n direction TB\n post-script-composite{{Emit post_script}}\n post-composite[Execute post_composite]\n\n subgraph run-test [test task]\n direction TB\n post-script-test{{Emit post_script}}\n\n test --> post-script-test\n end\n\n run-test --> post-composite --> post-script-composite\n end\n\n run-composite --> post-run\n end
"},{"location":"usage/pep582/","title":"Working with PEP 582","text":"PEP 582 has been rejected
This is a rejected PEP. However, due to the fact that this feature is the reason for PDM's birth, PDM will retain the support. We recommend using virtual environments instead.
With PEP 582, dependencies will be installed into __pypackages__
directory under the project root. With PEP 582 enabled globally, you can also use the project interpreter to run scripts directly.
When the project interpreter is a normal Python, this mode is enabled.
Besides, on a project you work with for the first time on your machine, if it contains an empty __pypackages__
directory, PEP 582 is enabled automatically, and virtualenv won't be created.
To make the Python interpreters aware of PEP 582 packages, one need to add the pdm/pep582/sitecustomize.py
to the Python library search path.
One just needs to execute pdm --pep582
, then environment variable will be changed automatically. Don't forget to restart the terminal session to take effect.
The command to change the environment variables can be printed by pdm --pep582 [<SHELL>]
. If <SHELL>
isn't given, PDM will pick one based on some guesses. You can run eval \"$(pdm --pep582)\"
to execute the command.
You may want to write a line in your .bash_profile
(or similar profiles) to make it effective when logging in. For example, in bash you can do this:
pdm --pep582 >> ~/.bash_profile\n
Once again, Don't forget to restart the terminal session to take effect.
How is it done?Thanks to the site packages loading on Python startup. It is possible to patch the sys.path
by executing the sitecustomize.py
shipped with PDM. The interpreter can search the directories for the nearest __pypackage__
folder and append it to the sys.path
variable.
Now there are no built-in support or plugins for PEP 582 in most IDEs, you have to configure your tools manually.
"},{"location":"usage/pep582/#pycharm","title":"PyCharm","text":"Mark __pypackages__/<major.minor>/lib
as Sources Root. Then, select as Python interpreter a Python installation with the same <major.minor>
version.
Additionally, if you want to use tools from the environment (e.g. pytest
), you have to add the __pypackages__/<major.minor>/bin
directory to the PATH
variable in the corresponding run/debug configuration.
Add the following two entries to the top-level dict in .vscode/settings.json
:
{\n\"python.autoComplete.extraPaths\": [\"__pypackages__/<major.minor>/lib\"],\n\"python.analysis.extraPaths\": [\"__pypackages__/<major.minor>/lib\"]\n}\n
This file can be auto-generated with plugin pdm-vscode
.
Enable PEP582 globally, and make sure VSCode runs using the same user and shell you enabled PEP582 for.
Cannot enable PEP582 globally?If for some reason you cannot enable PEP582 globally, you can still configure each \"launch\" in each project: set the PYTHONPATH
environment variable in your launch configuration, in .vscode/launch.json
. For example, to debug your pytest
run:
{\n\"version\": \"0.2.0\",\n\"configurations\": [\n{\n\"name\": \"pytest\",\n\"type\": \"python\",\n\"request\": \"launch\",\n\"module\": \"pytest\",\n\"args\": [\"tests\"],\n\"justMyCode\": false,\n\"env\": {\"PYTHONPATH\": \"__pypackages__/<major.minor>/lib\"}\n}\n]\n}\n
If your package resides in a src
directory, add it to PYTHONPATH
as well:
\"env\": {\"PYTHONPATH\": \"src:__pypackages__/<major.minor>/lib\"}\n
Using Pylance/Pyright? If you have configured \"python.analysis.diagnosticMode\": \"workspace\"
, and you see a ton of errors/warnings as a result. you may need to create pyrightconfig.json
in the workspace directory, and fill in the following fields:
{\n\"exclude\": [\"__pypackages__\"]\n}\n
Then restart the language server or VS Code and you're good to go. In the future (microsoft/pylance-release#1150), maybe the problem will be solved.
Using Jupyter Notebook?If you wish to use pdm to install jupyter notebook and use it in vscode in conjunction with the python extension:
pdm add notebook
or so to install notebook.env
file inside of your project directory with contents like the following:PYTHONPATH=/your-workspace-path/__pypackages__/<major>.<minor>/lib\n
If the above still doesn't work, it's most likely because the environment variable is not properly loaded when the Notebook starts. There are two workarounds.
code .
in Terminal. It will open a new VSCode window in the current directory with the path set correctly. Use the Jupyter Notebook in the new windowimport sys\nsys.path.append('/your-workspace-path/__pypackages__/<major>.<minor>/lib')\n
Reference Issue
PDM Task ProviderIn addition, there is a VSCode Task Provider extension available for download.
This makes it possible for VSCode to automatically detect pdm scripts so they can be run natively as VSCode Tasks.
"},{"location":"usage/pep582/#neovim","title":"Neovim","text":"If using neovim-lsp with pyright and want your __pypackages__
directory to be added to the path, you can add this to your project's pyproject.toml
.
[tool.pyright]\nextraPaths = [\"__pypackages__/<major.minor>/lib/\"]\n
"},{"location":"usage/pep582/#emacs","title":"Emacs","text":"You have a few options, but basically you'll want to tell an LSP client to add __pypackages__
to the paths it looks at. Here are a few options that are available:
pyproject.toml
and pyright","text":"Add this to your project's pyproject.toml
:
[tool.pyright]\nextraPaths = [\"__pypackages__/<major.minor>/lib/\"]\n
"},{"location":"usage/pep582/#eglot-pyright","title":"eglot + pyright","text":"Using pyright and eglot (included in Emacs 29), add the following to your config:
(defun get-pdm-packages-path ()\n\"For the current PDM project, find the path to the packages.\"\n(let ((packages-path (string-trim (shell-command-to-string \"pdm info --packages\"))))\n(concat packages-path \"/lib\")))\n\n(defun my/eglot-workspace-config (server)\n\"For the current PDM project, dynamically generate a python lsp config.\"\n`(:python\\.analysis (:extraPaths ,(vector (get-pdm-packages-path)))))\n\n(setq-default eglot-workspace-configuration #'my/eglot-workspace-config)\n
You'll want pyright installed either globally, or in your project (probably as a dev dependency). You can add this with, for example:
pdm add --dev --group devel pyright\n
"},{"location":"usage/pep582/#lsp-mode-lsp-python-ms","title":"LSP-Mode + lsp-python-ms","text":"Below is a sample code snippet showing how to make PDM work with lsp-python-ms in Emacs. Contributed by @linw1995.
;; TODO: Cache result\n(defun linw1995/pdm-get-python-executable (&optional dir)\n(let ((pdm-get-python-cmd \"pdm info --python\"))\n(string-trim\n(shell-command-to-string\n(if dir\n(concat \"cd \"\ndir\n\" && \"\npdm-get-python-cmd)\npdm-get-python-cmd)))))\n\n(defun linw1995/pdm-get-packages-path (&optional dir)\n(let ((pdm-get-packages-cmd \"pdm info --packages\"))\n(concat (string-trim\n(shell-command-to-string\n(if dir\n(concat \"cd \"\ndir\n\" && \"\npdm-get-packages-cmd)\npdm-get-packages-cmd)))\n\"/lib\")))\n\n(use-package lsp-python-ms\n:ensure t\n:init (setq lsp-python-ms-auto-install-server t)\n:hook (python-mode\n. (lambda ()\n(setq lsp-python-ms-python-executable (linw1995/pdm-get-python-executable))\n(setq lsp-python-ms-extra-paths (vector (linw1995/pdm-get-packages-path)))\n(require 'lsp-python-ms)\n(lsp)))) ; or lsp-deferred\n
"},{"location":"usage/project/","title":"New Project","text":"To start with, create a new project with pdm init
:
mkdir my-project && cd my-project\npdm init\n
You will need to answer a few questions, to help PDM to create a pyproject.toml
file for you. For more usages of pdm init
, please read Create your project from a template.
At first, you need to choose a Python interpreter from a list of Python versions installed on your machine. The interpreter path will be stored in .pdm-python
and used by subsequent commands. You can also change it later with pdm use
.
Alternatively, you can specify the Python interpreter path via PDM_PYTHON
environment variable. When it is set, the path saved in .pdm-python
will be ignored.
After you select the Python interpreter, PDM will ask you whether you want to create a virtual environment for the project. If you choose yes, PDM will create a virtual environment in the project root directory, and use it as the Python interpreter for the project.
If the selected Python interpreter is in a virtual environment, PDM will use it as the project environment and install dependencies into it. Otherwise, __pypackages__
will be created in the project root and dependencies will be installed into it.
For the difference between these two approaches, please refer to the corresponding sections in the docs:
__pypackages__
(PEP 582)A library and an application differ in many ways. In short, a library is a package that is intended to be installed and used by other projects. In most cases it also needs to be uploaded to PyPI. An application, on the other hand, is one that is directly facing end users and may need to be deployed into some production environments.
In PDM, if you choose to create a library, PDM will add a name
, version
field to the pyproject.toml
file, as well as a [build-system]
table for the build backend, which is only useful if your project needs to be built and distributed. So you need to manually add these fields to pyproject.toml
if you want to change the project from an application to a library. Also, a library project will be installed into the environment when you run pdm install
or pdm sync
, unless --no-self
is specified.
requires-python
value","text":"You need to set an appropriate requires-python
value for your project. This is an important property that affects how dependencies are resolved. Basically, each package's requires-python
must cover the project's requires-python
range. For example, consider the following setup:
requires-python = \">=3.9\"
foo
: requires-python = \">=3.7,<3.11\"
Resolving the dependencies will cause a ResolutionImpossible
:
Unable to find a resolution because the following dependencies don't work\non all Python versions defined by the project's `requires-python`\n
Because the dependency's requires-python
is >=3.7,<3.11
, it doesn't cover the project's requires-python
range of >=3.9
. In other words, the project promises to work on Python 3.9, 3.10, 3.11 (and so on), but the dependency doesn't support Python 3.11 (or any higher). Since PDM creates a cross-platform lockfile that should work on all Python versions within the requires-python
range, it can't find a valid resolution. To fix this, you need add a maximum version to requires-python
, like >=3.9,<3.11
.
The value of requires-python
is a version specifier as defined in PEP 440. Here are some examples:
requires-python
Meaning >=3.7
Python 3.7 and above >=3.7,<3.11
Python 3.7, 3.8, 3.9 and 3.10 >=3.6,!=3.8.*,!=3.9.*
Python 3.6 and above, except 3.8 and 3.9"},{"location":"usage/project/#working-with-older-python-versions","title":"Working with older Python versions","text":"Although PDM run on Python 3.8 and above, you can still have lower Python versions for your working project. But remember, if your project is a library, which needs to be built, published or installed, you make sure the PEP 517 build backend being used supports the lowest Python version you need. For instance, the default backend pdm-backend
only works on Python 3.7+, so if you run pdm build
on a project with Python 3.6, you will get an error. Most modern build backends have dropped the support for Python 3.6 and lower, so it is highly recommended to upgrade the Python version to 3.7+. Here are the supported Python range for some commonly used build backends, we only list those that support PEP 621 since otherwise PDM can't work with them.
pdm-backend
>=3.7
Yes setuptools>=60
>=3.7
Experimental hatchling
>=3.7
Yes flit-core>=3.4
>=3.6
Yes flit-core>=3.2,<3.4
>=3.4
Yes Note that if your project is an application (i.e. without the name
metadata), the above limitation of backends does not apply. Therefore, if you don't need a build backend you can use any Python version >=2.7
.
If you are already using other package manager tools like Pipenv or Poetry, it is easy to migrate to PDM. PDM provides import
command so that you don't have to initialize the project manually, it now supports:
Pipfile
pyproject.toml
pyproject.toml
requirements.txt
format used by pipsetup.py
(It requires setuptools
to be installed in the project environment. You can do this by configuring venv.with_pip
to true
for venv and pdm add setuptools
for __pypackages__
)Also, when you are executing pdm init
or pdm install
, PDM can auto-detect possible files to import if your PDM project has not been initialized yet.
Info
Converting a setup.py
will execute the file with the project interpreter. Make sure setuptools
is installed with the interpreter and the setup.py
is trusted.
You must commit the pyproject.toml
file. You should commit the pdm.lock
and pdm.toml
file. Do not commit the .pdm-python
file.
The pyproject.toml
file must be committed as it contains the project's build metadata and dependencies needed for PDM. It is also commonly used by other python tools for configuration. Read more about the pyproject.toml
file at Pip documentation.
You should be committing the pdm.lock
file, by doing so you ensure that all installers are using the same versions of dependencies. To learn how to update dependencies see update existing dependencies.
pdm.toml
contains some project-wide configuration and it may be useful to commit it for sharing.
.pdm-python
stores the Python path used by the current project and doesn't need to be shared.
$ pdm info\nPDM version:\n 2.0.0\nPython Interpreter:\n /opt/homebrew/opt/python@3.9/bin/python3.9 (3.9)\nProject Root:\n /Users/fming/wkspace/github/test-pdm\nProject Packages:\n /Users/fming/wkspace/github/test-pdm/__pypackages__/3.9\n\n# Show environment info\n$ pdm info --env\n{\n\"implementation_name\": \"cpython\",\n \"implementation_version\": \"3.8.0\",\n \"os_name\": \"nt\",\n \"platform_machine\": \"AMD64\",\n \"platform_release\": \"10\",\n \"platform_system\": \"Windows\",\n \"platform_version\": \"10.0.18362\",\n \"python_full_version\": \"3.8.0\",\n \"platform_python_implementation\": \"CPython\",\n \"python_version\": \"3.8\",\n \"sys_platform\": \"win32\"\n}\n
This command is useful for checking which mode is being used by the project:
None
, virtualenv mode is enabled.Now, you have set up a new PDM project and get a pyproject.toml
file. Refer to metadata section about how to write pyproject.toml
properly.
If you are developing a library, after adding dependencies to your project, and finishing the coding, it's time to build and publish your package. It is as simple as one command:
pdm publish\n
This will automatically build a wheel and a source distribution(sdist), and upload them to the PyPI index.
To specify another repository other than PyPI, use the --repository
option, the parameter can be either the upload URL or the name of the repository stored in the config file.
pdm publish --repository testpypi\npdm publish --repository https://test.pypi.org/legacy/\n
"},{"location":"usage/publish/#publish-with-trusted-publishers","title":"Publish with trusted publishers","text":"You can configure trusted publishers for PyPI so that you don't need to expose the PyPI tokens in the release workflow. To do this, follow the guide to add a publisher and write the GitHub Actions workflow as below:
jobs:\npypi-publish:\nname: upload release to PyPI\nruns-on: ubuntu-latest\npermissions:\n# IMPORTANT: this permission is mandatory for trusted publishing\nid-token: write\nsteps:\n- uses: actions/checkout@v3\n\n- uses: pdm-project/setup-pdm@v3\n\n- name: Publish package distributions to PyPI\nrun: pdm publish\n
"},{"location":"usage/publish/#build-and-publish-separately","title":"Build and publish separately","text":"You can also build the package and upload it in two steps, to allow you to inspect the built artifacts before uploading.
pdm build\n
There are many options to control the build process, depending on the backend used. Refer to the build configuration section for more details.
The artifacts will be created at dist/
and able to upload to PyPI.
pdm publish --no-build\n
"},{"location":"usage/scripts/","title":"PDM Scripts","text":"Like npm run
, with PDM, you can run arbitrary scripts or commands with local packages loaded.
pdm run flask run -p 54321\n
It will run flask run -p 54321
in the environment that is aware of packages in your project environment.
PDM also supports custom script shortcuts in the optional [tool.pdm.scripts]
section of pyproject.toml
.
You can then run pdm run <script_name>
to invoke the script in the context of your PDM project. For example:
[tool.pdm.scripts]\nstart = \"flask run -p 54321\"\n
And then in your terminal:
$ pdm run start\nFlask server started at http://127.0.0.1:54321\n
Any following arguments will be appended to the command:
$ pdm run start -h 0.0.0.0\nFlask server started at http://0.0.0.0:54321\n
Yarn-like script shortcuts
There is a builtin shortcut making all scripts available as root commands as long as the script does not conflict with any builtin or plugin-contributed command. Said otherwise, if you have a start
script, you can run both pdm run start
and pdm start
. But if you have an install
script, only pdm run install
will run it, pdm install
will still run the builtin install
command.
PDM supports 4 types of scripts:
"},{"location":"usage/scripts/#cmd","title":"cmd
","text":"Plain text scripts are regarded as normal command, or you can explicitly specify it:
[tool.pdm.scripts]\nstart = {cmd = \"flask run -p 54321\"}\n
In some cases, such as when wanting to add comments between parameters, it might be more convenient to specify the command as an array instead of a string:
[tool.pdm.scripts]\nstart = {cmd = [\n\"flask\",\n\"run\",\n# Important comment here about always using port 54321\n\"-p\", \"54321\"\n]}\n
"},{"location":"usage/scripts/#shell","title":"shell
","text":"Shell scripts can be used to run more shell-specific tasks, such as pipeline and output redirecting. This is basically run via subprocess.Popen()
with shell=True
:
[tool.pdm.scripts]\nfilter_error = {shell = \"cat error.log|grep CRITICAL > critical.log\"}\n
"},{"location":"usage/scripts/#call","title":"call
","text":"The script can be also defined as calling a python function in the form <module_name>:<func_name>
:
[tool.pdm.scripts]\nfoobar = {call = \"foo_package.bar_module:main\"}\n
The function can be supplied with literal arguments:
[tool.pdm.scripts]\nfoobar = {call = \"foo_package.bar_module:main('dev')\"}\n
"},{"location":"usage/scripts/#composite","title":"composite
","text":"This script kind execute other defined scripts:
[tool.pdm.scripts]\nlint = \"flake8\"\ntest = \"pytest\"\nall = {composite = [\"lint\", \"test\"]}\n
Running pdm run all
will run lint
first and then test
if lint
succeeded.
You can also provide arguments to the called scripts:
[tool.pdm.scripts]\nlint = \"flake8\"\ntest = \"pytest\"\nall = {composite = [\"lint mypackage/\", \"test -v tests/\"]}\n
Note
Argument passed on the command line are given to each called task.
"},{"location":"usage/scripts/#script-options","title":"Script Options","text":""},{"location":"usage/scripts/#env","title":"env
","text":"All environment variables set in the current shell can be seen by pdm run
and will be expanded when executed. Besides, you can also define some fixed environment variables in your pyproject.toml
:
[tool.pdm.scripts]\nstart.cmd = \"flask run -p 54321\"\nstart.env = {FOO = \"bar\", FLASK_ENV = \"development\"}\n
Note how we use TOML's syntax to define a composite dictionary.
Note
Environment variables specified on a composite task level will override those defined by called tasks.
"},{"location":"usage/scripts/#env_file","title":"env_file
","text":"You can also store all environment variables in a dotenv file and let PDM read it:
[tool.pdm.scripts]\nstart.cmd = \"flask run -p 54321\"\nstart.env_file = \".env\"\n
The variables within the dotenv file will not override any existing environment variables. If you want the dotenv file to override existing environment variables use the following:
[tool.pdm.scripts]\nstart.cmd = \"flask run -p 54321\"\nstart.env_file.override = \".env\"\n
Note
A dotenv file specified on a composite task level will override those defined by called tasks.
"},{"location":"usage/scripts/#site_packages","title":"site_packages
","text":"To make sure the running environment is properly isolated from the outer Python interpreter, site-packages from the selected interpreter WON'T be loaded into sys.path
, unless any of the following conditions holds:
PATH
but not inside the __pypackages__
folder.-s/--site-packages
flag is following pdm run
.site_packages = true
is in either the script table or the global setting key _
.Note that site-packages will always be loaded if running with PEP 582 enabled(without the pdm run
prefix).
If you want the options to be shared by all tasks run by pdm run
, you can write them under a special key _
in [tool.pdm.scripts]
table:
[tool.pdm.scripts]\n_.env_file = \".env\"\nstart = \"flask run -p 54321\"\nmigrate_db = \"flask db upgrade\"\n
Besides, inside the tasks, PDM_PROJECT_ROOT
environment variable will be set to the project root.
By default, all user provided extra arguments are simply appended to the command (or to all the commands for composite
tasks).
If you want more control over the user provided extra arguments, you can use the {args}
placeholder. It is available for all script types and will be interpolated properly for each:
[tool.pdm.scripts]\ncmd = \"echo '--before {args} --after'\"\nshell = {shell = \"echo '--before {args} --after'\"}\ncomposite = {composite = [\"cmd --something\", \"shell {args}\"]}\n
will produce the following interpolations (those are not real scripts, just here to illustrate the interpolation):
$ pdm run cmd --user --provided\n--before --user --provided --after\n$ pdm run cmd\n--before --after\n$ pdm run shell --user --provided\n--before --user --provided --after\n$ pdm run shell\n--before --after\n$ pdm run composite --user --provided\ncmd --something\nshell --before --user --provided --after\n$ pdm run composite\ncmd --something\nshell --before --after\n
You may optionally provide default values that will be used if no user arguments are provided:
[tool.pdm.scripts]\ntest = \"echo '--before {args:--default --value} --after'\"\n
will produce the following:
$ pdm run test --user --provided\n--before --user --provided --after\n$ pdm run test\n--before --default --value --after\n
Note
As soon a placeholder is detected, arguments are not appended anymore. This is important for composite
scripts because if a placeholder is detected on one of the subtasks, none for the subtasks will have the arguments appended, you need to explicitly pass the placeholder to every nested command requiring it.
Note
call
scripts don't support the {args}
placeholder as they have access to sys.argv
directly to handle such complexe cases and more.
{pdm}
placeholder","text":"Sometimes you may have multiple PDM installations, or pdm
installed with a different name. This could for example occur in a CI/CD situation, or when working with different PDM versions in different repos. To make your scripts more robust you can use {pdm}
to use the PDM entrypoint executing the script. This will expand to {sys.executable} -m pdm
.
[tool.pdm.scripts]\nwhoami = { shell = \"echo `{pdm} -V` was called as '{pdm} -V'\" }\n
will produce the following output: $ pdm whoami\nPDM, version 0.1.dev2501+g73651b7.d20231115 was called as /usr/bin/python3 -m pdm -V\n\n$ pdm2.8 whoami\nPDM, version 2.8.0 was called as <snip>/venvs/pdm2-8/bin/python -m pdm -V\n
Note
While the above example uses PDM 2.8, this functionality was introduced in the 2.10 series and only backported for the showcase.
"},{"location":"usage/scripts/#show-the-list-of-scripts","title":"Show the List of Scripts","text":"Use pdm run --list/-l
to show the list of available script shortcuts:
$ pdm run --list\n\u256d\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u252c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u252c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256e\n\u2502 Name \u2502 Type \u2502 Description \u2502\n\u251c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u253c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u253c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2524\n\u2502 test_cmd \u2502 cmd \u2502 flask db upgrade \u2502\n\u2502 test_script \u2502 call \u2502 call a python function \u2502\n\u2502 test_shell \u2502 shell \u2502 shell command \u2502\n\u2570\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2534\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2534\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256f\n
You can add an help
option with the description of the script, and it will be displayed in the Description
column in the above output.
Note
Tasks with a name starting with an underscore (_
) are considered internal (helpers...) and are not shown in the listing.
Like npm
, PDM also supports tasks composition by pre and post scripts, pre script will be run before the given task and post script will be run after.
[tool.pdm.scripts]\npre_compress = \"{{ Run BEFORE the `compress` script }}\"\ncompress = \"tar czvf compressed.tar.gz data/\"\npost_compress = \"{{ Run AFTER the `compress` script }}\"\n
In this example, pdm run compress
will run all these 3 scripts sequentially.
The pipeline fails fast
In a pipeline of pre - self - post scripts, a failure will cancel the subsequent execution.
"},{"location":"usage/scripts/#hook-scripts","title":"Hook Scripts","text":"Under certain situations PDM will look for some special hook scripts for execution:
post_init
: Run after pdm init
pre_install
: Run before installing packagespost_install
: Run after packages are installedpre_lock
: Run before dependency resolutionpost_lock
: Run after dependency resolutionpre_build
: Run before building distributionspost_build
: Run after distributions are builtpre_publish
: Run before publishing distributionspost_publish
: Run after distributions are publishedpre_script
: Run before any scriptpost_script
: Run after any scriptpre_run
: Run once before run script invocationpost_run
: Run once after run script invocationNote
Pre & post scripts can't receive any arguments.
Avoid name conflicts
If there exists an install
scripts under [tool.pdm.scripts]
table, pre_install
scripts can be triggered by both pdm install
and pdm run install
. So it is recommended to not use the preserved names.
Note
Composite tasks can also have pre and post scripts. Called tasks will run their own pre and post scripts.
"},{"location":"usage/scripts/#skipping-scripts","title":"Skipping scripts","text":"Because, sometimes it is desirable to run a script but without its hooks or pre and post scripts, there is a --skip=:all
which will disable all hooks, pre and post. There is also --skip=:pre
and --skip=:post
allowing to respectively skip all pre_*
hooks and all post_*
hooks.
It is also possible to need a pre script but not the post one, or to need all tasks from a composite tasks except one. For those use cases, there is a finer grained --skip
parameter accepting a list of tasks or hooks name to exclude.
pdm run --skip pre_task1,task2 my-composite\n
This command will run the my-composite
task and skip the pre_task1
hook as well as the task2
and its hooks.
You can also provide you skip list in PDM_SKIP_HOOKS
environment variable but it will be overridden as soon as the --skip
parameter is provided.
There is more details on hooks and pre/post scripts behavior on the dedicated hooks page.
"},{"location":"usage/template/","title":"Create Project From a Template","text":"Similar to yarn create
and npm create
, PDM also supports initializing or creating a project from a template. The template is given as a positional argument of pdm init
, in one of the following forms:
pdm init flask
- Initialize the project from the template https://github.com/pdm-project/template-flask
pdm init https://github.com/frostming/pdm-template-flask
- Initialize the project from a Git URL. Both HTTPS and SSH URL are acceptable.pdm init django@v2
- To check out the specific branch or tag. Full Git URL also supports it.pdm init /path/to/template
- Initialize the project from a template directory on local filesystem.And pdm init
will use the default template built in.
The project will be initialized at the current directory, existing files with the same name will be overwritten. You can also use the -p <path>
option to create a project at a new path.
According to the first form of the template argument, pdm init <name>
will refer to the template repository located at https://github.com/pdm-project/template-<name>
. To contribute a template, you can create a template repository and establish a request to transfer the ownership to pdm-project
organization(it can be found at the bottom of the repository settings page). The administrators of the organization will review the request and complete the subsequent steps. You will be added as the repository maintainer if the transfer is accepted.
A template repository must be a pyproject-based project, which contains a pyproject.toml
file with PEP-621 compliant metadata. No other special config files are required.
On initialization, the project name in the template will be replaced by the name of the new project. This is done by a recursive full-text search and replace. The import name, which is derived from the project name by replacing all non-alphanumeric characters with underscores and lowercasing, will also be replaced in the same way.
For example, if the project name is foo-project
in the template and you want to initialize a new project named bar-project
, the following replacements will be made:
foo-project
-> bar-project
in all .md
files and .rst
filesfoo_project
-> bar_project
in all .py
filesfoo_project
-> bar_project
in the directory namefoo_project.py
-> bar_project.py
in the file nameTherefore, we don't support name replacement if the import name isn't derived from the project name.
"},{"location":"usage/template/#use-other-project-generators","title":"Use other project generators","text":"If you are seeking for a more powerful project generator, you can use cookiecutter via --cookiecutter
option and copier via --copier
option.
You need to install cookiecutter
and copier
respectively to use them. You can do this by running pdm self add <package>
. To use them:
pdm init --cookiecutter gh:cjolowicz/cookiecutter-hypermodern-python\n# or\npdm init --copier gh:pawamoy/copier-pdm --UNSAFE\n
"},{"location":"usage/venv/","title":"Working with Virtual Environments","text":"When you run pdm init
command, PDM will ask for the Python interpreter to use in the project, which is the base interpreter to install dependencies and run tasks.
Compared to PEP 582, virtual environments are considered more mature and have better support in the Python ecosystem as well as IDEs. Therefore, virtualenv is the default mode if not configured otherwise.
Virtual environments will be used if the project interpreter (the interpreter stored in .pdm-python
, which can be checked by pdm info
) is from a virtualenv.
By default, PDM prefers to use the virtualenv layout as other package managers do. When you run pdm install
the first time on a new PDM-managed project, whose Python interpreter is not decided yet, PDM will create a virtualenv in <project_root>/.venv
, and install dependencies into it. In the interactive session of pdm init
, PDM will also ask to create a virtualenv for you.
You can choose the backend used by PDM to create a virtualenv. Currently it supports three backends:
virtualenv
(default)venv
conda
You can change it by pdm config venv.backend [virtualenv|venv|conda]
.
You can create more than one virtualenvs with whatever Python version you want.
# Create a virtualenv based on 3.8 interpreter\n$ pdm venv create 3.8\n# Assign a different name other than the version string\n$ pdm venv create --name for-test 3.8\n# Use venv as the backend to create, support 3 backends: virtualenv(default), venv, conda\n$ pdm venv create --with venv 3.9\n
"},{"location":"usage/venv/#the-location-of-virtualenvs","title":"The location of virtualenvs","text":"If no --name
is given, PDM will create the venv in <project_root>/.venv
. Otherwise, virtualenvs go to the location specified by the venv.location
configuration. They are named as <project_name>-<path_hash>-<name_or_python_version>
to avoid name collision. You can disable the in-project virtualenv creation by pdm config venv.in_project false
. And all virtualenvs will be created under venv.location
.
You can tell PDM to use a virtualenv you created in preceding steps, with pdm use
:
pdm use -f /path/to/venv\n
"},{"location":"usage/venv/#virtualenv-auto-detection","title":"Virtualenv auto-detection","text":"When no interpreter is stored in the project config or PDM_IGNORE_SAVED_PYTHON
env var is set, PDM will try to detect possible virtualenvs to use:
venv
, env
, .venv
directories in the project rootPDM_IGNORE_ACTIVE_VENV
is set$ pdm venv list\nVirtualenvs created with this project:\n\n- 3.8.6: C:\\Users\\Frost Ming\\AppData\\Local\\pdm\\pdm\\venvs\\test-project-8Sgn_62n-3.8.6\n- for-test: C:\\Users\\Frost Ming\\AppData\\Local\\pdm\\pdm\\venvs\\test-project-8Sgn_62n-for-test\n- 3.9.1: C:\\Users\\Frost Ming\\AppData\\Local\\pdm\\pdm\\venvs\\test-project-8Sgn_62n-3.9.1\n
"},{"location":"usage/venv/#show-the-path-or-python-interpreter-of-a-virtualenv","title":"Show the path or python interpreter of a virtualenv","text":"$ pdm venv --path for-test\n$ pdm venv --python for-test\n
"},{"location":"usage/venv/#remove-a-virtualenv","title":"Remove a virtualenv","text":"$ pdm venv remove for-test\nVirtualenvs created with this project:\nWill remove: C:\\Users\\Frost Ming\\AppData\\Local\\pdm\\pdm\\venvs\\test-project-8Sgn_62n-for-test, continue? [y/N]:y\nRemoved C:\\Users\\Frost Ming\\AppData\\Local\\pdm\\pdm\\venvs\\test-project-8Sgn_62n-for-test\n
"},{"location":"usage/venv/#activate-a-virtualenv","title":"Activate a virtualenv","text":"Instead of spawning a subshell like what pipenv
and poetry
do, pdm venv
doesn't create the shell for you but print the activate command to the console. In this way you won't leave the current shell. You can then feed the output to eval
to activate the virtualenv:
$ eval $(pdm venv activate for-test)\n(test-project-for-test) $ # Virtualenv entered\n
$ eval (pdm venv activate for-test)\n
PS1> Invoke-Expression (pdm venv activate for-test)\n
Additionally, if the project interpreter is a venv Python, you can omit the name argument following activate.
Note
venv activate
does not switch the Python interpreter used by the project. It only changes the shell by injecting the virtualenv paths to environment variables. For the forementioned purpose, use the pdm use
command.
For more CLI usage, see the pdm venv
documentation.
Looking for pdm shell
?
PDM doesn't provide a shell
command because many fancy shell functions may not work perfectly in a subshell, which brings a maintenance burden to support all the corner cases. However, you can still gain the ability via the following ways:
pdm run $SHELL
, this will spawn a subshell with the environment variables set properly. The subshell can be quit with exit
or Ctrl+D
.pdm() {\nlocal command=$1\n\nif [[ \"$command\" == \"shell\" ]]; then\neval $(pdm venv activate)\nelse\ncommand pdm $@\nfi\n}\n
Copy and paste this function to your ~/.bashrc
file and restart your shell.
For fish
shell you can put the following into your ~/fish/config.fish
or in ~/.config/fish/config.fish
function pdm\n set cmd $argv[1]\n\n if test \"$cmd\" = \"shell\"\n eval (pdm venv activate)\n else\n command pdm $argv\n end\n end\n
Now you can run pdm shell
to activate the virtualenv. The virtualenv can be deactivated with deactivate
command as usual.
By default when you activate a virtualenv, the prompt will show: {project_name}-{python_version}
.
For example if your project is named test-project
:
$ eval $(pdm venv activate for-test)\n(test-project-3.10) $ # {project_name} == test-project and {python_version} == 3.10\n
The format can be customized before virtualenv creation with the venv.prompt
configuration or PDM_VENV_PROMPT
environment variable (before a pdm init
or pdm venv create
). Available variables are:
project_name
: name of your projectpython_version
: version of Python (used by the virtualenv)$ PDM_VENV_PROMPT='{project_name}-py{python_version}' pdm venv create --name test-prompt\n$ eval $(pdm venv activate test-prompt)\n(test-project-py3.10) $\n
"},{"location":"usage/venv/#run-a-command-in-a-virtual-environment-without-activating-it","title":"Run a command in a virtual environment without activating it","text":"# Run a script\n$ pdm run --venv test test\n# Install packages\n$ pdm sync --venv test\n# List the packages installed\n$ pdm list --venv test\n
There are other commands supporting --venv
flag or PDM_IN_VENV
environment variable, see the CLI reference. You should create the virtualenv with pdm venv create --name <name>
before using this feature.
By default, if you use pdm use
and select a non-venv Python, the project will be switched to PEP 582 mode. We also allow you to switch to a named virtual environment via the --venv
flag:
# Switch to a virtualenv named test\n$ pdm use --venv test\n# Switch to the in-project venv located at $PROJECT_ROOT/.venv\n$ pdm use --venv in-project\n
"},{"location":"usage/venv/#disable-virtualenv-mode","title":"Disable virtualenv mode","text":"You can disable the auto-creation and auto-detection for virtualenv by pdm config python.use_venv false
. If venv is disabled, PEP 582 mode will always be used even if the selected interpreter is from a virtualenv.
By default PDM will not include pip
in virtual environments. This increases isolation by ensuring that only your dependencies are installed in the virtual environment.
To install pip
once (if for example you want to install arbitrary dependencies in CI) you can run:
# Install pip in the virtual environment\n$ pdm run python -m ensurepip\n# Install arbitrary dependencies\n# These dependencies are not checked for conflicts against lockfile dependencies!\n$ pdm run python -m pip install coverage\n
Or you can create the virtual environment with --with-pip
:
$ pdm venv create --with-pip 3.9\n
See the ensurepip docs for more details on ensurepip
.
If you want to permanently configure PDM to include pip
in virtual environments you can use the venv.with_pip
configuration.
PDM, as described, is a modern Python package and dependency manager supporting the latest PEP standards. But it is more than a package manager. It boosts your development workflow in various aspects.
"},{"location":"#feature-highlights","title":"Feature highlights","text":"PDM requires Python 3.8+ to be installed. It works on multiple platforms including Windows, Linux and macOS.
Note
You can still have your project working on lower Python versions, read how to do it here.
"},{"location":"#recommended-installation-method","title":"Recommended installation method","text":"PDM requires python version 3.8 or higher.
Like Pip, PDM provides an installation script that will install PDM into an isolated environment.
Linux/MacWindowscurl -sSL https://pdm-project.org/install-pdm.py | python3 -\n
(Invoke-WebRequest -Uri https://pdm-project.org/install-pdm.py -UseBasicParsing).Content | py -\n
For security reasons, you should verify the checksum of install-pdm.py
. It can be downloaded from install-pdm.py.sha256.
For example, on Linux/Mac:
curl -sSLO https://pdm-project.org/install-pdm.py\ncurl -sSL https://pdm-project.org/install-pdm.py.sha256 | shasum -a 256 -c -\n# Run the installer\npython3 install-pdm.py [options]\n
The installer will install PDM into the user site and the location depends on the system:
$HOME/.local/bin
for Unix%APPDATA%\\Python\\Scripts
on WindowsYou can pass additional options to the script to control how PDM is installed:
usage: install-pdm.py [-h] [-v VERSION] [--prerelease] [--remove] [-p PATH] [-d DEP]\n\noptional arguments:\n -h, --help show this help message and exit\n -v VERSION, --version VERSION | envvar: PDM_VERSION\n Specify the version to be installed, or HEAD to install from the main branch\n --prerelease | envvar: PDM_PRERELEASE Allow prereleases to be installed\n --remove | envvar: PDM_REMOVE Remove the PDM installation\n -p PATH, --path PATH | envvar: PDM_HOME Specify the location to install PDM\n -d DEP, --dep DEP | envvar: PDM_DEPS Specify additional dependencies, can be given multiple times\n
You can either pass the options after the script or set the env var value.
"},{"location":"#other-installation-methods","title":"Other installation methods","text":"HomebrewScooppipxpipasdfinside projectbrew install pdm\n
scoop bucket add frostming https://github.com/frostming/scoop-frostming.git\nscoop install pdm\n
pipx install pdm\n
Install the head version of GitHub repository. Make sure you have installed Git LFS on your system.
pipx install git+https://github.com/pdm-project/pdm.git@main#egg=pdm\n
To install PDM with all features:
pipx install pdm[all]\n
See also: https://pypa.github.io/pipx/
pip install --user pdm\n
Assuming you have asdf installed.
asdf plugin add pdm\nasdf local pdm latest\nasdf install pdm\n
By copying the Pyprojectx wrapper scripts to a project, you can install PDM as (npm-style) dev dependency inside that project. This allows different projects/branches to use different PDM versions.
To initialize a new or existing project, cd into the project folder and:
Linux/MacWindowscurl -LO https://github.com/pyprojectx/pyprojectx/releases/latest/download/wrappers.zip && unzip wrappers.zip && rm -f wrappers.zip\n./pw --init pdm\n
Invoke-WebRequest https://github.com/pyprojectx/pyprojectx/releases/latest/download/wrappers.zip -OutFile wrappers.zip; Expand-Archive -Path wrappers.zip -DestinationPath .; Remove-Item -Path wrappers.zip\n.\\pw --init pdm\n
When installing pdm with this method, you need to run all pdm
commands through the pw
wrapper:
./pw pdm install\n
"},{"location":"#update-the-pdm-version","title":"Update the PDM version","text":"pdm self update\n
"},{"location":"#packaging-status","title":"Packaging Status","text":""},{"location":"#shell-completion","title":"Shell Completion","text":"PDM supports generating completion scripts for Bash, Zsh, Fish or Powershell. Here are some common locations for each shell:
BashZshFishPowershellpdm completion bash > /etc/bash_completion.d/pdm.bash-completion\n
# Make sure ~/.zfunc is added to fpath, before compinit.\npdm completion zsh > ~/.zfunc/_pdm\n
Oh-My-Zsh:
mkdir $ZSH_CUSTOM/plugins/pdm\npdm completion zsh > $ZSH_CUSTOM/plugins/pdm/_pdm\n
Then make sure pdm plugin is enabled in ~/.zshrc
pdm completion fish > ~/.config/fish/completions/pdm.fish\n
# Create a directory to store completion scripts\nmkdir $PROFILE\\..\\Completions\necho @'\nGet-ChildItem \"$PROFILE\\..\\Completions\\\" | ForEach-Object {\n . $_.FullName\n}\n'@ | Out-File -Append -Encoding utf8 $PROFILE\n# Generate script\nSet-ExecutionPolicy Unrestricted -Scope CurrentUser\npdm completion powershell | Out-File -Encoding utf8 $PROFILE\\..\\Completions\\pdm_completion.ps1\n
"},{"location":"#virtualenv-and-pep-582","title":"Virtualenv and PEP 582","text":"PDM offers experimental support for PEP 582 as an opt-in feature, in addition to virtualenv management. Although the Python Steering Council has rejected PEP 582, you can still test it out using PDM.
To learn more about the two modes, refer to the relevant chapters on Working with virtualenv and Working with PEP 582.
"},{"location":"#pdm-eco-system","title":"PDM Eco-system","text":"Awesome PDM is a curated list of awesome PDM plugins and resources.
"},{"location":"#sponsors","title":"Sponsors","text":""},{"location":"dev/benchmark/","title":"Benchmark","text":"This page has been removed, please visit https://lincolnloop.github.io/python-package-manager-shootout/ for a detailed benchmark report.
"},{"location":"dev/changelog/","title":"Changelog","text":"Attention
Major and minor releases also include changes listed within prior beta releases.
"},{"location":"dev/changelog/#release-v2112-2024-01-02","title":"Release v2.11.2 (2024-01-02)","text":""},{"location":"dev/changelog/#bug-fixes","title":"Bug Fixes","text":"pdm update --update-eager
can hit InconsistentCandidate error when dependency is included both through default dependencies and extra. #2495pdm install
should not warn when overwriting its own symlinks on install
/update
. #2502pdm export
. #1910--skip-existing
to pdm publish
to ignore the uploading error if the package already exists. #2362==major.minor.*
as default requires python for application projects. #2382package-type
field in the tool.pdm
table to differentiate between library and application projects. #2394pdm lock
now supports --update-reuse
option to keep the pinned versions in the lockfile if possible. #2419inherit_metadata
to inherit and merge markers from parent requirements. This is enabled by default when creating a new lockfile. #2421symlink_individual
for creating a symlink for each individual package file and hardlink
for creating hardlinks. #2425reuse-installed
. When this strategy is enabled, PDM will try to reuse the versions already installed in the environment, even if the package names are given in the command line following add
or update
. This strategy is supported by add
, update
and lock
commands. #2479PDM_CACHE_DIR
environment variable to configure cache directory location. #2485pdm init -n
. #2436pdm init
now implies --lib
if --backend
is passed. #2437install.cache_method = \"symlink\"
. #2466KeyError
raised by pdm update --unconstrained
when the project itself is listed as a dependency. #2483-r
requirements paths relative to the requirement file they are specified in #2422pdm publish
fails with HTTP error. #2400__init__.py
contains an unusual line. #2378pdm init
being read-only when copied from a read-only PDM installation. #2379export
command. #2390--extra--index-url
#2342include_packages
and exclude_packages
config under tool.pdm.source
table. #1645requires-python
range. And provide a way to ignore them per-package. #2304-q/--quiet
option to suppress some warnings printed to the console. This option is mutually exclusive with -v/--verbose
. #2304--strategy/-S
option for lock
command, to specify one or more strategy flags for resolving dependencies. --static-urls
and --no-cross-platform
are deprecated at the same time. #2310pdm.cli.commands.venv.backend.Backend._ensure_clean
to empty the .venv
folder instead of deleting it. #2282--venv
option. #2314--no-default
is requested. #2230--no-isolated
does. #2071--no-lock
option doesn't work as expected. Also support --no-lock
option for add
, remove
and update
commands. #2245findpython
to find pythons with the spec given by the user. #2225virtualenv
python.exe
binary under bin/
as well as Scripts/
and the virtualenv
/conda
root. #2236${PROJECT_ROOT}
variable in the lockfile. #2240pdm run
should only find local file if the command starts with ./
. #2221--overwrite
option to pdm init
to overwrite existing files(default False). #2163pdm list
command. Add --tree
as an alias and preferred name of --graph
option. #2165pdm run
to run a script with the relative or absolute path. #2217@ file://
dependencies can not be updated. #2169requires-python
cause PDM to crash. #2175comarable_version(\"1.2.3+local1\") == Version(\"1.2.3\")
. #2182pdm-pep517
. #2167sitecustomize.py
. #2139keyring
, copier
, cookiecutter
, template
, truststore
dependency groups. #2109PDM_PROJECT
for -p/--project
option. #2126pyproject.toml
if both --unconstrained
and --dry-run
are passed to pdm update
. #2125build-system
table when importing from other package manager. #2126unearth
to 0.10.0 #2113pdm install
. #2086*_lock
hooks are always emitted with dry_run=True in pdm update
. #2060pdm install --plugins
can't install self. #2062cookiecutter
and copier
as project generator. #2059pdm init
now accepts a template argument to initialize project from a built-in or Git template. #2053DeprecationWarning
with FutureWarning
for better exposure. #2012install-pdm.py
and its checksum file on the docs site. #2026--edit/-e
to pdm config
to edit the config file in default editor. #2028--project
option to pdm venv
to support another path as the project root. #2042truststore
as the SSL backend. This only works on Python 3.10 or newer. #2049pdm self list
. #2018url
field when converting requirements from a Pipfile-style file requirement. #2032No significant changes.
"},{"location":"dev/changelog/#release-v273-2023-06-13","title":"Release v2.7.3 (2023-06-13)","text":""},{"location":"dev/changelog/#bug-fixes_17","title":"Bug Fixes","text":"parser
argument is passed to BaseCommand.__init__()
method. #2007pdm list
. #1973pdm init -n
doesn't respect the --python
option. #1984setup.py
if it prints something to stdout. #1995install-pdm.py
. #1996PDM_PYPI_USERNAME
and PDM_PYPI_PASSWORD
when there are no defaults in config. #1961repository.custom.verify_ssl
config option as well as new command line argument of publish
command. #1928PATH
env var. #1944ResourceWarning
s when running the test suite with warnings enabled. #1915tool.poetry.build
doesn't exist. #1935pdm import
clobbers build-system.requires
value in pyproject.toml
. #1948pdm sync
instead of pdm install --no-lock
. #1947PATH
env var isn't set correctly when running under non-isolation mode. #1904tool.pdm.plugins
setting. #1461--json
flag to both run
and info
command allowing to dump scripts and infos as JSON. #1854_
) as internal tasks and hide them from the listing. #1855pdm init -n
(non-interactive mode), a venv will be created by default. Previously, the selected Python will be used under PEP 582 mode. #1862--no-cross-platform
to pdm lock
to create a non-cross-platform lockfile. #1898--venv
option descriptions in zsh completion script. #1847package
and package[extra]
. #1851FileNotFoundError
if the requirement path is not found. #1875No significant changes.
"},{"location":"dev/changelog/#release-v254-2023-05-05","title":"Release v2.5.4 (2023-05-05)","text":""},{"location":"dev/changelog/#bug-fixes_24","title":"Bug Fixes","text":"<2.0
to avoid incompatibility with cachecontrol
. #1886markdown-exec
to 1.5.0
for rendering TOC in CLI reference page. #1836PDM_USE_VENV
as PDM_IN_VENV
for --venv
flag as it mistakenly override another existing env var. #1829pdm --pep582
raises an argument error. #1823resolution.respect-source-order
is enabled, sources are lazily evaluated. This means that if a match is found on the first source, the remaining sources will not be requested. #1509--venv <venv>
to run a command in the virtual environment with the given name. #1705PDM_PREFER_BINARY
environment variable. #1817pdm lock
. #1796environment.is_global
property. #1814pdm init -p <dir>
if the target directory is not created yet. #1822pdm-backend
. #1684pdm.toml
which can be committed to the VCS. #1742Environment
is renamed to PythonLocalEnvironment
and GlobalEnvironment
is renamed to PythonEnvironment
. Move pdm.models.environment
module to pdm.environments
package. #1791unearth
to 0.8 to allow calling keyring from CLI. #1653venv
command to show the path or the python interpreter for a managed venv. #1680--lib
option to init
command to create a library project without prompting. #1708pdm fix
to migrate to the new PDM features. Add a hint when invoking PDM commands. #1743.pdm-python
in project root .gitignore
when running pdm init
. #1749PDM_IGNORE_ACTIVE_VENV
env var. #1782pre_invoke
to emit before any command is invoked. #1792pdm export
due to non-deterministic order of group iteration. #1786pdm show --version
#1788installer
to 0.7.0
and emit a warning if the RECORD validation fails. #1784pdm export
output doesn't include the extras of the dependencies. #1767pdm export
. #1730venv.prompt
configuration when using conda
as the backend. #1734.
with -
when normalizing package name. #1745pdm venv activate
without specifying env_name
to activate in project venv created by conda #1735ruff
as the linter. #1715asdf
. #1725requires-python
doesn't work for all dependencies. #1690pdm-pep517
instead of setuptools
. #1658importlib.resources
. #1660pdm run
. #1652pdm config
. #1622python
. #1626packaging>=22
. #1619subdirectory
attribute to the lockfile entry. #1630pyproject.toml
. #1310pytest
plugin pdm.pytest
for plugin developers. #1594pdm.lock
with an @generated
comment. #1611sitecustomize
to the home directory if it exists in the filesystem(not packed in a zipapp). #1572build-system.requires
, since build
and hatch
both support it. Be aware it is not allowed in the standard. #1560packaging 22.0
. #1562__file__
usages with importlib.resources
, to make PDM usable in a zipapp. #1567package==22.0
from the dependencies to avoid some breakages to the end users. #1568pdm.pep517
as the metadata transformer for unknown custom build backends. #1546installer
to 0.6.0
. #1550unearth
to 0.6.3
and test against packaging==22.0
. #1555pdm use
. #1542tool.pdm.overrides
table to tool.pdm.resolution.overrides
. The old name is deprecated at the same time. #1503--backend
option to pdm init
command, users can choose a favorite backend from setuptools
, flit
, hatchling
and pdm-pep517
(default), since they all support PEP 621 standards. #1504{args[:default]}
placeholder. #1507python.use_venv=False
#1508findpython
installed. #1516install.cache
set to true
and caching method is pth
. #863pdm-pep517
. #1504pep517
with pyproject-hooks
because of the rename. #1528setup.py
format, users are encouraged to migrate to the PEP 621 metadata. #1504sitecustomize.py
respect the PDM_PROJECT_MAX_DEPTH
environment variable #1471python_version
in the environment marker. When the version contains only one digit, the result was incorrect. #1484venv.prompt
configuration to allow customizing prompt when a virtualenv is activated #1332ca_certs
or from the command line via pdm publish --ca-certs <path> ...
. #1392plugin
command to self
, and it can not only manage plugins but also all dependencies. Add a subcommand self update
to update PDM itself. #1406pdm init
to receive a Python path or version via --python
option. #1412requires-python
when importing from other formats. #1426pdm
instead of pip
to resolve and install build requirements. So that PDM configurations can control the process. #1429pdm config
command. #1450pdm lock --check
flag to validate whether the lock is up to date. #1459pip
when creating a new venv. #1463pdm list
command with new formats: --csv,--markdown
and add options --fields,--sort
to control the output contents. Users can also include licenses
in the --fields
option to display the package licenses. #1469pdm lock --check
in pre-commit. #1471--project
argument. #1220pypi.[ca,client]_cert[s]
config items are passed to distribution builder install steps to allow for custom PyPI index sources with self signed certificates. #1396pdm init
. #1410python*
command in pdm run
. #1414==1.*
). #1465importlib-metadata
from PyPI for Python < 3.10. #1467pypi.[ca,client]_cert[s]
config items are passed to distribution builder install steps to allow for custom PyPI index sources with self signed certificates. #1396pdm init
. #1410pdm lock --refresh
if some packages has URLs. #1361file://
links the first time they are added. #1325data-requires-python
when parsing package links. #1334editables
package isn't installed for self package. #1344setup-script
, run-setuptools
, and is-purelib
. #1327pdm run
. #1312post_lock
for add
and update
operations, to ensure the pyproject.toml
is updated before the hook is run. #1320add
or update
command. #1287packaging
dependency to ensure that packaging.utils.parse_wheel_filename
is available. #1293pypi.ca_certs
config entry. #1240pdm export
to available pre-commit hooks. #1279summary
field in pdm.lock
contains the description
from the package's pyproject.toml
. #1274pdm show
for a package that is only available as source distribution. #1276pip
. #1268pdm
module, it will be removed in the future. #1282python
executable in the PATH
. #1255metadata.files
in pdm.lock
. #1256[metada.files]
table of the lock file. #1259env_file
variables no longer override existing environment variables. #1235<this_package_name>[group1, group2]
#1241requires-python
when creating the default venv. #1237PYTHONPATH
. #1211pwsh
as an alias of powershell
for shell completion. #1216zsh
completion regarding --pep582
flag. #12184.0
. #1203unearth
to fix a bug that install links with weak hashes are skipped. This often happens on self-hosted PyPI servers. #1202pdm venv
commands into the main program. Make PEP 582 an opt-in feature. #1162global_project.fallback_verbose
defaulting to True
. When set to False
disables message Project is not found, fallback to the global project
#1188--only-keep
option to pdm sync
to keep only selected packages. Originally requested at #398. #1191unearth
to 0.4.1
to skip the wheels with invalid version parts. #1178PDM_RESOLVE_MAX_ROUNDS
environment variable (was spelled \u2026ROUDNS
before). #1180--no-clean
option from pdm sync
command. #1191[project]
table is not allowed, according to PEP 621. They are however still allowed in the [tool.pdm.dev-dependencies]
table. PDM will emit a warning when it finds editable dependencies in the [project]
table, or will abort when you try to add them into the [project]
table via CLI. #1083setup.py
project. #1062rich
. #1091-v
option. #1096unearth
to replace pip
's PackageFinder
and related data models. PDM no longer relies on pip
internals, which are unstable across updates. #1096find_matches()
to speed up the resolution. #1098publish
to PDM since it is required for so many people and it will make the workflow easier. #1107composite
script kind allowing to run multiple defined scripts in a single command as well as reusing scripts but overriding env
or env_file
. #1117--skip
to opt-out some scripts and hooks from any execution (both scripts and PDM commands). #1127pre/post_publish
, pre/post_run
and pre/post_script
hooks as well as an extensive lifecycle and hooks documentation. #1147[tool.pdm.build]
, according to pdm-pep517 1.0.0
. At the same time, warnings will be shown against old usages. #1153pyproject.toml
rather than building it. #1156[tool.pdm.build]
table. #1157post_use
hook triggered after successfully switching Python version. #1163respect-source-order
under [tool.pdm.resolution]
to respect the source order in the pyproject.toml
file. Packages will be returned by source earlier in the order or later ones if not found. #593tomllib
on Python 3.11 #1072click
, halo
, colorama
and log_symbols
. PDM has no vendors now. #1091pdm-pep517
to 1.0.0
. #1153pdm 0.x
) is no longer supported. #1157tox.ini
file for easier local testing against all Python versions. #1160venv
scheme for prefix
kind install scheme. #1158skip-add-to-path
option to installer in order to prevent changing PATH
. Replace bin
variable name with bin_dir
. #1145[tool.poetry.build]
config table. #1131pdm
process and not to the process actually being run. #1095version
in the cache key of the locked candidates if they are from a URL requirement. #1099requires-python
pre-release versions caused pdm update
to fail with InvalidPyVersion
. #1111setup.cfg
or setup.py
. #1101pdm update
command. #1104venv
install scheme when available. This scheme is more stable than posix_prefix
scheme since the latter is often patched by distributions. #1106pdm.lock
by --lockfile
option or PDM_LOCKFILE
env var. #1038pyproject.toml
when running pdm add --no-editable <package>
. #1050get_sysconfig_path.py
script. #1056${PROJECT_ROOT}
variable in the result of export
command. #1079pdm init
and create default README for libraries. #1041requirements.txt
. #1036pdm use
error. #1039optional
key when converting from Poetry's dependency entries. #1042--no-editable
is passed. pdm add --no-editable
will now override the editable
mode of the given packages. #1011pdm lock --refresh
. #1019${PROJECT_ROOT}
in the output of pdm list
. #1004installer 0.5.x
. #1002license
field to \"None\". #991pdm search
command. #993~/.local
) with global_project.user_site
config. #885auto_global
to global_project.fallback
and deprecate the old name. #986show
command. #966_.site_packages
is overridden by default option value. #985pdm-pep517
to support PEP 639. #959pythonfinder
to findpython
as the Python version finder. #930pre_*
and post_*
scripts for task composition. Pre- and Post- scripts for init
, build
, install
and lock
will be run if present. #789--config/-c
option to specify another global configuration file. #883[tool.pdm.overrides]
table. #909use_venv
to python.use_venv
; rename config feature.install_cache
to install.cache
; rename config feature.install_cache_method
to install.cache_method
; rename config parallel_install
to install.parallel
. #914[tool.pdm.overrides]
table. #861requires-python
should be produced if ANY(*
) is given. #917pdm.lock
gets created when --dry-run
is passed to pdm add
. #918path
. #904ExtrasError
to ExtrasWarning
for better understanding. Improve the warning message. #892Candidate
into a new class PreparedCandidate
. Candidate
no longer holds an Environment
instance. #920pip>=22.0
. #875pdm run
, it will run the Python REPL. #856direct_url.json
for a local pre-built wheel. #861pip<22.0
. #874network
marker. #858-
unexpectedly. #853use
command to save the human effort. And introduce an -i
option to ignored that remembered value. #846RECORD
. #847ModuleNotFoundError
during uninstall when the modules required are removed. #850pdm update
even if --no-sync
is passed. #837feature.install_cache_method
config. #822lock --refresh
to update the hash stored with the lock file without updating the pinned versions. #642[tool.pdm.overrides]
table. #790post_init
, pre_lock
, post_lock
, pre_install
and post_install
. #798install --check
to check if the lock file is up to date. #810allow_prereleases
setting. Now non-named requirements are resolved earlier than pinned requirements. #799atoml
to tomlkit
as the style-preserving TOML parser. The latter has supported TOML v1.0.0. #809minimum
, without upper bounds. #787sysconfig
to return the PEP 582 scheme in pdm run
. #784--pre/--prelease
option for pdm add
and pdm update
. It will allow prereleases to be pinned. #774git+https
candidates cannot be resolved. #771version
from [project]
table to [tool.pdm]
table, delete classifiers
from dynamic
, and warn usage about the deprecated usages. #748x >= VERSION
when adding dependencies. #752pdm list --freeze
to fix a bug due to Pip's API change. #533requires-python
. #744-s/--section
option from all previously supported commands. Use -G/--group
instead. #756importlib
to replace imp
in the sitecustomize
module for Python 3. #574pdm export
. #741-s/--site-packages
to pdm run
as well as a script config item. When it is set to True
, site-packages from the selected interpreter will be loaded into the running environment. #733NO_SITE_PACKAGES
isn't set in pdm run
if the executable is out of local packages. #733pdm run
, but keep them seen when PEP 582 is enabled. #708pip
with --isolated
when building wheels. In this way some env vars like PIP_REQUIRE_VIRTUALENV
can be ignored. #669pip
is not DEBUNDLED. #685summary
is None
, the lockfile can't be generated. #719${PROJECT_ROOT}
should be written in the URL when relative path is given. #721pdm import
can't merge the settings correctly. #723--no-sync
option to update
command. #684find_links
source type. It can be specified via type
key of [[tool.pdm.source]]
table. #694--dry-run
option to add
, install
and remove
commands. #698project.core.ui.display_columns
), fixing unnecessary wrapping due to / with empty lines full of spaces in case of long URLs in the last column. #680check_update
is boolean. #689setup_dev.py
in favor of pip install
. #676requires-python
is not recognized in candidates evaluation. #657installer
to 0.3.0
, fixing a bug that broke installation of some packages with unusual wheel files. #653packaging
and typing-extensions
to direct dependencies. #674requires-python
now participates in the resolution as a dummy requirement. #658--no-isolation
option for install
, lock
, update
, remove
, sync
commands. #640project_max_depth
configurable and default to 5
. #643pdm-pep517
backend on Python 2.7 when installing self as editable. #640*-nspkg.pth
files in install_cache
mode. It will still work without them. #623-r/--reinstall
option to sync
command to force re-install the existing dependencies. #601pdm cache clear
can clear cached installations if not needed any more. #604setuptools
won't see the dependencies under local packages. #601direct_url.json
when installing wheels. #607*
fails to be converted as SpecifierSet
. #609--json
to the list command which outputs the dependency graph as a JSON document. #583feature.install_cache
. When it is turned on, wheels will be installed into a centralized package repo and create .pth
files under project packages directory to link to the cached package. #589pdm show
. #580~/.pyenv/shims/python3
as the pyenv interpreter. #590-s/--section
option in favor of -G/--group
. #591pdm/installers/installers.py
is renamed to pdm/installers/manager.py
to be more accurate. The Installer
class under that file is renamed to InstallerManager
and is exposed in the pdm.core.Core
object for overriding. The new pdm/installers/installers.py
contains some installation implementations. #589pkg_resources.Distribution
to the implementation of importlib.metadata
. #592distlib
. #519--<field-name>
options in pdm show. When no package is given, show this project. #527--freeze
option to pdm list
command which shows the dependencies list as pip's requirements.txt format. #531PYTHONPATH
. #522pdm-pep517
to 0.8.0
. #524toml
to tomli
. #541plugin
to manage pdm plugins, including add
, remove
and list
commands. #510resolvelib
any more. This makes PDM more stable across updates of sub-dependencies. #515-u/--unconstrained
to support unconstraining version specifiers when adding packages. #501No significant changes.
"},{"location":"dev/changelog/#release-v161-2021-05-31","title":"Release v1.6.1 (2021-05-31)","text":"No significant changes.
"},{"location":"dev/changelog/#release-v160-2021-05-31","title":"Release v1.6.0 (2021-05-31)","text":""},{"location":"dev/changelog/#features-improvements_57","title":"Features & Improvements","text":"pdm export
no longer produces requirements file applicable for all platforms due to the new approach. #456--no-editable
option to install non-editable versions of all packages. #443--no-self
option to prevent the project itself from being installed. #444.gitignore
file in the __pypackages__
directory. #446PDM_PROJECT_ROOT
env var. Change to the project root when executing scripts. #470tomlkit
to atoml
as the style-preserving TOML parser and writer. #465--dev
flag for older versions of PDM. #444--config-setting
. #452keyring
as a dependency and guide users to install it when it is not available. #442distlib
. #447pdm.cli.actions
#428pdm use
with no argument given, which will list all available pythons for pick. #409setup.py
failed for NameError. #407install
and sync
commands. Add a new option --prod/--production
to exclude them. Improve the dependency selection logic to be more convenient to use \u2014 the more common the usage is, the shorter the command is. #391source-includes
to mark files to be included only in sdist builds. #390pdm-pep517
to 0.7.0
; update resolvelib
to0.7.0
. #390-d/--dev
option in install
and sync
commands. #391pdm run
on a directory not initialized yet.resolvelib
to 0.6.0
. #381pdm.models.readers
to improve typing support #321project.python_executable
to project.python
that contains all info of the interpreter. #382:all
given to -s/--section
to refer to all sections under the same species. Adjust add
, sync
, install
, remove
and update
to support the new dev-dependencies
groups. Old behavior will be kept the same. #351dev-dependencies
is now a table of dependencies groups, where key is the group name and value is an array of dependencies. These dependencies won't appear in the distribution's metadata. dev-depedencies
of the old format will turn into dev
group under dev-dependencies
. #351dev-dependencies
, includes
, excludes
and package-dir
out from [project]
table to [tool.pdm]
table. The migration will be done automatically if old format is detected. #351--dry-run
option for update
command to display packages that need update, install or removal. Add --top
option to limit to top level packages only. #358init
command via -n/--non-interactive
option. No question will be asked in this mode. #368pdm info
, also add an option --packages
to show that value only. #372<script>-X.Y
variant to the bin folder. #365-g/--global
that was deprecated in 1.4.0
. One should use -g -p <project_path>
for that purpose. #361pdm init
#352pdm-pep517
to 0.6.1
. #353type
argument to pdm cache clear
and improve its UI. #343entry-points
. #344models.project_info.ProjectInfo
, which indexes distlib.metadata._data
#335pdm.plugins
to pdm
. Export some useful objects and models for shorter import path. #318cmd
in tools.pdm.scripts
configuration items now allows specifying an argument array instead of a string.stream
singleton, improve the UI related code. #320cache
command, add list
, remove
and info
subcommands. #329pyproject.toml
. #308pdm.iostream
to improve 'typing' support #301specifiers.py
to a separated module. #303setup.py
has no intall_requires
key. #299pdm init
fails when pyproject.toml
exists but has no [project]
section. #295-I/--ignore-python
passed or PDM_IGNORE_SAVED_PYTHON=1
, ignore the interpreter set in .pdm.toml
and don't save to it afterwards. #283-p/--project
is introduced to specify another path for the project base. It can also be combined with -g/--global
option. The latter is changed to a flag only option that does not accept values. #286-f setuppy
for pdm export
to export the metadata as setup.py #289src
directory can't be uninstalled correctly. #277pdm sync
or pdm install
is not present in the error message. #274requires-python
attribute when fetching the candidates of a package. #264egg-info
directory when dependencies change. So that pdm list --graph
won't show invalid entries. #240requirements.txt
file, build the package to find the name if not given in the URL. #245name
and version
if not. #253packaging
. #130sections
value of a pinned candidate to be reused. #234>
, >=
, <
, <=
to combine with star versions. #254--save-compatible
slightly. Now the version specifier saved is using the REAL compatible operator ~=
as described in PEP 440. Before: requests<3.0.0,>=2.19.1
, After: requests~=2.19
. The new specifier accepts requests==2.19.0
as compatible version. #225${PROJECT_ROOT}
in the dependency specification can be expanded to refer to the project root in pyproject.toml. The environment variables will be kept as they are in the lock file. #226PYTHONPATH
(with python -I
mode) when executing pip commands. #231pip 21.0
. #235pyproject.toml
.pdm use <path-to-python-root>
. #221PYTHONPATH
manipulation under Windows platform. #215/search
endpoint is not available on given index. #211Poetry
, Pipfile
, flit
) can also be imported as PEP 621 metadata. #175pdm search
to query the /search
HTTP endpoint. #195classifiers
dynamic in pyproject.toml
template for autogeneration. #209is_subset()
returns incorrect result. #206pdm-pep517
to <0.3.0
, this is the last version to support legacy project metadata format.[metadata.files]
table. #196pip-shims
package as a dependency. #132pdm --pep582
can enable PEP 582 globally by manipulating the WinReg. #191__pypackages__
into PATH
env var during pdm run
. #193pdm run
:-s/--site-packages
to include system site-packages when running. #178setuptools
is installed before invoking editable install script. #174wheel
not wheels
for global projects #182sitecustomize.py
instead of a .pth
file to enable PEP 582. Thanks @Aloxaf. Update get_package_finder()
to be compatible with pip 20.3
. #185pyproject.toml
.[tool.pdm.scripts]
section.pdm run --list/-l
to show the list of script shortcuts. #168pdm install
. #169build-system.requires
anymore. #167build
to a home-grown version. #162LogWrapper
. #164is_subset
and is_superset
may return wrong result when wildcard excludes overlaps with the upper bound. #165pycomplete
. #159sitecustomize.py
incorrectly gets injected into the editable console scripts. #158find_matched()
is exhausted when accessed twice. #149pdm-pep517
to 0.2.0
that supports reading version from SCM. #146wheel==0.35
. #135pdm export
fails when the project doesn't have name
property. #126pip
to 20.1
. #125export
to export to alternative formats. #117resolvelib
0.4.0. #118resolvelib 0.3.0
. #116show
command to show package metadata. #114setuptools
to be installed in the isolated environment.pdm use
. #96python_requires
when initializing project. #89wheel
package is available before building packages. #90pythonfinder
, python-cfonts
, pip-shims
and many others. Drop dependency vistir
. #89pdm build
. #81pmd import
to import project metadata from Pipfile
, poetry
, flit
, requirements.txt
. #79pdm init
and pdm install
will auto-detect possible files that can be imported.package_dir
is mapped. #81pdm init
will use the current directory rather than finding the parents when global project is not activated.plugins
to entry_points
click
to argparse
, for better extensibility. #73-g/--global
to manage global project. The default location is at ~/.pdm/global-project
.-p/--project
to select project root other than the default one. #30pdm config del
to delete an existing config item. #71pdm init
. #674.0
as infinite upper bound when checking subsetting. #66ImpossiblePySpec
's hash clashes with normal one.pdm config
to inspect configurations. #26pdm cache clear
to clean caches. #63--python
option in pdm init
. #49python_requires
when initializing and defaults to >={current_version}
. #50setup.py
.pdm --help
. #42python-cfonts
to display banner. #42_editable_intall.py
compatible with Py2.pdm list --graph
to show a dependency graph of the working set. #10pdm update --unconstrained
to ignore the version constraint of given packages. #13pdm install
. #33pdm info
to show project environment information. #9pip
to 20.0
, update pip_shims
to 0.5.0
. #28setup_dev.py
for the convenience to setup pdm for development. #29pdm init
to bootstrap a project.pdm build
command.pdm init
to bootstrap a project.First off, thanks for taking the time to contribute! Contributions include but are not restricted to:
The following is a set of guidelines for contributing.
"},{"location":"dev/contributing/#a-recommended-flow-of-contributing-to-an-open-source-project","title":"A recommended flow of contributing to an Open Source project","text":"This section is for beginners to OSS. If you are an experienced OSS developer, you can skip this section.
git clone https://github.com/pdm-project/pdm.git\n# Or if you prefer SSH clone:\ngit clone git@github.com:pdm-project/pdm.git\n
git remote add fork https://github.com/yourname/pdm.git\ngit fetch fork\n
where fork
is the remote name of the fork repository.ProTips:
To update main branch to date:
git pull origin main\n# In rare cases that your local main branch diverges from the remote main:\ngit fetch origin && git reset --hard main\n
We recommend working in a virtual environment. Feel free to create a virtual environment with either the venv
module or the virtualenv
tool. For example:
python -m venv .venv\n. .venv/bin/activate # linux\n.venv/Scripts/activate # windows\n
Make sure your pip
is newer than 21.3
to install PDM in develop/editable mode.
python -m pip install -U \"pip>=21.3\"\npython -m pip install -e .\n
Make sure PDM uses the virtual environment you just created:
pdm config -l python.use_venv true\npdm config -l venv.in_project true\n
Install PDM development dependencies:
pdm install\n
Now, all dependencies are installed into the Python environment you chose, which will be used for development after this point.
"},{"location":"dev/contributing/#run-tests","title":"Run tests","text":"pdm run test\n
The test suite is still simple and needs expansion! Please help write more test cases.
Note
You can also run your test suite against all supported Python version using tox
with the tox-pdm
plugin. You can either run it by yourself with:
tox\n
or from pdm
with:
pdm run tox\n
"},{"location":"dev/contributing/#code-style","title":"Code style","text":"PDM uses pre-commit
for linting. Install pre-commit
first, for example with pip or pipx:
python -m pip install pre-commit\n
pipx install pre-commit\n
Then initialize pre-commit
:
pre-commit install\n
You can now lint the code with:
pdm run lint\n
PDM uses black
for code style and isort
for sorting import statements. If you are not following them, the CI will fail and your Pull Request will not be merged.
When you make changes such as fixing a bug or adding a feature, you must add a news fragment describing your change. News fragments are placed in the news/
directory, and should be named according to this pattern: <issue_num>.<issue_type>.md
(e.g., 566.bugfix.md
).
feature
: Features and improvementsbugfix
: Bug fixesrefactor
: Code restructuresdoc
: Added or improved documentationdep
: Changes to dependenciesremoval
: Removals or deprecations in the APImisc
: Miscellaneous changes that don't fit any of the other categoriesThe contents of the file should be a single sentence in the imperative mood that describes your changes. (e.g., Deduplicate the plugins list.
) See entries in the Change Log for more examples.
If you make some changes to the docs/
and you want to preview the build result, simply do:
pdm run doc\n
"},{"location":"dev/contributing/#release","title":"Release","text":"Once all changes are done and ready to release, you can preview the changelog contents by running:
pdm run release --dry-run\n
Make sure the next version and the changelog are as expected in the output.
Then cut a release on the main branch:
pdm run release\n
GitHub action will create the release and upload the distributions to PyPI.
Read more options about version bumping by pdm run release --help
.
Some reusable fixtures for pytest
.
New in version 2.4.0
To enable them in your test, add pdm.pytest
as a plugin. You can do so in your root conftest.py
:
# single plugin\npytest_plugins = \"pytest.plugin\"\n\n# many plugins\npytest_plugins = [\n ...\n \"pdm.pytest\",\n ...\n]\n
"},{"location":"dev/fixtures/#pdm.pytest.IndexMap","title":"IndexMap = Dict[str, Path]
module-attribute
","text":"Path some root-relative http paths to some local paths
"},{"location":"dev/fixtures/#pdm.pytest.IndexOverrides","title":"IndexOverrides = Dict[str, str]
module-attribute
","text":"PyPI indexes overrides fixture format
"},{"location":"dev/fixtures/#pdm.pytest.IndexesDefinition","title":"IndexesDefinition = Dict[str, Union[Tuple[IndexMap, IndexOverrides, bool], IndexMap]]
module-attribute
","text":"Mock PyPI indexes format
"},{"location":"dev/fixtures/#pdm.pytest.Distribution","title":"Distribution
","text":"A mock Distribution
"},{"location":"dev/fixtures/#pdm.pytest.LocalFileAdapter","title":"LocalFileAdapter
","text":" Bases: requests.adapters.BaseAdapter
A local file adapter for request.
Allows to mock some HTTP requests with some local files
"},{"location":"dev/fixtures/#pdm.pytest.MockWorkingSet","title":"MockWorkingSet
","text":" Bases: collections.abc.MutableMapping
A mock working set
"},{"location":"dev/fixtures/#pdm.pytest.PDMCallable","title":"PDMCallable
","text":" Bases: Protocol
The PDM fixture callable signature
"},{"location":"dev/fixtures/#pdm.pytest.PDMCallable.__call__","title":"__call__(args, strict=False, input=None, obj=None, env=None, **kwargs)
","text":"Parameters:
Name Type Description Defaultargs
str | list[str]
the command arguments as a single lexable string or a strings array
requiredstrict
bool
raise an exception on failure instead of returning if enabled
False
input
str | None
an optional string to be submitted too stdin
None
obj
Project | None
an optional existing Project
.
None
env
Mapping[str, str] | None
override the environment variables with those
None
Returns:
Type DescriptionRunResult
The command result
"},{"location":"dev/fixtures/#pdm.pytest.RunResult","title":"RunResult
dataclass
","text":"Store a command execution result.
"},{"location":"dev/fixtures/#pdm.pytest.RunResult.exception","title":"exception: Exception | None = None
instance-attribute
class-attribute
","text":"If set, the exception raised on execution
"},{"location":"dev/fixtures/#pdm.pytest.RunResult.exit_code","title":"exit_code: int
instance-attribute
","text":"The execution exit code
"},{"location":"dev/fixtures/#pdm.pytest.RunResult.output","title":"output: str
property
","text":"The execution stdout
output (stdout
alias)
outputs: str
property
","text":"The execution stdout
and stderr
outputs concatenated
stderr: str
instance-attribute
","text":"The execution stderr
output
stdout: str
instance-attribute
","text":"The execution stdout
output
print()
","text":"A debugging facility
"},{"location":"dev/fixtures/#pdm.pytest.TestRepository","title":"TestRepository
","text":" Bases: BaseRepository
A mock repository to ease testing dependencies
"},{"location":"dev/fixtures/#pdm.pytest.build_env","title":"build_env(build_env_wheels, tmp_path_factory)
","text":"A fixture build environment
Parameters:
Name Type Description Defaultbuild_env_wheels
Iterable[Path]
a list of wheel to install in the environment
requiredReturns:
Type DescriptionPath
The build environment temporary path
"},{"location":"dev/fixtures/#pdm.pytest.build_env_wheels","title":"build_env_wheels()
","text":"Expose some wheels to be installed in the build environment.
Override to provide your owns.
Returns:
Type DescriptionIterable[Path]
a list of wheels paths to install
"},{"location":"dev/fixtures/#pdm.pytest.local_finder_artifacts","title":"local_finder_artifacts()
","text":"The local finder search path as a fixture
Override to provides your own artifacts.
Returns:
Type DescriptionPath
The path to the artifacts root
"},{"location":"dev/fixtures/#pdm.pytest.pdm","title":"pdm(core, monkeypatch)
","text":"A fixture allowing to execute PDM commands
Returns:
Type DescriptionPDMCallable
A pdm
fixture command.
project(project_no_init)
","text":"A fixture creating an initialized test project for the current test.
Returns:
Type DescriptionProject
The initialized project
"},{"location":"dev/fixtures/#pdm.pytest.project_no_init","title":"project_no_init(tmp_path, mocker, core, pdm_session, monkeypatch, build_env)
","text":"A fixture creating a non-initialized test project for the current test.
Returns:
Type DescriptionProject
The non-initialized project
"},{"location":"dev/fixtures/#pdm.pytest.pypi_indexes","title":"pypi_indexes()
","text":"Provides some mocked PyPI entries
Returns:
Type DescriptionIndexesDefinition
a definition of the mocked indexes
"},{"location":"dev/fixtures/#pdm.pytest.remove_pep582_path_from_pythonpath","title":"remove_pep582_path_from_pythonpath(pythonpath)
","text":"Remove all pep582 paths of PDM from PYTHONPATH
"},{"location":"dev/fixtures/#pdm.pytest.repository","title":"repository(project, mocker, repository_pypi_json, local_finder)
","text":"A fixture providing a mock PyPI repository
Returns:
Type DescriptionTestRepository
A mock repository
"},{"location":"dev/fixtures/#pdm.pytest.repository_pypi_json","title":"repository_pypi_json()
","text":"The test repository fake PyPI definition path as a fixture
Override to provides your own definition path.
Returns:
Type DescriptionPath
The path to a fake PyPI repository JSON definition
"},{"location":"dev/fixtures/#pdm.pytest.venv_backends","title":"venv_backends(project, request)
","text":"A fixture iterating over venv
backends
working_set(mocker, repository)
","text":"a mock working set as a fixture
Returns:
Type DescriptionMockWorkingSet
a mock working set
"},{"location":"dev/write/","title":"PDM Plugins","text":"PDM is aiming at being a community driven package manager. It is shipped with a full-featured plug-in system, with which you can:
The core PDM project focuses on dependency management and package publishing. Other functionalities you wish to integrate with PDM are preferred to lie in their own plugins and released as standalone PyPI projects. In case the plugin is considered a good supplement of the core project it may have a chance to be absorbed into PDM.
"},{"location":"dev/write/#write-your-own-plugin","title":"Write your own plugin","text":"In the following sections, I will show an example of adding a new command hello
which reads the hello.name
config.
The PDM's CLI module is designed in a way that user can easily \"inherit and modify\". To write a new command:
from pdm.cli.commands.base import BaseCommand\n\nclass HelloCommand(BaseCommand):\n\"\"\"Say hello to the specified person.\n If none is given, will read from \"hello.name\" config.\n \"\"\"\n\n def add_arguments(self, parser):\n parser.add_argument(\"-n\", \"--name\", help=\"the person's name to whom you greet\")\n\n def handle(self, project, options):\n if not options.name:\n name = project.config[\"hello.name\"]\n else:\n name = options.name\n print(f\"Hello, {name}\")\n
First, let's create a new HelloCommand
class inheriting from pdm.cli.commands.base.BaseCommand
. It has two major functions:
add_arguments()
to manipulate the argument parser passed as the only argument, where you can add additional command line arguments to ithandle()
to do something when the subcommand is matched, you can do nothing by writing a single pass
statement. It accepts two arguments: an pdm.project.Project
object as the first one and the parsed argparse.Namespace
object as the second.The document string will serve as the command help text, which will be shown in pdm --help
.
Besides, PDM's subcommand has two default options: -v/--verbose
to change the verbosity level and -g/--global
to enable global project. If you don't want these default options, override the arguments
class attribute to a list of pdm.cli.options.Option
objects, or assign it to an empty list to have no default options:
class HelloCommand(BaseCommand):\n\narguments = []\n
Note
The default options are loaded first, then add_arguments()
is called.
Write a function somewhere in your plugin project. There is no limit on what the name of the function is, but the function should take only one argument -- the PDM core object:
def hello_plugin(core):\ncore.register_command(HelloCommand, \"hello\")\n
Call core.register_command()
to register the command. The second argument as the name of the subcommand is optional. PDM will look for the HelloCommand
's name
attribute if the name is not passed.
Let's recall the first code snippet, hello.name
config key is consulted for the name if not passed via the command line.
class HelloCommand(BaseCommand):\n\"\"\"Say hello to the specified person.\n If none is given, will read from \"hello.name\" config.\n \"\"\"\n\n def add_arguments(self, parser):\n parser.add_argument(\"-n\", \"--name\", help=\"the person's name to whom you greet\")\n\n def handle(self, project, options):\n if not options.name:\nname = project.config[\"hello.name\"]\nelse:\n name = options.name\n print(f\"Hello, {name}\")\n
Till now, if you query the config value by pdm config get hello.name
, an error will pop up saying it is not a valid config key. You need to register the config item, too:
from pdm.project.config import ConfigItem\n\ndef hello_plugin(core):\n core.register_command(HelloCommand, \"hello\")\ncore.add_config(\"hello.name\", ConfigItem(\"The person's name\", \"John\"))\n
where ConfigItem
class takes 4 parameters, in the following order:
description
: a description of the config itemdefault
: default value of the config itemglobal_only
: whether the config is allowed to set in home config onlyenv_var
: the name of environment variable which will be read as the config valueBesides of commands and configurations, the core
object exposes some other methods and attributes to override. PDM also provides some signals you can listen to. Please read the API reference for more details.
When developing a plugin, one hopes to activate and plugin in development and get updated when the code changes.
You can achieve this by installing the plugin in editable mode. To do this, specify the dependencies in tool.pdm.plugins
array:
[tool.pdm]\nplugins = [\n\"-e file:///${PROJECT_ROOT}\"\n]\n
Then install it with:
pdm install --plugins\n
After that, all the dependencies are available in a project plugin library, including the plugin itself, in editable mode. That means any change to the codebase will take effect immediately without re-installation. The pdm
executable also uses a Python interpreter under the hood, so if you run pdm
from inside the plugin project, the plugin in development will be activated automatically, and you can do some testing to see how it works.
PDM exposes some pytest fixtures as a plugin in the pdm.pytest
module. To benefit from them, you must add pdm[pytest]
as a test dependency.
To enable them in your test, add pdm.pytest
as a plugin. You can do so by in your root conftest.py
:
# single plugin\npytest_plugins = \"pytest.plugin\"\n\n# many plugins\npytest_plugins = [\n ...\n \"pdm.pytest\",\n ...\n]\n
You can see some usage examples into PDM own tests, especially the conftest.py file for configuration.
See the pytest fixtures documentation for more details.
"},{"location":"dev/write/#publish-your-plugin","title":"Publish your plugin","text":"Now you have defined your plugin already, let's distribute it to PyPI. PDM's plugins are discovered by entry point types. Create an pdm
entry point and point to your plugin callable (yeah, it doesn't need to be a function, any callable object can work):
PEP 621:
# pyproject.toml\n\n[project.entry-points.pdm]\nhello = \"my_plugin:hello_plugin\"\n
setuptools:
# setup.py\n\nsetup(\n ...\n entry_points={\"pdm\": [\"hello = my_plugin:hello_plugin\"]}\n ...\n)\n
"},{"location":"dev/write/#activate-the-plugin","title":"Activate the plugin","text":"As plugins are loaded via entry points, they can be activated with no more steps than just installing the plugin. For convenience, PDM provides a plugin
command group to manage plugins.
Assume your plugin is published as pdm-hello
:
pdm self add pdm-hello\n
Now type pdm --help
in the terminal, you will see the new added hello
command and use it:
$ pdm hello Jack\nHello, Jack\n
See more plugin management subcommands by typing pdm self --help
in the terminal.
To specify the required plugins for a project, you can use the tool.pdm.plugins
config in the pyproject.toml
file. These dependencies can be installed into a project plugin library by running pdm install --plugins
. The project plugin library will be loaded in subsequent PDM commands.
This is useful when you want to share the same plugin set with the contributors.
# pyproject.toml\n[tool.pdm]\nplugins = [\n\"pdm-packer\"\n]\n
Run pdm install --plugins
to install and activate the plugins.
Alternatively, you can have project-local plugins that are not published to PyPI, by using editable local dependencies:
# pyproject.toml\n[tool.pdm]\nplugins = [\n\"-e file:///${PROJECT_ROOT}/plugins/my_plugin\"\n]\n
"},{"location":"reference/api/","title":"API Reference","text":""},{"location":"reference/api/#pdm.core.Core","title":"pdm.core.Core
","text":"A high level object that manages all classes and configurations
"},{"location":"reference/api/#pdm.core.Core.add_config","title":"add_config(name, config_item)
staticmethod
","text":"Add a config item to the configuration class.
Parameters:
Name Type Description Defaultname
str
The name of the config item
requiredconfig_item
pdm.project.config.ConfigItem
The config item to add
required"},{"location":"reference/api/#pdm.core.Core.create_project","title":"create_project(root_path=None, is_global=False, global_config=None)
","text":"Create a new project object
Parameters:
Name Type Description Defaultroot_path
PathLike
The path to the project root directory
None
is_global
bool
Whether the project is a global project
False
global_config
str
The path to the global config file
None
Returns:
Type DescriptionProject
The project object
"},{"location":"reference/api/#pdm.core.Core.handle","title":"handle(project, options)
","text":"Called before command invocation
"},{"location":"reference/api/#pdm.core.Core.load_plugins","title":"load_plugins()
","text":"Import and load plugins under pdm.plugin
namespace A plugin is a callable that accepts the core object as the only argument.
def my_plugin(core: pdm.core.Core) -> None:\n ...\n
"},{"location":"reference/api/#pdm.core.Core.main","title":"main(args=None, prog_name=None, obj=None, **extra)
","text":"The main entry function
"},{"location":"reference/api/#pdm.core.Core.register_command","title":"register_command(command, name=None)
","text":"Register a subcommand to the subparsers, with an optional name of the subcommand.
Parameters:
Name Type Description Defaultcommand
Type[pdm.cli.commands.base.BaseCommand]
The command class to register
requiredname
str
The name of the subcommand, if not given, command.name
is used
None
"},{"location":"reference/api/#pdm.core.Project","title":"pdm.core.Project
","text":"Core project class.
Parameters:
Name Type Description Defaultcore
Core
The core instance.
requiredroot_path
str | Path | None
The root path of the project.
requiredis_global
bool
Whether the project is global.
False
global_config
str | Path | None
The path to the global config file.
None
"},{"location":"reference/api/#pdm.project.core.Project.config","title":"config: Mapping[str, Any]
cached
property
","text":"A read-only dict configuration
"},{"location":"reference/api/#pdm.project.core.Project.default_source","title":"default_source: RepositoryConfig
property
","text":"Get the default source from the pypi setting
"},{"location":"reference/api/#pdm.project.core.Project.project_config","title":"project_config: Config
cached
property
","text":"Read-and-writable configuration dict for project settings
"},{"location":"reference/api/#pdm.project.core.Project.find_interpreters","title":"find_interpreters(python_spec=None)
","text":"Return an iterable of interpreter paths that matches the given specifier,
which can beget_provider(strategy='all', tracked_names=None, for_install=False, ignore_compatibility=True, direct_minimal_versions=False)
","text":"Build a provider class for resolver.
:param strategy: the resolve strategy :param tracked_names: the names of packages that needs to update :param for_install: if the provider is for install :param ignore_compatibility: if the provider should ignore the compatibility when evaluating candidates :param direct_minimal_versions: if the provider should prefer minimal versions instead of latest :returns: The provider object
"},{"location":"reference/api/#pdm.project.core.Project.get_reporter","title":"get_reporter(requirements, tracked_names=None, spinner=None)
","text":"Return the reporter object to construct a resolver.
:param requirements: requirements to resolve :param tracked_names: the names of packages that needs to update :param spinner: optional spinner object :returns: a reporter
"},{"location":"reference/api/#pdm.project.core.Project.get_repository","title":"get_repository(cls=None, ignore_compatibility=True)
","text":"Get the repository object
"},{"location":"reference/api/#pdm.project.core.Project.resolve_interpreter","title":"resolve_interpreter()
","text":"Get the Python interpreter path.
"},{"location":"reference/api/#pdm.project.core.Project.use_pyproject_dependencies","title":"use_pyproject_dependencies(group, dev=False)
","text":"Get the dependencies array and setter in the pyproject.toml Return a tuple of two elements, the first is the dependencies array, and the second value is a callable to set the dependencies array back.
"},{"location":"reference/api/#pdm.project.core.Project.write_lockfile","title":"write_lockfile(toml_data, show_message=True, write=True, **_kwds)
","text":"Write the lock file to disk.
"},{"location":"reference/api/#signals","title":"Signals","text":"New in version 1.12.0
The signal definition for PDM.
Examplefrom pdm.signals import post_init, post_install\n\ndef on_post_init(project):\n project.core.ui.echo(\"Project initialized\")\n# Connect to the signal\npost_init.connect(on_post_init)\n# Or use as a decorator\n@post_install.connect\ndef on_post_install(project, candidates, dry_run):\n project.core.ui.echo(\"Project install succeeded\")\n
"},{"location":"reference/api/#pdm.signals.post_build","title":"post_build: NamedSignal = pdm_signals.signal('post_build')
module-attribute
","text":"Called after a project is built.
Parameters:
Name Type Description Defaultproject
Project
The project object
requiredartifacts
Sequence[str]
The locations of built artifacts
requiredconfig_settings
dict[str, str] | None
Additional config settings passed via args
required"},{"location":"reference/api/#pdm.signals.post_init","title":"post_init: NamedSignal = pdm_signals.signal('post_init')
module-attribute
","text":"Called after a project is initialized.
Parameters:
Name Type Description Defaultproject
Project
The project object
required"},{"location":"reference/api/#pdm.signals.post_install","title":"post_install: NamedSignal = pdm_signals.signal('post_install')
module-attribute
","text":"Called after a project is installed.
Parameters:
Name Type Description Defaultproject
Project
The project object
requiredcandidates
dict[str, Candidate]
The candidates installed
requireddry_run
bool
If true, won't perform any actions
required"},{"location":"reference/api/#pdm.signals.post_lock","title":"post_lock: NamedSignal = pdm_signals.signal('post_lock')
module-attribute
","text":"Called after a project is locked.
Parameters:
Name Type Description Defaultproject
Project
The project object
requiredresolution
dict[str, Candidate]
The resolved candidates
requireddry_run
bool
If true, won't perform any actions
required"},{"location":"reference/api/#pdm.signals.post_publish","title":"post_publish: NamedSignal = pdm_signals.signal('post_publish')
module-attribute
","text":"Called after a project is published.
Parameters:
Name Type Description Defaultproject
Project
The project object
required"},{"location":"reference/api/#pdm.signals.post_run","title":"post_run: NamedSignal = pdm_signals.signal('post_run')
module-attribute
","text":"Called after any run.
Parameters:
Name Type Description Defaultproject
Project
The project object
requiredscript
str
the script name
requiredargs
Sequence[str]
the command line provided arguments
required"},{"location":"reference/api/#pdm.signals.post_script","title":"post_script: NamedSignal = pdm_signals.signal('post_script')
module-attribute
","text":"Called after any script.
Parameters:
Name Type Description Defaultproject
Project
The project object
requiredscript
str
the script name
requiredargs
Sequence[str]
the command line provided arguments
required"},{"location":"reference/api/#pdm.signals.post_use","title":"post_use: NamedSignal = pdm_signals.signal('post_use')
module-attribute
","text":"Called after use switched to a new Python version.
Parameters:
Name Type Description Defaultproject
Project
The project object
requiredpython
PythonInfo
Information about the new Python interpreter
required"},{"location":"reference/api/#pdm.signals.pre_build","title":"pre_build: NamedSignal = pdm_signals.signal('pre_build')
module-attribute
","text":"Called before a project is built.
Parameters:
Name Type Description Defaultproject
Project
The project object
requireddest
str
The destination location
requiredconfig_settings
dict[str, str] | None
Additional config settings passed via args
required"},{"location":"reference/api/#pdm.signals.pre_install","title":"pre_install: NamedSignal = pdm_signals.signal('pre_install')
module-attribute
","text":"Called before a project is installed.
Parameters:
Name Type Description Defaultproject
Project
The project object
requiredcandidates
dict[str, Candidate]
The candidates to install
requireddry_run
bool
If true, won't perform any actions
required"},{"location":"reference/api/#pdm.signals.pre_invoke","title":"pre_invoke: NamedSignal = pdm_signals.signal('pre_invoke')
module-attribute
","text":"Called before any command is invoked.
Parameters:
Name Type Description Defaultproject
Project
The project object
requiredcommand
str | None
the command name
requiredoptions
Namespace
the parsed arguments
required"},{"location":"reference/api/#pdm.signals.pre_lock","title":"pre_lock: NamedSignal = pdm_signals.signal('pre_lock')
module-attribute
","text":"Called before a project is locked.
Parameters:
Name Type Description Defaultproject
Project
The project object
requiredrequirements
list[Requirement]
The requirements to lock
requireddry_run
bool
If true, won't perform any actions
required"},{"location":"reference/api/#pdm.signals.pre_publish","title":"pre_publish: NamedSignal = pdm_signals.signal('pre_publish')
module-attribute
","text":"Called before a project is published.
Parameters:
Name Type Description Defaultproject
Project
The project object
required"},{"location":"reference/api/#pdm.signals.pre_run","title":"pre_run: NamedSignal = pdm_signals.signal('pre_run')
module-attribute
","text":"Called before any run.
Parameters:
Name Type Description Defaultproject
Project
The project object
requiredscript
str
the script name
requiredargs
Sequence[str]
the command line provided arguments
required"},{"location":"reference/api/#pdm.signals.pre_script","title":"pre_script: NamedSignal = pdm_signals.signal('pre_script')
module-attribute
","text":"Called before any script.
Parameters:
Name Type Description Defaultproject
Project
The project object
requiredscript
str
the script name
requiredargs
Sequence[str]
the command line provided arguments
required"},{"location":"reference/build/","title":"Build Configuration","text":"pdm
uses the PEP 517 to build the package. It acts as a build frontend that calls the build backend to build the package.
A build backend is what drives the build system to build source distributions and wheels from arbitrary source trees.
If you run pdm init
, PDM will let you choose the build backend to use. Unlike other package managers, PDM does not force you to use a specific build backend. You can choose the one you like. Here is a list of build backends and corresponding configurations initially supported by PDM:
pyproject.toml
configuration:
[build-system]\nrequires = [\"pdm-backend\"]\nbuild-backend = \"pdm.backend\"\n
Read the docs
pyproject.toml
configuration:
[build-system]\nrequires = [\"setuptools\", \"wheel\"]\nbuild-backend = \"setuptools.build_meta\"\n
Read the docs
pyproject.toml
configuration:
[build-system]\nrequires = [\"flit_core >=3.2,<4\"]\nbuild-backend = \"flit_core.buildapi\"\n
Read the docs
pyproject.toml
configuration:
[build-system]\nrequires = [\"hatchling\"]\nbuild-backend = \"hatchling.build\"\n
Read the docs
pyproject.toml
configuration:
[build-system]\nrequires = [\"maturin>=1.4,<2.0\"]\nbuild-backend = \"maturin\"\n
Read the docs
Apart from the above mentioned backends, you can also use any other backend that supports PEP 621, however, poetry-core is not supported because it does not support reading PEP 621 metadata.
Info
If you are using a custom build backend that is not in the above list, PDM will handle the relative paths as PDM-style(${PROJECT_ROOT}
variable).
Options:
-h
, --help
: Show this help message and exit.-V
, --version
: Show the version and exit-c
, --config
: Specify another config file path [env var: PDM_CONFIG_FILE
] -v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress output-I
, --ignore-python
: Ignore the Python path saved in.pdm-python
. [env var: PDM_IGNORE_SAVED_PYTHON
]--pep582
SHELL
: Print the command line to be eval'd by the shellCommands:
"},{"location":"reference/cli/#add","title":"add","text":"Add package(s) to pyproject.toml and install them
Package Arguments:
-e
, --editable
: Specify editable packagespackages
: Specify packagesOptions:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress output-g
, --global
: Use the global project, supply the project root with -p
option-p
, --project
: Specify another path as the project root, which changes the base ofpyproject.toml
and __pypackages__
[env var: PDM_PROJECT
]-L
, --lockfile
: Specify another lockfile path. Default:pdm.lock
. [env var: PDM_LOCKFILE
]--no-lock
: Don't try to create or update the lockfile. [env var: PDM_NO_LOCK
]--save-compatible
: Save compatible version specifiers--save-wildcard
: Save wildcard version specifiers--save-exact
: Save exact version specifiers--save-minimum
: Save minimum version specifiers--update-reuse
: Reuse pinned versions already present in lock file if possible--update-eager
: Try to update the packages and their dependencies recursively--update-all
: Update all dependencies and sub-dependencies--update-reuse-installed
: Reuse installed packages if possible--pre
, --prerelease
: Allow prereleases to be pinned-u
, --unconstrained
: Ignore the version constraint of packages--dry-run
: Show the difference only and don't perform any action--venv
NAME
: Run the command in the virtual environment with the given key. [env var: PDM_IN_VENV
]-k
, --skip
: Skip some tasks and/or hooks by their comma-separated names. Can be supplied multiple times. Use:all
to skip all hooks. Use:pre
and:post
to skip all pre or post hooks.-d
, --dev
: Add packages into dev dependencies-G
, --group
: Specify the target dependency group to add into--no-sync
: Only writepyproject.toml
and do not sync the working set (default: True
)Install Options:
--no-editable
: Install non-editable versions for all packages--no-self
: Don't install the project itself. [env var: PDM_NO_SELF
]--fail-fast
, -x
: Abort on first installation error--no-isolation
: Disable isolation when building a source distribution that follows PEP 517, as in: build dependencies specified by PEP 518 must be already installed if this option is used.Build artifacts for distribution
Options:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress output-p
, --project
: Specify another path as the project root, which changes the base ofpyproject.toml
and __pypackages__
[env var: PDM_PROJECT
]--no-isolation
: Disable isolation when building a source distribution that follows PEP 517, as in: build dependencies specified by PEP 518 must be already installed if this option is used.-k
, --skip
: Skip some tasks and/or hooks by their comma-separated names. Can be supplied multiple times. Use:all
to skip all hooks. Use:pre
and:post
to skip all pre or post hooks.--no-sdist
: Don't build source tarballs (default: True
)--no-wheel
: Don't build wheels (default: True
)-d
, --dest
: Target directory to put artifacts (default: dist
)--no-clean
: Do not clean the target directory (default: True
)--config-setting
, -C
: Pass options to the backend. options with a value must be specified after \"=\": --config-setting=--opt(=value)
or -C--opt(=value)
Control the caches of PDM
Options:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress outputCommands:
"},{"location":"reference/cli/#clear","title":"clear","text":"Clean all the files under cache directory
Positional Arguments:
type
: Clear the given type of cachesOptions:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress outputRemove files matching the given pattern
Positional Arguments:
pattern
: The pattern to removeOptions:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress outputList the built wheels stored in the cache
Positional Arguments:
pattern
: The pattern to list (default: *
)Options:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress outputShow the info and current size of caches
Options:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress outputGenerate completion scripts for the given shell
Positional Arguments:
shell
: The shell to generate the scripts for. If not given, PDM will properly guess from SHELL
env var.Options:
-h
, --help
: Show this help message and exit.Display the current configuration
Positional Arguments:
key
: Config keyvalue
: Config valueOptions:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress output-g
, --global
: Use the global project, supply the project root with -p
option-p
, --project
: Specify another path as the project root, which changes the base ofpyproject.toml
and __pypackages__
[env var: PDM_PROJECT
]-l
, --local
: Set config in the project's local configuration file-d
, --delete
: Unset a configuration key-e
, --edit
: Edit the configuration file in the default editor(defined by EDITOR env var)Export the locked packages set to other formats
Options:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress output-g
, --global
: Use the global project, supply the project root with -p
option-p
, --project
: Specify another path as the project root, which changes the base ofpyproject.toml
and __pypackages__
[env var: PDM_PROJECT
]-L
, --lockfile
: Specify another lockfile path. Default:pdm.lock
. [env var: PDM_LOCKFILE
]-f
, --format
: Specify the export file format (default: requirements
)--without-hashes
: Don't include artifact hashes (default: True
)-o
, --output
: Write output to the given file, or print to stdout if not given--pyproject
: Read the list of packages frompyproject.toml
--expandvars
: Expand environment variables in requirements--self
: Include the project itself--editable-self
: Include the project itself as an editable dependencyDependencies Selection:
-G
, --group
GROUP
: Select group of optional-dependencies separated by comma or dev-dependencies (with -d
). Can be supplied multiple times, use:all
to include all groups under the same species.--no-default
: Don't include dependencies from the default group (default: True
)-d
, --dev
: Select dev dependencies--prod
, --production
: Unselect dev dependencies (default: True
)Fix the project problems according to the latest version of PDM
Positional Arguments:
problem
: Fix the specific problem, or all if not givenOptions:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress output-g
, --global
: Use the global project, supply the project root with -p
option-p
, --project
: Specify another path as the project root, which changes the base ofpyproject.toml
and __pypackages__
[env var: PDM_PROJECT
]--dry-run
: Only show the problemsImport project metadata from other formats
Positional Arguments:
filename
: The file nameOptions:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress output-g
, --global
: Use the global project, supply the project root with -p
option-p
, --project
: Specify another path as the project root, which changes the base ofpyproject.toml
and __pypackages__
[env var: PDM_PROJECT
]-d
, --dev
: import packages into dev dependencies-G
, --group
: Specify the target dependency group to import into-f
, --format
: Specify the file format explicitlyShow the project information
Options:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress output-g
, --global
: Use the global project, supply the project root with -p
option-p
, --project
: Specify another path as the project root, which changes the base ofpyproject.toml
and __pypackages__
[env var: PDM_PROJECT
]--venv
NAME
: Run the command in the virtual environment with the given key. [env var: PDM_IN_VENV
]--python
: Show the interpreter path--where
: Show the project root path--packages
: Show the local packages root--env
: Show PEP 508 environment markers--json
: Dump the information in JSONInitialize a pyproject.toml for PDM
Positional Arguments:
template
: Specify the project template, which can be a local path or a Git URLgenerator_args
: Arguments passed to the generatorOptions:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress output-g
, --global
: Use the global project, supply the project root with -p
option-p
, --project
: Specify another path as the project root, which changes the base ofpyproject.toml
and __pypackages__
[env var: PDM_PROJECT
]-k
, --skip
: Skip some tasks and/or hooks by their comma-separated names. Can be supplied multiple times. Use:all
to skip all hooks. Use:pre
and:post
to skip all pre or post hooks.--copier
: Use Copier to generate project [not installed] (default: builtin
)--cookiecutter
: Use Cookiecutter to generate project [not installed] (default: builtin
)-r
, --overwrite
: Overwrite existing filesBuiltin Generator Options:
-n
, --non-interactive
: Don't ask questions but use default values--python
: Specify the Python version/path to use--lib
: Create a library project--backend
: Specify the build backend, which implies --libInstall dependencies from lock file
Options:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress output-g
, --global
: Use the global project, supply the project root with -p
option-p
, --project
: Specify another path as the project root, which changes the base ofpyproject.toml
and __pypackages__
[env var: PDM_PROJECT
]--dry-run
: Show the difference only and don't perform any action-L
, --lockfile
: Specify another lockfile path. Default:pdm.lock
. [env var: PDM_LOCKFILE
]--no-lock
: Don't try to create or update the lockfile. [env var: PDM_NO_LOCK
]-k
, --skip
: Skip some tasks and/or hooks by their comma-separated names. Can be supplied multiple times. Use:all
to skip all hooks. Use:pre
and:post
to skip all pre or post hooks.--venv
NAME
: Run the command in the virtual environment with the given key. [env var: PDM_IN_VENV
]--check
: Check if the lock file is up to date and fail otherwise--plugins
: Install the plugins specified inpyproject.toml
Install Options:
--no-editable
: Install non-editable versions for all packages--no-self
: Don't install the project itself. [env var: PDM_NO_SELF
]--fail-fast
, -x
: Abort on first installation error--no-isolation
: Disable isolation when building a source distribution that follows PEP 517, as in: build dependencies specified by PEP 518 must be already installed if this option is used.Dependencies Selection:
-G
, --group
GROUP
: Select group of optional-dependencies separated by comma or dev-dependencies (with -d
). Can be supplied multiple times, use:all
to include all groups under the same species.--no-default
: Don't include dependencies from the default group (default: True
)-d
, --dev
: Select dev dependencies--prod
, --production
: Unselect dev dependencies (default: True
)List packages installed in the current working set
Positional Arguments:
patterns
: Filter packages by patterns. e.g. pdm list requests- flask-. In --tree mode, only show the subtree of the matched packages.Options:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress output-g
, --global
: Use the global project, supply the project root with -p
option-p
, --project
: Specify another path as the project root, which changes the base ofpyproject.toml
and __pypackages__
[env var: PDM_PROJECT
]--venv
NAME
: Run the command in the virtual environment with the given key. [env var: PDM_IN_VENV
]--freeze
: Show the installed dependencies in pip's requirements.txt format--tree
, --graph
: Display a tree of dependencies-r
, --reverse
: Reverse the dependency tree--resolve
: Resolve all requirements to output licenses (instead of just showing those currently installed)--fields
: Select information to output as a comma separated string. All fields: groups,homepage,licenses,location,name,version. (default: name,version,location
)--sort
: Sort the output using a given field name. If nothing is set, no sort is applied. Multiple fields can be combined with ','.--csv
: Output dependencies in CSV document format--json
: Output dependencies in JSON document format--markdown
: Output dependencies and legal notices in markdown document format - best effort basis--include
: Dependency groups to include in the output. By default all are included--exclude
: Exclude dependency groups from the outputResolve and lock dependencies
Options:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress output-g
, --global
: Use the global project, supply the project root with -p
option-p
, --project
: Specify another path as the project root, which changes the base ofpyproject.toml
and __pypackages__
[env var: PDM_PROJECT
]-L
, --lockfile
: Specify another lockfile path. Default:pdm.lock
. [env var: PDM_LOCKFILE
]--no-isolation
: Disable isolation when building a source distribution that follows PEP 517, as in: build dependencies specified by PEP 518 must be already installed if this option is used.-k
, --skip
: Skip some tasks and/or hooks by their comma-separated names. Can be supplied multiple times. Use:all
to skip all hooks. Use:pre
and:post
to skip all pre or post hooks.--refresh
: Don't update pinned versions, only refresh the lock file--check
: Check if the lock file is up to date and quit--update-reuse
: Reuse pinned versions already present in lock file if possible (default: all
)--update-reuse-installed
: Reuse installed packages if possibleLock Strategy:
--strategy
, -S
STRATEGY
: Specify lock strategy (cross_platform, static_urls, direct_minimal_versions, inherit_metadata). Add 'no_' prefix to disable. Can be supplied multiple times or split by comma.--no-cross-platform
: [DEPRECATED] Only lock packages for the current platform--static-urls
: [DEPRECATED] Store static file URLs in the lockfile--no-static-urls
: [DEPRECATED] Do not store static file URLs in the lockfileDependencies Selection:
-G
, --group
GROUP
: Select group of optional-dependencies separated by comma or dev-dependencies (with -d
). Can be supplied multiple times, use:all
to include all groups under the same species.--no-default
: Don't include dependencies from the default group (default: True
)-d
, --dev
: Select dev dependencies--prod
, --production
: Unselect dev dependencies (default: True
)Build and publish the project to PyPI
Options:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress output-p
, --project
: Specify another path as the project root, which changes the base ofpyproject.toml
and __pypackages__
[env var: PDM_PROJECT
]-k
, --skip
: Skip some tasks and/or hooks by their comma-separated names. Can be supplied multiple times. Use:all
to skip all hooks. Use:pre
and:post
to skip all pre or post hooks.-r
, --repository
: The repository name or url to publish the package to [env var: PDM_PUBLISH_REPO
]-u
, --username
: The username to access the repository [env var: PDM_PUBLISH_USERNAME
]-P
, --password
: The password to access the repository [env var: PDM_PUBLISH_PASSWORD
]-S
, --sign
: Upload the package with PGP signature-i
, --identity
: GPG identity used to sign files.-c
, --comment
: The comment to include with the distribution file.--no-build
: Don't build the package before publishing (default: True
)--skip-existing
: Skip uploading files that already exist. This may not work with some repository implementations.--no-very-ssl
: Disable SSL verification--ca-certs
: The path to a PEM-encoded Certificate Authority bundle to use for publish server validation [env var: PDM_PUBLISH_CA_CERTS
]Remove packages from pyproject.toml
Positional Arguments:
packages
: Specify the packages to removeOptions:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress output-g
, --global
: Use the global project, supply the project root with -p
option-p
, --project
: Specify another path as the project root, which changes the base ofpyproject.toml
and __pypackages__
[env var: PDM_PROJECT
]--dry-run
: Show the difference only and don't perform any action-L
, --lockfile
: Specify another lockfile path. Default:pdm.lock
. [env var: PDM_LOCKFILE
]--no-lock
: Don't try to create or update the lockfile. [env var: PDM_NO_LOCK
]-k
, --skip
: Skip some tasks and/or hooks by their comma-separated names. Can be supplied multiple times. Use:all
to skip all hooks. Use:pre
and:post
to skip all pre or post hooks.--venv
NAME
: Run the command in the virtual environment with the given key. [env var: PDM_IN_VENV
]-d
, --dev
: Remove packages from dev dependencies-G
, --group
: Specify the target dependency group to remove from--no-sync
: Only writepyproject.toml
and do not uninstall packages (default: True
)Install Options:
--no-editable
: Install non-editable versions for all packages--no-self
: Don't install the project itself. [env var: PDM_NO_SELF
]--fail-fast
, -x
: Abort on first installation error--no-isolation
: Disable isolation when building a source distribution that follows PEP 517, as in: build dependencies specified by PEP 518 must be already installed if this option is used.Run commands or scripts with local packages loaded
Options:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress output-g
, --global
: Use the global project, supply the project root with -p
option-p
, --project
: Specify another path as the project root, which changes the base ofpyproject.toml
and __pypackages__
[env var: PDM_PROJECT
]-k
, --skip
: Skip some tasks and/or hooks by their comma-separated names. Can be supplied multiple times. Use:all
to skip all hooks. Use:pre
and:post
to skip all pre or post hooks.--venv
NAME
: Run the command in the virtual environment with the given key. [env var: PDM_IN_VENV
]-l
, --list
: Show all available scripts defined inpyproject.toml
-j
, --json
: Output all scripts infos in JSONExecution Parameters:
-s
, --site-packages
: Load site-packages from the selected interpreterscript
: The command to runargs
: Arguments that will be passed to the commandSearch for PyPI packages
Positional Arguments:
query
: Query string to searchOptions:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress outputManage the PDM program itself (previously known as plugin)
Options:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress outputCommands:
"},{"location":"reference/cli/#list_2","title":"list","text":"List all packages installed with PDM
Options:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress output--plugins
: List plugins onlyInstall packages to the PDM's environment
Positional Arguments:
packages
: Specify one or many package names, each package can have a version specifierOptions:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress output--pip-args
: Arguments that will be passed to pip installRemove packages from PDM's environment
Positional Arguments:
packages
: Specify one or many package namesOptions:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress output--pip-args
: Arguments that will be passed to pip uninstall-y
, --yes
: Answer yes on the questionUpdate PDM itself
Options:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress output--head
: Update to the latest commit on the main branch--pre
: Update to the latest prerelease version--pip-args
: Additional arguments that will be passed to pip installManage the PDM program itself (previously known as plugin)
Options:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress outputCommands:
"},{"location":"reference/cli/#list_3","title":"list","text":"List all packages installed with PDM
Options:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress output--plugins
: List plugins onlyInstall packages to the PDM's environment
Positional Arguments:
packages
: Specify one or many package names, each package can have a version specifierOptions:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress output--pip-args
: Arguments that will be passed to pip installRemove packages from PDM's environment
Positional Arguments:
packages
: Specify one or many package namesOptions:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress output--pip-args
: Arguments that will be passed to pip uninstall-y
, --yes
: Answer yes on the questionUpdate PDM itself
Options:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress output--head
: Update to the latest commit on the main branch--pre
: Update to the latest prerelease version--pip-args
: Additional arguments that will be passed to pip installShow the package information
Positional Arguments:
package
: Specify the package name, or show this package if not givenOptions:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress output-g
, --global
: Use the global project, supply the project root with -p
option-p
, --project
: Specify another path as the project root, which changes the base ofpyproject.toml
and __pypackages__
[env var: PDM_PROJECT
]--venv
NAME
: Run the command in the virtual environment with the given key. [env var: PDM_IN_VENV
]--name
: Show name--version
: Show version--summary
: Show summary--license
: Show license--platform
: Show platform--keywords
: Show keywordsSynchronize the current working set with lock file
Options:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress output-g
, --global
: Use the global project, supply the project root with -p
option-p
, --project
: Specify another path as the project root, which changes the base ofpyproject.toml
and __pypackages__
[env var: PDM_PROJECT
]--dry-run
: Show the difference only and don't perform any action-L
, --lockfile
: Specify another lockfile path. Default:pdm.lock
. [env var: PDM_LOCKFILE
]-k
, --skip
: Skip some tasks and/or hooks by their comma-separated names. Can be supplied multiple times. Use:all
to skip all hooks. Use:pre
and:post
to skip all pre or post hooks.--clean
: Clean packages not in the lockfile--only-keep
: Only keep the selected packages--venv
NAME
: Run the command in the virtual environment with the given key. [env var: PDM_IN_VENV
]-r
, --reinstall
: Force reinstall existing dependenciesInstall Options:
--no-editable
: Install non-editable versions for all packages--no-self
: Don't install the project itself. [env var: PDM_NO_SELF
]--fail-fast
, -x
: Abort on first installation error--no-isolation
: Disable isolation when building a source distribution that follows PEP 517, as in: build dependencies specified by PEP 518 must be already installed if this option is used.Dependencies Selection:
-G
, --group
GROUP
: Select group of optional-dependencies separated by comma or dev-dependencies (with -d
). Can be supplied multiple times, use:all
to include all groups under the same species.--no-default
: Don't include dependencies from the default group (default: True
)-d
, --dev
: Select dev dependencies--prod
, --production
: Unselect dev dependencies (default: True
)Update package(s) in pyproject.toml
Positional Arguments:
packages
: If packages are given, only update themOptions:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress output-g
, --global
: Use the global project, supply the project root with -p
option-p
, --project
: Specify another path as the project root, which changes the base ofpyproject.toml
and __pypackages__
[env var: PDM_PROJECT
]-L
, --lockfile
: Specify another lockfile path. Default:pdm.lock
. [env var: PDM_LOCKFILE
]--no-lock
: Don't try to create or update the lockfile. [env var: PDM_NO_LOCK
]--save-compatible
: Save compatible version specifiers--save-wildcard
: Save wildcard version specifiers--save-exact
: Save exact version specifiers--save-minimum
: Save minimum version specifiers--update-reuse
: Reuse pinned versions already present in lock file if possible--update-eager
: Try to update the packages and their dependencies recursively--update-all
: Update all dependencies and sub-dependencies--update-reuse-installed
: Reuse installed packages if possible--pre
, --prerelease
: Allow prereleases to be pinned-u
, --unconstrained
: Ignore the version constraint of packages-k
, --skip
: Skip some tasks and/or hooks by their comma-separated names. Can be supplied multiple times. Use:all
to skip all hooks. Use:pre
and:post
to skip all pre or post hooks.--venv
NAME
: Run the command in the virtual environment with the given key. [env var: PDM_IN_VENV
]-t
, --top
: Only update those listed inpyproject.toml
--dry-run
, --outdated
: Show the difference only without modifying the lockfile content--no-sync
: Only update lock file but do not sync packages (default: True
)Install Options:
--no-editable
: Install non-editable versions for all packages--no-self
: Don't install the project itself. [env var: PDM_NO_SELF
]--fail-fast
, -x
: Abort on first installation error--no-isolation
: Disable isolation when building a source distribution that follows PEP 517, as in: build dependencies specified by PEP 518 must be already installed if this option is used.Dependencies Selection:
-G
, --group
GROUP
: Select group of optional-dependencies separated by comma or dev-dependencies (with -d
). Can be supplied multiple times, use:all
to include all groups under the same species.--no-default
: Don't include dependencies from the default group (default: True
)-d
, --dev
: Select dev dependencies--prod
, --production
: Unselect dev dependenciesUse the given python version or path as base interpreter
Positional Arguments:
python
: Specify the Python version or pathOptions:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress output-g
, --global
: Use the global project, supply the project root with -p
option-p
, --project
: Specify another path as the project root, which changes the base ofpyproject.toml
and __pypackages__
[env var: PDM_PROJECT
]-k
, --skip
: Skip some tasks and/or hooks by their comma-separated names. Can be supplied multiple times. Use:all
to skip all hooks. Use:pre
and:post
to skip all pre or post hooks.-f
, --first
: Select the first matched interpreter-i
, --ignore-remembered
: Ignore the remembered selection--venv
: Use the interpreter in the virtual environment with the given nameVirtualenv management
Options:
-h
, --help
: Show this help message and exit.-p
, --project
: Specify another path as the project root, which changes the base ofpyproject.toml
and __pypackages__
[env var: PDM_PROJECT
]--path
: Show the path to the given virtualenv--python
: Show the python interpreter path for the given virtualenvCommands:
"},{"location":"reference/cli/#create","title":"create","text":"Create a virtualenv
Positional Arguments:
python
: Specify which python should be used to create the virtualenvvenv_args
: Additional arguments that will be passed to the backendOptions:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress output-w
, --with
: Specify the backend to create the virtualenv-f
, --force
: Recreate if the virtualenv already exists-n
, --name
: Specify the name of the virtualenv--with-pip
: Install pip with the virtualenvList all virtualenvs associated with this project
Options:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress outputRemove the virtualenv with the given name
Positional Arguments:
env
: The key of the virtualenvOptions:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress output-y
, --yes
: Answer yes on the following questionPrint the command to activate the virtualenv with the given name
Positional Arguments:
env
: The key of the virtualenvOptions:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress outputPurge selected/all created Virtualenvs
Options:
-h
, --help
: Show this help message and exit.-v
, --verbose
: Use -v
for detailed output and -vv
for more detailed-q
, --quiet
: Suppress output-f
, --force
: Force purging without prompting for confirmation-i
, --interactive
: Interactively purge selected VirtualenvsThe default theme used by PDM is as follows:
Key Default Styleprimary
cyan success
green warning
yellow error
red info
blue req
bold green You can change the theme colors with pdm config
command. For example, to change the primary
color to magenta
:
pdm config theme.primary magenta\n
Or use a hex color code:
pdm config theme.success '#51c7bd'\n
"},{"location":"reference/configuration/#available-configurations","title":"Available Configurations","text":"The following configuration items can be retrieved and modified by pdm config
command.
build_isolation
Isolate the build environment from the project environment Yes Yes PDM_BUILD_ISOLATION
cache_dir
The root directory of cached files The default cache location on OS No PDM_CACHE_DIR
check_update
Check if there is any newer version available True No PDM_CHECK_UPDATE
global_project.fallback
Use the global project implicitly if no local project is found False
No global_project.fallback_verbose
If True show message when global project is used implicitly True
No global_project.path
The path to the global project <default config location on OS>/global-project
No global_project.user_site
Whether to install to user site False
No install.cache
Enable caching of wheel installations False Yes install.cache_method
Specify how to create links to the caches(symlink/symlink_individual/hardlink/pth
) symlink
Yes install.parallel
Whether to perform installation and uninstallation in parallel True
Yes PDM_PARALLEL_INSTALL
python.use_pyenv
Use the pyenv interpreter True
Yes python.use_venv
Use virtual environments when available True
Yes PDM_USE_VENV
python.providers
List of python provider names for findpython All providers supported by findpython Yes pypi.url
The URL of PyPI mirror https://pypi.org/simple
Yes PDM_PYPI_URL
pypi.username
The username to access PyPI Yes PDM_PYPI_USERNAME
pypi.password
The password to access PyPI Yes PDM_PYPI_PASSWORD
pypi.ignore_stored_index
Ignore the configured indexes False
Yes PDM_IGNORE_STORED_INDEX
pypi.ca_certs
Path to a PEM-encoded CA cert bundle (used for server cert verification) The CA certificates from certifi Yes pypi.client_cert
Path to a PEM-encoded client cert and optional key No pypi.client_key
Path to a PEM-encoded client cert private key, if not in pypi.client_cert No pypi.verify_ssl
Verify SSL certificate when query PyPI True
Yes pypi.json_api
Consult PyPI's JSON API for package metadata False
Yes PDM_PYPI_JSON_API
pypi.<name>.url
The URL of custom package source https://pypi.org/simple
Yes pypi.<name>.username
The username to access custom source Yes pypi.<name>.password
The password to access custom source Yes pypi.<name>.type
index
or find_links
index
Yes pypi.<name>.verify_ssl
Verify SSL certificate when query custom source True
Yes strategy.save
Specify how to save versions when a package is added minimum
(can be: exact
, wildcard
, minimum
, compatible
) Yes strategy.update
The default strategy for updating packages reuse
(can be : eager
, reuse
, all
, reuse-installed
) Yes strategy.resolve_max_rounds
Specify the max rounds of resolution process 10000 Yes PDM_RESOLVE_MAX_ROUNDS
strategy.inherit_metadata
Inherit the groups and markers from parents for each package True
Yes venv.location
Parent directory for virtualenvs <default data location on OS>/venvs
No venv.backend
Default backend to create virtualenv virtualenv
Yes PDM_VENV_BACKEND
venv.prompt
Formatted string to be displayed in the prompt when virtualenv is active {project_name}-{python_version}
Yes PDM_VENV_PROMPT
venv.in_project
Create virtualenv in .venv
under project root True
Yes PDM_VENV_IN_PROJECT
venv.with_pip
Install pip when creating a new venv False
Yes PDM_VENV_WITH_PIP
repository.<name>.url
The URL of custom package source https://pypi.org/simple
Yes repository.<name>.username
The username to access custom repository Yes repository.<name>.password
The password to access custom repository Yes repository.<name>.ca_certs
Path to a PEM-encoded CA cert bundle (used for server cert verification) The CA certificates from certifi Yes repository.<name>.verify_ssl
Verify SSL certificate when uploading to repository True
Yes If the corresponding env var is set, the value will take precedence over what is saved in the config file.
"},{"location":"reference/pep621/","title":"PEP 621 Metadata","text":"The project metadata are stored in the pyproject.toml
. The specifications are defined by PEP 621, PEP 631 and PEP 639. Read the detailed specifications in the PEPs.
In the following part of this document, metadata should be written under [project]
table if not given explicitly.
You can split a long description onto multiple lines, thanks to TOML support for multiline strings. Just remember to escape new lines, so the final description appears on one line only in your package metadata. Indentation will be removed as well when escaping new lines:
description = \"\"\"\\\n Lorem ipsum dolor sit amet, consectetur adipiscing elit, \\\n sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. \\\n Ut enim ad minim veniam, quis nostrud exercitation ullamco \\\n laboris nisi ut aliquip ex ea commodo consequat.\\\n\"\"\"\n
See TOML's specification on strings.
"},{"location":"reference/pep621/#package-version","title":"Package version","text":"StaticDynamic[project]\nversion = \"1.0.0\"\n
[project]\n...\ndynamic = [\"version\"]\n\n[tool.pdm]\nversion = { source = \"file\", path = \"mypackage/__version__.py\" }\n
The version will be read from the mypackage/__version__.py
file searching for the pattern: __version__ = \"{version}\"
.
Read more information about other configurations in dynamic project version from the pdm-backend
documentation.
The required version of Python is specified as the string requires-python
:
requires-python = \">=3.9\"\nclassifiers = [\n \"Programming Language :: Python :: 3\",\n \"Programming Language :: Python :: 3.9\",\n \"Programming Language :: Python :: 3.10\",\n \"Programming Language :: Python :: 3.11\",\n ...\n]\n
Note: As per PEP 621, PDM is not permitted to dynamically update the classifiers
section like some other non-compliant tools. Thus, you should also include the appropriate trove classifiers as shown above if you plan on publishing your package on PyPI.
The license is specified as the string license
:
license = {text = \"BSD-2-Clause\"}\nclassifiers = [\n \"License :: OSI Approved :: BSD License\",\n ...\n]\n
Note: As per PEP 621, PDM is not permitted to dynamically update the classifiers
section like some other non-compliant tools. Thus, you should also include the appropriate trove classifiers as shown above if you plan on publishing your package on PyPI.
The project.dependencies
is an array of dependency specification strings following the PEP 440 and PEP 508.
Examples:
[project]\n...\ndependencies = [\n# Named requirement\n\"requests\",\n# Named requirement with version specifier\n\"flask >= 1.1.0\",\n# Requirement with environment marker\n\"pywin32; sys_platform == 'win32'\",\n# URL requirement\n\"pip @ git+https://github.com/pypa/pip.git@20.3.1\"\n]\n
"},{"location":"reference/pep621/#optional-dependencies","title":"Optional dependencies","text":"You can have some requirements optional, which is similar to setuptools
' extras_require
parameter.
[project.optional-dependencies]\nsocks = [ 'PySocks >= 1.5.6, != 1.5.7, < 2' ]\ntests = [\n'ddt >= 1.2.2, < 2',\n'pytest < 6',\n'mock >= 1.0.1, < 4; python_version < \"3.4\"',\n]\n
To install a group of optional dependencies:
pdm install -G socks\n
-G
option can be given multiple times to include more than one group.
Depending on which build backend you are using, PDM will expand some variables in the dependency strings.
"},{"location":"reference/pep621/#environment-variables","title":"Environment variables","text":"pdm-backendhatchling[project]\ndependencies = [\"flask @ https://${USERNAME}:${PASSWORD}/artifacts.io/Flask-1.1.2.tar.gz\"]\n
[project]\ndependencies = [\"flask @ https://{env:USERNAME}:{env:PASSWORD}/artifacts.io/Flask-1.1.2.tar.gz\"]\n
Find more usages here
Don't worry about credential leakage, the environment variables will be expanded when needed and kept untouched in the lock file.
"},{"location":"reference/pep621/#relative-paths","title":"Relative paths","text":"When you add a package from a relative path, PDM will automatically save it as a relative path for pdm-backend
and hatchling
.
For example, if you run pdm add ./my-package
, it will result in the following line in pyproject.toml
.
[project]\ndependencies = [\"my-package @ file:///${PROJECT_ROOT}/my-package\"]\n
[project]\ndependencies = [\"my-package @ {root:uri}/my-package\"]\n
By default, hatchling doesn't support direct references in the dependency string, you need to turn it on in pyproject.toml
:
[tool.hatch.metadata]\nallow-direct-references = true\n
The relative path will be expanded based on the project root when installing or locking.
"},{"location":"reference/pep621/#console-scripts","title":"Console scripts","text":"The following content:
[project.scripts]\nmycli = \"mycli.__main__:main\"\n
will be translated to setuptools
style:
entry_points = {\n 'console_scripts': [\n 'mycli=mycli.__main__:main'\n ]\n}\n
Also, [project.gui-scripts]
will be translated to gui_scripts
entry points group in setuptools
style.
Other types of entry points are given by [project.entry-points.<type>]
section, with the same format of [project.scripts]
:
[project.entry-points.pytest11]\nmyplugin = \"mypackage.plugin:pytest_plugin\"\n
If the entry point name contains dots or other special characters, wrap it in quotes:
[project.entry-points.\"flake8.extension\"]\nmyplugin = \"mypackage.plugin:flake8_plugin\"\n
"},{"location":"usage/advanced/","title":"Advanced Usage","text":""},{"location":"usage/advanced/#automatic-testing","title":"Automatic Testing","text":""},{"location":"usage/advanced/#use-tox-as-the-runner","title":"Use Tox as the runner","text":"Tox is a great tool for testing against multiple Python versions or dependency sets. You can configure a tox.ini
like the following to integrate your testing with PDM:
[tox]\nenv_list = py{36,37,38},lint\n\n[testenv]\nsetenv =\nPDM_IGNORE_SAVED_PYTHON=\"1\"\ndeps = pdm\ncommands =\npdm install --dev\npytest tests\n\n[testenv:lint]\ndeps = pdm\ncommands =\npdm install -G lint\nflake8 src/\n
To use the virtualenv created by Tox, you should make sure you have set pdm config python.use_venv true
. PDM then will install dependencies from pdm lock
into the virtualenv. In the dedicated venv you can directly run tools by pytest tests/
instead of pdm run pytest tests/
.
You should also make sure you don't run pdm add/pdm remove/pdm update/pdm lock
in the test commands, otherwise the pdm lock
file will be modified unexpectedly. Additional dependencies can be supplied with the deps
config. Besides, isolated_build
and passenv
config should be set as the above example to make PDM work properly.
To get rid of these constraints, there is a Tox plugin tox-pdm which can ease the usage. You can install it by
pip install tox-pdm\n
Or,
pdm add --dev tox-pdm\n
And you can make the tox.ini
much tidier as following, :
[tox]\nenv_list = py{36,37,38},lint\n\n[testenv]\ngroups = dev\ncommands =\npytest tests\n\n[testenv:lint]\ngroups = lint\ncommands =\nflake8 src/\n
See the project's README for a detailed guidance.
"},{"location":"usage/advanced/#use-nox-as-the-runner","title":"Use Nox as the runner","text":"Nox is another great tool for automated testing. Unlike tox, Nox uses a standard Python file for configuration.
It is much easier to use PDM in Nox, here is an example of noxfile.py
:
import os\nimport nox\n\nos.environ.update({\"PDM_IGNORE_SAVED_PYTHON\": \"1\"})\n@nox.session\ndef tests(session):\n session.run_always('pdm', 'install', '-G', 'test', external=True)\n session.run('pytest')\n\n@nox.session\ndef lint(session):\n session.run_always('pdm', 'install', '-G', 'lint', external=True)\n session.run('flake8', '--import-order-style', 'google')\n
Note that PDM_IGNORE_SAVED_PYTHON
should be set so that PDM can pick up the Python in the virtualenv correctly. Also make sure pdm
is available in the PATH
. Before running nox, you should also ensure configuration item python.use_venv
is true to enable venv reusing.
__pypackages__
directory","text":"By default, if you run tools by pdm run
, __pypackages__
will be seen by the program and all subprocesses created by it. This means virtual environments created by those tools are also aware of the packages inside __pypackages__
, which result in unexpected behavior in some cases. For nox
, you can avoid this by adding a line in noxfile.py
:
os.environ.pop(\"PYTHONPATH\", None)\n
For tox
, PYTHONPATH
will not be passed to the test sessions so this isn't going to be a problem. Moreover, it is recommended to make nox
and tox
live in their own pipx environments so you don't need to install for every project. In this case, PEP 582 packages will not be a problem either.
Only one thing to keep in mind -- PDM can't be installed on Python < 3.7, so if your project is to be tested on those Python versions, you have to make sure PDM is installed on the correct Python version, which can be different from the target Python version the particular job/task is run on.
Fortunately, if you are using GitHub Action, there is pdm-project/setup-pdm to make this process easier. Here is an example workflow of GitHub Actions, while you can adapt it for other CI platforms.
Testing:\nruns-on: ${{ matrix.os }}\nstrategy:\nmatrix:\npython-version: [3.7, 3.8, 3.9, '3.10', '3.11']\nos: [ubuntu-latest, macOS-latest, windows-latest]\n\nsteps:\n- uses: actions/checkout@v3\n- name: Set up PDM\nuses: pdm-project/setup-pdm@v3\nwith:\npython-version: ${{ matrix.python-version }}\n\n- name: Install dependencies\nrun: |\npdm sync -d -G testing\n- name: Run Tests\nrun: |\npdm run -v pytest tests\n
TIPS
For GitHub Action users, there is a known compatibility issue on Ubuntu virtual environment. If PDM parallel install is failed on that machine you should either set parallel_install
to false
or set env LD_PRELOAD=/lib/x86_64-linux-gnu/libgcc_s.so.1
. It is already handled by the pdm-project/setup-pdm
action.
Note
If your CI scripts run without a proper user set, you might get permission errors when PDM tries to create its cache directory. To work around this, you can set the HOME environment variable yourself, to a writable directory, for example:
export HOME=/tmp/home\n
"},{"location":"usage/advanced/#use-pdm-in-a-multi-stage-dockerfile","title":"Use PDM in a multi-stage Dockerfile","text":"It is possible to use PDM in a multi-stage Dockerfile to first install the project and dependencies into __pypackages__
and then copy this folder into the final stage, adding it to PYTHONPATH
.
# build stage\nFROM python:3.8 AS builder\n\n# install PDM\nRUN pip install -U pip setuptools wheel\nRUN pip install pdm\n\n# copy files\nCOPY pyproject.toml pdm.lock README.md /project/\nCOPY src/ /project/src\n\n# install dependencies and project into the local packages directory\nWORKDIR /project\nRUN mkdir __pypackages__ && pdm sync --prod --no-editable\n\n\n# run stage\nFROM python:3.8\n\n# retrieve packages from build stage\nENV PYTHONPATH=/project/pkgs\nCOPY --from=builder /project/__pypackages__/3.8/lib /project/pkgs\n\n# retrieve executables\nCOPY --from=builder /project/__pypackages__/3.8/bin/* /bin/\n\n# set command/entrypoint, adapt to fit your needs\nCMD [\"python\", \"-m\", \"project\"]\n
"},{"location":"usage/advanced/#use-pdm-to-manage-a-monorepo","title":"Use PDM to manage a monorepo","text":"With PDM, you can have multiple sub-packages within a single project, each with its own pyproject.toml
file. And you can create only one pdm.lock
file to lock all dependencies. The sub-packages can have each other as their dependencies. To achieve this, follow these steps:
project/pyproject.toml
:
[tool.pdm.dev-dependencies]\ndev = [\n\"-e file:///${PROJECT_ROOT}/packages/foo-core\",\n\"-e file:///${PROJECT_ROOT}/packages/foo-cli\",\n\"-e file:///${PROJECT_ROOT}/packages/foo-app\",\n]\n
packages/foo-cli/pyproject.toml
:
[project]\ndependencies = [\"foo-core\"]\n
packages/foo-app/pyproject.toml
:
[project]\ndependencies = [\"foo-core\"]\n
Now, run pdm install
in the project root, and you will get a pdm.lock
with all dependencies locked. All sub-packages will be installed in editable mode.
Look at the \ud83d\ude80 Example repository for more details.
"},{"location":"usage/advanced/#hooks-for-pre-commit","title":"Hooks forpre-commit
","text":"pre-commit
is a powerful framework for managing git hooks in a centralized fashion. PDM already uses pre-commit
hooks for its internal QA checks. PDM exposes also several hooks that can be run locally or in CI pipelines.
requirements.txt
","text":"This hook wraps the command pdm export
along with any valid argument. It can be used as a hook (e.g., for CI) to ensure that you are going to check in the codebase a requirements.txt
, which reflects the actual content of pdm lock
.
# export python requirements\n- repo: https://github.com/pdm-project/pdm\nrev: 2.x.y # a PDM release exposing the hook\nhooks:\n- id: pdm-export\n# command arguments, e.g.:\nargs: ['-o', 'requirements.txt', '--without-hashes']\nfiles: ^pdm.lock$\n
"},{"location":"usage/advanced/#check-pdmlock-is-up-to-date-with-pyprojecttoml","title":"Check pdm.lock
is up to date with pyproject.toml","text":"This hook wraps the command pdm lock --check
along with any valid argument. It can be used as a hook (e.g., for CI) to ensure that whenever pyproject.toml
has a dependency added/changed/removed, that pdm.lock
is also up to date.
- repo: https://github.com/pdm-project/pdm\nrev: 2.x.y # a PDM release exposing the hook\nhooks:\n- id: pdm-lock-check\n
"},{"location":"usage/advanced/#sync-current-working-set-with-pdmlock","title":"Sync current working set with pdm.lock
","text":"This hook wraps the command pdm sync
along with any valid argument. It can be used as a hook to ensure that your current working set is synced with pdm.lock
whenever you checkout or merge a branch. Add keyring to additional_dependencies
if you want to use your systems credential store.
- repo: https://github.com/pdm-project/pdm\nrev: 2.x.y # a PDM release exposing the hook\nhooks:\n- id: pdm-sync\nadditional_dependencies:\n- keyring\n
"},{"location":"usage/config/","title":"Configure the Project","text":"PDM's config
command works just like git config
, except that --list
isn't needed to show configurations.
Show the current configurations:
pdm config\n
Get one single configuration:
pdm config pypi.url\n
Change a configuration value and store in home configuration:
pdm config pypi.url \"https://test.pypi.org/simple\"\n
By default, the configuration are changed globally, if you want to make the config seen by this project only, add a --local
flag:
pdm config --local pypi.url \"https://test.pypi.org/simple\"\n
Any local configurations will be stored in pdm.toml
under the project root directory.
The configuration files are searched in the following order:
<PROJECT_ROOT>/pdm.toml
- The project configuration<CONFIG_ROOT>/config.toml
- The home configuration<SITE_CONFIG_ROOT>/config.toml
- The site configurationwhere <CONFIG_ROOT>
is:
$XDG_CONFIG_HOME/pdm
(~/.config/pdm
in most cases) on Linux as defined by XDG Base Directory Specification~/Library/Application Support/pdm
on macOS as defined by Apple File System Basics%USERPROFILE%\\AppData\\Local\\pdm
on Windows as defined in Known foldersand <SITE_CONFIG_ROOT>
is:
$XDG_CONFIG_DIRS/pdm
(/etc/xdg/pdm
in most cases) on Linux as defined by XDG Base Directory Specification/Library/Application Support/pdm
on macOS as defined by Apple File System BasicsC:\\ProgramData\\pdm\\pdm
on Windows as defined in Known foldersIf -g/--global
option is used, the first item will be replaced by <CONFIG_ROOT>/global-project/pdm.toml
.
You can find all available configuration items in Configuration Page.
"},{"location":"usage/config/#configure-the-python-finder","title":"Configure the Python finder","text":"By default, PDM will try to find Python interpreters in the following sources:
venv
: The PDM virtualenv locationpath
: The PATH
environment variablepyenv
: The pyenv install rootrye
: The rye toolchain install rootasdf
: The asdf python install rootwinreg
: The Windows registryYou can unselect some of them or change the order by setting python.providers
config key:
pdm config python.providers rye # Rye source only\npdm config python.providers pyenv,asdf # pyenv and asdf\n
"},{"location":"usage/config/#allow-prereleases-in-resolution-result","title":"Allow prereleases in resolution result","text":"By default, pdm
's dependency resolver will ignore prereleases unless there are no stable versions for the given version range of a dependency. This behavior can be changed by setting allow_prereleases
to true
in [tool.pdm]
table:
[tool.pdm]\nallow_prereleases = true\n
"},{"location":"usage/config/#configure-the-package-indexes","title":"Configure the package indexes","text":"You can tell PDM where to to find the packages by either specifying sources in the pyproject.toml
or via pypi.*
configurations.
Add sources in pyproject.toml
:
[[tool.pdm.source]]\nname = \"private\"\nurl = \"https://private.pypi.org/simple\"\nverify_ssl = true\n
Change the default index via pdm config
:
pdm config pypi.url \"https://test.pypi.org/simple\"\n
Add extra indexes via pdm config
:
pdm config pypi.extra.url \"https://extra.pypi.org/simple\"\n
The available configuration options are:
url
: The URL of the indexverify_ssl
: (Optional)Whether to verify SSL certificates, default to trueusername
: (Optional)The username for the indexpassword
: (Optional)The password for the indextype
: (Optional) index or find_links, default to indexBy default, all sources are PEP 503 style \"indexes\" like pip's --index-url
and --extra-index-url
, however, you can set the type to find_links
which contains files or links to be looked for directly. See this answer for the difference between the two types.
These configurations are read in the following order to build the final source list:
pypi.url
, if pypi
doesn't appear in the name
field of any source in pyproject.toml
pyproject.toml
pypi.<name>.url
in PDM config.You can set pypi.ignore_stored_index
to true
to disable all indexes from the PDM config and only use those specified in pyproject.toml
.
Disable the default PyPI index
If you want to omit the default PyPI index, just set the source name to pypi
and that source will replace it.
[[tool.pdm.source]]\nurl = \"https://private.pypi.org/simple\"\nverify_ssl = true\nname = \"pypi\"\n
Indexes in pyproject.toml
or config When you want to share the indexes with other people who are going to use the project, you should add them in pyproject.toml
. For example, some packages only exist in a private index and can't be installed if someone doesn't configure the index. Otherwise, store them in the local config which won't be seen by others.
By default, all sources are considered equal, packages from them are sorted by the version and wheel tags, the most matching one with the highest version is selected.
In some cases you may want to return packages from the preferred source, and search for others if they are missing from the former source. PDM supports this by reading the configuration respect-source-order
:
[tool.pdm.resolution]\nrespect-source-order = true\n
"},{"location":"usage/config/#specify-index-for-individual-packages","title":"Specify index for individual packages","text":"You can bind packages to specific sources with include_packages
and exclude_packages
config under tool.pdm.source
table.
[[tool.pdm.source]]\nname = \"private\"\nurl = \"https://private.pypi.org/simple\"\ninclude_packages = [\"foo\", \"foo-*\"]\nexclude_packages = [\"bar-*\"]\n
With the above configuration, any package matching foo
or foo-*
will only be searched from the private
index, and any package matching bar-*
will be searched from all indexes except private
.
Both include_packages
and exclude_packages
are optional and accept a list of glob patterns, and include_packages
takes effect exclusively when the pattern matches.
You can specify credentials in the URL with ${ENV_VAR}
variable expansion and these variables will be read from the environment variables:
[[tool.pdm.source]]\nname = \"private\"\nurl = \"https://${PRIVATE_PYPI_USERNAME}:${PRIVATE_PYPI_PASSWORD}@private.pypi.org/simple\"\n
"},{"location":"usage/config/#configure-https-certificates","title":"Configure HTTPS certificates","text":"You can use a custom CA bundle or client certificate for HTTPS requests. It can be configured for both indexes(for package download) and repositories(for upload):
pdm config pypi.ca_certs /path/to/ca_bundle.pem\npdm config repository.pypi.ca_certs /path/to/ca_bundle.pem\n
Besides, it is possible to use the system trust store, instead of the bundled certifi certificates for verifying HTTPS certificates. This approach will typically support corporate proxy certificates without additional configuration.
To use truststore
, you need Python 3.10 or newer and install truststore
into the same environment as PDM:
$ pdm self add truststore\n
"},{"location":"usage/config/#index-configuration-merging","title":"Index configuration merging","text":"Index configurations are merged with the name
field of [[tool.pdm.source]]
table or pypi.<name>
key in the config file. This enables you to store the url and credentials separately, to avoid secrets being exposed in the source control. For example, if you have the following configuration:
[[tool.pdm.source]]\nname = \"private\"\nurl = \"https://private.pypi.org/simple\"\n
You can store the credentials in the config file:
pdm config pypi.private.username \"foo\"\npdm config pypi.private.password \"bar\"\n
PDM can retrieve the configurations for private
index from both places.
If the index requires a username and password, but they can't be found from the environment variables nor config file, PDM will prompt you to enter them. Or, if keyring
is installed, it will be used as the credential store. PDM can use the keyring
from either the installed package or the CLI.
If a package is required by many projects on the system, each project has to keep its own copy. This can be a waste of disk space, especially for data science and machine learning projects.
PDM supports caching installations of the same wheel by installing it in a centralized package repository and linking to that installation in different projects. To enable it, run:
pdm config install.cache on\n
It can be enabled on a per-project basis by adding the --local
option to the command.
The caches are located in $(pdm config cache_dir)/packages
. You can view the cache usage with pdm cache info
. Note that the cached installs are managed automatically -- they will be deleted if they are not linked to any projects. Manually deleting the caches from disk may break some projects on the system.
In addition, several different ways of linking to cache entries are supported:
symlink
(default), create symlinks to the package directories or children if the parent is a namespace package.symlink_individual
, for each individual files in the package directory, create a symlink to it.hardlink
, create hard links to the package files of the cache entry.You can switch between them by running pdm config [-l] install.cache_method <method>
.
Note
Only the installation of named requirements resolved from PyPI can be cached.
"},{"location":"usage/config/#configure-the-repositories-for-upload","title":"Configure the repositories for upload","text":"When using the pdm publish
command, it reads the repository secrets from the global config file(<CONFIG_ROOT>/config.toml
). The content of the config is as follows:
[repository.pypi]\nusername = \"frostming\"\npassword = \"<secret>\"\n\n[repository.company]\nurl = \"https://pypi.company.org/legacy/\"\nusername = \"frostming\"\npassword = \"<secret>\"\nca_certs = \"/path/to/custom-cacerts.pem\"\n
Alternatively, these credentials can be provided with env vars:
export PDM_PUBLISH_REPO=...\nexport PDM_PUBLISH_USERNAME=...\nexport PDM_PUBLISH_PASSWORD=...\nexport PDM_PUBLISH_CA_CERTS=...\n
A PEM-encoded Certificate Authority bundle (ca_certs
) can be used for local / custom PyPI repositories where the server certificate is not signed by the standard certifi CA bundle.
Note
Repositories are different from indexes in the previous section. Repositories are for publishing while indexes are for locking and resolving. They don't share the configuration.
Tip
You don't need to configure the url
for pypi
and testpypi
repositories, they are filled by default values. The username, password, and certificate authority bundle can be passed in from the command line for pdm publish
via --username
, --password
, and --ca-certs
, respectively.
To change the repository config from the command line, use the pdm config
command:
pdm config repository.pypi.username \"__token__\"\npdm config repository.pypi.password \"my-pypi-token\"\n\npdm config repository.company.url \"https://pypi.company.org/legacy/\"\npdm config repository.company.ca_certs \"/path/to/custom-cacerts.pem\"\n
"},{"location":"usage/config/#password-management-with-keyring","title":"Password management with keyring","text":"When keyring is available and supported, the passwords will be stored to and retrieved from the keyring instead of writing to the config file. This supports both indexes and upload repositories. The service name will be pdm-pypi-<name>
for an index and pdm-repository-<name>
for a repository.
You can enable keyring by either installing keyring
into the same environment as PDM or installing globally. To add keyring to the PDM environment:
pdm self add keyring\n
Alternatively, if you have installed a copy of keyring globally, make sure the CLI is exposed in the PATH
env var to make it discoverable by PDM:
export PATH=$PATH:path/to/keyring/bin\n
"},{"location":"usage/config/#override-the-resolved-package-versions","title":"Override the resolved package versions","text":"New in version 1.12.0
Sometimes you can't get a dependency resolution due to incorrect version ranges set by upstream libraries that you can't fix. In this case you can use PDM's overrides feature to force a specific version of a package to be installed.
Given the following configuration in pyproject.toml
:
[tool.pdm.resolution.overrides]\nasgiref = \"3.2.10\" # exact version\nurllib3 = \">=1.26.2\" # version range\npytz = \"https://mypypi.org/packages/pytz-2020.9-py3-none-any.whl\" # absolute URL\n
Each entry of that table is a package name with the wanted version. In this example, PDM will resolve the above packages into the given versions no matter whether there is any other resolution available.
Warning
By using [tool.pdm.resolution.overrides]
setting, you are at your own risk of any incompatibilities from that resolution. It can only be used if there is no valid resolution for your requirements and you know the specific version works. Most of the time, you can just add any transient constraints to the dependencies
array.
New in version 2.7.0
You can add extra options passed to individual pdm commands by tool.pdm.options
configuration:
[tool.pdm.options]\nadd = [\"--no-isolation\", \"--no-self\"]\ninstall = [\"--no-self\"]\nlock = [\"--no-cross-platform\"]\n
These options will be added right after the command name. For instance, based on the configuration above, pdm add requests
is equivalent to pdm add --no-isolation --no-self requests
.
New in version 2.10.0
You may see some warnings when resolving dependencies like this:
PackageWarning: Skipping scipy@1.10.0 because it requires Python\n<3.12,>=3.8 but the project claims to work with Python>=3.9.\nNarrow down the `requires-python` range to include this version. For example, \">=3.9,<3.12\" should work.\n warnings.warn(record.message, PackageWarning, stacklevel=1)\nUse `-q/--quiet` to suppress these warnings, or ignore them per-package with `ignore_package_warnings` config in [tool.pdm] table.\n
This is because the supported range of Python versions of the package doesn't cover the requires-python
value specified in the pyproject.toml
. You can ignore these warnings in a per-package basis by adding the following config:
[tool.pdm]\nignore_package_warnings = [\"scipy\", \"tensorflow-*\"]\n
Where each item is a case-insensitive glob pattern to match the package name.
"},{"location":"usage/dependency/","title":"Manage Dependencies","text":"PDM provides a bunch of handful commands to help manage your project and dependencies. The following examples are run on Ubuntu 18.04, a few changes must be done if you are using Windows.
"},{"location":"usage/dependency/#add-dependencies","title":"Add dependencies","text":"pdm add
can be followed by one or several dependencies, and the dependency specification is described in PEP 508.
Examples:
pdm add requests # add requests\npdm add requests==2.25.1 # add requests with version constraint\npdm add requests[socks] # add requests with extra dependency\npdm add \"flask>=1.0\" flask-sqlalchemy # add multiple dependencies with different specifiers\n
PDM also allows extra dependency groups by providing -G/--group <name>
option, and those dependencies will go to [project.optional-dependencies.<name>]
table in the project file, respectively.
You can reference other optional groups in optional-dependencies
, even before the package is uploaded:
[project]\nname = \"foo\"\nversion = \"0.1.0\"\n\n[project.optional-dependencies]\nsocks = [\"pysocks\"]\njwt = [\"pyjwt\"]\nall = [\"foo[socks,jwt]\"]\n
After that, dependencies and sub-dependencies will be resolved properly and installed for you, you can view pdm.lock
to see the resolved result of all dependencies.
Local packages can be added with their paths. The path can be a file or a directory:
pdm add ./sub-package\npdm add ./first-1.0.0-py2.py3-none-any.whl\n
The paths MUST start with a .
, otherwise it will be recognized as a normal named requirement. The local dependencies will be written to the pyproject.toml
file with the URL format:
[project]\ndependencies = [\n\"sub-package @ file:///${PROJECT_ROOT}/sub-package\",\n\"first @ file:///${PROJECT_ROOT}/first-1.0.0-py2.py3-none-any.whl\",\n]\n
Using other build backends If you are using hatchling
instead of the pdm backend, the URLs would be as follows:
sub-package @ {root:uri}/sub-package\nfirst @ {root:uri}/first-1.0.0-py2.py3-none-any.whl\n
Other backends doesn't support encoding relative paths in the URL and will write the absolute path instead."},{"location":"usage/dependency/#url-dependencies","title":"URL dependencies","text":"PDM also supports downloading and installing packages directly from a web address.
Examples:
# Install gzipped package from a plain URL\npdm add \"https://github.com/numpy/numpy/releases/download/v1.20.0/numpy-1.20.0.tar.gz\"\n# Install wheel from a plain URL\npdm add \"https://github.com/explosion/spacy-models/releases/download/en_core_web_trf-3.5.0/en_core_web_trf-3.5.0-py3-none-any.whl\"\n
"},{"location":"usage/dependency/#vcs-dependencies","title":"VCS dependencies","text":"You can also install from a git repository url or other version control systems. The following are supported:
git
hg
svn
bzr
The URL should be like: {vcs}+{url}@{rev}
Examples:
# Install pip repo on tag `22.0`\npdm add \"git+https://github.com/pypa/pip.git@22.0\"\n# Provide credentials in the URL\npdm add \"git+https://username:password@github.com/username/private-repo.git@master\"\n# Give a name to the dependency\npdm add \"pip @ git+https://github.com/pypa/pip.git@22.0\"\n# Or use the #egg fragment\npdm add \"git+https://github.com/pypa/pip.git@22.0#egg=pip\"\n# Install from a subdirectory\npdm add \"git+https://github.com/owner/repo.git@master#egg=pkg&subdirectory=subpackage\"\n
"},{"location":"usage/dependency/#hide-credentials-in-the-url","title":"Hide credentials in the URL","text":"You can hide the credentials in the URL by using the ${ENV_VAR}
variable syntax:
[project]\ndependencies = [\n\"mypackage @ git+http://${VCS_USER}:${VCS_PASSWD}@test.git.com/test/mypackage.git@master\"\n]\n
These variables will be read from the environment variables when installing the project.
"},{"location":"usage/dependency/#add-development-only-dependencies","title":"Add development only dependencies","text":"New in 1.5.0
PDM also supports defining groups of dependencies that are useful for development, e.g. some for testing and others for linting. We usually don't want these dependencies appear in the distribution's metadata so using optional-dependencies
is probably not a good idea. We can define them as development dependencies:
pdm add -dG test pytest\n
This will result in a pyproject.toml as following:
[tool.pdm.dev-dependencies]\ntest = [\"pytest\"]\n
You can have several groups of development only dependencies. Unlike optional-dependencies
, they won't appear in the package distribution metadata such as PKG-INFO
or METADATA
. The package index won't be aware of these dependencies. The schema is similar to that of optional-dependencies
, except that it is in tool.pdm
table.
[tool.pdm.dev-dependencies]\nlint = [\n\"flake8\",\n\"black\"\n]\ntest = [\"pytest\", \"pytest-cov\"]\ndoc = [\"mkdocs\"]\n
For backward-compatibility, if only -d
or --dev
is specified, dependencies will go to dev
group under [tool.pdm.dev-dependencies]
by default. Note
The same group name MUST NOT appear in both [tool.pdm.dev-dependencies]
and [project.optional-dependencies]
.
Local directories and VCS dependencies can be installed in editable mode. If you are familiar with pip
, it is just like pip install -e <package>
. Editable packages are allowed only in development dependencies:
Note
Editable installs are only allowed in the dev
dependency group. Other groups, including the default, will fail with a [PdmUsageError]
.
# A relative path to the directory\npdm add -e ./sub-package --dev\n# A file URL to a local directory\npdm add -e file:///path/to/sub-package --dev\n# A VCS URL\npdm add -e git+https://github.com/pallets/click.git@main#egg=click --dev\n
"},{"location":"usage/dependency/#save-version-specifiers","title":"Save version specifiers","text":"If the package is given without a version specifier like pdm add requests
. PDM provides three different behaviors of what version specifier is saved for the dependency, which is given by --save-<strategy>
(Assume 2.21.0
is the latest version that can be found for the dependency):
minimum
: Save the minimum version specifier: >=2.21.0
(default).compatible
: Save the compatible version specifier: >=2.21.0,<3.0.0
.exact
: Save the exact version specifier: ==2.21.0
.wildcard
: Don't constrain version and leave the specifier to be wildcard: *
.One can give --pre/--prerelease
option to pdm add
so that prereleases are allowed to be pinned for the given packages.
To update all dependencies in the lock file:
pdm update\n
To update the specified package(s):
pdm update requests\n
To update multiple groups of dependencies:
pdm update -G security -G http\n
Or using comma-separated list:
pdm update -G \"security,http\"\n
To update a given package in the specified group:
pdm update -G security cryptography\n
If the group is not given, PDM will search for the requirement in the default dependencies set and raises an error if none is found.
To update packages in development dependencies:
# Update all default + dev-dependencies\npdm update -d\n# Update a package in the specified group of dev-dependencies\npdm update -dG test pytest\n
"},{"location":"usage/dependency/#about-update-strategy","title":"About update strategy","text":"Similarly, PDM also provides 3 different behaviors of updating dependencies and sub-dependencies\uff0c which is given by --update-<strategy>
option:
reuse
: Keep all locked dependencies except for those given in the command line (default).reuse-installed
: Try to reuse the versions installed in the working set. This will also affect the packages requested in the command line.eager
: Try to lock a newer version of the packages in command line and their recursive sub-dependencies and keep other dependencies as they are.all
: Update all dependencies and sub-dependencies.One can give -u/--unconstrained
to tell PDM to ignore the version specifiers in the pyproject.toml
. This works similarly to the yarn upgrade -L/--latest
command. Besides, pdm update
also supports the --pre/--prerelease
option.
To remove existing dependencies from project file and the library directory:
# Remove requests from the default dependencies\npdm remove requests\n# Remove h11 from the 'web' group of optional-dependencies\npdm remove -G web h11\n# Remove pytest-cov from the `test` group of dev-dependencies\npdm remove -dG test pytest-cov\n
"},{"location":"usage/dependency/#install-the-packages-pinned-in-lock-file","title":"Install the packages pinned in lock file","text":"There are a few similar commands to do this job with slight differences:
pdm sync
installs packages from the lock file.pdm update
will update the lock file, then sync
.pdm install
will check the project file for changes, update the lock file if needed, then sync
.sync
also has a few options to manage installed packages:
--clean
: will remove packages no longer in the lockfile--only-keep
: only selected packages (using options like -G
or --prod
) will be kept.You can specify another lockfile than the default pdm lock
by using the -L/--lockfile <filepath>
option or the PDM_LOCKFILE
environment variable.
Say we have a project with following dependencies:
[project] # This is production dependencies\ndependencies = [\"requests\"]\n\n[project.optional-dependencies] # This is optional dependencies\nextra1 = [\"flask\"]\nextra2 = [\"django\"]\n\n[tool.pdm.dev-dependencies] # This is dev dependencies\ndev1 = [\"pytest\"]\ndev2 = [\"mkdocs\"]\n
Command What it does Comments pdm install
install all groups locked in the lockfile pdm install -G extra1
install prod deps, dev deps, and \"extra1\" optional group pdm install -G dev1
install prod deps and only \"dev1\" dev group pdm install -G:all
install prod deps, dev deps and \"extra1\", \"extra2\" optional groups pdm install -G extra1 -G dev1
install prod deps, \"extra1\" optional group and only \"dev1\" dev group pdm install --prod
install prod only pdm install --prod -G extra1
install prod deps and \"extra1\" optional pdm install --prod -G dev1
Fail, --prod
can't be given with dev dependencies Leave the --prod
option All development dependencies are included as long as --prod
is not passed and -G
doesn't specify any dev groups.
Besides, if you don't want the root project to be installed, add --no-self
option, and --no-editable
can be used when you want all packages to be installed in non-editable versions.
You may also use the pdm lock command with these options to lock only the specified groups, which will be recorded in the [metadata]
table of the lock file. If no --group/--prod/--dev/--no-default
option is specified, pdm sync
and pdm update
will operate using the groups in the lockfile. However, if any groups that are not included in the lockfile are given as arguments to the commands, PDM will raise an error.
This feature is especially valuable when managing multiple lockfiles, where each may have different versions of the same package pinned. To switch between lockfiles, you can use the --lockfile/-L
option.
For a realistic example, your project depends on a release version of werkzeug
and you may want to work with a local in-development copy of it when developing. You can add the following to your pyproject.toml
:
[project]\nrequires-python = \">=3.7\"\ndependencies = [\"werkzeug\"]\n\n[tool.pdm.dev-dependencies]\ndev = [\"werkzeug @ file:///${PROJECT_ROOT}/dev/werkzeug\"]\n
Then, run pdm lock
with different options to generate lockfiles for different purposes:
# Lock default + dev, write to pdm.lock\n# with the local copy of werkzeug pinned.\npdm lock\n# Lock default, write to pdm.prod.lock\n# with the release version of werkzeug pinned.\npdm lock --prod -L pdm.prod.lock\n
Check the metadata.groups
field in the lockfile to see which groups are included.
Currently, we support three flags to control the locking behavior: cross_platform
, static_urls
and direct_minimal_versions
, with the meanings as follows. You can pass one or more flags to pdm lock
by --strategy/-S
option, either by giving a comma-separated list or by passing the option multiple times. Both of these commands function in the same way:
pdm lock -S cross_platform,static_urls\npdm lock -S cross_platform -S static_urls\n
The flags will be encoded in the lockfile and get read when you run pdm lock
next time. But you can disable flags by prefixing the flag name with no_
:
pdm lock -S no_cross_platform\n
This command makes the lockfile not cross-platform.
"},{"location":"usage/dependency/#cross-platform","title":"Cross platform","text":"New in version 2.6.0
By default, the generated lockfile is cross-platform, which means the current platform isn't taken into account when resolving the dependencies. The result lockfile will contain wheels and dependencies for all possible platforms and Python versions. However, sometimes this will result in a wrong lockfile when a release doesn't contain all wheels. To avoid this, you can tell PDM to create a lockfile that works for this platform only, trimming the wheels not relevant to the current platform. This can be done by passing the --strategy no_cross_platform
option to pdm lock
:
pdm lock --strategy no_cross_platform\n
"},{"location":"usage/dependency/#static-urls","title":"Static URLs","text":"New in version 2.8.0
By default, PDM only stores the filenames of the packages in the lockfile, which benefits the reusability across different package indexes. However, if you want to store the static URLs of the packages in the lockfile, you can pass the --strategy static_urls
option to pdm lock
:
pdm lock --strategy static_urls\n
The settings will be saved and remembered for the same lockfile. You can also pass --strategy no_static_urls
to disable it.
New in version 2.10.0
When it is enabled by passing --strategy direct_minimal_versions
, dependencies specified in the pyproject.toml
will be resolved to the minimal versions available, rather than the latest versions. This is useful when you want to test the compatibility of your project within a range of dependency versions.
For example, if you specified flask>=2.0
in the pyproject.toml
, flask
will be resolved to version 2.0.0
if there is no other compatibility issue.
Note
Version constraints in package dependencies are not future-proof. If you resolve the dependencies to the minimal versions, there will likely be backwards-compatibility issues. For example, flask==2.0.0
requires werkzeug>=2.0
, but in fact, it can not work with Werkzeug 3.0.0
, which is released 2 years after it.
New in version 2.11.0
Previously, the pdm lock
command would record package metadata as it is. When installing, PDM would start from the top requirements and traverse down to the leaf node of the dependency tree. It would then evaluate any marker it encounters against the current environment. If a marker is not satisfied, the package would be discarded. In other words, we need an additional \"resolution\" step in installation.
When the inherit_metadata
strategy is enabled, PDM will inherit and merge environment markers from a package's ancestors. These markers are then encoded in the lockfile during locking, resulting in faster installations. This has been enabled by default from version 2.11.0
, to disable this strategy in the config, use pdm config strategy.inherit_metadata false
.
Similar to pip list
, you can list all packages installed in the packages directory:
pdm list\n
"},{"location":"usage/dependency/#include-and-exclude-groups","title":"Include and exclude groups","text":"By default, all packages installed in the working set will be listed. You can specify which groups to be listed by --include/--exclude
options, and include
has a higher priority than exclude
.
pdm list --include dev\npdm list --exclude test\n
There is a special group :sub
, when included, all transitive dependencies will also be shown. It is included by default.
You can also pass --resolve
to pdm list
, which will show the packages resolved in pdm.lock
, rather than installed in the working set.
By default, name, version and location will be shown in the list output, you can view more fields or specify the order of fields by --fields
option:
pdm list --fields name,licenses,version\n
For all supported fields, please refer to the CLI reference.
Also, you can specify the output format other than the default table output. The supported formats and options are --csv
, --json
, --markdown
and --freeze
.
Or show a dependency tree by:
$ pdm list --tree\ntempenv 0.0.0\n\u2514\u2500\u2500 click 7.0 [ required: <7.0.0,>=6.7 ]\nblack 19.10b0\n\u251c\u2500\u2500 appdirs 1.4.3 [ required: Any ]\n\u251c\u2500\u2500 attrs 19.3.0 [ required: >=18.1.0 ]\n\u251c\u2500\u2500 click 7.0 [ required: >=6.5 ]\n\u251c\u2500\u2500 pathspec 0.7.0 [ required: <1,>=0.6 ]\n\u251c\u2500\u2500 regex 2020.2.20 [ required: Any ]\n\u251c\u2500\u2500 toml 0.10.0 [ required: >=0.9.4 ]\n\u2514\u2500\u2500 typed-ast 1.4.1 [ required: >=1.4.0 ]\nbump2version 1.0.0\n
Note that --fields
option doesn't work with --tree
.
You can also limit the packages to show by passing the patterns to pdm list
:
pdm list flask-* requests-*\n
Be careful with the shell expansion In most shells, the wildcard *
will be expanded if there are matching files under the current directory. To avoid getting unexpected results, you can wrap the patterns with single quotes: pdm list 'flask-*' 'requests-*'
.
In --tree
mode, only the subtree of the matched packages will be displayed. This can be used to achieve the same purpose as pnpm why
, which is to show why a specific package is required.
$ pdm list --tree --reverse certifi\ncertifi 2023.7.22\n\u2514\u2500\u2500 requests 2.31.0 [ requires: >=2017.4.17 ]\n\u2514\u2500\u2500 cachecontrol[filecache] 0.13.1 [ requires: >=2.16.0 ]\n
"},{"location":"usage/dependency/#allow-prerelease-versions-to-be-installed","title":"Allow prerelease versions to be installed","text":"Include the following setting in pyproject.toml
to enable:
[tool.pdm]\nallow_prereleases = true\n
"},{"location":"usage/dependency/#set-acceptable-format-for-locking-or-installing","title":"Set acceptable format for locking or installing","text":"If you want to control the format(binary/sdist) of the packages, you can set the env vars PDM_NO_BINARY
and PDM_ONLY_BINARY
.
Each env var is a comma-separated list of package name. You can set it to :all:
to apply to all packages. For example:
# No binary for werkzeug will be locked nor used for installation\nPDM_NO_BINARY=werkzeug pdm add flask\n# Only binaries will be locked in the lock file\nPDM_ONLY_BINARY=:all: pdm lock\n# No binaries will be used for installation\nPDM_NO_BINARY=:all: pdm install\n# Prefer binary distributions and even if sdist with higher version is available\nPDM_PREFER_BINARY=flask pdm install\n
"},{"location":"usage/dependency/#solve-the-locking-failure","title":"Solve the locking failure","text":"If PDM is not able to find a resolution to satisfy the requirements, it will raise an error. For example,
pdm django==3.1.4 \"asgiref<3\"\n...\n\ud83d\udd12 Lock failed\nUnable to find a resolution for asgiref because of the following conflicts:\n asgiref<3 (from project)\nasgiref<4,>=3.2.10 (from <Candidate django 3.1.4 from https://pypi.org/simple/django/>)\nTo fix this, you could loosen the dependency version constraints in pyproject.toml. If that is not possible, you could also override the resolved version in `[tool.pdm.resolution.overrides]` table.\n
You can either change to a lower version of django
or remove the upper bound of asgiref
. But if it is not eligible for your project, you can try overriding the resolved package versions in pyproject.toml
.
Sometimes users may want to keep track of the dependencies of global Python interpreter as well. It is easy to do so with PDM, via -g/--global
option which is supported by most subcommands.
If the option is passed, <CONFIG_ROOT>/global-project
will be used as the project directory, which is almost the same as normal project except that pyproject.toml
will be created automatically for you and it doesn't support build features. The idea is taken from Haskell's stack.
However, unlike stack
, by default, PDM won't use global project automatically if a local project is not found. Users should pass -g/--global
explicitly to activate it, since it is not very pleasing if packages go to a wrong place. But PDM also leave the decision to users, just set the config global_project.fallback
to true
.
By default, when pdm
uses global project implicitly the following message is printed: Project is not found, fallback to the global project
. To disable this message set the config global_project.fallback_verbose
to false
.
If you want global project to track another project file other than <CONFIG_ROOT>/global-project
, you can provide the project path via -p/--project <path>
option. Especially if you pass --global --project .
, PDM will install the dependencies of the current project into the global Python.
Warning
Be careful with remove
and sync --clean/--pure
commands when global project is used, because it may remove packages installed in your system Python.
You can also export pdm lock
to other formats, to ease the CI flow or image building process. Currently, only requirements.txt
format is supported:
pdm export -o requirements.txt\n
Note
You can also run pdm export
with a .pre-commit
hook.
As any Python deliverable, your project will go through the different phases of a Python project lifecycle and PDM provides commands to perform the expected tasks for those phases.
It also provides hooks attached to these steps allowing for:
Besides, pre_invoke
signal is emitted before ANY command is invoked, allowing plugins to modify the project or options beforehand.
The built-in commands are currently split into 3 groups:
You will most probably need to perform some recurrent tasks between the installation and publication phases (housekeeping, linting, testing, ...) this is why PDM lets you define your own tasks/phases using user scripts.
To provides full flexibility, PDM allows to skip some hooks and tasks on demand.
"},{"location":"usage/hooks/#initialization","title":"Initialization","text":"The initialization phase should occur only once in a project lifetime by running the pdm init
command to initialize an existing project (prompt to fill the pyproject.toml
file).
They trigger the following hooks:
post_init
flowchart LR\n subgraph pdm-init [pdm init]\n direction LR\n post-init{{Emit post_init}}\n init --> post-init\n end
"},{"location":"usage/hooks/#dependencies-management","title":"Dependencies management","text":"The dependencies management is required for the developer to be able to work and perform the following:
lock
: compute a lock file from the pyproject.toml
requirements.sync
: synchronize (add/remove/update) PEP582 packages from the lock file and install the current project as editable.add
: add a dependencyremove
: remove a dependencyAll those steps are directly available with the following commands:
pdm lock
: execute the lock
taskpdm sync
: execute the sync
taskpdm install
: execute the sync
task, preceded from lock
if requiredpdm add
: add a dependency requirement, re-lock and then syncpdm remove
: remove a dependency requirement, re-lock and then syncpdm update
: re-lock dependencies from their latest versions and then syncThey trigger the following hooks:
pre_install
post_install
pre_lock
post_lock
flowchart LR\n subgraph pdm-install [pdm install]\n direction LR\n\n subgraph pdm-lock [pdm lock]\n direction TB\n pre-lock{{Emit pre_lock}}\n post-lock{{Emit post_lock}}\n pre-lock --> lock --> post-lock\n end\n\n subgraph pdm-sync [pdm sync]\n direction TB\n pre-install{{Emit pre_install}}\n post-install{{Emit post_install}}\n pre-install --> sync --> post-install\n end\n\n pdm-lock --> pdm-sync\n end
"},{"location":"usage/hooks/#switching-python-version","title":"Switching Python version","text":"This is a special case in dependency management: you can switch the current Python version using pdm use
and it will emit the post_use
signal with the new Python interpreter.
flowchart LR\n subgraph pdm-use [pdm use]\n direction LR\n post-use{{Emit post_use}}\n use --> post-use\n end
"},{"location":"usage/hooks/#publication","title":"Publication","text":"As soon as you are ready to publish your package/library, you will require the publication tasks:
build
: build/compile assets requiring it and package everything into a Python package (sdist, wheel)upload
: upload/publish the package to a remote PyPI indexAll those steps are available with the following commands:
pdm build
pdm publish
They trigger the following hooks:
pre_publish
post_publish
pre_build
post_build
flowchart LR\n subgraph pdm-publish [pdm publish]\n direction LR\n pre-publish{{Emit pre_publish}}\n post-publish{{Emit post_publish}}\n\n subgraph pdm-build [pdm build]\n pre-build{{Emit pre_build}}\n post-build{{Emit post_build}}\n pre-build --> build --> post-build\n end\n\n %% subgraph pdm-upload [pdm upload]\n %% pre-upload{{Emit pre_upload}}\n %% post-upload{{Emit post_upload}}\n %% pre-upload --> upload --> post-upload\n %% end\n\n pre-publish --> pdm-build --> upload --> post-publish\n end
Execution will stop at first failure, hooks included.
"},{"location":"usage/hooks/#user-scripts","title":"User scripts","text":"User scripts are detailed in their own section but you should know that:
pre_*
and post_*
script, including composite scripts.run
execution will trigger the pre_run
and post_run
hookspre_script
and post_script
hooksGiven the following scripts
definition:
[tool.pdm.scripts]\npre_script = \"\"\npost_script = \"\"\npre_test = \"\"\npost_test = \"\"\ntest = \"\"\npre_composite = \"\"\npost_composite = \"\"\ncomposite = {composite = [\"test\"]}\n
a pdm run test
will have the following lifecycle:
flowchart LR\n subgraph pdm-run-test [pdm run test]\n direction LR\n pre-run{{Emit pre_run}}\n post-run{{Emit post_run}}\n subgraph run-test [test task]\n direction TB\n pre-script{{Emit pre_script}}\n post-script{{Emit post_script}}\n pre-test[Execute pre_test]\n post-test[Execute post_test]\n test[Execute test]\n\n pre-script --> pre-test --> test --> post-test --> post-script\n end\n\n pre-run --> run-test --> post-run\n end
while pdm run composite
will have the following:
flowchart LR\n subgraph pdm-run-composite [pdm run composite]\n direction LR\n pre-run{{Emit pre_run}}\n post-run{{Emit post_run}}\n\n subgraph run-composite [composite task]\n direction TB\n pre-script-composite{{Emit pre_script}}\n post-script-composite{{Emit post_script}}\n pre-composite[Execute pre_composite]\n post-composite[Execute post_composite]\n\n subgraph run-test [test task]\n direction TB\n pre-script-test{{Emit pre_script}}\n post-script-test{{Emit post_script}}\n pre-test[Execute pre_test]\n post-test[Execute post_test]\n\n pre-script-test --> pre-test --> test --> post-test --> post-script-test\n end\n\n pre-script-composite --> pre-composite --> run-test --> post-composite --> post-script-composite\n end\n\n pre-run --> run-composite --> post-run\n end
"},{"location":"usage/hooks/#skipping","title":"Skipping","text":"It is possible to control which task and hook runs for any built-in command as well as custom user scripts using the --skip
option.
It accepts a comma-separated list of hooks/task names to skip as well as the predefined :all
, :pre
and :post
shortcuts respectively skipping all hooks, all pre_*
hooks and all post_*
hooks. You can also provide the skip list in PDM_SKIP_HOOKS
environment variable but it will be overridden as soon as the --skip
parameter is provided.
Given the previous script block, running pdm run --skip=:pre,post_test composite
will result in the following reduced lifecycle:
flowchart LR\n subgraph pdm-run-composite [pdm run composite]\n direction LR\n post-run{{Emit post_run}}\n\n subgraph run-composite [composite task]\n direction TB\n post-script-composite{{Emit post_script}}\n post-composite[Execute post_composite]\n\n subgraph run-test [test task]\n direction TB\n post-script-test{{Emit post_script}}\n\n test --> post-script-test\n end\n\n run-test --> post-composite --> post-script-composite\n end\n\n run-composite --> post-run\n end
"},{"location":"usage/pep582/","title":"Working with PEP 582","text":"PEP 582 has been rejected
This is a rejected PEP. However, due to the fact that this feature is the reason for PDM's birth, PDM will retain the support. We recommend using virtual environments instead.
With PEP 582, dependencies will be installed into __pypackages__
directory under the project root. With PEP 582 enabled globally, you can also use the project interpreter to run scripts directly.
When the project interpreter is a normal Python, this mode is enabled.
Besides, on a project you work with for the first time on your machine, if it contains an empty __pypackages__
directory, PEP 582 is enabled automatically, and virtualenv won't be created.
To make the Python interpreters aware of PEP 582 packages, one need to add the pdm/pep582/sitecustomize.py
to the Python library search path.
One just needs to execute pdm --pep582
, then environment variable will be changed automatically. Don't forget to restart the terminal session to take effect.
The command to change the environment variables can be printed by pdm --pep582 [<SHELL>]
. If <SHELL>
isn't given, PDM will pick one based on some guesses. You can run eval \"$(pdm --pep582)\"
to execute the command.
You may want to write a line in your .bash_profile
(or similar profiles) to make it effective when logging in. For example, in bash you can do this:
pdm --pep582 >> ~/.bash_profile\n
Once again, Don't forget to restart the terminal session to take effect.
How is it done?Thanks to the site packages loading on Python startup. It is possible to patch the sys.path
by executing the sitecustomize.py
shipped with PDM. The interpreter can search the directories for the nearest __pypackage__
folder and append it to the sys.path
variable.
Now there are no built-in support or plugins for PEP 582 in most IDEs, you have to configure your tools manually.
"},{"location":"usage/pep582/#pycharm","title":"PyCharm","text":"Mark __pypackages__/<major.minor>/lib
as Sources Root. Then, select as Python interpreter a Python installation with the same <major.minor>
version.
Additionally, if you want to use tools from the environment (e.g. pytest
), you have to add the __pypackages__/<major.minor>/bin
directory to the PATH
variable in the corresponding run/debug configuration.
Add the following two entries to the top-level dict in .vscode/settings.json
:
{\n\"python.autoComplete.extraPaths\": [\"__pypackages__/<major.minor>/lib\"],\n\"python.analysis.extraPaths\": [\"__pypackages__/<major.minor>/lib\"]\n}\n
This file can be auto-generated with plugin pdm-vscode
.
Enable PEP582 globally, and make sure VSCode runs using the same user and shell you enabled PEP582 for.
Cannot enable PEP582 globally?If for some reason you cannot enable PEP582 globally, you can still configure each \"launch\" in each project: set the PYTHONPATH
environment variable in your launch configuration, in .vscode/launch.json
. For example, to debug your pytest
run:
{\n\"version\": \"0.2.0\",\n\"configurations\": [\n{\n\"name\": \"pytest\",\n\"type\": \"python\",\n\"request\": \"launch\",\n\"module\": \"pytest\",\n\"args\": [\"tests\"],\n\"justMyCode\": false,\n\"env\": {\"PYTHONPATH\": \"__pypackages__/<major.minor>/lib\"}\n}\n]\n}\n
If your package resides in a src
directory, add it to PYTHONPATH
as well:
\"env\": {\"PYTHONPATH\": \"src:__pypackages__/<major.minor>/lib\"}\n
Using Pylance/Pyright? If you have configured \"python.analysis.diagnosticMode\": \"workspace\"
, and you see a ton of errors/warnings as a result. you may need to create pyrightconfig.json
in the workspace directory, and fill in the following fields:
{\n\"exclude\": [\"__pypackages__\"]\n}\n
Then restart the language server or VS Code and you're good to go. In the future (microsoft/pylance-release#1150), maybe the problem will be solved.
Using Jupyter Notebook?If you wish to use pdm to install jupyter notebook and use it in vscode in conjunction with the python extension:
pdm add notebook
or so to install notebook.env
file inside of your project directory with contents like the following:PYTHONPATH=/your-workspace-path/__pypackages__/<major>.<minor>/lib\n
If the above still doesn't work, it's most likely because the environment variable is not properly loaded when the Notebook starts. There are two workarounds.
code .
in Terminal. It will open a new VSCode window in the current directory with the path set correctly. Use the Jupyter Notebook in the new windowimport sys\nsys.path.append('/your-workspace-path/__pypackages__/<major>.<minor>/lib')\n
Reference Issue
PDM Task ProviderIn addition, there is a VSCode Task Provider extension available for download.
This makes it possible for VSCode to automatically detect pdm scripts so they can be run natively as VSCode Tasks.
"},{"location":"usage/pep582/#neovim","title":"Neovim","text":"If using neovim-lsp with pyright and want your __pypackages__
directory to be added to the path, you can add this to your project's pyproject.toml
.
[tool.pyright]\nextraPaths = [\"__pypackages__/<major.minor>/lib/\"]\n
"},{"location":"usage/pep582/#emacs","title":"Emacs","text":"You have a few options, but basically you'll want to tell an LSP client to add __pypackages__
to the paths it looks at. Here are a few options that are available:
pyproject.toml
and pyright","text":"Add this to your project's pyproject.toml
:
[tool.pyright]\nextraPaths = [\"__pypackages__/<major.minor>/lib/\"]\n
"},{"location":"usage/pep582/#eglot-pyright","title":"eglot + pyright","text":"Using pyright and eglot (included in Emacs 29), add the following to your config:
(defun get-pdm-packages-path ()\n\"For the current PDM project, find the path to the packages.\"\n(let ((packages-path (string-trim (shell-command-to-string \"pdm info --packages\"))))\n(concat packages-path \"/lib\")))\n\n(defun my/eglot-workspace-config (server)\n\"For the current PDM project, dynamically generate a python lsp config.\"\n`(:python\\.analysis (:extraPaths ,(vector (get-pdm-packages-path)))))\n\n(setq-default eglot-workspace-configuration #'my/eglot-workspace-config)\n
You'll want pyright installed either globally, or in your project (probably as a dev dependency). You can add this with, for example:
pdm add --dev --group devel pyright\n
"},{"location":"usage/pep582/#lsp-mode-lsp-python-ms","title":"LSP-Mode + lsp-python-ms","text":"Below is a sample code snippet showing how to make PDM work with lsp-python-ms in Emacs. Contributed by @linw1995.
;; TODO: Cache result\n(defun linw1995/pdm-get-python-executable (&optional dir)\n(let ((pdm-get-python-cmd \"pdm info --python\"))\n(string-trim\n(shell-command-to-string\n(if dir\n(concat \"cd \"\ndir\n\" && \"\npdm-get-python-cmd)\npdm-get-python-cmd)))))\n\n(defun linw1995/pdm-get-packages-path (&optional dir)\n(let ((pdm-get-packages-cmd \"pdm info --packages\"))\n(concat (string-trim\n(shell-command-to-string\n(if dir\n(concat \"cd \"\ndir\n\" && \"\npdm-get-packages-cmd)\npdm-get-packages-cmd)))\n\"/lib\")))\n\n(use-package lsp-python-ms\n:ensure t\n:init (setq lsp-python-ms-auto-install-server t)\n:hook (python-mode\n. (lambda ()\n(setq lsp-python-ms-python-executable (linw1995/pdm-get-python-executable))\n(setq lsp-python-ms-extra-paths (vector (linw1995/pdm-get-packages-path)))\n(require 'lsp-python-ms)\n(lsp)))) ; or lsp-deferred\n
"},{"location":"usage/project/","title":"New Project","text":"To start with, create a new project with pdm init
:
mkdir my-project && cd my-project\npdm init\n
You will need to answer a few questions, to help PDM to create a pyproject.toml
file for you. For more usages of pdm init
, please read Create your project from a template.
At first, you need to choose a Python interpreter from a list of Python versions installed on your machine. The interpreter path will be stored in .pdm-python
and used by subsequent commands. You can also change it later with pdm use
.
Alternatively, you can specify the Python interpreter path via PDM_PYTHON
environment variable. When it is set, the path saved in .pdm-python
will be ignored.
After you select the Python interpreter, PDM will ask you whether you want to create a virtual environment for the project. If you choose yes, PDM will create a virtual environment in the project root directory, and use it as the Python interpreter for the project.
If the selected Python interpreter is in a virtual environment, PDM will use it as the project environment and install dependencies into it. Otherwise, __pypackages__
will be created in the project root and dependencies will be installed into it.
For the difference between these two approaches, please refer to the corresponding sections in the docs:
__pypackages__
(PEP 582)A library and an application differ in many ways. In short, a library is a package that is intended to be installed and used by other projects. In most cases it also needs to be uploaded to PyPI. An application, on the other hand, is one that is directly facing end users and may need to be deployed into some production environments.
In PDM, if you choose to create a library, PDM will add a name
, version
field to the pyproject.toml
file, as well as a [build-system]
table for the build backend, which is only useful if your project needs to be built and distributed. So you need to manually add these fields to pyproject.toml
if you want to change the project from an application to a library. Also, a library project will be installed into the environment when you run pdm install
or pdm sync
, unless --no-self
is specified.
requires-python
value","text":"You need to set an appropriate requires-python
value for your project. This is an important property that affects how dependencies are resolved. Basically, each package's requires-python
must cover the project's requires-python
range. For example, consider the following setup:
requires-python = \">=3.9\"
foo
: requires-python = \">=3.7,<3.11\"
Resolving the dependencies will cause a ResolutionImpossible
:
Unable to find a resolution because the following dependencies don't work\non all Python versions defined by the project's `requires-python`\n
Because the dependency's requires-python
is >=3.7,<3.11
, it doesn't cover the project's requires-python
range of >=3.9
. In other words, the project promises to work on Python 3.9, 3.10, 3.11 (and so on), but the dependency doesn't support Python 3.11 (or any higher). Since PDM creates a cross-platform lockfile that should work on all Python versions within the requires-python
range, it can't find a valid resolution. To fix this, you need add a maximum version to requires-python
, like >=3.9,<3.11
.
The value of requires-python
is a version specifier as defined in PEP 440. Here are some examples:
requires-python
Meaning >=3.7
Python 3.7 and above >=3.7,<3.11
Python 3.7, 3.8, 3.9 and 3.10 >=3.6,!=3.8.*,!=3.9.*
Python 3.6 and above, except 3.8 and 3.9"},{"location":"usage/project/#working-with-older-python-versions","title":"Working with older Python versions","text":"Although PDM run on Python 3.8 and above, you can still have lower Python versions for your working project. But remember, if your project is a library, which needs to be built, published or installed, you make sure the PEP 517 build backend being used supports the lowest Python version you need. For instance, the default backend pdm-backend
only works on Python 3.7+, so if you run pdm build
on a project with Python 3.6, you will get an error. Most modern build backends have dropped the support for Python 3.6 and lower, so it is highly recommended to upgrade the Python version to 3.7+. Here are the supported Python range for some commonly used build backends, we only list those that support PEP 621 since otherwise PDM can't work with them.
pdm-backend
>=3.7
Yes setuptools>=60
>=3.7
Experimental hatchling
>=3.7
Yes flit-core>=3.4
>=3.6
Yes flit-core>=3.2,<3.4
>=3.4
Yes Note that if your project is an application (i.e. without the name
metadata), the above limitation of backends does not apply. Therefore, if you don't need a build backend you can use any Python version >=2.7
.
If you are already using other package manager tools like Pipenv or Poetry, it is easy to migrate to PDM. PDM provides import
command so that you don't have to initialize the project manually, it now supports:
Pipfile
pyproject.toml
pyproject.toml
requirements.txt
format used by pipsetup.py
(It requires setuptools
to be installed in the project environment. You can do this by configuring venv.with_pip
to true
for venv and pdm add setuptools
for __pypackages__
)Also, when you are executing pdm init
or pdm install
, PDM can auto-detect possible files to import if your PDM project has not been initialized yet.
Info
Converting a setup.py
will execute the file with the project interpreter. Make sure setuptools
is installed with the interpreter and the setup.py
is trusted.
You must commit the pyproject.toml
file. You should commit the pdm.lock
and pdm.toml
file. Do not commit the .pdm-python
file.
The pyproject.toml
file must be committed as it contains the project's build metadata and dependencies needed for PDM. It is also commonly used by other python tools for configuration. Read more about the pyproject.toml
file at Pip documentation.
You should be committing the pdm.lock
file, by doing so you ensure that all installers are using the same versions of dependencies. To learn how to update dependencies see update existing dependencies.
pdm.toml
contains some project-wide configuration and it may be useful to commit it for sharing.
.pdm-python
stores the Python path used by the current project and doesn't need to be shared.
$ pdm info\nPDM version:\n 2.0.0\nPython Interpreter:\n /opt/homebrew/opt/python@3.9/bin/python3.9 (3.9)\nProject Root:\n /Users/fming/wkspace/github/test-pdm\nProject Packages:\n /Users/fming/wkspace/github/test-pdm/__pypackages__/3.9\n\n# Show environment info\n$ pdm info --env\n{\n\"implementation_name\": \"cpython\",\n \"implementation_version\": \"3.8.0\",\n \"os_name\": \"nt\",\n \"platform_machine\": \"AMD64\",\n \"platform_release\": \"10\",\n \"platform_system\": \"Windows\",\n \"platform_version\": \"10.0.18362\",\n \"python_full_version\": \"3.8.0\",\n \"platform_python_implementation\": \"CPython\",\n \"python_version\": \"3.8\",\n \"sys_platform\": \"win32\"\n}\n
This command is useful for checking which mode is being used by the project:
None
, virtualenv mode is enabled.Now, you have set up a new PDM project and get a pyproject.toml
file. Refer to metadata section about how to write pyproject.toml
properly.
If you are developing a library, after adding dependencies to your project, and finishing the coding, it's time to build and publish your package. It is as simple as one command:
pdm publish\n
This will automatically build a wheel and a source distribution(sdist), and upload them to the PyPI index.
To specify another repository other than PyPI, use the --repository
option, the parameter can be either the upload URL or the name of the repository stored in the config file.
pdm publish --repository testpypi\npdm publish --repository https://test.pypi.org/legacy/\n
"},{"location":"usage/publish/#publish-with-trusted-publishers","title":"Publish with trusted publishers","text":"You can configure trusted publishers for PyPI so that you don't need to expose the PyPI tokens in the release workflow. To do this, follow the guide to add a publisher and write the GitHub Actions workflow as below:
jobs:\npypi-publish:\nname: upload release to PyPI\nruns-on: ubuntu-latest\npermissions:\n# IMPORTANT: this permission is mandatory for trusted publishing\nid-token: write\nsteps:\n- uses: actions/checkout@v3\n\n- uses: pdm-project/setup-pdm@v3\n\n- name: Publish package distributions to PyPI\nrun: pdm publish\n
"},{"location":"usage/publish/#build-and-publish-separately","title":"Build and publish separately","text":"You can also build the package and upload it in two steps, to allow you to inspect the built artifacts before uploading.
pdm build\n
There are many options to control the build process, depending on the backend used. Refer to the build configuration section for more details.
The artifacts will be created at dist/
and able to upload to PyPI.
pdm publish --no-build\n
"},{"location":"usage/scripts/","title":"PDM Scripts","text":"Like npm run
, with PDM, you can run arbitrary scripts or commands with local packages loaded.
pdm run flask run -p 54321\n
It will run flask run -p 54321
in the environment that is aware of packages in your project environment.
PDM also supports custom script shortcuts in the optional [tool.pdm.scripts]
section of pyproject.toml
.
You can then run pdm run <script_name>
to invoke the script in the context of your PDM project. For example:
[tool.pdm.scripts]\nstart = \"flask run -p 54321\"\n
And then in your terminal:
$ pdm run start\nFlask server started at http://127.0.0.1:54321\n
Any following arguments will be appended to the command:
$ pdm run start -h 0.0.0.0\nFlask server started at http://0.0.0.0:54321\n
Yarn-like script shortcuts
There is a builtin shortcut making all scripts available as root commands as long as the script does not conflict with any builtin or plugin-contributed command. Said otherwise, if you have a start
script, you can run both pdm run start
and pdm start
. But if you have an install
script, only pdm run install
will run it, pdm install
will still run the builtin install
command.
PDM supports 4 types of scripts:
"},{"location":"usage/scripts/#cmd","title":"cmd
","text":"Plain text scripts are regarded as normal command, or you can explicitly specify it:
[tool.pdm.scripts]\nstart = {cmd = \"flask run -p 54321\"}\n
In some cases, such as when wanting to add comments between parameters, it might be more convenient to specify the command as an array instead of a string:
[tool.pdm.scripts]\nstart = {cmd = [\n\"flask\",\n\"run\",\n# Important comment here about always using port 54321\n\"-p\", \"54321\"\n]}\n
"},{"location":"usage/scripts/#shell","title":"shell
","text":"Shell scripts can be used to run more shell-specific tasks, such as pipeline and output redirecting. This is basically run via subprocess.Popen()
with shell=True
:
[tool.pdm.scripts]\nfilter_error = {shell = \"cat error.log|grep CRITICAL > critical.log\"}\n
"},{"location":"usage/scripts/#call","title":"call
","text":"The script can be also defined as calling a python function in the form <module_name>:<func_name>
:
[tool.pdm.scripts]\nfoobar = {call = \"foo_package.bar_module:main\"}\n
The function can be supplied with literal arguments:
[tool.pdm.scripts]\nfoobar = {call = \"foo_package.bar_module:main('dev')\"}\n
"},{"location":"usage/scripts/#composite","title":"composite
","text":"This script kind execute other defined scripts:
[tool.pdm.scripts]\nlint = \"flake8\"\ntest = \"pytest\"\nall = {composite = [\"lint\", \"test\"]}\n
Running pdm run all
will run lint
first and then test
if lint
succeeded.
You can also provide arguments to the called scripts:
[tool.pdm.scripts]\nlint = \"flake8\"\ntest = \"pytest\"\nall = {composite = [\"lint mypackage/\", \"test -v tests/\"]}\n
Note
Argument passed on the command line are given to each called task.
"},{"location":"usage/scripts/#script-options","title":"Script Options","text":""},{"location":"usage/scripts/#env","title":"env
","text":"All environment variables set in the current shell can be seen by pdm run
and will be expanded when executed. Besides, you can also define some fixed environment variables in your pyproject.toml
:
[tool.pdm.scripts]\nstart.cmd = \"flask run -p 54321\"\nstart.env = {FOO = \"bar\", FLASK_ENV = \"development\"}\n
Note how we use TOML's syntax to define a composite dictionary.
Note
Environment variables specified on a composite task level will override those defined by called tasks.
"},{"location":"usage/scripts/#env_file","title":"env_file
","text":"You can also store all environment variables in a dotenv file and let PDM read it:
[tool.pdm.scripts]\nstart.cmd = \"flask run -p 54321\"\nstart.env_file = \".env\"\n
The variables within the dotenv file will not override any existing environment variables. If you want the dotenv file to override existing environment variables use the following:
[tool.pdm.scripts]\nstart.cmd = \"flask run -p 54321\"\nstart.env_file.override = \".env\"\n
Note
A dotenv file specified on a composite task level will override those defined by called tasks.
"},{"location":"usage/scripts/#site_packages","title":"site_packages
","text":"To make sure the running environment is properly isolated from the outer Python interpreter, site-packages from the selected interpreter WON'T be loaded into sys.path
, unless any of the following conditions holds:
PATH
but not inside the __pypackages__
folder.-s/--site-packages
flag is following pdm run
.site_packages = true
is in either the script table or the global setting key _
.Note that site-packages will always be loaded if running with PEP 582 enabled(without the pdm run
prefix).
If you want the options to be shared by all tasks run by pdm run
, you can write them under a special key _
in [tool.pdm.scripts]
table:
[tool.pdm.scripts]\n_.env_file = \".env\"\nstart = \"flask run -p 54321\"\nmigrate_db = \"flask db upgrade\"\n
Besides, inside the tasks, PDM_PROJECT_ROOT
environment variable will be set to the project root.
By default, all user provided extra arguments are simply appended to the command (or to all the commands for composite
tasks).
If you want more control over the user provided extra arguments, you can use the {args}
placeholder. It is available for all script types and will be interpolated properly for each:
[tool.pdm.scripts]\ncmd = \"echo '--before {args} --after'\"\nshell = {shell = \"echo '--before {args} --after'\"}\ncomposite = {composite = [\"cmd --something\", \"shell {args}\"]}\n
will produce the following interpolations (those are not real scripts, just here to illustrate the interpolation):
$ pdm run cmd --user --provided\n--before --user --provided --after\n$ pdm run cmd\n--before --after\n$ pdm run shell --user --provided\n--before --user --provided --after\n$ pdm run shell\n--before --after\n$ pdm run composite --user --provided\ncmd --something\nshell --before --user --provided --after\n$ pdm run composite\ncmd --something\nshell --before --after\n
You may optionally provide default values that will be used if no user arguments are provided:
[tool.pdm.scripts]\ntest = \"echo '--before {args:--default --value} --after'\"\n
will produce the following:
$ pdm run test --user --provided\n--before --user --provided --after\n$ pdm run test\n--before --default --value --after\n
Note
As soon a placeholder is detected, arguments are not appended anymore. This is important for composite
scripts because if a placeholder is detected on one of the subtasks, none for the subtasks will have the arguments appended, you need to explicitly pass the placeholder to every nested command requiring it.
Note
call
scripts don't support the {args}
placeholder as they have access to sys.argv
directly to handle such complexe cases and more.
{pdm}
placeholder","text":"Sometimes you may have multiple PDM installations, or pdm
installed with a different name. This could for example occur in a CI/CD situation, or when working with different PDM versions in different repos. To make your scripts more robust you can use {pdm}
to use the PDM entrypoint executing the script. This will expand to {sys.executable} -m pdm
.
[tool.pdm.scripts]\nwhoami = { shell = \"echo `{pdm} -V` was called as '{pdm} -V'\" }\n
will produce the following output: $ pdm whoami\nPDM, version 0.1.dev2501+g73651b7.d20231115 was called as /usr/bin/python3 -m pdm -V\n\n$ pdm2.8 whoami\nPDM, version 2.8.0 was called as <snip>/venvs/pdm2-8/bin/python -m pdm -V\n
Note
While the above example uses PDM 2.8, this functionality was introduced in the 2.10 series and only backported for the showcase.
"},{"location":"usage/scripts/#show-the-list-of-scripts","title":"Show the List of Scripts","text":"Use pdm run --list/-l
to show the list of available script shortcuts:
$ pdm run --list\n\u256d\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u252c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u252c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256e\n\u2502 Name \u2502 Type \u2502 Description \u2502\n\u251c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u253c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u253c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2524\n\u2502 test_cmd \u2502 cmd \u2502 flask db upgrade \u2502\n\u2502 test_script \u2502 call \u2502 call a python function \u2502\n\u2502 test_shell \u2502 shell \u2502 shell command \u2502\n\u2570\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2534\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2534\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256f\n
You can add an help
option with the description of the script, and it will be displayed in the Description
column in the above output.
Note
Tasks with a name starting with an underscore (_
) are considered internal (helpers...) and are not shown in the listing.
Like npm
, PDM also supports tasks composition by pre and post scripts, pre script will be run before the given task and post script will be run after.
[tool.pdm.scripts]\npre_compress = \"{{ Run BEFORE the `compress` script }}\"\ncompress = \"tar czvf compressed.tar.gz data/\"\npost_compress = \"{{ Run AFTER the `compress` script }}\"\n
In this example, pdm run compress
will run all these 3 scripts sequentially.
The pipeline fails fast
In a pipeline of pre - self - post scripts, a failure will cancel the subsequent execution.
"},{"location":"usage/scripts/#hook-scripts","title":"Hook Scripts","text":"Under certain situations PDM will look for some special hook scripts for execution:
post_init
: Run after pdm init
pre_install
: Run before installing packagespost_install
: Run after packages are installedpre_lock
: Run before dependency resolutionpost_lock
: Run after dependency resolutionpre_build
: Run before building distributionspost_build
: Run after distributions are builtpre_publish
: Run before publishing distributionspost_publish
: Run after distributions are publishedpre_script
: Run before any scriptpost_script
: Run after any scriptpre_run
: Run once before run script invocationpost_run
: Run once after run script invocationNote
Pre & post scripts can't receive any arguments.
Avoid name conflicts
If there exists an install
scripts under [tool.pdm.scripts]
table, pre_install
scripts can be triggered by both pdm install
and pdm run install
. So it is recommended to not use the preserved names.
Note
Composite tasks can also have pre and post scripts. Called tasks will run their own pre and post scripts.
"},{"location":"usage/scripts/#skipping-scripts","title":"Skipping scripts","text":"Because, sometimes it is desirable to run a script but without its hooks or pre and post scripts, there is a --skip=:all
which will disable all hooks, pre and post. There is also --skip=:pre
and --skip=:post
allowing to respectively skip all pre_*
hooks and all post_*
hooks.
It is also possible to need a pre script but not the post one, or to need all tasks from a composite tasks except one. For those use cases, there is a finer grained --skip
parameter accepting a list of tasks or hooks name to exclude.
pdm run --skip pre_task1,task2 my-composite\n
This command will run the my-composite
task and skip the pre_task1
hook as well as the task2
and its hooks.
You can also provide you skip list in PDM_SKIP_HOOKS
environment variable but it will be overridden as soon as the --skip
parameter is provided.
There is more details on hooks and pre/post scripts behavior on the dedicated hooks page.
"},{"location":"usage/template/","title":"Create Project From a Template","text":"Similar to yarn create
and npm create
, PDM also supports initializing or creating a project from a template. The template is given as a positional argument of pdm init
, in one of the following forms:
pdm init flask
- Initialize the project from the template https://github.com/pdm-project/template-flask
pdm init https://github.com/frostming/pdm-template-flask
- Initialize the project from a Git URL. Both HTTPS and SSH URL are acceptable.pdm init django@v2
- To check out the specific branch or tag. Full Git URL also supports it.pdm init /path/to/template
- Initialize the project from a template directory on local filesystem.And pdm init
will use the default template built in.
The project will be initialized at the current directory, existing files with the same name will be overwritten. You can also use the -p <path>
option to create a project at a new path.
According to the first form of the template argument, pdm init <name>
will refer to the template repository located at https://github.com/pdm-project/template-<name>
. To contribute a template, you can create a template repository and establish a request to transfer the ownership to pdm-project
organization(it can be found at the bottom of the repository settings page). The administrators of the organization will review the request and complete the subsequent steps. You will be added as the repository maintainer if the transfer is accepted.
A template repository must be a pyproject-based project, which contains a pyproject.toml
file with PEP-621 compliant metadata. No other special config files are required.
On initialization, the project name in the template will be replaced by the name of the new project. This is done by a recursive full-text search and replace. The import name, which is derived from the project name by replacing all non-alphanumeric characters with underscores and lowercasing, will also be replaced in the same way.
For example, if the project name is foo-project
in the template and you want to initialize a new project named bar-project
, the following replacements will be made:
foo-project
-> bar-project
in all .md
files and .rst
filesfoo_project
-> bar_project
in all .py
filesfoo_project
-> bar_project
in the directory namefoo_project.py
-> bar_project.py
in the file nameTherefore, we don't support name replacement if the import name isn't derived from the project name.
"},{"location":"usage/template/#use-other-project-generators","title":"Use other project generators","text":"If you are seeking for a more powerful project generator, you can use cookiecutter via --cookiecutter
option and copier via --copier
option.
You need to install cookiecutter
and copier
respectively to use them. You can do this by running pdm self add <package>
. To use them:
pdm init --cookiecutter gh:cjolowicz/cookiecutter-hypermodern-python\n# or\npdm init --copier gh:pawamoy/copier-pdm --UNSAFE\n
"},{"location":"usage/venv/","title":"Working with Virtual Environments","text":"When you run pdm init
command, PDM will ask for the Python interpreter to use in the project, which is the base interpreter to install dependencies and run tasks.
Compared to PEP 582, virtual environments are considered more mature and have better support in the Python ecosystem as well as IDEs. Therefore, virtualenv is the default mode if not configured otherwise.
Virtual environments will be used if the project interpreter (the interpreter stored in .pdm-python
, which can be checked by pdm info
) is from a virtualenv.
By default, PDM prefers to use the virtualenv layout as other package managers do. When you run pdm install
the first time on a new PDM-managed project, whose Python interpreter is not decided yet, PDM will create a virtualenv in <project_root>/.venv
, and install dependencies into it. In the interactive session of pdm init
, PDM will also ask to create a virtualenv for you.
You can choose the backend used by PDM to create a virtualenv. Currently it supports three backends:
virtualenv
(default)venv
conda
You can change it by pdm config venv.backend [virtualenv|venv|conda]
.
You can create more than one virtualenvs with whatever Python version you want.
# Create a virtualenv based on 3.8 interpreter\n$ pdm venv create 3.8\n# Assign a different name other than the version string\n$ pdm venv create --name for-test 3.8\n# Use venv as the backend to create, support 3 backends: virtualenv(default), venv, conda\n$ pdm venv create --with venv 3.9\n
"},{"location":"usage/venv/#the-location-of-virtualenvs","title":"The location of virtualenvs","text":"If no --name
is given, PDM will create the venv in <project_root>/.venv
. Otherwise, virtualenvs go to the location specified by the venv.location
configuration. They are named as <project_name>-<path_hash>-<name_or_python_version>
to avoid name collision. You can disable the in-project virtualenv creation by pdm config venv.in_project false
. And all virtualenvs will be created under venv.location
.
You can tell PDM to use a virtualenv you created in preceding steps, with pdm use
:
pdm use -f /path/to/venv\n
"},{"location":"usage/venv/#virtualenv-auto-detection","title":"Virtualenv auto-detection","text":"When no interpreter is stored in the project config or PDM_IGNORE_SAVED_PYTHON
env var is set, PDM will try to detect possible virtualenvs to use:
venv
, env
, .venv
directories in the project rootPDM_IGNORE_ACTIVE_VENV
is set$ pdm venv list\nVirtualenvs created with this project:\n\n- 3.8.6: C:\\Users\\Frost Ming\\AppData\\Local\\pdm\\pdm\\venvs\\test-project-8Sgn_62n-3.8.6\n- for-test: C:\\Users\\Frost Ming\\AppData\\Local\\pdm\\pdm\\venvs\\test-project-8Sgn_62n-for-test\n- 3.9.1: C:\\Users\\Frost Ming\\AppData\\Local\\pdm\\pdm\\venvs\\test-project-8Sgn_62n-3.9.1\n
"},{"location":"usage/venv/#show-the-path-or-python-interpreter-of-a-virtualenv","title":"Show the path or python interpreter of a virtualenv","text":"$ pdm venv --path for-test\n$ pdm venv --python for-test\n
"},{"location":"usage/venv/#remove-a-virtualenv","title":"Remove a virtualenv","text":"$ pdm venv remove for-test\nVirtualenvs created with this project:\nWill remove: C:\\Users\\Frost Ming\\AppData\\Local\\pdm\\pdm\\venvs\\test-project-8Sgn_62n-for-test, continue? [y/N]:y\nRemoved C:\\Users\\Frost Ming\\AppData\\Local\\pdm\\pdm\\venvs\\test-project-8Sgn_62n-for-test\n
"},{"location":"usage/venv/#activate-a-virtualenv","title":"Activate a virtualenv","text":"Instead of spawning a subshell like what pipenv
and poetry
do, pdm venv
doesn't create the shell for you but print the activate command to the console. In this way you won't leave the current shell. You can then feed the output to eval
to activate the virtualenv:
$ eval $(pdm venv activate for-test)\n(test-project-for-test) $ # Virtualenv entered\n
$ eval (pdm venv activate for-test)\n
PS1> Invoke-Expression (pdm venv activate for-test)\n
Additionally, if the project interpreter is a venv Python, you can omit the name argument following activate.
Note
venv activate
does not switch the Python interpreter used by the project. It only changes the shell by injecting the virtualenv paths to environment variables. For the forementioned purpose, use the pdm use
command.
For more CLI usage, see the pdm venv
documentation.
Looking for pdm shell
?
PDM doesn't provide a shell
command because many fancy shell functions may not work perfectly in a subshell, which brings a maintenance burden to support all the corner cases. However, you can still gain the ability via the following ways:
pdm run $SHELL
, this will spawn a subshell with the environment variables set properly. The subshell can be quit with exit
or Ctrl+D
.pdm() {\nlocal command=$1\n\nif [[ \"$command\" == \"shell\" ]]; then\neval $(pdm venv activate)\nelse\ncommand pdm $@\nfi\n}\n
Copy and paste this function to your ~/.bashrc
file and restart your shell.
For fish
shell you can put the following into your ~/fish/config.fish
or in ~/.config/fish/config.fish
function pdm\n set cmd $argv[1]\n\n if test \"$cmd\" = \"shell\"\n eval (pdm venv activate)\n else\n command pdm $argv\n end\n end\n
Now you can run pdm shell
to activate the virtualenv. The virtualenv can be deactivated with deactivate
command as usual.
By default when you activate a virtualenv, the prompt will show: {project_name}-{python_version}
.
For example if your project is named test-project
:
$ eval $(pdm venv activate for-test)\n(test-project-3.10) $ # {project_name} == test-project and {python_version} == 3.10\n
The format can be customized before virtualenv creation with the venv.prompt
configuration or PDM_VENV_PROMPT
environment variable (before a pdm init
or pdm venv create
). Available variables are:
project_name
: name of your projectpython_version
: version of Python (used by the virtualenv)$ PDM_VENV_PROMPT='{project_name}-py{python_version}' pdm venv create --name test-prompt\n$ eval $(pdm venv activate test-prompt)\n(test-project-py3.10) $\n
"},{"location":"usage/venv/#run-a-command-in-a-virtual-environment-without-activating-it","title":"Run a command in a virtual environment without activating it","text":"# Run a script\n$ pdm run --venv test test\n# Install packages\n$ pdm sync --venv test\n# List the packages installed\n$ pdm list --venv test\n
There are other commands supporting --venv
flag or PDM_IN_VENV
environment variable, see the CLI reference. You should create the virtualenv with pdm venv create --name <name>
before using this feature.
By default, if you use pdm use
and select a non-venv Python, the project will be switched to PEP 582 mode. We also allow you to switch to a named virtual environment via the --venv
flag:
# Switch to a virtualenv named test\n$ pdm use --venv test\n# Switch to the in-project venv located at $PROJECT_ROOT/.venv\n$ pdm use --venv in-project\n
"},{"location":"usage/venv/#disable-virtualenv-mode","title":"Disable virtualenv mode","text":"You can disable the auto-creation and auto-detection for virtualenv by pdm config python.use_venv false
. If venv is disabled, PEP 582 mode will always be used even if the selected interpreter is from a virtualenv.
By default PDM will not include pip
in virtual environments. This increases isolation by ensuring that only your dependencies are installed in the virtual environment.
To install pip
once (if for example you want to install arbitrary dependencies in CI) you can run:
# Install pip in the virtual environment\n$ pdm run python -m ensurepip\n# Install arbitrary dependencies\n# These dependencies are not checked for conflicts against lockfile dependencies!\n$ pdm run python -m pip install coverage\n
Or you can create the virtual environment with --with-pip
:
$ pdm venv create --with-pip 3.9\n
See the ensurepip docs for more details on ensurepip
.
If you want to permanently configure PDM to include pip
in virtual environments you can use the venv.with_pip
configuration.