Directory structure: └── docs/ ├── reference/ │ ├── build_failures.md │ ├── settings.md │ ├── index.md │ ├── resolver-internals.md │ ├── benchmarks.md │ ├── policies/ │ │ ├── license.md │ │ ├── versioning.md │ │ ├── index.md │ │ └── platforms.md │ └── cli.md ├── pip/ │ ├── compatibility.md │ ├── index.md │ ├── packages.md │ ├── compile.md │ ├── dependencies.md │ ├── inspection.md │ └── environments.md ├── concepts/ │ ├── tools.md │ ├── cache.md │ ├── python-versions.md │ ├── index.md │ ├── projects/ │ │ ├── layout.md │ │ ├── build.md │ │ ├── sync.md │ │ ├── config.md │ │ ├── index.md │ │ ├── workspaces.md │ │ ├── init.md │ │ ├── dependencies.md │ │ └── run.md │ └── resolution.md ├── index.md ├── configuration/ │ ├── files.md │ ├── installer.md │ ├── indexes.md │ ├── index.md │ ├── authentication.md │ └── environment.md ├── guides/ │ ├── projects.md │ ├── tools.md │ ├── publish.md │ ├── scripts.md │ ├── index.md │ ├── integration/ │ │ ├── dependency-bots.md │ │ ├── docker.md │ │ ├── fastapi.md │ │ ├── alternative-indexes.md │ │ ├── pre-commit.md │ │ ├── index.md │ │ ├── pytorch.md │ │ ├── gitlab.md │ │ ├── jupyter.md │ │ └── github.md │ └── install-python.md └── getting-started/ ├── help.md ├── first-steps.md ├── features.md ├── index.md └── installation.md
uv needs to build packages when there is not a compatible wheel (a pre-built distribution of the package) available. Building packages can fail for many reasons, some of which may be unrelated to uv itself.
An example build failure can be produced by trying to install and old version of numpy on a new, unsupported version of Python:
$ uv pip install -p 3.13 'numpy<1.20'
Resolved 1 package in 62ms
× Failed to build `numpy==1.19.5`
├─▶ The build backend returned an error
╰─▶ Call to `setuptools.build_meta:__legacy__.build_wheel()` failed (exit status: 1)
[stderr]
Traceback (most recent call last):
File "<string>", line 8, in <module>
from setuptools.build_meta import __legacy__ as backend
File "/home/konsti/.cache/uv/builds-v0/.tmpi4bgKb/lib/python3.13/site-packages/setuptools/__init__.py", line 9, in <module>
import distutils.core
ModuleNotFoundError: No module named 'distutils'
hint: `distutils` was removed from the standard library in Python 3.12. Consider adding a constraint (like `numpy >1.19.5`) to avoid building a version of `numpy` that depends
on `distutils`.
Notice that the error message is prefaced by "The build backend returned an error".
The build failure includes the [stderr]
(and [stdout]
, if present) from the build backend that
was used for the build. The error logs are not from uv itself.
The message following the ╰─▶
is a hint provided by uv, to help resolve common build failures. A
hint will not be available for all build failures.
Build failures are usually related to your system and the build backend. It is rare that a build failure is specific to uv. You can confirm that the build failure is not related to uv by attempting to reproduce it with pip:
$ uv venv -p 3.13 --seed
$ source .venv/bin/activate
$ pip install --use-pep517 'numpy==1.19.5'
Collecting numpy==1.19.5
Using cached numpy-1.19.5.zip (7.3 MB)
Installing build dependencies ... done
Getting requirements to build wheel ... done
ERROR: Exception:
Traceback (most recent call last):
...
File "/Users/example/.cache/uv/archive-v0/3783IbOdglemN3ieOULx2/lib/python3.13/site-packages/pip/_vendor/pyproject_hooks/_impl.py", line 321, in _call_hook
raise BackendUnavailable(data.get('traceback', ''))
pip._vendor.pyproject_hooks._impl.BackendUnavailable: Traceback (most recent call last):
File "/Users/example/.cache/uv/archive-v0/3783IbOdglemN3ieOULx2/lib/python3.13/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 77, in _build_backend
obj = import_module(mod_path)
File "/Users/example/.local/share/uv/python/cpython-3.13.0-macos-aarch64-none/lib/python3.13/importlib/__init__.py", line 88, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "<frozen importlib._bootstrap>", line 1387, in _gcd_import
File "<frozen importlib._bootstrap>", line 1360, in _find_and_load
File "<frozen importlib._bootstrap>", line 1310, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 488, in _call_with_frames_removed
File "<frozen importlib._bootstrap>", line 1387, in _gcd_import
File "<frozen importlib._bootstrap>", line 1360, in _find_and_load
File "<frozen importlib._bootstrap>", line 1331, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 935, in _load_unlocked
File "<frozen importlib._bootstrap_external>", line 1022, in exec_module
File "<frozen importlib._bootstrap>", line 488, in _call_with_frames_removed
File "/private/var/folders/6p/k5sd5z7j31b31pq4lhn0l8d80000gn/T/pip-build-env-vdpjme7d/overlay/lib/python3.13/site-packages/setuptools/__init__.py", line 9, in <module>
import distutils.core
ModuleNotFoundError: No module named 'distutils'
!!! important
The `--use-pep517` flag should be included with the `pip install` invocation to ensure the same
build isolation behavior. uv always uses [build isolation by default](../pip/compatibility.md#pep-517-build-isolation).
We also recommend including the `--force-reinstall` and `--no-cache` options when reproducing
failures.
Since this build failure occurs in pip too, it is not likely to be a bug with uv.
If a build failure is reproducible with another installer, you should investigate upstream (in this
example, numpy
or setuptools
), find a way to avoid building the package in the first place, or
make the necessary adjustments to your system for the build to succeed.
When generating the cross-platform lockfile, uv needs to determine the dependencies of all packages,
even those only installed on other platforms. uv tries to avoid package builds during resolution. It
uses any wheel if exist for that version, then tries to find static metadata in the source
distribution (mainly pyproject.toml with static project.version
, project.dependencies
and
project.optional-dependencies
or METADATA v2.2+). Only if all of that fails, it builds the
package.
When installing, uv needs to have a wheel for the current platform for each package. If no matching wheel exists in the index, uv tries to build the source distribution.
You can check which wheels exist for a PyPI project under “Download Files”, e.g.
https://pypi.org/project/numpy/2.1.1.md#files. Wheels with ...-py3-none-any.whl
filenames work
everywhere, others have the operating system and platform in the filename. In the linked numpy
example, you can see that there are pre-built distributions for Python 3.10 to 3.13 on MacOS, Linux
and Windows.
The following examples demonstrate common build failures and how to resolve them.
If the build error mentions a missing command, for example, gcc
:
× Failed to build `pysha3==1.0.2`
├─▶ The build backend returned an error
╰─▶ Call to `setuptools.build_meta:__legacy__.build_wheel` failed (exit status: 1)
[stdout]
running bdist_wheel
running build
running build_py
creating build/lib.linux-x86_64-cpython-310
copying sha3.py -> build/lib.linux-x86_64-cpython-310
running build_ext
building '_pysha3' extension
creating build/temp.linux-x86_64-cpython-310/Modules/_sha3
gcc -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -fPIC -DPY_WITH_KECCAK=1 -I/root/.cache/uv/builds-v0/.tmp8V4iEk/include -I/usr/local/include/python3.10 -c
Modules/_sha3/sha3module.c -o build/temp.linux-x86_64-cpython-310/Modules/_sha3/sha3module.o
[stderr]
error: command 'gcc' failed: No such file or directory
Then, you'll need to install it with your system package manager, e.g., to resolve the error above:
$ apt install gcc
!!! tip
When using the uv-managed Python versions, it's common to need `clang` installed instead of
`gcc`.
Many Linux distributions provide a package that includes all the common build dependencies.
You can address most build requirements by installing it, e.g., for Debian or Ubuntu:
```console
$ apt install build-essential
```
If the build error mentions a missing header or library, e.g., a .h
file, then you'll need to
install it with your system package manager.
For example, installing pygraphviz
requires Graphviz to be installed:
× Failed to build `pygraphviz==1.14`
├─▶ The build backend returned an error
╰─▶ Call to `setuptools.build_meta.build_wheel` failed (exit status: 1)
[stdout]
running bdist_wheel
running build
running build_py
...
gcc -fno-strict-overflow -Wsign-compare -DNDEBUG -g -O3 -Wall -fPIC -DSWIG_PYTHON_STRICT_BYTE_CHAR -I/root/.cache/uv/builds-v0/.tmpgLYPe0/include -I/usr/local/include/python3.12 -c pygraphviz/graphviz_wrap.c -o
build/temp.linux-x86_64-cpython-312/pygraphviz/graphviz_wrap.o
[stderr]
...
pygraphviz/graphviz_wrap.c:9: warning: "SWIG_PYTHON_STRICT_BYTE_CHAR" redefined
9 | #define SWIG_PYTHON_STRICT_BYTE_CHAR
|
<command-line>: note: this is the location of the previous definition
pygraphviz/graphviz_wrap.c:3023:10: fatal error: graphviz/cgraph.h: No such file or directory
3023 | #include "graphviz/cgraph.h"
| ^~~~~~~~~~~~~~~~~~~
compilation terminated.
error: command '/usr/bin/gcc' failed with exit code 1
hint: This error likely indicates that you need to install a library that provides "graphviz/cgraph.h" for `[email protected]`
To resolve this error on Debian, you'd install the libgraphviz-dev
package:
$ apt install libgraphviz-dev
Note that installing the graphviz
package is not sufficient, the development headers need to be
installed.
!!! tip
To resolve an error where `Python.h` is missing, install the [`python3-dev` package](https://packages.debian.org/bookworm/python3-dev).
If the build error mentions a failing import, consider disabling build isolation.
For example, some packages assume that pip
is available without declaring it as a build
dependency:
× Failed to build `chumpy==0.70`
├─▶ The build backend returned an error
╰─▶ Call to `setuptools.build_meta:__legacy__.build_wheel` failed (exit status: 1)
[stderr]
Traceback (most recent call last):
File "<string>", line 9, in <module>
ModuleNotFoundError: No module named 'pip'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "<string>", line 14, in <module>
File "/root/.cache/uv/builds-v0/.tmpvvHaxI/lib/python3.12/site-packages/setuptools/build_meta.py", line 334, in get_requires_for_build_wheel
return self._get_build_requires(config_settings, requirements=[])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/.cache/uv/builds-v0/.tmpvvHaxI/lib/python3.12/site-packages/setuptools/build_meta.py", line 304, in _get_build_requires
self.run_setup()
File "/root/.cache/uv/builds-v0/.tmpvvHaxI/lib/python3.12/site-packages/setuptools/build_meta.py", line 522, in run_setup
super().run_setup(setup_script=setup_script)
File "/root/.cache/uv/builds-v0/.tmpvvHaxI/lib/python3.12/site-packages/setuptools/build_meta.py", line 320, in run_setup
exec(code, locals())
File "<string>", line 11, in <module>
ModuleNotFoundError: No module named 'pip'
To resolve this error, pre-install the build dependencies then disable build isolation for the package:
$ uv pip install pip setuptools
$ uv pip install chumpy --no-build-isolation-package chumpy
Note you will need to install the missing package, e.g., pip
, and all the other build
dependencies of the package, e.g, setuptools
.
If a package fails to build during resolution and the version that failed to build is older than the
version you want to use, try adding a constraint with a
lower bound (e.g. numpy>=1.17
). Sometimes, due to algorithmic limitations, the uv resolver tries
to find a fitting version using unreasonably old packages, which can be prevented by using lower
bounds.
For example, when resolving the following dependencies on Python 3.10, uv attempts to build an old
version of apache-beam
.
dill<0.3.9,>=0.2.2
apache-beam<=2.49.0
× Failed to build `apache-beam==2.0.0`
├─▶ The build backend returned an error
╰─▶ Call to `setuptools.build_meta:__legacy__.build_wheel` failed (exit status: 1)
[stderr]
...
Adding a lower bound constraint, e.g., apache-beam<=2.49.0,>2.30.0
, resolves this build failure as
uv will avoid using an old version of apache-beam
.
Constraints can also be defined for indirect dependencies using constraints.txt
files or the
constraint-dependencies
setting.
If locking fails due to building a package from a platform you do not need to support, consider limiting resolution to your supported platforms.
If you support a large range of Python versions, consider using markers to use older versions for
older Python versions and newer versions for newer Python version. For example, numpy
only
supports four Python minor version at a time, so to support a wider range of Python versions, e.g.,
Python 3.8 to 3.13, the numpy
requirement needs to be split:
numpy>=1.23; python_version >= "3.10"
numpy<1.23; python_version < "3.10"
If locking fails due to building a package that is only usable on another platform, you can provide dependency metadata manually to skip the build. uv can not verify this information, so it is important to specify correct metadata when using this override.
The reference section provides information about specific parts of uv:
- Commands: A reference for uv's command line interface.
- Settings: A reference for uv's configuration schema.
- Resolver: Details about the internals of uv's resolver.
- Policies: uv's versioning policy, platform support policy, and license.
Looking for a broader overview? Check out the concepts documentation.
!!! tip
This document focuses on the internal workings of uv's resolver. For using uv, see the
[resolution concept](../concepts/resolution.md) documentation.
As defined in a textbook, resolution, or finding a set of version to install from a given set of requirements, is equivalent to the SAT problem and thereby NP-complete: in the worst case you have to try all possible combinations of all versions of all packages and there are no general, fast algorithms. In practice, this is misleading for a number of reasons:
- The slowest part of resolution in uv is loading package and version metadata, even if it's cached.
- There are many possible solutions, but some are preferable to others. For example, we generally prefer using the latest version of packages.
- Package dependencies are complex, e.g., there are contiguous versions ranges — not arbitrary boolean inclusion/exclusions of versions, adjacent releases often have the same or similar requirements, etc.
- For most resolutions, the resolver doesn't need to backtrack, picking versions iteratively is sufficient. If there are version preferences from a previous resolution, barely any work needs to be done.
- When resolution fails, more information is needed than a message that there is no solution (as is seen in SAT solvers). Instead, the resolver should produce an understandable error trace that states which packages are involved in away to allows a user to remove the conflict.
uv uses pubgrub-rs, the Rust implementation of PubGrub, an incremental version solver. PubGrub in uv works in the following steps:
- Start with a partial solution that declares which packages versions have been selected and which are undecided. Initially, only a virtual root package is decided.
- The highest priority package is selected from the undecided packages. Package with URLs (including
file, git, etc.) have the highest priority, then those with more exact specifiers (such as
==
), then those with less strict specifiers. Inside each category, packages are ordered by when they were first seen (i.e. order in a file), making the resolution deterministic. - A version is picked for the selected package. The version must works with all specifiers from the
requirements in the partial solution and must not be previously marked as incompatible. The
resolver prefers versions from a lockfile (
uv.lock
or-o requirements.txt
) and those installed in the current environment. Versions are checked from highest to lowest (unless using an alternative resolution strategy). - All requirements of the selected package version are added to the undecided packages. uv prefetches their metadata in the background to improve performance.
- The process is either repeated with the next package unless a conflict is detected, in which the
resolver will backtrack. For example, the partial solution contains, among other packages,
a 2
thenb 2
with the requirementsa 2 -> c 1
andb 2 -> c 2
. No compatible version ofc
can be found. PubGrub can determine this was caused bya 2
andb 2
and add the incompatibility{a 2, b 2}
, meaning that when either is picked, the other cannot be selected. The partial solution is restored toa 2
with the tracked incompatibility and the resolver attempts to pick a new version forb
.
Eventually, the resolver either picks compatible versions for all packages (a successful resolution) or there is an incompatibility including the virtual "root" package which defines the versions requested by the user. An incompatibility with the root package indicates that whatever versions of the root dependencies and their transitive dependencies are picked, there will always be a conflict. From the incompatibilities tracked in PubGrub, an error message is constructed to enumerate the involved packages.
!!! tip
For more details on the PubGrub algorithm, see [Internals of the PubGrub
algorithm](https://pubgrub-rs-guide.pages.dev/internals/intro).
In addition to PubGrub's base algorithm, we also use a heuristic that backtracks and switches the order of two packages if they have been conflicting too much.
Python resolvers historically didn't support backtracking, and even with backtracking, resolution was usually limited to single environment, which one specific architecture, operating system, Python version, and Python implementation. Some packages use contradictory requirements for different environments, for example:
numpy>=2,<3 ; python_version >= "3.11"
numpy>=1.16,<2 ; python_version < "3.11"
Since Python only allows one version of each package, a naive resolver would error here. Inspired by Poetry, uv uses a forking resolver: whenever there are multiple requirements for a package with different markers, the resolution is split.
In the above example, the partial solution would be split into two resolutions, one for
python_version >= "3.11"
and one for python_version < "3.11"
.
If markers overlap or are missing a part of the marker space, the resolver splits additional times — there can be many forks per package. For example, given:
flask > 1 ; sys_platform == 'darwin'
flask > 2 ; sys_platform == 'win32'
flask
A fork would be created for sys_platform == 'darwin'
, for sys_platform == 'win32'
, and for
sys_platform != 'darwin' and sys_platform != 'win32'
.
Forks can be nested, e.g., each fork is dependent on any previous forks that occurred. Forks with identical packages are merged to keep the number of forks low.
!!! tip
Forking can be observed in the logs of `uv lock -v` by looking for
`Splitting resolution on ...`, `Solving split ... (requires-python: ...)` and `Split ... resolution
took ...`.
One difficulty in a forking resolver is that where splits occur is dependent on the order packages
are seen, which is in turn dependent on the preferences, e.g., from uv.lock
. So it is possible for
the resolver to solve the requirements with specific forks, write this to the lockfile, and when the
resolver is invoked again, a different solution is found because the preferences result in different
fork points. To avoid this, the resolution-markers
of each fork and each package that diverges
between forks is written to the lockfile. When performing a new resolution, the forks from the
lockfile are used to ensure the resolution is stable. When requirements change, new forks may be
added to the saved forks.
To ensure that a resolution with requires-python = ">=3.9"
can actually be installed for the
included Python versions, uv requires that all dependencies have the same minimum Python version.
Package versions that declare a higher minimum Python version, e.g., requires-python = ">=3.10"
,
are rejected, because a resolution with that version can't be installed on Python 3.9. For
simplicity and forward compatibility, only lower bounds in requires-python
are respected. For
example, if a package declares requires-python = ">=3.8,<4"
, the <4
marker is not propagated to
the entire resolution.
While uv's resolution is universal with respect to environment markers, this doesn't extend to wheel
tags. Wheel tags can encode the Python version, Python implementation, operating system, and
architecture. For example, torch-2.4.0-cp312-cp312-manylinux2014_aarch64.whl
is only compatible
with CPython 3.12 on arm64 Linux with glibc>=2.17
(per the manylinux2014
policy), while
tqdm-4.66.4-py3-none-any.whl
works with all Python 3 versions and interpreters on any operating
system and architecture. Most projects have a universally compatible source distribution that can be
used when attempted to install a package that has no compatible wheel, but some packages, such as
torch
, don't publish a source distribution. In this case an installation on, e.g., Python 3.13, an
uncommon operating system, or architecture, will fail and complain that there is no matching wheel.
uv's performance is continually benchmarked against previous releases, and regularly compared to other tools in the space, like pip and Poetry.
The latest benchmarks and details on the benchmarking process can be found in the GitHub repository.
uv is licensed under either of
-
Apache License, Version 2.0
LICENSE-APACHE or https://www.apache.org/licenses/LICENSE-2.0
-
MIT License
at your option.
Unless you explicitly state otherwise, any contribution intentionally submitted for inclusion in uv by you, as defined in the Apache-2.0 license, shall be dually licensed as above, without any additional terms or conditions.
uv uses a custom versioning scheme in which the minor version number is bumped for breaking changes, and the patch version number is bumped for bug fixes, enhancements, and other non-breaking changes.
uv does not yet have a stable API; once uv's API is stable (v1.0.0), the versioning scheme will adhere to Semantic Versioning.
uv's changelog can be viewed on GitHub.
Cache versions are considered internal to uv, and so may be changed in a minor or patch release. See Cache versioning for more.
The uv.lock
schema version is considered part of the public API, and so will only be incremented
in a minor release as a breaking change. See
Lockfile versioning for more.
uv has Tier 1 support for the following platforms:
- macOS (Apple Silicon)
- macOS (x86_64)
- Linux (x86_64)
- Windows (x86_64)
uv is continuously built, tested, and developed against its Tier 1 platforms. Inspired by the Rust project, Tier 1 can be thought of as "guaranteed to work".
uv has Tier 2 support ("guaranteed to build") for the following platforms:
- Linux (PPC64)
- Linux (PPC64LE)
- Linux (aarch64)
- Linux (armv7)
- Linux (i686)
- Linux (s390x)
uv ships pre-built wheels to PyPI for its Tier 1 and Tier 2 platforms. However, while Tier 2 platforms are continuously built, they are not continuously tested or developed against, and so stability may vary in practice.
Beyond the Tier 1 and Tier 2 platforms, uv is known to build on i686 Windows, and known not to build on aarch64 Windows, but does not consider either platform to be supported at this time. The minimum supported Windows versions are Windows 10 and Windows Server 2016, following Rust's own Tier 1 support.
uv supports and is tested against Python 3.8, 3.9, 3.10, 3.11, 3.12, and 3.13.
uv is designed as a drop-in replacement for common pip
and pip-tools
workflows.
Informally, the intent is such that existing pip
and pip-tools
users can switch to uv without
making meaningful changes to their packaging workflows; and, in most cases, swapping out
pip install
for uv pip install
should "just work".
However, uv is not intended to be an exact clone of pip
, and the further you stray from common
pip
workflows, the more likely you are to encounter differences in behavior. In some cases, those
differences may be known and intentional; in others, they may be the result of implementation
details; and in others, they may be bugs.
This document outlines the known differences between uv and pip
, along with rationale,
workarounds, and a statement of intent for compatibility in the future.
uv does not read configuration files or environment variables that are specific to pip
, like
pip.conf
or PIP_INDEX_URL
.
Reading configuration files and environment variables intended for other tools has a number of drawbacks:
- It requires bug-for-bug compatibility with the target tool, since users end up relying on bugs in the format, the parser, etc.
- If the target tool changes the format in some way, uv is then locked-in to changing it in equivalent ways.
- If that configuration is versioned in some way, uv would need to know which version of the target tool the user is expecting to use.
- It prevents uv from introducing any settings or configuration that don't exist in the target
tool, since otherwise
pip.conf
(or similar) would no longer be usable withpip
. - It can lead to user confusion, since uv would be reading settings that don't actually affect its behavior, and many users may not expect uv to read configuration files intended for other tools.
Instead, uv supports its own environment variables, like UV_INDEX_URL
. uv also supports persistent
configuration in a uv.toml
file or a [tool.uv.pip]
section of pyproject.toml
. For more
information, see Configuration files.
By default, uv will accept pre-release versions during dependency resolution in two cases:
- If the package is a direct dependency, and its version markers include a pre-release specifier
(e.g.,
flask>=2.0.0rc1
). - If all published versions of a package are pre-releases.
If dependency resolution fails due to a transitive pre-release, uv will prompt the user to re-run
with --prerelease allow
, to allow pre-releases for all dependencies.
Alternatively, you can add the transitive dependency to your requirements.in
file with pre-release
specifier (e.g., flask>=2.0.0rc1
) to opt in to pre-release support for that specific dependency.
In sum, uv needs to know upfront whether the resolver should accept pre-releases for a given
package. pip
, meanwhile, may respect pre-release identifiers in transitive dependencies
depending on the order in which the resolver encounters the relevant specifiers
(#1641).
Pre-releases are
notoriously difficult to
model, and are a frequent source of bugs in packaging tools. Even pip
, which is viewed as a
reference implementation, has a number of open questions around pre-release handling
(#12469,
#12470,
#40505, etc.).
uv's pre-release handling is intentionally limited and intentionally requires user opt-in for
pre-releases, to ensure correctness.
In the future, uv may support pre-release identifiers in transitive dependencies. However, it's likely contingent on evolution in the Python packaging specifications. The existing PEPs do not cover "dependency resolution" and are instead focused on behavior for a single version specifier. As such, there are unresolved questions around the correct and intended behavior for pre-releases in the packaging ecosystem more broadly.
In both uv and pip
, users can specify multiple package indexes from which to search for the
available versions of a given package. However, uv and pip
differ in how they handle packages that
exist on multiple indexes.
For example, imagine that a company publishes an internal version of requests
on a private index
(--extra-index-url
), but also allows installing packages from PyPI by default. In this case, the
private requests
would conflict with the public requests
on PyPI.
When uv searches for a package across multiple indexes, it will iterate over the indexes in order
(preferring the --extra-index-url
over the default index), and stop searching as soon as it finds
a match. This means that if a package exists on multiple indexes, uv will limit its candidate
versions to those present in the first index that contains the package.
pip
, meanwhile, will combine the candidate versions from all indexes, and select the best version
from the combined set, though it makes
no guarantees around the order in
which it searches indexes, and expects that packages are unique up to name and version, even across
indexes.
uv's behavior is such that if a package exists on an internal index, it should always be installed
from the internal index, and never from PyPI. The intent is to prevent "dependency confusion"
attacks, in which an attacker publishes a malicious package on PyPI with the same name as an
internal package, thus causing the malicious package to be installed instead of the internal
package. See, for example,
the torchtriton
attack from
December 2022.
As of v0.1.39, users can opt in to pip
-style behavior for multiple indexes via the
--index-strategy
command-line option, or the UV_INDEX_STRATEGY
environment variable, which
supports the following values:
first-index
(default): Search for each package across all indexes, limiting the candidate versions to those present in the first index that contains the package, prioritizing the--extra-index-url
indexes over the default index URL.unsafe-first-match
: Search for each package across all indexes, but prefer the first index with a compatible version, even if newer versions are available on other indexes.unsafe-best-match
: Search for each package across all indexes, and select the best version from the combined set of candidate versions.
While unsafe-best-match
is the closest to pip
's behavior, it exposes users to the risk of
"dependency confusion" attacks.
uv also supports pinning packages to dedicated indexes (see: Indexes), such that a given package is always installed from a specific index.
uv uses PEP 517 build isolation by default (akin to
pip install --use-pep517
), following pypa/build
and in anticipation of pip
defaulting to PEP
517 builds in the future (pypa/pip#9175).
If a package fails to install due to a missing build-time dependency, try using a newer version of the package; if the problem persists, consider filing an issue with the package maintainer, requesting that they update the packaging setup to declare the correct PEP 517 build-time dependencies.
As an escape hatch, you can preinstall a package's build dependencies, then run uv pip install
with --no-build-isolation
, as in:
uv pip install wheel && uv pip install --no-build-isolation biopython==1.77
For a list of packages that are known to fail under PEP 517 build isolation, see #2252.
While uv includes first-class support for URL dependencies (e.g., ruff @ https://...
), it differs
from pip in its handling of transitive URL dependencies in two ways.
First, uv makes the assumption that non-URL dependencies do not introduce URL dependencies into the resolution. In other words, it assumes that dependencies fetched from a registry do not themselves depend on URLs. If a non-URL dependency does introduce a URL dependency, uv will reject the URL dependency during resolution. (Note that PyPI does not allow published packages to depend on URL dependencies; other registries may be more permissive.)
Second, if a constraint (--constraint
) or override (--override
) is defined using a direct URL
dependency, and the constrained package has a direct URL dependency of its own, uv may reject that
transitive direct URL dependency during resolution, if the URL isn't referenced elsewhere in the set
of input requirements.
If uv rejects a transitive URL dependency, the best course of action is to provide the URL
dependency as a direct dependency in the relevant pyproject.toml
or requirement.in
file, as the
above constraints do not apply to direct dependencies.
uv pip install
and uv pip sync
are designed to work with virtual environments by default.
Specifically, uv will always install packages into the currently active virtual environment, or
search for a virtual environment named .venv
in the current directory or any parent directory
(even if it is not activated).
This differs from pip
, which will install packages into a global environment if no virtual
environment is active, and will not search for inactive virtual environments.
In uv, you can install into non-virtual environments by providing a path to a Python executable via
the --python /path/to/python
option, or via the --system
flag, which installs into the first
Python interpreter found on the PATH
, like pip
.
In other words, uv inverts the default, requiring explicit opt-in to installing into the system Python, which can lead to breakages and other complications, and should only be done in limited circumstances.
For more, see "Using arbitrary Python environments".
For a given set of dependency specifiers, it's often the case that there is no single "correct" set of packages to install. Instead, there are many valid sets of packages that satisfy the specifiers.
Neither pip
nor uv make any guarantees about the exact set of packages that will be installed;
only that the resolution will be consistent, deterministic, and compliant with the specifiers. As
such, in some cases, pip
and uv will yield different resolutions; however, both resolutions
should be equally valid.
For example, consider:
starlette
fastapi
At time of writing, the most recent starlette
version is 0.37.2
, and the most recent fastapi
version is 0.110.0
. However, fastapi==0.110.0
also depends on starlette
, and introduces an
upper bound: starlette>=0.36.3,<0.37.0
.
If a resolver prioritizes including the most recent version of starlette
, it would need to use an
older version of fastapi
that excludes the upper bound on starlette
. In practice, this requires
falling back to fastapi==0.1.17
:
# This file was autogenerated by uv via the following command:
# uv pip compile requirements.in
annotated-types==0.6.0
# via pydantic
anyio==4.3.0
# via starlette
fastapi==0.1.17
idna==3.6
# via anyio
pydantic==2.6.3
# via fastapi
pydantic-core==2.16.3
# via pydantic
sniffio==1.3.1
# via anyio
starlette==0.37.2
# via fastapi
typing-extensions==4.10.0
# via
# pydantic
# pydantic-core
Alternatively, if a resolver prioritizes including the most recent version of fastapi
, it would
need to use an older version of starlette
that satisfies the upper bound. In practice, this
requires falling back to starlette==0.36.3
:
# This file was autogenerated by uv via the following command:
# uv pip compile requirements.in
annotated-types==0.6.0
# via pydantic
anyio==4.3.0
# via starlette
fastapi==0.110.0
idna==3.6
# via anyio
pydantic==2.6.3
# via fastapi
pydantic-core==2.16.3
# via pydantic
sniffio==1.3.1
# via anyio
starlette==0.36.3
# via fastapi
typing-extensions==4.10.0
# via
# fastapi
# pydantic
# pydantic-core
When uv resolutions differ from pip
in undesirable ways, it's often a sign that the specifiers are
too loose, and that the user should consider tightening them. For example, in the case of
starlette
and fastapi
, the user could require fastapi>=0.110.0
.
At present, uv pip check
will surface the following diagnostics:
- A package has no
METADATA
file, or theMETADATA
file can't be parsed. - A package has a
Requires-Python
that doesn't match the Python version of the running interpreter. - A package has a dependency on a package that isn't installed.
- A package has a dependency on a package that's installed, but at an incompatible version.
- Multiple versions of a package are installed in the virtual environment.
In some cases, uv pip check
will surface diagnostics that pip check
does not, and vice versa.
For example, unlike uv pip check
, pip check
will not warn when multiple versions of a package
are installed in the current environment.
uv does not support the --user
flag, which installs packages based on the user
install scheme.
Instead, we recommend the use of virtual environments to isolate package installations.
Additionally, pip will fall back to the user
install scheme if it detects that the user does not
have write permissions to the target directory, as is the case on some systems when installing into
the system Python. uv does not implement any such fallback.
For more, see #2077.
The --only-binary
argument is used to restrict installation to pre-built binary distributions.
When --only-binary :all:
is provided, both pip and uv will refuse to build source distributions
from PyPI and other registries.
However, when a dependency is provided as a direct URL (e.g., uv pip install https://...
), pip
does not enforce --only-binary
, and will build source distributions for all such packages.
uv, meanwhile, does enforce --only-binary
for direct URL dependencies, with one exception: given
uv pip install https://... --only-binary flask
, uv will build the source distribution at the
given URL if it cannot infer the package name ahead of time, since uv can't determine whether the
package is "allowed" in such cases without building its metadata.
Both pip and uv allow editables requirements to be built and installed even when --only-binary
is
provided. For example, uv pip install -e . --only-binary :all:
is allowed.
The --no-binary
argument is used to restrict installation to source distributions. When
--no-binary
is provided, uv will refuse to install pre-built binary distributions, but will
reuse any binary distributions that are already present in the local cache.
Additionally, and in contrast to pip, uv's resolver will still read metadata from pre-built binary
distributions when --no-binary
is provided.
PEP 600 describes a mechanism through which
Python distributors can opt out of manylinux
compatibility by defining a manylinux_compatible
function on the _manylinux
standard library module.
uv respects manylinux_compatible
, but only tests against the current glibc version, and applies
the return value of manylinux_compatible
globally.
In other words, if manylinux_compatible
returns True
, uv will treat the system as
manylinux
-compatible; if it returns False
, uv will treat the system as manylinux
-incompatible,
without calling manylinux_compatible
for every glibc version.
This approach is not a complete implementation of the spec, but is compatible with common blanket
manylinux_compatible
implementations like
no-manylinux
:
from __future__ import annotations
manylinux1_compatible = False
manylinux2010_compatible = False
manylinux2014_compatible = False
def manylinux_compatible(*_, **__): # PEP 600
return False
Unlike pip, uv does not compile .py
files to .pyc
files during installation by default (i.e., uv
does not create or populate __pycache__
directories). To enable bytecode compilation during
installs, pass the --compile-bytecode
flag to uv pip install
or uv pip sync
.
uv tends to be stricter than pip
, and will often reject packages that pip
would install. For
example, uv rejects HTML indexes with invalid URL fragments (see:
PEP 503), while pip
will ignore such fragments.
In some cases, uv implements lenient behavior for popular packages that are known to have specific spec compliance issues.
If uv rejects a package that pip
would install due to a spec violation, the best course of action
is to first attempt to install a newer version of the package; and, if that fails, to report the
issue to the package maintainer.
uv does not support the complete set of pip
's command-line options and subcommands, although it
does support a large subset.
Missing options and subcommands are prioritized based on user demand and the complexity of the implementation, and tend to be tracked in individual issues. For example:
If you encounter a missing option or subcommand, please search the issue tracker to see if it has already been reported, and if not, consider opening a new issue. Feel free to upvote any existing issues to convey your interest.
uv does not support pip
's auto
or import
options for --keyring-provider
. At present, only
the subprocess
option is supported.
Unlike pip
, uv does not enable keyring authentication by default.
Unlike pip
, uv does not wait until a request returns a HTTP 401 before searching for
authentication. uv attaches authentication to all requests for hosts with credentials available.
uv does not support features that are considered legacy or deprecated in pip
. For example, uv does
not support .egg
-style distributions.
However, uv does have partial support for (1) .egg-info
-style distributions (which are
occasionally found in Docker images and Conda environments) and (2) legacy editable
.egg-link
-style distributions.
Specifically, uv does not support installing new .egg-info
- or .egg-link
-style distributions,
but will respect any such existing distributions during resolution, list them with uv pip list
and
uv pip freeze
, and uninstall them with uv pip uninstall
.
When constraints are provided via --constraint
(or UV_CONSTRAINT
), uv will not apply the
constraints when resolving build dependencies (i.e., to build a source distribution). Instead, build
constraints should be provided via the dedicated --build-constraint
(or UV_BUILD_CONSTRAINT
)
setting.
pip, meanwhile, applies constraints to build dependencies when specified via PIP_CONSTRAINT
, but
not when provided via --constraint
on the command line.
For example, to ensure that setuptools 60.0.0
is used to build any packages with a build
dependency on setuptools
, use --build-constraint
, rather than --constraint
.
There are a few small but notable differences in the default behaviors of pip compile
and
pip-tools
.
By default, uv does not write the compiled requirements to an output file. Instead, uv requires that
the user specify an output file explicitly with the -o
or --output-file
option.
By default, uv strips extras when outputting the compiled requirements. In other words, uv defaults
to --strip-extras
, while pip-compile
defaults to --no-strip-extras
. pip-compile
is scheduled
to change this default in the next major release (v8.0.0), at which point both tools will default to
--strip-extras
. To retain extras with uv, pass the --no-strip-extras
flag to uv pip compile
.
By default, uv does not write any index URLs to the output file, while pip-compile
outputs any
--index-url
or --extra-index-url
that does not match the default (PyPI). To include index URLs
in the output file, pass the --emit-index-url
flag to uv pip compile
. Unlike pip-compile
, uv
will include all index URLs when --emit-index-url
is passed, including the default index URL.
When evaluating Python versions against requires-python
specifiers, uv truncates the candidate
version to the major, minor, and patch components, ignoring (e.g.) pre-release and post-release
identifiers.
For example, a project that declares requires-python: >=3.13
will accept Python 3.13.0b1. While
3.13.0b1 is not strictly greater than 3.13, it is greater than 3.13 when the pre-release identifier
is omitted.
While this is not strictly compliant with PEP 440, it is consistent with pip.
There are usually many possible solutions given a set of requirements, and a resolver must choose between them. uv's resolver and pip's resolver have a different set of package priorities. While both resolvers use the user-provided order as one of their priorities, pip has additional priorities that uv does not have. Hence, uv is more likely to be affected by a change in user order than pip is.
For example, uv pip install foo bar
prioritizes newer versions of foo
over bar
and could
result in a different resolution than uv pip install bar foo
. Similarly, this behavior applies to
the ordering of requirements in input files for uv pip compile
.
uv provides a drop-in replacement for common pip
, pip-tools
, and virtualenv
commands. These
commands work directly with the virtual environment, in contrast to uv's primary interfaces where
the virtual environment is managed automatically. The uv pip
interface exposes the speed and
functionality of uv to power users and projects that are not ready to transition away from pip
and
pip-tools
.
The following sections discuss the basics of using uv pip
:
- Creating and using environments
- Installing and managing packages
- Inspecting environments and packages
- Declaring package dependencies
- Locking and syncing environments
Please note these commands do not exactly implement the interfaces and behavior of the tools they are based on. The further you stray from common workflows, the more likely you are to encounter differences. Consult the pip-compatibility guide for details.
!!! important
uv does not rely on or invoke pip. The pip interface is named as such to highlight its dedicated
purpose of providing low-level commands that match pip's interface and to separate it from the
rest of uv's commands which operate at a higher level of abstraction.
To install a package into the virtual environment, e.g., Flask:
$ uv pip install flask
To install a package with optional dependencies enabled, e.g., Flask with the "dotenv" extra:
$ uv pip install "flask[dotenv]"
To install multiple packages, e.g., Flask and Ruff:
$ uv pip install flask ruff
To install a package with a constraint, e.g., Ruff v0.2.0 or newer:
$ uv pip install 'ruff>=0.2.0'
To install a package at a specific version, e.g., Ruff v0.3.0:
$ uv pip install 'ruff==0.3.0'
To install a package from the disk:
$ uv pip install "ruff @ ./projects/ruff"
To install a package from GitHub:
$ uv pip install "git+https://github.com/astral-sh/ruff"
To install a package from GitHub at a specific reference:
$ # Install a tag
$ uv pip install "git+https://github.com/astral-sh/[email protected]"
$ # Install a commit
$ uv pip install "git+https://github.com/astral-sh/ruff@1fadefa67b26508cc59cf38e6130bde2243c929d"
$ # Install a branch
$ uv pip install "git+https://github.com/astral-sh/ruff@main"
See the Git authentication documentation for installation from a private repository.
Editable packages do not need to be reinstalled for change to their source code to be active.
To install the current project as an editable package
$ uv pip install -e .
To install a project in another directory as an editable package:
$ uv pip install -e "ruff @ ./project/ruff"
Multiple packages can be installed at once from standard file formats.
Install from a requirements.txt
file:
$ uv pip install -r requirements.txt
See the uv pip compile
documentation for more information on requirements.txt
files.
Install from a pyproject.toml
file:
$ uv pip install -r pyproject.toml
Install from a pyproject.toml
file with optional dependencies enabled, e.g., the "foo" extra:
$ uv pip install -r pyproject.toml --extra foo
Install from a pyproject.toml
file with all optional dependencies enabled:
$ uv pip install -r pyproject.toml --all-extras
To uninstall a package, e.g., Flask:
$ uv pip uninstall flask
To uninstall multiple packages, e.g., Flask and Ruff:
$ uv pip uninstall flask ruff
Locking is to take a dependency, e.g., ruff
, and write an exact version to use to a file. When
working with many dependencies, it is useful to lock the exact versions so the environment can be
reproduced. Without locking, the versions of dependencies could change over time, when using a
different tool, or across platforms.
uv allows dependencies to be locked in the requirements.txt
format. It is recommended to use the
standard pyproject.toml
to define dependencies, but other dependency formats are supported as
well. See the documentation on declaring dependencies for more details on how to
define dependencies.
To lock dependencies declared in a pyproject.toml
:
$ uv pip compile pyproject.toml -o requirements.txt
Note by default the uv pip compile
output is just displayed and --output-file
/ -o
argument is
needed to write to a file.
To lock dependencies declared in a requirements.in
:
$ uv pip compile requirements.in -o requirements.txt
To lock dependencies declared in multiple files:
$ uv pip compile pyproject.toml requirements-dev.in -o requirements-dev.txt
uv also supports legacy setup.py
and setup.cfg
formats. To lock dependencies declared in a
setup.py
:
$ uv pip compile setup.py -o requirements.txt
To lock dependencies from stdin, use -
:
$ echo "ruff" | uv pip compile -
To lock with optional dependencies enabled, e.g., the "foo" extra:
$ uv pip compile pyproject.toml --extra foo
To lock with all optional dependencies enabled:
$ uv pip compile pyproject.toml --all-extras
Note extras are not supported with the requirements.in
format.
When using an output file, uv will consider the versions pinned in an existing output file. If a dependency is pinned it will not be upgraded on a subsequent compile run. For example:
$ echo "ruff==0.3.0" > requirements.txt
$ echo "ruff" | uv pip compile - -o requirements.txt
# This file was autogenerated by uv via the following command:
# uv pip compile - -o requirements.txt
ruff==0.3.0
To upgrade a dependency, use the --upgrade-package
flag:
$ uv pip compile - -o requirements.txt --upgrade-package ruff
To upgrade all dependencies, there is an --upgrade
flag.
Dependencies can be installed directly from their definition files or from compiled
requirements.txt
files with uv pip install
. See the documentation on
installing packages from files for more details.
When installing with uv pip install
, packages that are already installed will not be removed
unless they conflict with the lockfile. This means that the environment can have dependencies that
aren't declared in the lockfile, which isn't great for reproducibility. To ensure the environment
exactly matches the lockfile, use uv pip sync
instead.
To sync an environment with a requirements.txt
file:
$ uv pip sync requirements.txt
To sync an environment with a pyproject.toml
file:
$ uv pip sync pyproject.toml
Constraints files are requirements.txt
-like files that only control the version of a requirement
that's installed. However, including a package in a constraints file will not trigger the
installation of that package. Constraints can be used to add bounds to dependencies that are not
dependencies of the current project.
To define a constraint, define a bound for a package:
pydantic<2.0
To use a constraints file:
$ uv pip compile requirements.in --constraint constraints.txt
Note that multiple constraints can be defined in each file and multiple files can be used.
Overrides files are requirements.txt
-like files that force a specific version of a requirement to
be installed, regardless of the requirements declared by any constituent package, and regardless of
whether this would be considered an invalid resolution.
While constraints are additive, in that they're combined with the requirements of the constituent packages, overrides are absolute, in that they completely replace the requirements of the constituent packages.
Overrides are most often used to remove upper bounds from a transitive dependency. For example, if
a
requires c>=1.0,<2.0
and b
requires c>=2.0
and the current project requires a
and b
then the dependencies cannot be resolved.
To define an override, define the new requirement for the problematic package:
c>=2.0
To use an overrides file:
$ uv pip compile requirements.in --override overrides.txt
Now, resolution can succeed. However, note that if a
is correct that it does not support
c>=2.0
then a runtime error will likely be encountered when using the packages.
Note that multiple overrides can be defined in each file and multiple files can be used.
It is best practice to declare dependencies in a static file instead of modifying environments with ad-hoc installations. Once dependencies are defined, they can be locked to create a consistent, reproducible environment.
The pyproject.toml
file is the Python standard for defining configuration for a project.
To define project dependencies in a pyproject.toml
file:
[project]
dependencies = [
"httpx",
"ruff>=0.3.0"
]
To define optional dependencies in a pyproject.toml
file:
[project.optional-dependencies]
cli = [
"rich",
"click",
]
Each of the keys defines an "extra", which can be installed using the --extra
and --all-extras
flags or package[<extra>]
syntax. See the documentation on
installing packages for more details.
See the official
pyproject.toml
guide for
more details on getting started with a pyproject.toml
.
It is also common to use a lightweight requirements.txt
format to declare the dependencies for the
project. Each requirement is defined on its own line. Commonly, this file is called
requirements.in
to distinguish it from requirements.txt
which is used for the locked
dependencies.
To define dependencies in a requirements.in
file:
httpx
ruff>=0.3.0
Optional dependencies groups are not supported in this format.
To list all of the packages in the environment:
$ uv pip list
To list the packages in a JSON format:
$ uv pip list --format json
To list all of the packages in the environment in a requirements.txt
format:
$ uv pip freeze
To show information about an installed package, e.g., numpy
:
$ uv pip show numpy
Multiple packages can be inspected at once.
It is possible to install packages with conflicting requirements into an environment if installed in multiple steps.
To check for conflicts or missing dependencies in the environment:
$ uv pip check
Each Python installation has an environment that is active when Python is used. Packages can be
installed into an environment to make their modules available from your Python scripts. Generally,
it is considered best practice not to modify a Python installation's environment. This is especially
important for Python installations that come with the operating system which often manage the
packages themselves. A virtual environment is a lightweight way to isolate packages from a Python
installation's environment. Unlike pip
, uv requires using a virtual environment by default.
uv supports creating virtual environments, e.g., to create a virtual environment at .venv
:
$ uv venv
A specific name or path can be specified, e.g., to create a virtual environment at my-name
:
$ uv venv my-name
A Python version can be requested, e.g., to create a virtual environment with Python 3.11:
$ uv venv --python 3.11
Note this requires the requested Python version to be available on the system. However, if unavailable, uv will download Python for you. See the Python version documentation for more details.
When using the default virtual environment name, uv will automatically find and use the virtual environment during subsequent invocations.
$ uv venv
$ # Install a package in the new virtual environment
$ uv pip install ruff
The virtual environment can be "activated" to make its packages available:
=== "macOS and Linux"
```console
$ source .venv/bin/activate
```
=== "Windows"
```console
$ .venv\Scripts\activate
```
Since uv has no dependency on Python, it can install into virtual environments other than its own.
For example, setting VIRTUAL_ENV=/path/to/venv
will cause uv to install into /path/to/venv
,
regardless of where uv is installed. Note that if VIRTUAL_ENV
is set to a directory that is
not a PEP 405 compliant virtual environment,
it will be ignored.
uv can also install into arbitrary, even non-virtual environments, with the --python
argument
provided to uv pip sync
or uv pip install
. For example,
uv pip install --python /path/to/python
will install into the environment linked to the
/path/to/python
interpreter.
For convenience, uv pip install --system
will install into the system Python environment. Using
--system
is roughly equivalent to uv pip install --python $(which python)
, but note that
executables that are linked to virtual environments will be skipped. Although we generally recommend
using virtual environments for dependency management, --system
is appropriate in continuous
integration and containerized environments.
The --system
flag is also used to opt in to mutating system environments. For example, the
--python
argument can be used to request a Python version (e.g., --python 3.12
), and uv will
search for an interpreter that meets the request. If uv finds a system interpreter (e.g.,
/usr/lib/python3.12
), then the --system
flag is required to allow modification of this
non-virtual Python environment. Without the --system
flag, uv will ignore any interpreters that
are not in virtual environments. Conversely, when the --system
flag is provided, uv will ignore
any interpreters that are in virtual environments.
Installing into system Python across platforms and distributions is notoriously difficult. uv
supports the common cases, but will not work in all cases. For example, installing into system
Python on Debian prior to Python 3.10 is unsupported due to the
distribution's patching of distutils
(but not sysconfig
).
While we always recommend the use of virtual environments, uv considers them to be required in these
non-standard environments.
If uv is installed in a Python environment, e.g., with pip
, it can still be used to modify other
environments. However, when invoked with python -m uv
, uv will default to using the parent
interpreter's environment. Invoking uv via Python adds startup overhead and is not recommended for
general usage.
uv itself does not depend on Python, but it does need to locate a Python environment to (1) install dependencies into the environment and (2) build source distributions.
When running a command that mutates an environment such as uv pip sync
or uv pip install
, uv
will search for a virtual environment in the following order:
- An activated virtual environment based on the
VIRTUAL_ENV
environment variable. - An activated Conda environment based on the
CONDA_PREFIX
environment variable. - A virtual environment at
.venv
in the current directory, or in the nearest parent directory.
If no virtual environment is found, uv will prompt the user to create one in the current directory
via uv venv
.
If the --system
flag is included, uv will skip virtual environments search for an installed Python
version. Similarly, when running a command that does not mutate the environment such as
uv pip compile
, uv does not require a virtual environment — however, a Python interpreter is
still required. See the documentation on
Python discovery for details on the
discovery of installed Python versions.
Tools are Python packages that provide command-line interfaces.
!!! note
See the [tools guide](../guides/tools.md) for an introduction to working with the tools
interface — this document discusses details of tool management.
uv includes a dedicated interface for interacting with tools. Tools can be invoked without
installation using uv tool run
, in which case their dependencies are installed in a temporary
virtual environment isolated from the current project.
Because it is very common to run tools without installing them, a uvx
alias is provided for
uv tool run
— the two commands are exactly equivalent. For brevity, the documentation will mostly
refer to uvx
instead of uv tool run
.
Tools can also be installed with uv tool install
, in which case their executables are
available on the PATH
— an isolated virtual environment is still used, but it is not
removed when the command completes.
In most cases, executing a tool with uvx
is more appropriate than installing the tool. Installing
the tool is useful if you need the tool to be available to other programs on your system, e.g., if
some script you do not control requires the tool, or if you are in a Docker image and want to make
the tool available to users.
When running a tool with uvx
, a virtual environment is stored in the uv cache directory and is
treated as disposable, i.e., if you run uv cache clean
the environment will be deleted. The
environment is only cached to reduce the overhead of repeated invocations. If the environment is
removed, a new one will be created automatically.
When installing a tool with uv tool install
, a virtual environment is created in the uv tools
directory. The environment will not be removed unless the tool is uninstalled. If the environment is
manually deleted, the tool will fail to run.
Unless a specific version is requested, uv tool install
will install the latest available of the
requested tool. uvx
will use the latest available version of the requested tool on the first
invocation. After that, uvx
will use the cached version of the tool unless a different version is
requested, the cache is pruned, or the cache is refreshed.
For example, to run a specific version of Ruff:
$ uvx [email protected] --version
ruff 0.6.0
A subsequent invocation of uvx
will use the latest, not the cached, version.
$ uvx ruff --version
ruff 0.6.2
But, if a new version of Ruff was released, it would not be used unless the cache was refreshed.
To request the latest version of Ruff and refresh the cache, use the @latest
suffix:
$ uvx ruff@latest --version
0.6.2
Once a tool is installed with uv tool install
, uvx
will use the installed version by default.
For example, after installing an older version of Ruff:
$ uv tool install ruff==0.5.0
The version of ruff
and uvx ruff
is the same:
$ ruff --version
ruff 0.5.0
$ uvx ruff --version
ruff 0.5.0
However, you can ignore the installed version by requesting the latest version explicitly, e.g.:
$ uvx ruff@latest --version
0.6.2
Or, by using the --isolated
flag, which will avoid refreshing the cache but ignore the installed
version:
$ uvx --isolated ruff --version
0.6.2
uv tool install
will also respect the {package}@{version}
and {package}@latest
specifiers, as
in:
$ uv tool install ruff@latest
$ uv tool install [email protected]
By default, the uv tools directory is named tools
and is in the uv application state directory,
e.g., ~/.local/share/uv/tools
. The location may be customized with the UV_TOOL_DIR
environment
variable.
To display the path to the tool installation directory:
$ uv tool dir
Tool environments are placed in a directory with the same name as the tool package, e.g.,
.../tools/<name>
.
Tool environments are not intended to be mutated directly. It is strongly recommended never to
mutate a tool environment manually with a pip
operation.
Tool environments may be upgraded via uv tool upgrade
, or re-created entirely via subsequent
uv tool install
operations.
To upgrade all packages in a tool environment
$ uv tool upgrade black
To upgrade a single package in a tool environment:
$ uv tool upgrade black --upgrade-package click
To reinstall all packages in a tool environment
$ uv tool upgrade black --reinstall
To reinstall a single package in a tool environment:
$ uv tool upgrade black --reinstall-package click
Tool upgrades will respect the version constraints provided when installing the tool. For example,
uv tool install black >=23,<24
followed by uv tool upgrade black
will upgrade Black to the
latest version in the range >=23,<24
.
To instead replace the version constraints, re-install the tool with uv tool install
:
$ uv tool install black>=24
Similarly, tool upgrades will retain the settings provided when installing the tool. For example,
uv tool install black --prerelease allow
followed by uv tool upgrade black
will retain the
--prerelease allow
setting.
Tool upgrades will reinstall the tool executables, even if they have not changed.
Additional packages can be included during tool execution:
$ uvx --with <extra-package> <tool>
And, during tool installation:
$ uv tool install --with <extra-package> <tool-package>
The --with
option can be provided multiple times to include additional packages.
The --with
option supports package specifications, so a specific version can be requested:
$ uvx --with <extra-package>==<version> <tool-package>
If the requested version conflicts with the requirements of the tool package, package resolution will fail and the command will error.
Tool executables include all console entry points, script entry points, and binary scripts provided
by a Python package. Tool executables are symlinked into the bin
directory on Unix and copied on
Windows.
Executables are installed into the user bin
directory following the XDG standard, e.g.,
~/.local/bin
. Unlike other directory schemes in uv, the XDG standard is used on all platforms
notably including Windows and macOS — there is no clear alternative location to place executables on
these platforms. The installation directory is determined from the first available environment
variable:
$UV_TOOL_BIN_DIR
$XDG_BIN_HOME
$XDG_DATA_HOME/../bin
$HOME/.local/bin
Executables provided by dependencies of tool packages are not installed.
The bin
directory must be in the PATH
variable for tool executables to be available from the
shell. If it is not in the PATH
, a warning will be displayed. The uv tool update-shell
command
can be used to add the bin
directory to the PATH
in common shell configuration files.
Installation of tools will not overwrite executables in the bin
directory that were not previously
installed by uv. For example, if pipx
has been used to install a tool, uv tool install
will
fail. The --force
flag can be used to override this behavior.
The invocation uv tool run <name>
(or uvx <name>
) is nearly equivalent to:
$ uv run --no-project --with <name> -- <name>
However, there are a couple notable differences when using uv's tool interface:
- The
--with
option is not needed — the required package is inferred from the command name. - The temporary environment is cached in a dedicated location.
- The
--no-project
flag is not needed — tools are always run isolated from the project. - If a tool is already installed,
uv tool run
will use the installed version butuv run
will not.
If the tool should not be isolated from the project, e.g., when running pytest
or mypy
, then
uv run
should be used instead of uv tool run
.
uv uses aggressive caching to avoid re-downloading (and re-building) dependencies that have already been accessed in prior runs.
The specifics of uv's caching semantics vary based on the nature of the dependency:
- For registry dependencies (like those downloaded from PyPI), uv respects HTTP caching headers.
- For direct URL dependencies, uv respects HTTP caching headers, and also caches based on the URL itself.
- For Git dependencies, uv caches based on the fully-resolved Git commit hash. As such,
uv pip compile
will pin Git dependencies to a specific commit hash when writing the resolved dependency set. - For local dependencies, uv caches based on the last-modified time of the source archive (i.e.,
the local
.whl
or.tar.gz
file). For directories, uv caches based on the last-modified time of thepyproject.toml
,setup.py
, orsetup.cfg
file.
If you're running into caching issues, uv includes a few escape hatches:
- To force uv to revalidate cached data for all dependencies, pass
--refresh
to any command (e.g.,uv sync --refresh
oruv pip install --refresh ...
). - To force uv to revalidate cached data for a specific dependency pass
--refresh-package
to any command (e.g.,uv sync --refresh-package flask
oruv pip install --refresh-package flask ...
). - To force uv to ignore existing installed versions, pass
--reinstall
to any installation command (e.g.,uv sync --reinstall
oruv pip install --reinstall ...
).
By default, uv will only rebuild and reinstall local directory dependencies (e.g., editables) if
the pyproject.toml
, setup.py
, or setup.cfg
file in the directory root has changed. This is a
heuristic and, in some cases, may lead to fewer re-installs than desired.
To incorporate other information into the cache key for a given package, you can add cache key
entries under tool.uv.cache-keys
, which can include both file paths and Git commit hashes.
For example, if a project uses setuptools-scm
, and
should be rebuilt whenever the commit hash changes, you can add the following to the project's
pyproject.toml
:
[tool.uv]
cache-keys = [{ git = { commit = true } }]
If your dynamic metadata incorporates information from the set of Git tags, you can expand the cache key to include the tags:
[tool.uv]
cache-keys = [{ git = { commit = true, tags = true } }]
Similarly, if a project reads from a requirements.txt
to populate its dependencies, you can add
the following to the project's pyproject.toml
:
[tool.uv]
cache-keys = [{ file = "requirements.txt" }]
Globs are supported, following the syntax of the
glob
crate. For example, to invalidate the
cache whenever a .toml
file in the project directory or any of its subdirectories is modified, use
the following:
[tool.uv]
cache-keys = [{ file = "**/*.toml" }]
!!! note
The use of globs can be expensive, as uv may need to walk the filesystem to determine whether any files have changed.
This may, in turn, requiring traversal of large or deeply nested directories.
As an escape hatch, if a project uses dynamic
metadata that isn't covered by tool.uv.cache-keys
,
you can instruct uv to always rebuild and reinstall it by adding the project to the
tool.uv.reinstall-package
list:
[tool.uv]
reinstall-package = ["my-package"]
This will force uv to rebuild and reinstall my-package
on every run, regardless of whether the
package's pyproject.toml
, setup.py
, or setup.cfg
file has changed.
It's safe to run multiple uv commands concurrently, even against the same virtual environment. uv's cache is designed to be thread-safe and append-only, and thus robust to multiple concurrent readers and writers. uv applies a file-based lock to the target virtual environment when installing, to avoid concurrent modifications across processes.
Note that it's not safe to modify the uv cache (e.g., uv cache clean
) while other uv commands
are running, and never safe to modify the cache directly (e.g., by removing a file or directory).
uv provides a few different mechanisms for removing entries from the cache:
uv cache clean
removes all cache entries from the cache directory, clearing it out entirely.uv cache clean ruff
removes all cache entries for theruff
package, useful for invalidating the cache for a single or finite set of packages.uv cache prune
removes all unused cache entries. For example, the cache directory may contain entries created in previous uv versions that are no longer necessary and can be safely removed.uv cache prune
is safe to run periodically, to keep the cache directory clean.
It's common to cache package installation artifacts in continuous integration environments (like GitHub Actions or GitLab CI) to speed up subsequent runs.
By default, uv caches both the wheels that it builds from source and the pre-built wheels that it downloads directly, to enable high-performance package installation.
However, in continuous integration environments, persisting pre-built wheels may be undesirable. With uv, it turns out that it's often faster to omit pre-built wheels from the cache (and instead re-download them from the registry on each run). On the other hand, caching wheels that are built from source tends to be worthwhile, since the wheel building process can be expensive, especially for extension modules.
To support this caching strategy, uv provides a uv cache prune --ci
command, which removes all
pre-built wheels and unzipped source distributions from the cache, but retains any wheels that were
built from source. We recommend running uv cache prune --ci
at the end of your continuous
integration job to ensure maximum cache efficiency. For an example, see the
GitHub integration guide.
uv determines the cache directory according to, in order:
- A temporary cache directory, if
--no-cache
was requested. - The specific cache directory specified via
--cache-dir
,UV_CACHE_DIR
, ortool.uv.cache-dir
. - A system-appropriate cache directory, e.g.,
$XDG_CACHE_HOME/uv
or$HOME/.cache/uv
on Unix and%LOCALAPPDATA%\uv\cache
on Windows
!!! note
uv _always_ requires a cache directory. When `--no-cache` is requested, uv will still use
a temporary cache for sharing data within that single invocation.
In most cases, `--refresh` should be used instead of `--no-cache` — as it will update the cache
for subsequent operations but not read from the cache.
It is important for performance for the cache directory to be located on the same file system as the Python environment uv is operating on. Otherwise, uv will not be able to link files from the cache into the environment and will instead need to fallback to slow copy operations.
The uv cache is composed of a number of buckets (e.g., a bucket for wheels, a bucket for source distributions, a bucket for Git repositories, and so on). Each bucket is versioned, such that if a release contains a breaking change to the cache format, uv will not attempt to read from or write to an incompatible cache bucket.
For example, uv 0.4.13 included a breaking change to the core metadata bucket. As such, the bucket version was increased from v12 to v13. Within a cache version, changes are guaranteed to be both forwards- and backwards-compatible.
Since changes in the cache format are accompanied by changes in the cache version, multiple versions of uv can safely read and write to the same cache directory. However, if the cache version changed between a given pair of uv releases, then those releases may not be able to share the same underlying cache entries.
For example, it's safe to use a single shared cache for uv 0.4.12 and uv 0.4.13, though the cache itself may contain duplicate entries in the core metadata bucket due to the change in cache version.
A Python version is composed of a Python interpreter (i.e. the python
executable), the standard
library, and other supporting files.
Since it is common for a system to have an existing Python installation, uv supports discovering Python versions. However, uv also supports installing Python versions itself. To distinguish between these two types of Python installations, uv refers to Python versions it installs as managed Python installations and all other Python installations as system Python installations.
!!! note
uv does not distinguish between Python versions installed by the operating system vs those
installed and managed by other tools. For example, if a Python installation is managed with
`pyenv`, it would still be considered a _system_ Python version in uv.
A specific Python version can be requested with the --python
flag in most uv commands. For
example, when creating a virtual environment:
$ uv venv --python 3.11.6
uv will ensure that Python 3.11.6 is available — downloading and installing it if necessary — then create the virtual environment with it.
The following Python version request formats are supported:
<version>
e.g.3
,3.12
,3.12.3
<version-specifier>
e.g.>=3.12,<3.13
<implementation>
e.g.cpython
orcp
<implementation>@<version>
e.g.[email protected]
<implementation><version>
e.g.cpython3.12
orcp312
<implementation><version-specifier>
e.g.cpython>=3.12,<3.13
<implementation>-<version>-<os>-<arch>-<libc>
e.g.cpython-3.12.3-macos-aarch64-none
Additionally, a specific system Python interpreter can be requested with:
<executable-path>
e.g./opt/homebrew/bin/python3
<executable-name>
e.g.mypython3
<install-dir>
e.g./some/environment/
By default, uv will automatically download Python versions if they cannot be found on the system.
This behavior can be
disabled with the python-downloads
option.
The .python-version
file can be used to create a default Python version request. uv searches for a
.python-version
file in the working directory and each of its parents. Any of the request formats
described above can be used, though use of a version number is recommended for interopability with
other tools.
A .python-version
file can be created in the current directory with the
uv python pin
command.
Discovery of .python-version
files can be disabled with --no-config
.
uv will not search for .python-version
files beyond project or workspace boundaries.
uv bundles a list of downloadable CPython and PyPy distributions for macOS, Linux, and Windows.
!!! tip
By default, Python versions are automatically downloaded as needed without using
`uv python install`.
To install a Python version at a specific version:
$ uv python install 3.12.3
To install the latest patch version:
$ uv python install 3.12
To install a version that satisfies constraints:
$ uv python install '>=3.8,<3.10'
To install multiple versions:
$ uv python install 3.9 3.10 3.11
To install a specific implementation:
$ uv python install pypy
All of the Python version request formats are supported except those that are used for requesting local interpreters such as a file path.
By default uv python install
will verify that a managed Python version is installed or install the
latest version. If a .python-version
file is present, uv will install the Python version listed in
the file. A project that requires multiple Python versions may define a .python-versions
file. If
present, uv will install all of the Python versions listed in the file.
!!! important
Support for installing Python executables is in _preview_, this means the behavior is experimental
and subject to change.
To install Python executables into your PATH
, provide the --preview
option:
$ uv python install 3.12 --preview
This will install a Python executable for the requested version into ~/.local/bin
, e.g., as
python3.12
.
!!! tip
If `~/.local/bin` is not in your `PATH`, you can add it with `uv tool update-shell`.
To install python
and python3
executables, include the --default
option:
$ uv python install 3.12 --default --preview
When installing Python executables, uv will only overwrite an existing executable if it is managed
by uv — e.g., if ~/.local/bin/python3.12
exists already uv will not overwrite it without the
--force
flag.
uv will update executables that it manages. However, it will prefer the latest patch version of each Python minor version by default. For example:
$ uv python install 3.12.7 --preview # Adds `python3.12` to `~/.local/bin`
$ uv python install 3.12.6 --preview # Does not update `python3.12`
$ uv python install 3.12.8 --preview # Updates `python3.12` to point to 3.12.8
uv will respect Python requirements defined in requires-python
in the pyproject.toml
file during
project command invocations. The first Python version that is compatible with the requirement will
be used, unless a version is otherwise requested, e.g., via a .python-version
file or the
--python
flag.
To list installed and available Python versions:
$ uv python list
By default, downloads for other platforms and old patch versions are hidden.
To view all versions:
$ uv python list --all-versions
To view Python versions for other platforms:
$ uv python list --all-platforms
To exclude downloads and only show installed Python versions:
$ uv python list --only-installed
To find a Python executable, use the uv python find
command:
$ uv python find
By default, this will display the path to the first available Python executable. See the discovery rules for details about how executables are discovered.
This interface also supports many request formats, e.g., to find a Python executable that has a version of 3.11 or newer:
$ uv python find >=3.11
By default, uv python find
will include Python versions from virtual environments. If a .venv
directory is found in the working directory or any of the parent directories or the VIRTUAL_ENV
environment variable is set, it will take precedence over any Python executables on the PATH
.
To ignore virtual environments, use the --system
flag:
$ uv python find --system
When searching for a Python version, the following locations are checked:
- Managed Python installations in the
UV_PYTHON_INSTALL_DIR
. - A Python interpreter on the
PATH
aspython
,python3
, orpython3.x
on macOS and Linux, orpython.exe
on Windows. - On Windows, the Python interpreters in the Windows registry and Microsoft Store Python
interpreters (see
py --list-paths
) that match the requested version.
In some cases, uv allows using a Python version from a virtual environment. In this case, the virtual environment's interpreter will be checked for compatibility with the request before searching for an installation as described above. See the pip-compatible virtual environment discovery documentation for details.
When performing discovery, non-executable files will be ignored. Each discovered executable is queried for metadata to ensure it meets the requested Python version. If the query fails, the executable will be skipped. If the executable satisfies the request, it is used without inspecting additional executables.
When searching for a managed Python version, uv will prefer newer versions first. When searching for a system Python version, uv will use the first compatible version — not the newest version.
If a Python version cannot be found on the system, uv will check for a compatible managed Python version download.
Python pre-releases will not be selected by default. Python pre-releases will be used if there is no other available installation matching the request. For example, if only a pre-release version is available it will be used but otherwise a stable release version will be used. Similarly, if the path to a pre-release Python executable is provided then no other Python version matches the request and the pre-release version will be used.
If a pre-release Python version is available and matches the request, uv will not download a stable Python version instead.
By default, uv will automatically download Python versions when needed.
The python-downloads
option can be used to disable
this behavior. By default, it is set to automatic
; set to manual
to only allow Python downloads
during uv python install
.
!!! tip
The `python-downloads` setting can be set in a
[persistent configuration file](../configuration/files.md) to change the default behavior, or
the `--no-python-downloads` flag can be passed to any uv command.
By default, uv will attempt to use Python versions found on the system and only download managed interpreters when necessary.
The python-preference
option can be used to adjust
this behavior. By default, it is set to managed
which prefers managed Python installations over
system Python installations. However, system Python installations are still preferred over
downloading a managed Python version.
The following alternative options are available:
only-managed
: Only use managed Python installations; never use system Python installationssystem
: Prefer system Python installations over managed Python installationsonly-system
: Only use system Python installations; never use managed Python installations
These options allow disabling uv's managed Python versions entirely or always using them and ignoring any existing system installations.
!!! note
Automatic Python version downloads can be [disabled](#disabling-automatic-python-downloads)
without changing the preference.
uv supports the CPython, PyPy, and GraalPy Python implementations. If a Python implementation is not supported, uv will fail to discover its interpreter.
The implementations may be requested with either the long or short name:
- CPython:
cpython
,cp
- PyPy:
pypy
,pp
- GraalPy:
graalpy
,gp
Implementation name requests are not case sensitive.
See the Python version request documentation for more details on the supported formats.
uv supports downloading and installing CPython and PyPy distributions.
As Python does not publish official distributable CPython binaries, uv instead uses pre-built
distributions from the Astral
python-build-standalone
project.
python-build-standalone
is also is used in many other Python projects, like
Rye, Mise, and
bazelbuild/rules_python.
The uv Python distributions are self-contained, highly-portable, and performant. While Python can be
built from source, as in tools like pyenv
, doing so requires preinstalled system dependencies, and
creating optimized, performant builds (e.g., with PGO and LTO enabled) is very slow.
These distributions have some behavior quirks, generally as a consequence of portability; and, at
present, uv does not support installing them on musl-based Linux distributions, like Alpine Linux.
See the
python-build-standalone
quirks
documentation for details.
PyPy distributions are provided by the PyPy project.
Read the concept documents to learn more about uv's features:
Looking for a quick introduction to features? See the guides instead.
Python project metadata is defined in a
pyproject.toml
file. uv
requires this file to identify the root directory of a project.
!!! tip
`uv init` can be used to create a new project. See [Creating projects](./init.md) for
details.
A minimal project definition includes a name and version:
[project]
name = "example"
version = "0.1.0"
Additional project metadata and configuration includes:
When working on a project with uv, uv will create a virtual environment as needed. While some uv
commands will create a temporary environment (e.g., uv run --isolated
), uv also manages a
persistent environment with the project and its dependencies in a .venv
directory next to the
pyproject.toml
. It is stored inside the project to make it easy for editors to find — they need
the environment to give code completions and type hints. It is not recommended to include the
.venv
directory in version control; it is automatically excluded from git
with an internal
.gitignore
file.
To run a command in the project environment, use uv run
. Alternatively the project environment can
be activated as normal for a virtual environment.
When uv run
is invoked, it will create the project environment if it does not exist yet or ensure
it is up-to-date if it exists. The project environment can also be explicitly created with
uv sync
.
It is not recommended to modify the project environment manually, e.g., with uv pip install
. For
project dependencies, use uv add
to add a package to the environment. For one-off requirements,
use uvx
or
uv run --with
.
!!! tip
If you don't want uv to manage the project environment, set [`managed = false`](../../reference/settings.md#managed)
to disable automatic locking and syncing of the project. For example:
```toml title="pyproject.toml"
[tool.uv]
managed = false
```
uv creates a uv.lock
file next to the pyproject.toml
.
uv.lock
is a universal or cross-platform lockfile that captures the packages that would be
installed across all possible Python markers such as operating system, architecture, and Python
version.
Unlike the pyproject.toml
, which is used to specify the broad requirements of your project, the
lockfile contains the exact resolved versions that are installed in the project environment. This
file should be checked into version control, allowing for consistent and reproducible installations
across machines.
A lockfile ensures that developers working on the project are using a consistent set of package versions. Additionally, it ensures when deploying the project as an application that the exact set of used package versions is known.
The lockfile is created and updated during uv invocations that use the project environment, i.e.,
uv sync
and uv run
. The lockfile may also be explicitly updated using uv lock
.
uv.lock
is a human-readable TOML file but is managed by uv and should not be edited manually.
There is no Python standard for lockfiles at this time, so the format of this file is specific to uv
and not usable by other tools.
To distribute your project to others (e.g., to upload it to an index like PyPI), you'll need to build it into a distributable format.
Python projects are typically distributed as both source distributions (sdists) and binary
distributions (wheels). The former is typically a .tar.gz
or .zip
file containing the project's
source code along with some additional metadata, while the latter is a .whl
file containing
pre-built artifacts that can be installed directly.
uv build
can be used to build both source distributions and binary distributions for your project.
By default, uv build
will build the project in the current directory, and place the built
artifacts in a dist/
subdirectory:
$ uv build
$ ls dist/
example-0.1.0-py3-none-any.whl
example-0.1.0.tar.gz
You can build the project in a different directory by providing a path to uv build
, e.g.,
uv build path/to/project
.
uv build
will first build a source distribution, and then build a binary distribution (wheel) from
that source distribution.
You can limit uv build
to building a source distribution with uv build --sdist
, a binary
distribution with uv build --wheel
, or build both distributions from source with
uv build --sdist --wheel
.
uv build
accepts --build-constraint
, which can be used to constrain the versions of any build
requirements during the build process. When coupled with --require-hashes
, uv will enforce that
the requirement used to build the project match specific, known hashes, for reproducibility.
For example, given the following constraints.txt
:
setuptools==68.2.2 --hash=sha256:b454a35605876da60632df1a60f736524eb73cc47bbc9f3f1ef1b644de74fd2a
Running the following would build the project with the specified version of setuptools
, and verify
that the downloaded setuptools
distribution matches the specified hash:
$ uv build --build-constraint constraints.txt --require-hashes
The lockfile is created and updated during uv invocations that use the project environment, i.e.,
uv sync
and uv run
. The lockfile may also be explicitly created or updated using uv lock
:
$ uv lock
If you need to integrate uv with other tools or workflows, you can export uv.lock
to
requirements.txt
format with uv export --format requirements-txt
. The generated
requirements.txt
file can then be installed via uv pip install
, or with other tools like pip
.
In general, we recommend against using both a uv.lock
and a requirements.txt
file. If you find
yourself exporting a uv.lock
file, consider opening an issue to discuss your use case.
To avoid updating the lockfile during uv sync
and uv run
invocations, use the --frozen
flag.
To avoid updating the environment during uv run
invocations, use the --no-sync
flag.
To assert the lockfile matches the project metadata, use the --locked
flag. If the lockfile is not
up-to-date, an error will be raised instead of updating the lockfile.
You can also check if the lockfile is up-to-date by passing the --check
flag to uv lock
:
$ uv lock --check
This is equivalent to the --locked
flag for other commands.
By default, uv will prefer the locked versions of packages when running uv sync
and uv lock
.
Package versions will only change if the project's dependency constraints exclude the previous,
locked version.
To upgrade all packages:
$ uv lock --upgrade
To upgrade a single package to the latest version, while retaining the locked versions of all other packages:
$ uv lock --upgrade-package <package>
To upgrade a single package to a specific version:
$ uv lock --upgrade-package <package>==<version>
!!! note
In all cases, upgrades are limited to the project's dependency constraints. For example, if the
project defines an upper bound for a package then an upgrade will not go beyond that version.
Projects may declare the Python versions supported by the project in the project.requires-python
field of the pyproject.toml
.
It is recommended to set a requires-python
value:
[project]
name = "example"
version = "0.1.0"
requires-python = ">=3.12"
The Python version requirement determines the Python syntax that is allowed in the project and affects selection of dependency versions (they must support the same Python version range).
Entry points are the official term for an installed package to advertise interfaces. These include:
!!! important
Using the entry point tables requires a [build system](#build-systems) to be defined.
Projects may define command line interfaces (CLIs) for the project in the [project.scripts]
table
of the pyproject.toml
.
For example, to declare a command called hello
that invokes the hello
function in the example
module:
[project.scripts]
hello = "example:hello"
Then, the command can be run from a console:
$ uv run hello
Projects may define graphical user interfaces (GUIs) for the project in the [project.gui-scripts]
table of the pyproject.toml
.
!!! important
These are only different from [command-line interfaces](#command-line-interfaces) on Windows, where
they are wrapped by a GUI executable so they can be started without a console. On other platforms,
they behave the same.
For example, to declare a command called hello
that invokes the app
function in the example
module:
[project.gui-scripts]
hello = "example:app"
Projects may define entry points for plugin discovery in the
\[project.entry-points\]
table of the pyproject.toml
.
For example, to register the example-plugin-a
package as a plugin for example
:
[project.entry-points.'example.plugins']
a = "example_plugin_a"
Then, in example
, plugins would be loaded with:
from importlib.metadata import entry_points
for plugin in entry_points(group='example.plugins'):
plugin.load()
!!! note
The `group` key can be an arbitrary value, it does not need to include the package name or
"plugins". However, it is recommended to namespace the key by the package name to avoid
collisions with other packages.
A build system determines how the project should be packaged and installed. Projects may declare and
configure a build system in the [build-system]
table of the pyproject.toml
.
uv uses the presence of a build system to determine if a project contains a package that should be installed in the project virtual environment. If a build system is not defined, uv will not attempt to build or install the project itself, just its dependencies. If a build system is defined, uv will build and install the project into the project environment.
The --build-backend
option can be provided to uv init
to create a packaged project with an
appropriate layout. The --package
option can be provided to uv init
to create a packaged project
with the default build system.
!!! note
While uv will not build and install the current project without a build system definition,
the presence of a `[build-system]` table is not required in other packages. For legacy reasons,
if a build system is not defined, then `setuptools.build_meta:__legacy__` is used to build the
package. Packages you depend on may not explicitly declare their build system but are still
installable. Similarly, if you add a dependency on a local package or install it with `uv pip`,
uv will always attempt to build and install it.
As discussed in build systems, a Python project must be built to be installed. This process is generally referred to as "packaging".
You probably need a package if you want to:
- Add commands to the project
- Distribute the project to others
- Use a
src
andtest
layout - Write a library
You probably do not need a package if you are:
- Writing scripts
- Building a simple application
- Using a flat layout
While uv usually uses the declaration of a build system to determine if a project
should be packaged, uv also allows overriding this behavior with the
tool.uv.package
setting.
Setting tool.uv.package = true
will force a project to be built and installed into the project
environment. If no build system is defined, uv will use the setuptools legacy backend.
Setting tool.uv.package = false
will force a project package not to be built and installed into
the project environment. uv will ignore a declared build system when interacting with the project.
The UV_PROJECT_ENVIRONMENT
environment variable can be used to configure the project virtual
environment path (.venv
by default).
If a relative path is provided, it will be resolved relative to the workspace root. If an absolute path is provided, it will be used as-is, i.e. a child directory will not be created for the environment. If an environment is not present at the provided path, uv will create it.
This option can be used to write to the system Python environment, though it is not recommended.
uv sync
will remove extraneous packages from the environment by default and, as such, may leave
the system in a broken state.
!!! important
If an absolute path is provided and the setting is used across multiple projects, the
environment will be overwritten by invocations in each project. This setting is only recommended
for use for a single project in CI or Docker images.
!!! note
uv does not read the `VIRTUAL_ENV` environment variable during project operations. A warning
will be displayed if `VIRTUAL_ENV` is set to a different path than the project's environment.
If your project supports a more limited set of platforms or Python versions, you can constrain the
set of solved platforms via the environments
setting, which accepts a list of PEP 508 environment
markers. For example, to constrain the lockfile to macOS and Linux, and exclude Windows:
[tool.uv]
environments = [
"sys_platform == 'darwin'",
"sys_platform == 'linux'",
]
Or, to exclude alternative Python implementations:
[tool.uv]
environments = [
"implementation_name == 'cpython'"
]
Entries in the environments
setting must be disjoint (i.e., they must not overlap). For example,
sys_platform == 'darwin'
and sys_platform == 'linux'
are disjoint, but
sys_platform == 'darwin'
and python_version >= '3.9'
are not, since both could be true at the
same time.
By default, uv builds all packages in isolated virtual environments, as per PEP 517. Some packages are incompatible with build isolation, be it intentionally (e.g., due to the use of heavy build dependencies, mostly commonly PyTorch) or unintentionally (e.g., due to the use of legacy packaging setups).
To disable build isolation for a specific dependency, add it to the no-build-isolation-package
list in your pyproject.toml
:
[project]
name = "project"
version = "0.1.0"
description = "..."
readme = "README.md"
requires-python = ">=3.12"
dependencies = ["cchardet"]
[tool.uv]
no-build-isolation-package = ["cchardet"]
Installing packages without build isolation requires that the package's build dependencies are installed in the project environment prior to installing the package itself. This can be achieved by separating out the build dependencies and the packages that require them into distinct extras. For example:
[project]
name = "project"
version = "0.1.0"
description = "..."
readme = "README.md"
requires-python = ">=3.12"
dependencies = []
[project.optional-dependencies]
build = ["setuptools", "cython"]
compile = ["cchardet"]
[tool.uv]
no-build-isolation-package = ["cchardet"]
Given the above, a user would first sync the build
dependencies:
$ uv sync --extra build
+ cython==3.0.11
+ foo==0.1.0 (from file:///Users/crmarsh/workspace/uv/foo)
+ setuptools==73.0.1
Followed by the compile
dependencies:
$ uv sync --extra compile
+ cchardet==2.1.7
- cython==3.0.11
- setuptools==73.0.1
Note that uv sync --extra compile
would, by default, uninstall the cython
and setuptools
packages. To instead retain the build dependencies, include both extras in the second uv sync
invocation:
$ uv sync --extra build
$ uv sync --extra build --extra compile
Some packages, like cchardet
above, only require build dependencies for the installation phase
of uv sync
. Others, like flash-attn
, require their build dependencies to be present even just to
resolve the project's lockfile during the resolution phase.
In such cases, the build dependencies must be installed prior to running any uv lock
or uv sync
commands, using the lower lower-level uv pip
API. For example, given:
[project]
name = "project"
version = "0.1.0"
description = "..."
readme = "README.md"
requires-python = ">=3.12"
dependencies = ["flash-attn"]
[tool.uv]
no-build-isolation-package = ["flash-attn"]
You could run the following sequence of commands to sync flash-attn
:
$ uv venv
$ uv pip install torch
$ uv sync
Alternatively, you can provide the flash-attn
metadata upfront via the
dependency-metadata
setting, thereby forgoing
the need to build the package during the dependency resolution phase. For example, to provide the
flash-attn
metadata upfront, include the following in your pyproject.toml
:
[[tool.uv.dependency-metadata]]
name = "flash-attn"
version = "2.6.3"
requires-dist = ["torch", "einops"]
!!! tip
To determine the package metadata for a package like `flash-attn`, navigate to the appropriate Git repository,
or look it up on [PyPI](https://pypi.org/project/flash-attn) and download the package's source distribution.
The package requirements can typically be found in the `setup.py` or `setup.cfg` file.
(If the package includes a built distribution, you can unzip it to find the `METADATA` file; however, the presence
of a built distribution would negate the need to provide the metadata upfront, since it would already be available
to uv.)
Once included, you can again use the two-step uv sync
process to install the build dependencies.
Given the following pyproject.toml
:
[project]
name = "project"
version = "0.1.0"
description = "..."
readme = "README.md"
requires-python = ">=3.12"
dependencies = []
[project.optional-dependencies]
build = ["torch", "setuptools", "packaging"]
compile = ["flash-attn"]
[tool.uv]
no-build-isolation-package = ["flash-attn"]
[[tool.uv.dependency-metadata]]
name = "flash-attn"
version = "2.6.3"
requires-dist = ["torch", "einops"]
You could run the following sequence of commands to sync flash-attn
:
$ uv sync --extra build
$ uv sync --extra build --extra compile
!!! note
The `version` field in `tool.uv.dependency-metadata` is optional for registry-based
dependencies (when omitted, uv will assume the metadata applies to all versions of the package),
but _required_ for direct URL dependencies (like Git dependencies).
By default, the project will be installed in editable mode, such that changes to the source code are
immediately reflected in the environment. uv sync
and uv run
both accept a --no-editable
flag,
which instructs uv to install the project in non-editable mode. --no-editable
is intended for
deployment use-cases, such as building a Docker container, in which the project should be included
in the deployed environment without a dependency on the originating source code.
uv requires that all optional dependencies ("extras") declared by the project are compatible with each other and resolves all optional dependencies together when creating the lockfile.
If optional dependencies declared in one extra are not compatible with those in another extra, uv will fail to resolve the requirements of the project with an error.
To work around this, uv supports declaring conflicting extras. For example, consider two sets of optional dependencies that conflict with one another:
[project.optional-dependencies]
extra1 = ["numpy==2.1.2"]
extra2 = ["numpy==2.0.0"]
If you run uv lock
with the above dependencies, resolution will fail:
$ uv lock
x No solution found when resolving dependencies:
`-> Because myproject[extra2] depends on numpy==2.0.0 and myproject[extra1] depends on numpy==2.1.2, we can conclude that myproject[extra1] and
myproject[extra2] are incompatible.
And because your project requires myproject[extra1] and myproject[extra2], we can conclude that your projects's requirements are unsatisfiable.
But if you specify that extra1
and extra2
are conflicting, uv will resolve them separately.
Specify conflicts in the tool.uv
section:
[tool.uv]
conflicts = [
[
{ extra = "extra1" },
{ extra = "extra2" },
],
]
Now, running uv lock
will succeed. Note though, that now you cannot install both extra1
and
extra2
at the same time:
$ uv sync --extra extra1 --extra extra2
Resolved 3 packages in 14ms
error: extra `extra1`, extra `extra2` are incompatible with the declared conflicts: {`myproject[extra1]`, `myproject[extra2]`}
This error occurs because installing both extra1
and extra2
would result in installing two
different versions of a package into the same environment.
The above strategy for dealing with conflicting extras also works with dependency groups:
[dependency-groups]
group1 = ["numpy==2.1.2"]
group2 = ["numpy==2.0.0"]
[tool.uv]
conflicts = [
[
{ group = "group1" },
{ group = "group2" },
],
]
The only difference with conflicting extras is that you need to use group
instead of extra
.
Projects help manage Python code spanning multiple files.
!!! tip
Looking for an introduction to creating a project with uv? See the [projects guide](../../guides/projects.md) first.
Working on projects is a core part of the uv experience. Learn more about using projects:
- Understanding project structure and files
- Creating new projects
- Managing project dependencies
- Running commands and scripts in a project
- Using lockfiles and syncing the environment
- Configuring the project for advanced use cases
- Building distributions to publish a project
- Using workspaces to work on multiple projects at once
Inspired by the Cargo concept of the same name, a workspace is "a collection of one or more packages, called workspace members, that are managed together."
Workspaces organize large codebases by splitting them into multiple packages with common dependencies. Think: a FastAPI-based web application, alongside a series of libraries that are versioned and maintained as separate Python packages, all in the same Git repository.
In a workspace, each package defines its own pyproject.toml
, but the workspace shares a single
lockfile, ensuring that the workspace operates with a consistent set of dependencies.
As such, uv lock
operates on the entire workspace at once, while uv run
and uv sync
operate on
the workspace root by default, though both accept a --package
argument, allowing you to run a
command in a particular workspace member from any workspace directory.
To create a workspace, add a tool.uv.workspace
table to a pyproject.toml
, which will implicitly
create a workspace rooted at that package.
!!! tip
By default, running `uv init` inside an existing package will add the newly created member to the workspace, creating a `tool.uv.workspace` table in the workspace root if it doesn't already exist.
In defining a workspace, you must specify the members
(required) and exclude
(optional) keys,
which direct the workspace to include or exclude specific directories as members respectively, and
accept lists of globs:
[project]
name = "albatross"
version = "0.1.0"
requires-python = ">=3.12"
dependencies = ["bird-feeder", "tqdm>=4,<5"]
[tool.uv.sources]
bird-feeder = { workspace = true }
[tool.uv.workspace]
members = ["packages/*"]
exclude = ["packages/seeds"]
[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"
Every directory included by the members
globs (and not excluded by the exclude
globs) must
contain a pyproject.toml
file. However, workspace members can be either
applications or libraries; both are supported in
the workspace context.
Every workspace needs a root, which is also a workspace member. In the above example, albatross
is the workspace root, and the workspace members include all projects under the packages
directory, with the exception of seeds
.
By default, uv run
and uv sync
operates on the workspace root. For example, in the above
example, uv run
and uv run --package albatross
would be equivalent, while
uv run --package bird-feeder
would run the command in the bird-feeder
package.
Within a workspace, dependencies on workspace members are facilitated via
tool.uv.sources
, as in:
[project]
name = "albatross"
version = "0.1.0"
requires-python = ">=3.12"
dependencies = ["bird-feeder", "tqdm>=4,<5"]
[tool.uv.sources]
bird-feeder = { workspace = true }
[tool.uv.workspace]
members = ["packages/*"]
[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"
In this example, the albatross
project depends on the bird-feeder
project, which is a member of
the workspace. The workspace = true
key-value pair in the tool.uv.sources
table indicates the
bird-feeder
dependency should be provided by the workspace, rather than fetched from PyPI or
another registry.
!!! note
Dependencies between workspace members are editable.
Any tool.uv.sources
definitions in the workspace root apply to all members, unless overridden in
the tool.uv.sources
of a specific member. For example, given the following pyproject.toml
:
[project]
name = "albatross"
version = "0.1.0"
requires-python = ">=3.12"
dependencies = ["bird-feeder", "tqdm>=4,<5"]
[tool.uv.sources]
bird-feeder = { workspace = true }
tqdm = { git = "https://github.com/tqdm/tqdm" }
[tool.uv.workspace]
members = ["packages/*"]
[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"
Every workspace member would, by default, install tqdm
from GitHub, unless a specific member
overrides the tqdm
entry in its own tool.uv.sources
table.
The most common workspace layout can be thought of as a root project with a series of accompanying libraries.
For example, continuing with the above example, this workspace has an explicit root at albatross
,
with two libraries (bird-feeder
and seeds
) in the packages
directory:
albatross
├── packages
│ ├── bird-feeder
│ │ ├── pyproject.toml
│ │ └── src
│ │ └── bird_feeder
│ │ ├── __init__.py
│ │ └── foo.py
│ └── seeds
│ ├── pyproject.toml
│ └── src
│ └── seeds
│ ├── __init__.py
│ └── bar.py
├── pyproject.toml
├── README.md
├── uv.lock
└── src
└── albatross
└── main.py
Since seeds
was excluded in the pyproject.toml
, the workspace has two members total: albatross
(the root) and bird-feeder
.
Workspaces are intended to facilitate the development of multiple interconnected packages within a single repository. As a codebase grows in complexity, it can be helpful to split it into smaller, composable packages, each with their own dependencies and version constraints.
Workspaces help enforce isolation and separation of concerns. For example, in uv, we have separate packages for the core library and the command-line interface, enabling us to test the core library independently of the CLI, and vice versa.
Other common use cases for workspaces include:
- A library with a performance-critical subroutine implemented in an extension module (Rust, C++, etc.).
- A library with a plugin system, where each plugin is a separate workspace package with a dependency on the root.
Workspaces are not suited for cases in which members have conflicting requirements, or desire a
separate virtual environment for each member. In this case, path dependencies are often preferable.
For example, rather than grouping albatross
and its members in a workspace, you can always define
each package as its own independent project, with inter-package dependencies defined as path
dependencies in tool.uv.sources
:
[project]
name = "albatross"
version = "0.1.0"
requires-python = ">=3.12"
dependencies = ["bird-feeder", "tqdm>=4,<5"]
[tool.uv.sources]
bird-feeder = { path = "packages/bird-feeder" }
[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"
This approach conveys many of the same benefits, but allows for more fine-grained control over
dependency resolution and virtual environment management (with the downside that uv run --package
is no longer available; instead, commands must be run from the relevant package directory).
Finally, uv's workspaces enforce a single requires-python
for the entire workspace, taking the
intersection of all members' requires-python
values. If you need to support testing a given member
on a Python version that isn't supported by the rest of the workspace, you may need to use uv pip
to install that member in a separate virtual environment.
!!! note
As Python does not provide dependency isolation, uv can't ensure that a package uses its declared dependencies and nothing else. For workspaces specifically, uv can't ensure that packages don't import dependencies declared by another workspace member.
uv supports creating a project with uv init
.
When creating projects, uv supports two basic templates: applications and
libraries. By default, uv will create a project for an application. The --lib
flag can be used to create a project for a library instead.
uv will create a project in the working directory, or, in a target directory by providing a name,
e.g., uv init foo
. If there's already a project in the target directory, i.e., if there's a
pyproject.toml
, uv will exit with an error.
Application projects are suitable for web servers, scripts, and command-line interfaces.
Applications are the default target for uv init
, but can also be specified with the --app
flag.
$ uv init example-app
The project includes a pyproject.toml
, a sample file (hello.py
), a readme, and a Python version
pin file (.python-version
).
$ tree example-app
example-app
├── .python-version
├── README.md
├── hello.py
└── pyproject.toml
The pyproject.toml
includes basic metadata. It does not include a build system, it is not a
package and will not be installed into the environment:
[project]
name = "example-app"
version = "0.1.0"
description = "Add your description here"
readme = "README.md"
requires-python = ">=3.11"
dependencies = []
The sample file defines a main
function with some standard boilerplate:
def main():
print("Hello from example-app!")
if __name__ == "__main__":
main()
Python files can be executed with uv run
:
$ uv run hello.py
Hello from example-project!
Many use-cases require a package. For example, if you are creating a command-line interface that will be published to PyPI or if you want to define tests in a dedicated directory.
The --package
flag can be used to create a packaged application:
$ uv init --package example-pkg
The source code is moved into a src
directory with a module directory and an __init__.py
file:
$ tree example-pkg
example-pkg
├── .python-version
├── README.md
├── pyproject.toml
└── src
└── example_packaged_app
└── __init__.py
A build system is defined, so the project will be installed into the environment:
[project]
name = "example-pkg"
version = "0.1.0"
description = "Add your description here"
readme = "README.md"
requires-python = ">=3.11"
dependencies = []
[project.scripts]
example-pkg = "example_packaged_app:main"
[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"
!!! tip
The `--build-backend` option can be used to request an alternative build system.
A command definition is included:
[project]
name = "example-pkg"
version = "0.1.0"
description = "Add your description here"
readme = "README.md"
requires-python = ">=3.11"
dependencies = []
[project.scripts]
example-pkg = "example_packaged_app:main"
[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"
The command can be executed with uv run
:
$ uv run --directory example-pkg example-pkg
Hello from example-pkg!
A library provides functions and objects for other projects to consume. Libraries are intended to be built and distributed, e.g., by uploading them to PyPI.
Libraries can be created by using the --lib
flag:
$ uv init --lib example-lib
!!! note
Using `--lib` implies `--package`. Libraries always require a packaged project.
As with a packaged application, a src
layout is used. A py.typed
marker is included to indicate to consumers that types can be read from the library:
$ tree example-lib
example-lib
├── .python-version
├── README.md
├── pyproject.toml
└── src
└── example_lib
├── py.typed
└── __init__.py
!!! note
A `src` layout is particularly valuable when developing libraries. It ensures that the library is
isolated from any `python` invocations in the project root and that distributed library code is
well separated from the rest of the project source.
A build system is defined, so the project will be installed into the environment:
[project]
name = "example-lib"
version = "0.1.0"
description = "Add your description here"
readme = "README.md"
requires-python = ">=3.11"
dependencies = []
[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"
!!! tip
You can select a different build backend template by using `--build-backend` with `hatchling`,
`flit-core`, `pdm-backend`, `setuptools`, `maturin`, or `scikit-build-core`. An alternative
backend is required if you want to create a [library with extension modules](#projects-with-extension-modules).
The created module defines a simple API function:
def hello() -> str:
return "Hello from example-lib!"
And you can import and execute it using uv run
:
$ uv run --directory example-lib python -c "import example_lib; print(example_lib.hello())"
Hello from example-lib!
Most Python projects are "pure Python", meaning they do not define modules in other languages like C, C++, FORTRAN, or Rust. However, projects with extension modules are often used for performance sensitive code.
Creating a project with an extension module requires choosing an alternative build system. uv supports creating projects with the following build systems that support building extension modules:
maturin
for projects with Rustscikit-build-core
for projects with C, C++, FORTRAN, Cython
Specify the build system with the --build-backend
flag:
$ uv init --build-backend maturin example-ext
!!! note
Using `--build-backend` implies `--package`.
The project contains a Cargo.toml
and a lib.rs
file in addition to the typical Python project
files:
$ tree example-ext
example-ext
├── .python-version
├── Cargo.toml
├── README.md
├── pyproject.toml
└── src
├── lib.rs
└── example_ext
├── __init__.py
└── _core.pyi
!!! note
If using `scikit-build-core`, you'll see CMake configuration and a `main.cpp` file instead.
The Rust library defines a simple function:
use pyo3::prelude::*;
#[pyfunction]
fn hello_from_bin() -> String {
"Hello from example-ext!".to_string()
}
#[pymodule]
fn _core(m: &Bound<'_, PyModule>) -> PyResult<()> {
m.add_function(wrap_pyfunction!(hello_from_bin, m)?)?;
Ok(())
}
And the Python module imports it:
from example_ext._core import hello_from_bin
def main() -> None:
print(hello_from_bin())
The command can be executed with uv run
:
$ uv run --directory example-ext example-ext
Hello from example-ext!
!!! important
Changes to the extension code in `lib.rs` or `main.cpp` will require running `--reinstall` to
rebuild them.
Dependencies of the project are defined in several tables:
project.dependencies
: Published dependencies.project.optional-dependencies
: Published optional dependencies, or "extras".dependency-groups
: Local dependencies for development.tool.uv.sources
: Alternative sources for dependencies during development.
!!! note
The `project.dependencies` and `project.optional-dependencies` tables can be used even if
project isn't going to be published. `dependency-groups` are a recently standardized feature
and may not be supported by all tools yet.
uv supports modifying the project's dependencies with uv add
and uv remove
, but dependency
metadata can also be updated by editing the pyproject.toml
directly.
To add a dependency:
$ uv add httpx
An entry will be added in the project.dependencies
table:
[project]
name = "example"
version = "0.1.0"
dependencies = ["httpx>=0.27.2"]
The --dev
, --group
, or
--optional
flags can be used to add a dependencies to an alternative
table.
The dependency will include a constraint, e.g., >=0.27.2
, for the most recent, compatible version
of the package. An alternative constraint can be provided:
$ uv add "httpx>=0.20"
When adding a dependency from a source other than a package registry, uv will add an entry in the
sources table. For example, when adding httpx
from GitHub:
$ uv add "httpx @ git+https://github.com/encode/httpx"
The pyproject.toml
will include a Git source entry:
[project]
name = "example"
version = "0.1.0"
dependencies = [
"httpx",
]
[tool.uv.sources]
httpx = { git = "https://github.com/encode/httpx" }
If a dependency cannot be used, uv will display an error.:
$ uv add "httpx>9999"
× No solution found when resolving dependencies:
╰─▶ Because only httpx<=1.0.0b0 is available and your project depends on httpx>9999,
we can conclude that your project's requirements are unsatisfiable.
To remove a dependency:
$ uv remove httpx
The --dev
, --group
, or --optional
flags can be used to remove a dependency from a specific
table.
If a source is defined for the removed dependency, and there are no other references to the dependency, it will also be removed.
To change an existing dependency, e.g., to use a different constraint for httpx
:
$ uv add "httpx>0.1.0"
!!! note
In this example, we are changing the constraints for the dependency in the `pyproject.toml`.
The locked version of the dependency will only change if necessary to satisfy the new
constraints. To force the package version to update to the latest within the constraints, use `--upgrade-package <name>`, e.g.:
```console
$ uv add "httpx>0.1.0" --upgrade-package httpx
```
See the [lockfile](./sync.md#upgrading-locked-package-versions) documentation for more details
on upgrading packages.
Requesting a different dependency source will update the tool.uv.sources
table, e.g., to use
httpx
from a local path during development:
$ uv add "httpx @ ../httpx"
To ensure that a dependency is only installed on a specific platform or on specific Python versions, use environment markers.
For example, to install jax
on Linux, but not on Windows or macOS:
$ uv add "jax; sys_platform == 'linux'"
The resulting pyproject.toml
will then include the environment marker in the dependency
definition:
[project]
name = "project"
version = "0.1.0"
requires-python = ">=3.11"
dependencies = ["jax; sys_platform == 'linux'"]
Similarly, to include numpy
on Python 3.11 and later:
$ uv add "numpy; python_version >= '3.11'"
See Python's environment marker documentation for a complete enumeration of the available markers and operators.
!!! tip
Dependency sources can also be [changed per-platform](#platform-specific-sources).
The project.dependencies
table represents the dependencies that are used when uploading to PyPI or
building a wheel. Individual dependencies are specified using
dependency specifiers
syntax, and the table follows the
PEP 621 standard.
project.dependencies
defines the list of packages that are required for the project, along with
the version constraints that should be used when installing them. Each entry includes a dependency
name and version. An entry may include extras or environment markers for platform-specific packages.
For example:
[project]
name = "albatross"
version = "0.1.0"
dependencies = [
# Any version in this range
"tqdm >=4.66.2,<5",
# Exactly this version of torch
"torch ==2.2.2",
# Install transformers with the torch extra
"transformers[torch] >=4.39.3,<5",
# Only install this package on older python versions
# See "Environment Markers" for more information
"importlib_metadata >=7.1.0,<8; python_version < '3.10'",
"mollymawk ==0.1.0"
]
The tool.uv.sources
table extends the standard dependency tables with alternative dependency
sources, which are used during development.
Dependency sources add support common patterns that are not supported by the project.dependencies
standard, like editable installations and relative paths. For example, to install foo
from a
directory relative to the project root:
[project]
name = "example"
version = "0.1.0"
dependencies = ["foo"]
[tool.uv.sources]
foo = { path = "./packages/foo" }
The following dependency sources are supported by uv:
- Index: A package resolved from a specific package index.
- Git: A Git repository.
- URL: A remote wheel or source distribution.
- Path: A local wheel, source distribution, or project directory.
- Workspace: A member of the current workspace.
!!! important
Sources are only respected by uv. If another tool is used, only the definitions in the standard
project tables will be used. If another tool is being used for development, any metadata
provided in the source table will need to be re-specified in the other tool's format.
To add Python package from a specific index, use the --index
option:
$ uv add torch --index pytorch=https://download.pytorch.org/whl/cpu
uv will store the index in [[tool.uv.index]]
and add a [tool.uv.sources]
entry:
[project]
dependencies = ["torch"]
[tool.uv.sources]
torch = { index = "pytorch" }
[[tool.uv.index]]
name = "pytorch"
url = "https://download.pytorch.org/whl/cpu"
!!! tip
The above example will only work on x86-64 Linux, due to the specifics of the PyTorch index.
See the [PyTorch guide](../../guides/integration/pytorch.md) for more information about setting
up PyTorch.
Using an index
source pins a package to the given index — it will not be downloaded from other
indexes.
When defining an index, an explicit
flag can be included to indicate that the index should only
be used for packages that explicitly specify it in tool.uv.sources
. If explicit
is not set,
other packages may be resolved from the index, if not found elsewhere.
[[tool.uv.index]]
name = "pytorch"
url = "https://download.pytorch.org/whl/cpu"
explicit = true
To add a Git dependency source, prefix a Git-compatible URL (i.e., that you would use with
git clone
) with git+
.
For example:
$ uv add git+https://github.com/encode/httpx
[project]
dependencies = ["httpx"]
[tool.uv.sources]
httpx = { git = "https://github.com/encode/httpx" }
Specific Git references can be requested, e.g., a tag:
$ uv add git+https://github.com/encode/httpx --tag 0.27.0
[project]
dependencies = ["httpx"]
[tool.uv.sources]
httpx = { git = "https://github.com/encode/httpx", tag = "0.27.0" }
Or, a branch:
$ uv add git+https://github.com/encode/httpx --branch main
[project]
dependencies = ["httpx"]
[tool.uv.sources]
httpx = { git = "https://github.com/encode/httpx", branch = "main" }
Or, a revision (commit):
$ uv add git+https://github.com/encode/httpx --rev 326b9431c761e1ef1e00b9f760d1f654c8db48c6
[project]
dependencies = ["httpx"]
[tool.uv.sources]
httpx = { git = "https://github.com/encode/httpx", rev = "326b9431c761e1ef1e00b9f760d1f654c8db48c6" }
A subdirectory
may be specified if the package isn't in the repository root.
To add a URL source, provide a https://
URL to either a wheel (ending in .whl
) or a source
distribution (typically ending in .tar.gz
or .zip
; see
here for all supported formats).
For example:
$ uv add "https://files.pythonhosted.org/packages/5c/2d/3da5bdf4408b8b2800061c339f240c1802f2e82d55e50bd39c5a881f47f0/httpx-0.27.0.tar.gz"
Will result in a pyproject.toml
with:
[project]
dependencies = ["httpx"]
[tool.uv.sources]
httpx = { url = "https://files.pythonhosted.org/packages/5c/2d/3da5bdf4408b8b2800061c339f240c1802f2e82d55e50bd39c5a881f47f0/httpx-0.27.0.tar.gz" }
URL dependencies can also be manually added or edited in the pyproject.toml
with the
{ url = <url> }
syntax. A subdirectory
may be specified if the source distribution isn't in the
archive root.
To add a path source, provide the path of a wheel (ending in .whl
), a source distribution
(typically ending in .tar.gz
or .zip
; see
here for all supported formats), or a directory
containing a pyproject.toml
.
For example:
$ uv add /example/foo-0.1.0-py3-none-any.whl
Will result in a pyproject.toml
with:
[project]
dependencies = ["foo"]
[tool.uv.sources]
foo = { path = "/example/foo-0.1.0-py3-none-any.whl" }
The path may also be a relative path:
$ uv add ./foo-0.1.0-py3-none-any.whl
Or, a path to a project directory:
$ uv add ~/projects/bar/
!!! important
An [editable installation](#editable-dependencies) is not used for path dependencies by
default. An editable installation may be requested for project directories:
```console
$ uv add --editable ~/projects/bar/
```
For multiple packages in the same repository, [_workspaces_](./workspaces.md) may be a better
fit.
To declare a dependency on a workspace member, add the member name with { workspace = true }
. All
workspace members must be explicitly stated. Workspace members are always
editable . See the workspace documentation for more
details on workspaces.
[project]
dependencies = ["foo==0.1.0"]
[tool.uv.sources]
foo = { workspace = true }
[tool.uv.workspace]
members = [
"packages/foo"
]
You can limit a source to a given platform or Python version by providing dependency specifiers-compatible environment markers for the source.
For example, to pull httpx
from GitHub, but only on macOS, use the following:
[project]
dependencies = ["httpx"]
[tool.uv.sources]
httpx = { git = "https://github.com/encode/httpx", tag = "0.27.2", marker = "sys_platform == 'darwin'" }
By specifying the marker on the source, uv will still include httpx
on all platforms, but will
download the source from GitHub on macOS, and fall back to PyPI on all other platforms.
You can specify multiple sources for a single dependency by providing a list of sources, disambiguated by PEP 508-compatible environment markers.
For example, to pull in different httpx
tags on macOS vs. Linux:
[project]
dependencies = ["httpx"]
[tool.uv.sources]
httpx = [
{ git = "https://github.com/encode/httpx", tag = "0.27.2", marker = "sys_platform == 'darwin'" },
{ git = "https://github.com/encode/httpx", tag = "0.24.1", marker = "sys_platform == 'linux'" },
]
This strategy extends to using different indexes based on environment markers. For example, to
install torch
from different PyTorch indexes based on the platform:
[project]
dependencies = ["torch"]
[tool.uv.sources]
torch = [
{ index = "torch-cpu", marker = "platform_system == 'Darwin'"},
{ index = "torch-gpu", marker = "platform_system == 'Linux'"},
]
[[tool.uv.index]]
name = "torch-cpu"
url = "https://download.pytorch.org/whl/cpu"
[[tool.uv.index]]
name = "torch-gpu"
url = "https://download.pytorch.org/whl/cu124"
To instruct uv to ignore the tool.uv.sources
table (e.g., to simulate resolving with the package's
published metadata), use the --no-sources
flag:
$ uv lock --no-sources
The use of --no-sources
will also prevent uv from discovering any
workspace members that could satisfy a given dependency.
It is common for projects that are published as libraries to make some features optional to reduce
the default dependency tree. For example, Pandas has an
excel
extra and a
plot
extra to avoid
installation of Excel parsers and matplotlib
unless someone explicitly requires them. Extras are
requested with the package[<extra>]
syntax, e.g., pandas[plot, excel]
.
Optional dependencies are specified in [project.optional-dependencies]
, a TOML table that maps
from extra name to its dependencies, following
dependency specifiers syntax.
Optional dependencies can have entries in tool.uv.sources
the same as normal dependencies.
[project]
name = "pandas"
version = "1.0.0"
[project.optional-dependencies]
plot = [
"matplotlib>=3.6.3"
]
excel = [
"odfpy>=1.4.1",
"openpyxl>=3.1.0",
"python-calamine>=0.1.7",
"pyxlsb>=1.0.10",
"xlrd>=2.0.1",
"xlsxwriter>=3.0.5"
]
To add an optional dependency, use the --optional <extra>
option:
$ uv add httpx --optional network
!!! note
If you have optional dependencies that conflict with one another, resolution will fail
unless you explicitly [declare them as conflicting](./config.md#conflicting-dependencies).
Sources can also be declared as applying only to a specific optional dependency. For example, to
pull torch
from different PyTorch indexes based on an optional cpu
or gpu
extra:
[project]
dependencies = []
[project.optional-dependencies]
cpu = [
"torch",
]
gpu = [
"torch",
]
[tool.uv.sources]
torch = [
{ index = "torch-cpu", extra = "cpu" },
{ index = "torch-gpu", extra = "gpu" },
]
[[tool.uv.index]]
name = "torch-cpu"
url = "https://download.pytorch.org/whl/cpu"
[[tool.uv.index]]
name = "torch-gpu"
url = "https://download.pytorch.org/whl/cu124"
Unlike optional dependencies, development dependencies are local-only and will not be included in
the project requirements when published to PyPI or other indexes. As such, development dependencies
are not included in the [project]
table.
Development dependencies can have entries in tool.uv.sources
the same as normal dependencies.
To add a development dependency, use the --dev
flag:
$ uv add --dev pytest
uv uses the [dependency-groups]
table (as defined in PEP 735)
for declaration of development dependencies. The above command will create a dev
group:
[dependency-groups]
dev = [
"pytest >=8.1.1,<9"
]
The dev
group is special-cased; there are --dev
, --only-dev
, and --no-dev
flags to toggle
inclusion or exclusion of its dependencies. Additionally, the dev
group is
synced by default.
Development dependencies can be divided into multiple groups, using the --group
flag.
For example, to add a development dependency in the lint
group:
$ uv add --group lint ruff
Which results in the following [dependency-groups]
definition:
[dependency-groups]
dev = [
"pytest"
]
lint = [
"ruff"
]
Once groups are defined, the --group
, --only-group
, and --no-group
options can be used to
include or exclude their dependencies.
!!! tip
The `--dev`, `--only-dev`, and `--no-dev` flags are equivalent to `--group dev`,
`--only-group dev`, and `--no-group dev` respectively.
uv requires that all dependency groups are compatible with each other and resolves all groups together when creating the lockfile.
If dependencies declared in one group are not compatible with those in another group, uv will fail to resolve the requirements of the project with an error.
!!! note
If you have dependency groups that conflict with one another, resolution will fail
unless you explicitly [declare them as conflicting](./config.md#conflicting-dependencies).
By default, uv includes the dev
dependency group in the environment (e.g., during uv run
or
uv sync
). The default groups to include can be changed using the tool.uv.default-groups
setting.
[tool.uv]
default-groups = ["dev", "foo"]
!!! tip
To exclude a default group during `uv run` or `uv sync`, use `--no-group <name>`.
Before [dependency-groups]
was standardized, uv used the tool.uv.dev-dependencies
field to
specify development dependencies, e.g.:
[tool.uv]
dev-dependencies = [
"pytest"
]
Dependencies declared in this section will be combined with the contents in the
dependency-groups.dev
. Eventually, the dev-dependencies
field will be deprecated and removed.
!!! note
If a `tool.uv.dev-dependencies` field exists, `uv add --dev` will use the existing section
instead of adding a new `dependency-groups.dev` section.
If a project is structured as Python package, it may declare
dependencies that are required to build the project, but not required to run it. These dependencies
are specified in the [build-system]
table under build-system.requires
, following
PEP 518.
For example, if a project uses setuptools
as its build backend, it should declare setuptools
as
a build dependency:
[project]
name = "pandas"
version = "0.1.0"
[build-system]
requires = ["setuptools>=42"]
build-backend = "setuptools.build_meta"
By default, uv will respect tool.uv.sources
when resolving build dependencies. For example, to use
a local version of setuptools
for building, add the source to tool.uv.sources
:
[project]
name = "pandas"
version = "0.1.0"
[build-system]
requires = ["setuptools>=42"]
build-backend = "setuptools.build_meta"
[tool.uv.sources]
setuptools = { path = "./packages/setuptools" }
When publishing a package, we recommend running uv build --no-sources
to ensure that the package
builds correctly when tool.uv.sources
is disabled, as is the case when using other build tools,
like pypa/build
.
A regular installation of a directory with a Python package first builds a wheel and then installs that wheel into your virtual environment, copying all source files. When the package source files are edited, the virtual environment will contain outdated versions.
Editable installations solve this problem by adding a link to the project within the virtual
environment (a .pth
file), which instructs the interpreter to include the source files directly.
There are some limitations to editables (mainly: the build backend needs to support them, and native modules aren't recompiled before import), but they are useful for development, as the virtual environment will always use the latest changes to the package.
uv uses editable installation for workspace packages by default.
To add an editable dependency, use the --editable
flag:
$ uv add --editable ./path/foo
Or, to opt-out of using an editable dependency in a workspace:
$ uv add --no-editable ./path/foo
uv uses dependency specifiers, previously known as PEP 508. A dependency specifier is composed of, in order:
- The dependency name
- The extras you want (optional)
- The version specifier
- An environment marker (optional)
The version specifiers are comma separated and added together, e.g., foo >=1.2.3,<2,!=1.4.0
is
interpreted as "a version of foo
that's at least 1.2.3, but less than 2, and not 1.4.0".
Specifiers are padded with trailing zeros if required, so foo ==2
matches foo 2.0.0, too.
A star can be used for the last digit with equals, e.g. foo ==2.1.*
will accept any release from
the 2.1 series. Similarly, ~=
matches where the last digit is equal or higher, e.g., foo ~=1.2
is equal to foo >=1.2,<2
, and foo ~=1.2.3
is equal to foo >=1.2.3,<1.3
.
Extras are comma-separated in square bracket between name and version, e.g.,
pandas[excel,plot] ==2.2
. Whitespace between extra names is ignored.
Some dependencies are only required in specific environments, e.g., a specific Python version or
operating system. For example to install the importlib-metadata
backport for the
importlib.metadata
module, use importlib-metadata >=7.1.0,<8; python_version < '3.10'
. To
install colorama
on Windows (but omit it on other platforms), use
colorama >=0.4.6,<5; platform_system == "Windows"
.
Markers are combined with and
, or
, and parentheses, e.g.,
aiohttp >=3.7.4,<4; (sys_platform != 'win32' or implementation_name != 'pypy') and python_version >= '3.10'
.
Note that versions within markers must be quoted, while versions outside of markers must not be
quoted.
When working on a project, it is installed into the virtual environment at .venv
. This environment
is isolated from the current shell by default, so invocations that require the project, e.g.,
python -c "import example"
, will fail. Instead, use uv run
to run commands in the project
environment:
$ uv run python -c "import example"
When using run
, uv will ensure that the project environment is up-to-date before running the given
command.
The given command can be provided by the project environment or exist outside of it, e.g.:
$ # Presuming the project provides `example-cli`
$ uv run example-cli foo
$ # Running a `bash` script that requires the project to be available
$ uv run bash scripts/foo.sh
Additional dependencies or different versions of dependencies can be requested per invocation.
The --with
option is used to include a dependency for the invocation, e.g., to request a different
version of httpx
:
$ uv run --with httpx==0.26.0 python -c "import httpx; print(httpx.__version__)"
0.26.0
$ uv run --with httpx==0.25.0 python -c "import httpx; print(httpx.__version__)"
0.25.0
The requested version will be respected regardless of the project's requirements. For example, even
if the project requires httpx==0.24.0
, the output above would be the same.
Scripts that declare inline metadata are automatically executed in environments isolated from the project. See the scripts guide for more details.
For example, given a script:
# /// script
# dependencies = [
# "httpx",
# ]
# ///
import httpx
resp = httpx.get("https://peps.python.org/api/peps.json")
data = resp.json()
print([(k, v["title"]) for k, v in data.items()][:10])
The invocation uv run example.py
would run isolated from the project with only the given
dependencies listed.
Resolution is the process of taking a list of requirements and converting them to a list of package versions that fulfill the requirements. Resolution requires recursively searching for compatible versions of packages, ensuring that the requested requirements are fulfilled and that the requirements of the requested packages are compatible.
Most projects and packages have dependencies. Dependencies are other packages that are necessary in order for the current package to work. A package defines its dependencies as requirements, roughly a combination of a package name and acceptable versions. The dependencies defined by the current project are called direct dependencies. The dependencies added by each dependency of the current project are called indirect or transitive dependencies.
!!! note
See the [dependency specifiers
page](https://packaging.python.org/en/latest/specifications/dependency-specifiers/)
in the Python Packaging documentation for details about dependencies.
To help demonstrate the resolution process, consider the following dependencies:
- The project depends on
foo
andbar
. foo
has one version, 1.0.0:foo 1.0.0
depends onlib>=1.0.0
.
bar
has one version, 1.0.0:bar 1.0.0
depends onlib>=2.0.0
.
lib
has two versions, 1.0.0 and 2.0.0. Both versions have no dependencies.
In this example, the resolver must find a set of package versions which satisfies the project
requirements. Since there is only one version of both foo
and bar
, those will be used. The
resolution must also include the transitive dependencies, so a version of lib
must be chosen.
foo 1.0.0
allows all available versions of lib
, but bar 1.0.0
requires lib>=2.0.0
so
lib 2.0.0
must be used.
In some resolutions, there may be more than one valid solution. Consider the following dependencies:
- The project depends on
foo
andbar
. foo
has two versions, 1.0.0 and 2.0.0:foo 1.0.0
has no dependencies.foo 2.0.0
depends onlib==2.0.0
.
bar
has two versions, 1.0.0 and 2.0.0:bar 1.0.0
has no dependencies.bar 2.0.0
depends onlib==1.0.0
lib
has two versions, 1.0.0 and 2.0.0. Both versions have no dependencies.
In this example, some version of both foo
and bar
must be selected; however, determining which
version requires considering the dependencies of each version of foo
and bar
. foo 2.0.0
and
bar 2.0.0
cannot be installed together as they conflict on their required version of lib
, so the
resolver must select either foo 1.0.0
(along with bar 2.0.0
) or bar 1.0.0
(along with
foo 1.0.0
). Both are valid solutions, and different resolution algorithms may yield either result.
Markers allow attaching an expression to requirements that indicate when the dependency should be
used. For example bar ; python_version < "3.9"
indicates that bar
should only be installed on
Python 3.8 and earlier.
Markers are used to adjust a package's dependencies based on the current environment or platform. For example, markers can be used to modify dependencies by operating system, CPU architecture, Python version, Python implementation, and more.
!!! note
See the [environment
markers](https://packaging.python.org/en/latest/specifications/dependency-specifiers/#environment-markers)
section in the Python Packaging documentation for more details about markers.
Markers are important for resolution because their values change the required dependencies. Typically, Python package resolvers use the markers of the current platform to determine which dependencies to use since the package is often being installed on the current platform. However, for locking dependencies this is problematic — the lockfile would only work for developers using the same platform the lockfile was created on. To solve this problem, platform-independent, or "universal" resolvers exist.
uv supports both platform-specific and universal resolution.
uv's lockfile (uv.lock
) is created with a universal resolution and is portable across platforms.
This ensures that dependencies are locked for everyone working on the project, regardless of
operating system, architecture, and Python version. The uv lockfile is created and modified by
project commands such as uv lock
, uv sync
, and uv add
.
Universal resolution is also available in uv's pip interface, i.e.,
uv pip compile
, with the --universal
flag. The resulting requirements file
will contain markers to indicate which platform each dependency is relevant for.
During universal resolution, a package may be listed multiple times with different versions or URLs if different versions are needed for different platforms — the markers determine which version will be used. A universal resolution is often more constrained than a platform-specific resolution, since we need to take the requirements for all markers into account.
During universal resolution, all selected dependency versions must be compatible with the entire
requires-python
range declared in the pyproject.toml
. For example, if a project's
requires-python
is >=3.8
, then uv will not allow any dependency versions that are limited to,
e.g., Python 3.9 and later, as they are not compatible with Python 3.8, the lower bound of the
project's supported range. In other words, the project's requires-python
must be a subset of the
requires-python
of all its dependencies.
When evaluating requires-python
ranges for dependencies, uv only considers lower bounds and
ignores upper bounds entirely. For example, >=3.8, <4
is treated as >=3.8
.
By default, uv's pip interface, i.e., uv pip compile
, produces a resolution
that is platform-specific, like pip-tools
. There is no way to use platform-specific resolution in
the uv's project interface.
uv also supports resolving for specific, alternate platforms and Python versions with the
--python-platform
and --python-version
options. For example, if using Python 3.12 on macOS,
uv pip compile --python-platform linux --python-version 3.10 requirements.in
can be used to
produce a resolution for Python 3.10 on Linux instead. Unlike universal resolution, during
platform-specific resolution, the provided --python-version
is the exact python version to use,
not a lower bound.
!!! note
Python's environment markers expose far more information about the current machine
than can be expressed by a simple `--python-platform` argument. For example, the `platform_version` marker
on macOS includes the time at which the kernel was built, which can (in theory) be encoded in
package requirements. uv's resolver makes a best-effort attempt to generate a resolution that is
compatible with any machine running on the target `--python-platform`, which should be sufficient for
most use cases, but may lose fidelity for complex package and platform combinations.
If resolution output file exists, i.e. a uv lockfile (uv.lock
) or a requirements output file
(requirements.txt
), uv will prefer the dependency versions listed there. Similarly, if
installing a package into a virtual environment, uv will prefer the already installed version if
present. This means that locked or installed versions will not change unless an incompatible version
is requested or an upgrade is explicitly requested with --upgrade
.
By default, uv tries to use the latest version of each package. For example,
uv pip install flask>=2.0.0
will install the latest version of Flask, e.g., 3.0.0. If
flask>=2.0.0
is a dependency of the project, only flask
3.0.0 will be used. This is important,
for example, because running tests will not check that the project is actually compatible with its
stated lower bound of flask
2.0.0.
With --resolution lowest
, uv will install the lowest possible version for all dependencies, both
direct and indirect (transitive). Alternatively, --resolution lowest-direct
will use the lowest
compatible versions for all direct dependencies, while using the latest compatible versions for all
other dependencies. uv will always use the latest versions for build dependencies.
For example, given the following requirements.in
file:
flask>=2.0.0
Running uv pip compile requirements.in
would produce the following requirements.txt
file:
# This file was autogenerated by uv via the following command:
# uv pip compile requirements.in
blinker==1.7.0
# via flask
click==8.1.7
# via flask
flask==3.0.0
itsdangerous==2.1.2
# via flask
jinja2==3.1.2
# via flask
markupsafe==2.1.3
# via
# jinja2
# werkzeug
werkzeug==3.0.1
# via flask
However, uv pip compile --resolution lowest requirements.in
would instead produce:
# This file was autogenerated by uv via the following command:
# uv pip compile requirements.in --resolution lowest
click==7.1.2
# via flask
flask==2.0.0
itsdangerous==2.0.0
# via flask
jinja2==3.0.0
# via flask
markupsafe==2.0.0
# via jinja2
werkzeug==2.0.0
# via flask
When publishing libraries, it is recommended to separately run tests with --resolution lowest
or
--resolution lowest-direct
in continuous integration to ensure compatibility with the declared
lower bounds.
By default, uv will accept pre-release versions during dependency resolution in two cases:
- If the package is a direct dependency, and its version specifiers include a pre-release specifier
(e.g.,
flask>=2.0.0rc1
). - If all published versions of a package are pre-releases.
If dependency resolution fails due to a transitive pre-release, uv will prompt use of
--prerelease allow
to allow pre-releases for all dependencies.
Alternatively, the transitive dependency can be added as a constraint or
direct dependency (i.e. in requirements.in
or pyproject.toml
) with a pre-release version
specifier (e.g., flask>=2.0.0rc1
) to opt-in to pre-release support for that specific dependency.
Pre-releases are notoriously difficult to model, and are a frequent source of bugs in other packaging tools. uv's pre-release handling is intentionally limited and requires user opt-in for pre-releases to ensure correctness.
For more details, see Pre-release compatibility.
During universal resolution, a package may be listed multiple times with different versions or URLs within the same lockfile, since different versions may be needed for different platforms or Python versions.
The --fork-strategy
setting can be used to control how uv trades off between (1) minimizing the
number of selected versions and (2) selecting the latest-possible version for each platform. The
former leads to greater consistency across platforms, while the latter leads to use of newer package
versions where possible.
By default (--fork-strategy requires-python
), uv will optimize for selecting the latest version of
each package for each supported Python version, while minimizing the number of selected versions
across platforms.
For example, when resolving numpy
with a Python requirement of >=3.8
, uv would select the
following versions:
numpy==1.24.4 ; python_version == "3.8"
numpy==2.0.2 ; python_version == "3.9"
numpy==2.2.0 ; python_version >= "3.10"
This resolution reflects the fact that NumPy 2.2.0 and later require at least Python 3.10, while earlier versions are compatible with Python 3.8 and 3.9.
Under --fork-strategy fewest
, uv will instead minimize the number of selected versions for each
package, preferring older versions that are compatible with a wider range of supported Python
versions or platforms.
For example, when in the scenario above, uv would select numpy==1.24.4
for all Python versions,
rather than upgrading to numpy==2.0.2
for Python 3.9 and numpy==2.2.0
for Python 3.10 and later.
Like pip, uv supports constraint files (--constraint constraints.txt
) which narrow the set of
acceptable versions for the given packages. Constraint files are similar to requirements files, but
being listed as a constraint alone will not cause a package to be included to the resolution.
Instead, constraints only take effect if a requested package is already pulled in as a direct or
transitive dependency. Constraints are useful for reducing the range of available versions for a
transitive dependency. They can also be used to keep a resolution in sync with some other set of
resolved versions, regardless of which packages are overlapping between the two.
Dependency overrides allow bypassing unsuccessful or undesirable resolutions by overriding a package's declared dependencies. Overrides are a useful last resort for cases in which you know that a dependency is compatible with a certain version of a package, despite the metadata indicating otherwise.
For example, if a transitive dependency declares the requirement pydantic>=1.0,<2.0
, but does
work with pydantic>=2.0
, the user can override the declared dependency by including
pydantic>=1.0,<3
in the overrides, thereby allowing the resolver to choose a newer version of
pydantic
.
Concretely, if pydantic>=1.0,<3
is included as an override, uv will ignore all declared
requirements on pydantic
, replacing them with the override. In the above example, the
pydantic>=1.0,<2.0
requirement would be ignored completely, and would instead be replaced with
pydantic>=1.0,<3
.
While constraints can only reduce the set of acceptable versions for a package, overrides can expand the set of acceptable versions, providing an escape hatch for erroneous upper version bounds. As with constraints, overrides do not add a dependency on the package and only take effect if the package is requested in a direct or transitive dependency.
In a pyproject.toml
, use tool.uv.override-dependencies
to define a list of overrides. In the
pip-compatible interface, the --override
option can be used to pass files with the same format as
constraints files.
If multiple overrides are provided for the same package, they must be differentiated with markers. If a package has a dependency with a marker, it is replaced unconditionally when using overrides — it does not matter if the marker evaluates to true or false.
During resolution, uv needs to resolve the metadata for each package it encounters, in order to determine its dependencies. This metadata is often available as a static file in the package index; however, for packages that only provide source distributions, the metadata may not be available upfront.
In such cases, uv has to build the package to determine its metadata (e.g., by invoking setup.py
).
This can introduce a performance penalty during resolution. Further, it imposes the requirement that
the package can be built on all platforms, which may not be true.
For example, you may have a package that should only be built and installed on Linux, but doesn't build successfully on macOS or Windows. While uv can construct a perfectly valid lockfile for this scenario, doing so would require building the package, which would fail on non-Linux platforms.
The tool.uv.dependency-metadata
table can be used to provide static metadata for such dependencies
upfront, thereby allowing uv to skip the build step and use the provided metadata instead.
For example, to provide metadata for chumpy
upfront, include its dependency-metadata
in the
pyproject.toml
:
[[tool.uv.dependency-metadata]]
name = "chumpy"
version = "0.70"
requires-dist = ["numpy>=1.8.1", "scipy>=0.13.0", "six>=1.11.0"]
These declarations are intended for cases in which a package does not declare static metadata upfront, though they are also useful for packages that require disabling build isolation. In such cases, it may be easier to declare the package metadata upfront, rather than creating a custom build environment prior to resolving the package.
For example, you can declare the metadata for flash-attn
, allowing uv to resolve without building
the package from source (which itself requires installing torch
):
[project]
name = "project"
version = "0.1.0"
requires-python = ">=3.12"
dependencies = ["flash-attn"]
[tool.uv.sources]
flash-attn = { git = "https://github.com/Dao-AILab/flash-attention", tag = "v2.6.3" }
[[tool.uv.dependency-metadata]]
name = "flash-attn"
version = "2.6.3"
requires-dist = ["torch", "einops"]
Like dependency overrides, tool.uv.dependency-metadata
can also be used for cases in which a
package's metadata is incorrect or incomplete, or when a package is not available in the package
index. While dependency overrides allow overriding the allowed versions of a package globally,
metadata overrides allow overriding the declared metadata of a specific package.
!!! note
The `version` field in `tool.uv.dependency-metadata` is optional for registry-based
dependencies (when omitted, uv will assume the metadata applies to all versions of the package),
but _required_ for direct URL dependencies (like Git dependencies).
Entries in the tool.uv.dependency-metadata
table follow the
Metadata 2.3 specification,
though only name
, version
, requires-dist
, requires-python
, and provides-extra
are read by
uv. The version
field is also considered optional. If omitted, the metadata will be used for all
versions of the specified package.
By default, uv add
adds lower bounds to dependencies and, when using uv to manage projects, uv
will warn if direct dependencies don't have lower bound.
Lower bounds are not critical in the "happy path", but they are important for cases where there are dependency conflicts. For example, consider a project that requires two packages and those packages have conflicting dependencies. The resolver needs to check all combinations of all versions within the constraints for the two packages — if all of them conflict, an error is reported because the dependencies are not satisfiable. If there are no lower bounds, the resolver can (and often will) backtrack down to the oldest version of a package. This isn't only problematic because it's slow, the old version of the package often fails to build, or the resolver can end up picking a version that's old enough that it doesn't depend on the conflicting package, but also doesn't work with your code.
Lower bounds are particularly critical when writing a library. It's important to declare the lowest
version for each dependency that your library works with, and to validate that the bounds are
correct — testing with
--resolution lowest
or --resolution lowest-direct
. Otherwise, a user may
receive an old, incompatible version of one of your library's dependencies and the library will fail
with an unexpected error.
uv supports an --exclude-newer
option to limit resolution to distributions published before a
specific date, allowing reproduction of installations regardless of new package releases. The date
may be specified as an RFC 3339 timestamp (e.g.,
2006-12-02T02:07:43Z
) or a local date in the same format (e.g., 2006-12-02
) in your system's
configured time zone.
Note the package index must support the upload-time
field as specified in
PEP 700
. If the field is not present for a given
distribution, the distribution will be treated as unavailable. PyPI provides upload-time
for all
packages.
To ensure reproducibility, messages for unsatisfiable resolutions will not mention that
distributions were excluded due to the --exclude-newer
flag — newer distributions will be treated
as if they do not exist.
!!! note
The `--exclude-newer` option is only applied to packages that are read from a registry (as opposed to, e.g., Git
dependencies). Further, when using the `uv pip` interface, uv will not downgrade previously installed packages
unless the `--reinstall` flag is provided, in which case uv will perform a new resolution.
PEP 625 specifies that packages must distribute source
distributions as gzip tarball (.tar.gz
) archives. Prior to this specification, other archive
formats, which need to be supported for backward compatibility, were also allowed. uv supports
reading and extracting archives in the following formats:
- gzip tarball (
.tar.gz
,.tgz
) - bzip2 tarball (
.tar.bz2
,.tbz
) - xz tarball (
.tar.xz
,.txz
) - zstd tarball (
.tar.zst
) - lzip tarball (
.tar.lz
) - lzma tarball (
.tar.lzma
) - zip (
.zip
)
For more details about the internals of the resolver, see the resolver reference documentation.
The uv.lock
file uses a versioned schema. The schema version is included in the version
field of
the lockfile.
Any given version of uv can read and write lockfiles with the same schema version, but will reject
lockfiles with a greater schema version. For example, if your uv version supports schema v1,
uv lock
will error if it encounters an existing lockfile with schema v2.
uv versions that support schema v2 may be able to read lockfiles with schema v1 if the schema update was backwards-compatible. However, this is not guaranteed, and uv may exit with an error if it encounters a lockfile with an outdated schema version.
The schema version is considered part of the public API, and so is only bumped in minor releases, as a breaking change (see Versioning). As such, all uv patch versions within a given minor uv release are guaranteed to have full lockfile compatibility. In other words, lockfiles may only be rejected across minor releases.
An extremely fast Python package and project manager, written in Rust.
Installing Trio's dependencies with a warm cache.
- 🚀 A single tool to replace
pip
,pip-tools
,pipx
,poetry
,pyenv
,twine
,virtualenv
, and more. - ⚡️ 10-100x faster than
pip
. - 🐍 Installs and manages Python versions.
- 🛠️ Runs and installs Python applications.
- ❇️ Runs scripts, with support for inline dependency metadata.
- 🗂️ Provides comprehensive project management, with a universal lockfile.
- 🔩 Includes a pip-compatible interface for a performance boost with a familiar CLI.
- 🏢 Supports Cargo-style workspaces for scalable projects.
- 💾 Disk-space efficient, with a global cache for dependency deduplication.
- ⏬ Installable without Rust or Python via
curl
orpip
. - 🖥️ Supports macOS, Linux, and Windows.
uv is backed by Astral, the creators of Ruff.
Install uv with our official standalone installer:
=== "macOS and Linux"
```console
$ curl -LsSf https://astral.sh/uv/install.sh | sh
```
=== "Windows"
```console
$ powershell -c "irm https://astral.sh/uv/install.ps1 | iex"
```
Then, check out the first steps or read on for a brief overview.
!!! tip
uv may also be installed with pip, Homebrew, and more. See all of the methods on the
[installation page](./getting-started/installation.md).
uv manages project dependencies and environments, with support for lockfiles, workspaces, and more,
similar to rye
or poetry
:
$ uv init example
Initialized project `example` at `/home/user/example`
$ cd example
$ uv add ruff
Creating virtual environment at: .venv
Resolved 2 packages in 170ms
Built example @ file:///home/user/example
Prepared 2 packages in 627ms
Installed 2 packages in 1ms
+ example==0.1.0 (from file:///home/user/example)
+ ruff==0.5.4
$ uv run ruff check
All checks passed!
See the project guide to get started.
uv also supports building and publishing projects, even if they're not managed with uv. See the publish guide to learn more.
uv executes and installs command-line tools provided by Python packages, similar to pipx
.
Run a tool in an ephemeral environment using uvx
(an alias for uv tool run
):
$ uvx pycowsay 'hello world!'
Resolved 1 package in 167ms
Installed 1 package in 9ms
+ pycowsay==0.0.0.2
"""
------------
< hello world! >
------------
\ ^__^
\ (oo)\_______
(__)\ )\/\
||----w |
|| ||
Install a tool with uv tool install
:
$ uv tool install ruff
Resolved 1 package in 6ms
Installed 1 package in 2ms
+ ruff==0.5.4
Installed 1 executable: ruff
$ ruff --version
ruff 0.5.4
See the tools guide to get started.
uv installs Python and allows quickly switching between versions.
Install multiple Python versions:
$ uv python install 3.10 3.11 3.12
Searching for Python versions matching: Python 3.10
Searching for Python versions matching: Python 3.11
Searching for Python versions matching: Python 3.12
Installed 3 versions in 3.42s
+ cpython-3.10.14-macos-aarch64-none
+ cpython-3.11.9-macos-aarch64-none
+ cpython-3.12.4-macos-aarch64-none
Download Python versions as needed:
$ uv venv --python 3.12.0
Using CPython 3.12.0
Creating virtual environment at: .venv
Activate with: source .venv/bin/activate
$ uv run --python [email protected] -- python
Python 3.8.16 (a9dbdca6fc3286b0addd2240f11d97d8e8de187a, Dec 29 2022, 11:45:30)
[PyPy 7.3.11 with GCC Apple LLVM 13.1.6 (clang-1316.0.21.2.5)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>>>
Use a specific Python version in the current directory:
$ uv python pin 3.11
Pinned `.python-version` to `3.11`
See the installing Python guide to get started.
uv manages dependencies and environments for single-file scripts.
Create a new script and add inline metadata declaring its dependencies:
$ echo 'import requests; print(requests.get("https://astral.sh"))' > example.py
$ uv add --script example.py requests
Updated `example.py`
Then, run the script in an isolated virtual environment:
$ uv run example.py
Reading inline script metadata from: example.py
Installed 5 packages in 12ms
<Response [200]>
See the scripts guide to get started.
uv provides a drop-in replacement for common pip
, pip-tools
, and virtualenv
commands.
uv extends their interfaces with advanced features, such as dependency version overrides, platform-independent resolutions, reproducible resolutions, alternative resolution strategies, and more.
Migrate to uv without changing your existing workflows — and experience a 10-100x speedup — with the
uv pip
interface.
Compile requirements into a platform-independent requirements file:
$ uv pip compile docs/requirements.in \
--universal \
--output-file docs/requirements.txt
Resolved 43 packages in 12ms
Create a virtual environment:
$ uv venv
Using CPython 3.12.3
Creating virtual environment at: .venv
Activate with: source .venv/bin/activate
Install the locked requirements:
$ uv pip sync docs/requirements.txt
Resolved 43 packages in 11ms
Installed 43 packages in 208ms
+ babel==2.15.0
+ black==24.4.2
+ certifi==2024.7.4
...
See the pip interface documentation to get started.
See the first steps or jump straight to the guides to start using uv.
uv supports persistent configuration files at both the project- and user-level.
Specifically, uv will search for a pyproject.toml
or uv.toml
file in the current directory, or
in the nearest parent directory.
!!! note
For `tool` commands, which operate at the user level, local configuration
files will be ignored. Instead, uv will exclusively read from user-level configuration
(e.g., `~/.config/uv/uv.toml`) and system-level configuration (e.g., `/etc/uv/uv.toml`).
In workspaces, uv will begin its search at the workspace root, ignoring any configuration defined in workspace members. Since the workspace is locked as a single unit, configuration is shared across all members.
If a pyproject.toml
file is found, uv will read configuration from the [tool.uv]
table. For
example, to set a persistent index URL, add the following to a pyproject.toml
:
[[tool.uv.index]]
url = "https://test.pypi.org/simple"
default = true
(If there is no such table, the pyproject.toml
file will be ignored, and uv will continue
searching in the directory hierarchy.)
uv will also search for uv.toml
files, which follow an identical structure, but omit the
[tool.uv]
prefix. For example:
[[index]]
url = "https://test.pypi.org/simple"
default = true
!!! note
`uv.toml` files take precedence over `pyproject.toml` files, so if both `uv.toml` and
`pyproject.toml` files are present in a directory, configuration will be read from `uv.toml`, and
`[tool.uv]` section in the accompanying `pyproject.toml` will be ignored.
uv will also discover user-level configuration at ~/.config/uv/uv.toml
(or
$XDG_CONFIG_HOME/uv/uv.toml
) on macOS and Linux, or %APPDATA%\uv\uv.toml
on Windows; and
system-level configuration at /etc/uv/uv.toml
(or $XDG_CONFIG_DIRS/uv/uv.toml
) on macOS and
Linux, or %SYSTEMDRIVE%\ProgramData\uv\uv.toml
on Windows.
User-and system-level configuration must use the uv.toml
format, rather than the pyproject.toml
format, as a pyproject.toml
is intended to define a Python project.
If project-, user-, and system-level configuration files are found, the settings will be merged,
with project-level configuration taking precedence over the user-level configuration, and user-level
configuration taking precedence over the system-level configuration. (If multiple system-level
configuration files are found, e.g., at both /etc/uv/uv.toml
and $XDG_CONFIG_DIRS/uv/uv.toml
,
only the first-discovered file will be used, with XDG taking priority.)
For example, if a string, number, or boolean is present in both the project- and user-level configuration tables, the project-level value will be used, and the user-level value will be ignored. If an array is present in both tables, the arrays will be concatenated, with the project-level settings appearing earlier in the merged array.
Settings provided via environment variables take precedence over persistent configuration, and settings provided via the command line take precedence over both.
uv accepts a --no-config
command-line argument which, when provided, disables the discovery of any
persistent configuration.
uv also accepts a --config-file
command-line argument, which accepts a path to a uv.toml
to use
as the configuration file. When provided, this file will be used in place of any discovered
configuration files (e.g., user-level configuration will be ignored).
See the settings reference for an enumeration of the available settings.
uv run
can load environment variables from dotenv files (e.g., .env
, .env.local
,
.env.development
), powered by the dotenvy
crate.
To load a .env
file from a dedicated location, set the UV_ENV_FILE
environment variable, or pass
the --env-file
flag to uv run
.
For example, to load environment variables from a .env
file in the current working directory:
$ echo "MY_VAR='Hello, world!'" > .env
$ uv run --env-file .env -- python -c 'import os; print(os.getenv("MY_VAR"))'
Hello, world!
The --env-file
flag can be provided multiple times, with subsequent files overriding values
defined in previous files. To provide multiple files via the UV_ENV_FILE
environment variable,
separate the paths with a space (e.g., UV_ENV_FILE="/path/to/file1 /path/to/file2"
).
To disable dotenv loading (e.g., to override UV_ENV_FILE
or the --env-file
command-line
argument), set the UV_NO_ENV_FILE
environment variable to 1
, or pass the--no-env-file
flag to
uv run
.
If the same variable is defined in the environment and in a .env
file, the value from the
environment will take precedence.
A dedicated [tool.uv.pip]
section is provided for configuring
just the uv pip
command line interface. Settings in this section will not apply to uv
commands
outside the uv pip
namespace. However, many of the settings in this section have corollaries in
the top-level namespace which do apply to the uv pip
interface unless they are overridden by a
value in the uv.pip
section.
The uv.pip
settings are designed to adhere closely to pip's interface and are declared separately
to retain compatibility while allowing the global settings to use alternate designs (e.g.,
--no-build
).
As an example, setting the index-url
under [tool.uv.pip]
, as in the following pyproject.toml
,
would only affect the uv pip
subcommands (e.g., uv pip install
, but not uv sync
, uv lock
, or
uv run
):
[tool.uv.pip]
index-url = "https://test.pypi.org/simple"
By default, uv is installed to ~/.local/bin
. If XDG_BIN_HOME
is set, it will be used instead.
Similarly, if XDG_DATA_HOME
is set, the target directory will be inferred as
XDG_DATA_HOME/../bin
.
To change the installation path, use UV_INSTALL_DIR
:
=== "macOS and Linux"
```console
$ curl -LsSf https://astral.sh/uv/install.sh | env UV_INSTALL_DIR="/custom/path" sh
```
=== "Windows"
```powershell
powershell -ExecutionPolicy ByPass -c {$env:UV_INSTALL_DIR = "C:\Custom\Path";irm https://astral.sh/uv/install.ps1 | iex}
```
The installer may also update your shell profiles to ensure the uv binary is on your PATH
. To
disable this behavior, use INSTALLER_NO_MODIFY_PATH
. For example:
$ curl -LsSf https://astral.sh/uv/install.sh | env INSTALLER_NO_MODIFY_PATH=1 sh
If installed with INSTALLER_NO_MODIFY_PATH
, subsequent operations, like uv self update
, will not
modify your shell profiles.
In ephemeral environments like CI, use UV_UNMANAGED_INSTALL
to install uv to a specific path while
preventing the installer from modifying shell profiles or environment variables:
$ curl -LsSf https://astral.sh/uv/install.sh | env UV_UNMANAGED_INSTALL="/custom/path" sh
The use of UV_UNMANAGED_INSTALL
will also disable self-updates (via uv self update
).
Using environment variables is recommended because they are consistent across platforms. However, options can be passed directly to the install script. For example, to see the available options:
$ curl -LsSf https://astral.sh/uv/install.sh | sh -s -- --help
By default, uv uses the Python Package Index (PyPI) for dependency resolution
and package installation. However, uv can be configured to use other package indexes, including
private indexes, via the [[tool.uv.index]]
configuration option (and --index
, the analogous
command-line option).
To include an additional index when resolving dependencies, add a [[tool.uv.index]]
entry to your
pyproject.toml
:
[[tool.uv.index]]
# Optional name for the index.
name = "pytorch"
# Required URL for the index.
url = "https://download.pytorch.org/whl/cpu"
Indexes are prioritized in the order in which they’re defined, such that the first index listed in the configuration file is the first index consulted when resolving dependencies, with indexes provided via the command line taking precedence over those in the configuration file.
By default, uv includes the Python Package Index (PyPI) as the "default" index, i.e., the index used
when a package is not found on any other index. To exclude PyPI from the list of indexes, set
default = true
on another index entry (or use the --default-index
command-line option):
[[tool.uv.index]]
name = "pytorch"
url = "https://download.pytorch.org/whl/cpu"
default = true
The default index is always treated as lowest priority, regardless of its position in the list of indexes.
Index names may only contain alphanumeric characters, dashes, underscores, and periods, and must be valid ASCII.
When providing an index on the command line (with --index
or --default-index
) or through an
environment variable (UV_INDEX
or UV_DEFAULT_INDEX
), names are optional but can be included
using the <name>=<url>
syntax, as in:
# On the command line.
$ uv lock --index pytorch=https://download.pytorch.org/whl/cpu
# Via an environment variable.
$ UV_INDEX=pytorch=https://download.pytorch.org/whl/cpu uv lock
A package can be pinned to a specific index by specifying the index in its tool.uv.sources
entry.
For example, to ensure that torch
is always installed from the pytorch
index, add the
following to your pyproject.toml
:
[tool.uv.sources]
torch = { index = "pytorch" }
[[tool.uv.index]]
name = "pytorch"
url = "https://download.pytorch.org/whl/cpu"
Similarly, to pull from a different index based on the platform, you can provide a list of sources disambiguated by environment markers:
[project]
dependencies = ["torch"]
[tool.uv.sources]
torch = [
{ index = "pytorch-cu118", marker = "sys_platform == 'darwin'"},
{ index = "pytorch-cu124", marker = "sys_platform != 'darwin'"},
]
[[tool.uv.index]]
name = "pytorch-cu118"
url = "https://download.pytorch.org/whl/cu118"
[[tool.uv.index]]
name = "pytorch-cu124"
url = "https://download.pytorch.org/whl/cu124"
An index can be marked as explicit = true
to prevent packages from being installed from that index
unless explicitly pinned to it. For example, to ensure that torch
is installed from the pytorch
index, but all other packages are installed from PyPI, add the following to your pyproject.toml
:
[tool.uv.sources]
torch = { index = "pytorch" }
[[tool.uv.index]]
name = "pytorch"
url = "https://download.pytorch.org/whl/cpu"
explicit = true
Named indexes referenced via tool.uv.sources
must be defined within the project's pyproject.toml
file; indexes provided via the command-line, environment variables, or user-level configuration will
not be recognized.
If an index is marked as both default = true
and explicit = true
, it will be treated as an
explicit index (i.e., only usable via tool.uv.sources
) while also removing PyPI as the default
index.
By default, uv will stop at the first index on which a given package is available, and limit
resolutions to those present on that first index (first-index
).
For example, if an internal index is specified via [[tool.uv.index]]
, uv's behavior is such that
if a package exists on that internal index, it will always be installed from that internal index,
and never from PyPI. The intent is to prevent "dependency confusion" attacks, in which an attacker
publishes a malicious package on PyPI with the same name as an internal package, thus causing the
malicious package to be installed instead of the internal package. See, for example,
the torchtriton
attack from
December 2022.
Users can opt in to alternate index behaviors via the--index-strategy
command-line option, or the
UV_INDEX_STRATEGY
environment variable, which supports the following values:
first-index
(default): Search for each package across all indexes, limiting the candidate versions to those present in the first index that contains the package.unsafe-first-match
: Search for each package across all indexes, but prefer the first index with a compatible version, even if newer versions are available on other indexes.unsafe-best-match
: Search for each package across all indexes, and select the best version from the combined set of candidate versions.
While unsafe-best-match
is the closest to pip's behavior, it exposes users to the risk of
"dependency confusion" attacks.
Most private registries require authentication to access packages, typically via a username and password (or access token).
To authenticate with a provide index, either provide credentials via environment variables or embed them in the URL.
For example, given an index named internal-proxy
that requires a username (public
) and password
(koala
), define the index (without credentials) in your pyproject.toml
:
[[tool.uv.index]]
name = "internal-proxy"
url = "https://example.com/simple"
From there, you can set the UV_INDEX_INTERNAL_PROXY_USERNAME
and
UV_INDEX_INTERNAL_PROXY_PASSWORD
environment variables, where INTERNAL_PROXY
is the uppercase
version of the index name, with non-alphanumeric characters replaced by underscores:
export UV_INDEX_INTERNAL_PROXY_USERNAME=public
export UV_INDEX_INTERNAL_PROXY_PASSWORD=koala
By providing credentials via environment variables, you can avoid storing sensitive information in
the plaintext pyproject.toml
file.
Alternatively, credentials can be embedded directly in the index definition:
[[tool.uv.index]]
name = "internal"
url = "https://public:[email protected]/simple"
For security purposes, credentials are never stored in the uv.lock
file; as such, uv must have
access to the authenticated URL at installation time.
In addition to the [[tool.uv.index]]
configuration option, uv supports pip-style --index-url
and
--extra-index-url
command-line options for compatibility, where --index-url
defines the default
index and --extra-index-url
defines additional indexes.
These options can be used in conjunction with the [[tool.uv.index]]
configuration option, and
follow the same prioritization rules:
- The default index is always treated as lowest priority, whether defined via the legacy
--index-url
argument, the recommended--default-index
argument, or a[[tool.uv.index]]
entry withdefault = true
. - Indexes are consulted in the order in which they’re defined, either via the legacy
--extra-index-url
argument, the recommended--index
argument, or[[tool.uv.index]]
entries.
In effect, --index-url
and --extra-index-url
can be thought of as unnamed [[tool.uv.index]]
entries, with default = true
enabled for the former. In that context, --index-url
maps to
--default-index
, and --extra-index-url
maps to --index
.
Read about the various ways to configure uv:
- Using configuration files
- Using environment variables
- Configuring authentication
- Configuring package indexes
Or, jump to the settings reference which enumerates the available configuration options.
uv allows packages to be installed from Git and supports the following schemes for authenticating with private repositories.
Using SSH:
git+ssh://git@<hostname>/...
(e.g.git+ssh://[email protected]/astral-sh/uv
)git+ssh://git@<host>/...
(e.g.git+ssh://[email protected]/astral-sh/uv
)
See the GitHub SSH documentation for more details on how to configure SSH.
Using a password or token:
git+https://<user>:<token>@<hostname>/...
(e.g.git+https://git:[email protected]/astral-sh/uv
)git+https://<token>@<hostname>/...
(e.g.git+https://[email protected]/astral-sh/uv
)git+https://<user>@<hostname>/...
(e.g.git+https://[email protected]/astral-sh/uv
)
When using a GitHub personal access token, the username is arbitrary. GitHub does not support logging in with password directly, although other hosts may. If a username is provided without credentials, you will be prompted to enter them.
If there are no credentials present in the URL and authentication is needed, the Git credential helper will be queried.
uv supports credentials over HTTP when querying package registries.
Authentication can come from the following sources, in order of precedence:
- The URL, e.g.,
https://<user>:<password>@<hostname>/...
- A
.netrc
configuration file - A keyring provider (requires opt-in)
If authentication is found for a single net location (scheme, host, and port), it will be cached for the duration of the command and used for other queries to that net location. Authentication is not cached across invocations of uv.
.netrc
authentication is enabled by default, and will respect the NETRC
environment variable if
defined, falling back to ~/.netrc
if not.
To enable keyring-based authentication, pass the --keyring-provider subprocess
command-line
argument to uv, or set UV_KEYRING_PROVIDER=subprocess
.
Authentication may be used for hosts specified in the following contexts:
index-url
extra-index-url
find-links
package @ https://...
See the pip
compatibility guide for details on
differences from pip
.
By default, uv loads certificates from the bundled webpki-roots
crate. The webpki-roots
are a
reliable set of trust roots from Mozilla, and including them in uv improves portability and
performance (especially on macOS, where reading the system trust store incurs a significant delay).
However, in some cases, you may want to use the platform's native certificate store, especially if
you're relying on a corporate trust root (e.g., for a mandatory proxy) that's included in your
system's certificate store. To instruct uv to use the system's trust store, run uv with the
--native-tls
command-line flag, or set the UV_NATIVE_TLS
environment variable to true
.
If a direct path to the certificate is required (e.g., in CI), set the SSL_CERT_FILE
environment
variable to the path of the certificate bundle, to instruct uv to use that file instead of the
system's trust store.
If client certificate authentication (mTLS) is desired, set the SSL_CLIENT_CERT
environment
variable to the path of the PEM formatted file containing the certificate followed by the private
key.
Finally, if you're using a setup in which you want to trust a self-signed certificate or otherwise
disable certificate verification, you can instruct uv to allow insecure connections to dedicated
hosts via the allow-insecure-host
configuration option. For example, adding the following to
pyproject.toml
will allow insecure connections to example.com
:
[tool.uv]
allow-insecure-host = ["example.com"]
allow-insecure-host
expects to receive a hostname (e.g., localhost
) or hostname-port pair (e.g.,
localhost:8080
), and is only applicable to HTTPS connections, as HTTP connections are inherently
insecure.
Use allow-insecure-host
with caution and only in trusted environments, as it can expose you to
security risks due to the lack of certificate verification.
See the alternative indexes integration guide for details on authentication with popular alternative Python package indexes.
uv defines and respects the following environment variables:
Equivalent to the --break-system-packages
command-line argument. If set to true
,
uv will allow the installation of packages that conflict with system-installed packages.
WARNING: UV_BREAK_SYSTEM_PACKAGES=true
is intended for use in continuous integration
(CI) or containerized environments and should be used with caution, as modifying the system
Python can lead to unexpected behavior.
Equivalent to the --build-constraint
command-line argument. If set, uv will use this file
as constraints for any source distribution builds. Uses space-separated list of files.
Equivalent to the --cache-dir
command-line argument. If set, uv will use this
directory for caching instead of the default cache directory.
Equivalent to the --compile-bytecode
command-line argument. If set, uv
will compile Python source files to bytecode after installation.
Sets the maximum number of source distributions that uv will build concurrently at any given time.
Sets the maximum number of in-flight concurrent downloads that uv will perform at any given time.
Controls the number of threads used when installing and unzipping packages.
Equivalent to the --config-file
command-line argument. Expects a path to a
local uv.toml
file to use as the configuration file.
Equivalent to the --constraint
command-line argument. If set, uv will use this
file as the constraints file. Uses space-separated list of files.
Equivalent to the --custom-compile-command
command-line argument.
Used to override uv in the output header of the requirements.txt
files generated by
uv pip compile
. Intended for use-cases in which uv pip compile
is called from within a wrapper
script, to include the name of the wrapper script in the output file.
Equivalent to the --default-index
command-line argument. If set, uv will use
this URL as the default index when searching for packages.
.env
files from which to load environment variables when executing uv run
commands.
Equivalent to the --exclude-newer
command-line argument. If set, uv will
exclude distributions published after the specified date.
Equivalent to the --extra-index-url
command-line argument. If set, uv will
use this space-separated list of URLs as additional indexes when searching for packages.
(Deprecated: use UV_INDEX
instead.)
Equivalent to the --find-links
command-line argument. If set, uv will use this
comma-separated list of additional locations to search for packages.
Equivalent to the --fork-strategy
argument. Controls version selection during universal
resolution.
Equivalent to the --frozen
command-line argument. If set, uv will run without
updating the uv.lock
file.
Equivalent to the --token
argument for self update. A GitHub token for authentication.
Timeout (in seconds) for HTTP requests. (default: 30 s)
Equivalent to the --index
command-line argument. If set, uv will use this
space-separated list of URLs as additional indexes when searching for packages.
Equivalent to the --index-strategy
command-line argument. For example, if
set to unsafe-any-match
, uv will consider versions of a given package available across all index
URLs, rather than limiting its search to the first index URL that contains the package.
Equivalent to the --index-url
command-line argument. If set, uv will use this
URL as the default index when searching for packages.
(Deprecated: use UV_DEFAULT_INDEX
instead.)
Generates the environment variable key for the HTTP Basic authentication password.
Generates the environment variable key for the HTTP Basic authentication username.
Equivalent to the --allow-insecure-host
argument.
The URL from which to download uv using the standalone installer and self update
feature,
in lieu of the default GitHub Enterprise URL.
The URL from which to download uv using the standalone installer and self update
feature,
in lieu of the default GitHub URL.
The directory in which to install uv using the standalone installer and self update
feature.
Defaults to ~/.local/bin
.
Equivalent to the --keyring-provider
command-line argument. If set, uv
will use this value as the keyring provider.
Equivalent to the --link-mode
command-line argument. If set, uv will use this as
a link mode.
Equivalent to the --locked
command-line argument. If set, uv will assert that the
uv.lock
remains unchanged.
Equivalent to the --native-tls
command-line argument. If set to true
, uv will
use the system's trust store instead of the bundled webpki-roots
crate.
Equivalent to the --no-build-isolation
command-line argument. If set, uv will
skip isolation when building source distributions.
Equivalent to the --no-cache
command-line argument. If set, uv will not use the
cache for any operations.
Equivalent to the --no-config
command-line argument. If set, uv will not read
any configuration files from the current directory, parent directories, or user configuration
directories.
Ignore .env
files when executing uv run
commands.
Skip writing uv
installer metadata files (e.g., INSTALLER
, REQUESTED
, and direct_url.json
) to site-packages .dist-info
directories.
Equivalent to the --no-progress
command-line argument. Disables all progress output. For
example, spinners and progress bars.
Equivalent to the --no-sync
command-line argument. If set, uv will skip updating
the environment.
Equivalent to the --no-verify-hashes
argument. Disables hash verification for
requirements.txt
files.
Use to disable line wrapping for diagnostics.
Equivalent to the --offline
command-line argument. If set, uv will disable network access.
Equivalent to the --override
command-line argument. If set, uv will use this file
as the overrides file. Uses space-separated list of files.
Equivalent to the --prerelease
command-line argument. For example, if set to
allow
, uv will allow pre-release versions for all dependencies.
Equivalent to the --preview
argument. Enables preview mode.
Specifies the path to the directory to use for a project virtual environment. See the project documentation for more details.
Don't upload a file if it already exists on the index. The value is the URL of the index.
Equivalent to the --index
command-line argument in uv publish
. If
set, uv the index with this name in the configuration for publishing.
Equivalent to the --password
command-line argument in uv publish
. If
set, uv will use this password for publishing.
Equivalent to the --token
command-line argument in uv publish
. If set, uv
will use this token (with the username __token__
) for publishing.
Equivalent to the --publish-url
command-line argument. The URL of the upload
endpoint of the index to use with uv publish
.
Equivalent to the --username
command-line argument in uv publish
. If
set, uv will use this username for publishing.
Managed PyPy installations are downloaded from
python.org. This variable can be set to a mirror URL to use a
different source for PyPy installations. The provided URL will replace
https://downloads.python.org/pypy
in, e.g.,
https://downloads.python.org/pypy/pypy3.8-v7.3.7-osx64.tar.bz2
.
Distributions can be read from a local directory by using the file://
URL scheme.
Equivalent to the --python
command-line argument. If set to a path, uv will use
this Python interpreter for all operations.
Specifies the directory to place links to installed, managed Python executables.
Equivalent to the
python-downloads
setting and, when disabled, the
--no-python-downloads
option. Whether uv should allow Python downloads.
Specifies the directory for storing managed Python installations.
Managed Python installations are downloaded from the Astral
python-build-standalone
project.
This variable can be set to a mirror URL to use a different source for Python installations.
The provided URL will replace https://github.com/astral-sh/python-build-standalone/releases/download
in, e.g.,
https://github.com/astral-sh/python-build-standalone/releases/download/20240713/cpython-3.12.4%2B20240713-aarch64-apple-darwin-install_only.tar.gz
.
Distributions can be read from a local directory by using the file://
URL scheme.
Equivalent to the --python-preference
command-line argument. Whether uv
should prefer system or managed Python versions.
Timeout (in seconds) for HTTP requests. Equivalent to UV_HTTP_TIMEOUT
.
Equivalent to the --require-hashes
command-line argument. If set to true
,
uv will require that all dependencies have a hash specified in the requirements file.
Equivalent to the --resolution
command-line argument. For example, if set to
lowest-direct
, uv will install the lowest compatible versions of all direct dependencies.
Use to increase the stack size used by uv in debug builds on Windows.
Equivalent to the --system
command-line argument. If set to true
, uv will
use the first Python interpreter found in the system PATH
.
WARNING: UV_SYSTEM_PYTHON=true
is intended for use in continuous integration (CI)
or containerized environments and should be used with caution, as modifying the system
Python can lead to unexpected behavior.
Specifies the "bin" directory for installing tool executables.
Specifies the directory where uv stores managed tools.
Used ephemeral environments like CI to install uv to a specific path while preventing the installer from modifying shell profiles or environment variables.
uv also reads the following externally defined environment variables:
Used for trusted publishing via uv publish
. Contains the oidc request token.
Used for trusted publishing via uv publish
. Contains the oidc token url.
General proxy for all network requests.
Used to detect Bash shell usage.
Use to control color via anstyle
.
Used to determine if an active Conda environment is the base environment or not.
Used to detect an activated Conda environment.
Used to detect Fish shell usage.
Forces colored output regardless of terminal support.
See force-color.org.
Used for trusted publishing via uv publish
.
The standard HOME
env var.
Proxy for HTTPS requests.
Proxy for HTTP requests.
Timeout (in seconds) for HTTP requests. Equivalent to UV_HTTP_TIMEOUT
.
Avoid modifying the PATH
environment variable when installing uv using the standalone
installer and self update
feature.
Used to detect when running inside a Jupyter notebook.
Used to detect Ksh shell usage.
Used to look for Microsoft Store Pythons installations.
Used with --python-platform macos
and related variants to set the
deployment target (i.e., the minimum supported macOS version).
Defaults to 12.0
, the least-recent non-EOL macOS version at time of writing.
Use to set the .netrc file location.
Disables colored output (takes precedence over FORCE_COLOR
).
See no-color.org.
Used to detect NuShell
usage.
The standard PAGER
posix env var. Used by uv
to configure the appropriate pager.
The standard PATH
env var.
Used to detect the use of the Windows Command Prompt (as opposed to PowerShell).
The standard PWD
posix env var.
The validation modes to use when run with --compile
.
See PycInvalidationMode
.
Adds directories to Python module search path (e.g., PYTHONPATH=/path/to/modules
).
If set, uv will use this value as the log level for its --verbose
output. Accepts
any filter compatible with the tracing_subscriber
crate.
For example:
RUST_LOG=uv=debug
is the equivalent of adding--verbose
to the command lineRUST_LOG=trace
will enable trace-level logging.
See the tracing documentation for more.
The standard SHELL
posix env var.
Custom certificate bundle file path for SSL connections.
If set, uv will use this file for mTLS authentication. This should be a single file containing both the certificate and the private key in PEM format.
Path to system-level configuration directory on Windows systems.
Use to create the tracing durations file via the tracing-durations-export
feature.
Used to detect an activated virtual environment.
If set to 1
before a virtual environment is activated, then the
virtual environment name will not be prepended to the terminal prompt.
Path to directory where executables are installed.
Path to cache directory on Unix systems.
Path to system-level configuration directory on Unix systems.
Path to user-level configuration directory on Unix systems.
Path to directory for storing managed Python installations and tools.
Used to determine which .zshenv
to use when Zsh is being used.
Used to detect Zsh shell usage.
uv supports managing Python projects, which define their dependencies in a pyproject.toml
file.
You can create a new Python project using the uv init
command:
$ uv init hello-world
$ cd hello-world
Alternatively, you can initialize a project in the working directory:
$ mkdir hello-world
$ cd hello-world
$ uv init
uv will create the following files:
.
├── .python-version
├── README.md
├── hello.py
└── pyproject.toml
The hello.py
file contains a simple "Hello world" program. Try it out with uv run
:
$ uv run hello.py
Hello from hello-world!
A project consists of a few important parts that work together and allow uv to manage your project.
In addition to the files created by uv init
, uv will create a virtual environment and uv.lock
file in the root of your project the first time you run a project command, i.e., uv run
,
uv sync
, or uv lock
.
A complete listing would look like:
.
├── .venv
│ ├── bin
│ ├── lib
│ └── pyvenv.cfg
├── .python-version
├── README.md
├── hello.py
├── pyproject.toml
└── uv.lock
The pyproject.toml
contains metadata about your project:
[project]
name = "hello-world"
version = "0.1.0"
description = "Add your description here"
readme = "README.md"
dependencies = []
You'll use this file to specify dependencies, as well as details about the project such as its
description or license. You can edit this file manually, or use commands like uv add
and
uv remove
to manage your project from the terminal.
!!! tip
See the official [`pyproject.toml` guide](https://packaging.python.org/en/latest/guides/writing-pyproject-toml/)
for more details on getting started with the `pyproject.toml` format.
You'll also use this file to specify uv configuration options in a
[tool.uv]
section.
The .python-version
file contains the project's default Python version. This file tells uv which
Python version to use when creating the project's virtual environment.
The .venv
folder contains your project's virtual environment, a Python environment that is
isolated from the rest of your system. This is where uv will install your project's dependencies.
See the project environment documentation for more details.
uv.lock
is a cross-platform lockfile that contains exact information about your project's
dependencies. Unlike the pyproject.toml
which is used to specify the broad requirements of your
project, the lockfile contains the exact resolved versions that are installed in the project
environment. This file should be checked into version control, allowing for consistent and
reproducible installations across machines.
uv.lock
is a human-readable TOML file but is managed by uv and should not be edited manually.
See the lockfile documentation for more details.
You can add dependencies to your pyproject.toml
with the uv add
command. This will also update
the lockfile and project environment:
$ uv add requests
You can also specify version constraints or alternative sources:
$ # Specify a version constraint
$ uv add 'requests==2.31.0'
$ # Add a git dependency
$ uv add git+https://github.com/psf/requests
To remove a package, you can use uv remove
:
$ uv remove requests
To upgrade a package, run uv lock
with the --upgrade-package
flag:
$ uv lock --upgrade-package requests
The --upgrade-package
flag will attempt to update the specified package to the latest compatible
version, while keeping the rest of the lockfile intact.
See the documentation on managing dependencies for more details.
uv run
can be used to run arbitrary scripts or commands in your project environment.
Prior to every uv run
invocation, uv will verify that the lockfile is up-to-date with the
pyproject.toml
, and that the environment is up-to-date with the lockfile, keeping your project
in-sync without the need for manual intervention. uv run
guarantees that your command is run in a
consistent, locked environment.
For example, to use flask
:
$ uv add flask
$ uv run -- flask run -p 3000
Or, to run a script:
# Require a project dependency
import flask
print("hello world")
$ uv run example.py
Alternatively, you can use uv sync
to manually update the environment then activate it before
executing a command:
$ uv sync
$ source .venv/bin/activate
$ flask run -p 3000
$ python example.py
!!! note
The virtual environment must be active to run scripts and commands in the project without `uv run`. Virtual environment activation differs per shell and platform.
See the documentation on running commands and scripts in projects for more details.
uv build
can be used to build source distributions and binary distributions (wheel) for your
project.
By default, uv build
will build the project in the current directory, and place the built
artifacts in a dist/
subdirectory:
$ uv build
$ ls dist/
hello-world-0.1.0-py3-none-any.whl
hello-world-0.1.0.tar.gz
See the documentation on building projects for more details.
To learn more about working on projects with uv, see the projects concept page and the command reference.
Or, read on to learn how to publish your project as a package.
Many Python packages provide applications that can be used as tools. uv has specialized support for easily invoking and installing tools.
The uvx
command invokes a tool without installing it.
For example, to run ruff
:
$ uvx ruff
!!! note
This is exactly equivalent to:
```console
$ uv tool run ruff
```
`uvx` is provided as an alias for convenience.
Arguments can be provided after the tool name:
$ uvx pycowsay hello from uv
-------------
< hello from uv >
-------------
\ ^__^
\ (oo)\_______
(__)\ )\/\
||----w |
|| ||
Tools are installed into temporary, isolated environments when using uvx
.
!!! note
If you are running a tool in a [_project_](../concepts/projects/index.md) and the tool requires that
your project is installed, e.g., when using `pytest` or `mypy`, you'll want to use
[`uv run`](./projects.md#running-commands) instead of `uvx`. Otherwise, the tool will be run in
a virtual environment that is isolated from your project.
If your project has a flat structure, e.g., instead of using a `src` directory for modules,
the project itself does not need to be installed and `uvx` is fine. In this case, using
`uv run` is only beneficial if you want to pin the version of the tool in the project's
dependencies.
When uvx ruff
is invoked, uv installs the ruff
package which provides the ruff
command.
However, sometimes the package and command names differ.
The --from
option can be used to invoke a command from a specific package, e.g. http
which is
provided by httpie
:
$ uvx --from httpie http
To run a tool at a specific version, use command@<version>
:
$ uvx [email protected] check
To run a tool at the latest version, use command@latest
:
$ uvx ruff@latest check
The --from
option can also be used to specify package versions, as above:
$ uvx --from 'ruff==0.3.0' ruff check
Or, to constrain to a range of versions:
$ uvx --from 'ruff>0.2.0,<0.3.0' ruff check
Note the @
syntax cannot be used for anything other than an exact version.
The --from
option can be used to run a tool with extras:
$ uvx --from 'mypy[faster-cache,reports]' mypy --xml-report mypy_report
This can also be combined with version selection:
$ uvx --from 'mypy[faster-cache,reports]==1.13.0' mypy --xml-report mypy_report
The --from
option can also be used to install from alternative sources.
For example, to pull from git:
$ uvx --from git+https://github.com/httpie/cli httpie
You can also pull the latest commit from a specific named branch:
$ uvx --from git+https://github.com/httpie/cli@master httpie
Or pull a specific tag:
$ uvx --from git+https://github.com/httpie/[email protected] httpie
Or even a specific commit:
$ uvx --from git+https://github.com/httpie/cli@2843b87 httpie
Additional dependencies can be included, e.g., to include mkdocs-material
when running mkdocs
:
$ uvx --with mkdocs-material mkdocs --help
If a tool is used often, it is useful to install it to a persistent environment and add it to the
PATH
instead of invoking uvx
repeatedly.
!!! tip
`uvx` is a convenient alias for `uv tool run`. All of the other commands for interacting with
tools require the full `uv tool` prefix.
To install ruff
:
$ uv tool install ruff
When a tool is installed, its executables are placed in a bin
directory in the PATH
which allows
the tool to be run without uv. If it's not on the PATH
, a warning will be displayed and
uv tool update-shell
can be used to add it to the PATH
.
After installing ruff
, it should be available:
$ ruff --version
Unlike uv pip install
, installing a tool does not make its modules available in the current
environment. For example, the following command will fail:
$ python -c "import ruff"
This isolation is important for reducing interactions and conflicts between dependencies of tools, scripts, and projects.
Unlike uvx
, uv tool install
operates on a package and will install all executables provided by
the tool.
For example, the following will install the http
, https
, and httpie
executables:
$ uv tool install httpie
Additionally, package versions can be included without --from
:
$ uv tool install 'httpie>0.1.0'
And, similarly, for package sources:
$ uv tool install git+https://github.com/httpie/cli
As with uvx
, installations can include additional packages:
$ uv tool install mkdocs --with mkdocs-material
To upgrade a tool, use uv tool upgrade
:
$ uv tool upgrade ruff
Tool upgrades will respect the version constraints provided when installing the tool. For example,
uv tool install ruff >=0.3,<0.4
followed by uv tool upgrade ruff
will upgrade Ruff to the latest
version in the range >=0.3,<0.4
.
To instead replace the version constraints, re-install the tool with uv tool install
:
$ uv tool install ruff>=0.4
To instead upgrade all tools:
$ uv tool upgrade --all
To learn more about managing tools with uv, see the Tools concept page and the command reference.
Or, read on to learn how to to work on projects.
uv supports building Python packages into source and binary distributions via uv build
and
uploading them to a registry with uv publish
.
Before attempting to publish your project, you'll want to make sure it's ready to be packaged for distribution.
If your project does not include a [build-system]
definition in the pyproject.toml
, uv will not
build it by default. This means that your project may not be ready for distribution. Read more about
the effect of declaring a build system in the
project concept documentation.
!!! note
If you have internal packages that you do not want to be published, you can mark them as
private:
```toml
[project]
classifiers = ["Private :: Do Not Upload"]
```
This setting makes PyPI reject your uploaded package from publishing. It does not affect
security or privacy settings on alternative registries.
We also recommend only generating per-project tokens: Without a PyPI token matching the project,
it can't be accidentally published.
Build your package with uv build
:
$ uv build
By default, uv build
will build the project in the current directory, and place the built
artifacts in a dist/
subdirectory.
Alternatively, uv build <SRC>
will build the package in the specified directory, while
uv build --package <PACKAGE>
will build the specified package within the current workspace.
!!! info
By default, `uv build` respects `tool.uv.sources` when resolving build dependencies from the
`build-system.requires` section of the `pyproject.toml`. When publishing a package, we recommend
running `uv build --no-sources` to ensure that the package builds correctly when `tool.uv.sources`
is disabled, as is the case when using other build tools, like [`pypa/build`](https://github.com/pypa/build).
Publish your package with uv publish
:
$ uv publish
Set a PyPI token with --token
or UV_PUBLISH_TOKEN
, or set a username with --username
or
UV_PUBLISH_USERNAME
and password with --password
or UV_PUBLISH_PASSWORD
. For publishing to
PyPI from GitHub Actions, you don't need to set any credentials. Instead,
add a trusted publisher to the PyPI project.
!!! note
PyPI does not support publishing with username and password anymore, instead you need to
generate a token. Using a token is equivalent to setting `--username __token__` and using the
token as password.
If you're using a custom index through [[tool.uv.index]]
, add publish-url
and use
uv publish --index <name>
. For example:
[[tool.uv.index]]
name = "testpypi"
url = "https://test.pypi.org/simple/"
publish-url = "https://test.pypi.org/legacy/"
!!! note
When using `uv publish --index <name>`, the `pyproject.toml` must be present, i.e. you need to
have a checkout step in a publish CI job.
Even though uv publish
retries failed uploads, it can happen that publishing fails in the middle,
with some files uploaded and some files still missing. With PyPI, you can retry the exact same
command, existing identical files will be ignored. With other registries, use
--check-url <index url>
with the index URL (not the publish URL) the packages belong to. When
using --index
, the index URL is used as check URL. uv will skip uploading files that are identical
to files in the registry, and it will also handle raced parallel uploads. Note that existing files
need to match exactly with those previously uploaded to the registry, this avoids accidentally
publishing source distribution and wheels with different contents for the same version.
Test that the package can be installed and imported with uv run
:
$ uv run --with <PACKAGE> --no-project -- python -c "import <PACKAGE>"
The --no-project
flag is used to avoid installing the package from your local project directory.
!!! tip
If you have recently installed the package, you may need to include the
`--refresh-package <PACKAGE>` option to avoid using a cached version of the package.
To learn more about publishing packages, check out the PyPA guides on building and publishing.
Or, read on for guides on integrating uv with other software.
A Python script is a file intended for standalone execution, e.g., with python <script>.py
. Using
uv to execute scripts ensures that script dependencies are managed without manually managing
environments.
!!! note
If you are not familiar with Python environments: every Python installation has an environment
that packages can be installed in. Typically, creating [_virtual_ environments](https://docs.python.org/3/library/venv.html) is recommended to
isolate packages required by each script. uv automatically manages virtual environments for you
and prefers a [declarative](#declaring-script-dependencies) approach to dependencies.
If your script has no dependencies, you can execute it with uv run
:
print("Hello world")
$ uv run example.py
Hello world
Similarly, if your script depends on a module in the standard library, there's nothing more to do:
import os
print(os.path.expanduser("~"))
$ uv run example.py
/Users/astral
Arguments may be provided to the script:
import sys
print(" ".join(sys.argv[1:]))
$ uv run example.py test
test
$ uv run example.py hello world!
hello world!
Additionally, your script can be read directly from stdin:
$ echo 'print("hello world!")' | uv run -
Or, if your shell supports here-documents:
uv run - <<EOF
print("hello world!")
EOF
Note that if you use uv run
in a project, i.e. a directory with a pyproject.toml
, it will
install the current project before running the script. If your script does not depend on the
project, use the --no-project
flag to skip this:
$ # Note, it is important that the flag comes _before_ the script
$ uv run --no-project example.py
See the projects guide for more details on working in projects.
When your script requires other packages, they must be installed into the environment that the script runs in. uv prefers to create these environments on-demand instead of using a long-lived virtual environment with manually managed dependencies. This requires explicit declaration of dependencies that are required for the script. Generally, it's recommended to use a project or inline metadata to declare dependencies, but uv supports requesting dependencies per invocation as well.
For example, the following script requires rich
.
import time
from rich.progress import track
for i in track(range(20), description="For example:"):
time.sleep(0.05)
If executed without specifying a dependency, this script will fail:
$ uv run --no-project example.py
Traceback (most recent call last):
File "/Users/astral/example.py", line 2, in <module>
from rich.progress import track
ModuleNotFoundError: No module named 'rich'
Request the dependency using the --with
option:
$ uv run --with rich example.py
For example: ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 100% 0:00:01
Constraints can be added to the requested dependency if specific versions are needed:
$ uv run --with 'rich>12,<13' example.py
Multiple dependencies can be requested by repeating with --with
option.
Note that if uv run
is used in a project, these dependencies will be included in addition to
the project's dependencies. To opt-out of this behavior, use the --no-project
flag.
Python recently added a standard format for
inline script metadata.
It allows for selecting Python versions and defining dependencies. Use uv init --script
to
initialize scripts with the inline metadata:
$ uv init --script example.py --python 3.12
The inline metadata format allows the dependencies for a script to be declared in the script itself.
uv supports adding and updating inline script metadata for you. Use uv add --script
to declare the
dependencies for the script:
$ uv add --script example.py 'requests<3' 'rich'
This will add a script
section at the top of the script declaring the dependencies using TOML:
# /// script
# dependencies = [
# "requests<3",
# "rich",
# ]
# ///
import requests
from rich.pretty import pprint
resp = requests.get("https://peps.python.org/api/peps.json")
data = resp.json()
pprint([(k, v["title"]) for k, v in data.items()][:10])
uv will automatically create an environment with the dependencies necessary to run the script, e.g.:
$ uv run example.py
[
│ ('1', 'PEP Purpose and Guidelines'),
│ ('2', 'Procedure for Adding New Modules'),
│ ('3', 'Guidelines for Handling Bug Reports'),
│ ('4', 'Deprecation of Standard Modules'),
│ ('5', 'Guidelines for Language Evolution'),
│ ('6', 'Bug Fix Releases'),
│ ('7', 'Style Guide for C Code'),
│ ('8', 'Style Guide for Python Code'),
│ ('9', 'Sample Plaintext PEP Template'),
│ ('10', 'Voting Guidelines')
]
!!! important
When using inline script metadata, even if `uv run` is [used in a _project_](../concepts/projects/run.md), the project's dependencies will be ignored. The `--no-project` flag is not required.
uv also respects Python version requirements:
# /// script
# requires-python = ">=3.12"
# dependencies = []
# ///
# Use some syntax added in Python 3.12
type Point = tuple[float, float]
print(Point)
!!! note
The `dependencies` field must be provided even if empty.
uv run
will search for and use the required Python version. The Python version will download if it
is not installed — see the documentation on Python versions for
more details.
uv supports an exclude-newer
field in the tool.uv
section of inline script metadata to limit uv
to only considering distributions released before a specific date. This is useful for improving the
reproducibility of your script when run at a later point in time.
The date must be specified as an RFC 3339 timestamp
(e.g., 2006-12-02T02:07:43Z
).
# /// script
# dependencies = [
# "requests",
# ]
# [tool.uv]
# exclude-newer = "2023-10-16T00:00:00Z"
# ///
import requests
print(requests.__version__)
uv allows arbitrary Python versions to be requested on each script invocation, for example:
import sys
print(".".join(map(str, sys.version_info[:3])))
$ # Use the default Python version, may differ on your machine
$ uv run example.py
3.12.6
$ # Use a specific Python version
$ uv run --python 3.10 example.py
3.10.15
See the Python version request documentation for more details on requesting Python versions.
On Windows uv
will run your script ending with .pyw
extension using pythonw
:
from tkinter import Tk, ttk
root = Tk()
root.title("uv")
frm = ttk.Frame(root, padding=10)
frm.grid()
ttk.Label(frm, text="Hello World").grid(column=0, row=0)
root.mainloop()
PS> uv run example.pyw
{: style="height:50px;width:150px"}
Similarly, it works with dependencies as well:
import sys
from PyQt5.QtWidgets import QApplication, QWidget, QLabel, QGridLayout
app = QApplication(sys.argv)
widget = QWidget()
grid = QGridLayout()
text_label = QLabel()
text_label.setText("Hello World!")
grid.addWidget(text_label)
widget.setLayout(grid)
widget.setGeometry(100, 100, 200, 50)
widget.setWindowTitle("uv")
widget.show()
sys.exit(app.exec_())
PS> uv run --with PyQt5 example_pyqt.pyw
{: style="height:50px;width:150px"}
To learn more about uv run
, see the command reference.
Or, read on to learn how to run and install tools with uv.
Check out one of the core guides to get started:
- Installing Python versions
- Running scripts and declaring dependencies
- Running and installing applications as tools
- Creating and working on projects
- Integrate uv with other software, e.g., Docker, GitHub, PyTorch, and more
Or, explore the concept documentation for comprehensive breakdown of each feature.
It is considered best practice to regularly update dependencies, to avoid being exposed to vulnerabilities, limit incompatibilities between dependencies, and avoid complex upgrades when upgrading from a too old version. A variety of tools can help staying up-to-date by creating automated pull requests. Several of them support uv, or have work underway to support it.
uv is supported by Renovate.
!!! note
Updating `uv pip compile` outputs such as `requirements.txt` is not yet supported. Progress can
be tracked
at [renovatebot/renovate#30909](https://github.com/renovatebot/renovate/issues/30909).
Renovate uses the presence of a uv.lock
file to determine that uv is used for managing
dependencies, and will suggest upgrades to
project dependencies,
optional dependencies and
development dependencies.
Renovate will update both the pyproject.toml
and uv.lock
files.
The lockfile can also be refreshed on a regular basis (for instance to update transitive
dependencies) by enabling the
lockFileMaintenance
option:
{
$schema: "https://docs.renovatebot.com/renovate-schema.json",
lockFileMaintenance: {
enabled: true,
},
}
Renovate supports updating dependencies defined using script inline metadata.
Since it cannot automatically detect which Python files use script inline metadata, their locations
need to be explicitly defined using
fileMatch
, like so:
{
$schema: "https://docs.renovatebot.com/renovate-schema.json",
pep723: {
fileMatch: [
"scripts/generate_docs\\.py",
"scripts/run_server\\.py",
],
},
}
Support for uv is not yet available. Progress can be tracked at:
- dependabot/dependabot-core#10478 for
uv.lock
output - dependabot/dependabot-core#10039 for
uv pip compile
outputs
!!! tip
Check out the [`uv-docker-example`](https://github.com/astral-sh/uv-docker-example) project for
an example of best practices when using uv to build an application in Docker.
A Docker image is published with a built version of uv available. To run a uv command in a container:
$ docker run ghcr.io/astral-sh/uv --help
uv provides a distroless Docker image including the uv
binary. The following tags are published:
ghcr.io/astral-sh/uv:latest
ghcr.io/astral-sh/uv:{major}.{minor}.{patch}
, e.g.,ghcr.io/astral-sh/uv:0.5.11
ghcr.io/astral-sh/uv:{major}.{minor}
, e.g.,ghcr.io/astral-sh/uv:0.5
(the latest patch version)
In addition, uv publishes the following images:
- Based on
alpine:3.20
:ghcr.io/astral-sh/uv:alpine
ghcr.io/astral-sh/uv:alpine3.20
- Based on
debian:bookworm-slim
:ghcr.io/astral-sh/uv:debian-slim
ghcr.io/astral-sh/uv:bookworm-slim
- Based on
buildpack-deps:bookworm
:ghcr.io/astral-sh/uv:debian
ghcr.io/astral-sh/uv:bookworm
- Based on
python3.x-alpine
:ghcr.io/astral-sh/uv:python3.13-alpine
ghcr.io/astral-sh/uv:python3.12-alpine
ghcr.io/astral-sh/uv:python3.11-alpine
ghcr.io/astral-sh/uv:python3.10-alpine
ghcr.io/astral-sh/uv:python3.9-alpine
ghcr.io/astral-sh/uv:python3.8-alpine
- Based on
python3.x-bookworm
:ghcr.io/astral-sh/uv:python3.13-bookworm
ghcr.io/astral-sh/uv:python3.12-bookworm
ghcr.io/astral-sh/uv:python3.11-bookworm
ghcr.io/astral-sh/uv:python3.10-bookworm
ghcr.io/astral-sh/uv:python3.9-bookworm
ghcr.io/astral-sh/uv:python3.8-bookworm
- Based on
python3.x-slim-bookworm
:ghcr.io/astral-sh/uv:python3.13-bookworm-slim
ghcr.io/astral-sh/uv:python3.12-bookworm-slim
ghcr.io/astral-sh/uv:python3.11-bookworm-slim
ghcr.io/astral-sh/uv:python3.10-bookworm-slim
ghcr.io/astral-sh/uv:python3.9-bookworm-slim
ghcr.io/astral-sh/uv:python3.8-bookworm-slim
As with the distroless image, each image is published with uv version tags as
ghcr.io/astral-sh/uv:{major}.{minor}.{patch}-{base}
and
ghcr.io/astral-sh/uv:{major}.{minor}-{base}
, e.g., ghcr.io/astral-sh/uv:0.5.11-alpine
.
For more details, see the GitHub Container page.
Use one of the above images with uv pre-installed or install uv by copying the binary from the official distroless Docker image:
FROM python:3.12-slim-bookworm
COPY --from=ghcr.io/astral-sh/uv:latest /uv /uvx /bin/
Or, with the installer:
FROM python:3.12-slim-bookworm
# The installer requires curl (and certificates) to download the release archive
RUN apt-get update && apt-get install -y --no-install-recommends curl ca-certificates
# Download the latest installer
ADD https://astral.sh/uv/install.sh /uv-installer.sh
# Run the installer then remove it
RUN sh /uv-installer.sh && rm /uv-installer.sh
# Ensure the installed binary is on the `PATH`
ENV PATH="/root/.local/bin/:$PATH"
Note this requires curl
to be available.
In either case, it is best practice to pin to a specific uv version, e.g., with:
COPY --from=ghcr.io/astral-sh/uv:0.5.11 /uv /uvx /bin/
Or, with the installer:
ADD https://astral.sh/uv/0.5.11/install.sh /uv-installer.sh
If you're using uv to manage your project, you can copy it into the image and install it:
# Copy the project into the image
ADD . /app
# Sync the project into a new environment, using the frozen lockfile
WORKDIR /app
RUN uv sync --frozen
!!! important
It is best practice to add `.venv` to a [`.dockerignore` file](https://docs.docker.com/build/concepts/context/#dockerignore-files)
in your repository to prevent it from being included in image builds. The project virtual
environment is dependent on your local platform and should be created from scratch in the image.
Then, to start your application by default:
# Presuming there is a `my_app` command provided by the project
CMD ["uv", "run", "my_app"]
!!! tip
It is best practice to use [intermediate layers](#intermediate-layers) separating installation
of dependencies and the project itself to improve Docker image build times.
See a complete example in the
uv-docker-example
project.
Once the project is installed, you can either activate the project virtual environment by placing its binary directory at the front of the path:
ENV PATH="/app/.venv/bin:$PATH"
Or, you can use uv run
for any commands that require the environment:
RUN uv run some_script.py
!!! tip
Alternatively, the
[`UV_PROJECT_ENVIRONMENT` setting](../../concepts/projects/config.md#project-environment-path) can
be set before syncing to install to the system Python environment and skip environment activation
entirely.
To use installed tools, ensure the tool bin directory is on the path:
ENV PATH=/root/.local/bin:$PATH
RUN uv tool install cowsay
$ docker run -it $(docker build -q .) /bin/bash -c "cowsay -t hello"
_____
| hello |
=====
\
\
^__^
(oo)\_______
(__)\ )\/\
||----w |
|| ||
!!! note
The tool bin directory's location can be determined by running the `uv tool dir --bin` command
in the container.
Alternatively, it can be set to a constant location:
```dockerfile title="Dockerfile"
ENV UV_TOOL_BIN_DIR=/opt/uv-bin/
```
While uv installs a compatible Python version if there isn't one available in the image, uv does not yet support installing Python for musl-based distributions. For example, if you are using an Alpine Linux base image that doesn't have Python installed, you need to add it with the system package manager:
apk add --no-cache python3~=3.12
When developing, it's useful to mount the project directory into a container. With this setup,
changes to the project can be immediately reflected in a containerized service without rebuilding
the image. However, it is important not to include the project virtual environment (.venv
) in
the mount, because the virtual environment is platform specific and the one built for the image
should be kept.
Bind mount the project (in the working directory) to /app
while retaining the .venv
directory
with an anonymous volume:
$ docker run --rm --volume .:/app --volume /app/.venv [...]
!!! tip
The `--rm` flag is included to ensure the container and anonymous volume are cleaned up when the
container exits.
See a complete example in the
uv-docker-example
project.
When using Docker compose, more sophisticated tooling is available for container development. The
watch
option
allows for greater granularity than is practical with a bind mount and supports triggering updates
to the containerized service when files change.
!!! note
This feature requires Compose 2.22.0 which is bundled with Docker Desktop 4.24.
Configure watch
in your
Docker compose file
to mount the project directory without syncing the project virtual environment and to rebuild the
image when the configuration changes:
services:
example:
build: .
# ...
develop:
# Create a `watch` configuration to update the app
#
watch:
# Sync the working directory with the `/app` directory in the container
- action: sync
path: .
target: /app
# Exclude the project virtual environment
ignore:
- .venv/
# Rebuild the image on changes to the `pyproject.toml`
- action: rebuild
path: ./pyproject.toml
Then, run docker compose watch
to run the container with the development setup.
See a complete example in the
uv-docker-example
project.
Compiling Python source files to bytecode is typically desirable for production images as it tends to improve startup time (at the cost of increased installation time).
To enable bytecode compilation, use the --compile-bytecode
flag:
RUN uv sync --compile-bytecode
Alternatively, you can set the UV_COMPILE_BYTECODE
environment variable to ensure that all
commands within the Dockerfile compile bytecode:
ENV UV_COMPILE_BYTECODE=1
A cache mount can be used to improve performance across builds:
ENV UV_LINK_MODE=copy
RUN --mount=type=cache,target=/root/.cache/uv \
uv sync
Changing the default UV_LINK_MODE
silences warnings about
not being able to use hard links since the cache and sync target are on separate file systems.
If you're not mounting the cache, image size can be reduced by using the --no-cache
flag or
setting UV_NO_CACHE
.
!!! note
The cache directory's location can be determined by running the `uv cache dir` command in the
container.
Alternatively, the cache can be set to a constant location:
```dockerfile title="Dockerfile"
ENV UV_CACHE_DIR=/opt/uv-cache/
```
If you're using uv to manage your project, you can improve build times by moving your transitive
dependency installation into its own layer via the --no-install
options.
uv sync --no-install-project
will install the dependencies of the project but not the project
itself. Since the project changes frequently, but its dependencies are generally static, this can be
a big time saver.
# Install uv
FROM python:3.12-slim
COPY --from=ghcr.io/astral-sh/uv:latest /uv /uvx /bin/
# Change the working directory to the `app` directory
WORKDIR /app
# Install dependencies
RUN --mount=type=cache,target=/root/.cache/uv \
--mount=type=bind,source=uv.lock,target=uv.lock \
--mount=type=bind,source=pyproject.toml,target=pyproject.toml \
uv sync --frozen --no-install-project
# Copy the project into the image
ADD . /app
# Sync the project
RUN --mount=type=cache,target=/root/.cache/uv \
uv sync --frozen
Note that the pyproject.toml
is required to identify the project root and name, but the project
contents are not copied into the image until the final uv sync
command.
!!! tip
If you're using a [workspace](../../concepts/projects/workspaces.md), then use the
`--no-install-workspace` flag which excludes the project _and_ any workspace members.
If you want to remove specific packages from the sync, use `--no-install-package <name>`.
By default, uv installs projects and workspace members in editable mode, such that changes to the source code are immediately reflected in the environment.
uv sync
and uv run
both accept a --no-editable
flag, which instructs uv to install the project
in non-editable mode, removing any dependency on the source code.
In the context of a multi-stage Docker image, --no-editable
can be used to include the project in
the synced virtual environment from one stage, then copy the virtual environment alone (and not the
source code) into the final image.
For example:
# Install uv
FROM python:3.12-slim AS builder
COPY --from=ghcr.io/astral-sh/uv:latest /uv /uvx /bin/
# Change the working directory to the `app` directory
WORKDIR /app
# Install dependencies
RUN --mount=type=cache,target=/root/.cache/uv \
--mount=type=bind,source=uv.lock,target=uv.lock \
--mount=type=bind,source=pyproject.toml,target=pyproject.toml \
uv sync --frozen --no-install-project --no-editable
# Copy the project into the intermediate image
ADD . /app
# Sync the project
RUN --mount=type=cache,target=/root/.cache/uv \
uv sync --frozen --no-editable
FROM python:3.12-slim
# Copy the environment, but not the source code
COPY --from=builder --chown=app:app /app/.venv /app/.venv
# Run the application
CMD ["/app/.venv/bin/hello"]
If uv isn't needed in the final image, the binary can be mounted in each invocation:
RUN --mount=from=ghcr.io/astral-sh/uv,source=/uv,target=/bin/uv \
uv sync
The system Python environment is safe to use this context, since a container is already isolated.
The --system
flag can be used to install in the system environment:
RUN uv pip install --system ruff
To use the system Python environment by default, set the UV_SYSTEM_PYTHON
variable:
ENV UV_SYSTEM_PYTHON=1
Alternatively, a virtual environment can be created and activated:
RUN uv venv /opt/venv
# Use the virtual environment automatically
ENV VIRTUAL_ENV=/opt/venv
# Place entry points in the environment at the front of the path
ENV PATH="/opt/venv/bin:$PATH"
When using a virtual environment, the --system
flag should be omitted from uv invocations:
RUN uv pip install ruff
To install requirements files, copy them into the container:
COPY requirements.txt .
RUN uv pip install -r requirements.txt
When installing a project alongside requirements, it is best practice to separate copying the requirements from the rest of the source code. This allows the dependencies of the project (which do not change often) to be cached separately from the project itself (which changes very frequently).
COPY pyproject.toml .
RUN uv pip install -r pyproject.toml
COPY . .
RUN uv pip install -e .
FastAPI is a modern, high-performance Python web framework. You can use uv to manage your FastAPI project, including installing dependencies, managing environments, running FastAPI applications, and more.
!!! note
You can view the source code for this guide in the [uv-fastapi-example](https://github.com/astral-sh/uv-fastapi-example) repository.
As an example, consider the sample application defined in the FastAPI documentation, structured as follows:
project
└── app
├── __init__.py
├── main.py
├── dependencies.py
├── routers
│ ├── __init__.py
│ ├── items.py
│ └── users.py
└── internal
├── __init__.py
└── admin.py
To use uv with this application, inside the project
directory run:
$ uv init --app
This creates an project with an application layout
and a pyproject.toml
file.
Then, add a dependency on FastAPI:
$ uv add fastapi --extra standard
You should now have the following structure:
project
├── pyproject.toml
└── app
├── __init__.py
├── main.py
├── dependencies.py
├── routers
│ ├── __init__.py
│ ├── items.py
│ └── users.py
└── internal
├── __init__.py
└── admin.py
And the contents of the pyproject.toml
file should look something like this:
[project]
name = "uv-fastapi-example"
version = "0.1.0"
description = "FastAPI project"
readme = "README.md"
requires-python = ">=3.12"
dependencies = [
"fastapi[standard]",
]
From there, you can run the FastAPI application with:
$ uv run fastapi dev
uv run
will automatically resolve and lock the project dependencies (i.e., create a uv.lock
alongside the pyproject.toml
), create a virtual environment, and run the command in that
environment.
Test the app by opening http://127.0.0.1:8000/?token=jessica in a web browser.
To deploy the FastAPI application with Docker, you can use the following Dockerfile
:
FROM python:3.12-slim
# Install uv.
COPY --from=ghcr.io/astral-sh/uv:latest /uv /uvx /bin/
# Copy the application into the container.
COPY . /app
# Install the application dependencies.
WORKDIR /app
RUN uv sync --frozen --no-cache
# Run the application.
CMD ["/app/.venv/bin/fastapi", "run", "app/main.py", "--port", "80", "--host", "0.0.0.0"]
Build the Docker image with:
$ docker build -t fastapi-app .
Run the Docker container locally with:
$ docker run -p 8000:80 fastapi-app
Navigate to http://127.0.0.1:8000/?token=jessica in your browser to verify that the app is running correctly.
!!! tip
For more on using uv with Docker, see the [Docker guide](./docker.md).
================================================ File: /docs/guides/integration/alternative-indexes.md
While uv uses the official Python Package Index (PyPI) by default, it also supports alternative package indexes. Most alternative indexes require various forms of authentication, which requires some initial setup.
!!! important
Please read the documentation on [using multiple indexes](../../pip/compatibility.md#packages-that-exist-on-multiple-indexes)
in uv — the default behavior is different from pip to prevent dependency confusion attacks, but
this means that uv may not find the versions of a package as you'd expect.
uv can install packages from
Azure DevOps Artifacts.
Authenticate to a feed using a
Personal Access Token
(PAT) or interactively using the keyring
package.
If there is a PAT available (eg
$(System.AccessToken)
in an Azure pipeline),
credentials can be provided via the "Basic" HTTP authentication scheme. Include the PAT in the
password field of the URL. A username must be included as well, but can be any string.
For example, with the token stored in the $ADO_PAT
environment variable, set the index URL with:
$ export UV_EXTRA_INDEX_URL=https://dummy:$ADO_PAT@pkgs.dev.azure.com/{organisation}/{project}/_packaging/{feedName}/pypi/simple/
If there is not a PAT available, authenticate to Artifacts using the
keyring
package with
the artifacts-keyring
plugin. Because these two
packages are required to authenticate to Azure Artifacts, they must be pre-installed from a source
other than Artifacts.
The artifacts-keyring
plugin wraps
the Azure Artifacts Credential Provider tool.
The credential provider supports a few different authentication modes including interactive login —
see the tool's documentation for information
on configuration.
uv only supports using the keyring
package in
subprocess mode.
The keyring
executable must be in the PATH
, i.e., installed globally or in the active
environment. The keyring
CLI requires a username in the URL, so the index URL must include the
default username VssSessionToken
.
$ # Pre-install keyring and the Artifacts plugin from the public PyPI
$ uv tool install keyring --with artifacts-keyring
$ # Enable keyring authentication
$ export UV_KEYRING_PROVIDER=subprocess
$ # Configure the index URL with the username
$ export UV_EXTRA_INDEX_URL=https://[email protected]/{organisation}/{project}/_packaging/{feedName}/pypi/simple/
uv can install packages from
Google Artifact Registry. Authenticate to a
repository using password authentication or using keyring
package.
!!! note
This guide assumes `gcloud` CLI has previously been installed and setup.
Credentials can be provided via "Basic" HTTP authentication scheme. Include access token in the
password field of the URL. Username must be oauth2accesstoken
, otherwise authentication will fail.
For example, with the token stored in the $ARTIFACT_REGISTRY_TOKEN
environment variable, set the
index URL with:
export ARTIFACT_REGISTRY_TOKEN=$(gcloud auth application-default print-access-token)
export UV_EXTRA_INDEX_URL=https://oauth2accesstoken:$ARTIFACT_REGISTRY_TOKEN@{region}-python.pkg.dev/{projectId}/{repositoryName}/simple
You can also authenticate to Artifact Registry using keyring
package with
keyrings.google-artifactregistry-auth
plugin.
Because these two packages are required to authenticate to Artifact Registry, they must be
pre-installed from a source other than Artifact Registry.
The artifacts-keyring
plugin wraps gcloud CLI to generate
short-lived access tokens, securely store them in system keyring and refresh them when they are
expired.
uv only supports using the keyring
package in
subprocess mode.
The keyring
executable must be in the PATH
, i.e., installed globally or in the active
environment. The keyring
CLI requires a username in the URL and it must be oauth2accesstoken
.
# Pre-install keyring and Artifact Registry plugin from the public PyPI
uv tool install keyring --with keyrings.google-artifactregistry-auth
# Enable keyring authentication
export UV_KEYRING_PROVIDER=subprocess
# Configure the index URL with the username
export UV_EXTRA_INDEX_URL=https://oauth2accesstoken@{region}-python.pkg.dev/{projectId}/{repositoryName}/simple
uv can install packages from AWS CodeArtifact.
The authorization token can be retrieved using the awscli
tool.
!!! note
This guide assumes the AWS CLI has previously been authenticated.
First, declare some constants for your CodeArtifact repository:
export AWS_DOMAIN="<your-domain>"
export AWS_ACCOUNT_ID="<your-account-id>"
export AWS_REGION="<your-region>"
export AWS_CODEARTIFACT_REPOSITORY="<your-repository>"
Then, retrieve a token from the awscli
:
export AWS_CODEARTIFACT_TOKEN="$(
aws codeartifact get-authorization-token \
--domain $AWS_DOMAIN \
--domain-owner $AWS_ACCOUNT_ID \
--query authorizationToken \
--output text
)"
And configure the index URL:
export UV_EXTRA_INDEX_URL="https://aws:${AWS_CODEARTIFACT_TOKEN}@${AWS_DOMAIN}-${AWS_ACCOUNT_ID}.d.codeartifact.${AWS_REGION}.amazonaws.com/pypi/${AWS_CODEARTIFACT_REPOSITORY}/simple/"
If you also want to publish your own packages to AWS CodeArtifact, you can use uv publish
as
described in the publishing guide. You will need to set UV_PUBLISH_URL
separately
from the credentials:
# Configure uv to use AWS CodeArtifact
export UV_PUBLISH_URL="https://${AWS_DOMAIN}-${AWS_ACCOUNT_ID}.d.codeartifact.${AWS_REGION}.amazonaws.com/pypi/${AWS_CODEARTIFACT_REPOSITORY}/"
export UV_PUBLISH_USERNAME=aws
export UV_PUBLISH_PASSWORD="$AWS_CODEARTIFACT_TOKEN"
# Publish the package
uv publish
uv is also known to work with JFrog's Artifactory.
An official pre-commit hook is provided at
astral-sh/uv-pre-commit
.
To make sure your uv.lock
file is up to date even if your pyproject.toml
file was changed via
pre-commit, add the following to the .pre-commit-config.yaml
:
- repo: https://github.com/astral-sh/uv-pre-commit
# uv version.
rev: 0.5.8
hooks:
- id: uv-lock
To keep your requirements.txt
file updated using pre-commit:
- repo: https://github.com/astral-sh/uv-pre-commit
# uv version.
rev: 0.5.8
hooks:
- id: uv-export
To compile requirements via pre-commit, add the following to the .pre-commit-config.yaml
:
- repo: https://github.com/astral-sh/uv-pre-commit
# uv version.
rev: 0.5.11
hooks:
# Compile requirements
- id: pip-compile
args: [requirements.in, -o, requirements.txt]
To compile alternative files, modify args
and files
:
- repo: https://github.com/astral-sh/uv-pre-commit
# uv version.
rev: 0.5.11
hooks:
# Compile requirements
- id: pip-compile
args: [requirements-dev.in, -o, requirements-dev.txt]
files: ^requirements-dev\.(in|txt)$
To run the hook over multiple files at the same time:
- repo: https://github.com/astral-sh/uv-pre-commit
# uv version.
rev: 0.5.11
hooks:
# Compile requirements
- id: pip-compile
name: pip-compile requirements.in
args: [requirements.in, -o, requirements.txt]
- id: pip-compile
name: pip-compile requirements-dev.in
args: [requirements-dev.in, -o, requirements-dev.txt]
files: ^requirements-dev\.(in|txt)$
Learn how to integrate uv with other software:
- Using in Docker images
- Using with Jupyter
- Using with pre-commit
- Using in GitHub Actions
- Using in GitLab CI/CD
- Using with alternative package indexes
- Installing PyTorch
- Building a FastAPI application
Or, explore the concept documentation for comprehensive breakdown of each feature.
The PyTorch ecosystem is a popular choice for deep learning research and development. You can use uv to manage PyTorch projects and PyTorch dependencies across different Python versions and environments, even controlling for the choice of accelerator (e.g., CPU-only vs. CUDA).
!!! note
Some of the features outlined in this guide require uv version 0.5.3 or later. If you're using an
older version of uv, we recommend upgrading prior to configuring PyTorch.
From a packaging perspective, PyTorch has a few uncommon characteristics:
- Many PyTorch wheels are hosted on a dedicated index, rather than the Python Package Index (PyPI). As such, installing PyTorch often requires configuring a project to use the PyTorch index.
- PyTorch produces distinct builds for each accelerator (e.g., CPU-only, CUDA). Since there's no
standardized mechanism for specifying these accelerators when publishing or installing, PyTorch
encodes them in the local version specifier. As such, PyTorch versions will often look like
2.5.1+cpu
,2.5.1+cu121
, etc. - Builds for different accelerators are published to different indexes. For example, the
+cpu
builds are published on https://download.pytorch.org/whl/cpu, while the+cu121
builds are published on https://download.pytorch.org/whl/cu121.
As such, the necessary packaging configuration will vary depending on both the platforms you need to support and the accelerators you want to enable.
To start, consider the following (default) configuration, which would be generated by running
uv init --python 3.12
followed by uv add torch torchvision
.
In this case, PyTorch would be installed from PyPI, which hosts CPU-only wheels for Windows and macOS, and GPU-accelerated wheels on Linux (targeting CUDA 12.4):
[project]
name = "project"
version = "0.1.0"
requires-python = ">=3.12"
dependencies = [
"torch>=2.5.1",
"torchvision>=0.20.1",
]
!!! tip "Supported Python versions"
At time of writing, PyTorch does not yet publish wheels for Python 3.13; as such projects with
`requires-python = ">=3.13"` may fail to resolve. See the
[compatibility matrix](https://github.com/pytorch/pytorch/blob/main/RELEASE.md#release-compatibility-matrix).
This is a valid configuration for projects that want to use CPU builds on Windows and macOS, and CUDA-enabled builds on Linux. However, if you need to support different platforms or accelerators, you'll need to configure the project accordingly.
In some cases, you may want to use a specific PyTorch variant across all platforms. For example, you may want to use the CPU-only builds on Linux too.
In such cases, the first step is to add the relevant PyTorch index to your pyproject.toml
:
=== "CPU-only"
```toml
[[tool.uv.index]]
name = "pytorch-cpu"
url = "https://download.pytorch.org/whl/cpu"
explicit = true
```
=== "CUDA 11.8"
```toml
[[tool.uv.index]]
name = "pytorch-cu118"
url = "https://download.pytorch.org/whl/cu118"
explicit = true
```
=== "CUDA 12.1"
```toml
[[tool.uv.index]]
name = "pytorch-cu121"
url = "https://download.pytorch.org/whl/cu121"
explicit = true
```
=== "CUDA 12.4"
```toml
[[tool.uv.index]]
name = "pytorch-cu124"
url = "https://download.pytorch.org/whl/cu124"
explicit = true
```
=== "ROCm6"
```toml
[[tool.uv.index]]
name = "pytorch-rocm"
url = "https://download.pytorch.org/whl/rocm6.2"
explicit = true
```
We recommend the use of explicit = true
to ensure that the index is only used for torch
,
torchvision
, and other PyTorch-related packages, as opposed to generic dependencies like jinja2
,
which should continue to be sourced from the default index (PyPI).
Next, update the pyproject.toml
to point torch
and torchvision
to the desired index:
=== "CPU-only"
```toml
[tool.uv.sources]
torch = [
{ index = "pytorch-cpu" },
]
torchvision = [
{ index = "pytorch-cpu" },
]
```
=== "CUDA 11.8"
PyTorch doesn't publish CUDA builds for macOS. As such, we gate on `platform_system` to instruct uv to ignore
the PyTorch index when resolving for macOS.
```toml
[tool.uv.sources]
torch = [
{ index = "pytorch-cu118", marker = "platform_system != 'Darwin'"},
]
torchvision = [
{ index = "pytorch-cu118", marker = "platform_system != 'Darwin'"},
]
```
=== "CUDA 12.1"
PyTorch doesn't publish CUDA builds for macOS. As such, we gate on `platform_system` to instruct uv to ignore
the PyTorch index when resolving for macOS.
```toml
[tool.uv.sources]
torch = [
{ index = "pytorch-cu121", marker = "platform_system != 'Darwin'"},
]
torchvision = [
{ index = "pytorch-cu121", marker = "platform_system != 'Darwin'"},
]
```
=== "CUDA 12.4"
PyTorch doesn't publish CUDA builds for macOS. As such, we gate on `platform_system` to instruct uv to ignore
the PyTorch index when resolving for macOS.
```toml
[tool.uv.sources]
torch = [
{ index = "pytorch-cu124", marker = "platform_system != 'Darwin'"},
]
torchvision = [
{ index = "pytorch-cu124", marker = "platform_system != 'Darwin'"},
]
```
=== "ROCm6"
PyTorch doesn't publish ROCm6 builds for macOS or Windows. As such, we gate on `platform_system` to instruct uv to
ignore the PyTorch index when resolving for those platforms.
```toml
[tool.uv.sources]
torch = [
{ index = "pytorch-rocm", marker = "platform_system == 'Linux'"},
]
torchvision = [
{ index = "pytorch-rocm", marker = "platform_system == 'Linux'"},
]
```
As a complete example, the following project would use PyTorch's CPU-only builds on all platforms:
[project]
name = "project"
version = "0.1.0"
requires-python = ">=3.12.0"
dependencies = [
"torch>=2.5.1",
"torchvision>=0.20.1",
]
[tool.uv.sources]
torch = [
{ index = "pytorch-cpu" },
]
torchvision = [
{ index = "pytorch-cpu" },
]
[[tool.uv.index]]
name = "pytorch-cpu"
url = "https://download.pytorch.org/whl/cpu"
explicit = true
In some cases, you may want to use CPU-only builds in one environment (e.g., macOS and Windows), and CUDA-enabled builds in another (e.g., Linux).
With tool.uv.sources
, you can use environment markers to specify the desired index for each
platform. For example, the following configuration would use PyTorch's CPU-only builds on Windows
(and macOS, by way of falling back to PyPI), and CUDA-enabled builds on Linux:
[project]
name = "project"
version = "0.1.0"
requires-python = ">=3.12.0"
dependencies = [
"torch>=2.5.1",
"torchvision>=0.20.1",
]
[tool.uv.sources]
torch = [
{ index = "pytorch-cpu", marker = "platform_system == 'Windows'" },
{ index = "pytorch-cu124", marker = "platform_system == 'Linux'" },
]
torchvision = [
{ index = "pytorch-cpu", marker = "platform_system == 'Windows'" },
{ index = "pytorch-cu124", marker = "platform_system == 'Linux'" },
]
[[tool.uv.index]]
name = "pytorch-cpu"
url = "https://download.pytorch.org/whl/cpu"
explicit = true
[[tool.uv.index]]
name = "pytorch-cu124"
url = "https://download.pytorch.org/whl/cu124"
explicit = true
In some cases, you may want to use CPU-only builds in some cases, but CUDA-enabled builds in others,
with the choice toggled by a user-provided extra (e.g., uv sync --extra cpu
vs.
uv sync --extra cu124
).
With tool.uv.sources
, you can use extra markers to specify the desired index for each enabled
extra. For example, the following configuration would use PyTorch's CPU-only for
uv sync --extra cpu
and CUDA-enabled builds for uv sync --extra cu124
:
[project]
name = "project"
version = "0.1.0"
requires-python = ">=3.12.0"
dependencies = []
[project.optional-dependencies]
cpu = [
"torch>=2.5.1",
"torchvision>=0.20.1",
]
cu124 = [
"torch>=2.5.1",
"torchvision>=0.20.1",
]
[tool.uv]
conflicts = [
[
{ extra = "cpu" },
{ extra = "cu124" },
],
]
[tool.uv.sources]
torch = [
{ index = "pytorch-cpu", extra = "cpu" },
{ index = "pytorch-cu124", extra = "cu124" },
]
torchvision = [
{ index = "pytorch-cpu", extra = "cpu" },
{ index = "pytorch-cu124", extra = "cu124" },
]
[[tool.uv.index]]
name = "pytorch-cpu"
url = "https://download.pytorch.org/whl/cpu"
explicit = true
[[tool.uv.index]]
name = "pytorch-cu124"
url = "https://download.pytorch.org/whl/cu124"
explicit = true
!!! note
Since GPU-accelerated builds aren't available on macOS, the above configuration will fail to install
on macOS when the `cu124` extra is enabled.
While the above examples are focused on uv's project interface (uv lock
, uv sync
, uv run
,
etc.), PyTorch can also be installed via the uv pip
interface.
PyTorch itself offers a dedicated interface to determine the appropriate pip command to run for a given target configuration. For example, you can install stable, CPU-only PyTorch on Linux with:
$ pip3 install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cpu
To use the same workflow with uv, replace pip3
with uv pip
:
$ uv pip install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cpu
Astral provides Docker images with uv preinstalled. Select a variant that is suitable for your workflow.
variables:
UV_VERSION: 0.5
PYTHON_VERSION: 3.12
BASE_LAYER: bookworm-slim
stages:
- analysis
uv:
stage: analysis
image: ghcr.io/astral-sh/uv:$UV_VERSION-python$PYTHON_VERSION-$BASE_LAYER
script:
# your `uv` commands
!!! note
If you are using a distroless image, you have to specify the entrypoint:
```yaml
uv:
image:
name: ghcr.io/astral-sh/uv:$UV_VERSION
entrypoint: [""]
# ...
```
Persisting the uv cache between workflow runs can improve performance.
uv-install:
variables:
UV_CACHE_DIR: .uv-cache
cache:
- key:
files:
- uv.lock
paths:
- $UV_CACHE_DIR
script:
# Your `uv` commands
- uv cache prune --ci
See the GitLab caching documentation for more details on configuring caching.
Using uv cache prune --ci
at the end of the job is recommended to reduce cache size. See the uv
cache documentation for more details.
If using the uv pip
interface instead of the uv project interface, uv requires a virtual
environment by default. To allow installing packages into the system environment, use the --system
flag on all uv invocations or set the UV_SYSTEM_PYTHON
variable.
The UV_SYSTEM_PYTHON
variable can be defined in at different scopes. You can read more about
how variables and their precedence works in GitLab here
Opt-in for the entire workflow by defining it at the top level:
variables:
UV_SYSTEM_PYTHON: 1
# [...]
To opt-out again, the --no-system
flag can be used in any uv invocation.
When persisting the cache, you may want to use requirements.txt
or pyproject.toml
as
your cache key files instead of uv.lock
.
The Jupyter notebook is a popular tool for interactive computing, data analysis, and visualization. You can use Jupyter with uv in a few different ways, either to interact with a project, or as a standalone tool.
If you're working within a project, you can start a Jupyter server with access to the project's virtual environment via the following:
$ uv run --with jupyter jupyter lab
By default, jupyter lab
will start the server at
http://localhost:8888/lab.
Within a notebook, you can import your project's modules as you would in any other file in the
project. For example, if your project depends on requests
, import requests
will import
requests
from the project's virtual environment.
If you're looking for read-only access to the project's virtual environment, then there's nothing more to it. However, if you need to install additional packages from within the notebook, there are a few extra details to consider.
If you need to install packages from within the notebook, we recommend creating a dedicated kernel for your project. Kernels enable the Jupyter server to run in one environment, with individual notebooks running in their own, separate environments.
In the context of uv, we can create a kernel for a project while installing Jupyter itself in an
isolated environment, as in uv run --with jupyter jupyter lab
. Creating a kernel for the project
ensures that the notebook is hooked up to the correct environment, and that any packages installed
from within the notebook are installed into the project's virtual environment.
To create a kernel, you'll need to install ipykernel
as a development dependency:
$ uv add --dev ipykernel
Then, you can create the kernel for project
with:
$ uv run ipython kernel install --user --name=project
From there, start the server with:
$ uv run --with jupyter jupyter lab
When creating a notebook, select the project
kernel from the dropdown. Then use !uv add pydantic
to add pydantic
to the project's dependencies, or !uv pip install pydantic
to install pydantic
into the project's virtual environment without persisting the change to the project pyproject.toml
or uv.lock
files. Either command will make import pydantic
work within the notebook.
If you don't want to create a kernel, you can still install packages from within the notebook. However, there are a few caveats to consider.
Though uv run --with jupyter
runs in an isolated environment, within the notebook itself,
!uv add
and related commands will modify the project's environment, even without a kernel.
For example, running !uv add pydantic
from within a notebook will add pydantic
to the project's
dependencies and virtual environment, such that import pydantic
will work immediately, without
further configuration or a server restart.
However, since the Jupyter server is the "active" environment, !uv pip install
will install
package's into Jupyter's environment, not the project environment. Such dependencies will persist
for the lifetime of the Jupyter server, but may disappear on subsequent jupyter
invocations.
If you're working with a notebook that relies on pip (e.g., via the %pip
magic), you can include
pip in your project's virtual environment by running uv venv --seed
prior to starting the Jupyter
server. For example, given:
$ uv venv --seed
$ uv run --with jupyter jupyter lab
Subsequent %pip install
invocations within the notebook will install packages into the project's
virtual environment. However, such modifications will not be reflected in the project's
pyproject.toml
or uv.lock
files.
If you ever need ad hoc access to a notebook (i.e., to run a Python snippet interactively), you can
start a Jupyter server at any time with uv tool run jupyter lab
. This will run a Jupyter server in
an isolated environment.
If you need to run Jupyter in a virtual environment that isn't associated with a
project (e.g., has no pyproject.toml
or uv.lock
), you can do
so by adding Jupyter to the environment directly. For example:
$ uv venv --seed
$ uv pip install pydantic
$ uv pip install jupyterlab
$ .venv/bin/jupyter lab
From here, import pydantic
will work within the notebook, and you can install additional packages
via !uv pip install
, or even !pip install
.
You can also engage with Jupyter notebooks from within an editor like VS Code. To connect a uv-managed project to a Jupyter notebook within VS Code, we recommend creating a kernel for the project, as in the following:
# Create a project.
$ uv init project
# Move into the project directory.
$ cd project
# Add ipykernel as a dev dependency.
$ uv add --dev ipykernel
# Open the project in VS Code.
$ code .
Once the project directory is open in VS Code, you can create a new Jupyter notebook by selecting
"Create: New Jupyter Notebook" from the command palette. When prompted to select a kernel, choose
"Python Environments" and select the virtual environment you created earlier (e.g.,
.venv/bin/python
).
!!! note
VS Code requires `ipykernel` to be present in the project environment. If you'd prefer to avoid
adding `ipykernel` as a dev dependency, you can install it directly into the project environment
with `uv pip install ipykernel`.
If you need to manipulate the project's environment from within the notebook, you may need to add
uv
as an explicit development dependency:
$ uv add --dev uv
From there, you can use !uv add pydantic
to add pydantic
to the project's dependencies, or
!uv pip install pydantic
to install pydantic
into the project's virtual environment without
updating the project's pyproject.toml
or uv.lock
files.
For use with GitHub Actions, we recommend the official
astral-sh/setup-uv
action, which installs uv, adds it to
PATH, (optionally) persists the cache, and more, with support for all uv-supported platforms.
To install the latest version of uv:
name: Example
jobs:
uv-example:
name: python
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Install uv
uses: astral-sh/setup-uv@v5
It is considered best practice to pin to a specific uv version, e.g., with:
name: Example
jobs:
uv-example:
name: python
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Install uv
uses: astral-sh/setup-uv@v5
with:
# Install a specific version of uv.
version: "0.5.11"
Python can be installed with the python install
command:
name: Example
jobs:
uv-example:
name: python
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Install uv
uses: astral-sh/setup-uv@v5
- name: Set up Python
run: uv python install
This will respect the Python version pinned in the project.
Alternatively, the official GitHub setup-python
action can be used. This can be faster, because
GitHub caches the Python versions alongside the runner.
Set the
python-version-file
option to use the pinned version for the project:
name: Example
jobs:
uv-example:
name: python
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Install uv
uses: astral-sh/setup-uv@v5
- name: "Set up Python"
uses: actions/setup-python@v5
with:
python-version-file: ".python-version"
Or, specify the pyproject.toml
file to ignore the pin and use the latest version compatible with
the project's requires-python
constraint:
name: Example
jobs:
uv-example:
name: python
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Install uv
uses: astral-sh/setup-uv@v5
- name: "Set up Python"
uses: actions/setup-python@v5
with:
python-version-file: "pyproject.toml"
When using a matrix test test multiple Python versions, set the Python version using
astral-sh/setup-uv
, which will override the Python version specification in the pyproject.toml
or .python-version
files:
jobs:
build:
name: continuous-integration
runs-on: ubuntu-latest
strategy:
matrix:
python-version:
- "3.10"
- "3.11"
- "3.12"
steps:
- uses: actions/checkout@v4
- name: Install uv and set the python version
uses: astral-sh/setup-uv@v5
with:
python-version: ${{ matrix.python-version }}
If not using the setup-uv
action, you can set the UV_PYTHON
environment variable:
jobs:
build:
name: continuous-integration
runs-on: ubuntu-latest
strategy:
matrix:
python-version:
- "3.10"
- "3.11"
- "3.12"
env:
UV_PYTHON: ${{ matrix.python-version }}
steps:
- uses: actions/checkout@v4
Once uv and Python are installed, the project can be installed with uv sync
and commands can be
run in the environment with uv run
:
name: Example
jobs:
uv-example:
name: python
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Install uv
uses: astral-sh/setup-uv@v5
- name: Set up Python
run: uv python install
- name: Install the project
run: uv sync --all-extras --dev
- name: Run tests
# For example, using `pytest`
run: uv run pytest tests
!!! tip
The
[`UV_PROJECT_ENVIRONMENT` setting](../../concepts/projects/config.md#project-environment-path) can
be used to install to the system Python environment instead of creating a virtual environment.
It may improve CI times to store uv's cache across workflow runs.
The astral-sh/setup-uv
has built-in support for
persisting the cache:
- name: Enable caching
uses: astral-sh/setup-uv@v5
with:
enable-cache: true
You can configure the action to use a custom cache directory on the runner:
- name: Define a custom uv cache path
uses: astral-sh/setup-uv@v5
with:
enable-cache: true
cache-local-path: "/path/to/cache"
Or invalidate it when the lockfile changes:
- name: Define a cache dependency glob
uses: astral-sh/setup-uv@v5
with:
enable-cache: true
cache-dependency-glob: "uv.lock"
Or when any requirements file changes:
- name: Define a cache dependency glob
uses: astral-sh/setup-uv@v5
with:
enable-cache: true
cache-dependency-glob: "requirements**.txt"
Note that astral-sh/setup-uv
will automatically use a separate cache key for each host
architecture and platform.
Alternatively, you can manage the cache manually with the actions/cache
action:
jobs:
install_job:
env:
# Configure a constant location for the uv cache
UV_CACHE_DIR: /tmp/.uv-cache
steps:
# ... setup up Python and uv ...
- name: Restore uv cache
uses: actions/cache@v4
with:
path: /tmp/.uv-cache
key: uv-${{ runner.os }}-${{ hashFiles('uv.lock') }}
restore-keys: |
uv-${{ runner.os }}-${{ hashFiles('uv.lock') }}
uv-${{ runner.os }}
# ... install packages, run tests, etc ...
- name: Minimize uv cache
run: uv cache prune --ci
The uv cache prune --ci
command is used to reduce the size of the cache and is optimized for CI.
Its effect on performance is dependent on the packages being installed.
!!! tip
If using `uv pip`, use `requirements.txt` instead of `uv.lock` in the cache key.
!!! note
[post-job-hook]: https://docs.github.com/en/actions/hosting-your-own-runners/managing-self-hosted-runners/running-scripts-before-or-after-a-job
When using non-ephemeral, self-hosted runners the default cache directory can grow unbounded.
In this case, it may not be optimal to share the cache between jobs. Instead, move the cache
inside the GitHub Workspace and remove it once the job finishes using a
[Post Job Hook][post-job-hook].
```yaml
install_job:
env:
# Configure a relative location for the uv cache
UV_CACHE_DIR: ${{ github.workspace }}/.cache/uv
```
Using a post job hook requires setting the `ACTIONS_RUNNER_HOOK_JOB_STARTED` environment
variable on the self-hosted runner to the path of a cleanup script such as the one shown below.
```sh title="clean-uv-cache.sh"
#!/usr/bin/env sh
uv cache clean
```
If using the uv pip
interface instead of the uv project interface, uv requires a virtual
environment by default. To allow installing packages into the system environment, use the --system
flag on all uv
invocations or set the UV_SYSTEM_PYTHON
variable.
The UV_SYSTEM_PYTHON
variable can be defined in at different scopes.
Opt-in for the entire workflow by defining it at the top level:
env:
UV_SYSTEM_PYTHON: 1
jobs: ...
Or, opt-in for a specific job in the workflow:
jobs:
install_job:
env:
UV_SYSTEM_PYTHON: 1
...
Or, opt-in for a specific step in a job:
steps:
- name: Install requirements
run: uv pip install -r requirements.txt
env:
UV_SYSTEM_PYTHON: 1
To opt-out again, the --no-system
flag can be used in any uv invocation.
If Python is already installed on your system, uv will detect and use it without configuration. However, uv can also install and manage Python versions for you.
!!! tip
uv will [automatically fetch Python versions](#automatic-python-downloads) as needed — you don't need to install Python to get started.
To install the latest Python version:
$ uv python install
This will install a uv-managed Python version even if there is already a Python installation on your system. If you've previously installed Python with uv, a new version will not be installed.
!!! note
Python does not publish official distributable binaries. As such, uv uses distributions from Astral [`python-build-standalone`](https://github.com/astral-sh/python-build-standalone) project. See the [Python distributions](../concepts/python-versions.md#managed-python-distributions) documentation for more details.
Once Python is installed, it will be used by uv
commands automatically.
!!! important
When Python is installed by uv, it will not be available globally (i.e. via the `python` command).
Support for this feature is in _preview_. See [Installing Python executables](../concepts/python-versions.md#installing-python-executables)
for details.
You can still use
[`uv run`](../guides/scripts.md#using-different-python-versions) or
[create and activate a virtual environment](../pip/environments.md) to use `python` directly.
To install a specific Python version:
$ uv python install 3.12
To install multiple Python versions:
$ uv python install 3.11 3.12
To install an alternative Python implementation, e.g. PyPy:
$ uv python install [email protected]
See the python install
documentation
for more details.
To view available and installed Python versions:
$ uv python list
See the python list
documentation for more details.
Note that Python does not need to be explicitly installed to use uv. By default, uv will automatically download Python versions when they are required. For example, the following would download Python 3.12 if it was not installed:
$ uv run --python 3.12 python -c "print('hello world')"
Even if a specific Python version is not requested, uv will download the latest version on demand. For example, the following will create a new virtual environment and download a managed Python version if Python is not found:
$ uv venv
!!! tip
Automatic Python downloads can be [easily disabled](../concepts/python-versions.md#disabling-automatic-python-downloads) if you want more control over when Python is downloaded.
uv will use existing Python installations if present on your system. There is no configuration necessary for this behavior: uv will use the system Python if it satisfies the requirements of the command invocation. See the Python discovery documentation for details.
To force uv to use the system Python, provide the --python-preference only-system
option. See the
Python version preference
documentation for more details.
To learn more about uv python
, see the Python version concept
page and the command reference.
Or, read on to learn how to run scripts and invoke Python with uv.
The --help
flag can be used to view the help menu for a command, e.g., for uv
:
$ uv --help
To view the help menu for a specific command, e.g., for uv init
:
$ uv init --help
When using the --help
flag, uv displays a condensed help menu. To view a longer help menu for a
command, use uv help
:
$ uv help
To view the long help menu for a specific command, e.g., for uv init
:
$ uv help init
When using the long help menu, uv will attempt to use less
or more
to "page" the output so it is
not all displayed at once. To exit the pager, press q
.
When seeking help, it's important to determine the version of uv that you're using — sometimes the problem is already solved in a newer version.
To check the installed version:
$ uv version
The following are also valid:
$ uv --version # Same output as `uv version`
$ uv -V # Will not include the build commit and date
$ uv pip --version # Can be used with a subcommand
The issue tracker on GitHub is a good place to report bugs and request features. Make sure to search for similar issues first, as it is common for someone else to encounter the same problem.
Astral has a Discord server, which is a great place to ask questions, learn more about uv, and engage with other community members.
After installing uv, you can check that uv is available by running the uv
command:
$ uv
An extremely fast Python package manager.
Usage: uv [OPTIONS] <COMMAND>
...
You should see a help menu listing the available commands.
Now that you've confirmed uv is installed, check out an overview of features, learn how to get help if you run into any problems, or jump to the guides to start using uv.
uv provides essential features for Python development — from installing Python and hacking on simple scripts to working on large projects that support multiple Python versions and platforms.
uv's interface can be broken down into sections, which can be used independently or together.
Installing and managing Python itself.
uv python install
: Install Python versions.uv python list
: View available Python versions.uv python find
: Find an installed Python version.uv python pin
: Pin the current project to use a specific Python version.uv python uninstall
: Uninstall a Python version.
See the guide on installing Python to get started.
Executing standalone Python scripts, e.g., example.py
.
uv run
: Run a script.uv add --script
: Add a dependency to a scriptuv remove --script
: Remove a dependency from a script
See the guide on running scripts to get started.
Creating and working on Python projects, i.e., with a pyproject.toml
.
uv init
: Create a new Python project.uv add
: Add a dependency to the project.uv remove
: Remove a dependency from the project.uv sync
: Sync the project's dependencies with the environment.uv lock
: Create a lockfile for the project's dependencies.uv run
: Run a command in the project environment.uv tree
: View the dependency tree for the project.uv build
: Build the project into distribution archives.uv publish
: Publish the project to a package index.
See the guide on projects to get started.
Running and installing tools published to Python package indexes, e.g., ruff
or black
.
uvx
/uv tool run
: Run a tool in a temporary environment.uv tool install
: Install a tool user-wide.uv tool uninstall
: Uninstall a tool.uv tool list
: List installed tools.uv tool update-shell
: Update the shell to include tool executables.
See the guide on tools to get started.
Manually managing environments and packages — intended to be used in legacy workflows or cases where the high-level commands do not provide enough control.
Creating virtual environments (replacing venv
and virtualenv
):
uv venv
: Create a new virtual environment.
See the documentation on using environments for details.
Managing packages in an environment (replacing pip
and
pipdeptree
):
uv pip install
: Install packages into the current environment.uv pip show
: Show details about an installed package.uv pip freeze
: List installed packages and their versions.uv pip check
: Check that the current environment has compatible packages.uv pip list
: List installed packages.uv pip uninstall
: Uninstall packages.uv pip tree
: View the dependency tree for the environment.
See the documentation on managing packages for details.
Locking packages in an environment (replacing pip-tools
):
uv pip compile
: Compile requirements into a lockfile.uv pip sync
: Sync an environment with a lockfile.
See the documentation on locking environments for details.
!!! important
These commands do not exactly implement the interfaces and behavior of the tools they are based on. The further you stray from common workflows, the more likely you are to encounter differences. Consult the [pip-compatibility guide](../pip/compatibility.md) for details.
Managing and inspecting uv's state, such as the cache, storage directories, or performing a self-update:
uv cache clean
: Remove cache entries.uv cache prune
: Remove outdated cache entries.uv cache dir
: Show the uv cache directory path.uv tool dir
: Show the uv tool directory path.uv python dir
: Show the uv installed Python versions path.uv self update
: Update uv to the latest version.
Read the guides for an introduction to each feature, check out concept pages for in-depth details about uv's features, or learn how to get help if you run into any problems.
To help you get started with uv, we'll cover a few important topics:
Read on, or jump ahead to another section:
- Get going quickly with guides for common workflows.
- Learn more about the core concepts in uv.
- Use the reference documentation to find details about something specific.
Install uv with our standalone installers or your package manager of choice.
uv provides a standalone installer to download and install uv:
=== "macOS and Linux"
Use `curl` to download the script and execute it with `sh`:
```console
$ curl -LsSf https://astral.sh/uv/install.sh | sh
```
If your system doesn't have `curl`, you can use `wget`:
```console
$ wget -qO- https://astral.sh/uv/install.sh | sh
```
Request a specific version by including it in the URL:
```console
$ curl -LsSf https://astral.sh/uv/0.5.11/install.sh | sh
```
=== "Windows"
Use `irm` to download the script and execute it with `iex`:
```console
$ powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"
```
Changing the [execution policy](https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.core/about/about_execution_policies?view=powershell-7.4#powershell-execution-policies) allows running a script from the internet.
Request a specific version by including it in the URL:
```console
$ powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/0.5.11/install.ps1 | iex"
```
!!! tip
The installation script may be inspected before use:
=== "macOS and Linux"
```console
$ curl -LsSf https://astral.sh/uv/install.sh | less
```
=== "Windows"
```console
$ powershell -c "irm https://astral.sh/uv/install.ps1 | more"
```
Alternatively, the installer or binaries can be downloaded directly from [GitHub](#github-releases).
See the documentation on installer configuration for details on customizing your uv installation.
For convenience, uv is published to PyPI.
If installing from PyPI, we recommend installing uv into an isolated environment, e.g., with pipx
:
$ pipx install uv
However, pip
can also be used:
$ pip install uv
!!! note
uv ships with prebuilt distributions (wheels) for many platforms; if a wheel is not available for a given
platform, uv will be built from source, which requires a Rust toolchain. See the
[contributing setup guide](https://github.com/astral-sh/uv/blob/main/CONTRIBUTING.md#setup)
for details on building uv from source.
uv is available via Cargo, but must be built from Git rather than crates.io due to its dependency on unpublished crates.
$ cargo install --git https://github.com/astral-sh/uv uv
uv is available in the core Homebrew packages.
$ brew install uv
uv is available via winget.
$ winget install --id=astral-sh.uv -e
uv provides a Docker image at
ghcr.io/astral-sh/uv
.
See our guide on using uv in Docker for more details.
uv release artifacts can be downloaded directly from GitHub Releases.
Each release page includes binaries for all supported platforms as well as instructions for using
the standalone installer via github.com
instead of astral.sh
.
When uv is installed via the standalone installer, it can update itself on-demand:
$ uv self update
!!! tip
Updating uv will re-run the installer and can modify your shell profiles. To disable this
behavior, set `INSTALLER_NO_MODIFY_PATH=1`.
When another installation method is used, self-updates are disabled. Use the package manager's
upgrade method instead. For example, with pip
:
$ pip install --upgrade uv
To enable shell autocompletion for uv commands, run one of the following:
=== "Linux and macOS"
```bash
# Determine your shell (e.g., with `echo $SHELL`), then run one of:
echo 'eval "$(uv generate-shell-completion bash)"' >> ~/.bashrc
echo 'eval "$(uv generate-shell-completion zsh)"' >> ~/.zshrc
echo 'uv generate-shell-completion fish | source' >> ~/.config/fish/config.fish
echo 'eval (uv generate-shell-completion elvish | slurp)' >> ~/.elvish/rc.elv
```
=== "Windows"
```powershell
Add-Content -Path $PROFILE -Value '(& uv generate-shell-completion powershell) | Out-String | Invoke-Expression'
```
To enable shell autocompletion for uvx, run one of the following:
=== "Linux and macOS"
```bash
# Determine your shell (e.g., with `echo $SHELL`), then run one of:
echo 'eval "$(uvx --generate-shell-completion bash)"' >> ~/.bashrc
echo 'eval "$(uvx --generate-shell-completion zsh)"' >> ~/.zshrc
echo 'uvx --generate-shell-completion fish | source' >> ~/.config/fish/config.fish
echo 'eval (uvx --generate-shell-completion elvish | slurp)' >> ~/.elvish/rc.elv
```
=== "Windows"
```powershell
Add-Content -Path $PROFILE -Value '(& uvx --generate-shell-completion powershell) | Out-String | Invoke-Expression'
```
Then restart the shell or source the shell config file.
If you need to remove uv from your system, follow these steps:
-
Clean up stored data (optional):
$ uv cache clean $ rm -r "$(uv python dir)" $ rm -r "$(uv tool dir)"
!!! tip
Before removing the binaries, you may want to remove any data that uv has stored.
-
Remove the uv and uvx binaries:
=== "macOS and Linux"
```console $ rm ~/.local/bin/uv ~/.local/bin/uvx ```
=== "Windows"
```powershell $ rm $HOME\.local\bin\uv.exe $ rm $HOME\.local\bin\uvx.exe ```
!!! note
Prior to 0.5.0, uv was installed into `~/.cargo/bin`. The binaries can be removed from there to uninstall. Upgrading from an older version will not automatically remove the binaries from `~/.cargo/bin`.
See the first steps or jump straight to the guides to start using uv.