25 Comments

Enjoyed the article; it helps confirm my choice of uv over poetry for our greenfield project. Speaking of uv as a project management tool, you might be interested in the issues that I recently filed: https://github.com/astral-sh/uv/issues?q=is%3Aissue%20state%3Aopen%20author%3Amatthewadams

We're in a polylingual dev environment (kotlin, java, javascript, typescript, python, and likely more coming) employing a git monorepo, and, similar to your assertion about Python coders not knowing the command line (with which I agree), we've noticed that some data sciencey folks aren't familiar with git, git branching strategies, version control principles & semver, the software development lifecycle, build tools (maven/gradle, make, grunt/gulp, etc), dependency injection and inversion of control, automated testing, issue tracking systems and how they affect how you incrementally add features or fix bugs, monorepos/polyrepos, etc. Basically, they're mad scientists, off working in their secret, isolated laboratory on ad-hoc tasks, and haven't participated in releases & everything that goes along with them.

uv could step in here to really help these types of folks (and me) out.

Expand full comment

You will need to describe more precisely what you need in the ticket because there are lany ways to organize in and around a monorepo. This way the team may be able to genalize features that can be helped in different setups.

Expand full comment

Great overview thanks

I just reviewed uv for my team and there is one more reason against it, which isn't negligible for production-grade projects: Github Dependabot doesn't handle (yet) uv lock file. Supply chain management and vulnerability detection is such an important thing that it prevents the use of uv until it sees more adoption

Expand full comment

As a workaround, maybe ypu can uv export to requirements.txt and point dependabot to it.

Expand full comment

That's right but you loose some of the benefits of uv. In any case, Github is planning to address this in Q1 2025: https://github.com/dependabot/dependabot-core/issues/10478#issuecomment-2578570442

Expand full comment

There's a pretty good list of reasons not to switch pdm to using uv as the backend: https://pdm-project.org/latest/usage/uv/#limitations

I have made the switch, but I could understand why others haven't.

I switched to pdm to avoid most of those same complaints you raise. What were the pitfalls. You mentioned pdm once in the article---could you say more specifically?

Expand full comment

The article https://www.bitecode.dev/p/why-not-tell-people-to-simply-use is noy specific to pdm but the arguments hold for it as well.

Expand full comment

Goodness. Those are the kinds of reasons I was evangelizing switching /to/ pdm from the tools mentioned in TFA. I recently did move to uv as my pdm backend, but I didn't see a big QoL improvement. I'm curious, did you spend much time with pdm or mostly make the jump straight from the aforementioned tools straight over to uv?

Expand full comment

Fabulous piece, thanks. I've been wanting to try it.

Got a wee typo (double negative):

"Number 2 is not something you can't do much about, so the point is moot."

Expand full comment

Thanks

Expand full comment

> Finally, uvx (and so uv tool install) suffers from a similar problem then pipx, in that it encourages you to install some tools outside of your project. This makes sense for things like yt-dlp or httpie which are self-contained independent tools. But it's a trap for dev tools that care about syntax or libs, like mypy that will be installed in a certain Python version, but then used on a project with another potentially incompatible Python version. They will break spectacularly and many users won't understand why.

I think it's specifically meant for tools like `yt-dlp` and not for tools like ruff or type checkers. Add those directly to your project, and you can run them directly from inside the venv, or using `uv run`.

Expand full comment

Indeed but most people probably realize it.

Expand full comment

Is there a conda to uv migration tutorial written by anyone?

I have installed miniconda system-wide. For any Python package that I use a lot, I install them on base environment. And on other environments. Like ipython.

For every new project, I create a conda environment, and install everything in it. Upon finishing/writing my patch, I remove that environment and clean the caches. For my own projects, I create an environment.yaml and move on.

Everything works just fine. Now, the solving with mamba is fast. I can just hand someone the code and environment.yaml, and it runs on other platforms.

Can someone say why using uv is a good idea? Has anyone written a migration guide for such use cases?

I am mightily impressed by one line dependency declaration in a file. But I don't know (yet) where the caches are stored, how to get rid of them later, etc.

Expand full comment

I don't know any such guide and I suspect one reason is the typical anaconda setup doesn't exist. Anaconda has 4 different package managers: conda, miniconda, mamba and anaconda-project. It has several possible configuration files as well, and you can know what combinarion oe non python tools and conda channels or anaconda cloud feature the user is dependant on.

This makes writing such a guide very challenging.

Expand full comment

I'm guessing calling this a "project management tool" is likely to confuse readers. https://en.wikipedia.org/wiki/Project_management I know they do this on the uv site, but it's "package management" really.

Expand full comment

uv does more than packaging, since it's doing all the provisioning, book keeping, isolating, command running and abstracting parts as well.

You can of course, consider that all that is part of package management in a way, but if you say a python package manager, uv is not what people will pictur either.

Yet I agree project management is not ideal as well.

Expand full comment

Great article. You may want to run it through a spell check though

- developping

- maintening

- signaling

- independant

- contraints

Expand full comment

Thanks a bunch.

Expand full comment

Another blocker in my current gig is native JetBrains support. It hasn't come out yet on PyCharm. I don't use JetBrains stuff anymore but I have team members that do

Expand full comment

Pycharm now supports uv: https://www.jetbrains.com/help/pycharm/uv.html

Crazy how fast thing are moving.

Expand full comment

If you use Pixi by prefix.dev you can have conda and pypi too. It uses uv under the hood! ( same lockfile )

Expand full comment

Mixing conda channels and pypi is a path to pain.

Expand full comment

In my work, there is one other “edge case” that has kept me with Conda (for a few specific projects): the Python project has a non-Python dependency, where 1) I don’t want to install that non-Python dependency system wide, and 2) there isn’t some Python library that ships a platform specific wheel with the non-Python dependency bundled in it.

In some of these cases, there is an installer for the library on the conda-forge channel (e.g. qt)

Expand full comment

Although PyQt has a wheel now, your use case is true for many non python dependencies, like R deps, and is probably one of the last things for which Anaconda will always have a moat on Windows.

Indeed, on linux, people will use their package manager, and MacOS users tend to fire homebrew for this, but nuget/winget/choco never really took off in Windows.

I haven't hit the problem myself, so I didn't mention it. Thanks for taking the time to warn about this, this is a workflow that is unlikely to ever be served by uv.

Expand full comment

> Warning, this is a long article. I got carried away.

Lmao this is so me when preaching about uv

Expand full comment