What's up Python? UV disrupts packaging, 2023 community stats, namespaces in Pypi...
August 2024
Summary
UV last update raises high hopes for Python packaging.
JetBrain 2023 results are in, and what happens next will not surprise you that much.
PEP 752, a very conservative package repository namespaces proposal with no new syntax and you probably can’t request one anyway.
UV disrupts the Python packaging story
I have tried all the Python project management tools under the sun and eventually settled on using pip/venv for most things, even advocating "Why not tell people to "simply" use pyenv, poetry, pipx or anaconda".
Yet, when Astral came out with uv, I was quite optimistic because:
They had a good track record with ruff, showing they could not only build quality tools, but also pragmatic ones. They understand what the community needs and uses, and develop their vision inside the constraints of real use cases, practices, and existing stacks.
They wrote the whole thing in Rust, removing by design half of the bootstrapping problems cursing Python tooling. To install uv, you don't need to worry about which Python you use, where is your venv, and how you are going to deal with the PATH.
They started with a fast and robust pip-compatible API, demonstrating their desire to serve the community by providing a good migration story.
I was expecting them to eventually come up with their own toolbox to deal with Python projects and was delighted when they released, this month, uv 0.3 which does exactly that.
So what's the big deal?
uv
can now find direct dependencies from a pyproject.toml file, create a cross-platform lock file, add & remove deps from the conf and sync the whole project instantly. This turns it into a serious alternative to pip-tools, Poetry, or Pipenv.uv
can install Python itself, on Windows, Mac and Linux. This makes it a replacement for Pyflow, Pyenv or Rye.uv
can run scripts with inline dependencies, like hatch. This means all my stand-alone scripts can now use Pypi deps without worries.uv
provides uvx, a pipx equivalent.
It has taken a bit of all the Python management project ideas, removed most of what makes them hard to use, and improved on speed and reliability.
uv
does a lot automatically, transparently, with sane defaults. This makes starting up a project a fast and reliable experience:
$ cd /tmp
$ curl -LsSf https://astral.sh/uv/install.sh | sh
$ uv init -p "<3.8,>=3.7" # forcing 3.7 as I know I don't have it
Initialized project `tmp`
$ uv run hello.py
Using Python 3.7.9
Creating virtualenv at: .venv
Hello from tmp!
This:
Install uv globally.
Creates a project with a
hello_world.py
andpyproject.toml
files.Install Python with the required version.
Create a venv with this Python.
Activate the venv.
Run the script.
If you like this as much as I do, you can check out our 1H interview with Charlie Marsh where he shares his outlook on Python’s tooling future, and offers us a delicious technical dive into a few design decisions they came up with.
I insist I'm not recommending uv
just yet because it's still so young, and I need more time experimenting with it, on my projects and with my clients, to see how it fares IRL, outside of the beautiful land of theory.
I have to check how beginners deal with not having curl
, Powershell users having the wrong permissions or compiled extensions installation when wheels are not there.
Packaging is complicated.
But I am actively migrating code bases to it to give it a fair try, and I have high hopes for it. Especially since Astral swiftly released a v0.4 that is a very explicit and direct reply to community feedback.
JetBrain 2023 results are in
Every year JetBrain organizes a Python-related survey and publishes the results the year after. This week we get the data from 2023, from more than 25 000 respondents.
While the usual warnings about sampling bias apply, it can give a few insights and show trends going one way or another. Here are a bunch of general things that caught my eyes:
25% of users have picked up Python in less than a year and 33% have less than one year of coding experience. I’ve been observing for a long time that Python is a popular language among beginners, as it is taught in many schools and is a de facto solution for data analysis which is very cross-domain. It is also a language of choice for many non-programmers when they need to automate something just once. It's important to remember this when targeting Python devs: making things beginner-friendly is going to hit a large part of the user base.
37% of Python developers reported contributing to open-source projects last year, 77% of it with code. I didn't expect it to be that high honestly, those are beautiful numbers and they made my day.
While most people get information from docs and Youtube, 42% still get it from Stack Overflow. And here I was, thinking ChatGPT killed the site. There is no mention of LLM in the survey though.
Main Python use cases are unsurprisingly still being data analysis, web dev, and machine learning.
I always have a look at the tech stack section, which this year says:
Python versions gather around 3.10 to 12, with a majority staying one version behind latest, as recommended. And mostly from Python.org.
Python 2 is rocking a non-insignificant 6% of installs!
Flask, Django, and FastAPI are still the top 3 web dev frameworks.
Machine learning is 67% scikit-learn which is not at all what I see in my bubble. Either I'm completely removed from the practical application of ML because I spend too much time in LLM circles, or people inventing the future of AI have no time to answer a survey. I'll have to ask around to see how much I'm biased on this one.
55% use Linux, 55% use Windows, and only 29% on MacOS. Yes, it doesn't add up to 100%, people use several OSes. Personally, I dual boot. I’m going to repeat myself, we should emphasize this one. Too many people forget how many coders are in MS land, where the terminal is underused.
Like last year, VSCode (41%) and PyCharm (31%) are way ahead of user share, by an order of magnitude. The next contender, Vim, barely makes 2%. This of course shows how popular those editors are, and that full-life-in-the-terminal is more niche than it seems. But also maybe, this could tell us a lot of power users do not answer those surveys, as they are busy coding. Matches my own sample though.
More than half of all respondents use the
venv
command to create a virtual environment, 2 two-thirds userequirements.txt
to save deps, and 77% use pip to install them. On one hand, I'm happy that people follow Relieving Python packaging pain. On the other hand, I have the feeling that in 6 months, there will be an overwhelmingly better experience and we will have to do a whole lot of communication to migrate. I don't know, a hunch.
For some reason, I particularly appreciated the Favorite Python-related resources. I don't know why. It's a mystery.
You can get your hands on the data if you want to play with it.
PEP 752 – Package repository namespaces
I love namespaces, they are one honking great idea, and we got them all over the place in Python. Imports, modules, attributes, inner classes...
They help organize complexity, give tooling context to provide completion, limit the risk of name collisions, and help you lose weight.
There is one place, though, where we haven't adopted them: in packaging repositories. For example, if you want to install the types for the packaging request, you have to install the package types-requests
, despite the fact a namespace requests
for all the packages related to it would make a lot of sense.
In the JS world, npm have namespaces, and if you need to install, for example, packages for the library babel
, you could do:
npm install @babel/core @babel/preset-env
Which installs the core of the Babel JS compiler and some conf preset that allows you to use the latest JavaScript. Both are under the @babel
namespace, because they belong to the same org, but they are different packages.
As you probably understand by now, PEP 752 is a proposal to add namespaces to Pypi.
However:
Namespacing means support requests, and Pypi doesn't have a lot of resources for that.
Because
@
is special in PowerShell, the namespace can't use that as a prefix.Because
/
and//
are special in file systems and build systems, the namespace can't use that as a prefix or separator.$
have meaning in bash.:
is part of URIS._
and-
are already equivalent in Pypi’s API.You get the idea.
So the PEP proposal is actually very limited in scope:
Only organizations (not regular users) can reserve a namespace, as they are the biggest motivation for namespacing.
The namespace will be separated by a hyphen, keeping the ordinary syntax that works with flat package names. So basically, no new syntax is introduced can you can’t distinguish a namespaced package from a flat one by just looking at the name.
So what would change for you?
Well, not much.
The only difference would be that an entity like Microsoft:
Can reserve a namespace instead of being forced to hold a placeholder .
Will have there a dedicated page listing all the
microsoft-*
packages instead of an empty shell.Would have an indicator in the Pypi UI putting an emphasis that official packages such as microsoft-aurora belong officially to them.
Will (maybe?) be able to ask packages such as microsoft-teams to chose a different name.
The goal is to first and foremost introduce the concept gently while improving the defense Pypi staff has against phishing. Adversarial naming attacks have been increasing in the last few years so it makes sense since using popular orgs and project names is a common deception vector.
So a pretty mild proposal, but that will likely allow migration, step by step, in the direction of more namespacing. It's often like that in the Python world, you have to go forward to stay relevant, you can't move fast, and yet you still likely will break things anyway.
Minor typo, uv 0.3 not 3.0, and of course now 0.4.1. Blazingly fast, that makes so much difference. I'm expanding my experimentation with it daily.
Not sure I'm quite on board with how UV manages dependencies for single scripts. 'Now, you can use uv add to automatically embed the dependency declarations within the script itself'
I guess this make sense if the script is going to never leave the UV ecosystem. Otherwise it's just comment that is taking up space for other users.
'You can also pull arbitrary dependencies into a uv run invocation via with the --with flag, as in: uv run --with "requests<3" --with rich main.py.'
Better, but clunky, and prone to errors by having to manually type the dependencies each time.
Seems like it would make more sense to have a flag that instructs uv run to pull the dependencies directly from the import statements