Summary
PEP 771's "Default Extras for Python Software Packages" is getting an update
PEP 772 proposes a Python Packaging Council with broad authority over packaging standards, tools, and implementations.
The Debug Adapter Protocol makes debugging Python easier in VSCode and beyond
PEP 771: the new version looks good
There was an update for PEP 771 making more realistic claims about backward compat and avoiding a terrible key naming choice, which means it is ready for you to read about.
And hope for.
This proposal suggests to add a new feature to Python packaging: default extra dependencies.
Now, extra dependencies have existed for a long time.
For example, ipython defines 6 "extras dependencies" sub sections: "black", "doc", "test", "test_extra", "matplotlib" and "all".
This means you can do pip install ipython
to get ipython, but you could also do pip install ipython[doc, test]
to install it and all the documentation and testing packages it requires to develop it.
Extras are therefor a way to describe a group of dependencies that you don't think most of your users want, but that you offer as an option to opt in.
The problem comes when the user should install at least one for your package to work.
The typical example is fastapi, which tells you in the doc to pip install fastapi[standard]
because that includes templating, file upload, email validation, an http client and the wsgi server since this is what most users want.
But it doesn't want to include all that for the base install, because fastapi is also used itself as a dependency of other libs that don't want all that stuff.
Of course, some users will inevitably not read the doc, pip install fastapi
without the standard
packages because that makes sense intuitively, and get very confused.
This is where PEP 771 comes in, letting package authors declaring:
[project]
default-optional-dependency-keys =
"common_default_extra_deps"
]
[project.optional-dependencies]
minimal = []
common_default_extra_deps = [ "package1", "package2" ]
So that people can do pip install your_package
and get the same result as pip install your_package[common_default_extra_deps]
. The rare users needing the minimal install would do pip install your_package[minimal]
(which is empty) and be happy.
There is one drawback to this approach pip freeze > requirements.txt
and pip install -r requirements.txt
would not perfectly mirror each other anymore unless you use --no-deps
. Most people don't know this flag exists.
Not sure the breaking change is worth it for pip users.
But you know who is not affected? Users of systems with proper lock files. Like uv's.
PEP 772: a more official body for packaging matters
PEP 772 proposes a Python Packaging Council with broad authority over packaging standards, tools, and implementations.
Historically, the Python core dev stayed far away from packaging issues. I interviewed Brett Canon recently and asked why. You'll get his quite pragmatic response as soon as I'm done with the editing.
The consequence of this is that the Python packaging ecosystem always felt disconnected from the life of the Python project itself.
Now you may think the Python Packaging Authority or the Packaging-WG were what the Python Packaging Council proposed to be, but this never happened, although they did contribute a lot to the community.
The Packaging Council would be an elected team with a focus on using their power to establish standard processes, but attempt to exercise authority in the same light manner as the Steering Council is doing for the CPython project. So coordination, global vision plus last appeal for decisions that are stuck in the mud.
I like the few but clear entries they have as mandate and responsibilities, and appreciate they want to prevent conflicts of interest with "No more than two Packaging Council members should be employed by or significantly affiliated with the same entity", which includes PSF employees.
I was pleasantly surprised by how well the CPython governance model turned out to work, and I'm looking forward to seeing how this new council is going to play out if it's formed.
Easier debugging in VSCode
VSCode is now by far the most popular editor on the planet. I see it in every company I'm hired by, and in most videos or tutorial out there.
Unsurprisingly, it is the de facto choice for a bazillion Python devs, but using the debugger with it was always a hassle because as soon as you had something more complicated than a simple script, you had to create some configuration file to make it work.
This is why I mostly use and teach PDB (there is nice article on that): it works fine anywhere.
Well, Microsoft has solved the problem for good in the latest release by providing the debugpy
command.
Now if you want to debug something in VSCode, open the terminal in the IDE, and prefix your Python command with debugpy
and voilà, you are debugging.
debugpy
is automatically available in VSCode if you installed the Python extensions but it is an open source tool you can use elsewhere.
This is possible thanks the DAP (Debug Adapter Protocol), the new debugging counterpart to the LSP (Language Server Protocol) that let them abstract language specific debugging peculiarities.
This means this improved debugging experience can be made available in many other editors (e.g: here is a Vim adaptation of it) and for many languages. A great news for everybody, especially with PEP 768 on the starting block.
MongoDB shows off a native Django backend
An interesting thing popped up in my RSS feed this month, a tutorial from the MongoDB team that demonstrate how to connect their database with Django, not with the manual use of pymongo, but using their new native backend.
By native I mean you define the connection in settings.py using:
DATABASES = {
"default": ...,
}
and use (I'm copying their example, I haven't tried it yet) regular Django models:
from django.db import models
from django.conf import settings
from django_mongodb_backend.fields import EmbeddedModelField, ArrayField
from django_mongodb_backend.models import EmbeddedModel
class Award(EmbeddedModel):
wins = models.IntegerField(default=0)
nominations = models.IntegerField(default=0)
text = models.CharField(max_length=100)
class Movie(models.Model):
title = models.CharField(max_length=200)
plot = models.TextField(blank=True)
runtime = models.IntegerField(default=0)
released = models.DateTimeField("release date", null=True, blank=True)
awards = EmbeddedModelField(Award, null=True, blank=True)
genres = ArrayField(models.CharField(max_length=100), null=True, blank=True)
class Meta:
db_table = "movies"
managed = False
def __str__(self):
return self.title
class Viewer(models.Model):
name = models.CharField(max_length=100)
email = models.CharField(max_length=200)
class Meta:
db_table = "users"
managed = False
def __str__(self):
return self.name
The queries are made with the usual Django ORM API, it works with forms and migrations, but above all, the admin!
And although you won't get neither the whole list of Mongo nor Django features that way by the nature of things ("Django's transaction management APIs are not supported."), this is pretty significant if you use this product.
Of course, it's not stable yet.
PyCharm now supports uv
You don't really need editor support to use uv and therefor many PyCharm users have been enjoying the tool from the command line while pointing their editor to the venv directly.
But it's always nice to have QoL features from your favorite IDE available for your new workflow, and this is now the case with PyCharm 2024.3.2.
Pretty straightforward, when you create a virtual env or import a project, you can choose uv among the traditional virtualenv, poetry, conda, etc.: