Summary
Python without the GIL, for good
LPython: a new Python Compiler
Pydantic 2 is getting usable
PEP 387 defines "Soft Deprecation", getopt and optparse soft deprecated
Cython 3.0 released with better pure Python support
PEP 722 – Dependency specification for single-file scripts
Python VSCode support gets faster
Paint in the terminal
Python without the GIL, for good
We saw last month the Global Interpreter Lock was the center of attention once again. This month it carried on to the point than even Meta, Facebook’s parent company, pitched in:
If PEP 703 is accepted, Meta can commit to support in the form of three [engineer years on landing] nogil CPython
It is nice to have Python seeing more and more contributions from the big companies that used it for their success. It's a huge contrast compared to the 2010 decade.
The discussion culminated with an internal debate with the core devs, which ended up with an official announcement that PEP 703, the proposal that relit the fire, was going to be accepted after some details being figured out.
This means in the coming years, Python will have its GIL removed.
Here is the plan:
Short term, an unsupported experimental version of Python without the GIL is published in parallel to the regular one. Target is 3.13/3.14.
Mid-term, the no-GIL version is marked as officially supported, but is still just an alternative to Python with GIL. A target date is announced to make it the default once. This will happen only after the community has shown enough support for it, and will take several years.
Long-term, no-GIL becomes the default. Before this, the core devs can reverse the decision and abort the no-GIL project if it proves to have a bad ROI.
Note that if the program imports one single C-extension that uses the GIL on the no-GIL build, it's designed to switch back to the GIL automatically. So this is not a 2=>3 situation where non-compatible code breaks.
The main reason for the two different builds is to manage the unknown unknowns. Indeed, nobody expects the no-GIL to break things, but with such a big project, you can never be sure. ABI compat is tricky, and new extensions need to be compiled explicitly against it for it to work, so there is a need for the community embracing it.
Also, no-GIL compatible extensions will work on the old interpreter, so you don't get in the situation like Python 3 code not working on Python 2.
In fact, Python code itself should not be affected and will work seamlessly on one or the other, albeit with threads limited to a single core with the GIL.
LPython: a new Python Compiler
That's the news I didn't see coming. In "What's the deal with CPython, Pypy, MicroPython, Jython...?" we talked about Python compilers, and I thought I did a pretty good job about listing everything that mattered. Well, the team behind LPython decided to take this list and .append()
on it.
LPython is a new BSD 3 compiler that takes Python code and translate it for the following for LLVM, C, C++ or WASM. It doesn't aim to compile the entire program, although it can, but rather, like numba and cython, to let you speed up numerical bottle neck. The benchmarks are very promising and the ability to switch between Ahead-of-Time and Just-in-Time very convenient, although you will still need the entire compilation chain installed on the machine. LPython likes raw Python code, so if you call a Python function inside your snippet, you must explicitly mark it as such with a decorator. So most will likely use it for very specific snippets.
Pydantic 2 is getting usable
I've been pitching the coming of the version 2 of Pydantic for some time, because I, and many people, use it a lot for data validation / schema definition, and the new version is much faster.
Yes, it came out as stable last month, but if you read "Relieving your Python packaging pain" you know I don't encourage people to use the last version of anything except for testing or having fun.
Indeed, even a stable major version is still something that is guaranteed to need refinement, and still has little community support.
But now two things have happened:
Pydantic 2.1 has been released, the first wave of nasty bugs have been eradicated.
Fast API announced support of Pydantic 2. Since it's the biggest driver of Pydantic usage, it's a milestone.
I will now proceed with giving it a try in one personal project, and if it works, move it into professional projects in a few months.
PEP 387 defines "Soft Deprecation", getopt and optparse soft deprecated
If you haven't read Victor Stinner's blog yet, I encourage you to do so. It's technical and raw, with zero BS, and gives you a good view of what happens inside the contribution life of a core dev. Last article mentions something I missed last month: soft deprecation has been added to PEP 387 – Backwards Compatibility Policy.
This document, created in 2009, states how the Python projects deals with deprecation, and it will now contain the following:
A soft deprecation can be used when using an API which should no longer be used to write new code, but it remains safe to continue using it in existing code. The API remains documented and tested, but will not be developed further (no enhancement). The main difference between a “soft” and a (regular) “hard” deprecation is that the soft deprecation does not imply scheduling the removal of the deprecated API.
Basically, a soft deprecated API is in a zombie state, maintained alive forever, but will never see any work on it and be explicitly advised against being used.
optparse and getopt, two modules that used to be a de-facto solution for parsing script arguments in their time, are now marked as "soft-deprecated". You can use them forever, but you probably should not.
First, argparse is the more modern stdlib solution, and we have a good article on it.
Second, 3rd party projects like typer and click exist.
Cython 3.0 released with better pure Python support
Cython, the most famous Python compiler, released version 3. While the release comes with all sorts of improvement, one particularly stands out. Cython always had limitations: it used a superset of Python to express some of its features.
This is no more the case, as the release notes: "it should now be possible to express all Cython code and use all features in regular Python syntax".
Which means you now should be able to use any Python code base, just Cython it all and see what happens.
PEP 722 – Dependency specification for single-file scripts
While the no-GIL topic was certainly still alive and well, the proposal of PEP 722 really heated things up.
The idea is to formalize a syntax in comments that, similar to Groovy’s, would allow expressing the dependency of a single script. Taking the example from the PEP itself:
# In order to run, this script needs the following 3rd party libraries
#
# Requirements:
# requests
# rich
import requests
from rich.pretty import pprint
resp = requests.get("https://peps.python.org/api/peps.json")
data = resp.json()
pprint([(k, v["title"]) for k, v in data.items()][:10])
The important lines are:
# Requirements:
# requests
# rich
Which now would be officially formalized to be parsed by third-party tools. The concept is not new and tools like pip-run already support running a script for which you have the deps described with such comments:
$ pip uninstall rich requests
WARNING: Skipping rich as it is not installed.
WARNING: Skipping requests as it is not installed.
$ pip-run dah_script.py
[
│ ('1', 'PEP Purpose and Guidelines'),
│ ('2', 'Procedure for Adding New Modules'),
│ ('3', 'Guidelines for Handling Bug Reports'),
│ ('4', 'Deprecation of Standard Modules'),
│ ('5', 'Guidelines for Language Evolution'),
│ ('6', 'Bug Fix Releases'),
│ ('7', 'Style Guide for C Code'),
│ ('8', 'Style Guide for Python Code'),
│ ('9', 'Sample Plaintext PEP Template'),
│ ('10', 'Voting Guidelines')
]
Packages are installed in a temporary virtual env and deleted after the run, like npx used to do for the JS world.
The PEP doesn't imply Python or pip are going to integrate such feature, it's only about formalizing the syntax for now. But I have good hope for this one, as I have several lone Python scripts lying around that would really benefit from this, especially if you can keep the env around in the future. Such a proposal could show demand for it, and years later, lead to pip adoption. E.G: npx influenced the addition of npm create
, which allows to fetch a project template from specific packages. Indeed, that was the most common use case for npx.
Python VSCode support gets faster
If you use VSCode, you may have noticed using a lot of linters made the IDE slower. Mypy is particularly at fault as the mypy
command is slow to start, and the daemon mode is not used by VSCode.
For his new release, a new official mypy extension is now available, which uses the dmypy daemon. The speed up is such that the editor can now offer the check on the entire code base, not just the current file.
On top of that, pylance, the Microsft official extension for Python support, will now persist all the indexing work it performs on 3rd party libs. This will result in a lighter startup, and for big project, a speedier experience as indexing can take some time with slow machines.
I personally have to work on corporate clients’ laptops I can't modify, and they come with a ton of security software that makes them slow down to crawl, with process inspection and network calls to check file signatures after you click on anything. So this is a lifesaver.
Paint in the terminal
This is just so cool:
It's a version of paint that runs in the terminal, thanks to the Python lib textual
It's not going to change your life or anything, but WOW.
I installed it, and it's damn reactive. It even handles Ctrl-Z, and features a file selector when you try to save your work.
"As I have several lone Python scripts lying around that would really benefit from this, especially if you can keep the env around in the future."
This is possible with github.com/daylinmorgan/viv. Can be used similar to pip-run. But additionally can be used via curl or as a standalone function in your seldom one-off scripts.
I like to use pipx (https://pypa.github.io/pipx/) to run python programs in an isolated environment. It basically does what you said, create a virtualenv for a single software. As for the linter in VSCode, I am really enjoying Ruff (https://marketplace.visualstudio.com/items?itemName=charliermarsh.ruff). It is insanely fast.