Summary
Most new Python projects are still about AI
Flask's author new project, a Python installer, made waves
Python 3.12 improves the language perf by 4%
Urllib3 v2 is out, removing support for loads of things
Textual, the TUI library, runs in a web browser now
Pydantic 2 alpha shows a 5X increase in speed
Pycon US is over, now wait for the videos
AI is still the star of the show
We have been inundated with the LLM craze for a few months now, and I must say I don't get tired of it. It's so damn fun, and feels like the 2000's again, when the internet was new, full of promises and silly things.
Github trending projects reflect that very well, with AI-related repositories taking most of the listing real estate. A good chunk of it didn't exist last month, and who knows if it will exist next month?
Twitter still got a few punchlines about this.
And if you haven't given it a try yet, just calling the Open AI API from Python is a ridiculously entertaining way of draining your wallet.
I personally coded the most expensive Telegram bot ever.
Worth it.
Python packaging is bubbling
It's true, large language models are kinda omnipresent, but for a brief moment in time, packaging took over the spotlight.
You may have noticed that this blog recently talked about packaging quite a lot, and there was a lot of debates elsewhere as well. This surfaced the work of Trio's author, posy and pybi, respectively a Python installer and a standardized Python interpreter distribution format.
On the other hand, Flask's author released a tool with a similar goal, that he had so far only used internally: rye.
The announcement made a big wave, that may lead up to some quite positive turn of events.
So why is it a big deal?
Well, those projects are trying to solve the problem that "Relieving your packaging pain" addresses in its first half, meaning bootstrapping Python safely.
A huge number of Python packaging problems come from bootstrapping Python, not packaging per se.
But those projects target the pain with tooling, and that's much better than a procedure.
How do they do this?
In short, they attack directly the issues you may have read about in our previous article, "Explaining why the Python installation process is such a mess".
TL;DR:
they make the Python installation experience the same on all machines (no more fiddling with the dependencies, the PATH, command names, etc.);
they provide precompiled binaries of python on all platforms (no pyenv with is missing headers);
the binaries are standardized (no more missing tk);
they promote the use of virtual environments.
Now those projects are not ready for production, as they have plenty of rough edges, but pack the potential to bring a new era to the language.
I'm not exaggerating.
Python is massively successful despite those pain points, so imagine if you can get a cargo like experience.
Python 3.12 teasing perf improvements
Before ChatGPT, there was only one thing that was more subject to debate than packaging in the community, and that’s Python performances. So it's no surprise that when Guido van Rossum announced he was going to work on making CPython faster, people got excited.
The plan was to incrementally gain about 50% of perfs for four releases in a row (starting with Python 3.10), leading to what was, at the time, hoped to become a 5x increase in general speed of the language.
The transition to 3.11 brought a very honorable 1.22X, but couldn't reach the 1.5x that many were hyped about.
The latest reports on 3.12 are only talking about a 1.04x so far, or a 4% improvement. Given that this release was going to introduce the foundation of a JIT in CPython, it's easy to find that underwhelming.
But let's remember the release is scheduled for October, and a lot can happen in 6 months. Especially since the internal representation of integers is touted to have been changed with performances in mind, hinting that the team is laying out the groundwork for a just-in-time compiler to do a better job.
My absolute favorite improvement of the last 2 releases were the incredibly better error messages and Python 3.12 continues with that excellent trend.
However, we all would like a pony, wouldn't we?
Urllib3 2.0.0 has been released
Urllib3 is the backbone of the Python ecosystem, it's the curl of the language, it's everywhere. It's virtually the most downloaded package on pypi. Ok, technically it's boto, but it's automatically installed constantly by AWS servers, so it's cheating.
This lib is why "requests" can even exist; it is vendored with every machine that has "pip" on it.
This new release bumps the major digit because it's breaking things, by removing a long list of deprecated stuff, starting with support for Python 2.7, 3.5, and 3.6, all of them having reached end of life.
So what should you do?
Well, if you are a library author that touches HTTP even with a 10-foot pole, now is time to fire up your testing machinery.
And if you are not, now is time to slow down your upgrades. Give the Pypi crowd the time to catch up.
Textual demo running in the browser
Textual, everyone's favorite Python TUI library in 2022, keeps improving. We just had a peek of it running the browser (yes, a terminal application that also runs natively in the browser), but it is now playing inside a jupyter notebook
Pydantic 2 alpha is raising hopes
I used to be a great fan of Marshmallow, but Pydantic won me over with the type hints and the FastApi integration.
The lib has always been slow. I'm currently working for a client that receives inputs of several 10k lines JSON files. Validating them can slow down the program startup by up to 2 seconds. Not great, not terrible.
However, these days, thanks to the amazing Maturin project, Python and Rust are BFF and lots of projects are getting a boost through oxidation.
Pydantic V2 was expected to follow this trend, with the hope of shedding some weight.
And it did, with a minimum of 5x perf gain, and up to 50x.
Of course, things are going to break. Don’t jump in just yet unless you want to prepare your library for it.
PyCon US is over
All the PyCon are a bit special, but PyCon US is specialer. You can read the most famous Pythonic mustache on Tweeter to have an idea of how many people from cool projects got to hang out together.
If you didn't attend, I encourage you to check out the official YouTube page as they will publish the talks in the near future.
Though with those videos, sometimes the future is nearer and sometimes it is mucher lesser nearer, so be patient.