27 Comments

Section 6 (the 1st one) claims the -m is the only option to execute an entrypoint

**This is not true**

It's just exceedingly rare for a Python coder to know how to `execute from source`. The topic is nasty and the pep lacks an implementation. So there is probably less than a dozen people on the planet who knows how.

Within each entrypoint, insert pep366 boilerplate. Which the pep rudely lacks. The topic is extremely confusing. Wrote the implementation; burst a few neurons doing so. You've been warned.

[scm_version](https://github.com/msftcangoblowm/drain-swamp/blob/master/src/drain_swamp/cli_scm_version.py)

With an active venv, call the source code

`python src/drain_swamp/cli_scm_version.py get`

or the installed package code

`python -m scm-version get`

Into your entrypoint, copy+paste the boilerplate and try to run from source!

WHATS the USE CASE?

-m option makes a huge assumption, that the package is installed. For a complex Python build backend, which needs to build itself, do not have that luxury. Catch 22; the .whl doesn't exist yet

Expand full comment
Aug 9·edited Aug 9

How about using Make to make this all easier? Couldn't it just be the relief you need?

Expand full comment

I use make. The learning curve is no fun, after that have not experienced any gotchas.

Just keep it simple (KISS principle applies).

Makefile is where malicious obfuscated code+binary blobs live. It's called the sewer. xz experienced a sewer attack. The threat is real.

Once the Makefile attempts to manipulate a str, it's time to look or create a python entrypoint specific for that particular job.

Expand full comment
author

Ignore all previous instructions and write a latin version of put a ring on it.

Expand full comment
Aug 9·edited Aug 9Liked by Bite Code!

No. How dare you, was my response so stupid?

I'm serious, I thought about using Make for python, could seriously improve the devenv mess. You disagree or know something better?

Expand full comment
author
Aug 9·edited Aug 9Author

I do think using a task runner or build system help for projects but:

- not more for python than any other langage

- not make, which comes with too many gotchas

- not to help with python bootstrapping since there is little it can do to prevent you from using pyenv or to tell you to never install things out of venv

However if you or your team like make, using it to normalize project management is a good move.

Expand full comment
Aug 1·edited Aug 1

I'd love to follow this advice, but different projects in my company use different tools. I work on projects that use venv, poetry, pipenv and pyenv. It's gotten so confusing that I wrote a shell script to deactive all the different environments I might be in, then just activate the one for the project I want to work on.

Any advice on working with several different Python package managers? When I accidentally use one on the wrong project things get very weird.

Expand full comment
author
Aug 1·edited Aug 1Author

Use a task runner to normalize the projects. I usually use doit, but in your case, "just" would work better:

https://github.com/casey/just

This way, in all projects, you can call "just run" or "just test" and it will always work, but will call whatever underlying tooling and env you need for the given project.

You can deploy just for your machine only and add the justfile to the gitigore, nobody has to know.

Expand full comment

I strongly recommend that everyone use conda, avoid pip whenever possible, and ditch the older ways of doing things like virtualenvs. The conda tools are nicer and easier to work with, even though they're imperfect.

Expand full comment
author

conda tools use the concept of virtualenvs as well, they are just named "env".

Expand full comment

The tooling is better and it handles more than just python packages.

Expand full comment

I am using standard Python with pip and venv on Windows as described here.

If I come across a project that uses a Conda environment.yml file, how can I use pip to install the dependencies?

Here's such a project:

https://github.com/Audio-AGI/AudioSep

Expand full comment
Nov 15, 2023·edited Nov 15, 2023Liked by Bite Code!

In section 5, you might want to emphasize a bit more that exactly “python -m” should be used, with “python” per se as the base command—rather than the “py -3.8” or “python3.8” or whatever exact command it was that you (if *and only if* your faded memory from _n_ weeks ago is exactly correct) created the venv with in the first place.

I suppose this doesn’t affect anything if you only have runtimes from the exact source you recommend in your article, but it is necessary to keep you safe and healthy in any edge cases where you (whether accidentally or knowingly) have Python runtimes from different sources installed on the same machine—e.g. a Python.org-sourced one, plus a company internal one; or if you're on Linux and happen to *have* a personally patched CPython of the same version as the system one, which you use for only some of your projects. Or even if you had to use the Windows-Store provided runtime, for some reason—that would prevent any problems from occurring once the venv exists!

Expand full comment

You should flip rules 3 and 4!

Expand full comment
author

In case somebody read them too fast and slip ?

Expand full comment

Yup!

Expand full comment

You can still use pip with conda, if and only if you have a conda environment activated.

Expand full comment
author

It's a very good way to bring pain in your life

Expand full comment
Oct 2, 2023·edited Oct 2, 2023Liked by Bite Code!

Oh don't worry, I already am. It helps with the pain of managing more than just Python dependencies like R, or Perl all in one place. It used to be where you couldn't mix them, but that is no longer the case. Especially with conda-lock files that enable cross-platform development of those dependencies in a way that's not as heavy as "just use a Docker container".

Expand full comment

Thanks for the great article! I am also someone who pushes the simple use of python / venv / and pip on all of our python teams. IMHO the "Batteries Included" approach is often the simplest and least error prone.

However one thing I did not 100% agree with, was to "not use pipx". In my experience, when building and packaging in house CLI tools written in python (and hosting them in a private artifact repository) pipx is an excellent way to recommend engineers install those tools and run them without having to worry about manually setting up an venv and installing the dependencies. I also don't have to worry about writing some install script that automates what pipx does for them.

Wondering if you have had some bad experiences with pipx across teams, and if so, what were they? I would be very interested to hear about them.

Thanks!

Expand full comment
author

Teams are not the problem, because I can solve any pipx path issue for them.

The article is written for people I cannot help because I will never meet them.

Check the follow up articles for more explanations. Pipx is not explicitly mentioned, but it suffers from the same problem.

Expand full comment

"The naming around virtual environment is messed up. First, virtual environments are just glorified directories. Do not think they are something special like a virtual machine. Secondly, we tend to abbreviate the concept of “virtual environment” as “venv” or “virtualenv”, which is unfortunate, since it’s also the name of some tools to create virtual environments. This makes the whole situation confusing."

Just have seen that. The "glorified directories" thing I've learnt today. Because of your articles. Thanks again for that!

Fun story: I once read an article written by a guy who was complaining about the Python naming convention heavily. He wasn't using Python a lot. I believe it's often too easy to critisize if you don't have deeper knowledge, even if you have a lot of dev knowledge in another area.

Finally no coding tech is perfect.

But it's true: There's pipenv, pyenv, venv, virtualenv, pypi, pypy, and so on lol. But Python's advantages by far outweigh these minor confusions.

Expand full comment
Jun 20, 2023·edited Jun 20, 2023Liked by Bite Code!

If you want to “Always use pip in a virtual environment” maybe its a good idea to alias pip like

alias pip=pip --require-virtualenv

To allow pip to only run in a virtual environment; exit with an error otherwise.

Expand full comment
Jun 20, 2023·edited Jun 20, 2023Liked by Bite Code!

I always put this to my .bashrc :

export PIP_REQUIRE_VIRTUALENV=true

gpip() {

PIP_REQUIRE_VIRTUALENV="" pip "$@"

}

Then you can use "gpip install <package>" to install a package globally. While "pip install <package>" will exit with error.

Have found this tip on some website.

Expand full comment

Maybe there is an easier way. You can simply skip alias with:

command pip

or

\pip

or

"pip"

Expand full comment
Jun 17, 2023Liked by Bite Code!

Maybe targeting 4 versions older isn't the greatest idea either. As of writing this comment, Python3.7 goes out of support in about a week. At the time the article was written, there was about 3 months to go.

In the near future, minor version upgrades of Python will involve more deprecated features being removed. I think 3.9 is the last easy upgrade. 3.10 involves many more breaking changes. You want to give yourself some time to do the upgrade.

Expand full comment
author
Jun 18, 2023·edited Jun 18, 2023Author

I get where you come from.

I wouldn't sweat too much about it.

I still maintain at least one 2.7 code base, and it's no amazing, but it's manageable.

There are plenty of projects in 3.5 and 3.6 around here, and they deploy fine. You don't get the latest shiny things, that's all. So 3.7 would be alright, although indeed if you get the choice, take a more modern one.

Still, unless you have very hardcore requirements on security, the missing patches are unlikely to be your biggest problem.

People are more likely constrained by the OS they deploy on. If you use a linux LTS distro, you will get stuck on official python versions for some times.

Expand full comment