Thanks for that article. Though I've only slimmed over it, I've found a lot of great points.
I've always used pipenv for my projects. And the "only" problem that I sometimes have is with "pipenv lock". But your article made me aware now that this is actually not good, because when I one day will have to port all of my projects to a new co…
Thanks for that article. Though I've only slimmed over it, I've found a lot of great points.
I've always used pipenv for my projects. And the "only" problem that I sometimes have is with "pipenv lock". But your article made me aware now that this is actually not good, because when I one day will have to port all of my projects to a new computer, I might deeply regret not having working pipfile.lock files for all of my projects. Must admit that I haven't taken this serious enough. But I've already figured out a workaround.
Probably I'll use venv from now on. No hassle with pipenv.lock not working.
But one thing makes me clueless: How do you achieve building venvs via "python -m venv name_of_venv" with a specific python version when not using pyenv? Do you install the wished version globally then and select the wished python binary when creating the venv? Not sure if it's even possible or advised to install multiple Python versions globally. I'm on macOS and always use the official installer from the Python website to install Python globally. But I've never tried installing multiple of them. Is that good or even possible?
EDIT: Ah ok, reading parts of your article again, on Mac you should do something like "python3.8 -m venv .venv". But does that mean I can install several Python versions safely using the official installer?
I believe it's worth to read your article completely. Will do it soon.
I must say I really like your blog a lot! Have already found some very interesting articles with interesting in depth clarifications.
I'm coding in Python since 2019, and it has become my fav language. I always try to learn new things in small chunks, but as your blog title has already realized, there is always limited time for that.
I knew that using venv is still the standard way to do virtual env things. But I've been fine using pipenv, or at least I believed that so far. I think I'll switch now completely to venv. Will also solve some other issues for me. And actually installing pyenv and other tools is also always a bit of work, e.g. when setting up a new Linux VPS. So now I can save that time.
I also like your tldr at the beginning of your posts. That way it's possible to grab the essential infos fast, and read the details later.
Thanks again! Great work!
P.S. Wondering what's your opinion about pipx. I use it almost exclusively when installing global packages that are cli relevant. But I recently seemed to have some strange issues with it, though it was not too clear if it was because of pipx or because of something else.
I should have include pipx in the article, because you are not the first person to ask this.
Basically, I think it was a good concept, but it brought as many issues at it solved:
- you need to install it first, and it's not a stand alone executable. So many ways for people to mess that up.
- the way it patches the PATH is imperfect, so sometimes you just don't have the intended effect. When people read the doc and actually patch the PATH in the first place.
- when you install pipx, there is a subtle trap: you install it with a certain version of Python. This version condition all the packages you are going to install with it, which mean you may install mypy or black for python 3.7, then run them on a 3.11 codebase, and get an error because of the new syntax. But most people don't know this.
- has you said, it's has random issues, and so I eventually stopped using it because I need something more reliable.
All in all, its the same problem as with other tools: it solves some issues, but bring more modes of failure.
Thanks again for your reply! You seem to be a true Python expert. I've read in one of your blog posts that you have started out with Python 2.4 (I hope I remember this right), so you must have more than a decade of experience, and a lot of teaching experience.
It was really difficult for me to find answers for my pipx problem a few months ago. A user on Reddit gave me a kind of good, possible explanation. But yours here is by far better.
Maybe you could write an own article about pipx? Just as a suggestion. I believe a lot of people would be interested.
Or you could just ad a hint in your article to this comment concerning pipx, because I think you have already explained it pretty well here.
So you are just using standard pip global installation for cli relevant packages (like "pip install <package> --user")?
I'm really a person who always tries to collect the best news and knowledge resources. And I must say that I have the impression that your blog here is the by far best I've come across in the Python realm for a long time. Thanks again!
You should install every tool each virtualenv of each project. It will take more space on disk, and seems like a waste, but eventually it will save you a lot of trouble.
If you need something at the system level (which again, should be very, very rare. black, mypy or pylint should not, and apps should provide another format than wheels), you can fallback on a big venv for your whole system tools and little scripts.
I've played with your methods a bit, but unfortunately I believe I've discovered a weakness.
I took the opportunity and read a bit about venv in the Python docs (though not strictly in depth). It looks like that venv basically adds the Python interpreter of the virtual environment to the beginning of the PATH variable (speaking Linux here b.t.w, I've tested this on a Linux VPS). And "source venv_name/bin/activate" is just a bash script which makes that change of the PATH variable (along with some other stuff that I don't fully understand, because my bash skills are mediocre).
I'm into web3 hacking, where e.g. I use a Python tool called "solc-select". Which is assumed to be installed globally via "pip install solc-select --user". That tool has a cli shortcut, for example you can do "solc-select install 0.8.15", which will install version 0.8.15 of the solidity compiler and add it to a folder in your home directory, which is "~/.solc-select".
It's now useful to do something like "solc-select install all", which will install all existing Solidity compilers. That is better for your daily work. All these compilers together are around 750 MB (getting bigger with each newly released Solidity version). I've found out that the "solc-select" cli command is just another bash script, which will be stored in "~/.local/bin/solc-select", which is in your PATH.
When I now follow your suggestion, setting up everything fresh in a venv for each project, the solc-select script will be stored in "venv_name/bin". You can call it then normally via the "solc_select" command as long as the virtual environment is activated.
But when you now do "solc-select install all", the Soldity compilers won't be downloaded to "~/.solc-select", but instead to "venv_name/.solc-select".
All Solidity compilers together are currently 775 MB. Size continiously increasing, cause development is fast paced in that space.
That means that I would have to download 775 MB for each new project, and keep that stored. That's a bit overkill.
I conclude now: While your method makes sense, the problem is that the Python community is already focussed on methods like global installation via pip (or pipx, I know projects where pipx is explicitly recommended for installation). Especially global tools are designed as global tools. The devs of these tools don't expect that people will install them in a virtual environment.
Maybe I could install global tools to a global venv as you have suggested. But you can activate only one venv at a given time. While you can add the path of that global venv to your PATH permanently to be able to call scripts like "solc-select", the next problem is that I'm also using other global Python tools like "slither", which rely on a globally installed solc-select, and will check if it is installed.
At this point, it is getting too complicated for me. Going against the flow of the Python community sometimes seem to come with a price that is too expensive.
Nevertheless, I've learnt a lot today. I might try to find some compromises.
Ok cool. There are some further questions popping up in my mind, but I believe I know enough now to figure out the rest by myself.
You have really very interesting takes on all this. I have never come along something similar before in the web. I also totally agree to the "keep it simple" and "keep it to the basics" mindset. Nothing worse and time consuming than surprising errors where you have to put tons of time into to find out what it is.
Python is anyhow a lot about "keep it simple". So it suits to the language as well.
Thanks for that article. Though I've only slimmed over it, I've found a lot of great points.
I've always used pipenv for my projects. And the "only" problem that I sometimes have is with "pipenv lock". But your article made me aware now that this is actually not good, because when I one day will have to port all of my projects to a new computer, I might deeply regret not having working pipfile.lock files for all of my projects. Must admit that I haven't taken this serious enough. But I've already figured out a workaround.
Probably I'll use venv from now on. No hassle with pipenv.lock not working.
But one thing makes me clueless: How do you achieve building venvs via "python -m venv name_of_venv" with a specific python version when not using pyenv? Do you install the wished version globally then and select the wished python binary when creating the venv? Not sure if it's even possible or advised to install multiple Python versions globally. I'm on macOS and always use the official installer from the Python website to install Python globally. But I've never tried installing multiple of them. Is that good or even possible?
EDIT: Ah ok, reading parts of your article again, on Mac you should do something like "python3.8 -m venv .venv". But does that mean I can install several Python versions safely using the official installer?
I believe it's worth to read your article completely. Will do it soon.
Yes, you can safely install several Python versions on the same machine using the official installer. I have an article about installing Python for this very purpose: https://www.bitecode.dev/p/installing-python-the-bare-minimum
Thanks for your answer!
I must say I really like your blog a lot! Have already found some very interesting articles with interesting in depth clarifications.
I'm coding in Python since 2019, and it has become my fav language. I always try to learn new things in small chunks, but as your blog title has already realized, there is always limited time for that.
I knew that using venv is still the standard way to do virtual env things. But I've been fine using pipenv, or at least I believed that so far. I think I'll switch now completely to venv. Will also solve some other issues for me. And actually installing pyenv and other tools is also always a bit of work, e.g. when setting up a new Linux VPS. So now I can save that time.
I also like your tldr at the beginning of your posts. That way it's possible to grab the essential infos fast, and read the details later.
Thanks again! Great work!
P.S. Wondering what's your opinion about pipx. I use it almost exclusively when installing global packages that are cli relevant. But I recently seemed to have some strange issues with it, though it was not too clear if it was because of pipx or because of something else.
I should have include pipx in the article, because you are not the first person to ask this.
Basically, I think it was a good concept, but it brought as many issues at it solved:
- you need to install it first, and it's not a stand alone executable. So many ways for people to mess that up.
- the way it patches the PATH is imperfect, so sometimes you just don't have the intended effect. When people read the doc and actually patch the PATH in the first place.
- when you install pipx, there is a subtle trap: you install it with a certain version of Python. This version condition all the packages you are going to install with it, which mean you may install mypy or black for python 3.7, then run them on a 3.11 codebase, and get an error because of the new syntax. But most people don't know this.
- has you said, it's has random issues, and so I eventually stopped using it because I need something more reliable.
All in all, its the same problem as with other tools: it solves some issues, but bring more modes of failure.
I can deal with modes of failure, it's my job.
But most people don't want to.
Thanks again for your reply! You seem to be a true Python expert. I've read in one of your blog posts that you have started out with Python 2.4 (I hope I remember this right), so you must have more than a decade of experience, and a lot of teaching experience.
It was really difficult for me to find answers for my pipx problem a few months ago. A user on Reddit gave me a kind of good, possible explanation. But yours here is by far better.
Maybe you could write an own article about pipx? Just as a suggestion. I believe a lot of people would be interested.
Or you could just ad a hint in your article to this comment concerning pipx, because I think you have already explained it pretty well here.
So you are just using standard pip global installation for cli relevant packages (like "pip install <package> --user")?
I'm really a person who always tries to collect the best news and knowledge resources. And I must say that I have the impression that your blog here is the by far best I've come across in the Python realm for a long time. Thanks again!
You should install every tool each virtualenv of each project. It will take more space on disk, and seems like a waste, but eventually it will save you a lot of trouble.
If you need something at the system level (which again, should be very, very rare. black, mypy or pylint should not, and apps should provide another format than wheels), you can fallback on a big venv for your whole system tools and little scripts.
Hi again,
I've played with your methods a bit, but unfortunately I believe I've discovered a weakness.
I took the opportunity and read a bit about venv in the Python docs (though not strictly in depth). It looks like that venv basically adds the Python interpreter of the virtual environment to the beginning of the PATH variable (speaking Linux here b.t.w, I've tested this on a Linux VPS). And "source venv_name/bin/activate" is just a bash script which makes that change of the PATH variable (along with some other stuff that I don't fully understand, because my bash skills are mediocre).
I'm into web3 hacking, where e.g. I use a Python tool called "solc-select". Which is assumed to be installed globally via "pip install solc-select --user". That tool has a cli shortcut, for example you can do "solc-select install 0.8.15", which will install version 0.8.15 of the solidity compiler and add it to a folder in your home directory, which is "~/.solc-select".
It's now useful to do something like "solc-select install all", which will install all existing Solidity compilers. That is better for your daily work. All these compilers together are around 750 MB (getting bigger with each newly released Solidity version). I've found out that the "solc-select" cli command is just another bash script, which will be stored in "~/.local/bin/solc-select", which is in your PATH.
When I now follow your suggestion, setting up everything fresh in a venv for each project, the solc-select script will be stored in "venv_name/bin". You can call it then normally via the "solc_select" command as long as the virtual environment is activated.
But when you now do "solc-select install all", the Soldity compilers won't be downloaded to "~/.solc-select", but instead to "venv_name/.solc-select".
All Solidity compilers together are currently 775 MB. Size continiously increasing, cause development is fast paced in that space.
That means that I would have to download 775 MB for each new project, and keep that stored. That's a bit overkill.
I conclude now: While your method makes sense, the problem is that the Python community is already focussed on methods like global installation via pip (or pipx, I know projects where pipx is explicitly recommended for installation). Especially global tools are designed as global tools. The devs of these tools don't expect that people will install them in a virtual environment.
Maybe I could install global tools to a global venv as you have suggested. But you can activate only one venv at a given time. While you can add the path of that global venv to your PATH permanently to be able to call scripts like "solc-select", the next problem is that I'm also using other global Python tools like "slither", which rely on a globally installed solc-select, and will check if it is installed.
At this point, it is getting too complicated for me. Going against the flow of the Python community sometimes seem to come with a price that is too expensive.
Nevertheless, I've learnt a lot today. I might try to find some compromises.
Ok cool. There are some further questions popping up in my mind, but I believe I know enough now to figure out the rest by myself.
You have really very interesting takes on all this. I have never come along something similar before in the web. I also totally agree to the "keep it simple" and "keep it to the basics" mindset. Nothing worse and time consuming than surprising errors where you have to put tons of time into to find out what it is.
Python is anyhow a lot about "keep it simple". So it suits to the language as well.