We've all done it:
pip install numpy
# run into permissions issues
sudo pip install numpy # or "sudo !!" for the power users ;)
~~~{% endraw %}
End of story, it works, it's what is written in the README so what's wrong with it?
You see, when you install a library through pip, you are executing {% raw %}`setup.py`{% endraw %} with root permissions. This file could harbor malicious code. You are also messing with the system libraries and that invariably leads to issues down the line.
Relevant XKCD (thx u/truh):
![xkcd](https://imgs.xkcd.com/comics/python_environment.png)
A much better and secure way would be:{% raw %}
~~~bash
pip install --user numpy # libs will be installed in ~/.local/lib
~~~{% endraw %}
This is better, and can be used for installing applications, but it doesn't solve the problem of having different versions needs for different python projects. Enter [pipenv](https://pipenv.readthedocs.io/en/latest/). {% raw %}`pipenv` is to python what `composer` is to PHP. It lets you **easily** install and use libraries per project. It's not the only tool allowing you to do that, but it's the one I use so it's the one I'm gonna present you. Example:
~~~bash
pipenv install numpy matplotlib pandas
# to start your program
pipenv run ./crunch-data.py
# to install libs from another machine, after a git pull:
pipenv sync
# to get a shell in the env (like `source myenv/bin/activate` for venv)
pipenv shell
~~~
This allows a very reproducible environment for your program, without resorting to Docker and without messing up user or system libraries. Save yourself from future bugs and start using pipenv, venv, conda or virtualenv right away! It's much better than {% raw %}`requirements.txt`{% endraw %} + pip. :)
Cheers,
~Nico