In this post we will take a look at using Pipenv, a dependency manager for python, to boost your python workflow.
A virtual environment is an isolated environment in which dependencies for a python project are contained. Why should you care about isolating your project environments?
When you install a python package, it’s installed in a folder called “site-packages” that is located somewhere on your system depending on your python installation. You can find this directory by running:
import sys
sys.prefix
For me, the displayed directory is: '/home/stefan/.pyenv/versions/3.8.0'
.
So when you install a package:
pip install toolz
And then look at it’s installation path:
import toolz
toolz.__path__
You’ll find that the installation directory is: ['/home/stefan/.pyenv/versions/3.8.0/lib/python3.8/site-packages/toolz-0.10.0-py3.8.egg/toolz']
.
Now, when you’re developing multiple projects, you may have to install different versions of the same package, and that’s where virtual environments turn out the be very handy.
There are tons of virtual environment tools out there: venv, virtualenv, conda, pipenv, pyenv-virtualenv, virtualenvwrapper, just to name a few. And in this post we will take a look at Pipenv. In the tutorial that is referenced after you create a new PyPI account, Pipenv is also used: tutorial on managing python dependencies.
The big difference between other virtual environment tools and Pipenv is the Pipfile.lock
file that is generated, which makes sure that deterministic builds are produced. When you install dependencies using a requirements.txt
file, pip doesn’t always produce the same environment. This means that you can’t replicate the exact environment you have on your development machine in production (the “it works on my machine” problem).
A solution to this problem would be to pip freeze
the requirements, which pins down the installed dependencies. But that introduces the problem of having to manually keep the package versions up-to-date, while you shouldn’t necessarily care which precise version is installed.
Opposed to Pipenv, Pyenv is a tool for managing multiple python installations. Pipenv is capable of using this tool in the background to create and activate virtual environments that require different python versions.
Take a look at my previous post on how to install and use Pyenv. We will also use Pyenv in this post.
Install Pipenv
on your system:
pip install pipenv
Creating a virtual environment is quite easy:
pipenv install toolz --python 3.8.0
After installing the environment, you have to activate it. You can do this by running the following command:
pipenv shell
You can see that it simply executes the .venv/bin/activate
script that you often run manually in other virtual environment tools.
To deactivate your virtual environment, run:
deactivate
You can also install development dependencies by using the --dev
flag, in which case the dependencies won’t be installed in production:
pipenv install -d toolz --python 3.8.0
We installed a virtual environment that makes use of python version 3.8.0. Pipenv installs packages in ~/.local/share/virtualenvs/<your-virtualenv>
by default. I prefer to place the folder containing the virtual environment in the project itself. This way you always delete the virtual environment along with your project when you’re done. And the python installation can easily be selected in editors using a relative path: ./venv/bin/python
.
Remove the virtual environment:
pipenv --rm
Then export a variable called PIPENV_VENV_IN_PROJECT
to equal 1
:
export PIPENV_VENV_IN_PROJECT=1
Create the virtual environment again:
pipenv install toolz --python 3.8.0
A .venv/
folder is now created inside your project, alongside the Pipfile
and Pipfile.lock
:
You should never edit the Pipfile.lock
file manually. Also, take a look at the hashes. They are ensuring the exact same installation and version on other machines.
Your python application may depend on environment variables, which you may not want to share with others (secret tokens for example). This is an example of a basic .gitignore
file, specifying the files and folders you want to keep out of your remote repository:
.venv/ # your virtual environment is pretty huge
.vscode/ # you may want to share debug configurations sometimes
.mypy_cache/
.pytest_cache/
__pycache__/
*.egg-info
.env # contains your secrets
How do your colleagues know which environment variables are expected when the .env
file is not included as an example?
You can create a .env.example
that other developers can copy as a template, without sharing the secret values themselves:
echo 'SECRET=' >> .env.example
In your personal.env
file, you can fill in the secret values:
echo 'SECRET=P@ssW0rd4' >> .env
echo '.env' >> .gitignore # prevent secrets from being committed
If you are using pipenv
, your environment variables are automatically loaded:
➜ pipenv shell
Loading .env environment variables…
Launching subshell in virtual environment…
. /c/Users/StefanSchenk/Desktop/demo-project/.venv/bin/activate
If you are already using another virtual environment tool, switching is quite easy. If you run pipenv install
, it automatically detects the requirements.txt
file:
requirements.txt found, instead of Pipfile! Converting…
Warning: Your Pipfile now contains pinned versions, if your requirements.txt did.
We recommend updating your Pipfile to specify the "*" version, instead.
Or you can explicitly pass the requirement.txt
file as an argument, which may be useful if you have put development dependencies in a separate file:
pipenv install -r dev-requirements.txt --dev
And if you want to switch back to using requirement.txt
files, you can run:
pipenv lock -r > requirements.txt
pipenv lock -r -d > dev-requirements.txt
Let me know which virtual environment tool you prefer in the comments below!