Source: @amontalenti
There's a zoo of tools available for defining and managing Python environments, and for maintaining code quality. Here's one common and durable mix of third-party toos that do a good job with packaging, deployment, testing & linting.
For local development & local dependency environments, pyenv
with its pyenv-virtualenv
is very future-proof. Plus, pyenv
has support for miniconda
, Python 2 vs 3 issues, PyPy, etc. for situations where you need that. It's also a good choice for simple Python "environment-based" deployments to remote servers. See this StackOverflow link entry on why this is a solid choice. If you stick with pyenv
and its built-in virtualenv/venv
plugin, you'll be very future-proofed and very standard.
For actually installing dependencies, you'll probably want to avoid the debates going on in the community related to pipenv
& poetry
, and stick with pip
and layer on pip-tools
(if you need version pinning) later on.
Creating the setup.py
boilerplate and using setuptools
is a good idea if you're publishing your library to a private or public PyPI server. It's generally a "set-it-and-forget-it" thing so you don't need to overthink it.
flake8
is a good choice for linting, as it combines pep8
with pyflakes
and that's usually all you need on the linting side. It's very common in the community these days to use the black
formatter, which is similar in principle to the go code formatter. But this is optional.
pytest
is totally fine and very popular, but you also won't get any odd looks for sticking with unittest
and doctest
in the stdlib.
Wheels are now future-proof as they are enshrined both by approved PEPs in the core community, and the PyPA maintains the wheel project.
However, there's quite a lot of history behind it, providing confusing messages to new users. Python started with setuptools
and easy_install
, later added pip
, which definitely improved over easy_install
. But then, later, people realized pinning was useful for the way Python was deployed, so someone built pip_tools
, and Anaconda worked on conda
after GvR said packaging was "uninteresting" to the core team. Then in the last couple years, a couple of well-known Python F/OSS folks built poetry
and pipenv
. It's just the free-wheeling nature of a very open F/OSS community, especially since the Python core team has decided not to "bless" any one or another packaging/installer tool (except to ratify pyproject.toml
and wheels, and that only happened somewhat recently.
When one really thinks about it, though, the only "schism" in the community is between PyPI and Conda. PyPI definitely holds the npmjs.com and maven.org position in the Python community, whereas Conda is an "alternative packaging ecosystem" that is trying to focus on more complex setup/deployment scenarios, for example Apache Spark with pyspark
support is out of scope for PyPI (which is just for "Python packages"), but is in scope for conda-forge
.