Managing Python Environments

python
install
fast.ai
Author

Christian Wittmann

Published

October 24, 2025

I have a confession to make… Ever since I set up my Mac, I’ve been happily pip-installing all the Python packages I ever needed into the base environment. Foreseeably, this resulted in a dependency nightmare, followed by a dependency deadlock. The honest analysis of my setup was that I didn’t understand what I was doing, I followed the recommendations and I had focused on learning how to build my projects. I have no regrets, but now it was time to take a step back and learn more about managing what I had installed.

In the spirit of what Jeremy was teaching regarding the initial Fast.AI setup, I wanted to build a system that allows me to continue working in an iterative and explorative way, but which is more stable and manageable in the long term. I ended up doing a fresh Miniforge installation and I set up a few environments in an automated way. This blog post is a summary of my activities and background information I learned along the way.

Installations vs. Environments

Let’s start with some basic terminology and an as-is analysis. If you’re familiar with the basics, feel free to skip ahead to the next section.

On any system running Python, there is at least one installation and one environment, but what is the difference between the two? In a nutshell, the installation is the base layer that includes the Python interpreter, on which you can have one or more environments. The installation contains all the tooling for creating and running environments, and this installation is specific to the operating system or underlying hardware it is running on: For example, Windows or macOS (on Intel or Apple Silicon).

To understand how these pieces fit together, let’s first examine the installation layer. Then we will see how environments build upon it.

What is a Python installation?

More specifically, a Python installation is a self-contained directory tree that provides:

  • a Python executable (the interpreter)
  • its standard library (e.g., os, sys, json, etc.)
  • tools to manage packages or environments (e.g., pip, conda, venv)
  • a way to compile or locate C-extensions for that platform: This means it can either compile C code using the system’s compiler, or download pre-built binaries (wheels) that match your specific architecture (e.g., Apple Silicon vs Intel).

An installation is tied to the operating system:

  • It’s built for a specific CPU architecture and OS layout.
  • The OS can “see” it via a path like /opt/homebrew/bin/python3 or /Users/chrwittm/miniforge3/bin/python.

So, an installation is a real physical presence on disk, known to the OS, with binaries that can be executed directly.

What is a Python Environment (in conda/mamba)?

Building on an installation, a Python environment is a sandboxed workspace with its own Python interpreter and packages. Creating environments for the Miniforge installation is done using conda or mamba (a faster conda for package resolution). Each environment is represented by a directory that contains:

  • its own Python interpreter binary: With conda/mamba Python itself is treated just like another package, therefore any environment can have its own version (unlike venv, which always uses the parent installation’s Python version)
  • its own site-packages folder for installed libraries (e.g. numpy or pandas)
  • its own activation script: When you run conda activate myenv it changes PATH and sets some environment variables to tell the shell: “Use this Python and these packages.”

In short: environments share the underlying toolchain while maintaining complete independence for Python versions and packages.

As-is analysis: How many installations are there?

There is not just one Python installation on any given Mac, there are multiple.

In my case, there were (and still are) 3 installations. Here is where to find them and what they are good for.

  • Apple’s Python stub at /usr/bin/python3

    • It’s tied to Xcode/Command Line Tools and can change with macOS updates.
    • Do not change this installation
  • Homebrew Python at /opt/homebrew/bin/python3

    • It’s tied to Homebrew, and it’s installed via Homebrew (either explicitly or as a dependency)
    • Do not change this installation
  • Your own base environment e.g. at ~/miniforge3

    • Your own Python installation
    • It is the default environment used in the terminal.
    • This environment should be clean - a rule I broke, which we’re going to fix.

Before starting with the installation task, let’s dive a bit deeper and understand what the base environment actually is and does.

What is the base environment?

When you launch the Terminal on macOS (after installing Miniforge), the base environment is typically activated as the default environment.

The base environment lives directly in ~/miniforge3/ while all other environments are created under ~/miniforge3/envs/<name>/.

When you activate the base environment, conda prepends its bin directory to the shell’s PATH variable, like this:

export PATH="/Users/chrwittm/miniforge3/bin:$PATH"

This ensures that when you type python, pip, or conda, the shell finds and executes the versions from the base environment, not from macOS or Homebrew.

To reiterate a subtle point: Activation doesn’t start Python (no binaries are executed), it just changes your shell’s environment variables so that all Python-related commands now point to the environment’s specific directory.

Now that we understand what base is and how it works, the question becomes: what should we actually use it for?

The goal: A clean base environment

The only installation you should modify is your own, in my case, Miniforge. The other installations are managed by macOS or Homebrew.

In the Miniforge installation, the base environment should remain clean. Its only purpose is to create and manage other environments. In other words, it should be a clean, ARM-native Python ‘factory’ on macOS to manage all your environments.

My initial setup contains 3 environments:

  • A playground environment which takes the role of what base was for me previously, an environment where I can freely experiment with packages, but which I can re-create whenever necessary
  • A quarto-blogging environment for writing this blog
  • A fast.ai environment (fastai-latest), which also contains pytorch, so that I can run all the fast.ai projects I have built in the past and want to continue building in the future.

Comment: Ok

Re-installing Miniforge

After diving into the theory, let’s (re-)install Miniforge which is a quick and easy process. First, I’ll describe how to remove an existing Miniforge installation (my case), followed by doing the installation. If you start fresh, just skip the optional steps.

Backup existing environments that you want to keep (if you have any)

If you want to keep specific environments, export them first (replace myenv_backup with a descriptive name):

conda env export --no-builds > myenv_backup.yml

Remove Old Installation (if you have one)

Delete the old Miniforge directory, including all environments

conda deactivate 2>/dev/null || true    # deactivate, suppressing error messages
rm -rf ~/miniforge3                     # deleting Miniforge

Download Latest Miniforge Installer (for me Apple Silicon)

The Miniforge repo has installers for different platforms. For Apple Silicon Macs, we need the ARM64 version (Miniforge3-MacOSX-arm64.sh).

In the terminal, run the following commands:

cd ~/Downloads
curl -LO https://github.com/conda-forge/miniforge/releases/latest/download/Miniforge3-MacOSX-arm64.sh

Run the Installer

To start the installation, run:

bash Miniforge3-MacOSX-arm64.sh

Accept the terms and conditions, the default path, and also answer “yes” to the question “Do you wish to update your shell profile to automatically initialize conda?”. This will update your ~/.zshrc file to initialize conda any time you launch a terminal.

Either restart the terminal (recommended) or run source ~/.zshrc. As a result, you should see that the base environment is active, indicated by (base) in front of your terminal prompt.

Afterwards you can verify the installation by running:

which python        # should return the miniforge3 path
python --version    # should return the Python version, e.g. Python 3.12.11

Creating the playground

We want the base environment to stay clean. As a substitute, let’s create a playground environment that we can use as the default environment instead.

mamba create -n playground python=3.12 -c conda-forge

This command creates a new isolated environment named playground inside the Miniforge installation, installs Python 3.12 and its dependencies from the conda-forge community channel (a trusted source of packages).

To activate the playground use this command:

conda activate playground

Automatic activation of playground environment

Since the goal is to keep the base environment clean, let’s activate the playground by default when you start a new terminal.

To achieve this, we first need to deactivate the automatic activation of the base environment:

conda config --set auto_activate_base false

To automatically activate the playground, add the following line to your ~/.zshrc:

conda activate playground

To see the effect, either restart the terminal (recommended) or run source ~/.zshrc. As a result, you should see that the playground environment is active, indicated by (playground) in front of your terminal prompt.

Creating environments with configuration files

Our playground isn’t completed yet. It would be helpful to have basic packages like numpy or pandas preinstalled, so that when exploring new ideas, I don’t have to install them each time. Additionally, to help prevent a future dependency nightmare, it would be great to be able to reproduce a new playground with minimal effort: If at any point in the future, the playground would be messed up, I could just refresh it without any second thoughts.

The way to do this is to create an environment from YAML-configuration file. I started with the basic collection of packages, which will grow over time. (More on how to manage the evolution in the next section.)

name: playground

channels:
  - conda-forge     # primary open-source channel with Apple Silicon builds

dependencies:
  - python=3.12
  - ipython         # enhanced REPL
  - jupyterlab      # notebook / lab environment
  - ipywidgets      # interactive widgets for Jupyter
  - numpy           # numerical computing
  - pandas          # data analysis
  - matplotlib      # plotting library

To (re-)create the playground environment, you can simply run:

mamba env create -f playground.yml

Managing environments via a GitHub repo

To keep the YAML files for creating the playground and other environments consistent across multiple machines, I created a GitHub repo called python-environments.

Additionally, I implemented a creation script for each environment to fully automate the process. The script also takes care of tasks like making sure the respective environment is not active during the refresh.

Here is the playground creation script as an example.

#!/usr/bin/env bash
set -e  # exit on first error

echo "🔄 Rebuilding Playground environment..."

# Source shared safety check
source ../helpers/ensure_no_env.sh  # Check that no environment is currently active

# Remove old environment if it exists
conda remove -n playground --all -y || true

# Recreate from YAML
mamba env create -f playground.yml

echo "✅ Playground environment rebuilt successfully."
echo ""
echo "To activate it, run:"
echo "conda activate playground"

Assuming you are in the root directory of the repo, you can run the script like this:

cd playground
bash rebuild_playground.sh

Final Thoughts

Just like any good project, resolving the dependency nightmare took a bit longer than initially expected. But I’m very happy with the result because I have transitioned from following setup instructions to having created my own end-to-end Python environment setup workflow which supports my experimental working style.

Infrastructure projects like this one can feel like a distraction from building, but they’re actually investments for the future. This new setup not only solved my initial problem (the dependency nightmare), but it will make me a lot more productive in the future.

As a side note, I first encountered Python environments when deploying services to the cloud as I have described in a previous blog post. With my new setup, my local workflow now mirrors the cloud workflow, which should make future cloud projects smoother.

One last reflection: This experience reinforced my view that you should learn skills when you need them. Learning about creating different Python environments at this point in time was the right time for me. Had I spent time on it earlier, my overall enthusiasm would most likely have been a lot less. I find learning to solve specific problems much more effective than learning something just in case I might need it in the future.