Deployment with Docker
Using Docker is the recommended way to run AIDE ML in a consistent and isolated environment, ensuring that all dependencies and system requirements are met without conflicting with your local setup.
Dockerfile Overview
The project includes a multi-stage Dockerfile
that optimizes the final image size.
-
Builder Stage:
- Starts from a
python:3.10-slim
base image. - Installs build tools like
gcc
. - Copies only the necessary files (
requirements.txt
,setup.py
, source code). - Creates a virtual environment inside
/opt/venv
. - Installs all Python dependencies from
requirements.txt
and theaideml
package itself into this virtual environment.
- Starts from a
-
Runtime Stage:
- Starts from a fresh
python:3.10-slim
image. - Installs only essential runtime dependencies like
unzip
. - Creates a non-root user
aide
for enhanced security. - Copies the pre-built virtual environment from the
builder
stage. - Copies the application source code.
- Sets up
logs
andworkspaces
directories and assigns ownership to theaide
user. - Switches to the non-root
aide
user. - Sets the
ENTRYPOINT
toaide
, making the container directly executable as the CLI tool.
- Starts from a fresh
Building the Image
To build the Docker image, navigate to the root of the repository and run:
docker build -t aide .
Alternatively, you can use the provided Makefile
:
make docker-build
Running the Container
To run AIDE ML inside the container, you need to mount local directories for data, logs, and workspaces, and provide your LLM API key as an environment variable.
Basic Run Command
This command runs the house_prices
example:
docker run -it --rm \
-v "${LOGS_DIR:-$(pwd)/logs}:/app/logs" \
-v "${WORKSPACE_BASE:-$(pwd)/workspaces}:/app/workspaces" \
-v "$(pwd)/aide/example_tasks:/app/data" \
-e OPENAI_API_KEY="your-actual-api-key" \
aide data_dir=/app/data/house_prices goal="Predict price" eval="RMSE"
Command Breakdown
docker run -it --rm
: Runs the container in interactive mode (-it
) and removes it after it exits (--rm
).-v "$(pwd)/logs:/app/logs"
: Mounts your local./logs
directory to/app/logs
inside the container. All experiment logs will be saved here.-v "$(pwd)/workspaces:/app/workspaces"
: Mounts your local./workspaces
directory. The agent's temporary files and sandboxed environments will be created here.-v "$(pwd)/aide/example_tasks:/app/data"
: Mounts a local data directory (in this case, the examples) to/app/data
inside the container.-e OPENAI_API_KEY="..."
: Passes your API key into the container as an environment variable.aide
: The name of the image to run.data_dir=/app/data/house_prices ...
: These are the arguments passed directly to theaide
CLI entrypoint inside the container. Notice that thedata_dir
path now refers to the path inside the container where you mounted the data.
Using the Makefile
The Makefile
simplifies this process. First, ensure your OPENAI_API_KEY
is exported in your shell, then run:
make docker-run