Deployment with Docker
Using Docker is the recommended way to run AIDE ML in a consistent and isolated environment, ensuring that all dependencies and system requirements are met without conflicting with your local setup.
Dockerfile Overview
The project includes a multi-stage Dockerfile that optimizes the final image size.
-
Builder Stage:
- Starts from a
python:3.10-slimbase image. - Installs build tools like
gcc. - Copies only the necessary files (
requirements.txt,setup.py, source code). - Creates a virtual environment inside
/opt/venv. - Installs all Python dependencies from
requirements.txtand theaidemlpackage itself into this virtual environment.
- Starts from a
-
Runtime Stage:
- Starts from a fresh
python:3.10-slimimage. - Installs only essential runtime dependencies like
unzip. - Creates a non-root user
aidefor enhanced security. - Copies the pre-built virtual environment from the
builderstage. - Copies the application source code.
- Sets up
logsandworkspacesdirectories and assigns ownership to theaideuser. - Switches to the non-root
aideuser. - Sets the
ENTRYPOINTtoaide, making the container directly executable as the CLI tool.
- Starts from a fresh
Building the Image
To build the Docker image, navigate to the root of the repository and run:
docker build -t aide .
Alternatively, you can use the provided Makefile:
make docker-build
Running the Container
To run AIDE ML inside the container, you need to mount local directories for data, logs, and workspaces, and provide your LLM API key as an environment variable.
Basic Run Command
This command runs the house_prices example:
docker run -it --rm \
-v "${LOGS_DIR:-$(pwd)/logs}:/app/logs" \
-v "${WORKSPACE_BASE:-$(pwd)/workspaces}:/app/workspaces" \
-v "$(pwd)/aide/example_tasks:/app/data" \
-e OPENAI_API_KEY="your-actual-api-key" \
aide data_dir=/app/data/house_prices goal="Predict price" eval="RMSE"
Command Breakdown
docker run -it --rm: Runs the container in interactive mode (-it) and removes it after it exits (--rm).-v "$(pwd)/logs:/app/logs": Mounts your local./logsdirectory to/app/logsinside the container. All experiment logs will be saved here.-v "$(pwd)/workspaces:/app/workspaces": Mounts your local./workspacesdirectory. The agent's temporary files and sandboxed environments will be created here.-v "$(pwd)/aide/example_tasks:/app/data": Mounts a local data directory (in this case, the examples) to/app/datainside the container.-e OPENAI_API_KEY="...": Passes your API key into the container as an environment variable.aide: The name of the image to run.data_dir=/app/data/house_prices ...: These are the arguments passed directly to theaideCLI entrypoint inside the container. Notice that thedata_dirpath now refers to the path inside the container where you mounted the data.
Using the Makefile
The Makefile simplifies this process. First, ensure your OPENAI_API_KEY is exported in your shell, then run:
make docker-run