Installation Guide
This guide covers various methods to install and set up AIDE ML, from a simple pip
install to running it within a Docker container.
Prerequisites
- Python 3.10 or higher.
- An API key for an LLM provider (e.g., OpenAI, Anthropic, Gemini).
Standard Installation (PyPI)
The easiest way to get started with AIDE ML is to install it from PyPI.
pip install -U aideml
This single command installs the aide
CLI tool and all its core dependencies.
Development Installation
If you plan to contribute to AIDE ML or want to experiment with the source code, you should perform a development installation from a cloned repository.
-
Clone the repository:
git clone https://github.com/WecoAI/aideml.git cd aideml
-
Create and activate a virtual environment (recommended):
python3.10 -m venv .venv source .venv/bin/activate
-
Install in editable mode:
This command installs the package and its dependencies. The
-e
flag (editable) ensures that any changes you make to the source code are immediately reflected when you run theaide
command.pip install -e .
Using the Makefile
The project includes a Makefile
that automates the development setup.
# This command will create a venv, activate it, and install dependencies
make install
Docker Installation
For a completely isolated and reproducible environment, you can use Docker. The Dockerfile
provided in the repository sets up a container with Python 3.10 and all necessary dependencies.
-
Build the Docker image:
From the root of the repository, run:
docker build -t aide .
You can also use the
Makefile
shortcut:make docker-build
-
Run the Docker container:
The
README.md
andMakefile
provide adocker run
command that mounts local directories for logs, workspaces, and data into the container. This allows the agent inside the container to access your data and save its results to your local machine.docker run -it --rm \ -v "$(pwd)/logs:/app/logs" \ -v "$(pwd)/workspaces:/app/workspaces" \ -v "$(pwd)/aide/example_tasks:/app/data" \ -e OPENAI_API_KEY="your-actual-api-key" \ aide data_dir=/app/data/house_prices goal="Predict price" eval="RMSE"
The
Makefile
provides a convenient shortcut for this command:make docker-run
API Key Configuration
AIDE ML requires an API key to communicate with an LLM provider. You must set this as an environment variable.
For OpenAI models:
export OPENAI_API_KEY=<your-key> # Get one from https://platform.openai.com/api-keys
For other providers like Anthropic, Gemini, or OpenRouter, set the corresponding environment variable (ANTHROPIC_API_KEY
, GEMINI_API_KEY
, OPENROUTER_API_KEY
). See the Configuration page for more details.