Core Data Structures & Concepts
While Hollama is a user-facing application and not a library, understanding its core data structures is essential for contributors. This document outlines the key TypeScript interfaces that model the application's state.
Server Connections
Defines the configuration for connecting to an LLM server.
Source: src/lib/connections.ts
export enum ConnectionType {
Ollama = 'ollama',
OpenAI = 'openai',
OpenAICompatible = 'openai-compatible'
}
export interface Server {
id: string;
baseUrl: string;
connectionType: ConnectionType;
isVerified: Date | null;
isEnabled: boolean;
label?: string;
modelFilter?: string;
apiKey?: string;
}
id: A unique identifier for the server connection.baseUrl: The API endpoint of the LLM server.connectionType: The type of server, which determines the chat strategy to use.isVerified: A timestamp indicating if the connection was successfully verified, ornull.isEnabled: Whether to use models from this server.label: A user-defined name for the connection.modelFilter: A prefix to filter the list of models from this server.apiKey: The API key, used for OpenAI and compatible servers.
Sessions
A session represents a single, continuous conversation.
Source: src/lib/sessions.ts
export interface Session {
id: string;
messages: Message[];
systemPrompt: Message;
options: Partial<OllamaOptions>;
model?: Model;
updatedAt?: string;
title?: string;
}
export interface Message {
role: 'user' | 'assistant' | 'system';
content: string;
knowledge?: Knowledge;
context?: number[];
reasoning?: string;
images?: { data: string; filename: string }[];
}
Session:id: Unique ID for the session.messages: An array ofMessageobjects representing the conversation history.systemPrompt: A specialMessageobject withrole: 'system'.options: A partialOllamaOptionsobject containing any advanced parameters for this session.model: TheModelobject used for the session.
Message:role: Who sent the message.content: The text content of the message.knowledge: An attachedKnowledgeitem.reasoning: Reasoning or thought process from the model, extracted from tags like<think>.images: An array of attached images, each with base64 data and a filename.
Knowledge
Represents a reusable piece of content, often used as a system prompt.
Source: src/lib/knowledge.ts
export interface Knowledge {
id: string;
name: string;
content: string;
updatedAt: string;
}
id: A unique identifier.name: A user-defined name for the knowledge item.content: The text content of the item.updatedAt: Timestamp of the last modification.
Chat Strategy
The application uses a strategy pattern to communicate with different types of LLM servers.
Source: src/lib/chat/index.ts
export interface ChatStrategy {
chat(
payload: ChatRequest,
abortSignal: AbortSignal,
onChunk: (content: string) => void
): Promise<void>;
getModels(): Promise<Model[]>;
pull?(
payload: PullRequest,
onChunk: (progress: ProgressResponse | StatusResponse | ErrorResponse) => void
): Promise<void>;
}
chat: Sends a chat request and streams the response.getModels: Fetches the list of available models from the server.pull: (Optional) For Ollama, pulls a new model from the library.
Implementations for this interface exist in src/lib/chat/ollama.ts and src/lib/chat/openai.ts.