LiiADA Augmented Intelligence Framework
A modular, plug-and-play Python framework for building production-grade AI agents. Hexagonal architecture, multi-LLM support, and native HCP context integration — without the lock-in.
Why another agent framework?
Most frameworks give you tools to call LLMs. LiiADA gives you a system to build agents that stay coherent across sessions, recover from errors deterministically, and audit every decision they make.
Built for teams that can't afford hallucinations in production. Every agent module is independently testable, swappable, and observable.
Plug-and-play modules
13 independent modules. Swap LLM providers, memory backends, or RAG engines without touching business logic.
Persistent context via HCP
Native integration with HCP protocol. Agents read and update structured context automatically — no more session amnesia.
Observable by default
Every agent decision is logged with full traceability. Built-in metrics and OpenTelemetry-compatible tracing.
13 Independent Modules
Each module is a ports & adapters unit — replaceable without changing the rest of the system.
Base framework and DI container
Ollama, OpenAI, Anthropic, Gemini
Episodic + semantic memory
Multi-domain RAG system
Structured prompt engine
Intelligent caching layer
Metrics, tracing, logging
MCP protocol integration
Tool registration and execution
External system connectors
Session lifecycle management
Decision audit trail
HCP context integration
Minimal Agent in 10 Lines
Swap any module without touching the rest.
from liiada.core import AgentBuilder
from liiada.llm import OllamaProvider
from liiada.memory import EpisodicMemory
from liiada.hcp import HCPContextLoader
agent = (
AgentBuilder()
.with_llm(OllamaProvider(model="llama3"))
.with_memory(EpisodicMemory())
.with_context(HCPContextLoader(vault=".procontext/"))
.build()
)
response = await agent.run("Implement the user auth endpoint")
# Agent reads .procontext/ → knows stack, decisions, constraints Any LLM. Your Infrastructure.
No vendor lock-in. Run fully local or hybrid.
Ollama (local)
OpenAI
Anthropic
Google Gemini
Launching May 2026
Star the repo to get notified when LiiADA goes public.