attention-matters
Geometric recall on the S³ hypersphere. Quaternion drift, phasor interference, Kuramoto coupling. One brain per developer, not per project.
every token counts
Infrastructure for AI agents
that waste less and remember more.
Context windows are finite.
Most tokens are wasted on navigation — reading files to find files, scanning docs to find sections, re-learning what was known yesterday.
We build the tools to eliminate that waste.
Three problems. Three libraries. One orchestrator.
Geometric recall on the S³ hypersphere. Quaternion drift, phasor interference, Kuramoto coupling. One brain per developer, not per project.
Structural metadata for source files. Auto-generated sidecars via tree-sitter. LLMs know what a file does without reading it.
Structural intelligence for markdown. Hybrid search with BM25 and semantic embeddings. Section-level indexing.
Multi-agent orchestrator wrapping Claude Code CLI. Hub-and-spoke coordination with token budgets. Compiles context. Learns from outcomes.
all open source. all composable.
srobinsonnpx -y attention-matters serve# memorynpx -y frontmatter-matters serve# codenpx -y --package mdcontext mdcontext-mcp# docs