Cortex vs.

How Cortex compares.

Honest, factual comparisons between HangarX Cortex and other memory layers for AI agents. We lead with shared ground, sharpen on the wedge, and tell you when to pick the other tool.

At a glance

How Cortex stacks up against the most-evaluated memory layers across the dimensions buyers care about most. Click any column for a full comparison.

DimensionCortexMem0ZepCogneeLettaLangMemLlamaIndexPinecone
Corpus-grounded memory (docs, notes, code)
Knowledge graph
Hybrid retrieval pipeline (BM25+vector+graph+rerank)
Claims with provenance / citations
MCP server (cross-tool memory)
Self-host full stack
100% local (no cloud)
Native Obsidian plugin
Yes, fully supported Partial / possible with workaround Not a primary capability

Direct memory-layer competitors

Tools most often shortlisted alongside Cortex when teams are choosing AI memory infrastructure.

What about these?

Tools that come up in evaluations but solve different problems. Quick takes on each — without a full comparison page.

OpenAI Assistants API memory

OpenAI is rolling out memory features inside the Assistants API. It's tightly coupled to OpenAI's models, opaque about storage, and offers no MCP server — meaning the memory only works with OpenAI agents. Cortex is provider-agnostic and exposes the same memory to every MCP-compatible agent on your machine.

ChatGPT memory

ChatGPT's built-in memory is a consumer feature for personal use, not infrastructure for building agents. It can't be queried by other AI tools, can't ingest your documents at scale, and offers no provenance or auditability. Cortex is the developer-facing infrastructure equivalent.

Anthropic Memory tool

Anthropic has shipped early memory primitives in the Claude API. They're useful for stateful conversations within Claude itself, but they don't extend to other AI tools and aren't built around a knowledge graph or corpus ingestion. Cortex serves the same Claude agent (via MCP) plus every other agent simultaneously.

Notion AI / Notion Q&A

Notion AI is great if your notes live in Notion and you only need answers inside Notion. It's not exposed to external agents and isn't designed as memory infrastructure. Cortex is the equivalent for Obsidian users — and unlike Notion AI, the memory is callable by Claude, Cursor, Cline, Windsurf, and any MCP client.

Smart Connections / Obsidian Copilot / Khoj

These are Obsidian-specific RAG plugins that work inside Obsidian only. Cortex's Obsidian plugin works inside Obsidian too — but the same memory is also exposed over MCP to every other AI tool. The differentiator is cross-tool serving plus the deeper retrieval pipeline (knowledge graph, claims, contradictions).

Want grounded memory across every AI tool?

Skip the build-it-yourself stack. Point Cortex at your corpus and your agents start citing sources — over MCP, across every AI tool you use.