Open Brain — The infrastructure layer for your thinking. One database, one AI gateway, one chat channel — any AI plugs in. No middleware, no SaaS.
-
Updated
Apr 21, 2026 - TypeScript
Open Brain — The infrastructure layer for your thinking. One database, one AI gateway, one chat channel — any AI plugs in. No middleware, no SaaS.
The memory layer for AI-native development — giving AI persistent understanding of your software projects.
Human-like memory for AI agents — semantic, episodic & procedural. Experience-driven procedures that learn from failures. Free API, Python & JS SDKs, LangChain, CrewAI & OpenClaw integrations.
Shared Memory Storage for Multi-Agent Systems
The Developer Brain for AI coding agents
TypeScript agent memory layer: semantic vector recall + SQLite-backed storage, Chroma or in-memory vectors, REST API, MIT.
Benchmark suite for evaluating retrieval quality and latency of AI agent context systems
The lightest universal AI memory layer. One SQLite file, any LLM, zero cloud. MCP + HTTP + CLI. Smart Recall, Knowledge Evolution, Auto-Capture, Interactive Dashboard.
Synapse Layer — Continuous Consciousness Infrastructure for AI Systems. Persistent. Secure. 1-line integration.
Unified memory layer for AI coding agents: incremental transcript sync, ranked search, archive/restore.
Vendor-neutral memory layer for AI agents. Give ChatGPT, Claude, Cursor, Gemini, and Grok shared persistent memory. TypeScript SDK, MCP server, REST API.
🧠 Persistent memory for AI agents. SQLite for agent state. Zero cloud dependencies. Local embeddings. MCP-native integration with Claude Desktop/Code, Cursor, Windsurf & more.
Maximem Synap is the memory layer that makes AI agents remember. #1 on LongMemEval (90.2%). Works natively with LangChain, LlamaIndex, CrewAI, Google ADK, AutoGen, OpenAI Agents, Semantic Kernel, Haystack, and Pydantic AI.
Persistent identity and memory across AI tools - mcp-native, local-first, framework-agnostic, production-ready.
Ultimate memory layer for local LLM's ~ mitigate limitations of Context Window.
SQL Native Memory Layer for LLMs, AI Agents & Multi-Agent Systems
Memory and recall layer for AoA: explicit memory objects, provenance threads, temporal relevance, salience, and reviewable recall contracts.
Go bindings for Honcho - Persistent memory layer for AI applications. Type-safe API client for contextual conversation memory.
A memory layer for LLMs that extracts entities, relationships, and decisions from conversations and stores them in a semantic knowledge graph, giving any AI persistent, structured context across sessions.
Accurate context. Lowest cost. Your data. — vault-centric knowledge, memory consolidaton and cross topic insights for humans + agents, with CLI, MCP, and Hub.
Add a description, image, and links to the memory-layer topic page so that developers can more easily learn about it.
To associate your repository with the memory-layer topic, visit your repo's landing page and select "manage topics."