Codebase Memory MCP
High-performance code intelligence MCP server for AI coding agents. Indexes a codebase into a queryable knowledge graph in milliseconds, with 14 MCP tools spanning structural search, call-chain tracing, impact analysis, dead-code detection, and Cypher queries. Single static C binary, 66 languages via tree-sitter, zero runtime dependencies.
“The most active server in this batch and one of the highest-momentum projects in the entire directory. 100 commits in the last 30 days, 31 releases shipped in 9 weeks since first commit, and a peer-style arXiv preprint backing the architecture. Indexes the Linux kernel (28M LOC, 75K files) in 3 minutes, answers structural queries in under 1ms, and ships as a single static C binary with zero dependencies. SLSA Level 3 attestation bundles ship with every release. Designed for AI coding agents that need persistent code intelligence across sessions: tree-sitter AST parsing across 66 languages plus LSP-style hybrid type resolution for Go, C, and C++. The arXiv preprint (2603.27277) reports the trade-off: 83% answer quality versus 92% for a file-exploration agent, achieved at 10x fewer tokens across 31 real-world repositories. The pitch is the cost-quality curve, slightly lower raw quality at dramatically lower token cost, which matters when agents work in large repositories session after session.”
INSTALL THIS SERVER
{
"mcpServers": {
"codebase-memory": {
"command": "codebase-memory-mcp"
}
}
}
{
"mcpServers": {
"codebase-memory": {
"command": "codebase-memory-mcp"
}
}
}
{
"mcpServers": {
"codebase-memory": {
"command": "codebase-memory-mcp"
}
}
}
{
"mcpServers": {
"codebase-memory": {
"command": "codebase-memory-mcp"
}
}
}
{
"mcpServers": {
"codebase-memory": {
"command": "codebase-memory-mcp"
}
}
}
9 TOOLS AVAILABLE
OUR ASSESSMENT
- 100 commits in the last 30 days with 31 releases shipped; the most actively developed server in this batch.
- Single static binary with zero runtime dependencies (no Docker, no Node, no Python).
- 66 languages via vendored tree-sitter grammars compiled into the binary.
- LSP-style hybrid type resolution for Go, C, and C++.
- Auto-detects 11 coding agents (Claude Code, Codex CLI, Gemini CLI, Zed, OpenCode, Antigravity, Aider, KiloCode, VS Code, OpenClaw, Kiro) on install.
- SLSA Level 3 attestation bundles per release; binaries are signed and checksummed.
- Optional 3D graph UI for human exploration of the knowledge graph.
- Infrastructure-as-code indexing: Dockerfiles, Kubernetes manifests, Kustomize overlays as graph nodes with cross-references.
- Local-only processing; codebase data stays on the operator machine.
- C codebase requires C build toolchain for source-level audits; most operators consume the binary.
- 9-week project age means edge-case behaviour is still maturing despite the high commit count.
- arXiv preprint is preprint status; the published benchmarks reflect the maintainer evaluation harness.
All processing happens locally; codebase data stays on the host machine. Release binaries are signed and checksummed; SLSA Level 3 attestation bundles ship alongside each binary asset. The install command writes to agent configuration files (Claude Code, VS Code, etc.) and pre-tool hooks; review the install script before running on locked-down hosts. OpenSSF Scorecard score: 5.8/10. Source is fully open at github.com/DeusData/codebase-memory-mcp for audit.
AI coding agents working in large repositories where token cost matters, teams operating polyglot codebases (66 supported languages), and engineers wanting structural impact analysis (caller graphs, HTTP route dependencies, blast-radius queries).
TECHNICAL DETAILS
ADOPTION METRICS
// Reading this1,929 stars and 216 forks in the first 9 weeks of the project. arXiv preprint and SLSA Level 3 attestations elevate the trust signal.
// Reading thisFirst-ranked in ai-ml category; near top-5 globally by recent activity and editorial weight.
SOURCES & VERIFICATION
We don't take any single directory's word for it. Before scoring, we cross-reference 5 public MCP sources, install the server ourselves against the clients we cover, and record when we last re-verified.
The same server, 5 different lenses. We reconcile these signals into our editorial score, which is why our number sometimes diverges from a directory-aggregate star count.
| Source | Their rating | Their star count | Their downloads | Last synced |
|---|---|---|---|---|
| AutomationSwitch This page | 4.6editorial | 1,929 | — | APR 29, 2026 |
| PulseMCP | — unrated | unavailable | unavailable | APR 29, 2026 |
| MCP.so | — unrated | unavailable | unavailable | APR 29, 2026 |
| Glama | — unrated | unavailable | unavailable | APR 29, 2026 |
| Smithery | — unrated | unavailable | unavailable | APR 29, 2026 |
| Official MCP Registry | — unrated | unavailable | unavailable | APR 29, 2026 |
// Counts are directory-reported; we don't adjust them. Discrepancies usually come from different snapshot times or star-caching.
OTHER AI / ML MCP SERVERS
Qdrant MCP Server
Official Qdrant vector database MCP server. Acts as a semantic memory layer on top of Qdrant: store information with metadata, retrieve via similarity search. Two tools, very small surface area, exceptionally maintained by the Qdrant team. Configurable embedding provider (fastembed default), supports remote and local Qdrant clusters.
ElevenLabs MCP
Official ElevenLabs MCP server. Wraps the full ElevenLabs API surface: text-to-speech, voice cloning, speech-to-text, dubbing, sound effect generation, audio isolation, voice design. MIT-licensed, distributed via PyPI as elevenlabs-mcp. Free tier with 10,000 credits per month.
Pinecone Developer MCP
Official Pinecone MCP for developer workflows. Lets coding assistants search Pinecone documentation, list and configure indexes, generate code informed by index data, and upsert/query records during dev iteration. Apache-2.0, npm-distributed as @pinecone-database/mcp.
DISCUSS YOUR
MCP REQUIREMENTS.
Evaluating a server, scoping an internal deployment, or working out whether MCP is the right fit at all. Start the conversation and we will point you at the right piece of the ecosystem.