Give your AI the context it needs. OrbitalMCP stores, indexes, and retrieves knowledge via REST API and the Model Context Protocol.
// Ingest knowledge into OrbitalMCP
await fetch('https://api.orbitalmcp.com/v1/ingest', {
method: 'POST',
headers: {
'Authorization': 'Bearer your_api_key',
'Content-Type': 'application/json'
},
body: JSON.stringify({
content: 'Your documentation, code, or any text...',
source: 'my-project',
metadata: { version: '1.0' }
})
});
// Query with semantic search
const results = await fetch('https://api.orbitalmcp.com/v1/query', {
method: 'POST',
headers: { 'Authorization': 'Bearer your_api_key' },
body: JSON.stringify({ query: 'How does authentication work?' })
});Built for developers who want to give their AI agents persistent memory
Powered by PostgreSQL + pgvector. Your data stays in a battle-tested database, not a black box.
First-class Model Context Protocol support. Works seamlessly with Claude, Cursor, Gemini, and other MCP clients.
Strict tenant separation. Your data is never mixed with others or used for training.
HNSW indexing for sub-millisecond semantic search. Built on Fastify for minimal latency.
Two endpoints: ingest and query. No complex configuration or ML expertise required.
Track queries, monitor token usage, and optimize your AI's memory consumption.
Connect any MCP-compatible AI to your knowledge base
search_knowledge_baseSearch for relevant documents by semantic similarity. Returns the most relevant chunks from your knowledge base.
add_memoryStore new content in the knowledge base. Content is automatically chunked, embedded, and indexed.