This MCP server implements a Retrieval-Augmented Generation (RAG) system for documents stored in a local directory, enabling AI assistants to query and interact with Git repositories and text files. Built with TypeScript and leveraging LlamaIndex with Google's Gemini embeddings, it provides tools for listing documents, performing RAG queries against document collections, and adding new content through Git repository cloning or text file downloads. The server requires a Gemini API key for document indexing and querying, and is designed for integration with Claude Desktop, making it ideal for users who need to reference and analyze documentation, code repositories, or text collections during conversations with AI assistants.
kazuph