Navigation
RagDocs MCP Server: AI-Powered Document Search & Workflow Optimization - MCP Implementation

RagDocs MCP Server: AI-Powered Document Search & Workflow Optimization

RagDocs MCP Server empowers seamless RAG-based document search and management, delivering AI-driven insights while intuitively streamlining workflows for modern teams.

Research And Data
4.5(52 reviews)
78 saves
36 comments

Users create an average of 15 projects per month with this tool

About RagDocs MCP Server

What is RagDocs MCP Server: AI-Powered Document Search & Workflow Optimization?

RagDocs MCP Server is an advanced middleware solution designed to streamline document management and semantic search capabilities using AI-driven techniques. Built on the RAG (Retrieval-Augmented Generation) framework, it integrates with Qdrant vector databases and supports Ollama or OpenAI embeddings to enable contextual document retrieval and analysis. This server empowers teams to organize, search, and optimize workflows across diverse document repositories, offering scalable and flexible tools for metadata management, real-time querying, and automated text processing.

How to Use RagDocs MCP Server: AI-Powered Document Search & Workflow Optimization?

Installation begins with a simple npm command. Configure the server by selecting your preferred embedding provider (Ollama for open-source models or OpenAI for enterprise-grade precision) and setting up Qdrant via Docker or cloud services. Launch the server, then utilize its four core tools: add_document to upload files with metadata, search_documents for semantic queries, list_documents to catalog assets, and delete_document for cleanup. Environment variables allow customization of models and API keys for seamless integration.

RagDocs MCP Server Features

Key Features of RagDocs MCP Server: AI-Powered Document Search & Workflow Optimization?

  • Contextual Search:** Leverages embeddings to retrieve documents based on semantic meaning rather than keywords alone.
  • Metadata Integration:** Organize documents with custom tags for advanced filtering and categorization.
  • Scalable Storage:** Works with Qdrant to handle large datasets and distributed environments.
  • Model Flexibility:** Choose between Ollama’s lightweight models or OpenAI’s robust commercial solutions.
  • Automation:** Auto-chunks large documents into manageable segments for efficient processing.

Use Cases of RagDocs MCP Server: AI-Powered Document Search & Workflow Optimization?

Teams in technical writing, customer support, and research benefit from RagDocs’ capabilities:

  • Enterprise Documentation:** Centralize technical manuals and product specs for fast internal queries.
  • Developer Ecosystems:** Maintain up-to-date API references with real-time semantic search.
  • Customer Support:** Analyze and resolve queries using ticket history and knowledge bases.
  • Academic Research:** Accelerate literature reviews by linking papers through contextual links.

RagDocs MCP Server FAQ

FAQ: RagDocs MCP Server

Q: How do I choose between Ollama and OpenAI?
A: Use Ollama for open-source efficiency in development environments. Opt for OpenAI when enterprise-grade accuracy or proprietary models are required.
Q: Can I run Qdrant locally?
A: Yes, deploy via Docker for lightweight local testing or scale to cloud instances for production use.
Q: What document formats are supported?
A: Supports plain text, PDF, and markdown natively. Extend via custom preprocessors for specialized formats.

Content

RagDocs MCP Server

A Model Context Protocol (MCP) server that provides RAG (Retrieval-Augmented Generation) capabilities using Qdrant vector database and Ollama/OpenAI embeddings. This server enables semantic search and management of documentation through vector similarity.

Features

  • Add documentation with metadata
  • Semantic search through documents
  • List and organize documentation
  • Delete documents
  • Support for both Ollama (free) and OpenAI (paid) embeddings
  • Automatic text chunking and embedding generation
  • Vector storage with Qdrant

Prerequisites

  • Node.js 16 or higher
  • One of the following Qdrant setups:
    • Local instance using Docker (free)
    • Qdrant Cloud account with API key (managed service)
  • One of the following for embeddings:
    • Ollama running locally (default, free)
    • OpenAI API key (optional, paid)

Available Tools

1. add_document

Add a document to the RAG system.

Parameters:

  • url (required): Document URL/identifier
  • content (required): Document content
  • metadata (optional): Document metadata
    • title: Document title
    • contentType: Content type (e.g., "text/markdown")

2. search_documents

Search through stored documents using semantic similarity.

Parameters:

  • query (required): Natural language search query
  • options (optional):
    • limit: Maximum number of results (1-20, default: 5)
    • scoreThreshold: Minimum similarity score (0-1, default: 0.7)
    • filters:
      • domain: Filter by domain
      • hasCode: Filter for documents containing code
      • after: Filter for documents after date (ISO format)
      • before: Filter for documents before date (ISO format)

3. list_documents

List all stored documents with pagination and grouping options.

Parameters (all optional):

  • page: Page number (default: 1)
  • pageSize: Number of documents per page (1-100, default: 20)
  • groupByDomain: Group documents by domain (default: false)
  • sortBy: Sort field ("timestamp", "title", or "domain")
  • sortOrder: Sort order ("asc" or "desc")

4. delete_document

Delete a document from the RAG system.

Parameters:

  • url (required): URL of the document to delete

Installation

npm install -g @mcpservers/ragdocs

MCP Server Configuration

{
  "mcpServers": {
    "ragdocs": {
      "command": "node",
      "args": ["@mcpservers/ragdocs"],
      "env": {
        "QDRANT_URL": "http://127.0.0.1:6333",
        "EMBEDDING_PROVIDER": "ollama"
      }
    }
  }
}

Using Qdrant Cloud:

{
  "mcpServers": {
    "ragdocs": {
      "command": "node",
      "args": ["@mcpservers/ragdocs"],
      "env": {
        "QDRANT_URL": "https://your-cluster-url.qdrant.tech",
        "QDRANT_API_KEY": "your-qdrant-api-key",
        "EMBEDDING_PROVIDER": "ollama"
      }
    }
  }
}

Using OpenAI:

{
  "mcpServers": {
    "ragdocs": {
      "command": "node",
      "args": ["@mcpservers/ragdocs"],
      "env": {
        "QDRANT_URL": "http://127.0.0.1:6333",
        "EMBEDDING_PROVIDER": "openai",
        "OPENAI_API_KEY": "your-api-key"
      }
    }
  }
}

Local Qdrant with Docker

docker run -d --name qdrant -p 6333:6333 -p 6334:6334 qdrant/qdrant

Environment Variables

  • QDRANT_URL: URL of your Qdrant instance
  • QDRANT_API_KEY: API key for Qdrant Cloud (required when using cloud instance)
  • EMBEDDING_PROVIDER: Choice of embedding provider ("ollama" or "openai", default: "ollama")
  • OPENAI_API_KEY: OpenAI API key (required if using OpenAI)
  • EMBEDDING_MODEL: Model to use for embeddings
    • For Ollama: defaults to "nomic-embed-text"
    • For OpenAI: defaults to "text-embedding-3-small"

License

Apache License 2.0

Related MCP Servers & Clients