Navigation
Nest Llm Aigent: Effortless Deployment | RESTful Integration - MCP Implementation

Nest Llm Aigent: Effortless Deployment | RESTful Integration

Nest Llm Aigent transforms Stdio MCP Server into RESTful magic—deploy effortlessly, integrate with web apps via MCP Gateway. Seamless, powerful, and ready to future-proof your stack." )

Developer Tools
4.6(22 reviews)
33 saves
15 comments

46% of users reported increased productivity after just one week

About Nest Llm Aigent

What is Nest Llm Aigent: Effortless Deployment | RESTful Integration?

Nest Llm Aigent is a middleware solution that bridges the Model Context Protocol (MCP) with enterprise-ready RESTful integrations. Designed to address the inherent client-server limitations of MCP, it acts as an AI agent that aggregates multiple MCP servers into a unified HTTP service. This abstraction enables seamless integration into existing web architectures—particularly NestJS-based systems—while maintaining the portability and scalability of MCP components.

How to Use Nest Llm Aigent: Effortless Deployment | RESTful Integration?

Deployment follows three core steps: package integration, configuration, and API invocation. MCP servers are encapsulated as private NPM packages, installed via standard npm workflows. A configuration file (e.g., mcp.config.json) specifies server paths and parameters. REST endpoints like /api/mcp/agent then expose these resources to your application logic, handling LLM interactions through standardized JSON payloads.

{
    "mcpServers": {
      "server1": {
        "name": "example-server",
        "path": "./servers/server1/",
        "args": ["server.js"]
      }
    },
    "mcpClient": {
      "name": "mcp-client",
      "version": "1.0.0"
    }
  }

Nest Llm Aigent Features

Key Features of Nest Llm Aigent: Effortless Deployment | RESTful Integration?

  • Protocol Abstraction: Converts MCP's CS architecture into REST APIs, enabling drop-in integration for legacy or microservices-based systems.
  • Modular Packaging: Private NPM distribution ensures version-controlled, environment-agnostic MCP server management.
  • Unified Interface: Standardized endpoints like /api/mcp/all aggregate tools, resources, and prompts into a single response format.
  • Aggressive Compatibility: Supports virtually all LLM providers through MCP's extensible design.

Use Cases of Nest Llm Aigent: Effortless Deployment | RESTful Integration?

Primarily designed for:

  • Legacy system modernization: Retrofit existing NestJS services with AI capabilities without rewriting core architecture.
  • Multi-tenant LLM management: Centralize access to diverse MCP servers through a single API gateway.
  • CI/CD pipeline integration: Embed MCP server updates into automated deployment workflows using NPM package versioning.
  • Dynamic prompt orchestration: Leverage /api/mcp/prompts endpoints to programmatically adjust LLM inputs at runtime.

Nest Llm Aigent FAQ

FAQ from Nest Llm Aigent: Effortless Deployment | RESTful Integration?

Does the solution support real-time streaming responses?
Currently limited to HTTP/1.1. Future roadmap includes WebSocket encapsulation for chat-streaming scenarios.
How are context variables managed across layers?
Context sharing is supported at three levels: API interfaces, MCP client instances, and server modules. Explicit variable scoping is required in configurations.
What’s the recommended approach for production environments?
Deploy MCP servers as Dockerized NPM packages for isolation. Use Kubernetes secrets to manage API keys referenced in mcp.config.json.
Can I extend this for non-MCP LLMs?
While built for MCP compatibility, the adapter pattern allows wrapping non-MCP models via custom server implementations. A planned extension will automate this process.

Content

nest-llm-aigent

  1. 背景
    随着 MCP(Model Context Protocol)协议的推出,统一协议的同时,为大模型扩展程序(MCP server)能一次开发,处处运行提供了条件。 但是由于MCP天然侧重CS架构,导致我们很难方便的集成到公司的现有业务中,在这个背景下我写了这个转发方案,不仅解决了这个问题,从可以把这个项目理解为一个ai agent,可以配置他的多个mcp server,统一对外提供大模型的http服务。

具体架构图如下:

  1. 目标
  • 轻松集成 :适配层可以快速对接现有的 Web 服务(如基于 NestJS 构建的服务)。
  • 无缝扩展 :通过私有 NPM 包的形式,实现 MCP Server 的快速集成与统一管理,保持 MCP Server 的高度可移植性。
  • 便捷部署 :通过私有 NPM 包,实现 MCP Server 的集中管理和版本控制,无缝集成到现有的发布流程中,方便部署。
  • 大模型支持情况 :几乎支持所有的大模型调用。
  1. 接口定义
    为了满足不同场景下的功能需求,设计了以下标准化的接口:
功能 HTTP 方法 路径 描述
获取所有工具 POST /api/mcp/tools 获取所有工具列表
获取function calls工具 POST /api/mcp/functools 获取openai function call 定义
调用 MCP 工具 POST /api/mcp/tools/call 调用 MCP 工具
获取资源列表 POST /api/mcp/resources 获取所有资源
获取所有提示 POST /api/mcp/prompts 获取所有提示
获取所有工具、资源、提示词 POST /api/mcp/all 返回格式:{tools:[],resources:[],prompts:[]}
调用llm接口 POST /api/mcp/agent 输入格式:messages:["role":"user","content":"你的问题"],返回messages
  1. 示例
    .....

  2. MCP Servers 部署方式
    推荐将 MCP Server 集成部署,以提高灵活性与统一管理能力。建议以下方案:

  • 私有 NPM 包 :将 MCP Server 打包为 NPM 包,结合企业私有 NPM 仓库进行分发管理。
  • 安装与配置 :通过 npm install 直接安装 MCP Server 包,并通过配置文件(如 mcp.config.json)实现快速集成。

示例配置文件 mcp.config.json

{
  "mcpServers": {
    "server1": {
      "name": "example-server",
      "args": ["server.js"],
      "path": "./servers/server1/"
    }
  },
  "mcpClient": {
    "name": "mcp-client",
    "version": "1.0.0"
  }
}
  1. 未来
  • 目前mcp server sse协议 server只做了stdio 。
  • 请求时上下文协议及全局变量可以在接口层、mcp client层、mcp server层共享。
  • 目前支持http接口,最好未来可以把这些接口封装成一个socket,更有利于聊天场景。
  • 写完后,其实有另外一个想法,就是能不能基于mcp 原始协议,写一个公共类,对于基于类写的mcp server,编译后不仅支持mcp client的调用,也可以支持引入模块后的,函数调用。这可能是另外一个项目。
  • 大概花了1天时间,完成了这个项目,时间有点紧张,后续有时间再完善。

Related MCP Servers & Clients