Navigation
ProtoLinkAI: Smart Mods & Seamless AI Automation - MCP Implementation

ProtoLinkAI: Smart Mods & Seamless AI Automation

ProtoLinkAI 🚀: Effortlessly bridge MCP servers with AI—smarter mods, seamless automation, and zero coding headaches. Your server’s new secret weapon. 🛠️✨" )

Research And Data
4.7(56 reviews)
84 saves
39 comments

Users create an average of 45 projects per month with this tool

About ProtoLinkAI

What is ProtoLinkAI: Smart Mods & Seamless AI Automation?

ProtoLinkAI 是一款基于模块化架构的 AI 工具框架,通过 MCP (Modular Component Platform) 框架实现无缝自动化。其核心设计允许用户灵活集成 ElizaOS、Twitter API 等第三方工具,通过预定义的接口快速构建自动化流程,无需复杂的环境配置或重复编码。

How to Use ProtoLinkAI: Step-by-Step Configuration

  1. 环境准备:克隆仓库并安装依赖(Docker 或本地虚拟环境)
  2. 工具选择:在 MultiToolAgent 配置中启用所需模块(如时间/天气工具)
  3. 运行方式:
    • 容器化部署:执行 docker run mcp-framework
    • 本地启动:通过 run.sh 脚本并配置 .env 文件

ProtoLinkAI Features

Key Features of ProtoLinkAI

  • 动态模块加载:实时启用/禁用功能模块而无需重启服务
  • 零配置集成:通过标准化接口自动适配 ElizaOS 等第三方组件
  • 环境隔离:支持 Docker 容器化与本地进程双模式部署
  • 扩展接口:开放 API 文档便于自定义工具开发

Use Cases & Real-World Applications

场景1:自动化数据监控

配置时间模块与天气API,在指定时段自动抓取并分析气象数据,输出可视化报告

场景2:开发环境加速

通过 run.sh 脚本快速初始化开发环境,集成 ElizaOS 的自然语言处理模块

场景3:多框架协作

与 Claude Desktop 集成时,通过预设的 JSON 配置文件实现无缝衔接

ProtoLinkAI FAQ

FAQ: Common Questions

Q: 如何选择特定工具?
修改 MultiToolAgent 构造器中的模块注册列表即可
Q: 是否需要持续连接网络?
基础模块可在离线模式运行,依赖第三方API的工具需保持网络连接
Q: 如何更新组件?
通过 ./update.sh 脚本执行自动化版本同步与依赖校验
Q: 支持哪些操作系统?
经测试支持 Ubuntu 20+/macOS 11+/Windows 10+(需WSL)
``` ### 代码说明: 1. **结构化信息展示**:使用 `
    ` `
      ` `
      ` 等语义化标签,确保技术文档的可读性 2. **关键功能高亮**:通过项目符号列表突出显示核心特性 3. **配置示例**:保留代码片段的格式(如 标签)便于开发者快速定位 4. **场景化说明**:用具体案例展示框架在实际开发中的应用场景 5. **FAQ格式优化**:采用定义列表(
      )实现问题与答案的直观对应 6. **技术细节保留**:完整保留原始文档中的技术术语和组件名称(如 MCP、ElizaOS) 注:由于原始内容中未包含外部链接,因此未体现

Content

ProtoLinkAI 🚀

ProtoLink AI is a standardized tool wrapping framework for implementing and managing diverse tools in a unified way. It is designed to help developers quickly integrate and launch tool-based use cases.

Key Features

Tech Stack 🛠️

  • Python : Core programming language
  • MCP Framework: Communication protocol
  • Docker : Containerization

🤔 What is MCP?

The Model Context Protocol (MCP) is a cutting-edge standard for context sharing and management across AI models and systems. Think of it as the language AI agents use to interact seamlessly. 🧠✨

Here’s why MCP matters:

  • 🧩 Standardization : MCP defines how context can be shared across models, enabling interoperability.
  • Scalability : It’s built to handle large-scale AI systems with high throughput.
  • 🔒 Security : Robust authentication and fine-grained access control.
  • 🌐 Flexibility : Works across diverse systems and AI architectures.

mcp_architecture source

Installation 📦

Install via PyPI

pip install ProtoLinkai

Usage 💻

Run Locally

ProtoLinkai --local-timezone "America/New_York"

Run in Docker

  1. Build the Docker image: docker build -t ProtoLinkai .

  2. Run the container: docker run -i --rm ProtoLinkai


Twitter Integration 🐦

MProtoLinkAI offers robust Twitter integration, allowing you to automate tweeting, replying, and managing Twitter interactions. This section provides detailed instructions on configuring and using the Twitter integration, both via Docker and .env + scripts/run_agent.sh.

Docker Environment Variables for Twitter Integration

When running ProtoLinkAI within Docker, it's essential to configure environment variables for Twitter integration. These variables are divided into two categories:

1. Agent Node Client Credentials

These credentials are used by the Node.js client within the agent for managing Twitter interactions.

ENV TWITTER_USERNAME=
ENV TWITTER_PASSWORD=
ENV TWITTER_EMAIL=

2. Tweepy (Twitter API v2) Credentials

These credentials are utilized by Tweepy for interacting with Twitter's API v2.

ENV TWITTER_API_KEY=
ENV TWITTER_API_SECRET=
ENV TWITTER_ACCESS_TOKEN=
ENV TWITTER_ACCESS_SECRET=
ENV TWITTER_CLIENT_ID=
ENV TWITTER_CLIENT_SECRET=
ENV TWITTER_BEARER_TOKEN=

Running ProtoLinkAI with Docker

  1. Build the Docker image:

    docker build -t ProtoLinkai .

  2. Run the container:

    docker run -i --rm ProtoLinkai

Running ProtoLink with .env + scripts/run_agent.sh

Setting Up Environment Variables

Create a .env file in the root directory of your project and add the following environment variables:

ANTHROPIC_API_KEY=your_anthropic_api_key
ELIZA_PATH=/path/to/eliza
TWITTER_USERNAME=your_twitter_username
TWITTER_EMAIL=your_twitter_email
TWITTER_PASSWORD=your_twitter_password
PERSONALITY_CONFIG=/path/to/personality_config.json
RUN_AGENT=True

# Tweepy (Twitter API v2) Credentials
TWITTER_API_KEY=your_twitter_api_key
TWITTER_API_SECRET=your_twitter_api_secret
TWITTER_ACCESS_TOKEN=your_twitter_access_token
TWITTER_ACCESS_SECRET=your_twitter_access_secret
TWITTER_CLIENT_ID=your_twitter_client_id
TWITTER_CLIENT_SECRET=your_twitter_client_secret
TWITTER_BEARER_TOKEN=your_twitter_bearer_token

Running the Agent

  1. Make the script executable:

    chmod +x scripts/run_agent.sh

  2. Run the agent:

    bash scripts/run_agent.sh

Summary

You can configure ProtoLink to run with Twitter integration either using Docker or by setting up environment variables in a .env file and running the scripts/run_agent.sh script.

This flexibility allows you to choose the method that best fits your deployment environment.


ElizaOS Integration 🤖

1. Directly Use Eliza Agents from ProtoLink

This approach allows you to use Eliza Agents without running the Eliza Framework in the background. It simplifies the setup by embedding Eliza functionality directly within ProtoLink.

Steps:

  1. Configure ProtoLink to Use Eliza MCP Agent: In your Python code, add Eliza MCP Agent to the MultiToolAgent:

    from ProtoLink.core.multi_tool_agent import MultiToolAgent

from ProtoLink.tools.eliza_mcp_agent import eliza_mcp_agent

multi_tool_agent = MultiToolAgent([
    # ... other agents
    eliza_mcp_agent
])

Advantages:

  • Simplified Setup: No need to manage separate background processes.
  • Easier Monitoring: All functionalities are encapsulated within MCPAgentAI.
  • Highlight Feature: Emphasizes the flexibility of MCPAgentAI in integrating various tools seamlessly.

2. Run Eliza Framework from ProtoLinkai

This method involves running the Eliza Framework as a separate background process alongside ProtoLinkAI.

Steps:

  1. Start Eliza Framework: bash src/ProtoLinkai/tools/eliza/scripts/run.sh

  2. Monitor Eliza Processes: bash src/ProtoLinkai/tools/eliza/scripts/monitor.sh

  3. Configure MCPAgentAI to Use Eliza Agent: In your Python code, add Eliza Agent to the MultiToolAgent:

    from ProtoLink.core.multi_tool_agent import MultiToolAgent

from ProtoLink.tools.eliza_agent import eliza_agent

multi_tool_agent = MultiToolAgent([
   # ... other agents
   eliza_agent
])

Tutorial: Selecting Specific Tools

You can configure ProtoLink to run only certain tools by modifying the agent configuration in your server or by updating the server.py file to only load desired agents. For example:

from ProtoLinkai.tools.time_agent import TimeAgent
from ProtoLinkai.tools.weather_agent import WeatherAgent
from ProtoLinkai.core.multi_tool_agent import MultiToolAgent

multi_tool_agent = MultiToolAgent([
    TimeAgent(),
    WeatherAgent()
])
This setup will only enable **Time** and **Weather** tools.

Integration Example: Claude Desktop Configuration

You can integrate ProtoLinkAI with Claude Desktop using the following configuration (claude_desktop_config.json), note that local ElizaOS repo is optional arg:

{
    "mcpServers": {
        "mcpagentai": {
            "command": "docker",
            "args": ["run", "-i", "-v", "/path/to/local/eliza:/app/eliza", "--rm", "mcpagentai"]
        }
    }
}

Development 🛠️

  1. Clone this repository:

    git clone https://github.com/StevenROyola/ProtoLink.git

cd mcpagentai
  1. (Optional) Create a virtual environment:

    python3 -m venv .venv

source .venv/bin/activate
  1. Install dependencies:

    pip install -e .

  2. Build the package:

    python -m build



License : MIT
Enjoy! 🎉

Related MCP Servers & Clients