Navigation
Inbox Zero AI MCP: Prioritizes Replies, Flags Follow-Ups - MCP Implementation

Inbox Zero AI MCP: Prioritizes Replies, Flags Follow-Ups

Inbox Zero AI MCP: Your smart email copilot. Prioritizes replies, flags follow-ups, and slays inbox chaos — way beyond Gmail’s basic tricks. Finally, a tool that works as hard as you do." )

Communication
4.1(91 reviews)
136 saves
63 comments

Ranked in the top 10% of all AI tools in its category

About Inbox Zero AI MCP

What is Inbox Zero AI MCP: Prioritizes Replies, Flags Follow-Ups?

Inbox Zero AI MCP is an advanced email management solution designed to streamline communication workflows. By leveraging artificial intelligence, it prioritizes incoming replies, identifies critical follow-ups, and automates routine tasks. The platform integrates with Gmail and Resend APIs to ensure real-time notifications and seamless synchronization, enhancing productivity through intelligent email classification and action suggestions.

How to Use Inbox Zero AI MCP: Prioritizes Replies, Flags Follow-Ups?

  1. Configure APIs: Set up Google OAuth credentials with necessary scopes (Gmail Modify, Contacts Access) via the Google Cloud Console.
  2. Deploy Infrastructure: Configure Google Pub/Sub topics and subscriptions to enable real-time email updates. Use ngrok for local development endpoints.
  3. Select LLM Models: Choose between supported AI providers (Anthropic, OpenAI, AWS Bedrock) or connect Ollama locally via environment variables.
  4. Enable Automation: Schedule cron jobs to maintain email watches and generate follow-up summaries at predefined intervals.

Inbox Zero AI MCP Features

Key Features of Inbox Zero AI MCP: Prioritizes Replies, Flags Follow-Ups?

  • Smart Prioritization: AI-driven sorting of emails based on urgency and relevance.
  • Follow-Up Tracking: Automated flags for missed deadlines and unresolved threads.
  • Real-Time Sync: Instant updates via Google Pub/Sub and Push notifications.
  • Multi-Model Support: Compatibility with Anthropic Claude, OpenAI, and Ollama-based Phi models.
  • Analytics Integration: Tinybird pipelines for email activity tracking and performance metrics.

Use Cases for Inbox Zero AI MCP

Businesses utilize this platform for:

  • Customer Support Teams: Prioritizing high-priority tickets and tracking resolution timelines.
  • Project Managers: Monitoring stakeholder communications and ensuring follow-up actions.
  • Remote Teams: Centralized email management with real-time collaboration alerts.
  • Freelancers: Automating client communication workflows to focus on core tasks.

Inbox Zero AI MCP FAQ

FAQ: Common Questions About Inbox Zero AI MCP

How do I secure Google OAuth credentials?

Create a project in the Google Cloud Console, enable the Gmail API, and generate OAuth 2.0 credentials with appropriate scopes.

Can I use custom LLM models with Ollama?

Yes. Configure the OLLAMA_URL environment variable to point to your local Ollama server and specify the model via MODEL_NAME.

What triggers the follow-up flagging system?

Emails with unresolved threads older than the configured deadline (default 48 hours) are flagged, with notifications sent via the integrated messaging system.

Content

Inbox Zero - Your AI Email Assistant ====================================

Open source email app to reach inbox zero fast.

[Website](https://www.getinboxzero.com/)
·
[Discord](https://www.getinboxzero.com/discord)
·
[Issues](https://github.com/elie222/inbox-zero/issues)

About

There are two parts to Inbox Zero:

  1. An AI email assistant that helps you spend less time on email.
  2. Open source AI email client.

If you're looking to contribue to the project, the email client is the best place to do this.

Deploy with Vercel

Thanks to Vercel for sponsoring Inbox Zero in support of open-source software.

Features

  • AI Personal Assistant: Manages your email for you based on a plain text prompt file. It can take any action a human assistant can take on your behalf (Draft reply, Label, Archive, Reply, Forward, Mark Spam, and even call a webhook).
  • Reply Zero: Track emails that need your reply and those awaiting responses.
  • Smart Categories: Categorize everyone that's ever emailed you.
  • Bulk Unsubscriber: Quickly unsubscribe from emails you never read in one-click.
  • Cold Email Blocker: Automatically block cold emails.
  • Email Analytics: Track your email activity with daily, weekly, and monthly stats.

Learn more in our docs.

Feature Screenshots

AI Assistant Reply Zero
AI Assistant Reply Zero
Gmail Client Bulk Unsubscriber
Gmail client Bulk Unsubscriber

Demo Video

Inbox Zero demo

Built with

Feature Requests

To request a feature open a GitHub issue. If you don't have a GitHub account you can request features here. Or join our Discord.

Getting Started for Developers

We offer a hosted version of Inbox Zero at https://getinboxzero.com. To self-host follow the steps below.

Contributing to the project

You can view open tasks in our GitHub Issues. Join our Discord to discuss tasks and check what's being worked on.

ARCHITECTURE.md explains the architecture of the project (LLM generated).

Requirements

Setup

Here's a video on how to set up the project. It covers the same steps mentioned in this document. But goes into greater detail on setting up the external services.

The external services that are required are:

You also need to set an LLM, but you can use a local one too:

  • Anthropic
  • OpenAI
  • AWS Bedrock Anthropic
  • Google Gemini
  • Groq Llama 3.3 70B
  • Ollama (local)

To enable Bulk Unsubscriber, Analytics and Smart Categories you will also need to set:

We use Postgres for the database. For Redis, you can use Upstash Redis or set up your own Redis instance.

You can run Postgres & Redis locally using docker-compose

bash docker-compose up -d # -d will run the services in the background

Create your own .env file:

bash cp apps/web/.env.example apps/web/.env cd apps/web pnpm install

Set the environment variables in the newly created .env. You can see a list of required variables in: apps/web/env.ts.

The required environment variables:

  • NEXTAUTH_SECRET -- can be any random string (try using openssl rand -hex 32 for a quick secure random string)
  • GOOGLE_CLIENT_ID -- Google OAuth client ID. More info here
  • GOOGLE_CLIENT_SECRET -- Google OAuth client secret. More info here
  • GOOGLE_ENCRYPT_SECRET -- Secret key for encrypting OAuth tokens (try using openssl rand -hex 32 for a secure key)
  • GOOGLE_ENCRYPT_SALT -- Salt for encrypting OAuth tokens (try using openssl rand -hex 16 for a secure salt)
  • UPSTASH_REDIS_URL -- Redis URL from Upstash. (can be empty if you are using Docker Compose)
  • UPSTASH_REDIS_TOKEN -- Redis token from Upstash. (or specify your own random string if you are using Docker Compose)
  • TINYBIRD_TOKEN -- (optional) Admin token for your Tinybird workspace (be sure to create an instance in the GCP us-east4 region. This can also be changed via your .env if you prefer a different region). You can also decide to disabled Tinybird and then the analytics and bulk unsubscribe features will be disabled. Set NEXT_PUBLIC_DISABLE_TINYBIRD=true if you decide to disable Tinybird.

When using Vercel with Fluid Compute turned off, you should set MAX_DURATION=300 or lower. See Vercel limits for different plans here.

To run the migrations:

bash pnpm prisma migrate dev

To run the app locally:

bash pnpm run dev

Or from the project root:

bash turbo dev

Open http://localhost:3000 to view it in your browser. To upgrade yourself to admin visit: http://localhost:3000/admin.

Supported LLMs

For the LLM, you can use Anthropic, OpenAI, or Anthropic on AWS Bedrock. You can also use Ollama by setting the following enviroment variables:

sh OLLAMA_BASE_URL=http://localhost:11434/api NEXT_PUBLIC_OLLAMA_MODEL=phi3

Note: If you need to access Ollama hosted locally and the application is running on Docker setup, you can use http://host.docker.internal:11434/api as the base URL. You might also need to set OLLAMA_HOST to 0.0.0.0 in the Ollama configuration file.

You can select the model you wish to use in the app on the /settings page of the app.

Setting up Google OAuth and Gmail API

You need to enable these scopes in the Google Cloud Console:

plaintext https://www.googleapis.com/auth/userinfo.profile https://www.googleapis.com/auth/userinfo.email https://www.googleapis.com/auth/gmail.modify https://www.googleapis.com/auth/gmail.settings.basic https://www.googleapis.com/auth/contacts

Setting up Tinybird

Follow the instructions here to setup the pipes and datasources.

Optional: If you want to store AI usage stats in Tinybird too, then do the same in /packages/tinybird-ai-analytics.

Set up push notifications via Google PubSub to handle emails in real time

Follow instructions here.

  1. Create a topic
  2. Create a subscription
  3. Grant publish rights on your topic

Set env var GOOGLE_PUBSUB_TOPIC_NAME. When creating the subscription select Push and the url should look something like: https://www.getinboxzero.com/api/google/webhook?token=TOKEN or https://abc.ngrok-free.app/api/google/webhook?token=TOKEN where the domain is your domain. Set GOOGLE_PUBSUB_VERIFICATION_TOKEN in your .env file to be the value of TOKEN.

To run in development ngrok can be helpful:


# or with an ngrok domain to keep your endpoint stable (set `XYZ`):

ngrok http --domain=XYZ.ngrok-free.app 3000 ```

And then update the webhook endpoint in the [Google PubSub subscriptions dashboard](https://console.cloud.google.com/cloudpubsub/subscription/list).

To start watching emails visit: `/api/google/watch/all`

### Watching for email updates

Set a cron job to run these: The Google watch is necessary. The Resend one is optional.

```json "crons": [ { "path": "/api/google/watch/all", "schedule": "0 1 * * *" }, { "path": "/api/resend/summary/all", "schedule": "0 16 * * 1" } ] ```

[Here](https://vercel.com/guides/how-to-setup-cron-jobs-on-vercel#alternative-cron-providers) are some easy ways to run cron jobs. Upstash is a free, easy option. I could never get the Vercel `vercel.json`. Open to PRs if you find a fix for that.

Related MCP Servers & Clients