Skip to content

NextChat

NextChat (formerly ChatGPT Next Web) is one of the most popular open-source AI chat interfaces, with over 80,000 GitHub stars. You can deploy it to Vercel with a single click, run it in Docker, or self-host it on any infrastructure. MCP support transforms it from a simple chat wrapper into an extensible AI assistant.


NC
NextChat
TRANSPORT
Streamable HTTP ✓
PLATFORM
Web · Desktop · Mobile
MCP VIA
Settings Panel

Deploy Anywhere, Extend with MCP

NextChat's zero-dependency architecture means you can get a fully functional AI chat interface in minutes. One-click Vercel deploy, a single docker run command, or a simple npm start — your instance is ready. Adding MCP servers lets every user on your instance access external tools without additional configuration.

For teams, this creates a shared AI workspace where the same MCP tools are available to everyone. Self-hosted instances keep all conversation data on your infrastructure while still reaching out to Vinkius Cloud for tool access.

Highlights:

  • One-click deploy — Vercel, Railway, Docker, bare metal
  • Multi-platform — responsive web UI that works on phones, tablets, and desktops
  • Custom masks — create reusable chat presets with specific system prompts
  • Markdown rendering — code blocks, LaTeX, tables, and diagrams
  • Multi-provider — OpenAI, Anthropic, Google, Azure, and custom endpoints
  • 80k+ stars — one of the most battle-tested open-source AI UIs

Setting Up MCP

1. Get Your Token

In Vinkius Cloud, go to your server → Connection TokensCreate. Copy the URL.

2. Add to NextChat

Open SettingsMCP Servers → paste your URL:

https://edge.vinkius.com/{TOKEN}/mcp

If self-hosting, you can also configure MCP servers via environment variables for all users.

3. Chat with Tools

Start a conversation. MCP tools are listed in the toolbar and invoked automatically when the model decides they're relevant.


FAQ

Can I pre-configure MCP for all users on a self-hosted instance? Yes. Set MCP server URLs via environment variables in your deployment config. All users on that instance will have access to the configured tools.

Does NextChat work offline? The UI caches locally (PWA), but conversations require a connection to the LLM provider and MCP servers.

What are masks in NextChat? Masks are chat presets — a saved combination of system prompt, model, temperature, and now MCP servers. Share them with your team for consistent AI workflows.

Is NextChat free? Fully open-source under MIT. Self-host at no cost. You provide your own API keys.