Skip to content

chatmcp

chatmcp is one of the few applications built from the ground up as an MCP client. While other tools add MCP support to an existing chat interface, chatmcp makes the protocol its core feature — offering dedicated panels for tool discovery, resource browsing, and prompt template management.


cm
chatmcp
Open-Source · github.com/daodao97
TRANSPORT
Streamable HTTP ✓
PLATFORM
Windows · macOS · Linux
MCP VIA
Native MCP Client

Built for MCP, Not Retrofitted

Most MCP clients started as general-purpose chat apps that added tool support later. chatmcp takes the opposite approach: the entire UI is organized around MCP primitives. When you connect a server, you see its tools listed with schemas, its resources browsable in a sidebar, and its prompt templates available as one-click starters.

This makes chatmcp especially valuable for MCP server development. You can connect a server under development, inspect its tool definitions, test individual calls, and see how the LLM uses them — all through a graphical interface instead of raw JSON.

What chatmcp provides:

  • Tool catalog — browse all available tools with parameter schemas and descriptions
  • Resource browser — navigate MCP resources like files in a file manager
  • Prompt templates — use server-provided prompts to start focused conversations
  • Visual tool calls — see tool inputs, outputs, and timing in the chat thread
  • Multi-server — connect several MCP servers and see their tools side by side
  • Open-source — cross-platform desktop app

Getting Started

1. Create a Token

In Vinkius Cloud, go to your server → Connection TokensCreate. Copy the URL.

2. Add the Server

Open chatmcp → Add Server → paste:

https://edge.vinkius.com/{TOKEN}/mcp

The app immediately probes the server and displays its capabilities.

3. Explore and Chat

Browse the tool catalog to see what's available. Start a conversation — the LLM will invoke tools as needed — or test individual tools directly from the catalog panel.


FAQ

Is chatmcp useful for MCP server developers? Very. Its tool catalog displays schemas, descriptions, and response types for every exposed tool. You can test calls individually without writing a chat prompt.

Does chatmcp support resources and prompts? Yes. Unlike many clients that only support tools, chatmcp has dedicated UI panels for MCP resources and prompt templates.

Which LLM models does it support? chatmcp supports any OpenAI-compatible API, including Claude, GPT, and local models via Ollama.

Is chatmcp free? Open-source. You provide your own LLM API key.