Skip to content

Cherry Studio

Cherry Studio is a polished desktop AI client that unifies multiple LLM providers under one interface. Instead of switching between ChatGPT, Claude, and Gemini in separate tabs, you manage all your conversations in a single app — with a shared knowledge base, conversation folders, and now MCP tools that work across every provider.


CS
Cherry Studio
Open-Source · github.com/kangfenmao
TRANSPORT
Streamable HTTP ✓
PLATFORM
Windows · macOS · Linux
MCP VIA
Settings Panel

One Interface for Every Model, Extended by MCP

Cherry Studio consolidates OpenAI, Anthropic, Google, DeepSeek, Ollama, and custom OpenAI-compatible endpoints into a unified conversation view. You can compare responses across providers, organize chats into folders, and build a knowledge base from your documents.

MCP adds a new dimension to this: instead of the model being limited to its training data and your documents, it can now reach out to live systems. Check a production database, query a CI pipeline, or pull the latest pricing data — all within the same conversation thread.

Standout features:

  • Provider hub — manage API keys and quotas for multiple providers in one place
  • Knowledge base — index documents for retrieval-augmented generation
  • Conversation folders — organize chats by project, client, or topic
  • Translation mode — real-time language translation within conversations
  • Themes — dark / light modes with customizable accent colors
  • Open-source — community-maintained with frequent updates

How to Connect

1. Generate a Token

In Vinkius Cloud, navigate to your server → Connection TokensCreate. Copy the URL.

2. Add in Cherry Studio

Open SettingsMCPAdd Server. Enter a name ("vinkius") and paste the URL:

https://edge.vinkius.com/{TOKEN}/mcp

3. Use Across Providers

Start any conversation — whether you're chatting with Claude, GPT, Gemini, or a local model — MCP tools are available to all of them. Cherry Studio handles the plumbing.


FAQ

Do MCP tools work with all providers in Cherry Studio? Yes. MCP operates at the application level, not the provider level. All configured providers can call the same MCP tools.

Can I use Cherry Studio's knowledge base alongside MCP? Absolutely. The AI can reference your indexed documents and call MCP tools in the same message. Document context and live external data complement each other.

Does Cherry Studio support local models? Yes, through Ollama or any OpenAI-compatible local endpoint. MCP servers still require internet access.

Is Cherry Studio free? Open-source. Bring your own API keys for cloud providers, or use local models at no cost.