Skip to content

OpenCode

OpenCode is a terminal coding agent with a rich TUI (text user interface) — think full-screen panels, syntax highlighting, and keyboard shortcuts, all inside your terminal. It works with any LLM provider and treats MCP servers as first-class tool sources, making it easy to extend AI coding sessions with external data.


OC
TRANSPORT
Streamable HTTP ✓
PLATFORM
Windows · macOS · Linux
MCP VIA
opencode.toml

A Full-Screen TUI That Supports MCP

Unlike minimal chat-style CLIs, OpenCode presents a split-pane interface — one panel for the conversation, another for file diffs, and a toolbar showing connected MCP tools. This makes it straightforward to watch the agent iterate on code while MCP tools provide context from databases, APIs, or CI pipelines.

Because OpenCode is provider-agnostic, you can pair MCP with Claude, GPT, Gemini, DeepSeek, or local Ollama models. The MCP layer stays the same regardless of which LLM is driving the session.

Key MCP-related capabilities:

  • TOML config — declare MCP servers in opencode.toml, versioned with your repo
  • Tool sidebar — MCP tools appear alongside built-in file/terminal tools in the TUI
  • Real-time collaboration — multiple users can connect to the same session
  • Session history — browse and resume past conversations that used MCP tools
  • Open-source — 122k+ GitHub stars, MIT license

How to Connect

1. Grab Your Vinkius URL

Head to Vinkius Cloud, pick your server, and generate a Connection Token. Copy the resulting URL.

2. Edit opencode.toml

Create or open opencode.toml at the root of your project and add:

toml
[mcp.vinkius]
type = "remote"
url = "https://edge.vinkius.com/{TOKEN}/mcp"

3. Run OpenCode

bash
opencode

Your MCP tools appear in the TUI sidebar on launch. The agent uses them automatically when relevant — for instance, if a tool provides database schema lookups, OpenCode will invoke it when you ask about a migration.


FAQ

Which LLM providers work with OpenCode? Any provider supported by OpenCode — OpenAI, Anthropic, Google, Groq, DeepSeek, or local models via Ollama. MCP works independently of the model.

Can I add more than one MCP server? Yes. Add multiple [mcp.<name>] sections to opencode.toml. Each appears as a separate tool source in the sidebar.

How do I scope MCP servers to a single project? Place opencode.toml in the project directory. OpenCode reads the nearest config file on startup.

Is OpenCode free? Open-source under MIT. Bring your own API key for the LLM provider.