Continue
Continue is the leading open-source AI code assistant, and its core principle is bring your own everything. Choose any LLM — GPT-4, Claude, Mistral, Ollama, self-hosted models — configure it in a JSON file, and Continue connects it to your IDE. Context providers pull project-specific data (files, Git history, documentation, web pages) into every conversation. MCP tools add external system access on top of this flexible foundation.
The Open-Source Alternative with Total Control
Continue gives you control other AI assistants don't. The entire configuration lives in a config.json file: models, context providers, slash commands, MCP servers — all editable. Switch models mid-conversation. Add custom context providers that pull data from Jira, Confluence, or internal wikis. Share configs across your team via Git.
Context providers are Continue's superpower. Before MCP, Continue already solved the "bring external data to AI" problem through providers like @Docs (documentation), @Git (commit history), @Web (live web search), and @Folder (directory context). MCP extends this with any tool server — the two systems complement each other.
Features:
- Any LLM — OpenAI, Anthropic, Mistral, Ollama, LM Studio, self-hosted
- Context providers —
@File,@Git,@Docs,@Web,@Folder, custom - VS Code + JetBrains — both major IDE families supported
- Tab autocomplete — local or cloud-based completions
- Inline editing — highlight code → describe changes in natural language
- JSON config — complete control via
config.json - Team sharing — share configurations through version control
- Open-source — Apache 2.0, community-driven development
Configuration
1. Create a Token
In Vinkius Cloud, go to your server → Connection Tokens → Create. Copy the URL.
2. Edit config.json
Add to ~/.continue/config.json:
{
"mcpServers": [{
"name": "vinkius",
"url": "https://edge.vinkius.com/{TOKEN}/mcp"
}]
}3. Use in Chat
MCP tools are available alongside context providers in chat and inline editing sessions.
FAQ
How do MCP tools work with context providers? They complement each other. Context providers (@Docs, @Git) bring project data. MCP tools bring external system data. Both are available in every conversation.
Can I use Continue with local models? Yes. Ollama, LM Studio, and any OpenAI-compatible local server work natively.
Is configuration shareable? Yes. config.json can be committed to your repo for team-wide settings.
Is Continue free? Open-source under Apache 2.0. Bring your own API keys.