Skip to content

LibreChat

LibreChat is a self-hosted, open-source alternative to ChatGPT that supports multiple AI providers simultaneously. It includes an agent builder, file search, code interpreter, and conversation presets. Since it is fully self-hosted, your conversations never leave your infrastructure — MCP bridges the gap to external services on your terms.


LC
LibreChat
Open-Source · librechat.ai
TRANSPORT
Streamable HTTP ✓
PLATFORM
Web · Docker · Self-hosted
MCP VIA
Config File

A Self-Hosted ChatGPT with MCP Extensibility

LibreChat provides an experience similar to ChatGPT but on your own server. You manage users, set model permissions, and configure which tools are available — useful for organizations that need data sovereignty. MCP adds tool extensibility without compromising the self-hosted model.

The agent builder lets non-technical users create custom assistants with specific model settings, system prompts, and tool access. MCP servers provide the tools — database queries, API calls, monitoring checks — that make those assistants useful for real work.

Key features:

  • Multi-model — OpenAI, Anthropic, Google, Azure, and any OpenAI-compatible endpoint
  • Agent builder — create custom assistants with specific tool sets
  • File search — upload and search across documents in conversations
  • Code interpreter — execute Python code in sandboxed environments
  • User management — admin panel with registration, quotas, and permissions
  • Conversation presets — save and share model/tool configurations
  • Docker deployment — single docker-compose up for the full stack

Configuration

1. Create a Token

In Vinkius Cloud, go to your server → Connection TokensCreate. Copy the URL.

2. Add to Config

In your LibreChat configuration, add the MCP server:

yaml
mcpServers:
  vinkius:
    url: "https://edge.vinkius.com/{TOKEN}/mcp"

3. Restart and Chat

Restart LibreChat. MCP tools are available in all conversations and custom agents.


FAQ

Can I restrict MCP tool access per user in LibreChat? Yes. LibreChat's admin panel lets you manage user permissions, including which tools and models are available to each user.

Does LibreChat work with local models? Yes. Connect Ollama or any OpenAI-compatible local endpoint alongside cloud providers.

Is file search and code interpreter available with MCP? Yes. These features work alongside MCP tools in the same conversation.

Is LibreChat free? Open-source under MIT. Self-host at no cost. You provide your own API keys.