Open WebUI
Open WebUI (formerly Ollama WebUI) is the most popular self-hosted chat interface for local LLMs. It started as a frontend for Ollama but expanded to support any OpenAI-compatible API. Its pipeline architecture lets you plug in custom middleware — including MCP servers — that process messages between the user and the model. Collaborative workspaces make it suitable for teams.
Pipelines, RAG, and Collaborative AI
Open WebUI's pipeline system is its extensibility engine. Pipelines are middleware functions that intercept and transform messages. An MCP pipeline connects the model to external tools — when the model decides it needs data from an API or database, the pipeline routes the request through MCP.
The platform also includes built-in document RAG, model management (pull, delete, and configure Ollama models from the UI), and collaborative workspaces where teams share conversations and tool configurations.
Highlights:
- Pipeline architecture — middleware for message processing, tool routing, and custom logic
- Ollama integration — pull, manage, and configure local models from the UI
- Document RAG — upload documents and query them in conversations
- Collaborative workspaces — team-based conversation sharing and tool access
- Model management — download, configure, and switch models from the interface
- Admin panel — user management, permissions, and usage analytics
- Docker one-liner —
docker run -d -p 3000:8080 ghcr.io/open-webui/open-webui
Getting Connected
1. Create a Token
In Vinkius Cloud, go to your server → Connection Tokens → Create. Copy the URL.
2. Configure MCP
In Open WebUI, go to Admin Panel → Settings → MCP or configure via the pipeline system. Add your Vinkius URL:
https://edge.vinkius.com/{TOKEN}/mcp3. Chat with Tools
MCP tools are available across all models and workspaces. The model calls them when your questions require external data.
FAQ
What is the pipeline architecture? Pipelines are middleware functions that process messages between the user and the model. MCP tool routing, content filtering, and custom logic are all implemented as pipelines.
Can I use Open WebUI with cloud models? Yes. Open WebUI supports any OpenAI-compatible API endpoint alongside Ollama local models. MCP tools work with all of them.
Does Open WebUI support team collaboration? Yes. Workspaces let teams share conversations, model configurations, and tool access with permission controls.
Is Open WebUI free? Open-source under MIT. Self-host with Docker. No licensing costs.