oterm
oterm is a terminal UI (TUI) for Ollama that brings a rich chat experience to your terminal. Built with Textual, it offers mouse-clickable conversation management, multi-model tabs, and formatted markdown output — all without leaving the command line. MCP tools extend your local Ollama models with remote data access.
Rich Terminal Chat with Ollama and MCP
oterm sits perfectly between a bare CLI and a full desktop app. The Textual-based TUI gives you clickable buttons, scrollable panes, and formatted output — but runs entirely in your terminal. Conversation history persists in a local SQLite database, and you can switch between Ollama models mid-conversation.
MCP tools work alongside local models. The model runs on your machine, but when it needs external data, MCP tool calls go out to Vinkius Cloud. You get the privacy of local inference with the reach of remote tools.
Features:
- Textual TUI — rich terminal interface with mouse support
- SQLite persistence — conversations stored locally
- Multi-model tabs — switch between Ollama models in tabs
- Formatted Markdown — syntax-highlighted code and table rendering in terminal
- Image support — multimodal conversations with vision models
- Customizable — model parameters adjustable per conversation
Terminal Setup
1. Create a Token
In Vinkius Cloud, go to your server → Connection Tokens → Create. Copy the URL.
2. Configure MCP
Add your Vinkius URL to the oterm configuration file.
3. Launch
otermChat with local Ollama models. MCP tools extend conversations with remote data access.
FAQ
Is oterm only for Ollama models? Primarily yes. oterm is designed as an Ollama frontend with rich terminal UI features.
Do conversations persist? Yes. All conversations are stored in a local SQLite database and accessible across sessions.
Can I use keyboard-only navigation? Yes. Full keyboard navigation alongside mouse support thanks to the Textual framework.
Is oterm free? Open-source and free.