Elia
Elia is a keyboard-centric TUI (text user interface) for chatting with LLMs, built on Python's Textual framework. It's designed for developers who prefer navigating with keyboard shortcuts rather than mouse clicks. Conversations, models, and MCP tools are all accessible without leaving the terminal or reaching for a trackpad.
Keyboard-First AI Chat with MCP
Elia takes the approach that an LLM chat should feel like a native terminal application — panels, scrollable history, syntax highlighting, and composable key bindings. It stores conversations in a local SQLite database, so you can search and resume past chats without cloud syncing.
MCP tools slot into this workflow naturally. When you ask a question that requires external data, the model calls the appropriate MCP tool and displays the result inline — complete with syntax highlighting if it's code or JSON. You never leave the keyboard-driven flow.
Why Elia suits MCP users:
- Textual TUI — rich panels, borders, and color in any terminal emulator
- Keyboard shortcuts — navigate conversations, switch models, search history
- SQLite storage — conversations persist locally, searchable and exportable
- Ollama-native — first-class support for local model inference
- Multi-model — switch between Cloud and Ollama models mid-session
- Lightweight — install with
pipx install elia-chat, no heavy dependencies
How to Set Up
1. Create a Token
In Vinkius Cloud, go to your server → Connection Tokens → Create. Copy the URL.
2. Add MCP Server
Configure the Vinkius server in Elia's config:
mcp_servers:
vinkius:
url: "https://edge.vinkius.com/{TOKEN}/mcp"3. Launch Elia
eliaMCP tools are now available in your chat sessions. The conversation and tool outputs are stored in the local SQLite database for future reference.
FAQ
Does Elia store MCP tool results? Yes. Tool call inputs and outputs are stored alongside the conversation in Elia's local SQLite database. You can replay past sessions with full context.
Can I use Elia with Ollama? Absolutely. Elia has first-class Ollama integration for fully local, offline LLM conversations. MCP servers still need network access.
How do I install Elia? The recommended method is pipx install elia-chat. It works on any system with Python 3.10+.
Is Elia free? Open-source under MIT. No subscription or license required.