Skip to content

AnythingLLM

AnythingLLM is a desktop application that bundles everything you need for AI document chat: a built-in embedding engine, vector database, RAG pipeline, and multi-model support — no separate services required. Agent skills let you extend the built-in agent with custom abilities, and MCP servers are one way to add those skills.


AL
AnythingLLM
Mintplex Labs · anythingllm.com
TRANSPORT
Streamable HTTP ✓
PLATFORM
Windows · macOS · Linux · Docker
MCP VIA
Agent Skills

All-in-One Document Chat with Agent Skills

AnythingLLM embeds everything in a single install: LanceLanceDB for vector storage, a built-in embedding engine, a RAG pipeline, and an agent framework. You drag files into a workspace, and they're indexed for conversation immediately. No separate Pinecone, Weaviate, or ChromaDB setup.

Agent skills extend this beyond document chat. MCP tools become skills your agent can use — call an API, check a status, execute a query — alongside document retrieval decisions. The agent decides whether to search your uploaded docs or call an MCP tool.

Features:

  • All-in-one — embedding, vector DB, RAG, and agent built in
  • Workspaces — isolated document collections for different projects
  • Multi-user — user management with workspace permissions
  • Agent skills — extend agents with MCP tools and custom functions
  • Multi-model — OpenAI, Anthropic, Ollama, LM Studio, and more
  • File support — PDF, DOCX, TXT, CSV, and web page ingestion
  • Docker and desktop — run as desktop app or self-host with Docker

Adding MCP Skills

1. Create a Token

In Vinkius Cloud, go to your server → Connection TokensCreate. Copy the URL.

2. Add as Agent Skill

Go to SettingsAgentMCP SkillsAdd. Paste your Vinkius URL.

3. Chat in a Workspace

Create or open a workspace. The agent uses MCP tools alongside document RAG to answer questions.


FAQ

Can MCP tools and document RAG work together? Yes. The agent decides whether to search your workspace documents or call an MCP tool based on the question.

Does AnythingLLM require external services? No. Embedding, vector storage, and RAG are built in. External LLM providers (OpenAI, Ollama) are configured for inference.

Can multiple users share MCP tools? Yes. Agent skills including MCP are available to users with workspace access.

Is AnythingLLM free? Open-source. Desktop app is free. Enterprise version adds additional management features.