mcp-client-for-ollamaMCP Server40/100 via “local-first execution with no cloud dependencies”
A text-based user interface (TUI) client for interacting with MCP servers using Ollama. Features include agent mode, multi-server, model switching, streaming responses, tool management, human-in-the-loop, thinking mode, model params config, MCP prompts, custom system prompt and saved preferences. Bu
Unique: Implements a completely local-first architecture using Ollama for inference and local MCP servers for tools, with zero cloud dependencies — this is fundamentally different from cloud-based LLM clients which require API keys and internet connectivity.
vs others: Provides complete local execution unlike cloud-based LLM clients, enabling offline use, full privacy, and cost savings while maintaining full tool-use capability through local MCP servers.