I started with Ollama, a local language model runner, but hit a significant limitation: no Model Context Protocol (MCP) support. While bridges existed in Go and TypeScript, there was no clean, Python-native, API-first solution that prioritised data privacy.

By August 2025, nine months after MCP's release, I built a working prototype in a weekend. The core principle was uncompromising: my data stays with me. Unlike cloud AI solutions requiring data transit through external infrastructure, this approach keeps both models and tools locally.

The project is released free on GitHub as ollama-fastmcp-wrapper under MIT licence, addressing a gap that persisted — a GitHub issue (#7865) remained open requesting native MCP support in Ollama.

Practical applications

The bridge enables several use cases:

  • Private research assistant — local filesystem and web search integration for document summarisation
  • Local coding agent — Git and filesystem connections without subscription costs
  • Data analyst — SQLite/DuckDB integration for local query execution
  • Offline operations — shell execution and automation on air-gapped systems