I built a little Kagi MCP Server last week. It helps language models like Claude search the web and summarize web pages using Kagi’s excellent search and AI capabilities.
I’ve been using Kagi as my personal search engine for a while now. It’s privacy-focused, the results are high quality, and it’s not cluttered with ads. As I now work with LLMs frequently, I wanted to bring that same search experience to my LLMs. Kagi itself has an official MCP server (wrriten in Python), but I wanted something that compiled neatly into a single binary.
Kagi MCP Server is a simple Model Context Protocol (MCP) server that implements two main functions:
That’s it.
MCP (Model Context Protocol) is an emerging standard for connecting AI models to external tools. I like open standards - they tend to last longer than proprietary solutions. By building on MCP, this server should work with any compliant LLM platform, not just with specific vendors.
If you have a Kagi subscription, you can grab your API key and run this little server alongside your favorite LLM. It works in two modes:
The code is up on GitHub, and it’s written in Go, so it’s fast and easy to deploy. The only caveat is that the Kagi search API is currently in beta, so you might need to request access if you don’t have it yet. It’s also fairly expensive. 😕