Is normal search more expensive than AI?

As Colin says, the APIs for all these LLMs are probably heavily subsidized. Even running models on your own servers is likely cheaper at scale compared to the many search APIs Kagi calls at scale. Running models on your own server is actually feasible, too, compared with running your own search, which is enormously expensive from an R&D/storage perspective, even though Kagi does do it with Tinygem/Teclis for a subset of results. Chat tokens are really cheap even with substantial context.

You could heavily use premium ChatGPT models through OpenAI’s API over the course of several months and save money on a $20 Plus plan for a single month. Just for context.

Kagi made this blog post shortly after the Bing API pricing changes were announced, including more searches in the plans for the same price. It seems likely they dropped Bing and were able to save a lot of money as a result.

Unfortunately, Kagi removed the search APIs they use from their sources page on their help site a few years ago, so it’s anyone’s guess what they’re using now. I know they use Yandex too.

Further context for other readers—they killed their Search API and replaced it with an API that returns AI summaries. Discussed in a previous thread.

Microsoft added: “Customers may want to consider Grounding with Bing Search as part of Azure AI Agents. Grounding with Bing Search allows Azure AI Agents to incorporate real-time public web data when generating responses with an LLM.”

This alternative would necessarily be more expensive as it’s an additional LLM process running over Bing results, which can no longer be accessed directly…

The Register article referenced in the thread mentions Mojeek’s API by name as an alternative, so that’s a small win :slight_smile:


I split a question I had about the Mojeek API into a separate topic.

1 Like