Valyu Logo

Updates

Bing Search API is Dead. Now What?

>_ Hirsh P

Bing Search API backgriound cover

What happened: Microsoft discontinued the Bing Search API on August 11th, 2025, forcing developers to migrate to Azure AI Agents at a steep $35 per 1,000 queries.

If you built something using the Bing API, you already know. What's wild is how quietly this rug pull happened. Just a bunch of emails and a tweet-sized blog post. "Hey, that thing you've been relying on? Gone. Here's something completely different instead."

The Replacement: Azure AI Agents Complexity

Their suggested replacement is "Grounding with Bing Search as part of Azure AI Agents." Here's what migration actually requires:

Setup requirements:

  • Create Azure AI Agents
  • Configure resource groups
  • Navigate complex enterprise Azure dashboard

Cost structure:

  • $35 per 1,000 queries
  • No free tier

What is even worse is that it is not a standalone search API, it is "designed to function as an add-on feature for select Microsoft products".

When all you want is an API key to get started or migrate with minimal hassle, it's the kind of complexity that makes you question your life choices.

Why Traditional Search APIs Fail for AI: A Technical Breakdown

Building good search infrastructure is genuinely one of the harder technical problems out there. You're dealing with crawling at scale, real-time indexing, content parsing, and storage systems that need to serve results in milliseconds.

Traditional search APIs were optimized for:

  • Ad revenue models requiring multiple page views
  • Human-readable HTML responses
  • Keyword-based ranking algorithms
  • SEO-influenced results

What AI applications actually need:

  • Structured JSON/XML data formats for use inside context windows
  • A search index that does not only rely on keyword search but deepers semantics
  • The most relevant context for a given query
  • Built for tool calling and AI use
  • Better query and intent handling, AI queries are a lot more involved
  • Multimodal content (text, images, documents) in parseable formats

The web has also become flooded with AI-generated SEO spam, making old ranking methods less effective. The difference now is we can optimize for how models actually consume information as part of their context windows.

Real Cost Analysis: Microsoft Azure vs. Alternatives

Let's talk about that $35 per 1,000 queries number. Here's what this means for actual applications:

Application TypeMonthly Query VolumeMicrosoft Azure CostValyu CostMonthly Savings
Small AI Tool10,000$350$15$335 (95.7%)
Growing Startup100,000$3,500$150$3,350 (95.7%)
Scale Application1,000,000$35,000$1,500$33,500 (95.7%)

This is exactly backwards from where the industry is heading. LLM costs are dropping fast—GPT-5 is cheaper than GPT-3.5 was a year ago. But access to the knowledge those models need? Getting more expensive.

Valyu's Approach: AI-Native Search Infrastructure

At Valyu, we provide search infrastructure specifically designed for AI applications. Our API starts at $1.50 per 1,000 queries—not because we're trying to undercut anyone, but because this is what reasonable pricing looks like when you build for AI from the ground up.

Core capabilities:

  • Web search with AI-optimized formatting; you get back usable content for your context windows not search results
  • 10+ million academic papers with full citations
  • Real-time financial market data as structured JSON
  • Proprietary textbook and research content
  • Natural language query interpretation

Technical specifications:

  • Response time: <500ms average
  • Multimodal
  • Integration: 4 lines of code
  • Frameworks: LangChain, LlamaIndex, AI SDK compatible

Implementation Example

Basic integration requires minimal code:

1import { Valyu } from "valyu";
2
3const valyu = new Valyu("your-valyu-api-key");
4const response = await valyu.search("What is quantum computing?");
5
6console.log(response);


Advanced Search:

1import { Valyu } from "valyu";
2
3const valyu = new Valyu("your-valyu-api-key");
4const response = await valyu.search(
5 "Implementation details of agentic search-enhanced large reasoning models",
6 {
7 search_type: "proprietary",
8 max_num_results: 10,
9 max_price: 30,
10 relevance_threshold: 0.5,
11 category: "agentic retrieval-augmented generation",
12 included_sources: ["valyu/valyu-arxiv"],
13 is_tool_call: True
14 }
15);
16
17console.log(response);

What This Means for AI Development

When you're building an AI app, avoiding hallucinations isn't a nice-to-have feature—it's essential. Your models need access to real, current, accurate information to be useful. That shouldn't cost more than the models themselves.

Current development patterns show:

  • Average AI application makes 50,000-200,000 search queries monthly
  • Search costs now exceed LLM inference costs for many applications
  • Developers spending more time on data pipeline than core features
  • Hallucination rates directly correlate with search quality

Valyu has a great community of people building cool stuff on accessible search infrastructure has been really encouraging. Developers are creating answer engines that can cite academic sources, research tools that understand complex datasets, financial applications that pull from multiple data streams and analyse comapanies. We are here to support your applications, finding the right context should not be hard, give Valyu a try.

Looking Forward

Change like this is never fun when you're in the middle of it, but it often leads to better solutions. The search infrastructure we have today was built for a different era. We're finally at a point where we can build something designed specifically for AI applications from the ground up.

The key is finding something that fits how you actually want to build, not just what happens to be available. Check out Valyu if you're looking for an alternative that understands what AI developers actually need.


Get Started

👉 Documentation: docs.valyu.network

Need help? Drop us a message: founders@valyu.network



Cover photo by Deepmind