OpenAI WebSearch MCP Server

Connect your AI agents to the live web with OpenAI WebSearch MCP Server, ensuring real-time, accurate, and location-aware responses for your users.

OpenAI WebSearch MCP Server

What does “OpenAI WebSearch” MCP Server do?

The OpenAI WebSearch MCP Server enables AI assistants to access OpenAI’s web search functionality via the Model Context Protocol (MCP). By acting as a bridge between AI models and real-time web information, this server allows assistants to retrieve up-to-date data that may not be present in their training corpus. Developers can integrate this server with platforms like Claude or Zed, equipping their AI agents with the ability to perform live web searches during conversations. This significantly enhances use cases such as answering current events questions, enriching context with recent data, and providing a more dynamic, informed AI development workflow.

List of Prompts

No prompt templates are listed in the repository or documentation.

List of Resources

No explicit resources are listed in the repository or documentation.

List of Tools

  • web_search
    Allows the AI to call OpenAI’s web search as a tool.
    • Required arguments:
      • type (string): Must be “web_search_preview”.
      • search_context_size (string): Guidance for context window usage—can be “low”, “medium” (default), or “high”.
      • user_location (object or null): Contains location info (type, city, country, region, timezone) for tailoring searches.

Use Cases of this MCP Server

  • Answering Current Events:
    Enables AI assistants to provide up-to-date answers by searching the web for recent information rather than relying solely on static training data.
  • Research Assistance:
    Offers live web search capabilities for users seeking detailed, real-time facts or summaries on a wide range of topics.
  • Context Enrichment:
    Supplements LLM responses with fresh web data, enhancing the relevance and accuracy of outputs.
  • Location-Aware Search:
    Utilizes user-provided location details to tailor search results, making answers more contextually appropriate.
  • Debugging and Development:
    Easily inspect and debug the MCP server using the MCP inspector tool, streamlining integration and troubleshooting.

How to set it up

Windsurf

Coming soon (no steps currently provided in the documentation).

Claude

  1. Obtain your OpenAI API key from OpenAI’s platform.
  2. Run the following command to install and auto-configure the server:
    OPENAI_API_KEY=sk-xxxx uv run --with uv --with openai-websearch-mcp openai-websearch-mcp-install
    
  3. Alternatively, install uvx and edit your Claude settings:
    "mcpServers": {
      "openai-websearch-mcp": {
        "command": "uvx",
        "args": ["openai-websearch-mcp"],
        "env": {
            "OPENAI_API_KEY": "your-api-key-here"
        }
      }
    }
    
  4. Or install via pip:
    pip install openai-websearch-mcp
    
    And update settings:
    "mcpServers": {
      "openai-websearch-mcp": {
        "command": "python",
        "args": ["-m", "openai_websearch_mcp"],
        "env": {
            "OPENAI_API_KEY": "your-api-key-here"
        }
      }
    }
    
  5. Save configuration and restart Claude if necessary.

Securing API Keys:
Store API keys using the env field in your configuration.
Example:

"env": {
  "OPENAI_API_KEY": "your-api-key-here"
}

Cursor

Coming soon (no steps currently provided in the documentation).

Cline

No setup instructions provided in the documentation.

Zed

  1. Obtain your OpenAI API key.
  2. Using uvx, add to your Zed settings.json:
    "context_servers": [
      "openai-websearch-mcp": {
        "command": "uvx",
        "args": ["openai-websearch-mcp"],
        "env": {
            "OPENAI_API_KEY": "your-api-key-here"
        }
      }
    ],
    
  3. Or with pip installation:
    "context_servers": {
      "openai-websearch-mcp": {
        "command": "python",
        "args": ["-m", "openai_websearch_mcp"],
        "env": {
            "OPENAI_API_KEY": "your-api-key-here"
        }
      }
    },
    
  4. Save your configuration and restart Zed.

Securing API Keys:
Use the env field as shown above.

How to use this MCP inside flows

Using MCP in FlowHunt

To integrate MCP servers into your FlowHunt workflow, start by adding the MCP component to your flow and connecting it to your AI agent:

FlowHunt MCP flow

Click on the MCP component to open the configuration panel. In the system MCP configuration section, insert your MCP server details using this JSON format:

{
  "openai-websearch-mcp": {
    "transport": "streamable_http",
    "url": "https://yourmcpserver.example/pathtothemcp/url"
  }
}

Once configured, the AI agent is now able to use this MCP as a tool with access to all its functions and capabilities. Remember to change “openai-websearch-mcp” to whatever the actual name of your MCP server is and replace the URL with your own MCP server URL.


Overview

SectionAvailabilityDetails/Notes
OverviewFound in README.md
List of PromptsNo prompt templates listed
List of ResourcesNo explicit resources listed
List of Toolsweb_search tool described
Securing API KeysDetailed use of env fields in JSON configs
Sampling Support (less important in evaluation)Not mentioned

Between these tables:
This MCP server is focused and well-documented for its core use case (web search access for LLMs), but lacks advanced MCP features such as custom prompts, explicit resources, or sampling/roots support. Overall, it’s robust for its intended scenario, but limited in extensibility. Rating: 5/10


MCP Score

Has a LICENSE✅ (MIT)
Has at least one tool
Number of Forks10
Number of Stars43

Frequently asked questions

What does the OpenAI WebSearch MCP Server do?

It enables AI assistants to perform live, real-time web searches using OpenAI’s web search API, allowing them to access up-to-date information and answer questions about current events, recent facts, and more.

What platforms can use this MCP server?

It can be integrated with platforms like FlowHunt, Claude, Zed, and any environment that supports the Model Context Protocol (MCP).

Is API key security supported?

Yes. API keys are set via environment variables in your configuration for all supported platforms, keeping them secure.

What are the main use cases?

Current events Q&A, research assistance, enriching AI context with fresh web data, and tailoring responses based on user location.

Does it support location-aware search?

Yes. You can provide user location details in the tool arguments to get more relevant, localized search results.

What tools does the server provide?

It provides a 'web_search' tool, allowing AIs to query the web in real time, with options for context size and location.

Supercharge AI with Real-Time Web Search

Give your AI agents in FlowHunt real-world knowledge with the OpenAI WebSearch MCP Server. Start now to unlock current events, research assistance, and more.

Learn more