Wikidata MCP Server

AI Knowledge Graph Wikidata MCP

Contact us to host your MCP Server in FlowHunt

FlowHunt provides an additional security layer between your internal systems and AI tools, giving you granular control over which tools are accessible from your MCP servers. MCP servers hosted in our infrastructure can be seamlessly integrated with FlowHunt's chatbot as well as popular AI platforms like ChatGPT, Claude, and various AI editors.

What does “Wikidata” MCP Server do?

The Wikidata MCP Server is a server implementation of the Model Context Protocol (MCP), designed to interface directly with the Wikidata API. It provides a bridge between AI assistants and the vast structured knowledge in Wikidata, allowing developers and AI agents to seamlessly search for entity and property identifiers, extract metadata (such as labels and descriptions), and execute SPARQL queries. By exposing these capabilities as MCP tools, the server enables tasks like semantic search, knowledge extraction, and contextual enrichment in development workflows where external structured data is needed. This enhances AI-driven applications by allowing them to retrieve, query, and reason about up-to-date information from Wikidata.

List of Prompts

No prompt templates are mentioned in the repository or documentation.

Logo

Ready to grow your business?

Start your free trial today and see results within days.

List of Resources

No explicit MCP resources are described in the repository or documentation.

List of Tools

  • search_entity(query: str)
    Search for a Wikidata entity ID by its query.
  • search_property(query: str)
    Search for a Wikidata property ID by its query.
  • get_properties(entity_id: str)
    Get the properties associated with a given Wikidata entity ID.
  • execute_sparql(sparql_query: str)
    Execute a SPARQL query on Wikidata.
  • get_metadata(entity_id: str, language: str = “en”)
    Retrieve the English label and description for a given Wikidata entity ID.

Use Cases of this MCP Server

  • Semantic Data Retrieval
    Use AI assistants to search for entities or properties in Wikidata, providing users with accurate IDs for further data manipulation or exploration.
  • Automated Metadata Extraction
    Automatically extract labels and descriptions for Wikidata entities, enriching data-driven applications or projects with contextual information.
  • Programmatic SPARQL Query Execution
    Enable LLM-powered agents to formulate and execute SPARQL queries, making it possible to answer complex questions or gather structured knowledge dynamically.
  • Knowledge Graph Exploration
    Allow developers to explore relationships between entities and properties in Wikidata, supporting research, data analysis, and linked data workflows.
  • AI-Assisted Recommendations
    Build AI agents that can recommend items (e.g., movies by a certain director) by combining entity search, property retrieval, and SPARQL execution.

How to set it up

Windsurf

  1. Ensure you have Node.js installed.
  2. Locate your Windsurf configuration file.
  3. Add the Wikidata MCP Server to your mcpServers configuration using a JSON snippet like below.
  4. Save the configuration and restart Windsurf.
  5. Verify that the server appears in your MCP integrations.
"mcpServers": {
  "wikidata-mcp": {
    "command": "npx",
    "args": ["@zzaebok/mcp-wikidata@latest"]
  }
}

Securing API Keys (if needed):

{
  "wikidata-mcp": {
    "env": {
      "WIKIDATA_API_KEY": "your-api-key"
    },
    "inputs": {
      "some_input": "value"
    }
  }
}

Claude

  1. Install Node.js if not already installed.
  2. Open Claude’s configuration file.
  3. Insert the following configuration for the Wikidata MCP Server.
  4. Save and restart Claude Desktop.
  5. Confirm the server is accessible.
"mcpServers": {
  "wikidata-mcp": {
    "command": "npx",
    "args": ["@zzaebok/mcp-wikidata@latest"]
  }
}

Securing API Keys:

{
  "wikidata-mcp": {
    "env": {
      "WIKIDATA_API_KEY": "your-api-key"
    }
  }
}

Cursor

  1. Install Node.js and ensure Cursor supports MCP.
  2. Edit your Cursor configuration file.
  3. Add the Wikidata MCP Server entry as shown.
  4. Save changes and restart Cursor.
  5. Verify the server is listed.
"mcpServers": {
  "wikidata-mcp": {
    "command": "npx",
    "args": ["@zzaebok/mcp-wikidata@latest"]
  }
}

Securing API Keys:

{
  "wikidata-mcp": {
    "env": {
      "WIKIDATA_API_KEY": "your-api-key"
    }
  }
}

Cline

  1. Ensure Node.js is set up.
  2. Update the Cline config file with the MCP Server details.
  3. Add the configuration as below.
  4. Save and restart Cline.
  5. Check the MCP server integration.
"mcpServers": {
  "wikidata-mcp": {
    "command": "npx",
    "args": ["@zzaebok/mcp-wikidata@latest"]
  }
}

Securing API Keys:

{
  "wikidata-mcp": {
    "env": {
      "WIKIDATA_API_KEY": "your-api-key"
    }
  }
}

How to use this MCP inside flows

Using MCP in FlowHunt

To integrate MCP servers into your FlowHunt workflow, start by adding the MCP component to your flow and connecting it to your AI agent:

FlowHunt MCP flow

Click on the MCP component to open the configuration panel. In the system MCP configuration section, insert your MCP server details using this JSON format:

{
  "wikidata-mcp": {
    "transport": "streamable_http",
    "url": "https://yourmcpserver.example/pathtothemcp/url"
  }
}

Once configured, the AI agent is now able to use this MCP as a tool with access to all its functions and capabilities. Remember to change “wikidata-mcp” to whatever the actual name of your MCP server is and replace the URL with your own MCP server URL.


Overview

SectionAvailabilityDetails/Notes
OverviewOverview available in README.md
List of PromptsNo prompt templates found
List of ResourcesNo explicit resources listed
List of ToolsTools detailed in README.md
Securing API KeysNo explicit API key requirement found
Sampling Support (less important in evaluation)Not mentioned

Our opinion

The Wikidata MCP Server is a simple but effective implementation, providing several useful tools for interacting with Wikidata via MCP. However, it lacks documentation on prompt templates, resources, and sampling/roots support, which limits its flexibility for more advanced or standardized MCP integrations. The presence of a license, clear tooling, and active updates make it a solid starting point for MCP use cases focused on Wikidata.

MCP Score

Has a LICENSE✅ (MIT)
Has at least one tool
Number of Forks5
Number of Stars18

MCP Server Rating: 6/10
Solid core functionality, but lacking in standard MCP resource/prompt support and advanced features. Good for direct Wikidata integration use cases.

Frequently asked questions

Integrate Wikidata with FlowHunt

Enhance your AI’s reasoning and data capabilities by adding Wikidata as a structured knowledge source in your FlowHunt workflows.

Learn more

Azure Wiki Search MCP Server
Azure Wiki Search MCP Server

Azure Wiki Search MCP Server

The Azure Wiki Search MCP Server enables AI agents and developers to programmatically search and retrieve content from Azure DevOps wiki, streamlining access to...

4 min read
MCP Server Azure +4
mcp-google-search MCP Server
mcp-google-search MCP Server

mcp-google-search MCP Server

The mcp-google-search MCP Server bridges AI assistants and the web, enabling real-time search and content extraction using the Google Custom Search API. It empo...

5 min read
AI Web Search +5
DataHub MCP Server Integration
DataHub MCP Server Integration

DataHub MCP Server Integration

The DataHub MCP Server bridges FlowHunt AI agents with the DataHub metadata platform, enabling advanced data discovery, lineage analysis, automated metadata ret...

4 min read
AI Metadata +6