MCP Cursor

Enhance your development workflow with AI-powered MCP tools and extensions for Cursor IDE.

Product

  • MCP Servers
  • Getting Started
  • Documentation
  • Open Source

Resources

  • MCP Specification
  • Cursor IDE
  • MCP GitHub
  • Contributing

Legal

  • Privacy Policy
  • Terms of Service
  • Cookie Policy
Made withfor the developer community
© 2025 MCP Cursor. All rights reserved.
MCP Logo
MCP Cursor
IntroductionMCPs
IntroductionMCPs
3D MCP Cursor Visualization
  1. Home
  2. Servers
  3. Higress AI Search MCP
Higress AI Search MCP Logo

Higress AI Search MCP

Model Context Protocol Integration

Overview

Enhances AI model responses with real-time search results from various engines through Higress ai-search, supporting internet, academic, and internal knowledge searches.

Higress AI Search

Enhances AI model responses with real-time search results from various engines through Higress ai-search, supporting internet, academic, and internal knowledge searches.

Installation Instructions


README: https://github.com/cr7258/higress-ai-search-mcp-server

MseeP.ai Security Assessment Badge

Higress AI-Search MCP Server

Overview

A Model Context Protocol (MCP) server that provides an AI search tool to enhance AI model responses with real-time search results from various search engines through Higress ai-search feature.

Higress AI-Search Server MCP server

Demo

Cline

https://github.com/user-attachments/assets/60a06d99-a46c-40fc-b156-793e395542bb

Claude Desktop

https://github.com/user-attachments/assets/5c9e639f-c21c-4738-ad71-1a88cc0bcb46

Features

  • Internet Search: Google, Bing, Quark - for general web information
  • Academic Search: Arxiv - for scientific papers and research
  • Internal Knowledge Search

Prerequisites

  • uv for package installation.
  • Config Higress with ai-search plugin and ai-proxy plugin.

Configuration

The server can be configured using environment variables:

  • HIGRESS_URL(optional): URL for the Higress service (default: http://localhost:8080/v1/chat/completions).
  • MODEL(required): LLM model to use for generating responses.
  • INTERNAL_KNOWLEDGE_BASES(optional): Description of internal knowledge bases.

Option 1: Using uvx

Using uvx will automatically install the package from PyPI, no need to clone the repository locally.

{
  "mcpServers": {
    "higress-ai-search-mcp-server": {
      "command": "uvx",
      "args": [
        "higress-ai-search-mcp-server"
      ],
      "env": {
        "HIGRESS_URL": "http://localhost:8080/v1/chat/completions",
        "MODEL": "qwen-turbo",
        "INTERNAL_KNOWLEDGE_BASES": "Employee handbook, company policies, internal process documents"
      }
    }
  }
}

Option 2: Using uv with local development

Using uv requires cloning the repository locally and specifying the path to the source code.

{
  "mcpServers": {
    "higress-ai-search-mcp-server": {
      "command": "uv",
      "args": [
        "--directory",
        "path/to/src/higress-ai-search-mcp-server",
        "run",
        "higress-ai-search-mcp-server"
      ],
      "env": {
        "HIGRESS_URL": "http://localhost:8080/v1/chat/completions",
        "MODEL": "qwen-turbo",
        "INTERNAL_KNOWLEDGE_BASES": "Employee handbook, company policies, internal process documents"
      }
    }
  }
}

License

This project is licensed under the MIT License - see the LICENSE file for details.

Featured MCPs

Github MCP - Model Context Protocol for Cursor IDE

Github

This server provides integration with Github's issue tracking system through MCP, allowing LLMs to interact with Github issues.

Sequential Thinking MCP - Model Context Protocol for Cursor IDE

Sequential Thinking

An MCP server implementation that provides a tool for dynamic and reflective problem-solving through a structured thinking process. Break down complex problems into manageable steps, revise and refine thoughts as understanding deepens, and branch into alternative paths of reasoning.

Puppeteer MCP - Model Context Protocol for Cursor IDE

Puppeteer

A Model Context Protocol server that provides browser automation capabilities using Puppeteer. This server enables LLMs to interact with web pages, take screenshots, execute JavaScript, and perform various browser-based operations in a real browser environment.