MCP Cursor

Enhance your development workflow with AI-powered MCP tools and extensions for Cursor IDE.

Product

  • MCP Servers
  • Getting Started
  • Documentation
  • Open Source

Resources

  • MCP Specification
  • Cursor IDE
  • MCP GitHub
  • Contributing

Legal

  • Privacy Policy
  • Terms of Service
  • Cookie Policy
Made withfor the developer community
© 2025 MCP Cursor. All rights reserved.
MCP Logo
MCP Cursor
IntroductionMCPs
IntroductionMCPs
3D MCP Cursor Visualization
  1. Home
  2. Servers
  3. Markdown Web Crawl MCP
Markdown Web Crawl MCP Logo

Markdown Web Crawl MCP

Model Context Protocol Integration

Overview

Python-based web crawler extracts website content into markdown files, enabling efficient content aggregation and site archiving.

Markdown Web Crawl

Python-based web crawler extracts website content into markdown files, enabling efficient content aggregation and site archiving.

Installation Instructions


README: https://github.com/jmh108/md-webcrawl-mcp

MD MCP Webcrawler Project

A Python-based MCP (https://modelcontextprotocol.io/introduction) web crawler for extracting and saving website content.

Features

  • Extract website content and save as markdown files
  • Map website structure and links
  • Batch processing of multiple URLs
  • Configurable output directory

Installation

  1. Clone the repository:
git clone https://github.com/yourusername/webcrawler.git
cd webcrawler
  1. Install dependencies:
pip install -r requirements.txt
  1. Optional: Configure environment variables:
export OUTPUT_PATH=./output  # Set your preferred output directory

Output

Crawled content is saved in markdown format in the specified output directory.

Configuration

The server can be configured through environment variables:

  • OUTPUT_PATH: Default output directory for saved files
  • MAX_CONCURRENT_REQUESTS: Maximum parallel requests (default: 5)
  • REQUEST_TIMEOUT: Request timeout in seconds (default: 30)

Claude Set-Up

Install with FastMCP fastmcp install server.py

or user custom settings to run with fastmcp directly

"Crawl Server": {
      "command": "fastmcp",
      "args": [
        "run",
        "/Users/mm22/Dev_Projekte/servers-main/src/Webcrawler/server.py"
      ],
      "env": {
        "OUTPUT_PATH": "/Users/user/Webcrawl"
      }

Development

Live Development

fastmcp dev server.py --with-editable .

Debug

It helps to use https://modelcontextprotocol.io/docs/tools/inspector for debugging

Examples

Example 1: Extract and Save Content

mcp call extract_content --url "https://example.com" --output_path "example.md"

Example 2: Create Content Index

mcp call scan_linked_content --url "https://example.com" | \
  mcp call create_index --content_map - --output_path "index.md"

Contributing

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/AmazingFeature)
  3. Commit your changes (git commit -m 'Add some AmazingFeature')
  4. Push to the branch (git push origin feature/AmazingFeature)
  5. Open a Pull Request

License

Distributed under the MIT License. See LICENSE for more information.

Requirements

  • Python 3.7+
  • FastMCP (uv pip install fastmcp)
  • Dependencies listed in requirements.txt

Featured MCPs

Github MCP - Model Context Protocol for Cursor IDE

Github

This server provides integration with Github's issue tracking system through MCP, allowing LLMs to interact with Github issues.

Sequential Thinking MCP - Model Context Protocol for Cursor IDE

Sequential Thinking

An MCP server implementation that provides a tool for dynamic and reflective problem-solving through a structured thinking process. Break down complex problems into manageable steps, revise and refine thoughts as understanding deepens, and branch into alternative paths of reasoning.

Puppeteer MCP - Model Context Protocol for Cursor IDE

Puppeteer

A Model Context Protocol server that provides browser automation capabilities using Puppeteer. This server enables LLMs to interact with web pages, take screenshots, execute JavaScript, and perform various browser-based operations in a real browser environment.