MCP Cursor

Enhance your development workflow with AI-powered MCP tools and extensions for Cursor IDE.

Product

  • MCP Servers
  • Getting Started
  • Documentation
  • Open Source

Resources

  • MCP Specification
  • Cursor IDE
  • MCP GitHub
  • Contributing

Legal

  • Privacy Policy
  • Terms of Service
  • Cookie Policy
Made withfor the developer community
© 2025 MCP Cursor. All rights reserved.
MCP Logo
MCP Cursor
IntroductionMCPs
IntroductionMCPs
3D MCP Cursor Visualization
  1. Home
  2. Servers
  3. Langchain Integration MCP
Langchain Integration MCP Logo

Langchain Integration MCP

Model Context Protocol Integration

Overview

Integrates Langchain with MCP tools.

Langchain Integration

Integrates Langchain with MCP tools.

Installation Instructions


README: https://github.com/shashwat001/mcptools-langchain-integration

Langchain TypeScript with MCP Tools Integration

A TypeScript project that integrates Langchain with Model Context Protocol (MCP) tools, allowing interaction with Language Models and execution of various tools through a chat interface.

Prerequisites

  • Node.js (v14 or higher)
  • npm
  • Ollama server running locally
  • MCP server running locally

Installation

  1. Clone the repository:
git clone https://github.com/shashwat001/mcptools-langchain-integration.git
cd mcptools-langchain-integration
  1. Install dependencies:
npm install

Configuration

Ollama Configuration

The project uses Ollama for LLM integration. Configure Ollama settings in src/llm.js:

export const ollamaConfig = {
    baseUrl: "http://localhost:11434",
    model: "llama3.1:8b-instruct-q6_K",
    temperature: 0.1,
    maxRetries: 2
};

MCP Server Configuration

MCP server settings can be configured in src/llm.js:

export const mcpConfig = {
    serverUrl: 'http://localhost:7000/sse',
    clientInfo: {
        name: 'ollama-client',
        version: '1.0.0'
    }
};

System Prompt

The system prompt for tool interactions can be modified in src/llm.js:

export const systemPromptForTools = "In this environment you have access to a set of tools you can use to answer the user's question.\n Don't ask user to execute the functions and decide yourself whether to call the tool or not.\nNever call more than one tool at a time.";

Running the Application

  1. Start the Ollama server (make sure it's running on http://localhost:11434)
  2. Start the MCP server (make sure it's running on http://localhost:7000)
  3. Run the application:
node src/index.js

Features

  • Interactive chat interface with LLM
  • Integration with MCP tools
  • Tool execution through chat
  • Support for SSE (Server-Sent Events) based MCP server

Important Notes

MCP Server Caution

The project currently uses an SSE-based MCP server. Exercise caution as the MCP server has write permissions that could make unintended system changes. Always review tool permissions before execution.

Featured MCPs

Github MCP - Model Context Protocol for Cursor IDE

Github

This server provides integration with Github's issue tracking system through MCP, allowing LLMs to interact with Github issues.

Sequential Thinking MCP - Model Context Protocol for Cursor IDE

Sequential Thinking

An MCP server implementation that provides a tool for dynamic and reflective problem-solving through a structured thinking process. Break down complex problems into manageable steps, revise and refine thoughts as understanding deepens, and branch into alternative paths of reasoning.

Puppeteer MCP - Model Context Protocol for Cursor IDE

Puppeteer

A Model Context Protocol server that provides browser automation capabilities using Puppeteer. This server enables LLMs to interact with web pages, take screenshots, execute JavaScript, and perform various browser-based operations in a real browser environment.