Production-grade MCP servers
EN
Guides

How to Connect MCP Servers to Any AI Client — Claude, Cursor, VS Code, Windsurf, ChatGPT, and More

The complete guide to connecting MCP servers to every major AI client in 2026. Covers Claude Desktop, Cursor, VS Code, Windsurf, JetBrains, Claude Code, Cline, Goose, ChatGPT — plus framework SDKs for LangChain, CrewAI, Vercel AI, OpenAI Agents, and more.

Author
Engineering Team
April 14, 2026
How to Connect MCP Servers to Any AI Client — Claude, Cursor, VS Code, Windsurf, ChatGPT, and More
Try Vinkius Free

You installed Claude Desktop. You heard about MCP. You want your AI assistant to manage your Jira tickets, read your Notion docs, and search your codebase — not just talk about them.

This guide covers every major AI client: Claude Desktop, Cursor, VS Code (Copilot), Windsurf, JetBrains, Claude Code, Cline, Goose, Lovable, and ChatGPT — plus framework SDKs for programmatic agents.

We’ll show two approaches for each: the local method (manual config) and the managed method (one URL through the Vinkius AI Gateway).

The two ways to connect an MCP server

Before we cover each client, understand the fundamental choice:

Local MCP servers run on your machine. You install them with npx or pip, configure environment variables, manage API keys in plaintext config files, and restart your client every time you change something. This works for experimentation. It doesn’t work for production — because there’s no DLP, no audit trail, and your API keys sit in a JSON file on your laptop.

Managed MCP servers run behind a gateway. You subscribe to a server, get a URL, and paste it into your client. The gateway handles authentication, transport negotiation, credential isolation, and payload inspection. Your config file contains a single URL. Nothing else.

Here’s the difference in practice:

Local config (5 minutes, manual key management):

{
  "mcpServers": {
    "github": {
      "command": "npx",
      "args": ["-y", "@modelcontextprotocol/server-github"],
      "env": {
        "GITHUB_PERSONAL_ACCESS_TOKEN": "ghp_xxxxxxxxxxxxxxxxxxxx"
      }
    }
  }
}

Your GitHub token sits in plaintext. If that file gets committed to a repo, leaked, or accessed by malware, your entire GitHub account is exposed.

Managed config via Vinkius (30 seconds, zero key management):

{
  "mcpServers": {
    "github": {
      "url": "https://edge.vinkius.com/YOUR_VINKIUS_TOKEN/github-mcp"
    }
  }
}

No API key in the config. No npx. No environment variables. The Vinkius AI Gateway holds the credential in an encrypted vault, auto-negotiates the transport protocol (Streamable HTTP or SSE — depending on what your client supports), and scans every payload through the DLP pipeline. One URL. Zero configuration.

You pick a server from the App Catalog, subscribe with one click, copy the URL, and paste it. Every server is hardened and governed from day one — there is no raw mode.


Claude Desktop

Claude Desktop is the most popular MCP host. Here’s how to connect.

Step 1. Open your config file:

  • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
  • Windows: %APPDATA%\Claude\claude_desktop_config.json

(Shortcut: In Claude Desktop, go to Settings → Developer → Edit Config.)

Step 2. Add your server. For a managed Vinkius server:

{
  "mcpServers": {
    "github": {
      "url": "https://edge.vinkius.com/YOUR_VINKIUS_TOKEN/github-mcp"
    }
  }
}

For a local server:

{
  "mcpServers": {
    "github": {
      "command": "npx",
      "args": ["-y", "@modelcontextprotocol/server-github"],
      "env": {
        "GITHUB_PERSONAL_ACCESS_TOKEN": "ghp_your_token"
      }
    }
  }
}

Step 3. Save the file and restart Claude Desktop completely (quit from the system tray, not just close the window).

Step 4. Verify: click the ”+” button in the chat input and look for “Connectors.” You’ll see your server and the tools it exposes.


Cursor

Cursor has the deepest MCP integration of any IDE. You can connect via the UI, a config file, or a one-click deep link.

One-click install (Vinkius)

If you’re using a managed server, the fastest path is a deep link. The Vinkius dashboard generates these automatically:

cursor://anysphere.cursor-deeplink/mcp/install?name=github&config=eyJ1cmwiOiJodHRwczovL2VkZ2Uudmlua2l1cy5jb20vWU9VUl9WSU5LSVVTX1RPS0VOL2dpdGh1Yi1tY3AifQ==

Click the link, Cursor opens, confirms the server, and you’re connected. No file editing.

Config file

Global (all projects): ~/.cursor/mcp.json Project-specific: .cursor/mcp.json in your project root.

{
  "mcpServers": {
    "github": {
      "url": "https://edge.vinkius.com/YOUR_VINKIUS_TOKEN/github-mcp"
    }
  }
}

Settings UI

  1. Open Settings (⌘ , / Ctrl ,) → scroll to Features → MCP Servers.
  2. Click ”+ Add new MCP Server”.
  3. Set Type to “SSE”, enter a name, and paste your endpoint URL.
  4. A green indicator confirms a successful connection.

VS Code (GitHub Copilot)

VS Code supports MCP natively through the Copilot extension. Make sure Agent Mode is selected in your Copilot Chat window.

One-click install (Vinkius)

Like Cursor, VS Code supports deep links:

vscode:mcp/install?%7B%22name%22%3A%22github%22%2C%22type%22%3A%22http%22%2C%22url%22%3A%22https%3A%2F%2Fedge.vinkius.com%2FYOUR_VINKIUS_TOKEN%2Fgithub-mcp%22%7D

Command Palette

  1. Open Command Palette (⌘⇧P / Ctrl+Shift+P) → type “MCP: Add Server”.
  2. Select “HTTP” as the server type.
  3. Paste your endpoint URL when prompted.
  4. Choose where to save the config (workspace or user-level).

Config file

Workspace: .vscode/mcp.json User-level: open via Command Palette → MCP: Open User Configuration

{
  "servers": {
    "github": {
      "type": "http",
      "url": "https://edge.vinkius.com/YOUR_VINKIUS_TOKEN/github-mcp"
    }
  }
}

Windsurf

Windsurf uses MCP through its Cascade AI system.

  1. Open the Cascade panel in the right sidebar.
  2. Click the hammer icon (MCPs) → then “View raw config”.
  3. This opens ~/.codeium/windsurf/mcp_config.json.
  4. Add your server:
{
  "mcpServers": {
    "github": {
      "serverUrl": "https://edge.vinkius.com/YOUR_VINKIUS_TOKEN/github-mcp"
    }
  }
}

(Note: Windsurf uses serverUrl instead of url.)

  1. Save the file, then click “Refresh” in the Cascade MCP panel.

JetBrains (IntelliJ, WebStorm, PyCharm)

JetBrains IDEs support MCP through the AI Assistant plugin.

  1. Open Settings (⌘ , / Ctrl+Alt+S) → Tools → AI Assistant → Model Context Protocol (MCP).
  2. Click the ”+” button to add a new server.
  3. Select “SSE” as the transport type.
  4. Paste your endpoint URL.
  5. Click OK — the tools will appear in the AI Assistant chat.

Claude Code (Terminal)

Claude Code is Anthropic’s terminal-based agent. It uses CLI commands to manage MCP servers.

claude mcp add --transport sse github "https://edge.vinkius.com/YOUR_VINKIUS_TOKEN/github-mcp"

(Important: all flags must come before the server name.)

Verify the connection:

# Inside a Claude Code session:
/mcp

This lists all connected servers and their available tools.


Cline

Cline is a VS Code extension with its own MCP configuration.

  1. Open the Cline panel and click the MCP Servers icon (top-right).
  2. Go to the “Configure” tab → click “Configure MCP Servers”.
  3. This opens cline_mcp_settings.json.
  4. Add your server:
{
  "mcpServers": {
    "github": {
      "url": "https://edge.vinkius.com/YOUR_VINKIUS_TOKEN/github-mcp",
      "disabled": false
    }
  }
}
  1. Save — the server will auto-connect.

Goose

Goose supports both GUI and config-file setup.

GUI: Open Goose Desktop → Settings → Extensions“Add custom extension” → choose Remote/SSE → paste your URL.

Config file: Edit ~/.config/goose/config.yaml:

extensions:
  github:
    type: remote
    uri: https://edge.vinkius.com/YOUR_VINKIUS_TOKEN/github-mcp

Restart your Goose session to activate.


ChatGPT

ChatGPT supports MCP through its Developer Mode and Connectors.

  1. Go to Settings → Connectors (or Advanced Settings).
  2. Enable Developer Mode.
  3. Create a new connector → input your MCP server URL.
  4. In a new chat, click ”+” → More → Developer Mode → select your server.

Framework SDKs

If you’re building programmatic agents, here’s how to connect to a Vinkius-managed MCP server from every major framework.

Vercel AI SDK (TypeScript)

import { createMCPClient } from "@ai-sdk/mcp";
import { generateText } from "ai";
import { openai } from "@ai-sdk/openai";

const client = await createMCPClient({
  transport: {
    type: "sse",
    url: "https://edge.vinkius.com/YOUR_VINKIUS_TOKEN/github-mcp",
  },
});

const { text } = await generateText({
  model: openai("gpt-4o"),
  tools: await client.tools(),
  prompt: "List all open pull requests in the main repository.",
});

OpenAI Agents SDK (Python)

from agents import Agent
from agents.mcp import MCPServerStreamableHttp

async with MCPServerStreamableHttp(
    url="https://edge.vinkius.com/YOUR_VINKIUS_TOKEN/github-mcp"
) as server:
    agent = Agent(
        name="GitHub Assistant",
        instructions="Help the user manage their GitHub repositories.",
        mcp_servers=[server],
    )

LangChain (Python)

from langchain_mcp_adapters.client import MultiServerMCPClient

async with MultiServerMCPClient({
    "github": {
        "url": "https://edge.vinkius.com/YOUR_VINKIUS_TOKEN/github-mcp",
    }
}) as client:
    tools = client.get_tools()
    # Use tools with any LangChain agent

CrewAI (Python)

from crewai import Agent, Task, Crew

agent = Agent(
    role="DevOps Engineer",
    goal="Monitor and manage GitHub repositories",
    backstory="An expert agent.",
    mcps=[{
        "url": "https://edge.vinkius.com/YOUR_VINKIUS_TOKEN/github-mcp",
    }],
)

Google ADK (Python)

from google.adk.agents import Agent
from google.adk.mcp import McpToolset, SseServerParams

mcp_tools = McpToolset.from_server(
    server_params=SseServerParams(
        uri="https://edge.vinkius.com/YOUR_VINKIUS_TOKEN/github-mcp",
    )
)

agent = Agent(
    model="gemini-2.0-flash",
    tools=mcp_tools,
)

Anthropic SDK (TypeScript)

import Anthropic from "@anthropic-ai/sdk";
import { Client } from "@modelcontextprotocol/sdk/client/index.js";
import {
  StreamableHTTPClientTransport,
} from "@modelcontextprotocol/sdk/client/streamableHttp.js";

const mcp = new Client({ name: "github", version: "1.0.0" });
await mcp.connect(
  new StreamableHTTPClientTransport(
    new URL("https://edge.vinkius.com/YOUR_VINKIUS_TOKEN/github-mcp")
  )
);

const anthropic = new Anthropic();
const { tools } = await mcp.listTools();

Semantic Kernel (C#)

using Microsoft.SemanticKernel;
using ModelContextProtocol.Client;
using ModelContextProtocol.Client.Transport;

var client = await McpClientFactory.CreateAsync(
    new SseClientTransport(
        new Uri("https://edge.vinkius.com/YOUR_VINKIUS_TOKEN/github-mcp")
    )
);

var tools = await client.ListToolsAsync();

var kernel = Kernel.CreateBuilder()
    .AddOpenAIChatCompletion("gpt-4o", apiKey)
    .Build();

foreach (var tool in tools)
    kernel.Plugins.AddFromFunctions(
        "github", [tool.AsKernelFunction()]);

AutoGen (Python)

from autogen_agentchat.agents import AssistantAgent
from autogen_ext.models.openai import OpenAIChatCompletionClient
from autogen_ext.tools.mcp import McpWorkbench, SseServerParams

async with McpWorkbench(
    server_params=SseServerParams(
        url="https://edge.vinkius.com/YOUR_VINKIUS_TOKEN/github-mcp"
    )
) as workbench:
    agent = AssistantAgent(
        name="assistant",
        model_client=OpenAIChatCompletionClient(model="gpt-4o"),
        workbench=workbench,
    )

Why managed servers change the equation

Look at every code example and config snippet above. Notice the pattern: one URL, no API keys, no environment variables, no transport configuration.

That’s because our AI Gateway handles everything that local MCP servers force you to manage yourself:

  • Transport auto-negotiation — The gateway detects whether your client supports Streamable HTTP or SSE and adapts automatically. You never specify a transport type.
  • Credential isolation — Your API keys for GitHub, Jira, Stripe, Salesforce — they never appear in any config file. They live in an encrypted vault behind the gateway.
  • DLP pipeline — Every outbound payload is scanned for PII, financial data, SSH keys, and credentials in real-time before it leaves the perimeter.
  • Semantic classification — Every tool call is categorized as QUERY, MODIFY, or DESTRUCTIVE before execution. Destructive mutations require explicit approval.
  • Cryptographic audit trail — Every execution is hash-chained and cryptographically signed. Immutable. Forensic-grade.
  • Emergency kill switch — Revoke all tokens, terminate all connections, lock all servers to inactive with one click.

You don’t build any of this. You don’t configure any of this. You pick an MCP server from the App Catalog, subscribe, copy the URL, and paste it. Hardened and governed from day one.


Troubleshooting

Server not appearing after save? You must fully restart the client (Claude Desktop, Cursor, etc.) — closing the window is not enough. Quit from the system tray.

JSON syntax error? The most common issue is trailing commas. Use a JSON validator or paste your config into VS Code and look for red underlines.

Connection timeout? If using a local server, verify that npx, node, or python are in your system PATH. If using a managed server, verify your token is valid in the Vinkius dashboard.

Tools not showing in chat? Some clients require you to explicitly enable tools. In VS Code, click the “Select Tools” icon in Copilot Chat. In ChatGPT, enable the connector in Developer Mode.

Need logs? Check your client’s log files:

  • Claude Desktop (macOS): ~/Library/Logs/Claude/mcp-server-*.log
  • Claude Desktop (Windows): %APPDATA%\Claude\logs\
  • Cursor/VS Code: Command Palette → “Developer: Show Logs”

Ready to connect your first server? Browse the App Catalog to find the MCP servers your workflow needs, or create a free account and connect to any of our governed MCP servers in under two minutes.


Hardened & governed from day one

Your agents need tools. We make them safe.

Pick an MCP server from the catalog. Subscribe. Copy the URL. Paste it into Claude, Cursor, or any client. One URL — DLP, audit trail, and kill switch included.

V8 sandbox isolation · Semantic DLP · Cryptographic audit trail · Emergency kill switch

Share this article