Skip to main content

Using your MCP

This document covers how to use LiteLLM as an MCP Gateway. You can see how to use it with Responses API, Cursor IDE, and OpenAI SDK.

Use on LiteLLM UI​

Follow this walkthrough to use your MCP on LiteLLM UI

Use with Responses API​

Replace http://localhost:4000 with your LiteLLM Proxy base URL.

Demo Video Using Responses API with LiteLLM Proxy: Demo video here

cURL Example
curl --location 'http://localhost:4000/v1/responses' \
--header 'Content-Type: application/json' \
--header "Authorization: Bearer sk-1234" \
--data '{
"model": "gpt-5",
"input": [
{
"role": "user",
"content": "give me TLDR of what BerriAI/litellm repo is about",
"type": "message"
}
],
"tools": [
{
"type": "mcp",
"server_label": "litellm",
"server_url": "litellm_proxy",
"require_approval": "never"
}
],
"stream": true,
"tool_choice": "required"
}'

Specifying MCP Tools​

You can specify which MCP tools are available by using the allowed_tools parameter. This allows you to restrict access to specific tools within an MCP server.

To get the list of allowed tools when using LiteLLM MCP Gateway, you can naigate to the LiteLLM UI on MCP Servers > MCP Tools > Click the Tool > Copy Tool Name.

cURL Example with allowed_tools
curl --location 'http://localhost:4000/v1/responses' \
--header 'Content-Type: application/json' \
--header "Authorization: Bearer sk-1234" \
--data '{
"model": "gpt-5",
"input": [
{
"role": "user",
"content": "give me TLDR of what BerriAI/litellm repo is about",
"type": "message"
}
],
"tools": [
{
"type": "mcp",
"server_label": "litellm",
"server_url": "litellm_proxy/mcp",
"require_approval": "never",
"allowed_tools": ["GitMCP-fetch_litellm_documentation"]
}
],
"stream": true,
"tool_choice": "required"
}'

Use with Cursor IDE​

Use tools directly from Cursor IDE with LiteLLM MCP:

Setup Instructions:

  1. Open Cursor Settings: Use ⇧+⌘+J (Mac) or Ctrl+Shift+J (Windows/Linux)
  2. Navigate to MCP Tools: Go to the "MCP Tools" tab and click "New MCP Server"
  3. Add Configuration: Copy and paste the JSON configuration below, then save with Cmd+S or Ctrl+S
Basic Cursor MCP Configuration
{
"mcpServers": {
"LiteLLM": {
"url": "litellm_proxy",
"headers": {
"x-litellm-api-key": "Bearer $LITELLM_API_KEY"
}
}
}
}

How it works when server_url="litellm_proxy"​

When server_url="litellm_proxy", LiteLLM bridges non-MCP providers to your MCP tools.

  • Tool Discovery: LiteLLM fetches MCP tools and converts them to OpenAI-compatible definitions
  • LLM Call: Tools are sent to the LLM with your input; LLM selects which tools to call
  • Tool Execution: LiteLLM automatically parses arguments, routes calls to MCP servers, executes tools, and retrieves results
  • Response Integration: Tool results are sent back to LLM for final response generation
  • Output: Complete response combining LLM reasoning with tool execution results

This enables MCP tool usage with any LiteLLM-supported provider, regardless of native MCP support.

Auto-execution for require_approval: "never"​

Setting require_approval: "never" triggers automatic tool execution, returning the final response in a single API call without additional user interaction.