The fastest way to get started is connecting to the hosted server. No installation required. Below are setup instructions for each supported AI tool.Documentation Index
Fetch the complete documentation index at: https://docs.li.fi/llms.txt
Use this file to discover all available pages before exploring further.
Claude Desktop
Add to your Claude Desktop config:- macOS:
~/Library/Application Support/Claude/claude_desktop_config.json - Windows:
%APPDATA%\Claude\claude_desktop_config.json
Claude Code
Add via CLI:.mcp.json file at your project root (shareable via git):
Cursor
Add to.cursor/mcp.json (project-level) or ~/.cursor/mcp.json (global):
Windsurf
Add to~/.codeium/windsurf/mcp_config.json:
Windsurf uses
serverUrl instead of url.VS Code (GitHub Copilot)
Add to.vscode/mcp.json in your project:
ChatGPT
ChatGPT supports MCP via developer mode. Add the server through the ChatGPT UI. There is no config file to edit.- Open ChatGPT settings
- Navigate to the MCP servers section
- Add the server URL:
https://mcp.li.quest/mcp
Stdio Transport (Self-Hosted / Local)
For local MCP clients running the binary directly:In stdio mode, the API key is read from the
LIFI_API_KEY environment variable. Without it, the server uses the public rate limit (200 requests / 2 hours).Self-Hosted HTTP Transport
If you’re running your own instance of the MCP server:GitHub Repository
Full source code, Docker setup, and self-hosting documentation
Testing Your Setup
Use the MCP Inspector to interactively test the server:http://localhost:6274 where you can browse and test all available tools.
You can also verify your setup by asking your AI tool a simple question like:
“What chains does LI.FI support?”If the MCP server is connected correctly, the tool will call
get-chains and return a list of supported blockchains.
