Cookie preferences

We use cookies to improve your experience. See our Privacy Policy.

mcp-use Tunnel: Test Your MCP Server on ChatGPT and Claude Before Deploying

Enrico Toniato
Enrico ToniatoCTO
mcp-use Tunnel: Test Your MCP Server on ChatGPT and Claude Before Deploying

Building an MCP server locally is fast. Testing it against a real AI client is where things slow down.

ChatGPT and Claude can't reach localhost. At some point you need a public URL, and the usual options each come with a catch.

ngrok: on the free tier, HTTP traffic passes through an interstitial warning page that breaks the MCP handshake. You upgrade to skip it, but now your URL changes every time you restart the tunnel, so you go back to Claude, open the connector settings, paste the new URL, and reconnect. Every session.

cloudflared tunnel: no interstitial, but the same URL-churn problem. Subdomains are tied to your account, not to your project. Still reinstalling.

What you actually want is a stable public URL that stays the same across sessions and requires zero configuration after the first run.

How it works

@mcp-use/tunnel creates a WebSocket connection from your local server to the mcp-use relay. Requests to your public URL are forwarded over that connection to your local port; responses come back the same way.

The subdomain is generated from your project and written to dist/mcp-use.json on first run. Every subsequent run requests the same subdomain. Your URL stays the same.

Loading diagram...

Three ways to use it

1. Standalone: any MCP server

npx @mcp-use/tunnel 3000 # ✓ Tunnel established: https://proposed-rose.local.mcp-use.run/mcp

Point it at any local port. Install the URL in Claude or ChatGPT once. Leave it there; the same subdomain comes back on the next run.

2. Built into the CLI: one flag, no extra terminal

If your server runs with mcp-use start or mcp-use dev, pass --tunnel and the tunnel starts alongside the server:

mcp-use start --port 3000 --tunnel

The ready output includes the tunnel URL:

Local: http://localhost:3000 MCP: http://localhost:3000/mcp Tunnel: https://proposed-rose.local.mcp-use.run/mcp

Give ChatGPT or Claude that tunnel URL. Next session, it still works.

3. Inspector UI: no CLI needed

The mcp-use Inspector has a Start Tunnel button in the header. Click it, and after a few seconds the button label changes to Tunnel. A popover shows you the URL and the exact steps to install it in ChatGPT.

mcp-use Inspector showing the Start Tunnel button in the header alongside the connected MCP server
The mcp-use Inspector: Start Tunnel is always one click away
Inspector header with the Start Tunnel button in its default idle state
Idle state, tunnel not yet started

After clicking, the button shows a countdown while the connection is being established:

Inspector header showing Start Tunnel with a spinner and countdown while connecting
Connecting; after a few seconds it is ready

Once active, the button turns purple and clicking it opens the URL popover:

Inspector with the tunnel URL popover open, showing the public mcp-use.run URL and ChatGPT connection steps
Active tunnel: URL ready to paste into ChatGPT or Claude
Zoomed view of the tunnel URL popover showing the full local.mcp-use.run URL and Use in ChatGPT steps

The development loop

The Inspector is where the tunnel toggle makes the most sense. The whole iteration cycle lives in one place: call your tools, inspect responses, see live widget output; when you're ready to test against a real LLM, flip the tunnel without leaving the tab.

With the mcp-apps template, the tool runner renders your MCP widget directly in the response panel:

mcp-use Inspector showing the search-tools response rendered as a visual widget with a fruit carousel
The search-tools widget rendered live in the Inspector, test visually before going to ChatGPT

The typical flow:

Scaffold

npx create-mcp-use-app my-app --template mcp-apps cd my-app && npm install

Start the dev server

npm run dev

The Inspector opens at http://localhost:3000/inspector, already connected to your server.

Iterate locally

Use the Tools panel to call your tools, inspect JSON responses, and preview widgets. No real LLM needed at this stage.

Open the tunnel

Click Start Tunnel in the Inspector header. After a few seconds, the URL appears in the popover.

Install in ChatGPT or Claude

Enable dev mode in ChatGPT settings, go to App & Connectors, click create, and paste the tunnel URL. One time.

Test against the real client

Chat with your MCP server through ChatGPT or Claude. Widgets render, tools execute, the loop closes.

Next time you run npm run dev, the tunnel subdomain comes back from dist/mcp-use.json. Your ChatGPT connector still works.

Get started

If you're starting from scratch, scaffold with the mcp-apps template, which includes a product search tool with a widget so you can see the full flow immediately:

npx create-mcp-use-app my-app --template mcp-apps cd my-app && npm install && npm run dev

The Inspector opens with the tunnel toggle in the header.

If you already have a server running, point the standalone package at your port:

npx @mcp-use/tunnel <your-port>

Full documentation: mcp-use.com/docs/tunneling.

Tunnels expire after 24 hours of creation; inactive tunnels are removed after 1 hour without activity.

Rate limits: up to 10 tunnel creations per IP per hour, and up to 5 active tunnels per IP at once.

Share

Get Started

What will you build with MCP?

Start building AI agents with MCP servers today. Connect to any tool, automate any workflow, and deploy in minutes.