Inspectr Logo

Stop guessing how LLMs use your MCP server.

No more scrolling through logs to reconstruct MCP usage.

Inspectr makes Model Context Protocol tool calls, prompts, resources, and tokens observable across Claude, OpenAI, ... with one command.

Inspectr MCP insights view showing MCP request details
MCP

Understanding real MCP usage is harder than it looks

An MCP server can work functionally correct during development and still be difficult to understand once it’s used by real LLM clients.

In practice, teams rely on logs to understand what happened; but logs make it hard to reconstruct actual LLM behavior.

Questions like these are difficult to answer from logs alone:

  • Which MCP calls are actually made by the LLM/Agents?
  • What MCP tools, prompts, or resources were requested, and what output were returned?
  • Which flow or sequence of tool calls did the model follow?
  • How many tokens were consumed during a conversation?

The challenge is no longer if the MCP server works; but understanding what the LLM actually did.

Inspectr provides MCP insights

Flow

Inspectr runs as a transparent, pluggable proxy in front of your MCP server. It captures and understands MCP traffic in real time without requiring any code changes.

MCP insights in practice

MCP tool list

MCP tool list
Inspectr showing MCP tool list from an OpenAI MCP client

MCP tracing flow

MCP tracing flow
Inspectr tracing view showing MCP flow and tool calls

MCP Tool call

MCP Tool call
Inspectr showing MCP call details

Token usage

Token usage
Inspectr showing MCP token usage totals and breakdown

MCP request details

MCP request details
Inspectr showing MCP request details with tool metadata

Get MCP visibility in seconds

MCP

Works with Claude, OpenAI, and other MCP clients. No SDKs, no instrumentation, and no server changes.

$

npx @inspectr/inspectr --backend=<https://your-mcp-server:3000>

Drop-in for local development or remote testing with a single command.

Inspectr Logo

Build MCP servers with confidence

Understand real MCP usage before it reaches production and gains insights in LLM behavior.