Feature idea: expose web-check's checks as an MCP server for local/AI-assisted debugging
Hey — love what's been built here. The breadth of checks (SSL chain, DNS records, redirect ledger, open ports, headers, TLS config, traceroute, etc.) is genuinely impressive, and the /api structure makes it clear a lot of thought went into making these checks modular.
That got me thinking: what if those same checks were accessible as an MCP (Model Context Protocol) server? Basically, wrapping the existing /api jobs so that an AI assistant (Claude, Cursor, etc.) could call them as tools during a debugging session — without the developer needing to open a browser, paste a URL, and relay results manually.
The use case
Right now, if I'm debugging a local service — say I'm not sure why my CORS headers aren't landing right, or whether my redirect chain looks clean before a deploy — I have to context-switch: open web-check, type the URL, wait, read, and paste the relevant bit back into my terminal or chat. It's not painful, but it breaks flow.
With an MCP layer, the model I'm working with could just call check_headers or trace_redirects directly against localhost:3000 and reason over the results inline. No copy-paste, no switching tabs.
What this could look like
Given the existing /api endpoints, a thin MCP server wrapper seems feasible without touching the core check logic. Something like:
// wrapping existing /api jobs as MCP tools
check_ssl({ url }) // → /api/ssl
check_headers({ url }) // → /api/headers
trace_redirects({ url }) // → /api/redirects
check_dns({ url }) // → /api/dns
check_ports({ url }) // → /api/ports
check_tls_config({ url }) // → /api/tls-security-config
The MCP server could run alongside the existing server.js, or be packaged as a separate entry point. The heavy lifting is already done — this is just surfacing it differently.
Why local debugging specifically
The hosted web-check.xyz is great for public URLs, but can't reach localhost or services on a private network. A locally-running MCP server would fill that gap — especially useful during development, before anything is publicly deployed.
Things worth figuring out
- Should this live in the main repo or as a separate companion package?
- What's the right security boundary for allowing MCP tool calls to hit private/local addresses?
- Which subset of the existing checks are most valuable to expose first?
- Could we create an
SSE/HTTP-Streamable service or work on an npx package to be installed for a STDIO based approach
I'd love to contribute if this direction makes sense. I could put together a rough proof-of-concept, a design doc, or just a proposed tool schema to get the conversation going — whatever's most useful. Totally understand if this feels out of scope, though; just wanted to raise it.
Feature idea: expose web-check's checks as an MCP server for local/AI-assisted debugging
Hey — love what's been built here. The breadth of checks (SSL chain, DNS records, redirect ledger, open ports, headers, TLS config, traceroute, etc.) is genuinely impressive, and the
/apistructure makes it clear a lot of thought went into making these checks modular.That got me thinking: what if those same checks were accessible as an MCP (Model Context Protocol) server? Basically, wrapping the existing
/apijobs so that an AI assistant (Claude, Cursor, etc.) could call them as tools during a debugging session — without the developer needing to open a browser, paste a URL, and relay results manually.The use case
Right now, if I'm debugging a local service — say I'm not sure why my CORS headers aren't landing right, or whether my redirect chain looks clean before a deploy — I have to context-switch: open web-check, type the URL, wait, read, and paste the relevant bit back into my terminal or chat. It's not painful, but it breaks flow.
With an MCP layer, the model I'm working with could just call
check_headersortrace_redirectsdirectly againstlocalhost:3000and reason over the results inline. No copy-paste, no switching tabs.What this could look like
Given the existing
/apiendpoints, a thin MCP server wrapper seems feasible without touching the core check logic. Something like:The MCP server could run alongside the existing
server.js, or be packaged as a separate entry point. The heavy lifting is already done — this is just surfacing it differently.Why local debugging specifically
The hosted
web-check.xyzis great for public URLs, but can't reachlocalhostor services on a private network. A locally-running MCP server would fill that gap — especially useful during development, before anything is publicly deployed.Things worth figuring out
SSE/HTTP-Streamableservice or work on an npx package to be installed for aSTDIObased approachI'd love to contribute if this direction makes sense. I could put together a rough proof-of-concept, a design doc, or just a proposed tool schema to get the conversation going — whatever's most useful. Totally understand if this feels out of scope, though; just wanted to raise it.