Skip to content

blitzdotdev/iPhone-mcp

Repository files navigation



iPhone MCP server that lets AI use iPhones

Website Discord npm

iPhone-mcp

MCP server that lets AI agents control real iPhones and iPhone simulators on MacOS. Works with Claude Code, Cursor, Codex, OpenCode, and any MCP-compatible AI agent.

demo.mp4

Requirements

  • macOS
  • Xcode (install from App Store or xcode-select --install)
  • Node.js 18+
  • Homebrew

Installation

npm install @blitzdev/iphone-mcp

Quick Start

Global (use in any project)

npx @blitzdev/iphone-mcp --setup-all

This installs dependencies and configures @blitzdev/iphone-mcp for all your AI agents. It automatically sets up Claude Code, Cursor, Codex, OpenCode.

NOTE: For Cursor, you need to enable the blitz-iphone MCP server in Cursor Settings

Project-scoped (one project only)

cd <your project>
npx @blitzdev/iphone-mcp --setup-here

This prompts you to choose which AI agents to configure (Claude Code, Cursor, Codex, OpenCode) and writes the config files into your project directory. @blitzdev/iphone-mcp will only be available when you open an agent inside that directory.

Then just ask

Open a new AI agent session and ask:

> scan the simulator screen and tell me what you see
> connect to my iPhone and test the login flow
> find bugs in my app — tap around, try edge cases, report anything weird

What can it do?

Capability Simulator Physical iPhone
Tap, swipe, type text Yes Yes
Press buttons (Home, Lock, Siri) Yes Yes
Scan UI elements with coordinates Yes Yes
Take screenshots Yes Yes
List & launch installed apps Yes Yes
Auto-detect devices Yes Yes
Live screen viewer in browser -- Yes

iPhones

Connect your iPhone via USB or make sure it's on the same Wi-Fi network (home/office networks work, public cafe Wi-Fi won't).

First-time setup — your iPhone needs:

  1. Developer Mode enabled (Settings > Privacy & Security > Developer Mode)
  2. Trust this computer when prompted on the phone

Then ask your agent:

> connect to my iPhone

The agent will build and install WebDriverAgent on your phone (takes 1-3 minutes the first time), then give you a URL like http://localhost:5152?udid=... — open it in your browser to see a live view of your phone screen.

Simulators

Boot any iPhone simulator, then ask your AI agent to interact with it. No extra setup needed beyond --setup-all or --setup-here.

> what apps are installed on the simulator?
> launch Safari and go to example.com
> take a screenshot

Example session

You:    test the checkout flow in my app
Agent:  Let me scan the simulator to see what's on screen.

        [calls scan_ui]

        I can see the home screen. Let me launch your app first.

        [calls launch_app with bundleId "com.myapp"]
        [calls get_screenshot]

        The app is open on the main screen. I see a "Shop" tab at the bottom.
        Let me tap it and walk through the checkout flow.

        [calls device_action: tap on "Shop" tab]
        [calls scan_ui]

        I see a list of products. Let me add one to cart...

MCP Tools reference

These are the tools your AI agent can call:

Tool What it does
get_execution_context Find available simulators and iPhones
scan_ui Find tappable elements — buttons, links, text fields — with their coordinates
describe_screen Full UI element hierarchy (more detail than scan_ui)
device_action Tap, swipe, press buttons, type text, press keys
device_actions Run multiple actions in sequence
get_screenshot Save a screenshot and return the file path
list_devices List all simulators and physical devices
launch_app Launch an app by bundle ID
list_apps List installed apps
setup_device Build & install WebDriverAgent on a physical iPhone

Manual MCP configuration

If you'd rather configure things yourself:

Claude Code — add to ~/.claude.json (global) or .mcp.json (project):

{
  "mcpServers": {
    "blitz-iphone": {
      "command": "npx",
      "args": ["@blitzdev/iphone-mcp"]
    }
  }
}

Cursor — add to ~/.cursor/mcp.json (global) or .cursor/mcp.json (project):

{
  "mcpServers": {
    "blitz-iphone": {
      "command": "npx",
      "args": ["@blitzdev/iphone-mcp"]
    }
  }
}

Codex — add to ~/.codex/config.toml (global) or .codex/config.toml (project):

[mcp_servers.blitz-iphone]
command = "npx"
args = ["@blitzdev/iphone-mcp"]

OpenCode — add to opencode.json in your project root:

{
  "$schema": "https://opencode.ai/config.json",
  "mcp": {
    "blitz-iphone": {
      "type": "local",
      "command": ["npx", "-y", "@blitzdev/iphone-mcp"],
      "enabled": true
    }
  }
}

Troubleshooting

"No booted simulator found" — Open Simulator.app or run xcrun simctl boot "iPhone 16" first.

Physical device not detected — Make sure Developer Mode is on, the phone is connected via USB, and you've tapped "Trust" on the phone.

WDA build fails — Open Xcode > Settings > Accounts and make sure an Apple ID is signed in. Xcode needs a signing identity to build WDA.

"Connection refused" errors — The idb companion may have crashed. Run npx @blitzdev/iphone-mcp --setup-all again to re-initialize.

License

MIT

About

The MCP Server the lets AI use an iPhone

Resources

Stars

Watchers

Forks