Skip to content

Latest commit

 

History

History
964 lines (800 loc) · 20.2 KB

File metadata and controls

964 lines (800 loc) · 20.2 KB

Plenum Usage Examples

This document provides comprehensive examples for using Plenum's three commands: connect, introspect, and query. All examples show both the command invocation and the expected JSON output.

Table of Contents


Connection Management

Interactive Connection Picker

Launch the interactive connection picker when no arguments are provided:

plenum connect

This will:

  1. Display existing named connections
  2. Allow selection via numbered input
  3. Include "--- Create New ---" option
  4. Prompt for connection details interactively

Create PostgreSQL Connection

plenum connect \
  --name prod-db \
  --engine postgres \
  --host db.production.example.com \
  --port 5432 \
  --user readonly_user \
  --password secretpassword \
  --database app_production \
  --save global

Success Output:

{
  "ok": true,
  "engine": "postgres",
  "command": "connect",
  "data": {
    "connection_name": "prod-db",
    "engine": "postgres",
    "saved_to": "global",
    "message": "Connection 'prod-db' saved successfully"
  },
  "meta": {
    "execution_ms": 245
  }
}

Create MySQL Connection

plenum connect \
  --name local-mysql \
  --engine mysql \
  --host localhost \
  --port 3306 \
  --user root \
  --password rootpass \
  --database test_db \
  --save local

Success Output:

{
  "ok": true,
  "engine": "mysql",
  "command": "connect",
  "data": {
    "connection_name": "local-mysql",
    "engine": "mysql",
    "saved_to": "local",
    "message": "Connection 'local-mysql' saved successfully"
  },
  "meta": {
    "execution_ms": 198
  }
}

Create SQLite Connection

plenum connect \
  --name app-cache \
  --engine sqlite \
  --file /var/lib/app/cache.db \
  --save local

Success Output:

{
  "ok": true,
  "engine": "sqlite",
  "command": "connect",
  "data": {
    "connection_name": "app-cache",
    "engine": "sqlite",
    "saved_to": "local",
    "message": "Connection 'app-cache' saved successfully"
  },
  "meta": {
    "execution_ms": 42
  }
}

Validate Existing Connection

plenum connect --name prod-db

This re-validates an existing connection and displays connection metadata.


Schema Introspection

Introspect Using Named Connection

plenum introspect --name prod-db

Success Output:

{
  "ok": true,
  "engine": "postgres",
  "command": "introspect",
  "data": {
    "tables": [
      {
        "name": "users",
        "schema": "public",
        "columns": [
          {
            "name": "id",
            "data_type": "integer",
            "nullable": false,
            "default": "nextval('users_id_seq'::regclass)"
          },
          {
            "name": "email",
            "data_type": "character varying",
            "nullable": false,
            "default": null
          },
          {
            "name": "created_at",
            "data_type": "timestamp with time zone",
            "nullable": false,
            "default": "CURRENT_TIMESTAMP"
          }
        ],
        "primary_key": ["id"],
        "foreign_keys": [],
        "indexes": [
          {
            "name": "users_pkey",
            "columns": ["id"],
            "unique": true
          },
          {
            "name": "idx_users_email",
            "columns": ["email"],
            "unique": true
          }
        ]
      }
    ]
  },
  "meta": {
    "execution_ms": 523
  }
}

Introspect Specific Schema

plenum introspect --name prod-db --schema analytics

Filters introspection to only the analytics schema (PostgreSQL/MySQL).

Introspect with Explicit Connection

plenum introspect \
  --engine postgres \
  --host localhost \
  --port 5432 \
  --user admin \
  --password adminpass \
  --database myapp

Use explicit connection parameters without saving a named connection.

SQLite Introspection

plenum introspect --engine sqlite --file ./app.db

Success Output:

{
  "ok": true,
  "engine": "sqlite",
  "command": "introspect",
  "data": {
    "tables": [
      {
        "name": "products",
        "schema": null,
        "columns": [
          {
            "name": "id",
            "data_type": "INTEGER",
            "nullable": false,
            "default": null
          },
          {
            "name": "name",
            "data_type": "TEXT",
            "nullable": false,
            "default": null
          },
          {
            "name": "price",
            "data_type": "REAL",
            "nullable": true,
            "default": "0.0"
          }
        ],
        "primary_key": ["id"],
        "foreign_keys": [],
        "indexes": []
      }
    ]
  },
  "meta": {
    "execution_ms": 38
  }
}

Query Execution

Read-Only Query (Default)

plenum query \
  --name prod-db \
  --sql "SELECT id, email, created_at FROM users WHERE id = 1"

Success Output:

{
  "ok": true,
  "engine": "postgres",
  "command": "query",
  "data": {
    "columns": ["id", "email", "created_at"],
    "rows": [
      {
        "id": 1,
        "email": "admin@example.com",
        "created_at": "2024-01-15T10:30:00Z"
      }
    ],
    "rows_affected": null
  },
  "meta": {
    "execution_ms": 87,
    "rows_returned": 1
  }
}

Query with Row Limit

plenum query \
  --name prod-db \
  --sql "SELECT * FROM large_table" \
  --max-rows 1000

Limits the result set to 1000 rows, even if more exist.

Query with Timeout

plenum query \
  --name prod-db \
  --sql "SELECT * FROM expensive_view" \
  --timeout-ms 5000

Sets a 5-second timeout for query execution.

Attempted Write Query (REJECTED)

Plenum is strictly read-only. Write operations are rejected:

plenum query \
  --name prod-db \
  --sql "INSERT INTO logs (level, message) VALUES ('INFO', 'Application started')"

Error Output:

{
  "ok": false,
  "engine": "postgres",
  "command": "query",
  "error": {
    "code": "CAPABILITY_VIOLATION",
    "message": "Plenum is read-only and cannot execute this query. Please run this query manually:\n\nINSERT INTO logs (level, message) VALUES ('INFO', 'Application started')"
  }
}

Agent Workflow: When a write operation is needed:

  1. Use Plenum to introspect schema and read current data
  2. Construct the SQL statement
  3. Present it to the user for manual execution

Attempted DDL Query (REJECTED)

DDL operations are also rejected:

plenum query \
  --name prod-db \
  --sql "CREATE TABLE temp_results (id SERIAL PRIMARY KEY, value TEXT)"

Error Output:

{
  "ok": false,
  "engine": "postgres",
  "command": "query",
  "error": {
    "code": "CAPABILITY_VIOLATION",
    "message": "Plenum is read-only and cannot execute this query. Please run this query manually:\n\nCREATE TABLE temp_results (id SERIAL PRIMARY KEY, value TEXT)"
  }
}

Query from File

plenum query \
  --name prod-db \
  --sql-file ./queries/report.sql

Reads SQL from a file instead of command line.

Complex Query with JOIN

plenum query \
  --name prod-db \
  --sql "
    SELECT u.id, u.email, COUNT(p.id) as post_count
    FROM users u
    LEFT JOIN posts p ON u.id = p.user_id
    GROUP BY u.id, u.email
    ORDER BY post_count DESC
    LIMIT 10
  "

Common Workflows

Workflow 1: Setup and Validate Connection

# Step 1: Create connection
plenum connect \
  --name analytics-db \
  --engine postgres \
  --host analytics.example.com \
  --port 5432 \
  --user readonly \
  --password secret123 \
  --database analytics \
  --save global

# Step 2: Validate it works
plenum introspect --name analytics-db

# Step 3: Run a test query
plenum query --name analytics-db --sql "SELECT version()"

Workflow 2: Explore Unknown Database

# Step 1: Connect and save
plenum connect --name unknown-db --engine mysql --host localhost \
  --port 3306 --user root --password rootpass --database mystery_db --save local

# Step 2: Discover schema
plenum introspect --name unknown-db > schema.json

# Step 3: Examine table contents
plenum query --name unknown-db --sql "SELECT * FROM table_name LIMIT 10"

Workflow 3: Data Migration Check

# Check source row count
plenum query --name source-db \
  --sql "SELECT COUNT(*) as count FROM users"

# Check destination row count
plenum query --name dest-db \
  --sql "SELECT COUNT(*) as count FROM users"

# Verify data integrity
plenum query --name dest-db \
  --sql "SELECT id, email FROM users WHERE id NOT IN (SELECT id FROM source_users)"

Workflow 4: Read-Only Analysis

# Plenum is read-only - complex queries should be constructed and run manually
# Use Plenum to understand the schema first
plenum introspect --name work-db > schema.json

# Then construct your analysis query
# Present this to the user for manual execution:
# CREATE TEMPORARY TABLE analysis_tmp (id INT, score REAL);
# INSERT INTO analysis_tmp SELECT user_id, AVG(rating) FROM reviews GROUP BY user_id;

# After the user runs it manually, use Plenum to query results
plenum query --name work-db \
  --sql "SELECT * FROM analysis_tmp WHERE score > 4.5"

Workflow 5: Database Discovery (Wildcard Mode)

The wildcard database feature allows agents to discover available databases before selecting one.

MySQL Discovery

# Step 1: Create connection with wildcard database
plenum connect \
  --name mysql-discovery \
  --engine mysql \
  --host localhost \
  --port 3306 \
  --user root \
  --password rootpass \
  --database "*" \
  --save local

# Step 2: Discover available databases
plenum query --name mysql-discovery \
  --sql "SHOW DATABASES"

Example Output:

{
  "ok": true,
  "engine": "mysql",
  "command": "query",
  "data": {
    "columns": ["Database"],
    "rows": [
      ["information_schema"],
      ["mysql"],
      ["performance_schema"],
      ["sys"],
      ["app_production"],
      ["app_staging"],
      ["app_development"]
    ],
    "rows_affected": null
  },
  "meta": {
    "execution_ms": 12,
    "rows_returned": 7
  }
}
# Step 3: Query a specific database using fully qualified table names
plenum query --name mysql-discovery \
  --sql "SELECT * FROM app_production.users LIMIT 10"

# Or use USE statement to select a database
plenum query --name mysql-discovery \
  --sql "USE app_production"

# Then query tables in that database
plenum query --name mysql-discovery \
  --sql "SELECT COUNT(*) as user_count FROM users"

PostgreSQL Discovery

# Step 1: Create connection with wildcard database
plenum connect \
  --name pg-discovery \
  --engine postgres \
  --host localhost \
  --port 5432 \
  --user postgres \
  --password pgpass \
  --database "*" \
  --save local

# Step 2: Discover available databases
plenum query --name pg-discovery \
  --sql "SELECT datname FROM pg_catalog.pg_database WHERE datistemplate = false"

Example Output:

{
  "ok": true,
  "engine": "postgres",
  "command": "query",
  "data": {
    "columns": ["datname"],
    "rows": [
      ["postgres"],
      ["app_production"],
      ["app_staging"],
      ["app_development"]
    ],
    "rows_affected": null
  },
  "meta": {
    "execution_ms": 8,
    "rows_returned": 4
  }
}

Note: PostgreSQL requires reconnecting to switch databases. Create separate connections for each database:

# After discovering databases, create specific connections
plenum connect \
  --name app-prod \
  --engine postgres \
  --host localhost \
  --port 5432 \
  --user postgres \
  --password pgpass \
  --database app_production \
  --save local

plenum connect \
  --name app-dev \
  --engine postgres \
  --host localhost \
  --port 5432 \
  --user postgres \
  --password pgpass \
  --database app_development \
  --save local

Introspection with Wildcard Database

When using wildcard mode, you must specify the --schema parameter for introspection:

MySQL:

# Introspect specific database by name
plenum introspect --name mysql-discovery --schema app_production

PostgreSQL:

# Introspect specific schema in the default "postgres" database
plenum introspect --name pg-discovery --schema public

Error without schema:

# This will fail with wildcard database
plenum introspect --name mysql-discovery

Error Output:

{
  "ok": false,
  "engine": "mysql",
  "command": "introspect",
  "error": {
    "code": "ENGINE_ERROR",
    "message": "No database selected. When using wildcard database (\"*\"), you must specify --schema parameter."
  }
}

Error Handling

Read-Only Violation (Write Operation)

Plenum rejects all write operations:

plenum query --name prod-db \
  --sql "UPDATE users SET email = 'test@example.com' WHERE id = 1"

Error Output:

{
  "ok": false,
  "engine": "postgres",
  "command": "query",
  "error": {
    "code": "CAPABILITY_VIOLATION",
    "message": "Plenum is read-only and cannot execute this query. Please run this query manually:\n\nUPDATE users SET email = 'test@example.com' WHERE id = 1"
  }
}

Read-Only Violation (DDL Operation)

Plenum rejects all DDL operations:

plenum query --name prod-db \
  --sql "DROP TABLE users"

Error Output:

{
  "ok": false,
  "engine": "postgres",
  "command": "query",
  "error": {
    "code": "CAPABILITY_VIOLATION",
    "message": "Plenum is read-only and cannot execute this query. Please run this query manually:\n\nDROP TABLE users"
  }
}

Query Failed (Syntax Error)

plenum query --name prod-db \
  --sql "SELCT * FROM users"

Error Output:

{
  "ok": false,
  "engine": "postgres",
  "command": "query",
  "error": {
    "code": "QUERY_FAILED",
    "message": "Failed to prepare query: syntax error at or near \"SELCT\""
  }
}

Query Failed (Table Not Found)

plenum query --name prod-db \
  --sql "SELECT * FROM nonexistent_table"

Error Output:

{
  "ok": false,
  "engine": "postgres",
  "command": "query",
  "error": {
    "code": "QUERY_FAILED",
    "message": "Failed to execute query: relation \"nonexistent_table\" does not exist"
  }
}

Connection Failed

plenum connect --name bad-connection \
  --engine postgres --host invalid.host.example.com \
  --port 5432 --user test --password test --database test

Error Output:

{
  "ok": false,
  "engine": "postgres",
  "command": "connect",
  "error": {
    "code": "CONNECTION_FAILED",
    "message": "Failed to connect to database: connection refused"
  }
}

Invalid Input (Missing Required Parameter)

plenum query --name prod-db

Error Output:

{
  "ok": false,
  "engine": "",
  "command": "query",
  "error": {
    "code": "INVALID_INPUT",
    "message": "Either --sql or --sql-file is required"
  }
}

Advanced Examples

Using Environment Variables for Passwords

Create a connection that references an environment variable:

export DB_PASSWORD="super_secret_password"

plenum connect \
  --name secure-db \
  --engine postgres \
  --host db.example.com \
  --port 5432 \
  --user appuser \
  --password-env DB_PASSWORD \
  --database production \
  --save global

The password is read from $DB_PASSWORD at runtime, not stored in the config file.

Parameterized Queries (Agent Responsibility)

Important: Plenum does NOT provide SQL parameterization. The calling agent MUST sanitize inputs:

// Agent-side example (pseudo-code)
const userId = sanitize(userInput); // Agent must sanitize!
const sql = `SELECT * FROM users WHERE id = ${userId}`;
exec(`plenum query --name db --sql "${sql}"`);

Plenum assumes all SQL passed to it is safe and validated by the agent.

Combining with JSON Processing

Using jq to extract specific fields from Plenum output:

# Get just the row count
plenum query --name db --sql "SELECT COUNT(*) as count FROM users" \
  | jq -r '.data.rows[0].count'

# Extract error code for conditional logic
RESULT=$(plenum query --name db --sql "SELECT * FROM users")
ERROR_CODE=$(echo "$RESULT" | jq -r '.error.code // empty')

if [ "$ERROR_CODE" == "CAPABILITY_VIOLATION" ]; then
  echo "Insufficient permissions"
fi

Schema Diff Workflow

# Introspect source
plenum introspect --name source-db > source-schema.json

# Introspect destination
plenum introspect --name dest-db > dest-schema.json

# Compare (using external tool)
diff source-schema.json dest-schema.json

Tips for AI Agents

  1. Always check the ok field first - Don't assume success
  2. Match on error.code for programmatic error handling
  3. Parse JSON output - All Plenum output is valid JSON
  4. Sanitize user inputs - Plenum passes SQL verbatim to the database
  5. Use named connections - Avoid repeating connection parameters
  6. Plenum is read-only - Use it to introspect and query, then present write/DDL SQL to users for manual execution
  7. Set max-rows for unknown queries - Prevent accidentally fetching millions of rows
  8. Use timeouts for expensive queries - Protect against long-running operations

Configuration Files

Local Configuration (.plenum/config.json)

Connections are organized by project path, with named connections and an explicit default pointer:

{
  "projects": {
    "/home/user/myproject": {
      "connections": {
        "local": {
          "engine": "sqlite",
          "file": "./app.db"
        },
        "staging": {
          "engine": "postgres",
          "host": "staging.example.com",
          "port": 5432,
          "user": "readonly",
          "password_env": "STAGING_DB_PASSWORD",
          "database": "staging"
        },
        "prod": {
          "engine": "postgres",
          "host": "db.example.com",
          "port": 5432,
          "user": "readonly",
          "password_env": "PROD_DB_PASSWORD",
          "database": "production"
        }
      },
      "default": "local"
    }
  }
}

Global Configuration (~/.config/plenum/connections.json)

The global config uses the same structure, allowing you to store connections for multiple projects:

{
  "projects": {
    "/home/user/project1": {
      "connections": {
        "dev": {
          "engine": "postgres",
          "host": "localhost",
          "port": 5432,
          "user": "postgres",
          "password": "localdev",
          "database": "project1_dev"
        }
      },
      "default": "dev"
    },
    "/home/user/project2": {
      "connections": {
        "dev": {
          "engine": "mysql",
          "host": "localhost",
          "port": 3306,
          "user": "root",
          "password": "rootpass",
          "database": "project2_dev"
        },
        "prod": {
          "engine": "mysql",
          "host": "prod.example.com",
          "port": 3306,
          "user": "readonly",
          "password_env": "MYSQL_PASSWORD",
          "database": "project2_prod"
        }
      },
      "default": "dev"
    }
  }
}

Auto-Discovery Examples

When running commands from within a project directory, Plenum automatically uses the current project path:

# From /home/user/myproject directory:
# Uses the connection specified by the project's default pointer
plenum query --sql "SELECT * FROM users"

# Use a different named connection within the same project
plenum query --name staging --sql "SELECT * FROM users"

# Query a different project's connection
plenum query --project-path /home/user/project2 --sql "SELECT * FROM products"

# Query a specific named connection in a different project
plenum query --project-path /home/user/project2 --name prod --sql "SELECT * FROM products"

Performance Considerations

  • Connection overhead: Each command opens and closes a connection (stateless design)
  • Large result sets: Use --max-rows to limit memory usage
  • Complex introspection: Databases with 100+ tables may take several seconds to introspect
  • Network latency: Remote database connections will be slower than local
  • SQLite is fastest: No network overhead, synchronous driver

Run benchmarks to measure performance on your workload:

cargo bench --features sqlite