Give Claude Code the Power BI skills it needs. Install once, then just ask Claude to work with your semantic models.
Get Started • Skills • All Commands • REPL Mode • Contributing
pbi-cli gives Claude Code (and other AI agents) the ability to manage Power BI semantic models. It ships with 7 skills that Claude discovers automatically. You ask in plain English, Claude uses the right pbi commands.
graph LR
A["<b>You</b><br/>'Add a YTD measure<br/>to the Sales table'"] --> B["<b>Claude Code</b><br/>Uses Power BI skills"]
B --> C["<b>pbi-cli</b>"]
C --> D["<b>Power BI</b><br/>Desktop"]
style A fill:#1a1a2e,stroke:#f2c811,color:#fff
style B fill:#16213e,stroke:#4cc9f0,color:#fff
style C fill:#0f3460,stroke:#7b61ff,color:#fff
style D fill:#1a1a2e,stroke:#f2c811,color:#fff
Fastest way: Just give Claude the repo URL and let it handle everything:
Install and set up pbi-cli from https://github.com/MinaSaad1/pbi-cli.git
Or install manually (two commands):
pipx install pbi-cli-tool # 1. Install (handles PATH automatically)
pbi connect # 2. Auto-detects Power BI Desktop and installs skillsThat's it. Open Power BI Desktop with a .pbix file, run pbi connect, and everything is set up automatically. Open Claude Code and start asking.
You can also specify the port manually: pbi connect -d localhost:54321
Requires: Windows with Python 3.10+ and Power BI Desktop running.
Using pip instead of pipx?
pip install pbi-cli-toolOn Windows, pip install often places the pbi command in a directory that isn't on your PATH.
Fix: Add the Scripts directory to PATH
Find the directory:
python -c "import site; print(site.getusersitepackages().replace('site-packages','Scripts'))"Add the printed path to your system PATH:
setx PATH "%PATH%;C:\Users\YourName\AppData\Roaming\Python\PythonXXX\Scripts"Then restart your terminal. We recommend pipx instead to avoid this entirely.
After running pbi connect, Claude Code discovers 7 Power BI skills. Each skill teaches Claude a different area of Power BI development. You don't need to memorize commands. Just describe what you want.
graph TD
YOU["You: 'Set up RLS for<br/>regional managers'"] --> CC["Claude Code"]
CC --> SK{"Picks the<br/>right skill"}
SK --> S1["Modeling"]
SK --> S2["DAX"]
SK --> S3["Deployment"]
SK --> S4["Security"]
SK --> S5["Documentation"]
SK --> S6["Diagnostics"]
SK --> S7["Partitions"]
style YOU fill:#1a1a2e,stroke:#f2c811,color:#fff
style CC fill:#16213e,stroke:#4cc9f0,color:#fff
style SK fill:#0f3460,stroke:#7b61ff,color:#fff
style S1 fill:#1a1a2e,stroke:#f2c811,color:#fff
style S2 fill:#1a1a2e,stroke:#4cc9f0,color:#fff
style S3 fill:#1a1a2e,stroke:#7b61ff,color:#fff
style S4 fill:#1a1a2e,stroke:#06d6a0,color:#fff
style S5 fill:#1a1a2e,stroke:#ff6b6b,color:#fff
style S6 fill:#1a1a2e,stroke:#ffd166,color:#fff
style S7 fill:#1a1a2e,stroke:#a0c4ff,color:#fff
"Create a star schema with Sales, Products, and Calendar tables"
Claude creates the tables, sets up relationships, marks the date table, and adds formatted measures. Covers tables, columns, measures, relationships, hierarchies, and calculation groups.
Example: what Claude runs behind the scenes
pbi table create Sales --mode Import
pbi table create Products --mode Import
pbi table create Calendar --mode Import
pbi relationship create --from-table Sales --from-column ProductKey --to-table Products --to-column ProductKey
pbi relationship create --from-table Sales --from-column DateKey --to-table Calendar --to-column DateKey
pbi table mark-date Calendar --date-column Date
pbi measure create "Total Revenue" -e "SUM(Sales[Revenue])" -t Sales --format-string "$#,##0""What are the top 10 products by revenue this year?"
Claude writes and executes DAX queries, validates syntax, and creates measures with time intelligence patterns like YTD, previous year, and rolling averages.
Example: what Claude runs behind the scenes
pbi dax execute "
EVALUATE
TOPN(
10,
ADDCOLUMNS(VALUES(Products[Name]), \"Revenue\", CALCULATE(SUM(Sales[Amount]))),
[Revenue], DESC
)
""Export the model to Git for version control"
Claude exports your model as TMDL files for version control and imports them back. Handles transactions for safe multi-step changes.
Example: what Claude runs behind the scenes
pbi database export-tmdl ./model/
# ... you commit to git ...
pbi database import-tmdl ./model/"Set up row-level security so regional managers only see their region"
Claude creates RLS roles with descriptions, sets up perspectives for different user groups, and exports the model for version control.
Example: what Claude runs behind the scenes
pbi security-role create "Regional Manager" --description "Users see only their region's data"
pbi perspective create "Executive Dashboard"
pbi perspective create "Regional Detail"
pbi database export-tmdl ./model-backup/"Document everything in this model"
Claude catalogs every table, measure, column, and relationship. Generates data dictionaries, measure inventories, and can export the full model as TMDL for human-readable reference.
Example: what Claude runs behind the scenes
pbi --json model get
pbi --json model stats
pbi --json table list
pbi --json measure list
pbi --json relationship list
pbi database export-tmdl ./model-docs/"Why is this DAX query so slow?"
Claude traces query execution, clears caches for clean benchmarks, checks model health, and verifies the environment.
Example: what Claude runs behind the scenes
pbi dax clear-cache
pbi trace start
pbi dax execute "EVALUATE SUMMARIZECOLUMNS(...)" --timeout 300
pbi trace stop
pbi trace export ./trace.json"Set up partitions for incremental refresh on the Sales table"
Claude manages table partitions, shared M/Power Query expressions, and calendar table configuration.
Example: what Claude runs behind the scenes
pbi partition list --table Sales
pbi partition create "Sales_2024" --table Sales --expression "..." --mode Import
pbi expression create "ServerURL" --expression '"https://api.example.com"'
pbi calendar mark Calendar --date-column Date22 command groups covering the full Power BI Tabular Object Model. You rarely need these directly when using Claude Code, but they're available for scripting, CI/CD, or manual use.
| Category | Commands |
|---|---|
| Queries | dax execute, dax validate, dax clear-cache |
| Model | table, column, measure, relationship, hierarchy, calc-group |
| Deploy | database export-tmdl, database import-tmdl, database export-tmsl, transaction |
| Security | security-role, perspective |
| Connect | connect, disconnect, connections list, connections last |
| Data | partition, expression, calendar, advanced culture |
| Diagnostics | trace start, trace stop, trace fetch, trace export, model stats |
| Tools | setup, repl, skills install, skills list |
Use --json for machine-readable output (for scripts and AI agents):
pbi --json measure list
pbi --json dax execute "EVALUATE Sales"Run pbi <command> --help for full options.
For interactive work, the REPL keeps a persistent connection alive between commands:
$ pbi repl
pbi> connect --data-source localhost:54321
Connected: localhost-54321
pbi(localhost-54321)> measure list
pbi(localhost-54321)> dax execute "EVALUATE TOPN(5, Sales)"
pbi(localhost-54321)> exit
Tab completion, command history, and a dynamic prompt showing your active connection.
pbi-cli connects directly to Power BI Desktop's Analysis Services engine via pythonnet and the .NET Tabular Object Model (TOM). No external binaries or MCP servers needed. Everything runs in-process for sub-second command execution.
graph TB
subgraph CLI["pbi-cli (Python)"]
A["Click CLI"] --> B["tom_backend / adomd_backend"]
B --> C["pythonnet"]
end
C -->|"in-process .NET"| D["Bundled TOM DLLs"]
D -->|"XMLA"| E["Power BI Desktop<br/>msmdsrv.exe"]
style CLI fill:#16213e,stroke:#4cc9f0,color:#fff
style D fill:#0f3460,stroke:#7b61ff,color:#fff
style E fill:#1a1a2e,stroke:#f2c811,color:#fff
Why a CLI? When an AI agent uses an MCP server directly, the tool schemas consume ~4,000+ tokens per tool in the context window. A pbi command costs ~30 tokens. Same capabilities, 100x less context.
Configuration details
All config lives in ~/.pbi-cli/:
~/.pbi-cli/
config.json # Default connection preference
connections.json # Named connections
repl_history # REPL command history
Bundled DLLs ship inside the Python package (pbi_cli/dlls/):
- Microsoft.AnalysisServices.Tabular.dll
- Microsoft.AnalysisServices.AdomdClient.dll
- Microsoft.AnalysisServices.Core.dll
- Microsoft.AnalysisServices.Tabular.Json.dll
- Microsoft.AnalysisServices.dll
git clone https://github.com/MinaSaad1/pbi-cli.git
cd pbi-cli
pip install -e ".[dev]"ruff check src/ tests/ # Lint
mypy src/ # Type check
pytest -m "not e2e" # Run testsContributions are welcome! Please open an issue first to discuss what you'd like to change.
- Fork the repository
- Create a feature branch
- Make your changes with tests
- Open a pull request
MIT License