The aim of this project is to create a robot arm that can play poker, or act as a dealer in a poker game. In dealer mode, the robot arm will be able to pick up cards, shuffle them, deal them to players, pick up chips, handle the pot and the general actions of the game. In player mode, the robot arm will be able to pick up cards, play them, and handle chips as needed against other players.
- Python 3.12 (Required)
- We recommend using pyenv to manage python versions.
- Hardware: Raspberry Pi 5 (8GB RAM) running Raspberry Pi OS (Bookworm).
- Cameras: OAK-D Lite (birdseye card detection) + Logitech C925e (chip segmentation)
- Robot Arm: SO101 6-DOF servo arm (optional, for arm control).
-
Clone the repository:
Using HTTPS:
git clone https://github.com/jalliet/prap-25-26.git cd prap-25-26Using SSH:
git clone git@github.com:jalliet/prap-25-26.git cd prap-25-26 -
Create a Virtual Environment:
Using standard python (ensure it is 3.12):
python3.12 -m venv venv
OR using pyenv:
pyenv install 3.12 pyenv local 3.12 python -m venv venv -
Activate the Virtual Environment:
source venv/bin/activate -
Install Dependencies:
pip install -r requirements.txt
The PySide6 graphical interface for monitoring the game state and camera feed. See the GUI Documentation for details.
Core game logic in poker/ — card/deck management, chip stacks, player state, betting actions, and game phase transitions (Pre-Flop → Showdown). Hand evaluation uses the phevaluator library (poker/evaluator.py); side pots are computed via a capped-contribution algorithm at showdown.
Dual-camera computer vision pipeline using YOLOv8 models:
- Card Detector (
OAK-D Lite) — identifies playing cards usingvision/models/Card_detection_large_best.pt. Detection can be toggled via "Toggle Card Detection" in the dashboard, which overlays bounding boxes and labels on the primary feed. - Chip Segmentor (
Logitech C925e) — counts chips by colour usingvision/models/Chip_segmentation_large_best.pt. Runs on the dedicated secondary camera with event-driven inference — YOLO only triggers after betting actions (call/bet/raise/all-in) or during showdown; the live preview always streams. The chip stack total is shown in the right panel.
Model weights are gitignored. Place them in vision/models/:
vision/models/Card_detection_large_best.pt
vision/models/Chip_segmentation_large_best.pt
If weights are missing, detectors run in dummy mode (no inference, no errors).
ROS 2 workspace in src/ for the SO-101 robot arm:
- poker_bringup — master launch file with 4 modes (sim, pc_hardware, pi_hardware, pi_hardware_headless)
- poker_control — LQR trajectory controller with CasADi inverse kinematics, action servers
- poker_interfaces — custom ROS 2 messages (TargetPose, TargetJoints, MotorFeedback) and actions (MovePose, MoveJoints)
- lerobot_description — SO-101 URDF/xacro, STL meshes, Gazebo launch
- scservo_driver — C++ driver for STS3215 servos
- poker_dashboard — ROS 2 dashboard node (alternative to PySide6 GUI)
services/arm_ros_bridge.py — Qt-compatible bridge connecting the main app to the ROS 2 arm controller. Gracefully degrades when ROS 2 is not installed.
# Start the dashboard (camera feed, card detection, game state)
python main.py
# Or use the helper script (checks Python version)
bash scripts/start_game.sh- Toggle Card Detection — enables live YOLOv8 card detection with bounding box overlays on the primary OAK-D feed. Detections are logged in the game log. Two camera feeds are visible in the right panel: the primary OAK-D birdseye feed (top) and the compact C925e chip feed (bottom).
- Start/Stop Simulation — launches or stops the ROS 2 Gazebo simulation (
ros2 launch poker_bringup poker_arm.launch.py mode:=sim) directly from the GUI. - Start Hand / Test Bet — manual triggers for testing game state transitions.
- Debug Inference (keyboard) — press B to run one-shot card detection on the OAK-D feed and save the annotated frame to
debug_inference/birdseye/. Press C for chip segmentation on the C925e feed, saved todebug_inference/chip_seg/. Timestamped PNGs are saved and the path is logged.
The left panel exposes operator-facing controls for the active player:
- Action row — Fold, Check, Call, Bet, Raise, All-In. Buttons enable or disable based on the current
GameState(phase, current bet, player status). Invalid actions are routed through theon_action_rejectedsignal and surfaced in the game log. - Sizing row —
QSpinBox(clamped to[min_raise, current_player.stack]) plus four preset buttons that write into the spin box: 1/2 pot, pot, 2x pot, all-in.
Existing test buttons (Test Bet, Toggle Card Detection, Start/Stop Simulation) live in a Debug group below the real controls.
Under mode:=pi_hardware and mode:=pi_hardware_headless, the launch file spawns pump_test (a ROS 2 GPIO node in poker_control) for the table-side button and suction pump:
- Table button (GPIO 27) — RISING-edge increments a seat counter and publishes
Int32on/button_count. The dashboard'sservices/table_io_bridge.pysubscribes and callsGameState.next_turn()to advance the turn. - Suction pump (GPIO 17) —
pump_testsubscribes to/pump_control(Int32) and toggles GPIO 17 HIGH or LOW based onmsg.data.ArmChoreographerbuilds pump-on and pump-off steps into itspick_up_deckanddeal_card_to_seatsequences, emittingpump_requested(bool)at each pump step.MainWindowroutes that signal toTableIoBridge.set_pump, which publishesInt32(1)orInt32(0)on/pump_controland signalspump_state_set(bool)once thepump_duration_ssettle delay (default 0.05) elapses. The choreographer waits onpump_state_setbefore moving past the pump step. Physical chip movement stays manual.
Requires ROS 2 Jazzy and a built workspace.
# Source ROS 2 and build the workspace (once)
source /opt/ros/jazzy/setup.bash
colcon build
source install/setup.bash
# Option 1: Launch simulation from the GUI
python main.py
# Then click "Start Simulation" in the dashboard
# Option 2: Launch simulation manually
ros2 launch poker_bringup poker_arm.launch.py mode:=simros2 launch poker_bringup poker_arm.launch.py mode:=sim # Gazebo simulation only
ros2 launch poker_bringup poker_arm.launch.py mode:=pc_hardware # PC + real servos + digital twin
ros2 launch poker_bringup poker_arm.launch.py mode:=pi_hardware # Raspberry Pi + real servos + digital twin
ros2 launch poker_bringup poker_arm.launch.py mode:=pi_hardware_headless # Pi headless (no Gazebo, no dashboard)
ros2 launch poker_bringup poker_arm.launch.py dashboard_only:=true # Dashboard only (remote control)# Terminal 1: Start mock arm server
python scripts/mock_arm_server.py
# Terminal 2: Start dashboard
python main.pypython -m pytest tests/ -vMermaid diagrams in docs/diagrams/, one per subfolder:
| Diagram | Description |
|---|---|
| system-architecture | High-level component map (GUI, services, vision, poker, ROS 2) |
| vision-pipeline | Camera → detectors → dedup → signals → display |
| game-state-fsm | Poker phase transitions (Pre-Flop → Showdown) |
| ros2-node-graph | ROS 2 nodes, topics, and action servers |
| gui-signals | Qt/custom signal connections between components |
| launch-modes | Which nodes spawn per ROS 2 launch mode |
| class-relationships | Class diagram with inheritance and composition |
Render all diagrams:
bash docs/diagrams/render.sh # all diagrams
bash docs/diagrams/render.sh -a # app diagrams only
bash docs/diagrams/render.sh -r # ROS 2 diagrams onlyRequires npx (Node.js). Output goes to docs/diagrams/output/.
| Colour | Value |
|---|---|
| Red | 1 |
| Blue | 5 |
| White | 20 |
The dashboard surfaces choreographer state through two read-only labels at the top of the right (camera) panel:
sequence_status_labelshows the active sequence name and the current step index (for exampleSequence: pick_up_deck (step 2)), or empty when no sequence is running.sequence_rejection_labeldisplays the most recent rejection reason from the choreographer (for exampleLast rejection: pick_up_deck) so operators can see why a sequence was refused.
Manual triggers live in the Debug QGroupBox on the left panel for integration testing without GameState driving the queue:
- Home button calls
choreographer.home(). - Pick Up Deck button calls
choreographer.pick_up_deck(). - Deal to Seat button with a seat-index QSpinBox calls
choreographer.deal_card_to_seat(seat). - Flip Card button with a community-index QSpinBox (0..4) calls
choreographer.flip_card(num_players + i). - Collect Pot button calls
choreographer.collect_pot().
This means the entry point was built with the system Python shebang. Always use ./build.sh instead of colcon build — it patches the shebangs automatically after every build.
If you already built with colcon build, just run ./build.sh once to fix it:
cd ~/poker_arm_ws
./build.sh
source install/setup.bashMake sure rosdep has been initialised:
sudo rosdep init
rosdep updateQuick fix (current session only): Grant access to the port immediately without logging out:
sudo chmod 666 /dev/ttyACM0This resets on reboot or when the device is unplugged. You'll need to re-run it each session.
Permanent fix: Add your user to the dialout group so the device is always accessible:
sudo usermod -a -G dialout $USERLog out and log back in for the group change to take effect. After that, chmod is no longer needed.
If you modify DH parameters, delete old models and regenerate:
rm -rf install/poker_control/share/poker_control/models/*.casadi
./build.sh
source install/setup.bash
ros2 run poker_control generate_kinematicsSymptom: Gazebo opens a window but the simulation never loads. The terminal repeatedly prints [ros_gz_sim]: Requesting list of world names. and the controller spawners time out.
Cause: Gazebo Harmonic (shipped with ROS 2 Jazzy) defaults to the Ogre2 renderer, which requires OpenGL 4.3+. On machines without a dedicated GPU — integrated Intel/AMD graphics, VMs, WSL2 — Ogre2 stalls silently during initialisation, blocking the server from ever starting. This is a known Gazebo upstream compatibility issue, not a misconfiguration.
Fix: This project's launch file already applies --render-engine ogre (Ogre1, requires only OpenGL 2.1) and sets GZ_IP=127.0.0.1 (pins gz-transport to loopback). No action needed — the launch file handles it automatically.
If you have forked or modified the launch files and see this issue, add these two lines to so101_gazebo.launch.py:
SetEnvironmentVariable(name="GZ_IP", value="127.0.0.1")("gz_args", [" -v 4 -r empty.sdf --render-engine ogre"])If Gazebo crashes immediately with an Ogre::UnimplementedException or GL3PlusTextureGpu error, it is due to WSL2's virtual graphics driver not supporting the required OpenGL features.
Force software rendering before launching:
export LIBGL_ALWAYS_SOFTWARE=1