Follow the steps below to set up your environment:
- Create a new conda environment with Python 3.8:
conda create -n PCP python=3.8- Activate the newly created environment:
conda activate PCP-
Install PyTorch based on your system: PyTorch website
-
Install the remaining dependencies from the requirements.txt file:
pip install -r requirements.txt- Install the local 'dynamics' package:
pip install -e dynamicsThe pipeline enables EMG- and force-aware control of the PSYONIC Ability Hand prosthesis.
The workflow consists of:
- Patient folder setup
- Connecting the EMG board
- Data acquisition (calibration, free-space, interaction)
- Preprocessing
- Model training
- Inference / real-time control
- Copy an existing
<person_id>folder fromdata/as a template. - This folder contains a
configs/directory with:modular_fs.yaml(free-space controller config)modular_inter.yaml(interaction controller config)
- Rename the folder to the patient's name, pseudonym, or ID.
- These config files will be automatically updated during preprocessing.
Two options exist:
With GUI (Linux/macOS only): Use the external Biomech_EMG_USB tool (not included in this repo). This shows live EMG readings and helps verify connection quality.
Without GUI (cross-platform, including Windows): Run:
python s0_emgInterface.pyCheck that the printed timestamps are chronological. If they jump around, the connection is unstable and the EMG board must be restarted.
In the GUI, misaligned timestamps also indicate lost connection.
- Run the calibration script:
python s1.5_collect_calib_data.py --person_id <person_id> --movement Calibration --hand_side <Left/Right> --calibrate_emg- Step 1: Relaxed baseline EMG for 10s (noise calibration).
- Step 2: Maximal voluntary contraction (MVC) for 10s (muscle activation range).
- Results are stored in
recordings/calibration/.
- Next, record free-space hand poses:
python s1.5_collect_calib_data.py --person_id <person_id> --movement <hand_pose> --hand_side <Left/Right><hand_pose> options:
indexFlExmrpFlExfingersFlExhandClOpthumbAbAdthumbFlExindexDigitsFlExpinchClOp
During the sync iterations (not stored), the patient aligns with the prosthesis trajectory. During rec iterations, data is recorded and stored.
Run:
python s1.5_collect_force_data.py --person_id <person_id> --grip <grip_type> --hand <Left/Right> --gui<grip_type> options:
hookpower_griptripodpinch
Default: 60s trial, PID update frequency 100–200 Hz.
Workflow:
- Hand slowly closes into grip (approach phase).
- Place rigid/soft object into hand.
- When contact is detected, the PID controller follows the target force trajectory.
- Patient modulates grip force via muscle co-contraction, following the moving "tail" in the GUI.
Interaction trials are saved with _interaction suffix in the movement name.
After acquisition, process all recordings:
python s2.5_process_all_data.py --person_id <person_id> --hand_side <Left/Right>Optionally restrict to one movement:
python s2.5_process_all_data.py --person_id <person_id> --hand_side <Left/Right> --movement indexFlExThis step:
- Filters and aligns EMG, force, and kinematics
- Updates
modular_fs.yamlandmodular_inter.yamlwith used EMG channels, movements, etc. - Config files can be edited manually to adjust model architecture.
Train models using:
python s4_train.py --person_dir <person_id> --intact_hand <Left/Right> --controller_mode <free_space|interaction|both> -t -sUse -v instead of -t -s to only visualize data.
Modes:
--controller_mode free_space→ train free-space only--controller_mode interaction→ train interaction only--controller_mode both→ train both models
Models are saved under data/<person_id>/models/.
Run the online controller:
python s5_inference.py -e --person_dir <person_id> \
--free_space_model_name <free_space_model_name> \
--interaction_model_name <interaction_model_name>- Requires trained models from step 5.
- Uses EMG + force feedback in real-time to control the PSYONIC Ability Hand.
s0_emgInterface.py— connect to EMG board (with/without GUI).s0_emgFilter.py— filter, rectify, normalize EMG stream.s1.5_collect_calib_data.py— collect calibration + free-space trajectories.s1.5_collect_force_data.py— collect interaction force trials with PID + GUI.s2.5_process_all_data.py— preprocess and align all data, update configs.s4_train.py— train free-space/interaction/both models.s5_inference.py— run trained models for real-time control.psyonicControllers.py— controller class for running trained models on the prosthesis.EMGClass.py / BCI_Data_Receiver.py— EMG acquisition and streaming utilities.predict_utils.py / models.py— training, datasets, neural network architectures.