Note: This project replaces the previous
emager-pyrepository with improved architecture, enhanced features, and better maintainability.
A comprehensive toolbox for working with the EMaGer v1 and v3 EMG acquisition devices, featuring real-time gesture recognition and prosthetic hand control capabilities.
- Overview
- Key Features
- How It Works
- Installation
- Quick Start
- Configuration
- Available Commands
- Testing
- Development Workflows
- Project Structure
- Documentation
- Troubleshooting
- Contributing
- License
- Acknowledgments
- Citation
- Contact
EMaGerLib is a Python-based framework designed for electromyographic (EMG) signal processing, machine learning-based gesture recognition, and prosthetic hand control. Built on top of libemg, it provides a complete pipeline from data collection to real-time control of prosthetic devices.
- Real-time Gesture Recognition: Advanced CNN-based models with quantization support for efficient inference
- Prosthetic Hand Control: Native support for Psyonic and other prosthetic hands via serial communication
- Data Collection & Training: Screen-guided training sessions with configurable gesture sets
- Visualization Tools: Real-time 64-channel EMG visualization and monitoring
- Flexible Configuration: Python, YAML, and JSON-based configuration system
- EMaGer Hardware Support: Compatible with EMaGer v1 and v3 devices
EMaGerLib provides a complete workflow from data collection to prosthetic control, built on top of libemg with custom extensions for EMaGer hardware.
The library integrates with libemg's core components and our custom EMaGer Streamer to provide a 6-step pipeline: Configuration → Visualize → Data Collection → Train Model → Real-time Prediction → Hand Control. See the Quick Start section for details on each step.
EMaGerLib sits between libemg (foundation) and your application. Simply import emagerlib and add your custom configuration.
- Python >= 3.8
- Windows/Linux/MacOS
- EMaGer v1/v3 hardware (for real data acquisition)
Step 1: Clone the repository
git clone https://github.com/SBIOML/EMaGerLib.git
cd EMaGerLibStep 2: Create and activate a virtual environment (recommended)
# Windows
python -m venv venv
venv\Scripts\activate
# Linux/MacOS
python3 -m venv venv
source venv/bin/activateStep 3: Install the package
pip install -e .This installs the package with all dependencies and makes console commands available globally.
pip install -r requirements.txtNote: For troubleshooting and advanced setup, see the Troubleshooting Guide.
Tip: After installing with
pip install -e ., you can use the unifiedemagercommand (e.g.,emager train-cnn), individual commands (e.g.,emager-train-cnn), or run scripts directly (e.g.,python examples/training/train_cnn.py).
Command-Line Arguments: All commands support extensive configuration options. See the CLI Guide for details.
Copy the example configuration file base_config_example.py to your project directory and modify it for your needs (paths, gesture classes, etc.).
More in Configuration section.
Monitor real-time 64-channel EMG data:
emager live-64chRun a screen-guided training session:
emager screen-trainingTrain a CNN on collected EMG data:
emager train-cnnTest gesture recognition in real-time:
emager realtime-predictRun real-time control with a connected prosthetic:
emager realtime-controlEMaGerLib uses a flexible configuration system supporting multiple formats:
- Python (
.py) - Most flexible, allows code execution and computed values - YAML (
.yaml) - Human-readable, ideal for version control - JSON (
.json) - Machine-readable, good for automation (used for automatic config saving)
- Copy the example configuration: Use base_config_example.py as a template
- Edit for your project: Modify paths, gesture classes, and parameters
- Use with any command:
emager train-cnn -c my_config.py
Note: If you change path parameters (like
BASE_PATHorMEDIA_PATH), update.gitignoreaccordingly to avoid committing large data files. See the Configuration Guide for details.
| Parameter | Description | Example |
|---|---|---|
BASE_PATH |
Root directory for datasets | Path("./Datasets/") |
SESSION |
Session identifier | "D1" |
CLASSES |
Gesture class IDs | [2, 3, 30, 14, 18] |
WINDOW_SIZE |
EMG window size (samples) | 200 |
SAMPLING |
Sampling rate (Hz) | 1008 |
MODEL_NAME |
Model filename (or None to auto-detect) |
None |
- Configuration Guide - Complete configuration documentation
- base_config_example.py - Python configuration template
- base_config_example.yaml - YAML configuration template
After installing with pip install -e ., the following commands are available system-wide:
The emager command provides a unified interface to all EMaGer commands:
# Show all available commands
emager
emager --help
# Run any command using the unified interface
emager train-cnn --config my_config.py
emager screen-training
emager realtime-predict --log-level DEBUGFor backward compatibility, all commands are also available with their full names:
| Command | Description |
|---|---|
emager screen-training |
Screen-guided data collection |
emager train-cnn |
Train CNN model |
emager realtime-predict |
Real-time gesture prediction |
emager realtime-control |
Real-time prosthetic control |
emager live-64ch |
Live 64-channel EMG visualization |
emager test-hand |
Test hand control interface |
emager test-psyonic |
Test Psyonic hand |
emager test-wave |
Test hand wave gestures |
emager visualize-libemg |
Visualize with libemg |
emager run-tests |
Run complete test suite |
# Use default configuration
emager train-cnn
# or
emager-train-cnn
# Use custom configuration
emager train-cnn -c my_config.py
# With debug logging
emager realtime-predict --log-level DEBUG
# Save configuration after run
emager train-cnn --save-config-name experiment_01
# Get help for a specific command
emager train-cnn --helpNote: All commands support the same set of command-line arguments for configuration, logging, and config saving. See the CLI Guide for complete details.
Run the complete test suite:
emager-run-testsThe test suite includes:
- Configuration system (loading/saving Python, JSON, YAML)
- Utility functions (majority voting, decimation)
- Model discovery and sorting
- Gesture definitions and constants
# Test hand control
emager-test-hand
# Test Psyonic hand
emager-test-psyonic
# Visualize EMG signals
emager-live-64chEMaGerLib/
├── emagerlib/ # Core library modules
│ ├── config/ # Configuration management
│ ├── control/ # Prosthetic hand control interfaces
│ ├── models/ # Neural network models (EmagerCNN)
│ ├── utils/ # Utility functions and helpers
│ └── visualization/ # Real-time GUI and plotting tools
├── examples/ # Executable scripts
│ ├── data_collection/ # Data collection scripts
│ ├── hand_control/ # Hand control tests
│ ├── realtime/ # Real-time prediction and control
│ ├── training/ # Model training scripts
│ └── visualisation/ # Visualization tools
├── config_examples/ # Example configuration files
├── tests/ # Unit tests
└── docs/ # Documentation
Comprehensive documentation is available in the docs/ directory:
- Configuration Guide - Complete configuration system documentation
- CLI Guide - Command-line arguments and usage examples
- Troubleshooting Guide - Installation, hardware, development, and solutions to common problems
Best for: Standard EMG projects with custom configurations only
Install EMaGerLib and use provided commands with your configuration. No code modification needed.
pip install -e .
emager-screen-training -c my_config.pyBest for: Adding new features (controllers, models, visualizations)
Clone and install in editable mode, then add custom modules:
git clone https://github.com/SBIOML/EMaGerLib.git
cd EMaGerLib
pip install -e .Add your modules to:
emagerlib/control/- New prosthetic interfacesemagerlib/models/- Custom modelsemagerlib/visualization/- Visualization toolsemagerlib/utils/- Utility functions
Best for: Changing how EMaGerLib interacts with libemg
You can either:
- Fork and clone libemg locally and install it in editable mode, then install EMaGerLib on top
- Modify EMaGerLib's wrapper code in files that interact with libemg
For complete development guidelines, see the Troubleshooting Guide.
EMaGer device not detected
- Check USB connection and drivers
- Verify device appears in system (Windows: Device Manager, Linux:
lsusb) - Try different USB ports
Import errors after installation
- Reinstall in editable mode:
pip install -e . - Check Python version:
python --version(must be >= 3.8) - Verify all dependencies are installed:
pip list
Model training crashes
- Reduce
BATCH_SIZEin config if out of memory - Ensure sufficient training data collected
- Check log files for detailed error messages
Real-time control lag
- Reduce
WINDOW_SIZEfor faster response - Close unnecessary applications
- Consider model quantization for performance
For comprehensive troubleshooting, see the Troubleshooting Guide. You can also check GitHub Issues for known problems and solutions.
We welcome contributions! Whether it's bug reports, feature requests, or code contributions:
- Report bugs via GitHub Issues
- Request features by opening an issue with the "enhancement" label
- Submit code via pull requests (see Troubleshooting Guide)
Before contributing code:
- Read the Troubleshooting Guide
- Run tests with
emager-run-tests - Follow existing code style
- Document your changes
This project is licensed under the MIT License - see the LICENSE file for details.
- Built on top of libemg
- Developed by the Smart Biomedical Microsystems Laboratory (SBIOML)
If you use this software in your research, please cite:
@software{emagerlib2025,
title = {EMaGerLib: EMG Signal Processing and Prosthetic Control Toolbox},
author = {Michaud, Étienne},
year = {2025},
organization = {Smart Biomedical Microsystems Laboratory},
license = {MIT}
}Author: Étienne Michaud
Email: etmic6@ulaval.ca
Organization: Smart Biomedical Microsystems Laboratory
GitHub: SBIOML/EMaGerLib
Need help? Check the documentation or open an issue.

