Microsecond motion radar for rotating objects
Event Horizon transforms event cameras into microsecond-precision motion radars for ultra-fast rotating objects. Using only asynchronous event data, we achieve real-time propeller blade tracking and RPM measurement with performance optimized for ARM devices.
- Propeller blade tracking with microsecond temporal resolution
- RPM estimation using multiple methods (FFT, autocorrelation, zero-crossing analysis)
- Real-time clustering with DBSCAN-based temporal tracking
- ARM SIMD optimization achieving 1.71x speedup using Neon intrinsics
- Near real-time performance: 10 seconds of event data processes in ~10 seconds
Traditional frame-based cameras are fundamentally limited by exposure time and frame rate. Event cameras capture every brightness change with microsecond precision, enabling us to:
- Track individual propeller blades rotating at thousands of RPM
- Measure rotation frequency to predict drone thrust and future positions
- Detect periodic patterns from various angles using FFT analysis
- Deploy on mobile ARM devices for field use (not just lab PCs)
This is critical for autonomous drone interception, no-fly zone enforcement, collision avoidance, and high-speed tracking applications.
Implemented SIMD parallelization using ARM Neon intrinsics in Rust:
- 1.71x speedup on ARM processors
- Optimized operations: mean, variance, distance calculations, batch processing
- Enables mobile deployment for real-time tracking in the field
- Near real-time: 10-second event file processes in ~10 seconds
Implementation: Following ARM's SIMD on Rust guide, we parallelized statistical calculations and clustering operations critical for blade tracking.
Deployed hyperparameter optimization on Vultr cloud infrastructure:
- Memory-optimized compute instances for processing large event datasets
- Distributed parameter search for optimal DBSCAN clustering thresholds
- Fast iteration on detection algorithms
Event Stream (.dat)
→ Hot Pixel Filtering
→ Temporal Windowing
→ DBSCAN Clustering
→ Blade Tracking
→ Angle/Width Analysis
→ RPM Estimation (FFT/Autocorrelation/Zero-Crossing)
-
Event Stream Processing (
eviolibrary)- Memory-mapped
.datfile reading (Prophesee Metavision format) - Zero-copy decoding of packed event data
- Real-time playback pacing
- Memory-mapped
-
Hot Pixel Filtering
- Identifies and removes noisy pixels with excessive events
- Improves clustering quality
-
Temporal Blade Clustering
- Initial DBSCAN clustering on time windows (10-20ms)
- Rolling window tracking (0.5-0.75ms) with grace periods
- Event-to-cluster assignment with distance thresholds
- Periodic re-initialization to detect new blades
-
Blade Statistics & Analysis
- Center position tracking with history
- Angle calculation via linear regression
- Width measurement (blade extent)
- Confidence scoring
-
RPM Estimation Methods
- FFT Analysis: Frequency domain periodicity detection
- Autocorrelation: Time-domain period detection
- Angle Zero-Crossing: Track sin(angle) zero crossings
- Width Maxima: Detect peaks in blade width cycles
Multi-stage filtering ensures robust blade detection:
- Size constraints: Min/max cluster events (100-10000)
- Spatial filtering: Max Y-spread to reject diffuse patterns
- Temporal consistency: Grace periods for intermittent blade visibility
- Statistical validation: Confidence scores based on linear fit quality
evio/
├── pyproject.toml # Python package configuration
├── config/
│ └── tracking_config.yaml # Hyperparameters and presets
├── scripts/
│ ├── play_dat.py # Event stream visualizer
│ ├── simulate_propeller.py # Synthetic event generation
│ ├── temporal_blade_tracking.py # Core tracking implementation
│ ├── create_tracking_video_configurable.py # Video generation with RPM
│ ├── analyze_blade_periodicity.py # FFT & autocorrelation analysis
│ ├── detect_blade_periods.py # Period detection methods
│ ├── test_clustering_robustness.py # Parameter optimization (Python)
│ ├── test_rpm_robustness.py # RPM consistency testing
│ ├── visualize_raw_events.py # Event data exploration
│ └── detection/
│ ├── detect_drone.py # Drone detection & propeller tracking
│ ├── detect_drone_v2.py # Enhanced detection pipeline
│ └── compare_detection_methods.py # Method comparison
├── rust_tools/
│ ├── Cargo.toml
│ ├── src/
│ │ ├── main.rs # Clustering robustness (Rust)
│ │ ├── simd_ops.rs # ARM Neon SIMD optimizations
│ │ ├── benchmark.rs # SIMD vs scalar benchmarks
│ │ └── bin/
│ │ ├── benchmark.rs # Benchmark executable
│ │ └── create_tracking_video.rs # Video generation (Rust)
└── src/evio/
├── core/
│ ├── recording.py # .dat file abstraction
│ ├── mmap.py # Memory-mapped I/O
│ ├── pacer.py # Real-time playback
│ └── index_scheduler.py # Event indexing
└── source/
└── dat_file.py # .dat decoder
Install UV for Python package management.
For Rust tools (optional):
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh# Clone repository
git clone <repository-url>
cd evio
# Install Python dependencies
uv sync
# Build Rust tools (optional, for SIMD benchmarks)
cd rust_tools
cargo build --release
cd ..# Visualize event stream
uv run scripts/play_dat.py drone_idle.dat
# Track propeller blades and estimate RPM
uv run scripts/create_tracking_video_configurable.py \
--config config/tracking_config.yaml \
--preset drone_idle
# Analyze blade periodicity with FFT
uv run scripts/analyze_blade_periodicity.py \
--config config/tracking_config.yaml \
--preset drone_idle
# Detect drones with propeller tracking
uv run scripts/detection/detect_drone.py drone_moving.dat --detect-propellers
# Test clustering parameter robustness
uv run scripts/test_clustering_robustness.py \
--config config/tracking_config.yaml \
--preset drone_idleGenerate tracking video with comprehensive RPM analysis:
uv run scripts/create_tracking_video_configurable.py \
--config config/tracking_config.yaml \
--preset drone_idleOutput:
- Tracking video (
blade_tracking_video.mp4) with annotated blades - Analysis plots showing:
- Blade angles over time
- Confidence scores
- Cluster widths
- FFT frequency spectrum
- Autocorrelation periodicity
- RPM calculations
Deep dive into rotation frequency detection:
uv run scripts/analyze_blade_periodicity.py \
--config config/tracking_config.yaml \
--preset drone_idleMethods:
- FFT: Identifies dominant frequencies in angle/position signals
- Autocorrelation: Detects repeating patterns in time series
- Angle Jump Analysis: Measures intervals between large angular changes
Output: Comprehensive plots showing all three methods with RPM estimates.
Test clustering robustness across parameters:
Python version:
uv run scripts/test_clustering_robustness.py \
--config config/tracking_config.yaml \
--preset drone_idle \
--output results.csvRust version (with SIMD optimization):
cd rust_tools
cargo run --release -- \
--config ../config/tracking_config.yaml \
--preset drone_idle \
--output ../clustering_robustness_results.csv
cd ..Exports CSV with metrics across parameter combinations:
- eps (DBSCAN distance threshold)
- min_samples (minimum cluster size)
- window_us (temporal window duration)
- Cluster counts, sizes, spreads, noise percentages
Full drone detection with optional propeller tracking:
# Basic detection
uv run scripts/detection/detect_drone.py drone_moving.dat
# With propeller tracking and video output
uv run scripts/detection/detect_drone.py drone_moving.dat \
--detect-propellers \
--output-video detection.mp4 \
--window 50 \
--speed 1.0Visualization:
- 🟩 Green boxes: Detected drones
- 🟧 Orange ellipses: Tracked propellers (P1-P4)
- HUD: Detection counts, timing info
Compare SIMD vs scalar performance:
cd rust_tools
cargo run --release --bin benchmark
cd ..Tests mean/variance calculations, distance computations, and batch operations across data sizes.
All scripts use config/tracking_config.yaml with preset support:
drone_idle: Stationary drone with sparse propeller eventsdrone_moving: Moving drone with denser event patterns
Data:
input_file: Path to .dat filestart_time_sec: Start time offsetduration_sec: Analysis window durationpolarity: Event polarity filter (-1, 1, or null)
Clustering:
eps: DBSCAN distance threshold (pixels)min_samples: Minimum events per clusterwindow_us: Initial clustering window (microseconds)assignment_distance: Max distance for event-to-cluster assignmentgrace_period_us: Time to keep sparse clusters alive
Temporal Tracking:
window_duration_us: Rolling window sizereinit_interval_us: Period for re-running DBSCANmin_cluster_size: Minimum events to keep cluster active
Edit config/tracking_config.yaml or add new presets for your datasets.
Event cameras output asynchronous events when pixel brightness changes:
Event = (x, y, timestamp, polarity)
- Microsecond timestamps: Far exceeding traditional frame rates
- Sparse representation: Only changing pixels generate events
- Polarity: ON (+1) for brightness increase, OFF (-1) for decrease
Prophesee Metavision binary format:
Header:
% Width 1280
% Height 720
% Format EVT3
Binary Events (8 bytes each):
- Bits 0-13: X coordinate (14 bits)
- Bits 14-27: Y coordinate (14 bits)
- Bits 28-31: Polarity (4 bits)
- Bits 32-63: Timestamp in microseconds (32 bits)
Memory-mapped I/O enables zero-copy processing of millions of events.
1. Initialization:
- Load events in initial window (10-20ms)
- Apply DBSCAN clustering on (x, y) coordinates
- Filter clusters by size, Y-spread (reject diffuse patterns)
2. Temporal Tracking:
- Process events in rolling window (0.5-0.75ms)
- Assign new events to nearest cluster within threshold
- Update cluster statistics: center, angle (via linear regression), width
- Remove old events from rolling window
- Re-run DBSCAN periodically to detect new blades
3. Grace Periods:
- Keep clusters alive during sparse event periods (5-20ms)
- Critical for tracking fast-rotating blades that intermittently generate events
4. Statistics Calculation:
# Linear regression for blade angle
slope, intercept = fit_line(x_coords, y_coords)
angle = atan(slope)
# Width = extent along blade axis
width = max(coords_along_axis) - min(coords_along_axis)
# Confidence from R² of linear fitMethod 1: FFT (Frequency Domain)
angles_over_time = [...]
fft_magnitudes = fft(angles_over_time)
dominant_freq = frequency_of_max_magnitude
rpm = dominant_freq * 60Method 2: Autocorrelation (Time Domain)
autocorr = correlate(signal, signal, mode='full')
period = time_lag_of_first_peak
rpm = (1 / period) * 60Method 3: Zero-Crossing (Phase Tracking)
sin_angles = sin(angles)
zero_crossings = find_upward_crossings(sin_angles)
periods = diff(zero_crossings)
rpm = (1 / mean(periods)) * 60Method 4: Width Maxima
# Width cycles twice per rotation (blade appears/disappears)
peaks = find_peaks(widths)
period = mean(diff(peak_times))
rpm = (1 / period) * 60 / 2 # Divide by 2 for half-rotation cyclesKey operations parallelized with ARM Neon:
Mean Calculation:
// Process 2 f64 values per iteration (128-bit Neon registers)
let mut sum = vdupq_n_f64(0.0);
for chunk in values.chunks(2) {
let v = vld1q_f64(chunk);
sum = vaddq_f64(sum, v);
}Variance Calculation:
let vmean = vdupq_n_f64(mean);
let diff = vsubq_f64(values, vmean);
let squared = vmulq_f64(diff, diff);
// Horizontal reduction for final sumDistance Computation:
// Batch calculate distances from point to multiple targets
// Vectorized subtraction, multiplication, additionSpeedup: 1.71x on representative blade tracking workloads.
✅ Rolling window clustering with grace periods handles intermittent blade visibility
✅ Multiple RPM methods provide cross-validation (FFT most robust)
✅ SIMD optimization enables mobile deployment on ARM devices
✅ YAML configuration with presets enables easy parameter tuning
✅ Linear regression for blade angle works well for thin, straight blades
- Solution: Long grace periods (20ms), periodic re-initialization
- Solution: Hyperparameter search tools (Python + Rust)
- Solution: Y-spread filtering, temporal consistency checks
- Real camera integration: Test with live Prophesee devices (currently validated on recordings)
- 3D trajectory estimation: Combine RPM with spatial tracking
- Multi-drone scenarios: Handle multiple quadcopters simultaneously
- Machine learning: Neural networks for event-based detection
- Improved blade models: Handle curved or flexible blades
- Better tree/foliage rejection for outdoor scenarios
Extend evio library to support additional cameras:
- Implement async stream in
src/evio/source/ - Yield standardized packets:
x_coords, y_coords, timestamps, polarities - All algorithms work with any source automatically
Generate synthetic event data:
uv run scripts/simulate_propeller.py \
--output output_quadcopter.dat \
--rpm 5500 \
--num-blades 4 \
--duration 10.0 \
--fps 1000Parameters: rotation angle, position, noise level, etc.
# Test RPM consistency across time points
uv run scripts/test_rpm_robustness.py \
--config config/tracking_config.yaml \
--preset drone_idle \
--num-time-points 20
# Compare detection methods
uv run scripts/detection/compare_detection_methods.py- Magrini et al. (2025) - Drone Detection with Event Cameras - Comprehensive survey of event-based vision for drone detection, tracking, trajectory forecasting, and propeller signature analysis. arXiv:2508.04564 [cs.CV]
- ARM SIMD on Rust
- Prophesee Metavision SDK
- Event Cameras: Principles and Applications
- DBSCAN Clustering
@arm 📱 - SIMD optimization resources enabling 1.71x speedup on ARM processors
@Vultr ☁️ - Cloud infrastructure for hyperparameter optimization
Special thanks to the event camera community and Prophesee for the Metavision SDK.
MIT License - see LICENSE for details
We welcome contributions! Priority areas:
- Real-time camera integration (Prophesee, Inivation)
- Additional RPM estimation methods
- Performance optimization (GPU, more SIMD)
- Documentation improvements
- Dataset contributions
Event Horizon - Tracking the future at microsecond precision 🎯⚡
If you use this work, please cite:
Event Horizon: Microsecond Motion Radar for Rotating Objects
https://github.com/<your-repo>/evio