An end-to-end crowd analysis project that combines:
- People counting from video (YOLO)
- Density heatmap generation (for hotspots/spikes)
- Nearby BLE device analysis on Arduino/ESP32
- FastAPI backend for processing and integration
- Web frontend for simulation and visualization
The goal is to help teams monitor crowd conditions in near real-time and make safer decisions.
This project is useful for places where crowd behavior changes quickly (events, campuses, transport hubs, malls, public spaces).
- Early risk detection
- Spot crowd-density spikes before they become unsafe.
- Security support
- Detect unusual surges, persistent clustering, or unexpected activity patterns.
- Operational awareness
- Understand where people are concentrated and how patterns evolve over time.
- Faster response
- Improve staffing, queue routing, and emergency handling with data-driven signals.
- Low-friction deployment options
- Camera-based analytics + BLE environment sensing for complementary insights.
Counts people in video streams/files and returns statistics such as:
- frames processed
- max people in frame
- unique tracked people IDs
Optional annotated video output is supported.
Builds persistent density maps from detections and renders:
- raw heatmap matrix (
.npy) - colored heatmap image
- overlay heatmap on source image
Useful for identifying repeated congestion zones.
ESP32 Arduino sketch that scans nearby BLE advertisers and tracks:
- unique devices seen
- RSSI (signal strength) trends (last/min/max)
- device seen frequency
- strongest currently observed device
This can provide additional environmental context in areas where camera coverage is limited.
Provides HTTP APIs for processing uploaded media:
GET /healthPOST /simulation/process
Routes files to either YOLO counting (video) or heatmap generation (image).
React + TypeScript + Vite interface for interacting with the backend and visual outputs.
- Input media is uploaded from frontend (video/image).
- Backend detects mode:
- video → people counting
- image → density heatmap
- Model output is returned as stats and optional output artifacts.
- BLE analyzer can run separately on ESP32 to provide nearby device analytics.
.
├── backend/ # FastAPI service
├── ble-model/ # Arduino BLE analyzer sketch + README
├── heatmap-model/ # Heatmap module + tests + outputs
├── pc-model/ # YOLO people counting scripts + outputs
├── website/ # React/Vite frontend
└── preview_assets/ # sample media/assets
From repository root:
poetry install
poetry run python -m uvicorn backend.app:app --host 0.0.0.0 --port 8000 --reloadBackend docs are in backend/README.md.
From website/:
npm install
npm run dev(Or use your preferred package manager.)
Open ble-model/ble_model.ino in Arduino IDE, select ESP32 board, upload, and open Serial Monitor at 115200 baud.
Details: ble-model/README.md.
- Event venues: detect crowding near stages/exits
- Transit stations: identify queue buildup and unusual flow changes
- Campus security: observe spikes around gates/corridors
- Retail/public facilities: improve staffing and crowd routing
- Vision outputs depend on camera angle, lighting, and occlusion.
- BLE signals are noisy and do not directly equal exact person counts.
- Best results come from combining multiple signals (video + heatmap + BLE context).
See LICENSE.