"The All-Seeing Eye" - Indigenous AI-Powered Airspace Defense Grid
PROJECT SUDARSHAN is a Next-Gen drone defense operating system designed to detect, track, and neutralize hostile aerial threats. Unlike traditional radar systems, SUDARSHAN fuses Computer Vision (AI), Electronic Warfare (EW), and Kinetic Interception into a single glass-pane dashboard.
It is built to handle the modern "Drone Swarm" threat vector, where a single operator needs to manage multiple detection and kill-chains simultaneously.

"The Ultimate Weapon - Precision, Power & Destiny" The Kinetic Missile Defense Grid. When non-kinetic methods fail, BRAHMASTRA authorizes the launch of high-speed interceptors.
- 🛡️ Feature: 3D Missile Interception with manual launch authority.
- 💡 Innovation: Authentic Ballistic Simulation (Not just animation).
-
🧮 Algorithm: Adaptive Bezier Curves (
$$B(t)$$ ) which re-calculate the intercept path 20 times/sec based on target velocity. - 🤖 Terminal Guidance: Kalman Filter prediction locks onto the drone's future position at 80% flight path.
"The Thunderbolt Shield - Invisible & Impenetrable" The RF Jamming & Spectrum Analysis Suite. Designed to sever the link between a drone and its operator.
- 🛡️ Feature: Real-time Radio Frequency (RF) Spectrum Analysis.
- 💡 Innovation: Gamified Frequency Matching - The operator must manually align the jamming wave with the target's signal spike.
- 🔮 Visualization: A Holographic Signal Scope renders the invisible radio waves using React Three Fiber.
- ✅ Result: Safe neutralization (Auto-Land) for forensic capture.
"The Unescapable Trap - Lure, Confuse & Destroy" The Advanced Signal Spoofing & Trap System. Instead of destroying the drone, CHAKRAVYUH hijacks its navigation.
- 🛡️ Feature: GPS Signal Override & Trajectory Redirection.
- 💡 Innovation: "Honey Pot" Phantom Node - Broadcasting fake "Home Base" coordinates to lure the drone.
- ⚡ Trap Mechanic: Once the drone reaches the safe zone, a localized EMP Shockwave is triggered.
- 💻 Tech: Uses Geofencing algorithms to create the "Safe Box".
"The Eye of Rudra - Vigilant & Sharp" The Computer Vision Neural Engine. The eyes of the system.
- 🛡️ Feature: Real-time Object Detection & Threat Classification.
- 🧠 Model: YOLOv8s (You Only Look Once - Small), fine-tuned on aerial datasets.
- ⚙️ Architecture: Producer-Consumer Threading decoupling Inference (AI) from Rendering (Video).
- 🚀 Performance: Achieves 60 FPS Video with 5 FPS Inference on standard i5 CPUs (No GPU required).
- 🔬 Triple-Stream Fusion: Simultaneously processes 3 discrete camera feeds (Day/Night/Thermal).
Most student projects lag when running AI. We solved this by implementing a Producer-Consumer Queue Architecture.
- The Problem: Running YOLO inference blocks the video thread, causing 200ms lag.
- Our Solution: The video thread pushes frames to a
Queue. A separate "Worker Brain" processes them in the background and updates a sharedDetection Statevariable. The result? Silky smooth 60fps video with real-time AI overlays.
We don't just show where the drone is; we show where it will be.
- Formula: $$ P_{future} = P_{current} + (Velocity \times \Delta t) + (0.5 \times Acceleration \times \Delta t^2) $$
- Visual: A Green Laser Line projects 5 seconds into the future, helping operators time their interceptor launches perfectly.
To prevent operator fatigue, we engineered a state-aware audio system (J.A.R.V.I.S.). It tracks every individual drone ID and camera source. It will announce "Hostile Detected on Cam 1" exactly once, ensuring critical alerts aren't drowned out by noise.
We moved beyond "animations" to implement Industry-Standard Mathematical Models directly in the browser.
Instead of hardcoding paths, we use a Discrete Zero-latency Kalman Filter to predict where the drone will be in
- State Vector: $$ X_k = [x, y, v_x, v_y]^T $$
- Prediction Model: $$ X_{k+1} = F_k X_k + B_k u_k $$
- Covariance Update: $$ P_{k+1} = F_k P_k F_k^T + Q_k $$
- Why it matters: This allows the interceptor missile to lead the target dynamicially, accounting for sudden turns.
The spoofing module solves the Pseudorange Equation in reverse to calculate the required time-offset injection.
-
The Formula: $$ \rho = r + c(dt_r - dt_s) + S_{offset} + \epsilon $$
-
$\rho$ : Pseudorange -
$r$ : True Geometric Range -
$c$ : Speed of Light ($299,792,458 m/s$ ) -
$dt$ : Clock Bias (Receiver/Satellite) -
$S_{offset}$ : The Spoofing Attack Vector (Nanoseconds)
-
-
Signal Visualization:
- I/Q Scatter Plot: Visualizes BPSK/QPSK constellation corruption.
-
Allan Variance (
$$\sigma^2(\tau)$$ ): Measures the induced clock instability to confirm "lock loss".
"The One Who Hears All - Detection Beyond Sight" The Acoustic Warfare & Audio Fingerprinting Module. Vision fails in fog; Radar fails against stealth. Sound never lies.
- 🛡️ Feature: Multi-Source Audio Analysis (Simulated Array & Live Microphone).
- 💡 Innovation: Inverse Doppler Solver - Calculates drone velocity purely from pitch shift, without Radar.
- 🧬 Fingerprinting: Class-4 Drone Signature Lock - Filters out bird calls/wind by strictly matching the 2nd Harmonic of rotor blade noise.
- 🌊 Visual: 3000% Scale 3D Spectrogram - Massive, Hollywood-grade visualization of the audio spectrum.
We implemented a real-time Physics Engine that runs on the raw audio buffer.
Calculates reliability velocity (v) from frequency shift (Δf).
-
Formula: $$ v = c \times \frac{F_{observed} - F_{source}}{F_{source}} $$
-
$v$ : Target Velocity (m/s) -
$c$ : Speed of Sound ($343 m/s$ ) -
$F_{obs}$ : Detected FFT Peak Frequency -
$F_{src}$ : Known Drone Rotor Frequency (e.g., 400Hz)
-
- Live Validated: Whistling into the mic actually changes the calculated speed on screen!
Converts Time-Domain audio (Microphone) into Frequency-Domain data (Spectrogram).
- Formula: $$ X[k] = \sum_{n=0}^{N-1} x[n] \cdot e^{-i 2\pi k n / N} $$
- Implementation: 2048-point Fast Fourier Transform running at 60 FPS via Web Audio API.
To prevent false positives (like human voice), we use a strict "Rotor Check":
-
Rule: Trigger
MATCHIF AND ONLY IF:- Fundamental Freq
$\in [350Hz, 650Hz]$ - Harmonic Strength (
$2 \times F$ )$> 30%$ of Fundamental.
- Fundamental Freq
- Why? Mechanical rotors produce "Sawtooth" waves (strong even harmonics). Human voice produces "Sine-like" or chaotic waves.
This module does not just "animate" points. It solves the Reynolds' Boids Flocking Equations in real-time.
To predict the leader's position, we calculate the weighted centroid of the swarm cluster.
- Formula: $$ \mu = \frac{1}{N} \sum_{i=1}^{N} P_i $$
-
Projection: $$ P_{ghost} = \mu + 0.4 \times (\mu - P_{base}) $$
- Why? Leaders typically stay behind the swarm relative to the target (safety offset).
The "Kill Shot" is modeled as a signal propagation wave.
- Formula: $$ I = \frac{P}{4\pi r^2} $$
-
Implementation: The hack signal (
$I$ ) degrades over distance ($r$ ). We simulate this by sorting target drones by distance from the Ghost Node and applying a temporal delay ($t_{delay} = k \times r$ ). - Result: A realistic "Dominio Effect" where drones near the leader fall first, followed by outer layers.
Voice & Text Interaction: The system accepts natural language commands. Below are the 23 Core Commands recognized by the Neural Engine:
- REPORT / STATUS / SITREP → "Sector Scan Complete. 5 Hostiles Detected."
- INTERCEPT / SCRAMBLE / ENGAGE / KILL → "Target Locked. Scrambling Interceptors."
- WEATHER / MET → "Wind Shear at 20,000ft. Analysis: Safe to Fly."
- ANALYSIS / ANALYSE / THREAT / IDENTIFY → "Critical Alert: Class-4 Threat Detected."
- BASE / HQ / DEFENSE → "Command Post: SAM Batteries Armed."
- SYSTEM / JARVIS / DIAGNOSTIC → "CPU Load: 12%. Uplink Active."
- HELLO / WHO ARE YOU → "I am J.A.R.V.I.S. At your service."
- HELP / COMMANDS → "Displaying Protocol List..."
"The Master Strategist - Understanding the Hive Mind" The Advanced Swarm Analysis & Neural Interface. Designed to predict, visualise, and systematically dismantle coordinated drone swarms.
-
🛡️ Feature: Multi-Spectral Vision System (Optical / Thermal / LiDAR).
- Optical: Standard visual feed.
-
Thermal: Inverted backdrop-filter for heat signature tracking (
$T > 40^{\circ}C$ ). - LiDAR: Point-cloud simulation using hard-light mix-blend modes.
-
☠️ Kill Shot Mechanic: "The Viral Hack"
- Trigger: Dual-Key Auth (SHIFT + SPACE) initiates a cascading cyber-attack.
-
Physics: Propagates from the "Ghost Node" (Leader) to neighbors based on Inverse Square Distance (
$1/r^2$ ).
-
❤️ Pilot Vitals: Heartbeat Flatline Monitor.
- Visualises the pilot's stress levels (Sine Wave).
- Effect: Upon successful neutralization, the monitor FLATLINES with a continuous audio tone, confirming the kill.
-
📺 Kill Feed: Drone POV Intercept.
- Hacks the enemy's video feed.
- Displays "SIGNAL LOST" static noise instantly upon hack execution.
- Node.js (v18+)
- Python 3.10+ (for Vision)
git clone https://github.com/ankan123basu/DRONE-X.git
cd DRONE-Xcd server
npm install
npm run dev
# Runs on Port 3001cd client
npm install
npm run dev
# Runs on Port 5173cd vision-service
pip install -r requirements.txt
python main.py
# Runs on Port 5000Team Antigravity - Building the future of Autonomous Defense.
"Nabhah Sprsham Diptam" (Touch the Sky with Glory)
© 2025 PROJECT SUDARSHAN