Skip to content

ankan123basu/DRONEWATCH-X

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

17 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

🇮🇳 PROJECT SUDARSHAN (सुदर्शन)

"The All-Seeing Eye" - Indigenous AI-Powered Airspace Defense Grid

Status Tech Sector


📜 Mission Statement

PROJECT SUDARSHAN is a Next-Gen drone defense operating system designed to detect, track, and neutralize hostile aerial threats. Unlike traditional radar systems, SUDARSHAN fuses Computer Vision (AI), Electronic Warfare (EW), and Kinetic Interception into a single glass-pane dashboard.

It is built to handle the modern "Drone Swarm" threat vector, where a single operator needs to manage multiple detection and kill-chains simultaneously. Screenshot 2025-12-11 235457 Screenshot 2025-12-11 235516 Screenshot 2025-12-11 235541 Screenshot 2025-12-11 235623 Screenshot 2025-12-11 235712 Screenshot 2025-12-11 235841 Screenshot 2025-12-12 000810 Screenshot 2025-12-12 000902 Screenshot 2025-12-12 001008 Screenshot 2025-12-12 001100 Screenshot 2025-12-12 001816 Screenshot 2025-12-12 001826


🛡️ THE DEFENSE SUITE (Modules)

1. BRAHMASTRA (ब्रह्मास्त्र) - Kinetic Interception

"The Ultimate Weapon - Precision, Power & Destiny" The Kinetic Missile Defense Grid. When non-kinetic methods fail, BRAHMASTRA authorizes the launch of high-speed interceptors.

  • 🛡️ Feature: 3D Missile Interception with manual launch authority.
  • 💡 Innovation: Authentic Ballistic Simulation (Not just animation).
  • 🧮 Algorithm: Adaptive Bezier Curves ($$B(t)$$) which re-calculate the intercept path 20 times/sec based on target velocity.
  • 🤖 Terminal Guidance: Kalman Filter prediction locks onto the drone's future position at 80% flight path.

2. VAJRA-KAVACH (वज्र कवच) - Electronic Warfare

"The Thunderbolt Shield - Invisible & Impenetrable" The RF Jamming & Spectrum Analysis Suite. Designed to sever the link between a drone and its operator.

  • 🛡️ Feature: Real-time Radio Frequency (RF) Spectrum Analysis.
  • 💡 Innovation: Gamified Frequency Matching - The operator must manually align the jamming wave with the target's signal spike.
  • 🔮 Visualization: A Holographic Signal Scope renders the invisible radio waves using React Three Fiber.
  • ✅ Result: Safe neutralization (Auto-Land) for forensic capture.

3. CHAKRAVYUH (चक्रव्यूह) - GPS Spoofing

"The Unescapable Trap - Lure, Confuse & Destroy" The Advanced Signal Spoofing & Trap System. Instead of destroying the drone, CHAKRAVYUH hijacks its navigation.

  • 🛡️ Feature: GPS Signal Override & Trajectory Redirection.
  • 💡 Innovation: "Honey Pot" Phantom Node - Broadcasting fake "Home Base" coordinates to lure the drone.
  • ⚡ Trap Mechanic: Once the drone reaches the safe zone, a localized EMP Shockwave is triggered.
  • 💻 Tech: Uses Geofencing algorithms to create the "Safe Box".

4. RUDRAKSHA (रुद्राक्ष) - AI Vision Sentinel

"The Eye of Rudra - Vigilant & Sharp" The Computer Vision Neural Engine. The eyes of the system.

  • 🛡️ Feature: Real-time Object Detection & Threat Classification.
  • 🧠 Model: YOLOv8s (You Only Look Once - Small), fine-tuned on aerial datasets.
  • ⚙️ Architecture: Producer-Consumer Threading decoupling Inference (AI) from Rendering (Video).
  • 🚀 Performance: Achieves 60 FPS Video with 5 FPS Inference on standard i5 CPUs (No GPU required).
  • 🔬 Triple-Stream Fusion: Simultaneously processes 3 discrete camera feeds (Day/Night/Thermal).

🔬 Technical Innovations (For Judges)

🚀 1. Hybrid Threading Architecture (Rudraksha)

Most student projects lag when running AI. We solved this by implementing a Producer-Consumer Queue Architecture.

  • The Problem: Running YOLO inference blocks the video thread, causing 200ms lag.
  • Our Solution: The video thread pushes frames to a Queue. A separate "Worker Brain" processes them in the background and updates a shared Detection State variable. The result? Silky smooth 60fps video with real-time AI overlays.

🔮 2. Predictive Trajectory Engine (Kalman Filter)

We don't just show where the drone is; we show where it will be.

  • Formula: $$ P_{future} = P_{current} + (Velocity \times \Delta t) + (0.5 \times Acceleration \times \Delta t^2) $$
  • Visual: A Green Laser Line projects 5 seconds into the future, helping operators time their interceptor launches perfectly.

⚡ 3. The "One-Time" Audio Logic

To prevent operator fatigue, we engineered a state-aware audio system (J.A.R.V.I.S.). It tracks every individual drone ID and camera source. It will announce "Hostile Detected on Cam 1" exactly once, ensuring critical alerts aren't drowned out by noise.

📐 4. MATH & PHYSICS KERNELS (V2.0 Core)

We moved beyond "animations" to implement Industry-Standard Mathematical Models directly in the browser.

A. BRAHMASTRA - Kalman Filter State Estimation

Instead of hardcoding paths, we use a Discrete Zero-latency Kalman Filter to predict where the drone will be in $t+5s$.

  • State Vector: $$ X_k = [x, y, v_x, v_y]^T $$
  • Prediction Model: $$ X_{k+1} = F_k X_k + B_k u_k $$
  • Covariance Update: $$ P_{k+1} = F_k P_k F_k^T + Q_k $$
  • Why it matters: This allows the interceptor missile to lead the target dynamicially, accounting for sudden turns.

B. CHAKRAVYUH - GPS Spoofing Equation

The spoofing module solves the Pseudorange Equation in reverse to calculate the required time-offset injection.

  • The Formula: $$ \rho = r + c(dt_r - dt_s) + S_{offset} + \epsilon $$
    • $\rho$: Pseudorange
    • $r$: True Geometric Range
    • $c$: Speed of Light ($299,792,458 m/s$)
    • $dt$: Clock Bias (Receiver/Satellite)
    • $S_{offset}$: The Spoofing Attack Vector (Nanoseconds)
  • Signal Visualization:
    • I/Q Scatter Plot: Visualizes BPSK/QPSK constellation corruption.
    • Allan Variance ($$\sigma^2(\tau)$$): Measures the induced clock instability to confirm "lock loss".

5. KARNA (कर्ण) - Acoustic Triangulation

"The One Who Hears All - Detection Beyond Sight" The Acoustic Warfare & Audio Fingerprinting Module. Vision fails in fog; Radar fails against stealth. Sound never lies.

  • 🛡️ Feature: Multi-Source Audio Analysis (Simulated Array & Live Microphone).
  • 💡 Innovation: Inverse Doppler Solver - Calculates drone velocity purely from pitch shift, without Radar.
  • 🧬 Fingerprinting: Class-4 Drone Signature Lock - Filters out bird calls/wind by strictly matching the 2nd Harmonic of rotor blade noise.
  • 🌊 Visual: 3000% Scale 3D Spectrogram - Massive, Hollywood-grade visualization of the audio spectrum.

🔬 Technical Innovations (For Judges) -> (Update)

⚡ 5. MATH KERNEL: KARNA (Audio Physics)

We implemented a real-time Physics Engine that runs on the raw audio buffer.

A. The Inverse Doppler Solver

Calculates reliability velocity (v) from frequency shift (Δf).

  • Formula: $$ v = c \times \frac{F_{observed} - F_{source}}{F_{source}} $$
    • $v$: Target Velocity (m/s)
    • $c$: Speed of Sound ($343 m/s$)
    • $F_{obs}$: Detected FFT Peak Frequency
    • $F_{src}$: Known Drone Rotor Frequency (e.g., 400Hz)
  • Live Validated: Whistling into the mic actually changes the calculated speed on screen!

B. The Fourier Transform (FFT)

Converts Time-Domain audio (Microphone) into Frequency-Domain data (Spectrogram).

  • Formula: $$ X[k] = \sum_{n=0}^{N-1} x[n] \cdot e^{-i 2\pi k n / N} $$
  • Implementation: 2048-point Fast Fourier Transform running at 60 FPS via Web Audio API.

C. Harmonic Fingerprinting Logic

To prevent false positives (like human voice), we use a strict "Rotor Check":

  • Rule: Trigger MATCH IF AND ONLY IF:
    1. Fundamental Freq $\in [350Hz, 650Hz]$
    2. Harmonic Strength ($2 \times F$) $> 30%$ of Fundamental.
  • Why? Mechanical rotors produce "Sawtooth" waves (strong even harmonics). Human voice produces "Sine-like" or chaotic waves.

🛡️ 6. MATH KERNEL: CHANAKYA (Swarm Physics)

This module does not just "animate" points. It solves the Reynolds' Boids Flocking Equations in real-time.

A. Swarm Center Calculation (Ghost Node)

To predict the leader's position, we calculate the weighted centroid of the swarm cluster.

  • Formula: $$ \mu = \frac{1}{N} \sum_{i=1}^{N} P_i $$
  • Projection: $$ P_{ghost} = \mu + 0.4 \times (\mu - P_{base}) $$
    • Why? Leaders typically stay behind the swarm relative to the target (safety offset).

B. The Viral Cascade (Inverse Square Law)

The "Kill Shot" is modeled as a signal propagation wave.

  • Formula: $$ I = \frac{P}{4\pi r^2} $$
  • Implementation: The hack signal ($I$) degrades over distance ($r$). We simulate this by sorting target drones by distance from the Ghost Node and applying a temporal delay ($t_{delay} = k \times r$).
  • Result: A realistic "Dominio Effect" where drones near the leader fall first, followed by outer layers.

🗣️ J.A.R.V.I.S. Command Reference

Voice & Text Interaction: The system accepts natural language commands. Below are the 23 Core Commands recognized by the Neural Engine:

  • REPORT / STATUS / SITREP → "Sector Scan Complete. 5 Hostiles Detected."
  • INTERCEPT / SCRAMBLE / ENGAGE / KILL → "Target Locked. Scrambling Interceptors."
  • WEATHER / MET → "Wind Shear at 20,000ft. Analysis: Safe to Fly."
  • ANALYSIS / ANALYSE / THREAT / IDENTIFY → "Critical Alert: Class-4 Threat Detected."
  • BASE / HQ / DEFENSE → "Command Post: SAM Batteries Armed."
  • SYSTEM / JARVIS / DIAGNOSTIC → "CPU Load: 12%. Uplink Active."
  • HELLO / WHO ARE YOU → "I am J.A.R.V.I.S. At your service."
  • HELP / COMMANDS → "Displaying Protocol List..."

🚀 6. CHANAKYA (चाणक्य) - Swarm Intelligence & Kill Switch

"The Master Strategist - Understanding the Hive Mind" The Advanced Swarm Analysis & Neural Interface. Designed to predict, visualise, and systematically dismantle coordinated drone swarms.

  • 🛡️ Feature: Multi-Spectral Vision System (Optical / Thermal / LiDAR).
    • Optical: Standard visual feed.
    • Thermal: Inverted backdrop-filter for heat signature tracking ($T > 40^{\circ}C$).
    • LiDAR: Point-cloud simulation using hard-light mix-blend modes.
  • ☠️ Kill Shot Mechanic: "The Viral Hack"
    • Trigger: Dual-Key Auth (SHIFT + SPACE) initiates a cascading cyber-attack.
    • Physics: Propagates from the "Ghost Node" (Leader) to neighbors based on Inverse Square Distance ($1/r^2$).
  • ❤️ Pilot Vitals: Heartbeat Flatline Monitor.
    • Visualises the pilot's stress levels (Sine Wave).
    • Effect: Upon successful neutralization, the monitor FLATLINES with a continuous audio tone, confirming the kill.
  • 📺 Kill Feed: Drone POV Intercept.
    • Hacks the enemy's video feed.
    • Displays "SIGNAL LOST" static noise instantly upon hack execution.

🛠️ Installation & Setup

Prerequisites

  • Node.js (v18+)
  • Python 3.10+ (for Vision)

1. Clone & Install

git clone https://github.com/ankan123basu/DRONE-X.git
cd DRONE-X

2. Start Command Server (The Brain)

cd server
npm install
npm run dev
# Runs on Port 3001

3. Start Dashboard (The Interface)

cd client
npm install
npm run dev
# Runs on Port 5173

4. Start Vision Engine (Rudraksha)

cd vision-service
pip install -r requirements.txt
python main.py
# Runs on Port 5000

👨‍💻 Contributors

Team Antigravity - Building the future of Autonomous Defense.

"Nabhah Sprsham Diptam" (Touch the Sky with Glory)


© 2025 PROJECT SUDARSHAN

About

Next-gen airspace defense grid prototype combining AI vision, RF warfare, GPS spoofing, swarm dynamics, and interception physics

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors