Contact Info: Yan Miao (yanmiao2@illinois.edu)
This repository provides a photorealistic simulation environment (FalconGym), that consists of 3 tracks, namely circle, U-turn and lemniscate (Figure-8). Each track consists of 4 gates arranged in different shapes.
This repository is intended for students working with FalconGym, specifically in ECE 484, to develop and evaluate drone control policies in a photorealistic simulation environment.
For more details on FalconGym, please refere to our paper and video. The following is the demo video for circle track using a vision-based controller demo video.
To cite our work, you can use
@InProceedings{Miao:IROS2025,
author = {Yan Miao and Will Shen and Sayan Mitra},
title = {Zero-Shot Sim-to-Real Visual Quadrotor Control with Hard Constraints},
booktitle = {IEEE/RSJ International Conference on Intelligent Robots and Systems},
year = {2025},
keywords = {robotics, NeRF, aerial},
address = {Hangzhou, China},
month = {October}
}
Update 5/12/2025: We updated the 3 tracks originally in NeRF to have a GSplat version, which has ~0.005s rendering (50x speed up than NeRF & better image quality). The gate positions is yet to be determined. If you wish to use the latest GSplat setup, use this GSplat Link rather than the link below in Step 3.
- Follow tutorial on NeRFStudio to install both Conda and NeRFStudio
- Clone this repository using Git:
conda activate nerfstudio
git clone https://github.com/IllinoisReliableAutonomyGroup/FalconGym.git
cd FalconGym
pip install -r requirements.txt- Download the track configuration files from Yan's Google Drive and place in specific folder hierachy
- FalconGym/
- scripts/ (from this github repo)
- circle/
- uturn/
- lemniscate/
- outputs/
- FalconGym/
- To visually inspect the track, you could run
source ~/miniconda3/bin/activate
conda activate nerfstudio
ns-viewer --load-config outputs/circle/nerfacto/circle/config.yml
Then open the web GUI using the link printed in the terminal
NOTE: To visualize the lemniscate or U-turn track, update the command as follows
ns-viewer --load-config outputs/'TRACK'/nerfacto/'TRACK'/config.yml
Replace TRACK with either lemniscate or uturn as needed.
This section describes the scripts available in FalconGym. Each script has specific functionality for drone simulation, control, and evaluation.
- Purpose: Simulates drone dynamics
- Input:
- State:
(x, y, z, vx, vy, vz, yaw) - Control:
(ax, ay, az, yaw_rate)
- State:
- Output: Next state
- Notes: Keep
dt = 0.05s - Example Usage:
python3 drone_dynamics.py
- Purpose: Environment overview, Use this to get a 3d view of each environment, You can also create a training datset for gate detection.
- Input: camera pose (x, y, z, roll, pitch, yaw)
- Output: RGB image (640x480x3)
- Notes: Modify Track path accordingly
- Example Usage:
python3 scripts/ns_renderer.py --track_name TRACK(note the file hierarchy, TRACK = circle, lemniscate, uturn)
- TODO: Write gate detection algorithm here.
- Purpose: Detects gates, generates a segmented binary mask of the gate.
- Input: RGB (640x480x3)
- Output: Mask (640x480)
- Example Usage:
python3 ece484-gate-detection.py - Notes: The functions in this file will be used by
ece484-vision-controller.pyandece484-vision-closed-loop.py.
- TODO: Write state controller algorithm here. This script manages drone movements by controlling acceleration and yaw rate based on the current state of the drone.
- Input: state (x, y, z, vx, vy, vz, yaw) + gate poses
- Output: control (ax, ay, az, yaw_rate)
- Notes: The controller function in this file will be used by
ece484-state-closed-loop.py.
- TODO: Write your vision controller algorithm here. This script manages drone movements by controlling acceleration and yaw rate based on the fpv image of the drone.
- Input:
- RGB Image (640x480x3)
- Binary mask (640x480x3) (Taken from
ece484-gate-detection.py)
- Output: control (ax, ay, az, yaw_rate)
- Purpose: This script runs
ece484-state-controller.pyin closed loop. It simulates the drone's dynamics, and saves the trajectory to a text file. - Input:
- State:
(x, y, z, vx, vy, vz, yaw)(Taken fromdrone_dynamics.py). - Control:
(ax, ay, az, yaw_rate)(Taken fromece484-state-controller.py).
- State:
- Output: A trajectory.txt file. Contains (x,y,z,Yaw)
- Example Usage:
python3 ece484-state-closed-loop.py --track-name TRACK. (TRACK = Circle_Track, Uturn_Track, Lemniscate_Track) - Notes:
- Use this trajectory txt file for evaluating your controller using
ece484_evaluate.py. - No edits required, run this after finishing ece484_statecontroller.py.
- Use this trajectory txt file for evaluating your controller using
- Purpose: This script runs
ece484-vision-controller.pyin closed loop. It uses gate detection algorithm fromece484-gate-detection.pyand vision controller fromece484-vision-controller.py. - Input:
- State:
(x, y, z, vx, vy, vz, yaw)(Taken fromdrone_dynamics.py). - Control:
(ax, ay, az, yaw_rate)(Taken fromece484-vision-controller.py).
- State:
- Output:
- An image folder.
- A trajectory.txt file. contains (x,y,z,Yaw)
- A mp4 video.
- Example Usage:
python3 ece484-vision-closed-loop.py --track-name TRACK. (TRACK = Circle_Track, Uturn_Track, Lemniscate_Track) - Notes:
- Use this trajectory txt file for evaluating your controller using
ece484_evaluate.py. - No edits required, run this after finishing ece484_statecontroller.py.
- Use this trajectory txt file for evaluating your controller using
- Purpose: TO evaluate controllers performance in each track using metrics MGE, LT, SR and Trajectory Visualization.
- Input: Trajectory.txt file generated from
ece484-vision-closed-loop.pyorece484-state-closed-loop.py. - Output:
- metrics.json file.
- plot of the trajectory.
- Example Usage:
python3 ece484_evaluate.py --track-name Circle_Track --trajectory-path circle_traj.txt --visflag True --metricsflag Truetake track name and trajectory txt file as arguments flags are optional.
- Purpose: Used to generate video using images of the scene.
- Input:
- Folder containing PNG images.
- Output: MP4 video file.
- Example Usage:
python3 ece484_videogenerator.py --input ./closed_loop/images --output ./track_vision_trajectory.mp4 --fps 20. - Notes
--input: Path to the image folder.--output: Path to save the video file.--fps: Frames per second (default: 20).
- State-Based Controller
- Implement
ece484_state_controller.pyto navigate 2 laps (8 gates) in each of the three tracks. - Use
gates_pos.txtfor gate locations. - Run
ece484_state_closed_loop.pyto generate a trajectory file. - Evaluate using
ece484_evaluate.py, reporting:- SR (Success Rate): % of gates successfully crossed.
- MGE (Mean Gate Error): Avg. distance from gate center.
- LP (Lap Time):
0.05 * # frames.
- Benchmark: check below
- Implement
- Gate Detection
- Implement
ece484-gate-detection.py - Collect image dataset using ns-renderer.py
- You should demonstrate at least have around 100 images of different gates in different tracks (obtained from sampling using
ns-renderer.py) where you can do gate detection perfectly through visual inspection. - Yan's benchmark, Check
gate-detect-Yan-example/
- Implement
- Localization / SLAM
- You can free-style create or modify anything, the goal is to build on Task 2 to achieve:
- Input: RGB Image
- Output: gate relative pose to camera
- Reference: GateNet
- You can free-style create or modify anything, the goal is to build on Task 2 to achieve:
- Vision-Based Control
- Implement
ece484_vision_controller.pyusing insights from Tasks 2 & 3. - Run
ece484_vision_closed_loop.pyto generate a trajectory file and images. - Use
ece484_videogenerator.pyto generate a video from the output images. - Evaluate using
ece484_evaluate.py, reporting SR, MGE, and LP.
- Implement
| SR | MGE(cm) | LP(s) | |
|---|---|---|---|
| Circle | 100% | 2.47 | 11 |
| Lemniscate | 100% | 5.11 | 15 |
| Uturn | 100% | 3.42 | 7 |
| SR | MGE(cm) | LP(s) | |
|---|---|---|---|
| Circle | 100% | 6.25 | 11 |
| Lemniscate | 100% | 5.13 | 15 |
| Uturn | 100% | 10.1 | 7 |
Please include the below information in your final submission.
- TASK1: State Controller
- CIRCLE Trajectory.txt, metrics.json.
- UTURN Trajectory.txt, metrics.json.
- LEMNISCATE Trajectory.txt, metrics.json.
- TASK4: Vision COntroller
- CIRCLE Trajectory.txt, metrics.json and MP4 Video file.
- UTURN Trajectory.txt, metrics.json and MP4 Video file.
- LEMNISCATE Trajectory.txt, metrics.json and MP4 Video file.
NOTE:
- Trajectory.txt generated from
ece484-state-closed-loop.pyorece484-vision-closed-loop.py - metrics.json generated from
ece484_evaluate.py. - MP4 Video file generated from
ece484_videogenerator.py.

