[IEEE/RSJ IROS 25] This repository is the official code for MARSCalib: Multi-robot, Automatic, Robust, Spherical Target-based Extrinsic Calibration in Field and Extraterrestrial Environments.
Seokhwan Jeong, Hogyun Kim, Younggun Cho†
- 🛠️ Prerequisites
- 📷 Data Acquisition
- ✏️ Introduction
- ✉️ Contact
-
PCL
-
OpenCV
-
json
sudo apt-get install nlohmann-json3-dev -
pip install git+https://github.com/facebookresearch/segment-anything.git -
Sample dataset
- https://drive.google.com/drive/u/2/folders/1BitH-WxJ1EHX8qE2x8W38zRt9uzFAeM7
- There are three dataset utilizing three different LiDARs: OS1-32, Mid-360, MLX-120.
-
The camera's intrinsic parameters must be known in advance.
-
Two topics are required in the bag: one for image(sensor_msgs/Image), and one for point clouds(sensor_msgs/PointCloud2).
-
Place the spherical target within about 30 cm, ensuring it is visible to both sensors. (If placed too far, the sphere may not be captured for LiDAR)
-
Remain the two sensors(camera and LiDAR) and the spherical target ❗stationary❗ while data is collected for more than 40 seconds.
-
After acquiring the data, arrange the bags as follows (recommended: at least 10 samples).
📂 Dataset ┣ 1.db3 (any file name is fine.) ┣ 2.db3 ┣ 3.db3 ┣ 4.db3 ...
0. Create workspace & download SAM model weight
- Extract image and accumlated point cloud from ros2 bag.
cd ~/ros2_ws/src
git clone https://github.com/sparolab/MARSCalib
cd .. && colcon build && source install/setup.bash
-
Download SAM Model Weight
- Default Vit-h model: https://dl.fbaipublicfiles.com/segment_anything/sam_vit_h_4b8939.pth
- If you want to use other model, please refer to the relevant page : https://github.com/facebookresearch/segment-anything
-
Place downloaded model(pth) file in
./marscalib/segment_anything/model/.
1. Preprocess
- To extract image and accumlate point cloud from rosbags.
- There are two ways:
- 1️⃣To automatically detect image and point cloud topics:
ros2 run marscalib preprocess <dataset location> -a
- 2️⃣To manually specify topics when multiple are present:
ros2 run marscalib preprocess <dataset location> \
--image_topic <image_topic> \
--points_topic <points_topic>
- Example:
ros2 run marscalib preprocess ~/data/sphere \
--image_topic /camera/color/image_raw \
--points_topic /ouster/points
- ❗Fill intrinsic parameters of camera in output preprocess file in json file type.
2. SAM
- Extract mask images from the raw image.
ros2 run marscalib amg.py \
--checkpoint <model checkpoint location> \
--model-type <model type> \
--input <preprocess folder location>
- Example:
ros2 run marscalib amg.py \
--checkpoint ~/ros2_ws/src/marscalib/segment_anything/model/sam_vit_h_4b8939.pth \
--model-type vit_h \
--input ~/data/sphere_preprocess
3. Camera ellipse center detection
- Detect image with ellipses from the mask image, then extract center of the ellipse.
ros2 run marscalib camera <preprocess folder location>
4. Range image generation & Hough transform
- Generate range image from the accumulated pointcloud and search for circle in the range image. The points inside the detected circle consist sphere. This process may take some time.
- Enter LiDAR type.
- o : ouster
- m : mid360
- s : mlx
ros2 run marscalib hough.py <preprocess folder location> <LiDAR type>
- Example:
ros2 run marscalib hough.py ~/data/sphere_preprocess o
5. LiDAR sphere center detection
- Detect center of the sphere.
- Enter LiDAR type.
- o : ouster
- m : mid360
- s : mlx
- Enter target's radius.
- If you want to observe the output of every stage, add "-v" in the command line.
ros2 run marscalib lidar <preprocess folder location> <LiDAR type> <target's radius>
- Example
ros2 run marscalib lidar ~/data/sphere_preprocess o 0.1 -v
6. [R|t] calculation
- Calculate transformation matrix with 2D-3D center pair.
ros2 run marscalib rt <preprocess folder location>
- Seokhwan Jeong eric5709@inha.edu
- Hogyun Kim hg.kim@inha.edu

