Skip to content

manishdhakal/GFT

Repository files navigation

GFT: Graph Feature Tuning for Efficient Point Cloud Analysis

Accepted at: WACV 2026

ArXiv: arxiv:2511.10799

Abstract

Parameter-efficient fine-tuning (PEFT) significantly reduces computational and memory costs by updating only a small subset of the model's parameters, enabling faster adaptation to new tasks with minimal loss in performance. Previous studies have introduced PEFTs tailored for point cloud data, as general approaches are suboptimal. To further reduce the number of trainable parameters, we propose a point-cloud-specific PEFT, termed Graph Features Tuning (GFT), which learns a dynamic graph from initial tokenized inputs of the transformer using a lightweight graph convolution network and passes these graph features to deeper layers via skip connections and efficient cross-attention modules. Extensive experiments on object classification and segmentation tasks show that GFT operates in the same domain, rivalling existing methods, while reducing the trainable parameters.

Table of Contents

Methodology

GFT

Setup

Environment Setup

Please refer to the IDPT repo for environment setup.

Script for additional libraries:

pip install -r requirements.txt

Note: If there are any missing libraries, please install accordingly.

Pretrained Model Setup

Download pretrained models from:

Pretrained Model Link
ACT here
Point-BERT here
Point-MAE here

Save those models to follow the structure given below:

pretrained/
├── ACT/
│   └── pretrained.pth
├── Point-BERT/
│   └── pretrained.pth
└── Point-MAE/
    └── pretrained.pth

Dataset Setup

Please follow the instructions from DATASET.md.

Finetuning

Baselines

Baselines are reproduced from DAPT and IDPT with stronger augmentation strategy of ACT.

Ours

All of the training scripts for classification and segmentation are in the scripts/ directory.

Eg. script for reproduding all of the results from OBJ_BG for Point-MAE checkpoint:

bash scripts/objbg.sh

Change the configs from the scripts to run experiments with other pretrained ---ckpts and --exp_name can be any name for your log directory. Seeds are integers ranged within [0,9].

OR, run the command given below with needed changes.

CUDA_VISIBLE_DEVICES=0 \
python main.py \
--config cfgs/gft/finetune_scan_objbg.yaml \
--ckpts pretrained/Point-MAE/pretrained.pth \
--finetune_model \
--exp_name point_mae \ 
--seed 0

Results

GFT-Results

BibTeX Citation

@inproceedings{dhakal2026gft,
  title = {GFT: Graph Feature Tuning for Efficient Point Cloud Analysis},
  author = {Dhakal, Manish and Dasari, Venkat R. and Sunderraman, Raj and Ding, Yi},
  booktitle = {Proceedings of the  Winter Conference on Applications of Computer Vision (WACV)},
  month = {March},
  year = {2026},
}

Acknowledgement

Many thanks to the following repositories that helped with an established training pipeline and other utility codes: DAPT and IDPT, Point-BERT, Point-MAE, ACT, and Pointnet2_PyTorch.

About

[WACV 2026] GFT: Graph Feature Tuning for Efficient Point Cloud Analysis

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages