Skip to content

mlvlab/Representation-Shift

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Representation Shift

Representation Shift Figure

Official implementation of ICCV 2025 paper Representation Shift: Unifying Token Compression with FlashAttention.

1. Setup

conda create -n rep_shift python=3.10
conda activate rep_shift
conda env create -n rep_shift -f rep_shift.yml

2. Evaluation

python main.py --data_path /path/to/data --eval --model deit_base --batch-size-eval 200 --use_flash True --drop_r [0.2,0,0,0.2,0,0,0.2,0,0,0,0,0]  

Citation

@inproceedings{choi2025representation,
  title={Representation Shift: Unifying Token Compression with FlashAttention},
  author={Choi, Joonmyung and Lee, Sanghyeok and Ko, Byungoh and Kim, Eunseo and Kil, Jihyung and Kim, Hyunwoo J.},
  booktitle={Proceedings of the IEEE/CVF International Conference on Computer Vision},
  year={2025}
}

About

Official Implementation (Pytorch) of the "Representation Shift: Unifying Token Compression with FlashAttention", ICCV 2025

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors