We aimed to evaluate different model compression techniques such as Pruning, Knowledge Distillation, KD + Pruning and Dynamic KD (changing temperature). As a part of evaluation metrics we compared the models based on accuracy, parameter count, flops, model size, training time, average inference time and CO2 emissions
Satyajeet2000/Pruning_and_Knowledge_Distillation
Folders and files
| Name | Name | Last commit date | ||
|---|---|---|---|---|