This script allows you to run predefined experiments for training and pruning various neural network models (LeNet300, ResNet50, VGG19) on datasets such as MNIST, CIFAR-10, CIFAR-100, and ImageNet. The focus is on exploring different model sparsities. You also have codes to train networks from scratch for Cifar10/100.
- Python 3.x
- Required dependencies: Ensure you have all necessary packages installed. You might have a
requirements.txtfile for this. - Pytorch dependencies are not included in
requirements.txt, make sure to install them from the official website
The script is controlled via command-line arguments to select and run specific experiments.
python entry.py --experiment <experiment_name> [--sparsity <percentage>]Running bottleneck experiment
python entry.py --experiment lenet300_bottleneckRunning convergencec MNIST experiment
python entry.py --experiment lenet300_convergenceRun resnet50 cifar100 at 98% sparsity
python entry.py --experiment resnet50_cifar100 --sparsity 98Run Vgg19 cifar10 at 95% sparsity
python entry.py --experiment vgg19_cifar10 --sparsity 95For the imagenet experiments, we created predefined files to run for each sparsity, as they are more sensible to regrowth and pruning. The commands are the following
For 96.5% sparsity
python run_experiments.py --experiment resnet50_imagenet_96.5For 95% sparsity
python run_experiments.py --experiment resnet50_imagenet_95For 90% sparsity
python run_experiments.py --experiment resnet50_imagenet_90