Hello,
Thank you for sharing the code for Degree-Quant!
I'm interested in applying Degree-Quant to heterogeneous graphs, specifically using RGCNs. I've replaced the standard Linear layers with your provided LinearQuantized layers and used PyTorch's dynamic quantization after the model training.
While testing with OGB-MAG dataset, the accuracy seems to be preserved. However, I've observed a significant increase (~25%) in both inference time and memory usage.
Is this expected behavior for heterogeneous graphs with Degree-Quant? Any insights or suggestions you might have would be greatly appreciated.
If this approach is a viable option for heterogeneous graphs, I'd be happy to contribute code to support RGCNs in your repository.
Hello,
Thank you for sharing the code for Degree-Quant!
I'm interested in applying Degree-Quant to heterogeneous graphs, specifically using RGCNs. I've replaced the standard Linear layers with your provided LinearQuantized layers and used PyTorch's dynamic quantization after the model training.
While testing with OGB-MAG dataset, the accuracy seems to be preserved. However, I've observed a significant increase (~25%) in both inference time and memory usage.
Is this expected behavior for heterogeneous graphs with Degree-Quant? Any insights or suggestions you might have would be greatly appreciated.
If this approach is a viable option for heterogeneous graphs, I'd be happy to contribute code to support RGCNs in your repository.