Hi,
can tou please provide also a whl for flash attention 2.8.2 + cuda129 for python 3.12?
you have atm only for python 3.13would be wonderfull
Thank you
ps: reason: WanVideoWrapper plugin that has requirement flash attention between 2.71 and 2.82, including
Hi,
can tou please provide also a whl for flash attention 2.8.2 + cuda129 for python 3.12?
you have atm only for python 3.13would be wonderfull
Thank you
ps: reason: WanVideoWrapper plugin that has requirement flash attention between 2.71 and 2.82, including