Skip to content

RTX 5070 need CUDA12.8 + and torch2.7.0 + #40

@Midnight7777777

Description

@Midnight7777777

I use RTX 5070 which needs CUDA12.8 + and torch2.7.0 +, i cannot find a version of flash-attention to use which is only support cu122 and below. And i have error occured when configuring the environmen as below :

ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
vla-adapter 0.0.1 requires torch==2.2.0, but you have torch 2.7.0+cu128 which is incompatible.
vla-adapter 0.0.1 requires torchaudio==2.2.0, but you have torchaudio 2.7.0+cu128 which is incompatible.
vla-adapter 0.0.1 requires torchvision==0.17.0, but you have torchvision 0.22.0+cu128 which is incompatible.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions