当前位置:   article > 正文

【flash-attention】Building wheel for flash-attn (pyproject.toml) did not run successfully_failed building wheel for flash-attn

failed building wheel for flash-attn

报错

Building wheel for flash-attn (pyproject.toml) did not run successfully

解决

方法1

git clone git@github.com:Dao-AILab/flash-attention.git
cd /flash-attention
python setup.py install

注意这里会从出现错误提示flash-attention/csrc/cutlass找不到,git下载cutlass失败
所以cd flash-attention/csrc/ 然后 git@github.com:NVIDIA/cutlass.git

重新运行python setup.py install 就可以编译成功了
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8

方法2(推荐)

找到自己对应的配置版本,例如:
cuda:12.2
torch:2.2
python:3.10

pip install https://github.com/Dao-AILab/flash-attention/releases/download/v2.4.2/flash_attn-2.4.2+cu122torch2.2cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
  • 1
声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/凡人多烦事01/article/detail/270403
推荐阅读
相关标签
  

闽ICP备14008679号