New software installed: flash-attention version 2.6.3-CUDA-12.1.1

Dear users, we have installed a new software: flash-attention 2.6.3-CUDA-12.1.1:


---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
  flash-attention: flash-attention/2.6.3-CUDA-12.1.1
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
    Description:
      Fast and memory-efficient exact attention.


    You will need to load all module(s) on any one of the lines below before the "flash-attention/2.6.3-CUDA-12.1.1" module is available to load.

      GCC/12.3.0  OpenMPI/4.1.5
 
    Help:
      Description
      ===========
      Fast and memory-efficient exact attention.
      
      
      More information
      ================
       - Homepage: https://github.com/Dao-AILab/flash-attention
      
      
      Included extensions
      ===================
      flash-attention-2.6.3
      


 

Best,
HPC team

1 Like