Support Utilities¶
dnallm.utils.support ¶
Functions¶
is_flash_attention_capable ¶
is_flash_attention_capable()
Check if Flash Attention has been installed. Returns: True if Flash Attention is installed and the device supports it False otherwise
Source code in dnallm/utils/support.py
28 29 30 31 32 33 34 35 36 37 38 39 40 41 | |
is_fp8_capable ¶
is_fp8_capable()
Check if the current CUDA device supports FP8 precision.
Returns:
| Type | Description |
|---|---|
bool
|
True if the device supports FP8 ( |
compute capability >= 9.0),
False otherwise
Source code in dnallm/utils/support.py
8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 | |