Hugging Face Forums
Enabling Flash Attention 2
🤗Transformers
saireddy
July 3, 2024, 2:41pm
3
@varadhbhatnagar
were you able to figure out the difference?
show post in topic
Related topics
Topic
Replies
Views
Activity
Issue with LlamaSdpaAttention Not Being Utilized
🤗Transformers
1
225
February 13, 2025
Swapping GPT-2 Attention with Flash Attention
🤗Transformers
3
3026
June 4, 2023
SDPA attention in e.g. Llama does not use fused accelerations
🤗Transformers
0
848
March 5, 2024
Shouldn't `_flash_attn_2_enabled` be documented?
🤗Transformers
1
5714
November 30, 2023
FlashAttention or equivalent?
🤗Transformers
0
914
April 30, 2023