[Compatibility] LLaVA-OneVision-1.5-8B-Instruct cannot run on transformers>=5.0.0
#5 opened about 22 hours ago
by
hua-zi
fix: import `flash_attn_varlen_func` from `flash_attn` instead of `transformers.modeling_flash_attention_utils`
#4 opened 3 months ago
by
wincentIsMe
vllm support ?
➕ 10
1
#2 opened 7 months ago
by
TahirC