File size: 273 Bytes
0c586be | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 | # Core torch>=2.0.0 transformers>=4.35.0 datasets>=2.14.0 accelerate>=0.24.0 # Metrics scikit-learn>=1.3.0 scipy>=1.11.0 # Optional but recommended trackio>=0.1.0 # FlashAttention (requires CUDA) # flash-attn>=2.3.0 # For longer context / efficiency # xformers>=0.0.22 |