Improve model card: add pipeline tag, links and usage instructions

#1
by nielsr HF Staff - opened

This PR improves the model card for the Flux Attention Qwen3-8B model by:

  • Adding the text-generation pipeline tag for better discoverability.
  • Adding the library_name: transformers metadata.
  • Including links to the official paper, GitHub repository, and project website.
  • Adding a "Quick Start" usage section with code derived from the GitHub repository to help users load and use the custom hybrid attention architecture.
  • Including the official BibTeX citation.
QQTang1223 changed pull request status to merged

Sign up or log in to comment