framework support via use_bidirectional_attention - Cheers from your friends at Baseten

#3

Marking it as "use_bidirectional_attention", would be helpful. I also added it to nvidia and gemma3, and helps integration in popular frameworks. https://huggingface.co/nvidia/llama-embed-nemotron-8b/commit/1acaf42b890bafa464ef9a58d1c0db0dd26120d4

michaelfeil changed pull request title from Cheers from your friends at Baseten to framework support via use_bidirectional_attention - Cheers from your friends at Baseten
Voyage AI org

Great point! Thank you for pointing it out!

hongliu9903 changed pull request status to merged

Sign up or log in to comment