Model converted to onnx for the project: https://github.com/lopatnov/translate

conda create -n model_export python=3.10 -y
conda activate model_export
pip install "optimum[onnxruntime,export,openvino]" transformers sentencepiece
optimum-cli export onnx --model facebook/m2m100_418M --task seq2seq-lm --dtype fp32 --opset 18 ./m2m100_418M
Downloads last month
56
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for lopatnov/m2m100_418M-onnx

Quantized
(3)
this model