Conifg File Issue due to finetuning
The config file after finetuning is updated and do not include the required parameters like model type, due to which I am unable to convert it into ct-translate, which is making the offline (cloud free) and edge device deployment difficulty, as the supported library for whisper-x need ct-translated version for offline usage
Not sure if I understood the problem here. I refer to you the model not converted to work with whisperX: https://huggingface.co/inesc-id/WhisperLv3-PT-All/tree/main.
As I found this model aligned with my use case so I tried to deploy this on a mobile device and while converting its bin file into onnx quantized version of weights, it generates a value error and when I tried to debug using LLM, came to know that the config file do not have a parameter called as model_type needed for onnx_runtime library for conversion.
If you want I can also share the detailed error and command experienced during conversion.
Hopefully I tried to make you understood
Thanks for your response as well the work you did
