Error with transformers library
#16
by ewinge - opened
I try to use this model with HuggingFacePipeline from LangChain:
from langchain_huggingface import HuggingFacePipeline
llm = HuggingFacePipeline.from_model_id(
model_id='mistralai/Ministral-3-8B-Instruct-2512',
task='text-generation',
device=device,
)
I get this error message:ValueError: Unrecognized configuration class <class 'transformers.models.mistral3.configuration_mistral3.Mistral3Config'> for this kind of AutoModel: AutoModelForCausalLM.