Error while running Llava-1.5-7b on Amazon SageMaker

#17
by Mkaggie23 - opened

Hi,
I'm facing a small issue when running Llava-1.5-7b on Amazon SageMaker. I deployed the model to a SageMaker endpoint but each time I try using predictor.predict(data)
I get the following error
...
"code": 400,
"type": "InternalServerException",
"message": "\u0027llava\u0027"
}
This is the structure of my data

data = {
"image" : url
"question" : "Focus on the person, describe what he is wearing and his environment. Don't exceed 30 words."
}

Hi, Anyone who can help ?

Llava Hugging Face org

Hi @Mkaggie23
Thanks for the issue !
Unfortunately i am not super familiar with SageMaker to be able to help, I found a similar issue here: https://huggingface.co/Salesforce/blip2-opt-2.7b/discussions/25#65c9e629f8e9ec7fd01a67ab does that potentially helps ?

Hi @ybelkada

Thanks for the support. I used runpod to deploy the model but I think the similar issue you provided gave me a clear idea about the structure of the input to LLava through sagemaker.

Thanks!

Mkaggie23 changed discussion status to closed

Sign up or log in to comment