YAML Metadata Warning:empty or missing yaml metadata in repo card
Check out the documentation for more information.
Hugging Face Endpoint Template: mluke-store-judge
This folder contains a custom Inference Endpoint handler for running studio-ousia/mluke-large as a lightweight term-typing judge for Rosetta experiments.
Why this exists:
- the raw Hugging Face deployment for
mLUKEexposes afill-maskinterface - for Rosetta we need a typed classification interface for preserve experiments
- this wrapper converts candidate term + store context into scored label outputs
Files:
handler.py: custom endpoint handler implementingEndpointHandlerrequirements.txt: endpoint dependencies
Expected request body
{
"inputs": {
"candidate": "Third Najm",
"context": "Third Najm line: Syria-focused, spans hoodie/crewneck/tee/long-sleeve in black and white. Quoted poetic titles are used as named design concepts.",
"labels": [
"brand_name",
"product_line",
"named_work",
"person_reference",
"cultural_item",
"nationality",
"modifier"
]
}
}
Expected response body
{
"model_id": "studio-ousia/mluke-large",
"candidate": "Third Najm",
"top_label": "named_work",
"results": [
{
"label": "named_work",
"descriptor": "title",
"raw_score": -2.13,
"confidence": 0.44
}
]
}
Deployment steps
- Create a small Hugging Face repo under your account, for example:
new-account/mluke-store-judge
- Upload
handler.pyandrequirements.txtfrom this folder to that repo. - Create an Inference Endpoint from that repo using the default container type with task
Custom. - Set endpoint env var:
MODEL_ID=studio-ousia/mluke-large
- After deploy, save the endpoint URL locally as:
HF_MLUKE_ENDPOINT_URL=https://...your-endpoint-url...
Notes
- This is an experiment path only. It is not wired into the main Rosetta pipeline.
- The handler uses masked-LM pseudo-log-likelihood over label descriptors to score label fit.
- Default descriptor map:
brand_name->brandproduct_line->collectionnamed_work->titleperson_reference->personcultural_item->garmentnationality->originmodifier->descriptor
- You can override that map by sending
inputs.label_mapin the request payload.
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support