Problems running on mlx-lm
#3
by Galathana - opened
Is this runnable outside of the Inferencer app? When I try to spin it up on mlx-lm I get this error:
ValueError: Expected shape (64, 512, 4) but received shape (64, 512, 2) for parameter model.layers.0.self_attn.embed_q.scales
and it fails to load on other inference engines.