Model type deepseek_v4 not supported.

#2
by linxin111 - opened

I have installed the mlx-lm 0.31.3, but it still reports an error

python 3.12.13
mlx 0.31.2
mlx-lm 0.31.3
mlx-metal 0.31.2

error message:
mlx_lm.chat --model "mlx-community/DeepSeek-V4-Flash-4bit"
Fetching 39 files: 100%|████████████████████████████████████████████████████████████████████████████| 39/39 [00:00<00:00, 11001.27it/s]
Download complete: : 0.00B [00:00, ?B/s] Traceback (most recent call last): | 0/39 [00:00<?, ?it/s]
File "/opt/homebrew/Caskroom/miniconda/base/envs/jarvis/lib/python3.12/site-packages/mlx_lm/utils.py", line 188, in _get_classes
arch = importlib.import_module(f"mlx_lm.models.{model_type}")
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/Caskroom/miniconda/base/envs/jarvis/lib/python3.12/importlib/__init__.py", line 90, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "", line 1387, in _gcd_import
File "", line 1360, in _find_and_load
File "", line 1324, in _find_and_load_unlocked
ModuleNotFoundError: No module named 'mlx_lm.models.deepseek_v4'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/opt/homebrew/Caskroom/miniconda/base/envs/jarvis/bin/mlx_lm.chat", line 6, in
sys.exit(main())
^^^^^^
File "/opt/homebrew/Caskroom/miniconda/base/envs/jarvis/lib/python3.12/site-packages/mlx_lm/chat.py", line 110, in main
model, tokenizer = load(
^^^^^
File "/opt/homebrew/Caskroom/miniconda/base/envs/jarvis/lib/python3.12/site-packages/mlx_lm/utils.py", line 491, in load
model, config = load_model(model_path, lazy, model_config=model_config)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/Caskroom/miniconda/base/envs/jarvis/lib/python3.12/site-packages/mlx_lm/utils.py", line 334, in load_model
model_class, model_args_class = get_model_classes(config=config)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/Caskroom/miniconda/base/envs/jarvis/lib/python3.12/site-packages/mlx_lm/utils.py", line 191, in _get_classes
raise ValueError(msg)
ValueError: Model type deepseek_v4 not supported.
Download complete: : 0.00B [00:00, ?B/s]

MLX Community org

Known state

  • mlx_lm.chat launches successfully.
  • Failure is at model architecture import for deepseek_v4.
  • Next action: verify whether the invoked executable and installed package are the same artifact (path consistency check).

Step 2 — Verify active executable vs installed package (no assumptions)

There is a common failure mode where pip updated one environment but mlx_lm.chat resolves from another.

Run these as separate discrete checks and compare outputs.

2.1 Confirm which executable is running

which mlx_lm.chat

Expected evidence:
Should point to:

/opt/homebrew/Caskroom/miniconda/base/envs/jarvis/bin/mlx_lm.chat

(Your traceback suggests this, but verify.)

2.2 Confirm which Python is running

which python

Should be same .../envs/jarvis/bin/python

2.3 Confirm package path actually imported

python -c "import mlx_lm; print(mlx_lm.file)"

Record exact path.

2.4 Confirm whether deepseek_v4 module exists

python -c "import importlib.util; print(importlib.util.find_spec('mlx_lm.models.deepseek_v4'))"

Interpretation:

  • None → module absent in installed package (strong evidence unsupported)
  • module path returned → different issue

2.5 Inspect installed model modules

python -c "import mlx_lm.models as m, pkgutil; print(sorted([x.name for x in pkgutil.iter_modules(m.path)]))"

Check whether deepseek_v4 appears in list.

Decision logic (based only on evidence):

If:

  • paths all align, and
  • find_spec(...) returns None, and
  • deepseek_v4 absent from module list,

then evidence strongly confirms:
mlx-lm 0.31.3 does not include this model architecture.

If paths differ:
environment/path skew is the issue.

Sign up or log in to comment