How to use inclusionAI/LLaDA2.0-Uni with Transformers:
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("inclusionAI/LLaDA2.0-Uni", trust_remote_code=True, dtype="auto")
Impressive model, can you please provide some gguf versions that can run on low-vram hardware?
Hello, we have provided an FP8 version for use.
· Sign up or log in to comment