copied from https://huggingface.co/google/flan-t5-xxl

Converted to bfloat16, removed the decoder blocks and packaged for InvokeAI

To install in invoke, enter skunkworx/FLAN-T5xxl::bfloat16 in the huggingface repoID on the model manager to install.

Visit the Discord for more information. https://discord.com/channels/1020123559063990373/1050123398342250526/1381203753918664774

Incase you want to do the conversion yourself or on another model. You can find the code used to convert the original google/flan-t5-xxl model can be found in the root folder https://huggingface.co/skunkworx/FLAN-T5xxl/blob/main/convert-bf16-enc.py

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support