Tiny-lamina is an extremely lightweight language model, designed specifically for children and normal users on simple phone CPUs. with only 202k parameters, Tiny-lamina is extremely ultra-compact, allowing fine-tuning on CPUs and uses on low-end phones.

The use cases of this tiny-lamina aim to generate English word fragments for all use cases that require unstructured English words, or fragments of English prepositions.

Tiny-lamina, and all the models on the Clem-CPu account, aim to democratize the use of AI models by children on CPUs.

Downloads last month
8
GGUF
Model size
202k params
Architecture
gpt2
Hardware compatibility
Log In to add your hardware

We're not able to determine the quantization variants.

Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support