Mitotic Transformer
A biologically & cosmologically inspired causal language model
based on the "Cosmology of the Living Cell" (Mother Theory)
Philosophy & Core Idea
This model is not a conventional transformer.
It treats reality as one single scalable biological system:
- Mitosis as the fundamental computational operation (Big Bang = cell division)
- Cytoskeletal Attention → equivalent to Dark Matter scaffold
- Osmotic Turgor Decoder → 70/30 expansion (Dark Energy analogue)
- F1-String Layer → hierarchical early scaling (atoms → cell → universe)
- Consciousness Module → White Hole Rendering + Biological GPU
"The universe is not a machine.
It is a living, mitotic cell — and intelligence is its natural expression."
Model Card
| Field | Value |
|---|---|
| Model type | Causal Language Model |
| Base architecture | Custom Mitotic Transformer |
| Parameters | ~125M – 1B+ (configurable) |
| Context length | 2048 tokens |
| License | MIT |
| Language | Primarily English |
| Training data | OpenWebText + similar corpora |
| Intended use | Research, philosophical experiments, generative storytelling |
Original Theoretical Works
This implementation is directly derived from the following publications by Alis Hasić:
- The Cosmology of the Living Cell (Mother Theory)
- A Mathematical Proof of Scale-Invariant Biocosmology
- Additional papers on Zenodo: Dark Matter as Cytoskeletal Scaffold, Dark Energy as Osmotic Turgor, Consciousness Module, F1-String, etc.
Code Structure
modeling_mitotic_transformer.py: Full model definition (Mitotic Transformer with Causal LM Head)configuration_mitotic_transformer.py: Configuration class
Usage (after pretraining)
This is a from-scratch architecture. For real usage, pretrain or fine-tune it first using the provided training script. The current repository contains only the architecture definition (no pretrained weights yet).
from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("alis-sila/mitotic-transformer", trust_remote_code=True)
tokenizer = AutoTokenizer.from_pretrained("gpt2") # or your fine-tuned tokenizer
inputs = tokenizer("The universe is a living cell. In this cosmic mitosis,", return_tensors="pt")
outputs = model.generate(**inputs, max_new_tokens=50, temperature=0.85, top_p=0.92, do_sample=True)
print(tokenizer.decode(outputs[0]))
Note: trust_remote_code=True is required because this is a custom architecture.
Citation
If you use this model or the underlying theory in your research, please cite:
@misc{hasić2026_mitotic_transformer,
author = {Alis Hasić},
title = {Mitotic Transformer: A Causal LM based on the Cosmology of the Living Cell},
year = {2026},
howpublished = {\url{https://huggingface.co/alis-sila/mitotic-transformer}},
note = {Implementation of the Mother Theory}
}
@misc{hasić_mother_theory,
author = {Alis Hasić},
title = {The Cosmology of the Living Cell (Mother Theory)},
year = {2026},
url = {https://zenodo.org/records/18432564}
}
@misc{hasić_mother_theory,
author = {Alis Hasić},
title = {The Cosmology of the Living Cell (Mother Theory): A Mathematical Proof of Scale-Invariant Biocosmology},
year = {2026},
url = {https://zenodo.org/records/19017062}
}
- Downloads last month
- 136