BartlebyGPT Dead Letter Office (DLO-Base)

The BartlebyGPT Dead Letter Office (DLO-Bae) is a continued pretraining (CPT) of Qwen/Qwen3.5-2B-Base. CPT was run on ~62M tokens of Melvillian prose, over 1 epoch with TRL.

It is not trained for instruction following or conversation. It is intended to be fine-tuned.

Downloads last month
300
Safetensors
Model size
2B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for staeiou/bartleby-dlo-qwen3.5-2b-base-cpt

Finetuned
(29)
this model
Quantizations
1 model