πŸ“œ OWoTGPT-1.3

OWoTGPT-1.3 is a small GPT-2 model fine-tuned on chat logs from the website Our World of Text. It can generate new messages based on prior chat history.

Model Details

Model Description

OWoTGPT-1.3 is a 135M parameter GPT-2 model fine-tuned on chat logs from the website Our World of Text. When given prior chat history, it can predict and generate what messages might come after. OWoTGPT-1.3 can also mimic a specific user if a username and ID are specified. It may attempt to mimic their tone, speaking style, and interests. If no information is supplied for the next message, the model will randomly generate it on its own and mimic the user it has picked.

Model Sources

Uses

This model was built for entertainment purposes and is not meant to be used in any serious applications.

You can either tell the model to imitate a specific user or to choose a user on its own. To let it randomly pick a user, put a random ID (a number from 1 usually up to 9999) surrounded by square brackets as the input, plus an asterisk placed in front of the ID. The model will then generate a username (for non-anon messages) after.

For example, putting [*1201] may result in [*1201] KCUCiznob: , while [*719] may result in [*719] CarlBatman:.

If you want to generate an anonymous message, put a random ID surrounded by square brackets without the asterisk, and add a colon right after, like this: [2481]:.

To demonstrate, putting [3465]: may result in the message [3465]: your training data already exists.

The model will also generate more messages after.

Out-of-Scope Use

This model will not work for any purpose other than generating OWOT chat logs.

Bias, Risks, and Limitations

  • As the chat logs it was trained on include spam, the model may end up generating spam
  • It may generate offensive content (including racial slurs) as OWOT doesn't have a lot of rules
  • Its messages will likely not be very relevant to the context
  • It may not generate coherent messages at times
  • The model cannot give factual knowledge of any sort
  • The messages generated may not properly address the prior chat history

Recommendations

Follow the instructions and take note of the limitations above, and you'll be fine.

Downloads last month
1,963
Safetensors
Model size
0.1B params
Tensor type
F32
Β·
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for Pomni/OWoTGPT-1.3

Finetuned
(2128)
this model
Quantizations
1 model