--- datasets: - Kamisori-daijin/email-datasets-20k language: - en pipeline_tag: text-generation tags: - small - cpu - email - mail - e-mail - tiny - nanogpt - gpt - open - opensource --- # Welcome to eMail-LM This is a 44.64M parameter GPT model which generates E-Mails. ## How to use To use this model, download `model.pt` and `use.py` from this repo and run `use.py`. ## Sample Input: `Write an email to my boss about a server outage.`
Output: ```plaintext SUBJECT: Regarding the Server Outage - Seriously? BODY: [Name], Let's be blunt. The server went down last night, and now a bad timing – on a Friday night. Frankly, it was unacceptable. I'm not going to sugarcoat it. I'm not interested in excuses about how this happened, and I'm fixing it. I expect a full report by [Time] today. [Your Name] ``` ## Training code The full training code can be found as `train.ipynb` in this repo. Have fun :D ## Limitations As this is a very small model and it wasn't trained for a really long time, it still generates garbage - but always emails. It's definetely more for research purposes.