GPT OSS 20B pruned from 32 experts to 20, based on HuiHui's abliterated base. I recommend using a larger context cap for this model because it seems to function worse at low context.
- Downloads last month
- 9
GPT OSS 20B pruned from 32 experts to 20, based on HuiHui's abliterated base. I recommend using a larger context cap for this model because it seems to function worse at low context.