Interested in your fine-tuning work — possible collaboration?
Hi there, , I'm with the ERNIE team. I noticed that you fine-tuned an ERNIE model and I’m very interested in your use case. Wondering if you’d be open to a brief conversation about your work and the possibility of featuring it as a community case or other developer-related collaboration.
If you’re open to it, could you share a preferred way to connect, or just reply here?
Thanks for your contribution and for sharing your model. 😄
Hey:
Good to hear from you ;
I am open to both "feature" and/or "dev-colab" ;
This model features the 20x Brainstorm adapter, the tamest / most stable of the Brainstorm adapters.
There are sizes from 3x to 60x.
I call it an adapter, because it acts more like a plug-in than a fine tune (IE via Unsloth, TRL etc).
It does however make permanent changes to the model, but does not add any new data to the model.
This is what I would call a "level 1" implementation.
It can be tuned in Unsloth and there are several different sizes / configs as well.
The adapter works on all model arch types and has been tested on over 300 models.
RE: contact.
Reply here is great or discord via "drawless111" / "David_AU"