Be careful! The model and Anthropic's policy

#75
by SylvainVerdy - opened

Can we use the model as-is to commercialization ?
Anthropic's policy prohibits using Claude outputs to train ANY model without written permission, regardless of commercial intent.

The datasets used (nohurry/Opus-4.6-Reasoning-3000x-filtered, Jackrong/Qwen3.5-reasoning-700x) explicitly contain Claude 4.6 Opus reasoning trajectories, which you acknowledge in your own model card.

The Apache 2.0 license you applied covers the Qwen3.5 base weights, not the legality of your training data provenance.

Source: https://support.claude.com/en/articles/12326764-can-i-use-my-outputs-to-train-an-ai-model
other link: https://www.anthropic.com/news/detecting-and-preventing-distillation-attacks
=>
This model is derived from training data sourced from Claude API outputs.
Use is permitted for personal and research purposes only.
Commercial use is prohibited.
Redistribution is prohibited.
Users assume all legal risk regarding Anthropic's Terms of Service.

Can the model be used as-is? That's a legitimate legal question.

I would like to find a legal solution that would allow me to use these kinds of models for commercial purposes...

So what, I cannot even use it to write a code????
What a bunch of bullocks

@minyor25
The real issue here is the Apache 2.0 label applied to this model, it’s misleading. Apache 2.0 normally allows commercial use, but that only covers the base Qwen3.5 weights. Since the fine-tuning data comes from Claude outputs, Anthropic’s terms of service (ToS)effectively overrides that for any commercial usage. So in practice, this model should be considered non-commercial, regardless of what the license badge says.

Nobody is saying that you can’t use it for personal use.

Ah, so you meant like: Be careful, do not serve this model to people and do not take money for it, like some kind of self hosted provider?
Sure fine, but I somehow doubt that many people download it to be providers..
Does personal use include making a commercial software with it locally on a video card?
Thanks

Show me established legal precedent that says Anthropic can restrict you from using text that their model generates to train an LLM. I will be surprised, and then point you at Harry Potter.

Show me established legal precedent that says Anthropic can restrict you from using text that their model generates to train an LLM. I will be surprised, and then point you at Harry Potter.

The precedent is that the source using an embedded API gets banned from using their service. if Anthropic determines that their API tripped a flag for ToS violation, it will more than likely happen. Lesson of the day; Don't use a SOTA API in your workflow if you intend to use it for commercial gain.

"The precedent is that the source using an embedded API gets banned from using their service. if Anthropic determines that their API tripped a flag for ToS violation, it will more than likely happen. Lesson of the day; Don't use a SOTA API in your workflow if you intend to use it for commercial gain."

I don't see how that applies to this. Then, there are plenty of ways to launder your api access so they can't link it to you. As for providing access to a model making use of the distilled material? Seems a bit too late to stop by turning off the spigot to the training material. You could refuse to sell Anthropic books, but it's not gonna untrain Harry potter.

Then you're too naive to understand the point of view. Advocating for "avoiding bans" from an API doesn't help your case. Was the damage already done? Sure. But you fail to understand that said person is going to have to start training that model all over again as the work with the former API connection was lost in the process of the ban. For some people, that's not worth the time nor the energy. Now that Gemma 4 is open source and can be ran locally, there is no point in using a closed source solution like Antrophic for API calls.

I suppose you must be right, since I don't understand why you would need to start training over. It seems to me as if we're talking about two seperate things. Such different paradgims. Speaking with you is a waste of my time I believe. Bye.

Sign up or log in to comment