The brunt cost is taken by the public, whose intellectual property has been expropriated. After all, the worth of our combined data would be at least the sum of the worth of the entire AI industry.
Anthropic already admitted to heavily monitoring user requests to protect against distillation. They have everything in place, turning on learning from user data would literally be just a couple lines of code at this point. Anyone trusting them not to do it is a fool.
Absolutely. Plus as these companies become hungrier for revenue and to get out of the commodity market they are in, they are only going to get more aggressive in their (ab)use of customer data.
Such a thing can’t be enforced and it can be flipped on a dime.
You should play around with local LLMs and system prompts to experience it.
reply