We shouldn't even be giving them defensive weapons because that only enables them to wage war without consequence. In this specific case its a moot point since we joined this war in the most direct way possible but in general every time we shoot down one country's missiles but not the other we are participating in the war, especially when the side we protect is the aggressor.
>If github trained models on the contents of your private repos, that would be a violation.
Really don't see why that should change anything. Surely you'd want your gift to the Microsoft corporation to appreciate in value! Why would we ever withold this boon from somebody on the basis that they gifted their source exclusively to microslop!?
I think the most important problem here is that this is an ambulance not a monster truck. It never ceases to amaze me how people on this site will always insist that the onus should be on society to deal the fallout from silicon valley's poorly-tested and poorly-designed bullshit. In a truly just world we'd be able to charge Google's leadership as an accessory to homicide for this.
Not an expert but i think the goal is to get the ambulance and its occupants to a specific location and then make an egress to a nearby medical facility? Also I'm not confident ambulances are designed to execute the pit maneuver.
>I am sure Google wouldn't have sued for damages.
Oh well if that's the case i guess it's all alright.
>First responders can put the vehicle into a manual mode to move it if needed.
I really feel like you're missing the point of why you're supposed to pull over and yield right-of-way for emergency vehicles.
One nuance I've noticed: the statement from Anthropic specifically stated the use of their products for these purposes was not included in the contract with DoD but it stops short of saying it was prohibited by the contract.
Maybe it's just a weak choice of words in anthropic's statement, but the way I read it I get the impression that anthropic is assuming they retain discretion over how their products are used for any purposes not outlined in the contract, while the DoD sees it more along the lines of a traditional sale in which the seller relinquishes all rights to the product by default, and has to enumerate any rights over the product they will retain in the contract.
it's also unprecedented for a contractor to suddenly announce their products will, from now on, be able to refuse to function based on the product's evaluation of what it perceives to be an ethical dilemma. Just because silicon valley gets away with bullying the consumer market with mandatory automatic updates and constantly-morphing EULAs doesn't mean they're entitled to take that attitude with them when they try to join the military industrial complex. Actually they shouldn't even be entitled to take that attitude to the consumer market but sadly that battle was lost a long time ago.
>for _any _ purpose
they're allowed to use it for any purpose not related to a government contract.
> it's also unprecedented for a contractor to suddenly announce their products will, from now on, be able to refuse to function based on the product's evaluation of what it perceives to be an ethical dilemma
That is a deeply deceptive description of what happened. Anthropic was clear from the beginning of the contract the limitations of Claude; the military reneged; and beyond cancelling the contract with Anthropic (fair enough), they are retaliating in an attempt to destroy its businesses, by threatening any other company that does business with Anthropic.
It’s not clear to me that the AI itself will refuse. You could build a system where AI is asked if an image matches a pattern. The true/false is fed to a different system to fire a missile. Building such a system would violate the contract, but doesn’t prevent such a thing from being built if you don’t mind breaking a contract.