They do have a camp, considered illegal by most of the world, yet still in operation, with middle eastern people, plenty of them innocent, that even has a MC Donald's on the island. Which is literally the most American thing imagineable to put a Mac Donald's on their illegal prison island...
I suspect Dorsey is coming back to run it and Elon just bought it. Given Dorsey’s comments about how toxic the board was I think this is the ideal situation.
> No user receives any CSAM photo, not even in encrypted form. Users receive a data structure of blinded fingerprints of photos in the CSAM database. Users cannot recover these fingerprints and therefore cannot use them to identify which photos are in the CSAM database.
I'm sorry but this is the most ridiculous thing I've read today. Hashes have never and probably will never be used "smear" someone the US doesn't like. We can speculate about them planting evidence but trying to prosecute based on hashes baked into the OS used by millions? That's absurd.
I'm pretty sure this is a non-tech way of saying "a machine learning model" or other parameters which is not a particularly useful form of this database.
There is a minimum number of hash matches required, then images are made available to Apple who then manually checks that they are CSAM material and not just collisions. That's what the 9to5Mac story about this says: https://9to5mac.com/2021/08/05/apple-announces-new-protectio...
With a broader rollout to all accounts and simply scanning in iMessage rather than photos there's one possible scenario if you could generate images which were plausibly real photos: spam them to someone before an election, let friendly law enforcement talk about the investigation, and let them discover how hard it is to prove that you didn't delete the original image which was used to generate the fingerprint. Variations abound: target that teacher who gave you a bad grade, etc. The idea would be credibility laundering: “Apple flagged their phone” sounds more like there's something there than, say, a leak to the tabloids or a police investigation run by a political rival.
This is technically possible now but requires you to actually have access to seriously illegal material. A feasible collision process would make it a lot easier for someone to avoid having something which could directly result in a jail sentence.
So you can upload the colliding images to iCloud and get yourself reported for having child porn. Then after the law comes down on you, you can prove that you didn't ever have child porn. And you can sue Apple for libel, falsely reporting a crime, whatever else they did. It would be a clever bit of tech activism.