For the best experience on desktop, install the Chrome extension to track your reading on news.ycombinator.com
Hacker Newsnew | past | comments | ask | show | jobs | submit | history | more grahamj's commentsregister

Apple software is already largely written my Americans.

Somehow I don't think fealty will change its quality.


My first thought was payment to avoid sanctions for being "woke" (read: anti-discrimination)


Apple should start prompting users to enable it.


probably avoiding the support issues of users losing access to encryption key recovery


It only works when the gun nuts aren’t on the side of the oppressors.


> They undermined their own "worldwide" claim, as ADP still works everywhere else, and the UK has no access.

Disagree. There is a difference between ADP being unavailable in one country and it working differently in that country. Implementing a backdoor would mean changing the way ADP works.


IMO the only thing you can have a high level of trust in is your own *nix server. Backup those devices to it then encrypt there before being sent to the cloud.


> your own *nix server

Just be sure it's pre-Intel Management Engine / pre-AMD Platform Security Processor!


Handling the encryption yourself is the way to go, but for maximum security, don't send that encrypted data to the cloud. Keep it all on your own server(s).

That doesn't help people who aren't technically capable, of course. But at least those who are can protect themselves.


Why couldn't the government just get a warrant and take your local servers? At that point there doesn't seem to be much of a difference with respect to this threat model, at least cloud is convenient.


It is much more effort than sending a data request to a cloud provider, and it can't be done without you knowing.


Depends what kind of security. Local doesn't help if your house burns down or is robbed.


This is why, while I applaud what Apple is doing here, they need to allow us to supply our own E2E encryption keys.


That’s literally what the feature they’re removing did.


Not exactly. It generates the keys for you and stores them on device in the Secure Enclave. You cannot "bring your own" encryption key, but the primary benefit of doing so--that Apple does not have access to it--is intentionally accomplished anyway by the implementation.


I’m not sure I appreciate the value of literally bringing your own keys. My device generating them on my behalf as part of a setup process seems sufficient. You’d use openssl or something and defer to software to actually do keygen no matter what.


I agree it seems sort of academic at first blush, but I'm going to venture a guess it's the idea that you own them, instead of Apple.

So you can eg. keep a backup on your own (secure) infrastructure. Transfer them when switching devices or even mirror on two different ones*. Extract your own secret enclave contents. Improve confidence they were generated securely. And depending on implementation, perhaps reduce the ease with which Apple might "accidentally" vacuum the keys up as a result of an update / order.

*Not sure how much these two make sense in the iOS ecosystem. I know on the Android side I'd absolutely love to maintain a "hot standby" phone that is an exact duplicate of my daily driver, so if I drop it in the ocean I can be up and running again in a heartbeat with zero friction (without need to restore backups, reliance on nerfed backup API's outside the ones Google uses, having to re-setup 2FA, etc. and without ever touching Google's creepy-feeling cloud).


You would need to have a completely trusted software and hardware stack to actually own the keys. And that is already hard enough to get on a PC where ownership still means something, it is not going to happen on most mobile devices. To whatever extent you trust any of the stack already, the Secure Enclave is a better bet than BYOK. The real risk, as you imply, is if Apple is able to compromise the security coprocessor with an OTA firmware update, but they can definitely already push a regular OS update that exfiltrates any key you type in.


Just make an airgapped Linux device on a DYI FPGA CPU. This part is not that difficult comparing to persuading commercial vendors let you use your own cloud and your own encryption/backup mechanisms.


Yeah... unfortunately it ought to be the other way around. They should have a hard time pursuading us to trust them enough to use theirs.

If your phone company asked you to give them the key to your house, in perpetuity, how would you feel about that? (Particularly if they insisted you sign a 15 page Terms of Use first that disclaims all their liability if anything goes missing).


It depends what kind of backdoor the UK is asking for but "encryption backdoor" sounds like cryptographic compromise. I don't know if that's what it means but either way the only way to be sure your keys are secure is to generate them yourself.


BYOK does not provide any additional security over the Secure Enclave (and similar security coprocessors). In fact, unless the Secure Enclave were to directly accept your input and bypass the OS, BYOK is worse because the software can just upload your key to a server as soon as you type it in. Whereas, a key generated on the Secure Enclave stays there, because there exists no operation to export it.


I don't believe it's the SE itself that encrypts user data so it must already be the case that the key is generated outside the SE, sent to it for storage, and is retrieved if the user is authenticated.

So the difference between Apple generating the key on device and storing it in the SE and the user generating it and storing it in the SE is that the user can use a known-secure key generation algo. If Apple generates the key you can't be sure it's cryptographically secure and doesn't have a backdoor.


The SE’s AES engine line encrypts and decrypts data to flash, and the SEP is responsible for generating all keys.

At this point, the people who claim they can’t trust Apple’s key generation should also distrust Intel or AMD or any other vendor’s key generation as well. Might as well generate keys by hand.


But if you don't trust Apple, how to you get the key into the Secure Enclave to begin with? Doesn't Apple control the software on your device that provides the interface into the Secure Enclave from outside of it?


Yes Apple controls the device so you're right, you can never be sure what it's doing. My thinking is that an encryption backdoor means the key generation algo is compromised. In that case you want to bypass that by generating the key yourself.

If the backdoor is some other method of getting your key off the device then all bets are off.


I’m old enough that, growing up, rotaries would have worked on our phone line but not old enough to have actually had a rotary phone. It did used to cost more for touch tone service though (incredibly) and my dad was cheap so our digital phones were all set to pulse dial mode.

All the “high tech” of a digital phone with the “fun” of having to wait for the digits to pulse out! Also I figured out at a young age that pulse dial was just toggling the rocker switch so I got pretty good at dialing by tapping the switch myself.


> Also I figured out at a young age that pulse dial was just toggling the rocker switch

I saw a Cathode Ray Dude video that talked about this recently. I definitely would have enjoyed playing with this and annoying my parents if I had known that that was how it worked.


I’ve been on the lookout for an old analog phone for something similar but a bit different: Home Assistant can tap into a VOIP box and provide voice AI to it so I want to have a phone that you can just pick up and have a conversation with an AI. Maybe use varying system prompts to have it emulate historical figures or celebs or something. Will be fun for my tween I think :)


I would not recommend applying a heat lamp to your eyes.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search:

HN For You