For the best experience on desktop, install the Chrome extension to track your reading on news.ycombinator.com
Hacker Newsnew | past | comments | ask | show | jobs | submit | history | petedoyle's commentsregister

Wow, I'd love to do this. Any tips on how to build this (or how to help an LLM build this), specifically for ./gradlew?


The U.S. Missile Defense Agency (MDA), in cooperation with U.S. Space Force and U.S. Northern Command, conducted a flight test on June 23, 2025, in which the Long Range Discrimination Radar (LRDR) at Clear Space Force Station, Alaska, successfully acquired, tracked and reported missile target data to the Command and Control Battle Management and Communications (C2BMC). This was the radar’s first flight test tracking a live Intercontinental Ballistic Missile (ICBM) representative target. More: https://www.mda.mil/news/25news0006.html


> The absolute nightmare is about giving Google the root signing key of your application

I wish more people talked about this. At Amazon, I helped with the early threat modeling around adoption of "App Signing by Google Play", which requires sending your app's root signing key to Google (and is now required, with no publicly-available opt-out for new apps.) It would have added some nice things for Android devs: app bundles, smaller downloads, instant apps, etc.

That said, we imagined the following scenario, and were unable to find a reasonable mitigation at the time:

It seems plausible the US government could send a NSL (or similar) to Google and force them to distribute modified APKs for apps like Signal (ex: to exfiltrate keys). This would be nearly impossible to detect, especially if the modified APK were distributed to only an individual user, or a small group. A few people raised concerns [1], but I don't recall Google ever giving a reasonable response.

[1] https://commonsware.com/blog/2020/09/23/uncomfortable-questi...

Edit: clarify no opt out applies to new apps


Well, this is one of those HN comments that I will never forget. Someone wrote (and then removed after a buyer purchased it and required it's take down) a stylometry analyzer once for HN comments. A supposedly senior-y Google-r lambasted some Snowden slides commenting things were impossibly unimaginable inside Google (this was before it has done become widely accepted that internal services at such companies such of course be using some transport security). I got in some silly fight with someone ... 13+ years ago? These are specific things I remember. And now probably your comment.

I didn't trust stock Android before, and I felt the sinking-gut feeling as soon as I realized where "upload root signing key" was going, but spelling it out here puts a ... fine point on things.

Thanks for the comment.




> > The absolute nightmare is about giving Google the root signing key of your application

> It seems plausible the US government could send a NSL (or similar) to Google and force them to distribute modified APKs for apps like Signal

Since when do you have to hand over your signing keys to Google? I seem to remember the Signal devs saying that they preferred publishing their app on Google Play as opposed to F-Droid because in the former case they control the signing keys. Has this changed?


> Since when do you have to hand over your signing keys to Google?

Since it requires App Bundles, which is mandatory, as soon as you have Android TV support, for example.

https://android-developers.googleblog.com/2022/11/app-bundle...

See https://dev.to/npomepuy/vlc-for-android-updates-on-the-play-...


Apologies / small correction:

Apps first published to the Play store before August 2021 are not required to upload their keys [1]. This likely includes Signal.

[1] https://developer.android.com/guide/app-bundle


Unless they use Android TV, for example: See https://dev.to/npomepuy/vlc-for-android-updates-on-the-play-...


Google Play also limits APKs to 100MB maximum size while AABs have a higher limit.


Thanks. TIL.


Just for completeness: For reproducable builds F-Droid can now distribute builds signed by the developer.


This has been the case for a few years now, and you could always distribute whatever you wanted from your own repo.


The require to get the private key? When they could ask for the cert and just cross-sign? Can't imagine any valid reason for that...

Would be nice to get a confirmation of this as it sounds wild.


Valid reason for them is they would have to spend money on supporting and maintaining cross signing. I can image it is much much cheaper to just store priv key.

So if they can get away with it they just do it, no one is there to stop them.


> Can't imagine any valid reason for that...

Depends of your paranoia level: either because laziness or because of evil intentions...


> Depends of your paranoia level: either because laziness or because of evil intentions...

They disposed of the "Don't be evil" promise in a very active and energetic manner, seems like we have rational grounds for deciding, without paranoia :)


Somewhat off-topic: Does anyone know the underlying strength of the keys used as the "root of trust" behind passkey synchronization on Android/iOS? I can't find a lot of documentation on this.

It seems like they're synced between devices using client-side encryption, with keys derived from your phone's lock code (typically only 4-6 digits). Is it possible that the passkeys are fully random, but then encrypted with far less than 128/256 bits of actual entropy while being synchronized between devices?

Could it be possible to brute force the keys server-side (IIUC, derived from 4-6 digit pins) with non-excessive amounts of compute? What am I missing?


A confidential channel can be established over an insecure medium using e.g. Diffie-Hellman key exchange. To protect against MITM, an out-of-band QR/bluetooth can be used.


Typically you see symmetric encryption keys (AES-256 is the most common), derived from a Password KDF. I don't know what Google or Apple do specifically, but that'd be my first guess.


I didn't know him, but followed his work on bufferbloat closely. I've never seen anyone work so diligently, for so many years, to fix a problem most people will never know even existed. And yet, that work will be felt by almost everyone on the internet. I'm sad knowing he's passed, and thankful to have seen his work. Rest in peace.


https://xkcd.com/2347/ (which is, in my opinion, highest praise in this line of work)


Small FYI that I couldn't see them in Chrome 133.0.6943.142 on MacOS. Firefox works.


It's the complete opposite for me — there are no animations in Firefox even with uBlock Origin disabled, but Brave shows them fine.

The browser console spams this link: https://react.dev/errors/418?invariant=418

edit: looks like it's caused by a userstyles extension injecting a dark theme into the page; React doesn't like it and the page silently breaks.


Ohhh interesting! Obviously not ideal, but I guess just an extension issue?


Interesting. Running any chrome extensions that might be messing with things? Alternatively, if you can share any errors you're getting in the console lmk.


Oh, looks like it. I disabled extensions one by one til I found it was reflect.app's extension. Edit: reported on their discord.

False alarm :) Amazing work!!


If Apple were to add new APIs, it might be possible to use personal cloud storage (NAS, Decentralized Web Nodes, etc.) with the same UX as iCloud with E2EE.


> it might be possible to use personal cloud storage [...] with E2EE

Which would quickly become illegal if UKGOV is set on getting access to people's iOS backups / cloud storage / etc. Hell, it's already a legal requirement to hand over your keys if UKGOV demands them[0].

[0] "Regulation of Investigatory Powers Act 2000 part III (RIPA 3) gives the UK power to authorities to compel the disclosure of encryption keys or decryption of encrypted data by way of a Section 49 Notice." https://wiki.openrightsgroup.org/wiki/Regulation_of_Investig...


Scale matters. Police don't have the time to go through everyone's computers. It is much easier to scan everyone's conversations, notes, or photos. Cloud storage invites this kind of mass surveillance by being high-value targets with little capacity to resist.


I would be less pissed with this if the UK actually kept the data to the UK.


You'd be fine with _domestic surveillance_ as long as it's kept within country? The average jurisprudence of a UK citizen is mind blowing to me.


I'm not british. I would be fine under their government. Not too thrilled but fine


Parent said "less pissed", not "fine"


I don't negotiate with terrorists.


Bit more complicated than that. iCloud isn't passive storage. A fair bit of the logic exists on the server.


Ah, so in the UK or China this could go through a proxy that steals all the keys.

Half the computer crimes in the UK involve illegal access to the PNC (police national computer), how exactly do we think this would go.

For all the checks you put on people who can access this stuff the temptation is too big - just look at the intelligence analysts using systems to stalk Exs etc.

For any system like this to exist you must ask yourself if you would be happy with the worst person you know having a job where they have access to it.


Maybe they wanted some cached data to get invalidated if users change their passwords?


Then use some other data which can act as a proxy for that, like the date of the last credential change. Using the password itself is a terrible security smell.


Would be fun if this turned into a huge recruiting event :)


This has been bothering me, a lot. Google talks [1] about how Passkey replication is e2e encrypted between devices, but AFAICT they're just using a pin + key derivation. A six digit pin is like 20 bits of entropy before a KDF. [2]

Has anyone seen any docs that might help characterize how much entropy the keys have for e2e encryption (Android/iOS)?

I must be missing something, because I can't see how Google would call something e2e encrypted if the keys only have like 30-35 bits of "effective" entropy after a KDF. But that seems like it's the case??

    [1] "From the user's point of view, this means that when using 
    a passkey for the first time on the new device, they will 
    be asked for an existing device's screen lock in order to 
    restore the end-to-end encryption keys"
[1] https://security.googleblog.com/2022/10/SecurityofPasskeysin...

[2] https://www.omnicalculator.com/other/password-entropy?c=SGD&...


Talking about Apple here because it's what I'm more familiar with, and their security whitepapers are more widely available.

The PIN and key derivation wraps the actual encryption key that's stored locally in the device or secure enclave, not the actual secrets that are stored in the provider's cloud. The actual wrapping keys are random 256 bit AES-GCM keys. This approach works because the secure enclave provides measures against bruteforcing and tampering.

There is some controversy that I can't find an explanation for in any whitepaper, specifically here: https://support.apple.com/en-us/HT202303 where it reads "(...) this data remains secure even in the case of a data breach in the cloud. If you lose access to your account, only you can recover this data, using your device passcode or password, recovery contact, or recovery key." because that implies off-device use of the PIN, so those measures are lost. There's no further explanation that I could find about that. Some previous discussion about that particular point here: https://news.ycombinator.com/item?id=33897793&p=2#33900540


Thank you! I'm trying to understand more deeply, so I appreciate it. :)

> This approach works because the secure enclave provides measures against bruteforcing and tampering.

That's interesting!

> because that implies off-device use of the PIN, so those measures are lost

This link from your previous thread is interesting: https://support.apple.com/en-sg/guide/security/sec3e341e75d/...

Uses SRP to let the device prove to iCloud HSMs that the user entered the correct pin, without ever sending it over the wire. The HSMs have similar protections for brute forcing, etc.

From the docs I have a fairly high confidence entropy is 256 bits for iCloud Keychain. I have much less confidence on Android, but I'm still researching... :)


Sure, but if that key derivation function is protected by a "you get 10 attempts then we wipe the keys" safeguard, the effective entropy is much higher. The question shouldn't really surround the effective entropy of the PIN, but rather the systems in-place to protect bypassing safeguards in the key derivation function which render the actual entropy of the PIN irrelevant. There probably isn't no way around that safeguard, but as more of this gets moved into trusted compute silicon the level of sophistication required to breach it goes up; and is one hardware revision or operating system update away from being made moot again.

This thread really smells like https://xkcd.com/538/. Three things you have to remember, that are far more important than any of the concerns you have:

1) The effective entropy of the current system (passwords) is "shrugs shoulders fuck it not our problem". Services can enforce password entropy requirements. They cannot effectually require users to use a unique password. They also cannot forbid users from writing the password they use in a .txt file on their desktop or post-it note or throwing it in Apple Notes (EVERYONE does this outside of our bubble. Apple Notes and Excel are the #1 and #2 password managers on the planet). A six digit pin + hardware TPM key derivation is, at best, the same thing that was guarding how most people store their passwords anyway, and in many cases far better than the current state (if a user's device has no E2EE, or if they're syncing their passwords.xlsx file with Dropbox, etc).

2) Passkeys do not and are not designed to protect against nation-state level attackers. Passwords weren't either. They also don't protect well against the "grab a hammer and beat it out of him" threat vector; you're going to give up your password, and tomorrow they'll probably have your iPhone and your passkeys will be disclosed as well. Passkeys are designed to protect against unsophisticated (and even moderately sophisticated) attackers; phishing, data breaches, etc.

3) If you want higher tiers of entropy guarding your passkeys, you can do that. 1Password, as an example, already has this [1]. They store passkeys, and encrypt those passkeys with their two-level account & master password keys. Done! If you don't like 1Password, you can roll your own, and I'm sure OSS password managers like gopass/keepass/etc will eventually add this. Passkeys/WebAuthn don't prescribe to anyone how you store the private keys; Apple will do their thing, Google will do their thing, you don't have to use them, many people will, and they'll be better off (see point 1).

[1] https://future.1password.com


> Sure, but if that key derivation function is protected by a "you get 10 attempts then we wipe the keys" safeguard, the effective entropy is much higher.

Thank you. 100% agree.

> Passkeys do not and are not designed to protect against nation-state level attackers

I've been mulling over some use-cases where this is important, hence the deep consideration over entropy. 100% not a huge deal for the passkeys case for many 9's of people.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search:

HN For You