For the best experience on desktop, install the Chrome extension to track your reading on news.ycombinator.com
Hacker Newsnew | past | comments | ask | show | jobs | submit | history | svnt's commentsregister

Right but wasn’t high effort the default effort before? So ultrathink is gone in all but name.

They don’t want to officially disclose the reality because while some users will understand the realities of protecting a product while innovating, many will just realize it means one can go looking for claude 4.5 performance elsewhere.

I’m going in circles. Let me take a step back and try something completely different. The answer is a clean refactor.

Wait, the simplest fix is the same hack I tried 45 minutes ago but in a different context. Let me just try that.

Wait,


Wait, the linter re-ordered the file. Let me restore it to the previous state.

whisper: There is no linter.


Those test failures are pre-existing. We're all done!

Wait, I should check if they pre-exist on master.

    < 1,000 prompts for compound cd && git commands that can't be safely auto-accepted >

The job of execs/middle managers seems to often be dual parenting: 1) coordinate the capable well-parented employees below them, and 2) pander to the usefully myopic spoiled brats above.

I would personally met the most spoiled brats dealing with big tech employees.

If this were true and you were their manager you would not have to deal with it. You could coach or remove them.

I don't think I get to remove employees for being from privileged families and having unrealistic expectations.

I view people like him like large satellites or small planets that have been crashing through everything mindlessly and creating this wake of wreckage. The only way to keep going is to not look back, or in this framing, not look in.

In single-minded pursuit of a simple goal and with early success they reduce their own humanity so that their repeated actions can maintain their simple function.

Looking anywhere behind/within has become so overwhelming and so painful they will construct elaborate narrative and even engage in medical assistance (eg ketamine) to avoid the consequences of integration.


This is an overstatement of the protection that blanks provide. As it says, they only (potentially) provide insight into contamination caused during the extraction process.

Yeah it’s called a regex. With a lot of human assistance it can do less but fits in smaller spaces and doesn’t break down.

It’s also deterministic, unlike llms…

Is this true for anything beyond the simplest LLM architectures? It seems like as soon as you introduce something like CoT this is no longer the case, at least in terms of mechanism, if not outcome.

Those products aren’t typically described as having been “hyped” though — just successful or viral. Hyped has a sort of derogatory/schadenfreude subtext.

This was published right before people started experimentally validating the Landauer limit. I am not sure why it hasn’t been taken down at some point as the evidence has accumulated:

2012 — Bérut et al. (Nature) — They used a single colloidal silica bead (2 μm) trapped in a double-well potential created by a focused laser. By modulating the potential to erase the bit, they showed that mean dissipated heat saturates at the Landauer bound (k_B T ln 2) in the limit of long erasure cycles.

https://www.physics.rutgers.edu/~morozov/677_f2017/Physics_6...

2014 — Jun et al. (PRL) — A higher-precision follow-up using 200 nm fluorescent particles in an electrokinetic feedback trap. Same basic physics, tighter error bars.

https://pmc.ncbi.nlm.nih.gov/articles/PMC4795654/

2016 — Hong et al. (Science Advances) — First test on actual digital memory hardware. Used arrays of sub-100 nm single-domain Permalloy nanomagnets and measured energy dissipation during adiabatic bit erasure using magneto-optic Kerr effect magnetometry. The measured dissipation was consistent with the Landauer limit within 2 standard deviations using the actual the basis of magnetic storage.

https://www.science.org/doi/10.1126/sciadv.1501492

2018 — Guadenzi et al. (Nature Physics) — Opens with:

The erasure of a bit of information is an irreversible operation whose minimal entropy production of kB ln 2 is set by the Landauer limit1. This limit has been verified in a variety of classical systems, including particles in traps2,3 and nanomagnets4. Here, we extend it to the quantum realm by using a crystal of molecular nanomagnets as a quantum spin memory and showing that its erasure is still governed by the Landauer principle.

https://www.nature.com/articles/s41567-018-0070-7

The Landauer limit is not conjecture.


I haven't finished reading this yet, but I don't think the author is saying that the Landauer limit for erasure is wrong. They're saying that there are other limits in computing beyond erasure. I think this makes sense; although reversible computing should be possible at zero temperature and infinite precision, realistic computers need some way to remove entropy that accumulates during the computation.

So I don't think their claim is in tension with any of the papers that you cite.


I'm not sure, but isn't 2 standard deviations a bit low? Especially so for something that can be done in a lab. It seems that 2 SD is the minimum threshold for getting published. Can we be sure that these are properly reviewed?

Could it be possible that you confused the number of standard deviations one needs to falsify something? For instance, if two things are different we may want to be as many SD as we can apart. Here, on the other hand, the data agree _within_ 2S D.

That was the limit of just one experimental approach that was peer reviewed and published in a major journal. As you can see there are many experiments validating the limit and none invalidating it.

The reality is that the Landauer limit is vanishingly small. I would encourage you to review the experiment methodology and see if you can come up with better, fundable methods.


Is the focus on the erasure of a bit, rather than writing a bit, just conventional or is there a significant difference between the processes?

Erasure is logically irreversible, writing a bit is not. When you erase a bit you compress the logical phase space of the closed system, which means the missing information has to go somewhere — in this case a couple of very low energy phonons into the larger environment.

Ah, I thought writing a bit was irreversible, because after writing say 1, the previous state could have been a 0 or a 1. But in fact writing a bit should be thought of as the whole process "0 to 1" or "1 to 1", including the initial bit, so that the process is logically reversible. Is that right? Then what I had in mind as an irreversible process of writing would be equivalent to first erasing the bit and then writing the new one.

Yes exactly, your conception of write includes their conception of erase.

Bennett’s reversible computing is essentially just the way to avoid erasure: whenever you write, keep all the previous states around.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search:

HN For You