For the best experience on desktop, install the Chrome extension to track your reading on news.ycombinator.com
Hacker Newsnew | past | comments | ask | show | jobs | submit | history | irdc's commentsregister

It's been shown in other fields that training models on the output of other models produces subtly broken models, not a flattening to the statistical mean. Why would science be different?

Signetics was first with their 25120 Fully Encoded, 9046xN, Random Access Write-Only-Memory[0].

0. https://web.archive.org/web/20120316141638/http://www.nation...


As a Dutch person, what’s interesting to me is how this exact rule applies to Dutch. Maybe that’s why I didn’t notice it while reading the article…


Ideally the test should include the number of bit errors that were corrected using on-disc ECC. This could then also be used to estimate disc lifetime (preferably using multiple samples).


Thus making humanity an ever-receding area of AI-incompetence.


> Why do we even have linear physical and virtual addresses in the first place, when pretty much everything today is object-oriented?

But what happens when the in-memory size of objects approaches 2⁶⁴? How to even map such a thing without multi-level page tables?


What field do you work in that you’re mapping objects of size 2^{63}? Databases? When I see anything that size it’s a bug.


Regions, like [0], for example? Multi-level page tables kinda suck.

[0] https://web.archive.org/web/20250321211345/https://www.secur...


16 bit programming kinda sucked. I caught the tail end of it but my first project was using Win32s so I just had to cherry-pick what I wanted to work on to avoid having to learn it at all. I was fortunate that a Hype Train with a particularly long track was about to leave the station and it was 32 bit. But everyone I worked with or around would wax poetic about what a pain in the ass 16 bit was.

Meanwhile though, the PC memory model really did sort of want memory to be divided into at least a couple of classes and we had to jump through a lot of hoops to deal with that era. Even if I wasn't coding in 16 bit I was still consuming 16 bit games with boot disks.


I was recently noodling around with a retrocoding setup. I have to admit that I did grin a silly grin when I found a set of compile flags for a DOS compiler that caused sizeof(void far*) to return 6 - the first time I'd ever seen it return a non power of two in my life.


I believe Multics allowed multiple segments to be laid out contiguously. When you overflowed the offset, you got into the next object/segment.


Where sublimation is the redirection of socially unacceptable impulses or desires into socially acceptable actions, desublimation refers to the acceptance of these impulses and desires, removing the energies otherwise available for higher goals.


Anything more complicated than this was just too difficult with the early HTML standards (there was no CSS).


I ran this at one time but it was a bit unstable. I remember corresponding with one of the authors who remarked that it was also attempting to emulate the stability of Windows 95. This was ... oh gawd ... back in 1997 or 1998 I think.


Note that this new stepping fixes the notorious E9 erratum which caused GPIOs to misbehave.


Additionally, for those integrating the chip into retrocomputing addons/mods, "RP2350 is now officially 5V tolerant".

One odd thing in the post is mention of a test A3 variant, of which 30,000 will be put on random Pico 2 and Pico 2W boards.


The A3 stepping is documented in the updated datasheet, which has a really nice "Hardware Revision History" in Appendix C.

It has all the hardware changes and most of the bootrom changes of A4.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search:

HN For You