For the best experience on desktop, install the Chrome extension to track your reading on news.ycombinator.com
Hacker Newsnew | past | comments | ask | show | jobs | submit | history | connicpu's commentsregister

It would be funny as a project but there's better low speed backup options like a starlink dish on standby mode (500kbps)

Is that publicly available? And how much would that cost?

You have to get a dish on a different plan first but then you can switch to standby mode, it's $5/mo. It might not be available if you rent the hardware though, you might have to buy the dish outright.

A fork bomb can refer to any process that spawns multiple recursive child processes. You don't need fork to spawn more copies of yourself, that is merely the classic Unix implementation.

Year of the Linux Desktop began for me last year. All the games I play work, and that's mostly what I do at home. Work is also a Linux desktop, because our build system runs there anyways so may as well use it directly (though some still work primarily from windows/mac laptops and ssh into their desktop). Only windows machine I have left is my work laptop because IT doesn't offer Linux laptops, but it's basically just a thin client to access my desktop away from the office.


Not hidden from nation states with access to real-time satellite imagery, but more rustic guerilla operations usually don't have such sophisticated access


Just poor ones - how much could it cost to get a scan of the oceans once weekly or daily? 10 million dollars?


Actually probably even cheaper, a generic scan to spot all the ships, and when it's done, just need to get images around the last location. Probably can use something like the Planet API


Unless your lookup table is small enough to only use a portion of your L1 cache and you're calling it so much that the lookup table is never evicted :)


Even that is not necessarily needed, I have gotten major speedups from LUTs even as large as 1MB because the lookup distribution was not uniform. Modern CPUs have high cache associativity and faster transfers between L1 and L2.

L1D caches have also gotten bigger -- as big as 128KB. A Deflate/zlib implementation, for instance, can use a brute force full 32K entry LUT for the 15-bit Huffman decoding on some chips, no longer needing the fast small table.


It's still less space for other things in the L1 cache, isn't it?


I've never written a check, but I have had to deposit occasional checks. In the last 6 years the only checks I've received were first paychecks at a new job (before direct deposit was set up) and my covid stimulus checks.


In Washington voting is free. My ballot comes in the mail, I fill it out, I drop it in the outgoing mail. It's pre-stamped. I don't mind full citizenship verification at the time of registration, as that can be done months before it's actually time to vote.


A couple weeks ago someone on my team tried using the experimental "vibe-lint" that someone else had added to our CI system and the results were hilariously bad. It left 10 plausible sounding review comments, but was anywhere from subtly to hilariously wrong about what's going on in 9/10 of them. If a human were leaving comments of that quality consistently they certainly wouldn't receive maintainer privileges here until they improved _significantly_.


Black box data doesn't need that crazy throughput either though. Traditional RF is much easier to get right, and works even when the aircraft starts losing track of where it is and stops being able to track the satellite with its laser


I write algorithms that operate on predictable amounts of data. It's very easy to work out the maximum amount of things we need to have and then allocate it all in fixed size arrays. If you allocate all your memory at startup you can never OOM at runtime. Some containers need over 100GB but like the parent comment said we've already bought the RAM.


I write algorithms that operate on less predictable amounts of data.


If you operate over all of your data every time it's a lot more predictable ;)


The data I operate on come in from the outside world, I can't operate on all of it because most of it doesn't exist yet. I can't process an event that hasn't happened yet


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search:

HN For You