For some definition of work, yes, not every definition. Their product is not without flaw, leaving room at for improvement, and room for improvement by more than only other AI.
> There are no vulnerabilities
That's just not true. There's loads of vulnerabilities, just as there's plenty of vulnerabilities in human written code. Try it, point an AI looking for vulns at the output of an AI that's been through the highest intensity and scrutiny workflow, even code that has already been AI reviewed for vulnerabilities.
Reading around a bit, yes to Netflix adding anti-piracy measures, maybe to folks recording HDMI/DisplayPort.
Apparently, Netflix is using steganography/content watermarks in their 4k content itself to trace users who are pirating. This is from a totally unsourced Reddit thread[0] but they do reference a real company which claims to do this watermarking[1]. The claim is that in addition to Netflix requiring 4k content to be available only on platforms with Trusted Execution Environments[2], Netflix also encodes each ~10 second "chunk" of the video stream into at least 2 different versions: an Y and a Z version. Then, they serve each customer a unique series of chunks when that customer streams their content, e.g. YYZYZZZYZYYZYZYYZZYZYYZ. Then when content leaks, Netflix can examine each chunk of the leaked content to extract the ID of the user who streamed the content. Apparently, Netflix can encode a lot more than just the userID, they can also encode stuff like the individual device ID, the TEE key ID, etc.
I know you might be thinking "I could do something to defeat that" and you're probably right (e.g. take streams from multiple users and intercut them so that the bits of the watermark through time are being constantly shuffled), but I'll also bet that there's many layers of steganography we don't know about, and unless you get them all, you'll not escape scot-free.
But the only real world impact is that the device that was used to stream that 4K content gets blacklisted at the hardware level.
To workaround this, piracy groups try to batch 4K rips because they know that the device will be burned soon after they upload the content. They then acquire another device, and the game of whack-a-mole continues.
Not that I would ever pirate a movie because I'm a good boy, but I remember the Cinavia DRM that affected Blu-ray players thirteen years ago.
I'm not 100% sure how it worked, but I guess it could do a similar kind of steganography-style thing to the audio track, where they would embed keys silently and the blu-ray player would check against that.
I'm not sure if anyone actually ever managed to defeat it, I think they just stopped implementing it in streaming boxes.
What are they gonna do? Ban your account? You don't need to go through KYC to get a netflix account, so what's preventing you from using a prepaid card to sign up for another account?
Yes, when they work overtime they get paid more for that overtime than regular time.
The money doesn't somehow make it sustainable for the people burning out their lives. Working 7 days a week, including overnight shifts, for 20 years to collect a pension seems like WELL earned compensation.
That's seems unrelated to "we have so few" and "we enmiserate the one's we do have".
I think rahimnathwani's point was not that they get extra pay so it's fine, but that it seems economically irrational to overwork fewer staff if it's actually more expensive.
Here in Norway it's similar with doctors, they get paid a lot because they work crazy hours. But the doctors' association is fighting to keep it that way, as the old timers who didn't burn out along the way enjoys the high pay more than their spare time.
It's hard to argue you're underpaid if, as a result of short staffing, you're getting paid more (both in absolute terms and per unit of effort) than you signed up for.
Mostly yes. Upright fridge and freezer designs trade off efficiency for convenience (rooting around in a chest fridge/freezer can be annoying). https://youtu.be/CGAhWgkKlHI
The video down thread shows, the internal food-supports are all wire meshes with big gaps. The cold air is not squirted up and out like a syringe, it's more like the food is kept in a birdcage that's lowered and raised out of a pool of cold air.
Fly is not saying "just ignore SOC2 compliance". Fly is saying "yes, get SOC2, we had to become SOC2 compliant, and also, you can work with your auditor to achieve SOC2 compliance in a more sane way than if you just do whatever is recommended upfront."
Basically, they are saying that you should tailor your SOC2 implementation so that it's actually useful without being a horrible overbearing process, that you have that option and should take it.
Who can know what the world will look like as we "transition"? I sure don't, but I'm thankful the author here has taken a stab at it. I feel like this is one of the first stories I've seen to try to imagine this post-transition world in a way that isn't so gonzo as to be unrelatable. It was so relatable (the human-ness shining all the brighter in a machine-driven world) that I cried as I finished reading. I've felt very anxious about my own future, and to see one possible future painted so vividly, with such human and emotionally focused themes, triggered quite an emotional reaction. I think the feeling was:
> If the world must change, I hope at least we still tell such stories and share how we feel within that change. If so, come what may, that's a future I know I can live in.
Thank you for this comment, I'm so glad it made you feel a little bit better about the future, if even for a little while!
This is really the whole idea behind this project with Near Zero. I think there's a lot of anxiety out there around AI and the future, I was there for a while too. Ultimately I've ended up pretty optimistic about it all, and inspired by what the group at Protocolized is doing, found science fiction a great way to help express that.
You're right that this isn't some groundbreaking revelation. If you're using AI enough to be feeling it, you're feeling/seeing what they're talking about. The purpose of a paper/retreat like this it get it all together and written down on paper, then to disseminate it to the wider world. I think the paper does a good job of collecting info that isn't wrong, and which has enough info to help guide folks making decisions.
Don't downvote, it certainly did for me. My first computer was a MBP 13inch from 2009, as I was apple obsessed like the person in the parent article. Time passes and I realized I really didn't like either Windows or Mac, and for the past 10 years Ive been linux only. It really does happen, even if rarely.
Good on you for rising up to the ranks of Linux/BSD.
You just need to recognize that not everybody aspires to be competent with lower-levels of hardware and software that Apple makes that much more difficult. Most Apple users are content to use apps written by others and that is as far as their interest goes.
An analogy is the car market. Most people don't care how the car works, etc. They just want to get to places. If you only need to drive to the shops and do minimal errands, you don't even need a truck - a sedan will do just fine. Same with computers, lots of different market segments with distinct needs and expectations.
> You just need to recognize that not everybody aspires to be competent with lower-levels of hardware and software
You don't really need that to use Linux.
People should stop copy/pasting urban myths or stories from the late 90's. We are in 2026 and one can perfectly buy a laptop preinstalled on linux with full support and just find the apps they need from an "app store" which in this case is just the frontend for the flatpak and packages manager. Picking up an app from Gnome Software is no different than installing an app from the play/apple/microsoft store.
Yep everyone has their preference. A lot of us have done both. I’ve run multiple distros. I’ve played with low level software. I have used and continue to use open source tools in places.
And I prefer my Mac to this day as my main machine.
Consumer user or Linux hacker is a false dichotomy people sometimes like to try to slot people into (not accusing you GianFabien).
My first computer was a Compaq my parents got during that peak era of home PC mass adoption in the late 90s. I immediately played a ton of games, got on AOL, learned VBScript, C++, HTML, etc.
This was such a natural and common thing that I never even questioned if others were having a different experience with computers. This sounds crazy now, but it felt as if everyone was either going to learn to program or already had, not as a career choice but as an essential form of literacy. I mean even the calculators were programmable!
To me, Macs were just "the boring computers" we had at school and what my grandparents bought. They seemed locked down and weird like an appliance. I have no idea what my life would be like now if I had grown up in a different time and with a Mac.
This isn't to hate on Macs, but to tell the story of the dominance of Microsoft at the time and how much culture shifted towards more "dumb" consumerism. By the time the first iPod came out I realized the adults had no interest in any of this more progressive future. Then the iPhone and Windows Vista confirmed it.
I installed Ubuntu on the ThinkPad I had in high school and never really looked back. To this day, I am still baffled by the obsessions people have with AI "replacing jobs" and Apple devices as status symbols. I think those people miss the point entirely and worry about their incomplete worldview being passed down to younger generations. What I see is the masses refusing to participate and technofeudalists taking advantage of them.
For some definition of work, yes, not every definition. Their product is not without flaw, leaving room at for improvement, and room for improvement by more than only other AI.
> There are no vulnerabilities
That's just not true. There's loads of vulnerabilities, just as there's plenty of vulnerabilities in human written code. Try it, point an AI looking for vulns at the output of an AI that's been through the highest intensity and scrutiny workflow, even code that has already been AI reviewed for vulnerabilities.
reply