For the best experience on desktop, install the Chrome extension to track your reading on news.ycombinator.com
Hacker Newsnew | past | comments | ask | show | jobs | submit | history | more henry700's commentsregister

The paper security backup "d'oh" equivalent to this would naturally be storing the encrypted PaperAge QR codes in the same physical location as the unencrypted QRkey paper containing the decryption key. Which would be hilarious to witness.


It's a shame that every new cool product/dataformat/cable/cpu/whatever researched by Apple has very little (or no) public documentation. Sure, there are lots of hackers who can test and reverse engineer those pretty quickly, but it's just unnecessary work. I don't know why Apple is so revered in hacker circles, to be honest. Not even Microsoft does this shit anymore, they're open sourcing a lot of research this decade, but they're still seen with extreme distrust. Whereas Apple was always secretive and used underhanded tactics, but it is still loved.


Here's the link to the Darwin, the underlying Unix layer of macOS, which is open source [1].

Go for it!

[1]: https://github.com/apple-oss-distributions/distribution-macO...


You're joking, right? You can see at a glance that this mess is only published for compliance reasons. There is no documentation at all, and most of the code consists of numerous versions of open-source libraries that have been subtly modified due to their licensing requirements, which necessitate disclosure of modifications. Good luck building any flavor of that! See: https://news.ycombinator.com/item?id=35197308


>For a while, the lack of generics surpassed complaints about error handling, but now that Go supports generics, error handling is back on top

>The Go team takes community feedback seriously

It feels like reading satire, but it's real.


They clearly are wrestling with these issues, which to me seems like taking the feedback seriously. Taking feedback seriously doesn’t imply you have to say yes to every request. That just gives you a design-by-committee junk drawer language. We already have enough of those, so personally I’m glad the Go team sticks to its guns and says no most of the time.


How is Go not a design-by-committee language? They don't have a single lead language developer or benevolent dictator, and as this blog demonstrates, they're very much driven by consensus.


True, but they’re very picky as committees go. But yeah, maybe not the best use of that expression…


I find it peculiar how, in a language so riddled with simple concurrency architectural issues, the approach is to painstankingly fix every library after fixing the runtime, instead of just using some better language. Why does the community insist on such a bad language when literally even fucking Javascript has a saner execution model?


I think the opposite. Every language has flaws. What's impressive about Python is their ongoing commitment to work on theirs, even the deepest-rooted. It makes me optimistic that this is a language to stick with for the long run.


I agree about using other languages that have better concurrency support if concurrency is your bottleneck.

But changing the language in a brownfield project is hard. I love Go, and these days I don’t bother with Python if I know the backend needs to scale.

But Python’s ecosystem is huge, and for data work, there’s little alternative to it.

With all that said, JavaScript ain’t got shit on any language. The only good thing about it is Google’s runtime, and that has nothing to do with the language. JS doesn’t have true concurrency and is a mess of a language in general. Python is slow, riddled with concurrency problems, but at least it’s a real language created by a guy who knew what he was doing.


> instead of just using some better language

Python the language is pretty bad. Python the ecosystem of libraries and tools has no equal, unfortunately.

Switching a language is easy. Switching a billion lines of library less so.

And the tragic part is that many of the top “python libraries” are just Python interfaces to a C library! But if you want to switch to a “better language” that fact isn’t helpful.


I wonder if we get automatic LLM translation of codebases from language to language soon - this could close the library gap and diminish the language lock in factor.


i find it peculiar how tribal people are about languages. python is fantastic. you're not winning anyone over with comments like this. just go write your javascript and be happy, bud.


Amazing doing this on a weekend and keeping enough track to write a great article about it within 3 weeks time.


Has implications for competitive FPS gaming on modern Linux — one more problem to fix. For example "The Finals" allows Linux players, but imagine having this much input delay and having to revert to classic x11 to play games, lol.


It's actually just cursor latency specific to the windowed environment you're running. From what I've experienced (with a 4060 Ti) it doesn't seem to impact FPS games at all.

I haven't tried any games that use a cursor with Wayland yet so I don't know if it would have an impact there.

I think it has to do with whether or not the game in question is reading the mouse device directly (e.g. through SSL) or via the compositor. If it's reading the device directly it stands to reason that there would be less latency.


> Has implications for competitive FPS gaming

That has not been proven in the article. Input handling follows different paths for full screen games.


> having to revert to classic x11 to play games, lol.

It would be more of a problem the other way around, if we had to resort to Wayland to get low latency. I think most of us using Linux for gaming and casual stuff are happy to stick to X11 for now and the foreseeable future. It has good support in software, its failure modes are well-documented, and it doesn't add one more layer to the pile of complexity that desktop linux already is; at least, not one us users have to consciously keep in mind as a common source of issues.


If anything, Wayland removes a layer. Replaces the insanely huge monolith with binary blobs with bog-standard Linux kernel APIs.


As I type this, I'm playing "the finals" on Linux via Proton on Wayland. I won't pretend I'm any kind of competitive gamer type, but it's perfectly fine and I don't feel like I'm being held back by input lag. So this is very much a niche issue to have.


On CS2 Wayland gave a major performance boost, but it's being held back by a regression since a change in device input layer.

https://github.com/ValveSoftware/csgo-osx-linux/issues/3856

From outside it's hard to tell if it's truly protocol differences or just the age of the implementations on X11, but when Wayland came out every project has claimed improvements over the old X11 stack. Also, from the early Wayland days presentations bashed the protocol as something that couldn't be fixed without a rework that was not going to happen due to the dead weight of backwards compatibility and awful older hardware.

As a user applications running on Wayland have consistently improved on how nice things feel if you don't miss your latency deadlines. It's easy to perceive on apps, and straight out obvious in games.


I got to diamond in Apex Legends on Linux using Wayland on KDE Plasma. Didn't feel any noticeable difference between wayland, x and windows in my Apex playing.


It doesn't. This article doesn't measure that, and full screen works differently. This article also only measures gnome.


I have trouble believing that 6.5ms in increased latency would be perceptible to any more than a fraction of a percent of the most elite gamers. Most the people claiming that this level of difference is impacting their gameplay are victims of confirmation bias.


David Eagleman has done some work with drummers. Granted the audio system might be a bit more accurate than the visual, or maybe drummers are just weird. On the other hand, vim taking 30 milliseconds to start (ugh) and having sluggish cursor motions is why I'm on vi now. Haven't tried Wayland. Maybe in some number of years once it's more portable and more developed? (And how many years has it already been out?)

> “I was working with Larry Mullen, Jr., on one of the U2 albums,” Eno told me. “ ‘All That You Don’t Leave Behind,’ or whatever it’s called.” Mullen was playing drums over a recording of the band and a click track—a computer-generated beat that was meant to keep all the overdubbed parts in synch. In this case, however, Mullen thought that the click track was slightly off: it was a fraction of a beat behind the rest of the band. “I said, ‘No, that can’t be so, Larry,’ ” Eno recalled. “ ‘We’ve all worked to that track, so it must be right.’ But he said, ‘Sorry, I just can’t play to it.’ ”

> Eno eventually adjusted the click to Mullen’s satisfaction, but he was just humoring him. It was only later, after the drummer had left, that Eno checked the original track again and realized that Mullen was right: the click was off by six milliseconds. “The thing is,” Eno told me, “when we were adjusting it I once had it two milliseconds to the wrong side of the beat, and he said, ‘No, you’ve got to come back a bit.’ Which I think is absolutely staggering.”


It doesn't need to be perceptible to cause a difference in a game.

Suppose two players notice each other at the same time (e.g. as would naturally happen when walking around a corner in a shooter), first to shoot wins, and their total latencies are identical Gaussians with a standard deviation of 100ms. Then a 6.5ms reduction in latency is worth an additional 2.5% chance of winning the trade. Maybe you won't notice this on a moment by moment basis, but take statistics and its impact should be measurable.

In ELO terms a 2.5% gain in win rate is around a 10 point increase (simplifying by assuming that single Gaussian is the entire game). That's small, but if you were a hardcore player and all it took to raise your ELO by 10 points was using a better monitor/mouse/OS... why not? Doing that is cheap compared to the time investment required to improve your ELO another 10 points with practice (unless you're just starting).

Also, I think you'd be surprised what people can perceive in a context where they are practiced. Speed runners hit frame perfect tricks in 60FPS games. That's not reaction time but it does intimately involve consistent control latency between practice and execution.


Slightly off topic…

> Suppose two players notice each other at the same time (e.g. as would naturally happen when walking around a corner in a shooter)

This is not true for third person games. Depending on a left sided or right sided peek and your angle or approach, players see asymmetrically.

For example, Fortnite is a right side peek game. Peeking right is safer than peeking left as less of your body is exposed before your camera turns the corner.

I believe distance also plays a part in the angles.


Yeah, network latency and client side prediction and accuracy will also play huge roles. The actual distributions will be very complex, but in general reacting faster is going to be better.


Do people not play deathmatches on LAN parties anymore these days? 2.5 is huge if the game lasts long enough that someone would be leading with 200. ;)


I would postulate that 100% of professional (i.e. elite) competitive gamers would be able to tell the difference. See this old touchscreen demonstration: https://www.youtube.com/watch?v=vOvQCPLkPt4


About the difference between 60hz and 120hz monitor, instantly noticeable just by moving the mouse in windows (just by looking at the distance cursor updates as it moves). Would you argue that all gaming monitors are placebo?


Just to nitpick, that difference is still above 6.5 ms.


I actually would. Gaming monitors are the equivalent of fancy audiophile gear. It's a way to fleece people by making them think they can perceive a difference that isn't really there.


But the difference between 60hz and 120hz is instantly noticeable in a blind test, without even opening a game. That's by definition not placebo.


There have been actual tests showing players have better accuracy up to 360 fps displays.

https://www.youtube.com/watch?v=OX31kZbAXsA


Guess you think all speakers are the same.


Those sorts of latencies actually are noticeable! As an example, 6.5ms latency between a virtual instrument and its UI is definitely noticeable.

I didn’t think it was. But it is. I promise!

It’s not necessarily a reaction-time game-winning thing. It’s a feel.

With virtual instruments, my experience is that when you get down to ~3ms you don’t notice the latency anymore… but!, when you go below 3ms, it starts feeling more physically real.


You may think 6.5 ms of input latency is imperceptible. But combine it with the rest of the stack (monitor refresh rate, local network latency, RTT between client and server, time for server to register input from client and calculate “winner”), and it becomes the diff between an L and W. In the case of pros, the diff between a multimillion dollar cash prize and nil.


There are noticability thresholds where this could push it over. For fighting games if you have the reactions to whiff punish N frame recovery moves this may push you to only being able to punish N+1 recovery moves and really impact your ranking. This is a little over 1/3rd of a 60hz frame.


Now download Mixxx and try DJing and using the waveforms for cueing


"Competitive FPS gaming" stops allowing anything besides fresh Windows installs after a certain point. It's a diminutive distinction, like pointing out that emulating GameCube games won't let Smash Melee players fight competitively; nobody playing on either side actually cares.


A number is also a string. Crazy, right?


The following code will only validate when gameServerVersion is a hex string between 9 and 32 characters:

  #!/usr/bin/env python
  import sys
  import re
  from typing import Annotated
  from pydantic import BaseModel, AfterValidator, validator
  from pydantic_yaml import parse_yaml_raw_as
  
  
  def validate_commit_id(s: str) -> str:
    assert re.search(r"^[0-9a-f]{9,32}$", s), f"{s} is not a commit id"
    return s
  
  
  CommitId = Annotated[str, AfterValidator(validate_commit_id)]
  
  
  class Model(BaseModel):
    gameServerVersion: CommitId
  
  
  def main():
    text = sys.stdin.read()
    conf = parse_yaml_raw_as(Model, text)
    print(conf)
  
  
  if __name__ == "__main__":
    main()
    
It would have prevented the Git Hash Bug originally described:

  $ cat conf.yaml
  gameServerVersion: 556474e378
  $ python parse.py < conf.yaml
  pydantic_core._pydantic_core.ValidationError: 1 validation error for Model
  gameServerVersion
    Input should be a valid string [type=string_type, input_value=inf, input_type=float]
  
  $ cat conf.yaml
  gameServerVersion: "556474e378"
  $ python parse.py < conf.yaml
  gameServerVersion='556474e378'
  
  $ cat conf.yaml
  gameServerVersion: "Infinity"
  $ python parse.py < conf.yaml
  pydantic_core._pydantic_core.ValidationError: 1 validation error for Model
  gameServerVersion
    Assertion failed, Infinity is not a commit id [type=assertion_error, input_value='Infinity', input_type=str]

It's just good practice to validate things on the way in. Even if they were using JSON as their config file, they should still validate it.


Keep the bug generators going, we will need the jobs


Cloudflare doesn't cost money, you pay the price with your soul.


You can personalize the CSS of any website with extensions. The correct comparison here would be talking about disabling the capability to do this.


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search:

HN For You