For the best experience on desktop, install the Chrome extension to track your reading on news.ycombinator.com
Hacker Newsnew | past | comments | ask | show | jobs | submit | history | praseodym's commentsregister

ZFS on Linux has had many bugs over the years, notably with ZFS-native encryption and especially sending/receiving encrypted volumes. Another issue is that using swap on ZFS is still guaranteed to hang the kernel in low memory scenarios, because ZFS needs to allocate memory to write to swap.

The swap issue isn't zfs' fault though, it works just fine on FreeBSD and illumos... its a issue with how the Linux kernel handles things.

The zero copy that zero copied unencrypted blocks onto encrypted file systems was genius. It’s almost like they don’t test.

This is exactly why I’ve to replaced my home server by a low-power x86 NUC instead. No custom build needed to run NixOS and idle power consumption turns out to be slightly lower than the Raspberry Pi 5.


Idle consumption is truly horrid on the Pi 5, even with all the hacks and turning absolutely everything off and hobbling the SoC to 500 Mhz it's imposible to get it under 2W. I'm convinced that the Pi Foundation doesn't think battery powered applications are like, a thing that physically exists.


Allow me to ask you what’s the NUC computer you are using?


I’m using an ASUS NUC 14 Essential Kit N355. It’s a bit more expensive than the Pi 5, but also more powerful (8 cores and decent GPU). There is also a more affordable N150 model. And even lower budget are the N150 mini PCs from Chinese manufacturers, but they often mess up things like cooling in a hardware revision (compared to the favorable review that you’d read).

And forgot to mention this before: Intel CPUs with built-in GPUs have very performant and energy efficient hardware video codecs, whereas the Raspberry Pi 5 is limited and lacks software support.


And what is the idle power draw that you're seeing on the NUC? Out of the box or did you have to mess around with BIOS and powertop?


I get 3-5W, mostly 4W on my N100 nuc. WiFi disabled through bios. And I ran powertop and made the suggested changes. 1 stick of 16gib lpDDR5, 1 nvme ssd, 1 4TB SATA ssd. Under full cpu load usage goes up to 8-12W. When also the gpu is busy with encoding the consumption grows to 20-24W. This is with turbo clock enabled. With it disabled power draw stays around 4W, but it is annoyingly slow I enabled turbo again and just content with the odd power peak.


I'm seeing 4-4.5 Watt idle. I've disabled WiFi in the BIOS (using wired Ethernet) and ran `powertop --auto-tune`, but not much else.


I am not the OP, but I got an $150 (at a time) fanless quad core Celeron box at Aliexpress about 5 years ago, and it just runs with zero problems with openmediavault and dockers. Attached is external HDD over USB 3, it’s still fast enough (and the HDD is the bottleneck, not the USB interface).


Few months ago it was possible to get Intel N100 (i5-6400 performance at much lower power) based mini PC with 8GB RAM and 256GB SSD for 100-120 USD on sale. Unfortunately, 'rampocalypse' happened.


I wonder if I can run this on a 2 year old celeron laptop


You can run this on a 10 year old celeron laptop.


They sent out an email to clarify that there will be no age verification and that the '18 or older' rule was introduced to avoid that:

> You must be at least 18 years old to use the Service (Zed’s AI-enabled software-as-a-service offering, including features like account creation/sign in, Zed Free and Zed Pro, and collaboration). See https://zed.dev/terms#21-eligibility. We set the threshold at 18 due to children's data privacy obligations under COPPA, equivalent international frameworks, and an increasing number of state and regional laws that extend protections to anyone under 18. Those regulations require parental consent verification, age-gated data handling, and separate retention policies for minors. Building and maintaining that infrastructure is a real cost for a small team, and getting it wrong carries regulatory risk. Setting the line at 18 lets us maintain a single privacy framework for all account holders without carve-outs.


The hardware is great, but the software is lacking. macOS only supports resolution-based scaling which makes anything but the default 200% pixel scaling mode look bad. For example, with a 27" 4K display many users will want to use 150% or 175% scaling to get enough real estate, but the image will look blurry because macOS renders at a higher resolution and then downscales to the 4K resolution of the screen.

Both Windows and Linux (Wayland) support scaling the UI itself, and with their support for sub-pixel anti-aliasing (that macOS also lacks) this makes text look a lot more crisp.


I would love to see examples of this. I have a MBP and a 24" 4K Dell monitor connected via HDMI. I use all kinds of scaled resolutions and I've never noticed anything being jagged or blurry.

Meanwhile in Linux the scaling is generally good, but occasionally I'll run into some UI element that doesn't scale properly, or some application that has a tiny mouse cursor.

And then Windows has serious problems with old apps - blurry as hell with a high DPI display.

Subpixel antialiasing isn't something I miss on macOS because it seems pointless at these resolutions [0]. And I don't think it would work with OLED anyway because the subpixels are arranged differently than a typical conventional LCD.

[0] I remember being excited by ClearType on Windows back in the day, and I did notice a difference. But there's no way I'd be able to discern it on a high DPI display; the conventional antialiasing macOS does is enough.


I'm more surprised that you're using a 24" display at any resolution. Of course, everyone has different preferences, but that just seems ridiculously small considering how available larger displays are for the same ppi and refresh rate probably.

I'm personally on the old 30" 16:10 2560x1600 form factor, and it's wildly better visually than the 27" 1440p screen by the same brand (all of them Dell) I use at the office.


> I'm more surprised that you're using a 24" display at any resolution

I have an 24" 4K Dell I bought when big 4k screen with good (measured) colors were still expensive. It's a very pleasant screen to use. Sure, it has less real estate than a bigger one, but this is somewhat mitigated by the fact that I can keep it closer to my eyes, so I can use smaller text.

I find it makes me more "focused" in a way. Can't have multiple windowfuls of crap visible at the same time. It's very practical for TWMs. It also works well in a dual screen scenario, for stronger separation when you need it, but I'm still not sure if a single bigger screen is better than two smaller ones for things like having docs up next to code for example.

I find I can't use two 27" or higher screens, they're just too big and I need to turn my head way too much for comfort. At work we have a 2x27" 4k setup, and I basically only use the screen in front of me. Later I've been experimenting with pushing them very far away, but then I just need to increase text size and lose actual real estate.

> but that just seems ridiculously small considering how available larger displays are for the same ppi and refresh rate probably

I don't particularly care about refresh rates above 60 Hz (my laptop does 120 Hz, can see the difference, don't care). But I do care about PPI. Which screens are easily available with the PPI of a 4K 24"? I'd expect something like 5k 27" or 6k 32". These are very expensive (>1000 € for a crappy 27" Samsung, 2000 for a 32" Dell) and not that common, at least in France.


> I don't particularly care about refresh rates above 60 Hz (my laptop does 120 Hz, can see the difference, don't care). But I do care about PPI.

I feel basically the same way, and I don't like excessively wide screens or even 16:9. I've always preferred 16:10, and have wavered between 1,2,3 screens over time. 16:9 27" 1440p is not a pleasant form factor, but it's fine in vertical mode.

I tend to prefer PPI, but not at the cost of screen real estate, and I tend to prefer 120hz, but not at the cost of PPI or picture quality. So the Dell Ultrasharp 30" series from years ago, with IPS 60hz and 2560x1600 is perfect for now, and it also lets me run games without investing substantially in brand new gaming PC hardware. The picture quality is great, the price on the used market is great, screen real estate is great, it's just not as sharp or fast as my Mac screen.

I've got my eyes on 32" 6k displays, but since they're so ungodly expensive, I'd really prefer them to have 120hz and good HDR, even though they're not priority attributes for me. I'd keep one of the 30" displays next to it in vertical mode for documentation or log files


> I'm personally on the old 30" 16:10 2560x1600 form factor

I sorta wish that form factor had taken off instead of 27" 1440p. The extra vertical space is really nice, and that seems to be the ideal PPI for 100% scaling IMHO.

I keep telling myself I'd like to get a 4K OLED display at the same PPI, but 40" seems to be conspicuously missing in every monitor lineup... at least at a price that will convince me to buy three of them, anyway.


Agreed. I'm hoping that some more decent 6k 32" screens come out this year, but they're still all 16:9 which just sucks imo


Agree! I still have several (now discontinued) Philips 40 inch monitors, and that is the perfect size to do programming work. Very little scrolling needed while you work. But I would love to have a 40 inch in 4K+ instead of 2560x1600, why is no one making these? (I did get a Samsung 8K 50 inch, but that's too large for a multi screen setup)


Any other requirements? I noticed this one recently, but 40" is a bit big for my taste: https://www.dell.com/en-ca/shop/dell-ultrasharp-40-curved-th...


Yeah, I worked on that one. It's passable, but I don't like the aspect ratio very much, it's too wide, I rather have 40" on 16:9


Ya idk what people are getting from ultrawides tbh. They're not great for video, not great for my neck, not enough vertical space, and can be disorienting for gaming. I can certainly imagine scenarios that would make them effective, but I'd just rather have more vertical space


I took one of my dual 24" office monitors during Covid WFH and ended up keeping it when I quit that job. I use it as a second display alongside the MacBook which is on a stand.

I think the largest I would want at my current desk is 27". 30 is way too big for me. But more importantly I want something that matches the crispness of the MBP display, and 1440p and 1600p are too low res.


Look at how many people only use their 14 inch laptop screen, it's ridiculous and terribly unergonomic.


This [1] has good examples. 24" 4K is on the smaller side and so less noticeable than on larger displays like 27" or 32".

[1] https://bjango.com/articles/macexternaldisplays2/


I have a Macbook pro and a Linux machine attached to my dual 4k monitors.

Fonts on Linux (KDE Plasma on Wayland) look noticeably sharper than the Mac. I don't use subpixel rendering either. I hate that I have to use the Mac for work.


This is correct and also increasingly affecting me as my eyes age. I had to give my Studio Display to my wife because my eyes can't focus at a reasonable distance anymore, and if I moved back further the text was too small to read. I ran the 5K Studio Display at 4K scaled for a bit but it was noticeably blurry.

This would've been easily solved with non-integer scaling, if Apple had implemented that.

(I now use a combo of 4K TV 48" from ~1.5-2 metres back as well as a 4K 27" screen from 1 m away, depending on which room I want to work in. Angular resolution works out similarly (115 pixels per degree).)


All through the 2000s Apple developed non-integer scaling support in various versions of MacOS X under the banner of “resolution independence” - the idea was to use vectors where possible rather than bitmaps so OS UI would look good at any resolution, including non-integer scaling factors.

Some indie Mac developers even started implementing support for it in anticipation of it being officially enabled. The code was present in 10.4 through 10.6 and possibly later, although not enabled by default. Apple gave up on the idea sadly and integer scaling is where we are.

Here’s a developer blog from 2006 playing with it:

> https://redsweater.com/blog/223/resolution-independent-fever

There was even documentation for getting ready to support resolution independence on Apple’s developer portal at one stage, but I sadly can’t find it today.

Here’s a news post from all the way back in 2004 discussing the in development feature in Mac OS tiger:

> https://forums.appleinsider.com/discussion/45544/mac-os-x-ti...

Lots of of folks (myself included!) in the Mac software world were really excited for it back then. It would have permitted you to scale the UI to totally arbitrary sizes while maintaining sharpness etc.


Yep, I played with User Interface Resolution app myself back then in uni. The impact of Apple's choice to skip non-integer scaling didn't hit me until a few years ago when my eyes started to fail...


> This is correct and also increasingly affecting me as my eyes age. I had to give my Studio Display to my wife because my eyes can't focus at a reasonable distance anymore, and if I moved back further the text was too small to read.

> (I now use a combo of 4K TV 48" from ~1.5-2 metres back as well as a 4K 27" screen from 1 m away, depending on which room I want to work in. Angular resolution works out similarly (115 pixels per degree).)

The TV is likely a healthier distance to keep your eyes focused on all day regardless, but were glasses not an option?


Glasses would have been the "normal person" fix, but my eyes are great otherwise (better than 20/20 distance vision). So I could focus closer with glasses, but the lenses were worse quality than just sitting farther back.


If you can get used to using it (which really just requires some practice), the screen magnifier on Mac is fantastic and most importantly it’s extremely low latency (by this I mean, it reacts pretty much instantly when you want to zoom in or out).

Once you get used to flicking in and out of zoom instead of leaning into the monitor it’s great.

As an aside, Windows and Linux share this property too nowadays. Using the screen magnifiers is equally pleasant on any of these OSes. I game on Linux these days and the magnifier there even works within games.


Oh man... I'm in the same situation wrt eyesight. Are you coding on the 4K tv? I have enough space to make that configuration work. TIA


Yep, 4K is plenty of resolution for me running Sequoia. But running at simulated 1920x1080@2x, as at native 4K text would be way too small.


Thank you!


> For example, with a 27" 4K display many users will want to use 150% or 175% scaling to get enough real estate, but the image will look blurry

I use a Mac with a monitor with these specs (a Dell of some kind, I don't know the model number off the top of my head), at 150% scaling, and it's not blurry at all.


I also feel it's just fine. Not as amazing as the Apple displays, but I'll have to sit really close to make out the difference for text.


I just tested on my 4k display and 150% and 175% were not blurry at all. I'm on a 32 inch 4k monitor. Is it possible this information is out of date and was fixed by more recent versions of macos?


Absolutely not fixed. Try to look on black text on white background. Its not very obvious but still a little annoying


Interesting, maybe it just doesn't bother me, because I do not notice it at all. I was looking at black text on a white background. Maybe it's less of an impact on Q-OLEDs with their pixel layout perhaps? I just checked and I actually run my ultra-wide monitor at 125% resolution and the text looks crisp. That one is a regular LED display but it does have really high pixel density (5120 x 2160, I run it at 3360x1418)


> For example, with a 27" 4K display

4K pixels is not enough at 27" for Retina scaling.

Apple uses 5K panels in their 27" displays for this reason.

There are several very good 27" 5K monitors on the market now around $700 to $800. Not as cheap as the 4K monitors but you have to pay for the pixel density.

There are also driver boards that let you convert 27" 5K iMacs into external monitors. I don't recommend this lightly because it's not an easy mod but it's within reason for the motivated Hacker News audience.


If your Mac goes bad it can be worthwile. My friend gave me his pre-Retina 27" iMac, part of the circa-2008 generation of Macs whose GPUs all failed.

I removed all the computing hardware but kept the Apple power supply, instead of using the cheapo one that came with the LCD driver board I bought. I was able to find the PWM specs for the panel, and installed a cheap PWM module with its own frequency & duty-cycle display to drive it and control brightness.

The result is my daily desktop monitor. Spent way too much time on it, but it works great!


Apple still uses ancient 450nm panel though, nowadays everyone and their dog moved to 455-460nm ones. 450nm considerably more harsh on my eyes.


Wayland supports it (and Chrome supports it very well) but GTK does not. I run my UI at 200% scaling because graphical Emacs uses GTK to draw text, and that text would be blurry if I ran at my preferred scaling factor of 150% or 175%.


GTK uses Pango/Harfbuzz and some other components to draw text, all of which are widely used in other Linux GUI stacks. GTK/GDK do not draw text themselves, so your complaints are not with them directly.


I'm not asseting that text is being rendered incorrectly. I'm asserting that after rendering, the text is being downsampled.


This works with GTK for me at least. I've been using Gnome+Wayland with 150% scaling for almost 4 years now, and I haven't noticed any issues with GTK. Actually, my experience is essentially backwards from yours—anything Electron/Chromium-based needed a bunch of command-line flags to work properly up until a few months ago, whereas GTK apps always just worked without any issues.


If you're using a high-DPI monitor, you might not notice the blurriness. I use a standard 110-DPI monitor (at 200% scaling in Gnome) and I notice it when the scaling factor is not an integer.

Or more precisely, I noticed it eventually as a result of my being primed to notice it after people on this site insisted that GTK cannot handle fractional scaling factors.

Compared to the contents of a browser's viewport, Emacs and the apps that come with Gnome are visually simple, so it took me a year or 2 to notice (even on a standard 110-DPI monitor used at 150% and 175% scaling) any blurriness in those apps since the app I'm most conditioned to notice blurriness is my browser, and Chrome's viewport is resolution independent except when rendering certain image formats -- text is always non-blurry.

Yes, Chrome's entire window can be quite blurry if Xwayland is involved, but it now talks to Wayland by default and for years before that could be configured to talk Wayland, so I don't consider that worth talking about. If Xwayland is not involved, the contents of Chrome's viewport is non-blurry at all scaling factors except for the PNGs, JPGs, etc. For a long time, when run at a fractional scaling factor under Gnome (and configured to talk Wayland) the only part of Hacker News that was blurry was the "Y" logo in the top left corner, then about 2 years ago, that logo's PNG file was replaced with an SVG file and the final bit of blurriness on HN went away.


> If you're using a high-DPI monitor [...] I use a standard 110-DPI monitor (at 200% scaling in Gnome)

FWIW, I'm using a 184 DPI monitor with 150% scaling.

> you might not notice the blurriness. [...]

> Compared to the contents of a browser's viewport, Emacs and the apps that come with Gnome are visually simple, so it took me a year or 2 to notice

I'm pretty sensitive to font rendering issues—to the point where I've complained to publishers about their PDFs having unhinted fonts—so I think that I would have noticed it, but if it's really as subtle as you say, then maybe I haven't.

I do have a somewhat unusual setup though: I'm currently using

  $ gsettings set org.gnome.mutter experimental-features "['scale-monitor-framebuffer','xwayland-native-scaling']"
although that might not be required any more with recent versions. I've also enabled full hinting and subpixel antialiasing with Gnome Tweaks, and I've set the following environment variables:

  MOZ_ENABLE_WAYLAND=1
  QT_QPA_PLATFORM=wayland
  GDK_BACKEND=wayland,x11,*
  CLUTTER_BACKEND=gdk,wayland
  SDL_VIDEODRIVER=wayland
  SDL_VIDEO_DRIVER=wayland
  ECORE_EVAS_ENGINE=wayland_egl
  ELM_ENGINE=wayland_egl
  QT_AUTO_SCREEN_SCALE_FACTOR=1
  QT_ENABLE_HIGHDPI_SCALING=1
So maybe one of those settings would improve things for you? I've randomly accumulated most of these settings over the years, so I unfortunately can't really explain what (if anything) any of them do.

> Yes, Chrome's entire window can be quite blurry if Xwayland is involved, but it now talks to Wayland by default

Ah, good to hear that that's finally the default; that probably means that I can safely remove my custom wrapper scripts that forced those flags on.


Do you notice blurriness on MacOS when the Settings app (name?) has been used to change the scaling factor to a fractional value?


Sorry, but I haven't ever used a Mac, so I unfortunately can't answer that. I've used Windows with fractional scaling, and most programs aren't blurry there, but the few that don't support fractional scaling are really blurry.


That's an accurate summary of my experience with Windows, too.


> macOS renders at a higher resolution and then downscales to the 4K resolution

That seems weird to me. I remember 20 years ago one of the whole points of macOS version 10 was display PDF, i.e. a vector based UI.


While the original OS X display model, Quartz, evolved from Display PDF via NextStep, I believe that it shifted back to pixel rasterization to offload more of the display stack onto the GPU.

Quartz Extreme?

John Siracusa, Ars Technica:

It's possible that existing consumer video cards could be coerced into doing efficient vector drawing in hardware. Apple tried to do just that in Tiger [note], but then had to back off at the last minute and disable the feature in the shipping version of the OS. It remains disabled to this day.

[note] https://arstechnica.com/reviews/os/macosx-10.4.ars/14

https://arstechnica.com/staff/2006/04/3720/


Have you ever seen a MacBook air's screen? Those use fractional scaling and look fine.


Yeah this is correct, I don't know why you're being downvoted. The decisions Apple made when pivoting their software stack to high-DPI resulted in Macs requiring ultra-dense displays for optimal results - that's a limitation of macOS, not an indictment of less dense displays, which Windows and Linux accommodate much better.


Microsoft MS-DOS and Windows supported this in the 90s with DriveSpace, and modern file systems like btrfs and zfs also support transparent compression.


Issues caused by restoring from backups were super common in the early iOS days. It makes me wonder how many weird bugs can be fixed these days by starting from scratch instead of migrating years of cruft through backup/restore.


Quite a lot probably. I wasn’t able to use the health app on two different iPhones - it would crash after tapping “get started” consistently - and apple told me to just start from scratch instead of restoring from backup and there was no other way to fix it. Not a very satisfying answer


They might not be dependent on ad revenue, but they are a greedy company that will not leave any money on the table. Next year, more ads are coming to the App Store that already generates a profit of over $10 billion/year: https://9to5mac.com/2025/12/17/apple-announces-more-ads-are-...


MacOS doesn’t have a gatekeeper status in the Digital Markets Act (DMA), so Apple doesn’t need to provide it. This shows that they only provide the SDK because of regulatory pressure, and try to maintain their vendor lock-in where possible.


Not necessarily, Since 2015 launch NAN has been vaporware outside android, nobody else support it. Windows does not do so today either [1].

In Linux iw and the new cfg80211 NAN module has support for some hardware. There are few chips in desktop/laptop ecosystem that have the feature, but it is hard to know which ones today, it is more common not to have support than to.

AFAIK no major distros include UI based support that regular users can use. Most Chromebooks do not have the hardware to support, ChromeOS[2] did not have support OOB, so even Google does not implement it for all their devices in the first place.

For Apple to implement is easier than Microsoft or Google given their vertical control, but not simple even if they wanted to. They may still need a hardware update/change and they typically rollout few versions of the hardware first before they announce support so most people have access to it, given the hardware refresh cycle it is important for basic user experience which is why people buy Apple. What is the point if you cannot share with most users because they don't have latest hardware? Average user will try couple of times and never use it again because it doesn't "work".

Sometimes competing standards / lack of compliance are political play for control of the standards not about vendor lock-in directly. Developers are the usual casualties in these wars, rather than end users directly. Webdevs been learning that since JScript in the mid 90s.

All this to say, as evidences go this is weak for selective compliance due to regulatory pressure.

[1] https://learn.microsoft.com/en-us/answers/questions/2284386/...

[2] I haven't checked recently


Look, you might be right. But you might be wrong. We don't know for sure.

One of my first jobs was in infosec, and there was a sign above one of the senior consultant's door quoting Hanlon's Razor: "Never attribute to malice that which is adequately explained by stupidity". That quote is right.

There's so much going on at any medium-to-large organisation, from engineering to politics and personalities. All that multiplied across hundreds of thousands of people in thousands of teams. Its possible you're right. Apple might have provided an iOS-only SDK for wifi aware because of regulatory pressure. Its also possible they want to provide it on all platforms, but just started with an ios only version because of who works on it, or which business unit they're part of, or politics, or because they think its more useful on ios than on macos. We just don't know.

Whenever I've worked in large organisations, I'm always amazed how much nonsense goes on internally that is impossible to predict from the outside. Like, someone emails us about something important. It makes the rounds internally, but the person never gets emailed back. Why? Maybe because nobody inside the company thought it was their job to get back to them. Or Steve should really have replied, but he was away on paternity leave or something and forgot about it when he got back to work. Or sally is just bad at writing emails. Or there's some policy that PR needs to read all emails to the public, and nobody could be bothered. And so on. From the outside you just can't know.

I don't know if you're right or wrong. Apple isn't all good or all bad. And the probability isn't 100% and its not 0%. Take off the tin foil hat and have some uncertainty.


Your reply makes sense in a vacuum, but in reality we have the context of having seen Apple comply with regulation maliciously before, so we do know for sure that there's no macOS in the sdk because they weren't forced to by regulation.


> we do know for sure that there's no macOS in the sdk because they weren't forced to by regulation.

Unless you have insider knowledge, we don't know anything for sure here. Apple isn't a person. Apple doesn't have a single, consistent opinion when it comes to openness and EU regulation. (And even a person can change their mind.) All we know is that some teams at apple responded in the past to some EU regulation with malicious compliance. That doesn't tell us for sure what apple will do here.

Apple is 165 000 people. That's a lot of people. A lot more people than comment regularly on HN, and look at us! We don't agree about anything. I'm sure plenty of apple's employees hate EU regulation. And plenty more would love to opensource everything apple does.

That sort of inconsistency is exactly what we see across apple's product line. The Swift programming language is opensource. But SwiftUI is closed source. Webkit and FoundationDB are opensource. But almost everything on iOS is closed source. Apple sometimes promotes open standards - like pushing Firewire, USB and more recently USB-C - which they helped to design. But they also push plenty of proprietary standards that they keep under lock and key. Like the old 20-pin ipod connector, that companies had to pay money to apple to be allowed to use in 3rd party products. Or Airdrop. Or iMessage. AFS (apple filesystem) is closed source. But its also incredibly well documented. My guess is the engineers responsible want to support 3rd party implementations of AFS but for some reason they're prohibited from open-sourcing their own implementation.

We don't know anything for sure here. For my money, there's even odds in a year or two this API quietly becomes available on macos, watchos and tvos as well. If you "know for sure" that won't happen, lets make a bet at 100-1 odds. If you're sure, its free money for you.


I largely agree with you but want to highlight a few points.

> Apple doesn't have a single, consistent opinion when it comes to openness and EU regulation.

But it does have a greedy leader who can and does override everyone else.

https://techcrunch.com/2025/02/24/apple-exec-phil-schiller-t...

> Apple is 165 000 people. That's a lot of people. A lot more people than comment regularly on HN

How do you know the HN numbers? I’m not doubting you, I’m curious about the data.

> and look at us! We don't agree about anything.

At the same time, anyone can join HN. There’s no “culture fit” or anything like that. It is possible to have a larger difference of ideas in a smaller pool of people.

> AFS (apple filesystem)

APFS, not AFS.

https://en.wikipedia.org/wiki/Apple_File_System


It’s a few million page views on the front page and a small fraction commenting.


Again, what’s the source of the data? Anyone can throw around vague numbers. “A few million” and “a small fraction” provide no useful information for the context.



Multipoint connectivity is part of the spec but apparently AirPods only support it if you pretend to be an Apple device.


part of which spec revision? what date did it come out? and what date did airpods come out. compare the two dates. i'll wait...


The multipoint spec was added to Bluetooth 4.0 in 2010:

https://www.bluetooth.com/specifications/specs/core-specific...

The Battery Service 1.0 spec was officially adopted in 2011:

https://www.bluetooth.com/specifications/specs/battery-servi...

The first airpods were released in 2016...

Please consider that simping for a trillion dollar company might actually not be in your best personal interests...


What do you mean by "multipoint spec"? I have written a few BT stacks (you might have even used one I wrote at one point or another) and I have no idea what you mean by that phrase. Please cite a section of the spec or proper name of what you are talking about.


If you're such an expert then you could have likely found this even easier than I did:

https://www.bluetooth.com/specifications/specs/multi-profile...


Airpods already do this all...

They support HFP and A2DP and AVRCP, and properly, including all of those features working on android phones and proper switching between them as needed...


The argument (which I assume you deliberately ignored) is that those features, like battery reporting and multi device pairing, are being arbitrarily restricted by Apple to maintain a proprietary ecosystem.

How you could argue that this is a good thing tells me you're either too drunk on the corporate kool-aid or that you have some financial incentive to ignore the obvious problems with these facts.

Either way this is my last message in this thread as googling things for you is a bore.


So show me in the spec where one BT device as seen by the host can report the battery of three different battery levels - ie the case and two ear pieces.


The obvious solution would be to report the lowest number, as multiple replies to you have already proposed, but you again chose to ignore because it doesn't serve your agenda.

This entire thread started with you claiming Apple was somehow trying to prevent issues by hiding these features, and you've twice tried to move the goalposts to irrelevant points when given evidence to the contrary.

If you can't even defend your original position then I have no interest in continuing a discussion with Apple's most useful idiot.


The lowest level of the three is not a useful number. The case serves as a battery pack to recharge the headphones (something I did earlier today while on an international flight).

The reality is the sort of compatibility being talked about is a new feature with design choices, not just unwired functionality.

I'd rather them work on features to report charging time or expected playback time on iOS, or write their own app for Android, than try to arbitrarily increase their bluetooth profile compatibility checklist.


“the lowest number” would be completely useless. What would that tell me? Do I need to charge the case? So I need to put my left pod in the fully charged case? The right one?


The average between the two buds then, and rely on the LED for the case. This isn't that complicated guys, other earbud manufacturers somehow figured it out, I'm sure Apple can too.

Hyper-fixating on an issue with one part of the spec doesn't dismiss the larger problem being discussed. It's baffling (and kind of sad) how hard you guys feel the need to defend a trillion dollar company making obviously anti-consumer decisions.


How does the “average” help? It still isn’t actionable. It doesn’t tell me whether I need to charge my case, put my left AirPod in a fully charged case or put the right AirPod in a fully charged case.

If you know the average of those three, what does it tell you?

What other manufacturers have figured out how to report three devices that represent to a Bluetooth host as one device in a standards conforming way that will work across multiple hosts?

It’s not that I am defending a trillion dollar company - the idea of averaging three completely different devices is non sensical and provides absolutely no benefit to the end user. If you want ti complain about anyone - complain about the standards body.


You, like the person this thread started with, are (deliberately?) missing the fire in the forest because you want to talk about the state of one tree. Yes, the standard should be improved to support multi-part devices, nobody here is arguing against that.

This entire thread started with someone trying to claim that Apple was not in the wrong by restricting these features, of which battery reporting is A SINGLE ONE.

No, I don't have a perfect solution for this one specific part of the problem, but that's also not been the my focus the entire time. Getting dragged into the weeds only serves to distract from the actually important point here, which is that what Apple is doing is anti-consumer.

Let's first agree that Apple should play on even ground with everyone else, and then we can whinge over how to correctly report the battery of three components over a single connection. Yeesh.


Yes because it’s easy as long as you ignore the details. Speaking of which, how do you surface all of the other features of the AirPod using the Bluetooth protocol?

You claimed other manufacturers have “figured this out” - how?

Every single thing that you say Apple should do is about how Apple can do that in a method that conforms to the spec - you kind of have to “fixate” upon the spec if you claim that Apple isn’t conforming to the spec.

The battery reporting is the one you brought up and had only horrible ideas.


The FAQ mentions there will be only 12 days of puzzles and no global leaderboard: https://adventofcode.com/2025/about#faq_num_days

The AoC author also posted about this on Reddit: https://www.reddit.com/r/adventofcode/comments/1ocwh04/chang...


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search:

HN For You