I very much used to agree with this, but some time this summer the ChatGPT iOS app started to change this for me. I have definitely had days where I've felt as coding-creative as I can be on a laptop but instead just texting my AI interns to handle the execution while I'm out for a walk.
I don't understand why this article invents and explains a phony ranged-float fix when the real fix from the footnotes would have been just as simple to explain. The deception needlessly undermines the main point of the article, which I completely agree with.
That fix has limited applicability. x * x is also a non-negative float. But abs(x * x) is not optimized. Or abs(abs(x)+1).
GCC, for example, does know that.
For a good long while at least, this flag was a signal for the browser to use CPU rendering, because of the overhead of GPU setup for rendering very changing content was too high.
My knowledge is dated and second hand though. New GPU APIs hopefully changed this!
I completely agree that technology in the last couple years has genuinely been fulfilling the promise established in my childhood sci-fi.
The other day, alone in a city I'd never been to before, I snapped a photo of a bistro's daily specials hand-written on a blackboard in Chinese, copied the text right out of the photo, translated it into English, learned how to pronounce the menu item I wanted, and ordered some dinner.
Two years ago this story would have been: notice the special board, realize I don't quite understand all the characters well enough to choose or order, and turn wistfully to the menu to hopefully find something familiar instead. Or skip the bistro and grab a pre-packaged sandwich at a convenience store.
> I snapped a photo of a bistro's daily specials hand-written on a blackboard in Chinese, copied the text right out of the photo, translated it into English, learned how to pronounce the menu item I wanted, and ordered some dinner.
> Two years ago
This functionality was available in 2014, on either an iPhone or android. I ordered specials in Taipei way before Covid. Here's the blog post celebrating it:
This is all a post about AI, hype, and skepticism. In my childhood sci-fi, the idea of people working multiple jobs to still not be able to afford rent was written as shocking or seen as dystopian. All this incredible technology is a double edges sword, but doesn't solve the problems of the day, only the problems of business efficiency, which exacerbates the problems of the day.
>The other day, alone in a city I'd never been to before, I snapped a photo of a bistro's daily specials hand-written on a blackboard in Chinese, copied the text right out of the photo, translated it into English, learned how to pronounce the menu item I wanted, and ordered some dinner.
To be fair apps dedicated apps like Pleco have supported things like this for 6+ years, but the spread of modern language models has made it more accessible
My preferred way to compare floats as being interchangeably equivalent in unit tests is
bool equiv(float x, float y) {
return (x <= y && y <= x)
|| (x != x && y != y);
}
This handles things like ±0 and NaNs (while NaNs can't be IEEE-754-equal per se, they're almost always interchangeable), and convinces -Wfloat-equal you kinda know what you're doing. Also everything visually lines up real neat and tidy, which I find makes it easy to remember.
Outside unit tests... I haven't really encountered many places where float equality is actually what I want to test. It's usually some < or <= condition instead.
In C89, it was implementation-defined. In C99, it was made expressly legal, but it was erroneously included in the list of undefined behavior annex. From C11 on, the annex was fixed.
> but UB in C++
C++11 adopted "unrestricted unions", which added a concept of active members that is UB to access other members unless you make them active. Except active members rely on constructors and destructors, which primitive types don't have, so the standard isn't particularly clear on what happens here. The current consensus is that it's UB.
C++20 added std::bit_cast which is a much safer interface to type punning than unions.
> punning through incompatible pointer casting was UB in both
There is a general rule that accessing an object through an 'incompatible' lvalue is illegal in both languages. In general, changing the const or volatile qualifier on the object is legal, as is reading via a different signed or unsigned variant, and char pointers can read anything.
> In C99, it was made expressly legal, but it was erroneously included in the list of undefined behavior annex.
In C99, union type punning was put under Annex J.1, which is unspecified behavior, not undefined behavior. Unspecified behavior is basically implementation-defined behavior, except that the implementor is not required to document the behavior.
You can, but in the context of the standard, you'd be wrong to do so. Undefined behavior and unspecified behavior have specific, different, meanings in context of the C and C++ standards.
> You can, but in the context of the standard, you'd be wrong to do so. Undefined behavior and unspecified behavior have specific, different, meanings in context of the C and C++ standards.
> Conflate them at your own peril.
I think that ryao was not conflating them, but literally just pointing out, as a joke, that "UB" can stand for "undefined behavior" or "unspecified behavior." Taking advantage of this is inviting dangerous ambiguity, which is why ryao's suggestion ended with ":)," but I think that saying that it's wrong is an overstateent.
There has been plenty of misinformation spread on that. One of the GCC developers told me explicitly that type punning through a union was UB in C, but defined by GCC when I asked (after I had a bug report closed due to UB). I could find the bug report if I look for it, but I would rather not do the search.
From a draft of the C23 standard, this is what it has to say about union type punning:
> If the member used to read the contents of a union object is not the same as the member last used to store a value in the object the appropriate part of the object representation of the value is reinterpreted as an object representation in the new type as described in 6.2.6 (a process sometimes called type punning). This might be a non-value representation.
In past standards, it said "trap representation" rather than "non-value representation," but in none of them did it say that union type punning was undefined behavior. If you have a PDF of any standard or draft standard, just doing a search for "type punning" should direct you to this footnote quickly.
So I'm going to say that if the GCC developer explicitly said that union type punning was undefined behavior in C, then they were wrong, because that's not what the C standard says.
> (11) The values of bytes that correspond to union members other than the one last stored into (6.2.6.1).
So it's a little more constrained in the ramifications, but the outcomes may still be surprising. It's a bit unfortunate that "UB" aliases to both "Undefined behavior" and "Unspecified behavior" given they have subtly different definitions.
From section 4 we have:
> A program that is correct in all other aspects, operating on correct data, containing unspecified behavior shall be a correct program and act in accordance with 5.1.2.4.
I actually might, although not now. Thanks for the link. I'm surprised he directly contradicted the C standard, rather than it just being a misunderstanding.
It doesn't. That commenter is saying that in C99, it was unspecified behavior. Since C11 onward, it's been removed from the unspecified behavior annex and type punning is allowed, though it may generate a trap/non-value representation. It was never undefined behavior, which is different.
Edit: no, it's still in the unspecified behavior annex, that's my mistake. It's still not undefined, though.
I am a member of the standards committee and a GCC maintainer. The C standard supports union punning. (You are right though that relying on godbolt examples can be misleading.)
This was my instinct too, until I got this little tickle in the back of my head that maybe I remembered that Clang was already acting like this, so maybe it won't be so bad. Notice 32-bit wzr vs 64-bit xzr:
Ah, I can confirm what I see elsewhere in the thread, this is no longer true in Clang. That first clang was Apple Clang 17---who knows what version that actually is---and here is Clang 20:
$ /opt/homebrew/opt/llvm/bin/clang-20 -O1 -c union.c -o union.o && objdump -d union.o
union.o: file format mach-o arm64
Disassembly of section __TEXT,__text:
0000000000000000 <ltmp0>:
0: f900001f str xzr, [x0]
4: d65f03c0 ret
0000000000000008 <_create_d>:
8: f900001f str xzr, [x0]
c: d65f03c0 ret