I'd like to apply, but the job pages just offer 'see open positions' in a loop, no clear way to get to an application form, and the 'contact us' page doesn't accept a Gmail address. Am I missing something?
Thanks for flagging this! You’re right — the job page flow can be a bit confusing. You can apply directly by sending your resume to the email address listed in the last section of the job ad. We’re also adding a form shortly to make applying more straightforward.
I'm seeing quite a few people on this site recently talking about their Kagi subscriptions, claiming it is sufficiently better than Google to be worth the money.
I hope so, because I actively want seconds absent from the system tray. Attention is a scarce resource; the fewer things on the screen constantly changing and thereby consuming my attention, the better. If saving power means we remain free from that anti-feature, great.
I, for one, love it for casual and incidental benchmarking. Of everything - not just a process I run, but also how long between bird chirps outside my window. But I also find it very easy to ignore, too. Glad it’s optional.
Beware of concentrated benefit and diffuse cost. Sure, let a seconds clock be available to call up the 0.1% of the time when you want it. But it shouldn't be in the system tray presenting a small but ongoing attention drain the other 99.9% of the time.
As a horologist, I want seconds. It annoys me not to have it. I wouldn't care if it isn't the default, as long as I can set it, similarly to how I currently have to set 24-hour time separately on all my machines because the US locale defaults to 12-hour time. That's fine, and understandable. But I'm constantly annoyed, for instance, by Apple's long running absolute refusal to allow the iOS clock to display seconds.
The attention drain is sadly pretty much unmeasurable properly, as it's a subjective thing.
I'm one of those freaks who have this on and I honestly like it a lot. It gives me a feeling of certainty, grounding, and precision.
Primary driver for turning it on was their redesign of the clock flyout to be, uhh, nonexistent with Windows 11, which I'd previously use on demand for seconds information. I was also worried about this being a nonsolution and a distraction initially, but it ended up being fine.
I leave a handful of actually important notifications on, like the one that says 'someone just made a purchase using your bank account, making sure it was you', but most of them, I do indeed turn off.
Approximately zero people in the world care if you join a meeting at 1:00, or 1:01. It's good to aim to be punctual, but if you're off by a minute there is no consequence.
That is definitely not true. It's very dependent on the culture, the company, the specific group.
I've met managers who literally lock the conference room door when it hits :00.
That's a little crazy in my view, but there are definitely places where it's the norm.
There are basically two ways of managing expectations around meeting times. The first is that it's acceptable for meetings to run late, so it's normal and tolerated for people to be late to their next meeting, and meetings often start something like 5 minutes late, and you try to make sure nothing really important gets discussed until 10 minutes in. The other is that it's unacceptable for meetings to start late, so people always leave the previous meeting early to make sure they have time for bathroom, emergency emails, etc. In which case important participants wind up leaving before a decision gets made, which is a whole problem of its own.
I'm curious how you came to such a universally sweeping conclusion. At any rate, it's incorrect as I have personally observed counterexamples in my professional career.
Ideally the clock display should be customisable to display whatever level of precision you want; I believe at least one Linux application lets you specify it via a strftime() format string.
That's already more customisation than most software will allow, but to paraphrase an old saying, "those who don't understand strftime() are doomed to reinvent it poorly":
Looking through the latest 'Who's Hiring' thread last week, I noticed that a higher percentage of the remote jobs seemed to specify Remote (US). Could one of the causes of that, have been employers reading ahead and making decisions on the basis that this bill might be passed? Or is there some other reason? Or is it just a case where it fluctuates randomly from month to month, and I am trying to read pattern into random noise?
Eh. I mean, I don't have a piano, but I think a world in which no one ever does anything that can't be done on a laptop computer, would not be a better place.
I think it's an interesting approach. With, yes, caveats as people have pointed out, but there are caveats to everything, and this approach removes a huge barrier to 'try them and see'. If you want to talk specifics, contact info is in my profile.
ChatGPT allows you to download an archive of all the conversations you have had with it, so you can do things like search for past conversations with your choice of search tool. The archive is in JSON format, not quite usable directly, so I wrote a program to unpack it to plain text.
Right. To be clear, the purpose of the article is not 'zounds, C and C++ have footguns!' but 'C and C++ have footguns – yeah, this is not exactly breaking news – but here is a hopefully helpful summary of where they are, why they exist, and what you can do to avoid them'.
If you are already satisfied you know how to avoid them, and you don't need any more help with that, then you are not the target audience, and should by all means ignore the article.
But tyfighter has some reason. People take such articles, and use them to beat up anyone writing C++, arguing that they are stupid to use such an undependable tool.
So, yes, people who know what they're doing can ignore such articles, as a first-order effect. But there are second-order effects from such articles, and while they don't change anything, they are rather unpleasant. Hence tyfighter's anger - he gets tired of being on the receiving end of the fallout from such articles.
I do actually sympathize with that! I tried to keep a level tone, and maximize the ratio of useful information to flame-war ammunition, but that ratio unfortunately has an upper bound well short of infinity.
Not the first discussion of this topic, by any means. In this case, I've tried to boil it down to the essential points a practical programmer needs to know, but the article still ended up longer than I initially aimed for.
One hopefully constructive comment… I didn’t find this a motivating example as intended:
int foo_or_bar(int which) {
// Assumes you don't mind both functions being called
int x = foo();
int y = bar();
return *(&x + which);
}
The argument being (if I understood right) if x has to have an address, it can’t be put in a register, so that must be UB or we can’t use registers. Well, how about the rule is that if I take the address of x, it can’t be put in a register? That seems like an obvious rule, and I seem to remember that was a safe assumption before the “great UBification” of compilers.
I’m sure there’s a better example of why UB helps optimization, but this one didn’t work for me.
Ohhh. I misunderstood the example. That would not only depend on a lack of register usage, but depend on where the variables are stored on the stack, which I don’t think anyone could reasonably demand even in 1976.
So I still feel there must be a more reasonable and therefore motivating example of optimizations one would want that are enabled by surprising you with UB. (Which I think was the idea behind this example.)
> depend on where the variables are stored on the stack, which I don’t think anyone could reasonably demand even in 1976
You'd be surprised. A typical compiler of that era would be single-pass, and allocating variables on the stack in order in which they were declared was not uncommon. Don't forget, we're talking about a language that actually had a "register" keyword solely to tell the compiler to enregister the variable!