None of the tools that you mentioned except for LibreOffice and OpenOffice are free-as-in-freedom, and if you’re using Linux on the desktop, then Microsoft Office and the Apple iWork suite are unavailable as desktop applications.
I interpreted the clause “two poor alternatives in a row” as Biden + Harris in the 2024 presidential election, and not Clinton + Harris, since Clinton was the 2016 nominee and Harris was the 2024 nominee after Biden dropped out, but the 2020 nominee was Biden, who did successfully defeat Trump that year.
In my opinion, Clinton’s and Harris’ losses had less to do with their gender and more to do with the candidates themselves:
1. Clinton was facing strong anti-establishment headwinds, and Clinton is a very establishment politician. Many people in 2016 were piping mad at establishment politicians. Trump was able to win the GOP nomination on a platform of “draining the swamp” and pursuing an aggressively right-wing agenda compared to more moderate Republicans, and Sanders, who also had an anti-establishment platform, proved to be a formidable opponent to Clinton. Despite Clinton’s loss, she was still able to win the popular vote. Perhaps had there been less anti-establishment sentiment, it would have been a Clinton vs Jeb Bush election, and I believe Clinton would have won that race.
2. Harris never won a presidential primary election. The only reason she ended up becoming the nominee is because Biden dropped out of the race after his disastrous debate performance against Trump, which occurred after the primaries. Since it was too late to have the voters decide on a replacement for Biden, the Democratic Party selected a replacement: Harris. She only had a few months to campaign, whereas Trump had virtually campaigned his entire time out of office.
3. Let’s not forget the Trump factor in 2024. During Biden’s entire presidency, Trump was able to consolidate his hold on the GOP and his voting base, and in some ways he even expanded his base. The conservative media was filled with defenses of January 6, and Trump was able to convince enough Americans that he and his supporters were persecuted in the aftermath of the 2020 election and January 6.
I believe Trump would have won 2020 had the COVID pandemic not happened. Things were very chaotic in 2020 America. Biden and his extensive experience in the federal government looked reassuring to a lot of Americans. Biden would have had a tougher time against Trump had 2020 been more like 2019. I believe Biden would have had a tougher time against Bernie Sanders in the primaries had COVID not happened, though a counterargument is that Super Tuesday happened on March 3, before shelter-in-place policies were in effect in California.
A big reason for Trump's success despite his polarizing nature is the polarizing effects of the platforms of our two parties, which distinguish themselves on "culture war" issues such as abortion, gun rights, immigration, LGBT+ rights, and race relations. There are many Americans who love the MAGA agenda, and there are also many Americans who are not in 100% agreement with MAGA but who'd never vote for a Democrat since they feel that a candidate with the opposite cultural views is anathema. If third parties were more viable in America, the latter group of voters could vote for a candidate that is more to their temperament instead of voting for whomever the GOP nominee is.
Had COVID not happened, Trump might not have gone batshit crazy with a vendetta against the entire concept of independent federal agencies. Actively rejecting the advice coming from Fauci et al would seem to be a large part of what sensitized him to the larger pattern rather than just writing each instance off as an interpersonal issue.
(by "Trump" and "him" I mean the person himself plus his symbiotic ecosystem of enablers and followers)
As much as I love alluring designs such as the NeXT Cube (which I have), the Power Mac G4 Cube (which I wish I had), and the 2013 Mac Pro (which I also have), sometimes a person needs a big, hulking box of computational power with room for internal expansion, and from the first Quadra tower in the early 1990s until the 2012 Mac Pro was discontinued, and again from 2019 until today, Apple delivered this.
Even so, the ARM Mac Pro felt more like a halo car rather than a workhorse. The ARM Mac Pro may have been more compelling had it supported GPUs. Without this support, the price premium of the Mac Pro over the Mac Studio was too great to justify purchasing the Pro for many people, unless they absolutely needed internal expansion.
I’d love a user-upgradable Mac like my 2013 Mac Pro, but it’s clear that Apple has long moved on with its ARM Macs. I’ve moved on to the PC ecosystem. On one hand ARM Macs are quite powerful and energy-efficient, but on the other hand they’re very expensive for non-base RAM and storage configurations, though with today’s crazy prices for DDR5 RAM and NVMe SSDs, Apple’s prices for upgrades don’t look that bad by comparison.
I believe this is the first time since 1987 with the introduction of the Macintosh II that there are no Macs in Apple's lineup that offer some type of combination of upgradeable RAM, upgradeable storage, and internal expansion slots. The 2013 Mac Pro lacked internal expansion slots, but still had DIMM slots and an SSD slot. The 2019 Mac Pro brought back expansion slots, though the 2023 Mac Pro took away DIMM slots in favor of the unified memory architecture found in all ARM Macs.
I have mixed feelings about this. On one hand I miss being able to upgrade RAM at a later date without having to pay up-front for all of the RAM I'm expected to use for the lifetime of the machine. This is especially painful in 2026 with today's sky-high RAM prices caused by intense demand. On the other hand, the memory bandwidth in Apple's ARM Macs is tremendous, especially in higher-end Macs, due to the tight integration of the design. This matters greatly in memory-intensive applications such as generative AI. I feel less bad about non-expandable RAM given the tradeoffs, though it still makes for quite expensive computing, especially at 2026 RAM prices.
I guess Apple has finally achieved Steve Jobs' original Macintosh vision of closed-off appliances, though (thankfully) the NeXT Cube and the NeXTstation were not like that. RIP to Jean Louis-Gassée's vision of expandable, upgradeable Macs, starting with the Macintosh II in 1987 and leading to other fine Macs such as the Macintosh IIfx, the Quadra lineup, high-end Power Macs (8100, 8500, 9500, 8600, 9600, G3, G4, G5), and the Mac Pro.
Indeed. It seems, at least in America (I’m less familiar with the situation abroad) that computer science researchers who want to do longer-term work are getting squeezed. Less funding means fewer research positions in academia. Industry has many opportunities, especially in AI, but industry tends to favor shorter-term, product-focused research as opposed to longer-term work with fewer immediate prospects for productization. This is a great environment for many researchers, but researchers who want to work on longer-term, “blue-skies” projects might not find a suitable position in industry these days.
I wholeheartedly agree. Computing professions such as software engineering used to feel like, "Wow, they're paying me to do this!" Yes, there was real work involved, but for many of us it never felt like drudgery, and we produced, shipped, and made our customers, managers, and other stakeholders happy. I remember a time (roughly 20 years ago) when zealous enthusiasts would proudly profess that they'd work for companies like Apple or Google for free if they could work on their dream projects.
Times have changed. The field has become much more serious about making money; fantasies about volunteering at Apple have been replaced with fantasies about very large salaries and RSU grants. Simultaneously (and I don't think coincidentally), the field has become less fun. I recognized how privileged this sounds talking about "fun", given how for most of humanity, work isn't about having fun and personal fulfillment, but about making the money required to house, feed, and clothe themselves and their loved ones. Even with the drudgery of corporate life, it beats the work conditions and the abuse that many other occupations get.
Still, let's pour one out for a time when the interests and passions of computing enthusiasts did line up with the interests of the corporate world.
My take is that there used to be a significant overlap between hobbyist-style exploration/coding and what industry wanted, especially during the PC revolution where companies like Apple and Microsoft were started by hobbyists selling their creations to other people. This continued through the 1990s and the 2000s; we know the story of how Mark Zuckerberg started Facebook from his Harvard dorm room. I am a 90s kid who was inspired by the stories of Steve Jobs and Bill Gates to pursue a computing career. I was also inspired by Bell Labs and Xerox PARC researchers.
The “hacker-friendliness” of software industry employment has been eroding in the past decade or so, and generative AI is another factor that strengthens the position of business owners and managers. Perhaps this is the maturing of the software development field. Back when computers were new and when there were few people skilled in computing, employment was more favorable for hobbyists. Over time the frontiers of computing have been settled, which reduced the need for explorers, and thus explorers have been sidelined in favor of different types of workers. LLMs are another step; while I’m not sure that LLMs could do academic research in computer science, they are already capable of doing software engineering tasks that undergraduates and interns could do.
I think what some of us are mourning is the closing of a frontier, of our figurative pastures being turned into suburban subdivisions. It’s bigger than generative AI; it’s a field that is less dependent on hobbyists for its future.
There will always be other frontiers, and even in computing there are still interesting areas of research and areas where hobbyists can contribute. But I think much of the software industry has moved in a direction where its ethos is different from the ethos of enthusiasts.
I’ve come to the same conclusion, though my line of work was research rather than software engineering. “He who pays the piper calls the tune.” It’s fun as long as I enjoyed the tunes being called, but the tunes changed, and I became less interested in playing.
I am now a tenure-track community college professor. I’m evaluated entirely by my teaching and service. While teaching a full course load is intense, and while my salary is nowhere near what a FAANG engineer makes, I get three months of summer break and one month of winter break every year to rejuvenate and to work on personal projects, with nobody telling me what research projects to work on, how frequently I should publish, and how fast I ship code.
This quote from J. J. Thomson resonates with me, and it’s more than 100 years old:
"Granting the importance of this pioneering research, how can it best be promoted? The method of direct endowment will not work, for if you pay a man a salary for doing research, he and you will want to have something to point to at the end of the year to show that the money has not been wasted. In promising work of the highest class, however, results do not come in this regular fashion, in fact years may pass without any tangible results being obtained, and the position of the paid worker would be very embarrassing and he would naturally take to work on a lower, or at any rate a different plane where he could be sure of getting year by year tangible results which would justify his salary. The position is this: You want this kind of research, but, if you pay a man to do it, it will drive him to research of a different kind. The only thing to do is to pay him for doing something else and give him enough leisure to do research for the love of it." (from https://archive.org/details/b29932208/page/198/mode/2up).
That was the original strategy for universities: teaching was the job, and research was the side-product of having some very smart people with free time. Until some "genius" decided that it was better to have professors competing for money to pay directly for their research. This transformed a noble and desirable profession into just another money searching activity.
I remember when I first learned about GNUstep in 2004 when I was in high school. It's a shame GNUstep never took off; we could have had an ecosystem of applications that could run on both macOS and Linux using native GUIs.
reply