I'm sorry, that's absolutely bullshit. In fact, I wish we had left everyone who complained behind—the python community would have been happier and healthier for it. Absolute crybabies who wanted to be catered to without caring for how intractable the problems with python2 were—e.g. dealing with unicode was a royal pain in the ass, and the bytes/string divide completely fixed it. IMO, it was the best-executed breaking change I've ever witnessed in a language.
In comparison, e.g. Scala 2 -> Scala 3 was an absolute nightmare—it just didn't have the same vocal wailing from maintainers in the community (or, I suppose, a fraction of Python's popularity to begin with).
Being to aggressive in breaking stuff gets you a shitshow like Node.js or Ruby. Long-term source code compatibility is a very useful feature for open source and a sign of a mature eco system. Feel free to add stuff, but once it's part of a stable release it has to be maintained long after a "better" way to do it comes along.
Javascript would heavily benefit from breaking changes. The reason why it still sucks ass to use today is because this won't ever happen.
I can't speak for node.js specifically but who gives a shit
> Long-term source code compatibility is a very useful feature for open source
Sure, until you need affordable maintainers. Maintainability must be balanced with patience for bad software. Cf the insane maintenance cost of perl scripts
nodejs itself doesn't have very many breakages; i have plenty of code that is unchanged from 0.12 to 24. npm is a whole other kettle of fish but I don't think you can blame the core project for the sins of everyone that publishes to the package manager. Python2 -> Python3 on the other hand had a lot of breakage in "standard" code.
Sure, but it's pretty trivial to generate a CLI application that talks to that API.
That's how I let agents access my database too. Letting them access psql is a recipe for disaster, but a CLI executable that contains the credentials, and provides access to a number of predefined queries and commands? That's pretty convenient.
Yes. But are you letting your agent make the decision of when and how to call that CLI? And presumably you're invoking it via the Bash tool. In which case your agent is free to write ad-hoc bash orchestration around your CLI calls. And perhaps you don't just have one such CLI but rather N for N different services.
And so we've arrived at the world of ad-hoc on-the-fly bash scripting that teams writing backend agentic applications in more "traditional"/conservative companies are not going to want.
Don't get me wrong, it's great for claude-code-type local computer automation use cases -- I do the same as you there.
Often in movies you have the scrappy character that rises to the occasion by making a great speech, winning everybody over. I used to love those scenes.
Now, I've realized, in real life they wouldn't have let them finish their first sentence.
stuff like this. if i enjoy a movie but the script simply doesn't check out from a rational perspective (plot holes, implausible behavior, inconsistencies etc.) then i sometimes decide to switch to a fairy tale mental mode where those issues are excused magically. only works with some movies. kingdom of heaven comes to mind.
Project: Hail Mary, a fantasy world where geopolitics are trivially simple and every state in the world collectively agrees how great it would be to cede power and work together. (And therefore enable a genuinely fun and amazing science story which was the actual focus of the book to begin with, 10/10).
This is not quite correct. If a dividend happens, the market capitalisation drops by the amount of the dividend, the number of shares remains constant, so the share price dips by the amount of the dividend per share. All investors get the dividend.
If a buyback happens, the market capitalisation drops by the amount of the buyback, and the number of shares drops by the same ratio, keeping the share price initially constant. The money goes to the investors who sell.
Buybacks are nevertheless good for investors who hold. They now have shares in a company whose market cap is 100% growing enterprise, instead of 90% enterprise and 10% bag of money. That means that if the company keeps doing well, the share price will increase faster than it would have done otherwise (it will also drop faster - it's no longer anchored to an inert pile of cash).
The investors who sell are wealthier by amount $X because now they have fewer shares and more dollars.
The investors who don't sell are wealthier by the same amount $X because the shares they kept are worth more, because prices go up.
> keeping the share price initially constant.
This statement is definitely incorrect, unless you're being very technicaly and pedantic about "initially". You can think about it theoretically or you can look at empirical evidence. It is well-supported empirically that share prices go up after buybacks, and in fact they do so quantitatively by exactly the amount necessary for the equation implied above to hold.
No, this is incorrect. Investors like buybacks, so when the buyback is announced, share prices may rise, but certainly not by the amount of the buyback. They don't go up when the buyback gets executed, unlike dividends, which decrease the share price at the moment when they get distributed.
The equations are:
nr_shares * share_price = cash_of_company + value_of_company_excluding_cash.
In a buyback, cash_of_company decreases by the buyback, and nr_shares decreases by buyback / share_price.
Consider the extreme case, a lemonade stand with a bank account with $1M. 1000 shares outstanding, share price $1000. After a buyback of $900K is announced, 900 shares are sold for $1000. $100K remains in the company's bank account, 100 shares remain outstanding, at ... $1000 per share.
That's why they should be phased, and not too steeply.
If they're phased, e.g. at 30% (for every additional €1 you earn, benefits decrease by €0.30), you have the problem that when you are applicable for several of them (e.g. for children, child care, chronic illness, etc.), the benefit reduction adds up as well, so you're quickly back at an effective marginal tax rate (EMTR) of 90% or even over 100%.
You'd think that it wouldn't be beyond the capability of our society to declare that "the EMTR shall be at most 70% at any point in the income curve", and do the math to make it work, but apparently not.
For those that want to stick with thermodynamics, imagine an organism that stores 1% of consumed calories as fat, and uses the other 99%, and that cannot - for whichever reason - turn fat back into calories.
Completely in accordance with thermodynamics, and yet, "just eat less" doesn't work.
Software development is a bit like chess. 1. e4 is an abstraction available to all projects, 3. Nc3 is available to 20% of projects, while 15. Nxg5 is unique to your own project.
Or, abstractions in your project form a dependency tree, and the nodes near the root are universal, e.g. C, Postgres, json, while the leaf nodes are abstractions peculiar to just your own project.
The possible chess moves is already known ahead of time. Just because an AI can't make up a move like Np5 as a human could do, that doesn't mean anything AI can't play chess. It will be fine with just using the existing moves that have been found so far. The idea that we still need humans to come up with new chess moves is not a requirement for playing chess.
So bizarre! It really shook my belief in Philips' competence at the time.
I mean, take a 100 minute movie, sliced into 1-second clips. 8kB is not even enough to store all possible orders you could put those clips in. I would hate to think so ill of any of my friends or colleagues to think that they could believe such an obvious fraud.
> I mean, take a 100 minute movie, sliced into 1-second clips. 8kB is not even enough to store all possible orders you could put those clips in.
Using a low hurdle to show it still failing is a good rhetorical technique, but you lowered your hurdle too far here. Yes technically specifying the order of 6000 segments takes more than 8KB. Because it takes 8.14KB. That's a rounding error. What could have been a useful argument is now a nitpick. And what if the movie was only 98 minutes, now it fits? What a mixed message.
It's a good reference point, but I'd replace "is not even enough" with "would only be enough".
Is it a sort of reversible pseudo-hashing function even possible? Or something like a seed in a deterministic procedural generator. You could store arbitrary data in a few bits. 8kb for all the redundancies and metadata even.
On a second thought, the compression alone would destroy information. NVM.
reply