The tool used in construction for releasing trapped air bubbles out of poured concrete is called a concrete vibrator (SFW if anyone cares to Google for it). A vibrating ... ahem, personal toy is actually rather a clever substitute for a small scale project like this.
It's obviously untrue that technology can't fundamentally alter human communication in a few years. For example, the advent of film, then radio, and finally television caused a convergence of culture at the national and even global level. Characters like Mickey Mouse and the cast of Star Trek are instantly recognized internationally, even to those who never have seen any of the works they star in. There likely isn't anyone here who doesn't remember some catchy commercial jingle of their youth or catchphrase from media that entered the national lexicon. And yes, it also affected reasoning: Walter Cronkite, a long ago TV journalist, was labelled "the most trusted man in America" for the integrity of his reporting. The internet caused a second wave of transformation since it was many-to-many communication instead of unidirectional broadcasting that allowed the coalescence of subcultures, examples being various fandoms and, infamously, 4chan.
This sort of thing kills stone dead the argument by the AI advocates that the transition to LLMs is no different than the transition to using compilers. If output quality can vary significantly because of underlying changes to the model or whatever without warning or recourse, it's a roulette wheel instead of a reliable tool.
That list cherry picks all the successful cases where the technology improved while ignoring the many, many others where it didn't and the technology improved no further. That's dishonest.
It isn't even a good job of cherry picking: we never got mainstream supersonic passenger aircraft after the Concorde because aerospace technology hasn't advanced far enough to make it economically viable and the decrease in progress and massively increasing costs in semiconductors for cutting edge processes is very well known.
You're not factoring in the list of constraints I provided.
There's no broad social acceptance of supersonic flight because it creates incredibly loud sonic booms that the public doesn't want to deal with. And despite that, it's still a bad counterexample, as companies continue to innovate in this area e.g. Boom Supersonic.
At best you can say, "It's taking longer than expected," but my point was never that it will happen on any specific schedule. It took 400 years for guns to advance from the primitive fire lances in China to weapons with lock mechanisms in the 1400s. Those long time frames only prove my point even more strongly. Progress WILL happen, when there is appetite and acceptance and incentive and room to grow, and time is no obstacle. It's one of the more certain things in human history, and the forces behind it have been well studies.
Just as certain: the people and jobs who are obsoleted by these new technologies often remain in denial until they are forgotten.
If code quality only stops mattering in 400 years (whatever that definition happens to be) then the prediction that it makes is worthless in terms of what you should do today. You use it to argue it’s unimportant deal with it, but if it’s a 400 year payoff you’ve made the wrong bet.
It’s really hard to predict where exponential progress will freeze. I was reading the other day that the field seems to have stagnated again in terms of no really meaningful ideas to overcome the inherent bottlenecks we’ve hit now in terms of diminishing returns for scaling. I’m not a pessimist or unbridled optimist but I think it’s fundamentally difficult to predict and the law of averages suggests someone will end up crowing about being right
> "no conceivable word where the agent will be taken away"
LLM access is a paid service. HN concerns itself with inequality constantly and it's not inconceivable that some individuals get ahead because they can afford to pay for more tokens and better models than those who are poorer.
It was a trade-off for a very long time (late 1960s to late 1990s IMO): the output of the early compilers was much less efficient than hand writing assembly language but it enabled less skilled programmers to produce working programs. Compilers pulled ahead when eventually processor ISAs evolved to optimize executing compiler generated code (e.g. the CISC -> RISC transition) and optimizing compilers became practical because of more powerful hardware. It definitely was not an overnight transformation.
That's the problem though: programmers who become the equivalent of McDonald's workers will be paid poorly like McDonald's workers and be treated as disposable like McDonald's workers.
The calculator analogy is wrong for the same reason. Knowing and internalizing arithmetic, algebra, and the shape of curves, etc. are mathematical rungs to get to higher mathematics and becoming a mathematician or physicist. You can't plug-and-chug your way there with a calculator and no understanding.
The people who make the calculator analogy are already victims of the missing rung problem and they aren't even able to comprehend what they're lacking. That's where the future of LLM overuse will take us.
reply