Meta-critique: I abhor "highlights" or summaries like this:
> We identify key issues causing digital transformation failure.
> Our findings explain how researchers and practitioners should identify certain pitfalls leading to digital transformation failure.
And what were those issues and findings?
Don't give me a summary of the summary. That's what LLMs are for!
Unfortunately, the abstract is equally unenlightening:
> Our findings indicate a widespread tendency to categorise the DT ecosystem using terms like ‘technology’, ‘information system’, and ‘management’, among others. However, this approach neglects an in-depth examination of the specific and novel aspects of DT that have contributed to its failures.
Without wasting more of my time reading such a badly written research paper, I have no idea what they are trying to say in the abstract, which doesn't bode well for the rest.
For the same reason that anyone's reasoning process and answers to random exam questions are never used as textbooks: if the reasoning is not guaranteed to be right, why would you want to make that training material?
We can empirically figure out how often the reasoning model is correct. With a 95% empirical accuracy, it should still help the model directionally. No training data set needs to be 100% accurate. No?
Why optimize for efficiency though? Why not human flourishing or planetary health, whichever way you wish to define that?
Efficiency sounds to me like an absolutely awful way to run any society as it's what turns individuals into disposable cogs of a machine that needs to be operated smoothly because, well, no obvious reason other than a fetish to see the machine run smoothly, no matter the human cost.
Until you manage to eliminate or at least limit competition between groups, efficiency is a very important metric to optimize for lest you get out-competed by others, at the very least reducing the effect of your group, if not resulting in it shrinking or disappearing.
I agree it's not perfect, but I think you are over-dramatizing.
My gut says that planetary healthy would be wildly improved if we worked to build a more efficient society. Having meaning & caring about being a well running world would hopefully give us some grounds to flourish on, pride and effort and will to drive us towards something meaningful, beyond the grab-as-much-money-as-you-can state of things today. A collective future worth caring about.
The article talks about building a social world for people deliberately. It seems like they had some care, wanted to try to improve the social lives of people too. If anything I think the Technocracy people understood somewhat better that these decisions about how we treat people are not instanteous questions: maybe we get more social productivity out of some by treating them like crap & working them to collapse, but then we have decades of them being a social and perhaps economic drain on all society. That short term exploitation is what capital does to people today already! But no scientist worth a salt is going to create such imbalanced wasteful systems!
Efficiency can mean a lot of different things. It depends on what you are trying to make efficient, doesn't it? An efficient society, in my view, would be focused on happiness indexes, on gini coefficients. It would be trying to make our footprint more modest, try to make goods repairiable & sustainable long term. Modern eco-concern today has a lot of overlap with many of the basics here.
I'm speculating a lot. But if you don't want to nibble at this food for thought what other morsels are worth trying? This very much is a case where, again, I find the anti-willpower striking and concerning, the leaping into the negative. Over something strange and weird and a bit fanciful and naive. But at least they were working for something to believe in. At least they had a will to better, one that seems fundamentally resoundingly kind of true, kind of needed: a view that is long term, that focuses on building a maintainable long term running order, that doesn't consume until exhaustion.
I'd characterize it as cybernetic, as observant to & attending to inputs and outputs of systems. And interested in improving the results, by observing how these systems function.
Your characterization feels like one of those seeing-only-evil's that I characterize as an anti-willpower.
reply