Complexity can only ever be moved, never removed. This means complex problems are impossible to make easy, only a single part of a problem can be swayed with equal or worse cost to others.
This is best seen in abstractions; they make certain functions much easier to do on a routine basis, but they are less flexible than the functions they represent, and thus you end up with creating more and more abstractions, cases for them and at the end, end up with the same or more chaos than you had in the beginning.
This results in what I'd say futile work being done trying to fight it without even realising. You can't do something complex with less, or it wasn't that complex in the first place.
To make a truly malleable system, is to make a fractal, which is by definition impossible to solve. A problem(fractal) always requires a problem-area and accuracy in order to solve it, in that accuracy. There is _always_ an equal or worse tradeoff.
Abstractions in software are just that: we write a piece of code to do what we want to do and then we only supply parameters. Once we want to change what we are doing, we either have to go back to basics and come back with a new piece of code or somehow piggyback the existing code to fit the new scenario. The problem is that after we've written the first piece of code, we consider the thing solved. The first piece + parameters becomes the way of doing things and we are very reluctant to go back to basics again. So we start to pile up abstractions. But we don't have to. We can totally go back to basics and reduce the number of abstractions.
We cannot overcome the inherent complexity of the task (The Law of Requisite Variety) but it's not this complexity that bites us: it's the complexity of added abstractions.
I'm not sure I get what you mean. It's just that it's incredibly difficult to signal anything with a tool that exists for the sole purpose of utility. It's like trying to sell cotton with signaling; it doesen't really raise any kinds of emotion
>We do things to convince ourselves of our qualities
This just seems like basic logic though? If you want to be X and Y provides the means, then it makes sense to do it right? There can't be a hidden agenda if you're actively pursuing it
It's understandable to be surprised, it's not every day everyone needs resources at once at the same time, although some foresight a month before couldn't have hurt
Not really. The majority of cases of the common cold are caused by rhinoviruses, not coronaviruses. Then again I would gladly take a 10% cut in the common cold...
We're talking about whether or not Amazon should ship things like clothes on a 2 day or a 4 week schedule during a global pandemic. I think a 4 week schedule is fine because Amazon need to concentrate on shipping essentials, and clothes that people buy from Amazon aren't essentials. If you're suggesting that Amazon should prioritise clothes in case the buyer has literally no other wearable clothes then you're going to need to justify that argument.
Anyone can make spurious points about extremely unlikely edge cases, but suggesting Amazon should plan their logistics around them is not a helpful contribution.
Not everyone can afford multiple sets of clothing for themselves. A lot of people live in poverty, day to day with what they've got. Denying access to all clothing will undoubtedly cause distress
Automation really is all progress made by humans. Chainsaws replace hand-saws, and is 'automated' sawing.
The less work is required to produce goods, the better it can compete (in price). This makes it so that you _don't have to_ work as much to achieve the same quality of life
This is best seen in abstractions; they make certain functions much easier to do on a routine basis, but they are less flexible than the functions they represent, and thus you end up with creating more and more abstractions, cases for them and at the end, end up with the same or more chaos than you had in the beginning.
This results in what I'd say futile work being done trying to fight it without even realising. You can't do something complex with less, or it wasn't that complex in the first place.
To make a truly malleable system, is to make a fractal, which is by definition impossible to solve. A problem(fractal) always requires a problem-area and accuracy in order to solve it, in that accuracy. There is _always_ an equal or worse tradeoff.