The idea is that if it costs $r to store a base-r digit, then base 3 (or e in a continuous scale) turns out to be the most efficient. Obviously, there's no a priori reason to think that a 3-level gate is exactly 1.5x more expensive than a 2-level gate, so this is mostly of theoretical interest.
I'm thinking about how this would apply to human psychology of reading and writing numbers. Then it doesn't make sense to measure economy as b floor(log_b(n)+1), because adding in more symbols doesn't increase the complexity linearly for people reading or writing numbers. Maybe something like E(b,n) = f(b) g(floor(log_b(n)+1)), where f stays constant up to 10 or 20 symbols, and then increases after, and g increases faster than linearly because it's easier to read shorter numbers than longer ones.
I work in the aus data space, I’ve met a lot of intelligent PHD data scientists, but very rare is it to meet one who can scale their work efficiently across the org. The amount of time wasted on models that never make it to production is ridiculous.
I always assumed this line uses the traditional order of planets from the sun and referenced the common trope that "Men are from Mars, Women from Venus".