Bohmian mechanics is based on the idea that we perceive stuff to be in a certain position in a single reality because there is a correspondence to stuff being actually there. That's nice. If the particles are surfing a wave and not impacting it, so be it.
It is also rather nice to think of the particles as just being points in space with nothing else associated with them; an electron is just an electron because the portion of the wave function that is relevant and guiding it is the electron portion; see a paper from 2004 entitled "Are all particles identical?" [1] (I am a coauthor on that). If one thinks about it, we only know about particles through their motion so having things like mass and charge linked to the object guiding the particle seems perfectly reasonable. Points are not only not labelled by numbers (particle 1, 2, etc) but also not labelled by mass and charge.
The nondeterminism of not knowing the initial conditions is fine; the point was to have a theory with well-defined objects that give some plausible story and connection to our experiences, such as stuff existing and being somewhere. The fact that non-relativistic Bohmian mechanics happens to be deterministic is just happenstance for many of its supporters. In some QFT versions, the dynamics of creation is not deterministic and there is no reason for that to be a problem. But it is well-specified without having to invoke some special magic action called "observation".
As for QFT, the biggest problem for Bohmian mecahnics is the need to have an actually well-defined evolution of the wave function. The idea of particles being created and annihilated is not particularly hard. And, in fact, recent work has shown that if one takes that seriously and respects probability leaking from n particle space to n+1 and n-1, then at least some of the divergence problems go away. See [2]
> Bohmian mechanics is based on the idea that we perceive stuff to be in a certain position in a single reality because there is a correspondence to stuff being actually there. That's nice. If the particles are surfing a wave and not impacting it, so be it.
At that point it's very obviously a violation of Occam's razor though. It's like positing that the content of my field of vision is an objectively real thing, that the reason the universe looks like a video projection is that there really is a video projection going on, even though that video projection has no physical effect.
> If one thinks about it, we only know about particles through their motion so having things like mass and charge linked to the object guiding the particle seems perfectly reasonable.
Indeed. But if one thinks a little more, what's the point of positing a particle at all, if all of the physics is in the pilot wave?
The definite positions of your brain states evolution is correlated to the other positions of all the stuff. The other particles do have an effect on your evolution and there is a "you" set of particles one can talk about. Remember the wave function is a function on configuration space so evaluating the guiding effect on the particles is to have to know what point in configuration space it is at; this is actually the troubling bit and leads to the nonlocality concerns, but that problem is common to any quantum theory with definite results happening.
The physics, therefore, is not all in the pilot wave. If you take as the point of a particle theory that there should be particles with positions changing in time, then that is what is being given in Bohmian mechanics.
Also, ask yourself, if the wave function is on configuration space, what constitutes a configuration? In Bohmian mechanics, it is clear, but if the wave function is all there is, then why are we talking about configuration space at all? It is just this abstract vector in Hilbert space evolving and many different representations can happen. Why do we not perceive reality in terms of these other representations?
If it helps, you can think of the wave function a bit like a dynamic law. In [1], the authors suggest thinking of log( psi) analogously to the Hamlitonian H on phase space in classical mechanics. There is no back action on H and most of it is irrelevant to the evolution of a particular particle system in that framework and yet everyone recognizes it as just a convenient way of describing the dynamics.
The difference is that psi evolves but even that may only be true on a subsystem point of view. It is theoretically possible to have a stateless universal wave function which, when particular particle positions of the environment are plugged in, nonetheless gives evolving subsystem wave functions.
Occam's razor is difficult to apply here without a prejudice. If you want to minimize the number of equations, then sure, "the wave function is everything" works, but it comes at the cost of there being what could be considered an infinite number of "you"s and everything else, all slightly different and whole existing other expressions of the universe with no connection to us. If you want collapse somewhere, then you have to posit that mechanism.
On the other hand, by adding in particles and the guiding equation, one gets a singular "you" and everything that we experience is, more or less, definite and singular. So the "existing" stuff is dramatically reduced.
Which one of this is truly simpler is a matter of taste, I would say. I think in terms of communicating with people, the Bohmian version of "there is this universal wave and the positions of stuff are guided by it" is pretty simple. The law itself is so trivially a part of the Schrodinger equation that it could easily be derived before the Schrodinger equation itself. Contrast this with other versions which is "reality collapses to a definite state when we look at it" or "there are infinitely many different universes". None of those seem as simple.
> there is a "you" set of particles one can talk about
We know that particles don't have identity though - exchange of identical particles is a symmetry and physics would be very different if it wasn't. I won't claim it's compelling, but to me that suggests that a particle is more like a pattern or a field excitation than a thing with its own concrete existence.
> Why do we not perceive reality in terms of these other representations?
What would be different if we did? I mean obviously at a macroscopic level particles moving through space is a model that gives a good approximation and is easy to think in, but that doesn't mean they're any more physically real than e.g. temperature.
In physics, particles not being labelled by anything other than their trajectories is a very natural starting point. When one uses the natural configuration space, one without labels in which a configuration is a set of n points in physical space rather than an ordered n-tuple, then the complex-valued wave functions on that space are exactly those of boson type. To get the fermions, one replaces the value space with a 1 dimensional complex bundle over the configuration space, one which twists in the right way. A paper I coauthored explores this in a general context: [1]
The "you" is then a rough set of particles whose trajectories roughly coincide with your macroscopic trajectory. Their identity is just given by where they are.
As for representations, I feel like I can easily understand how to get momentum or temperature from particles with their time evolution (trajectories), but I do not see how, say, to get positions of particle just from knowing what their momentums were and their time evolution.
But we don't have even a set of definite trajectories. If we see e.g. an electron coming towards a hydrogen atom and then an electron moving away from it, not only do we not know whether the incoming electron "bounced off" or whether it settled into the orbital and "kicked" the electron that was already there out, but in a fundamental physics sense what occurred is some weighted average of both (in the same way that we don't merely "not know" which of the two slits an electron went through but in an important physical sense it partially went through both).
It depends on the theory. The Bohmian theory, which is what I have been using, is one in which electrons have actual positions that change over time along trajectories. We may not have access to that data, but that is fine. Certainly in simulations one would be able to see which scenario happened. For some, it might be the same electron moving away, for others it would be kicking one out. One could definitively say which one is happening in the simulation. In experiments, we cannot say that because our access to the knowledge is limited by quantum equilibrium. The quantum formalism is very much like thermodynamics in that regard; the individual details are missing, but the larger picture can be computed. Nevertheless, in a Bohmian world, the electrons have their distinct identities as distinguished by, and only by, their trajectories.
Imagine if you are already a great writer, but want to learn more about asking questions, coming up with interesting angles. Then collaborating with an AI that does the grunt work seems a natural fit. You may also want to improve editing skills rather than writing skills. By saving time and energy on not writing, editing may become something that there is more time to really get good at.
In other courses, curiosity rather than mastery may be what is relevant. So again asking questions and getting somewhat reliable answers that skepticism should be applied to could be of great benefit. Obviously, if you want to get good at something that the AI is doing, then one needs to do the work first though the AI could be a great work questioner. The current unreliability could actually be an asset for those wishing to use it to learn in partnership with, much like working with peers is helpful because they may not be right either in contrast to working with someone who has already mastered a subject. Both have their places, of course.
I think Norm Wildberger's videos are very useful to think about and enjoy though certainly one should reflect carefully on what he says. His videos inspired my work on a new definition of real numbers. The basic idea is that it is the set of all intervals that contain the real number. Since this is circular, there are properties that describe when a set of rational intervals are defined. This approach is equivalent to Dedekind cuts and does not address Wildberger's concern.
But there is another definition which I call oracles which does a better job. It is much more constructive and is about a a procedure that one can ask whether a fuzzy version of a given interval contains the real number. It has various properties for a procedure to satisfy and, if so, then it will generate the set of intervals that contain the real number if taken out to infinite length.
So basically, it is a two part-definition. There is a theoretically perfect version and then there is another that yields to the practical problems of not being able to actually specify a real number entirely.
If interested, the papers are hosted on GitHub [1]. The most recent version going over what I just said is Real Numbers As Rational Betweenness Relations [2]
That is a great paper. I came up with my own definition of real numbers [1] and using the simplified cut property of that paper was a pleasant pathway to establishing completeness.
As far as I can tell, they are unrelated beyond the relation all definitions of real numbers would have with it. There is a superficial Left and Right boundary defining an interval that contains the number, but it does not seem like there are a family of intervals for a given real number in this approach.
I do not get the feeling that there is a narrowing down to a given number in the surreal numbers, but I know very little about them.
I also do not immediately see how my construction could be extended to infinitesimals or the infinite numbers.
I have been working on a new definition of real numbers which I think is a better foundation for real numbers and seems to be a theoretical version of what you are doing practically. I am currently calling them rational betweenness relations. Namely, it is the set of all rational intervals that contain the real number. Since this is circular, it is really about properties that a family of intervals must satisfy. Since real numbers are messy, this idealized form is supplemented with a fuzzy procedure for figuring out whether an interval contains the number or not. The work is hosted at (https://github.com/jostylr/Reals-as-Oracles) with the first paper in the readme being the most recent version of this idea.
The older and longer paper of Defining Real Numbers as Oracles contains some exploration of these ideas in terms of continued fractions. In section 6, I explore the use of mediants to compute continued fractions, as inspired by the old paper Continued Fractions without Tears ( https://www.jstor.org/stable/2689627 ). I also explore a bit of Bill Gosper's arithmetic in Section 7.9.2. In there, I square the square root of 2 and the procedure, as far as I can tell, never settles down to give a result as you seem to indicate in another comment.
For fun, I am hoping to implement a version of some of these ideas in Julia at some point. I am glad to see a version in Python and I will no doubt draw inspiration from it and look forward to using it as a check on my work.
It is equivalent to Dedekind cuts as one of my papers shows. You can think of Dedekind cuts as collecting all the lower bounds of the intervals and throwing away the upper bounds. But if you think about flushing out a Dedekind cut to be useful, it is about pairing with an upper bound. For example, if I say that 1 and 1.1 and 1.2 are in the Dedekind cut, then I know the real number is above 1.2. But it could be any number above 1.2. What I also need to know is, say, that 1.5 is not in the cut. Then the real number is between 1.2 and 1.5. But this is really just a slightly roundabout way of talking about an interval that contains the real number.
Similarly with decimals and Cauchy sequences, what is lurking around to make those useful is an interval. If I tell you the sequence consists of a trillion approximations to pi, to within 10^-20 precision, but I do not tell you anything about the tail of the sequence, then one has no information. The next term could easily be -10000. It is having that criterion about all the rest of the terms being within epsilon that matters and that, fundamentally, is an interval notion.
If one embraces rational intervals throughout, they can be the computational foundation and the ux could have the option of displaying the interval for the complete truth or, to gain an intuitive sense, pick a number in the interval to display, such as the median or mediant. Presumably this would be a a user choice in any given context.
EPR demonstrated that if you insist on locality and experiments having results in accordance with quantum predictions, then there must be pre-existing elements of reality that determine the results. Bell demonstrated that pre-existing elements of reality and the quantum results of experiments are incompatible with locality. The two together imply that if you assume experiments have results when we say they do in the way quantum mechanics predicts, then there is something nonlocal going on. Hidden variable theories do not add to nor remove the problem of non locality though Bohmian mechanics makes it very clear what the mechanism is.
Many worlds gets around this since experiments do not have definite results in that theory. Instead, the experimenter splits into multiple copies, each of which thinks there is a result of the experiment, but that is an illusion.
No, that is only the case if you assume the experimenter is free to choose the experiment. Bell shows that in that case either locality or realism is violated.
However, there is a simpler explanation, namely superdeterminism. You are in fact not free to choose the experiment, this choice also has physical causes.
Guesstimation of the numbers is not promising. The article mentions 1 to 3 million as targets, let's say 2 million as an asset at retirement. At 4% withdrawal, that gives 80k a year. Median wage is about that.
There are about 4 million people in an age cohort below, say, 80 (80 * 4 million = 320 million). Let's say a generation is 20 years. So for Gen X to retire all at this comfortable level, that would be 4 million * 2 million * 20 = 160 trillion dollars in assets. This is staggered, but still.
The total US stock market value was recently about 55 trillion [1] and 110 trillion for global stock market [2]
These numbers seem a bit off from each other. Looking at the chart for the US growth, the value doubled from 1998 to 2014, then doubled again from 2014 to now. The later bit is presumably due to the massive injection of money (inflation) from the past 5 years which all seems supported by looking at the massive jump in value in 2020 in the chart.
And here is some analysis about GDP to total market value [3]. Also not a promising conclusion. Has market value at 60trillion at the moment and GDP at 30 trillion.
1225: ten years earlier, Magna Carta starting to limit monarchs and the seed of individual freedom
1681: eight years later was glorious revolution with a bill of rights, marking individual freedoms
1764: ten years later, beginning of American Revolution and being free of monarchs
1849: ten years-ish later, start of US civil war; was the time of an attempt by the British to end slavery around the world
1936: ten years later, colonial empires were being dismantled, UN established to attempt global cooperation, US in the ascendancy with a seed of ties being established more by economics than military force, great economic upswing lifting people out of poverty (60% in poverty then, 10% now) while the global population blossoms
2035: Majority of the global population in middle class or better, triumph of individuals over technocrats, bureaucrats, and corporatists :)
It is also rather nice to think of the particles as just being points in space with nothing else associated with them; an electron is just an electron because the portion of the wave function that is relevant and guiding it is the electron portion; see a paper from 2004 entitled "Are all particles identical?" [1] (I am a coauthor on that). If one thinks about it, we only know about particles through their motion so having things like mass and charge linked to the object guiding the particle seems perfectly reasonable. Points are not only not labelled by numbers (particle 1, 2, etc) but also not labelled by mass and charge.
The nondeterminism of not knowing the initial conditions is fine; the point was to have a theory with well-defined objects that give some plausible story and connection to our experiences, such as stuff existing and being somewhere. The fact that non-relativistic Bohmian mechanics happens to be deterministic is just happenstance for many of its supporters. In some QFT versions, the dynamics of creation is not deterministic and there is no reason for that to be a problem. But it is well-specified without having to invoke some special magic action called "observation".
As for QFT, the biggest problem for Bohmian mecahnics is the need to have an actually well-defined evolution of the wave function. The idea of particles being created and annihilated is not particularly hard. And, in fact, recent work has shown that if one takes that seriously and respects probability leaking from n particle space to n+1 and n-1, then at least some of the divergence problems go away. See [2]
1: https://arxiv.org/abs/quant-ph/0405039 2: https://arxiv.org/abs/1809.10235