Problems with Bohmian mechanics
(also called the de Broglie-Bohm interpretation of quantum
mechanics)
This is the most popular interpretation of quantum physics in terms of hidden
variables, that are the "positions of particles". These hidden variables behave in a
"deterministic" way that, in order to fit with the EPR paradox, must admit non-local
(instantaneous, faster-than-light) interdependence between these positions
throughout the universe, with respect to a supposed absolute time, i.e. simultaneity
relation (partition of the universe into space-like 3D slices), that is hidden too.
There are several big problems with this interpretation.
Problem 1 : breaking relativistic invariance
Compatibility troubles with General Relativity
Bohmian mechanics breaks relativistic invariance by requiring the
choice of absolute time, relatively to which its laws will be
processed. This gap is well-known, but I did not read much about
the full measure of how big is this gap. Since, just assuming an
initial choice of an absolute frame at the beginning of time, as
would fit a description in the framework of Special Relativity,
does not suffice. Because the physical space-time does not
(approximately) fit Special Relativity but General Relativity.
And in General Relativity, there is no natural way, even once
chosen a space-like slice of space-time (that has no reason to
possibly be a flat one) as a definition of the "initial time", to
determine how space-time will have to be sliced into more
"simultaneous" classes of events in the future. If we try the
"simplest solution", that is taking the parameter of time defined
as the age (the time spent since the Big Bang), so as to define
simultaneity as equality of age, this cannot work because it will
run into lots of singularities (especially at the centers of
planets and stars).
On the other hand, quantum physics itself does not have any
fundamental incompatibility with the symmetry principles of
General Relativity, as explained by Carlo Rovelli, one of the
founders of Loop Quantum Gravity, which is based on the care to
fully keep both the founding principles of quantum physics and
general relativity.
(They may reply that non-locality and the violation of the Lorentz
invariance, is not just a problem with their interpretation, but a
general problem with quantum physics, and more precisely of any
interpretation that accepts actual randomness and the selection of
an actual outcome of measurements, i.e. any interpretation that is
not many-worlds).
It only interprets the Schrödinger equation (i.e.
non-relativistic quantum mechanics), not Quantum Field Theory
The Stanford encyclopedia of philosophy article about Bohmian
mechanics, tells in length what a wonderful interpretation it is of
the (non-relativistic) Schrödinger wave equation. As if quantum
mechanics and the Schrödinger equation were the same thing. I'm
afraid they are not taking the full measure of the fact that the
core of the argumentation is out of subject. As they only mention
later, in a last section, the real big issue is where this
interpretation fails :
The theory of quantum physics that is currently well-established as
the proper description of physical reality, and that begs for an
interpretation, is NOT the Schrödinger wave equation, but the full
quantum theory, that is, Quantum Field Theory. Indeed without this
extended framework it is not even possible to account for the possibility for
an atom to absorb or emit a photon (while switching between energy levels),
an event which occurs actually quite often ! Seriously, what a terrible subset
of known physics is this little Schrodinger equation they are so proud of
better explaining or visualizing in their way !
And adapting Bohmian mechanics to interpret quantum field theory is
far from obvious. As replied to a question: De
Broglie- Bohm Quantum Theory.
This article
(Jul 2004), admits in the conclusion that "We leave open, however,
three considerable gaps: the question of the process associated with
the Klein–Gordon operator, the problem of removing cut-offs, and the
issue of Lorentz invariance."
A detailed review of the situation can be found in Wallace's article
The Quantum Measurement
Problem: State of Play, section 7 (Relativistic quantum
physics); there is a more recent
article supporting Bohmian quantum field theory (2011).
Some works were done trying to fill this gap... a
report I stumbled on
And even if a candidate Bohmian Quantum Field Theory is offered, it
remains to check that it will avoid the trouble with the treatment
of randomness that is explained below.
Problem 2 : the nonsense of deterministic
randomness (like in classical deterministic chaos)
Of course the unfalsifiable character of this theory (the
ineffectiveness of assuming deterministic causes from hidden
variables that cannot be checked) is clearly on the table, in the
sense that it is "only an interpretation" without any different
prediction (it is the same prediction : pure randomness). However,
I will comment here in details how deep I see this problem.
In short : Bohmian mechanics claims to be a deterministic theory,
but I see no way in which this supposed quality of "determinism"
can make any meaningful sense : even if true it would still just
be an empty box that does not provide any effective answer about
whether the world is deterministic, probabilistic, or even...
divinely guided.
The same remark goes for the classical "deterministic chaos",
which many philosophers assume to be a form of determinism
(strangely still often referring to it when discussing
determinism, though classical mechanics is outdated as a
fundamental description of the universe).
Indeed, classical mechanics is essentially of the same kind as
Bohmian mechanics, in the sense that their quality of
"determinism" is illusory for the same reason, that is explained
below.
Why the "determinism" of classical mechanics (as a purely
theoretical concept disconnected from this world) is an empty
concept
Even a hypothetical universe totally described by some
"deterministic" classical physics, can be said to produce absolutely
random phenomena too, though leaving an explanatory gap in the
nature of this randomness.
Giving a real number between 0 and 1, is essentially the same as
giving an infinity of binary digits, that constitute the binary
expansion of that number.
The computation of a continuous function of a number, can be made
in successive approximations as computations from the unlimited
data of its binary digits (taken in finite amount, as many as
needed for the required accuracy of the result).
This way, a reasoning about a continuous variable, can be
equivalently expressed as if it were about a potential infinity of
discrete variables.
The reality assumed by classical mechanics, as well as the hidden
variables assumed by Bohmian mechanics, are continuous variables.
Thus, they are equivalently expressible as an infinite series of
discrete variables (to be progressively explored).
Consider a physical system whose initial state is described by some
quantity x=5.7843.....
After some chaotic processes, it arrives to a final state described
by another quantity y that we can measure up to 5 decimals.
The problem is that the determination of these 5 first decimals of y
out of the exact value of x, must take account of, say, the
first 1,000,010 decimals of x, as even a change in the
millionth decimal of x may modify the value of y by
several units.
In this situation, how can you meaningfully claim that "the first 5
measured decimals of y are not fundamentally random" ?
Indeed, even if a continuous variable is conceived as
"well-determined", in practice, it only means that the first few
digits are well-determined. For all practical purposes, beyond a
certain scale of details, all the next digits down to infinity,
have pretty well all the behavior of purely random data.
So, as an excellent physical approximation, we can "really"
consider the first decimals of the final y as "absolutely
random", in the sense that, if the first thousand decimals of x
are [whatever], it will remain an excellent physical approximation
to qualify its next thousands of decimals as "absolutely random".
Finally, a classical mechanistic chaotic system is quickly full of
"absolute" randomness because the parameters of the initial state
contain an infinity of digits of precision that, after the first few
ones, necessarily turn out to be "absolutely random" for all practical purposes; and
these random digits intervene in macroscopic behavior very quickly.
One might react by assuming that there is a fundamental difference
between the "practical" and the "essential" versions of randomness.
Indeed there are cases when this expression "for all practical
purposes" can be used to refer to subjective human criteria
that are relative, so that these "practical properties" can be
dismissed as irrelevant when discussing what is essential. But there
are other cases such as the one discussed here, where this
distinction between the "essential" and the "emergent" or
"practical" properties is effectively blurred - not just hidden but
really broken. Such phenomena where essentialism
is wrong, as the difference between what is "essential" and
what is "practical", actually vanishes, can be found in other
concepts, such as
A risk of trouble with Bohmian Quantum Field theories: the
divergence of behavior in finite time
This problem with classical "deterministic" chaos might even be
worsened by the possibility of a fractal process where the first
decimal of a quantity depends on the 2nd decimal of its value 1
second before, which depends on the 3rd decimal 0.5 second before
itself, which depends on the 4th decimal of its value 0.3 second
before that, and so on, so that finally the number of decimals that
it depends on, reaches infinity before a finite time interval.
And if I don't mistake, this is how things actually happen in fluids
mechanics in the macroscopic formalization by partial differential
equations (ignoring the behavior of individual molecules).
Now for Bohmian mechanics: even if such a divergence does not occur
for the Schrödinger equation, how can we seriously expect to stay
safe from such a divergence in any candidate Bohmian interpretation
of quantum field theory, including compatibility with
renormalization (that is also a sort of fractal process) ?
In addition to this question of the risk of divergence coming from the infinitely
small with fractal structures of Feynman diagrams, comes the risk of
divergence coming from the infinitely big: from the influence of the rest of the
Universe, since according to Bohmian mechanics, the evolution of the
"position" of a particle cannot be dissociated with what happens in
the whole Universe - a form of "determinism" which does not start to
make sense before you completely specify a cosmological model (is
the Universe infinite, spherical, or something else ?).
A reference on the topic : Hall, M. J. W., Incompleteness of trajectorybased
interpretations of quantum mechanics,
J. Phys. A Math. Gen. 37 (2004) 9549 and
quant-ph/0406054"
How quantum mechanics resolves the nonsense of classical
deterministic randomness - a lesson of meaningfulness that Bohmian
mechanics tries to reject, but why ?
One
"scientific skeptic" from AFIS having no clue about physics pretended
to incarnate modern scientific rationality by proclaiming this very stupid argument:
"For all practical purposes, quantum effects can be neglected in
macroscopic systems, so the world is classical, thus deterministic".
Yes, but classical (macroscopic) chaotic systems have the butterfly
effect (sensitivity to very small differences of initial
conditions). You can describe the butterfly effect as consequence of
a classical physics which "is deterministic"; however, giving good
approximate descriptions of general chaotic behaviors by
mathematical models is one thing, but the issue of the "reality" of
an underlying determinism in our universe, is another.
The physical fact in this universe, as shown by classical physics
itself, is that the visible random results of such phenomena depend
on smaller and smaller details of previous states when you trace
back the causes to earlier and earlier states of the system.
Finally, these details come from microscopic fluctuations, which are
subject to quantum indeterminacy. Classical determinism is only an
intermediate medium of communication across scales of processes,
that displays macroscopic random effects coming from microscopic
fluctuations of some earlier time, that come in fact from quantum
randomness. No matter the possibility to theoretically deduce the
properties of statistical mechanics from deterministic assumptions,
the actual physical reality of our universe is that the macroscopic
randomness displayed by classical deterministic chaos, as well as
the particular thermic randomness with probabilities described by
statistical mechanics (with its specific probability law of
Boltzmann distribution), has a quantum origin, so that its
randomness (and its obedience to the Boltzmann distribution in the
thermic case), is exactly as pure and physically "absolute" as
quantum randomness is. Indeed a careful examination of the
macroscopic consequences of quantum mechanics (with its emergent
process of entropy creation that goes on quite quickly) shows that
the actually exact (not just the practically knowable or computable)
probability law deduced from quantum theory on any possible
measurement, quickly converges to the Boltzmann distribution at the
given temperature.
Indeed : how can you even dream things to go otherwise in a
continuous world ? How can you ever dream to conceive a
mathematically well-defined causality law that describes an effect
as depending on an infinity of details (an infinite amount of
information) that the system contained at an earlier time ? How can
you dream such a mathematical law to be an effective causality law
at all, either in reality or as a meaningful mathematical prediction
tool ?
The answer of quantum mechanics, to avoid the nonsense of a concept
of a causality from an infinity of causes (such as the infinity of
digits of continuous variables), is this one : there is an absolute
limit to the amount of information that is physically present in a
given local system, that can causally (physically) influence its
future behavior. If you want to measure the details of the system
beyond this amount of information that it physically contains, all
you will get is newly created, purely random data that did not exist
in the initial state of the system.
Still, while the amount of information in a quantum system is
limited (finite), it is not a discrete pack of information but a
continuous one (that fits with the continuous symmetries of
geometry). This paradoxical combination is accomplished by the fact
that this continuity is only a continuity in the values of
probabilities of results when the system is measured, not a
continuity of any "physically present quantity" that can be measured
with unlimited accuracy. Thus a rejection of physical realism, as
the continuity of the transition between the possibilities for 2 states to
be identical or distinct, means that there is no physical reality of
which state a system exactly is in. (As I once commented,
a third option would be to break the continuous symmetries and opt
for a digital universe, but this would leave us with no sense to make
of the effective fundamental role of continuous symmetries in the
formulation of theoretical physics)
Then, Bohmian mechanics reinterprets quantum randomness, as an
unverifiable come-back of the previous concept of randomness, that
is, as a classical deterministic chaos. Now, we come to a
"deterministic" randomness that is both pure (with exact
probabilities and with no means of prediction from any prior
measurement) and with purely speculative and unobservable origin.
D. Wallace commented on the trouble with this in the 6.6 of his article ; I
will comment it in my way.
The oddity of an intractable deterministic randomness : did you
ever see it anywhere else ?
Do you know any other "random" process that once seemed so well
random that a specific probability law was formulated, that seemed
to be well confirmed by observations but without explanations,
until deterministic causes were found that made the exact behavior
effectively deterministic, i.e. predictable (or at least giving
much better prediction tools than the former probability law) ? I
cannot think of any.
Of course there are some obvious not-really-examples :
- Classical deterministic chaos generates probabilities of
outcome in the long term. But the observation and understanding
of short-term deterministic laws comes first ; long term
probabilistic laws only come as very sophisticated abstract
deductions, and appear much more complicated than their
deterministic explanations.
- The determination of gender of embryos could finally be
understood; the gender of the baby can now be "predicted" by
tests on the embryo (that already had it) ; the latter can be
chosen in-vitro by selection technologies, but still not
predicted when it is still random; anyway there is a specific
"choosing process" whose analysis cannot provide better
forecasting tools.
- The weather for tomorrow has never been given a seriously
precise probability; the improvements to weather forecasts were
painful but straightforward by improving observations and means
of computation, while predictions are anyway just less and less
reliable as they are applied to longer time intervals.
- Some refinements of formulas used by insurance companies and
traders
- The practice of insider trading
- Improvements in the prediction of earthquakes remain slow and
gradual.
- The spreading of illnesses had not been given probability
laws in the past, and is still not very predictable now.
- Pseudo-randomness. Pseudo-random generators, with the double
property of fitting a given simple probability law and being
classically determined, and, unlike the butterfly effect,
staying separated from any quantum source of randomness, were
man-made using well-designed algorithms for this explicit
purpose. No such randomness comes from nature. (Even the
decimals of pi, for example, while coming from the "nature" of
mathematics, require some design to be physically computed).
Anything more striking than this ?
Illustrating the problem by a tale
One day I saw an advertising panel for a new product, announced
in this way :
BomBox
The Perfect Random Generator
that Works in a
Completely Deterministic Manner
produced by BomBox, Inc.
|
Intrigued by such an amazing claim as the claim that perfectly
random data can be produced deterministically, I went to a
BomBox shop and asked the seller to explain about this new
product.
He proudly explained to me why it was so valuable to have made
this technological breakthrough, to guarantee that the numbers given by this random
generator were indeed produced in a deterministic manner: the reason was, if the
process generating these numbers was not deterministic, then we could not be
exactly sure where this generated data came from, so that the guarantee of its
perfectly unbiased random character would remain doubtful. The deterministic
character of the process ensured that no uncontrolled bias could happen, and
thus that the result indeed had the exact desired randomness property.
But I wanted to understand more, how such a seemingly
paradoxical combination of qualities could indeed be implemented
in a device. He refused to answer, as this was an industrial
secret that could not be disclosed. In the face of my
skepticism, he offered me the following guarantee : I would pay
a regular monthly price to use it ; ifever I discovered any flaw
in the working of this box, that it didn't have the advertised
qualities, he would then soon come and fix it for free; or pay
me back all what I paid from the start if he couldn't.
I accepted the deal and took the box with me. At home I started
to test it, using a program designed to test randomness and look
for patterns in series of digits. I discovered that the output
indeed behaved randomly when I was there but started displaying
patterns after some time I was absent. I noticed that the box
had a little camera that took data from the room around, to be
mixed with its internal computations so as to renew the
randomness of its behavior. Indeed once this camera was
completely hidden, the defects of its randomness came back.
I went back to the shop and reported that flaw. The seller
offered to change the box, and as a compensation, he also
offered me a solar panel to put on my roof to provide the needed
electricity for the working of this box and even more uses if I
need.
I accepted the deal, and the new box indeed appeared to offer a
clearer and more systematic randomness than the previous one.
However I was still puzzled and undertook to examine the new
system in more details. It turn out that the solar panel was
hiding a little microphone recording the noise from the street
and sending it to the BomBox as a means to reshuffle the
internal data of its randomness computations.
Again I came back to the shop and complained that such a method
was still an imperfect method of randomness generation, because
the output from such a process could not be completely random as
this box did not really produce its own randomness but only
indirectly transmitted (after some reshuffling) the imperfect
randomness of outside events that ultimately determined it, so
that the detailed features of these events might still be a
source of bias for the resulting so-called randomness.
Again he understood, and he offered to replace again the last
BomBox with a new one that would also have the function of
internet access.
It worked quite well, while I had disabled the flow of data from
the solar panel, but I was still wondering how these random
numbers could really be generated, so I decided to check the
details of Internet connections it was making. I discovered
that, while I was not supposed to be using the Internet
connection, the BomBox was still making connections to the
servers of the BomBox company, which answered in some visibly
random manner. I understood that, once again, the BomBox was not
really producing these random numbers from itself along, but
only transmitting and reshuffling those provided by the Web
servers of the BomBox company.
I went again to ask for explanations about this. The seller
explained that this procedure did not jeopardize the quality of
this randomness because the Web servers of the BomBox company
were themselves perfectly reliable in producing random numbers
in a deterministic manner, thanks to the full implementation of
the real secret technology of the BomBox company, which the
BomBox themselves did not contain, for fear that it might let
its full secret technology to be leaked to competitors.
I replied that I still needed to check how the BomBox server
was working, to verify both the deterministic and unbiased
nature of its random behavior. I was invited to their servers
room and could inspect the hardware of their servers. They
indeed all used standard hardware known to behave
deterministically. Still it wasn't clear if these servers were
producing these random output all by their own or were only
reshuffling any random data from somewhere else; and in the
latter case I needed to check that other source too. They
promised I could check all this in the next week.
The next week, I went back and noticed that these servers
themselves received random data from another machine in another
room, which I inspected too, and that itself was receiving
random data from still another machine, which was also receiving
data from somewhere else. I still wasn't satisfied, as I said I
wanted to continue my inspection further, without any limit. But
I had to wait one more week before continuing the inspections.
Meanwhile, they undertook to install a continuous assembly chain
of electronic devices, each one connected by a wire to the next
one being produced. The last machine I inspected last time was
connected to the first produced device that was coming from a
treadmill, which was connected to the next one behind, itself
connected to a wire further in the treadmill whose end I could
not see. Later as they approached, the last wire appeared
connected to a next device, and, as they all approached, this
all kept being repeated over and over again with more and more
such devices each connected to the next one coming from the
treadmill.
They told me : here you are ! You wanted the right to keep
inspecting the devices producing the random data without limit.
Now you see where the data comes from: it comes from a device,
itself receiving its data from the next device, and so on. We
are ready to let you inspect all these devices one after the
other without any limit, as they arrive here in this chain from
our production factory.
That sounded great, but I was still wondering : how can the last
produced device in this chain provide its own random numbers,
before receiving any further random data from the next device
that will be added later to this chain ? I wanted to inspect the
inside of the factory, to check how they were doing. But they
refused to do so.
Instead, they gave up. They decided to cancel all transactions
and give me back my previous payments.
But not only that, they were visibly fearful that other people
might also inquire too much and eventually discover a loophole
in their technology, and would come to complain in the same way.
Indeed, as a last resort to prevent any such questions on the
reliability of their technology from popping up again in the
near future, they issued this bold job opening
Superman
Wanted
for Super Mission
Contact BomBox, Inc
|
Then Superman came for this job, and was assigned the following
task. He had to study the whole production chain of this series
of devices each connected to the next one so as to generate a
flow of random numbers towards the first one. Then his task
would be to operate this production chain at a divergently
increasing speed: he would produce the first copy of the device
in one hour, the next one in half an hour, the third one in 15
minutes and so on, so as to finally get an infinity of them to
be produced until the end of the second hour. Of course there
was not enough place to collect that infinity of produced
devices in this factory, but Superman would be able to complete
his mission in outer space.
He accepted this mission, and started this production inside
this factory until the last second of the second hour. At the
last second, he completed his mission while skyrocketing up to
the end of the Universe, leaving that seemingly infinite chain
of devices hanging from the sky behind him.
Millions of people were amazed at that spectacular operation.
Then, as an effect of that demonstration of how an endless
generation of perfectly random digits could indeed purely come
out from that (infinite) chain of devices all behaving in a
completely deterministic manner (without help from any other
input), the profits of the BomBox Company started skyrocketing
too. But I concluded to forget myself about this random
generation service from then on, letting other people believe in
this method if they liked.
Ten years later, I happened to see in the news that the BomBox
Company had gone bankrupt. Intrigued by this news, I searched for
further information, to understand how such a pitiful end could
happen to this powerful company after the spectacular success it
previously had. Here is what happened.
After Superman completed his mission, rumors started
circulating on where he might have finally ended up at the very
last moment, when reaching the end of the Universe. Some people
simply assumed that the Universe
was actually infinite, so that he could really complete
the production of this chain of devices to an infinite length,
while some other people speculated that, instead of this, he
finally reached a border of the Universe very far away and met
God there.
Based on the latter idea, a new cult emerged, claiming that, by
performing some sorts of ceremonies, God might respond by
carefully designing the data He would send to the very last link
of this chain of devices, the one which was under His control,
and this way finally influencing the supposedly random data of
some BomBoxes around the world so as to serve some specified
purposes.
Some time later, this cult claimed to have reached some success
in their activities, pointing out, for example, that some of
their members had won huge profits in lottery games. Though it
was not exactly clear, after detailed examination, whether all
these winners already belonged to that cult before their gain,
or some of them were only enrolled there after this to get some
fame (and also to benefit the cult leader's offer to let them
meet a lot of female members of this cult) by pretending to have
been a member before, the rumor could not be stopped : a wind of
panic started blowing among BomBox users, afraid that, after
all, the data generated by their BomBox might not be as reliably
deterministic and unbiased as they were told, and decided to
give it back and not use it any longer.
Problem 3 : Is it really worth saving Physical Reality at the
expense of real physics ?
In short : the Bohmian picture does not fit with the proper,
simple understanding of QM (especially the spin) and its link to
classical physics.
Reference :
Guest
post on Bohmian Mechanics, by Reinhard F. Werner (from which I copied
the above title), with a long discussion.
Are Bohmian trajectories real? On
the dynamical mismatch between de Broglie-Bohm and classical dynamics in
semiclassical systems, Matzkin, A. and Nurock, V. (2007)
Bohmian
mechanics, a ludicrous caricature of Nature by Luboš Motl
Metaphysically,
a similarity with Last
Thursdayism can be found. More precisely, the idea that
the Universe can be suddenly destroyed and re-created... with a
difference and no way to know that it was different from what
the given memories suggest. Here is how : according to Bohmian
mechanics, when 2 wavelets in a wavefunction meet, the
"particle" can switch from a wavelet to the other. This may be
technically not a likely risk to re-write memories since one can
argue that memories are more likely stored as positions of
particles rather than momenta, but... what if we imagine
artificial intelligence systems with bits of memories
temporarily stored as momenta ?
This switch does not leave any trace because
- Since the position is a hidden variable, assuming the
wavefunction to be known, the precise position on the side of
the second wavelet after the event that came from the first
wavelet, is not a physically relevant or easily measurable
data;
- The coexistence of both wavelets is not a physically
relevant data either: looking at things from the many-world
perspective, the different worlds do not interact, so that
when the particle is in one wavelet, the presence (in the full
wavefunction) of the other wavelet to which position can
switch (in the past or in the future) is not relevant to the
understanding of the physical behavior
After a little bit of analysis, if I deduce well, the situation turns out to be much more
terrible than it might have seemed at first glance. Namely, let us consider a very ordinary kind of system:
a gas in thermal equilibrium in a box. Whatever the initial state could exactly be, taking the
ordinary phenomenon of stabilization of physical states of gases towards thermal equilibrium, then
looking at the interpretation of this by Bohmian mechanics, it turns out that the wavefunction keeps
continously meeting parts of itself with different momenta at the same positions, so that the stop-and-bounce
of the hidden variable from a wavelet to another will keep happening all the time, thus letting the Bohmian position of
each gas molecule follow a Brownian motion like molecules of gas but with much shorter elementary moves than the
mean free path of the classical description of gases, thus almost standing still. If I deduce well (to be confirmed), the effective
mean free path of the Bohmian motion even approaches zero, therefore making the positions of molecules
standing still, as the number of gas molecules in the box increases.
Related remarks :
- Quantum mechanics offers a jump in computational complexity
of processes compared to a classical kind of law for large scale systems
over a grid of data in the role of a state of a system with some given
resolution (such as one bit per atom), in other words, a quantum computer
can be "much more powerful" than a classical computer for given sizes of memory
and number of operations per second. However a hidden variables
interpretation over a quantum computer requires a still higher
jump in the computational power of the hidden process of evolution of the
hidden variable underlying this quantum computation.
- Bohmian mechanics violates Occam's razor, by adding extra
stuff that is not needed when working in QM, and cannot be
tested. Which may also be seen as part of the next problem:
Problem 4 : Under-determinations of the theory
In short : there is not one Bohmian mechanics, but several possible
versions, and no way to choose between them, except as a matter or
taste.
See in D. Wallace's
report "The Quantum Measurement Problem: State of Play",
section 6.5 (p. 60)
Both last problems are also addressed in this preprint : The Bohmian
interpretation of quantum mechanics : a pitfall for realism,
Matzkin, A. and Nvrock, V (2004)
Problem 5 : a many-worlds
interpretation in disguise - a terrible definition of
"existence"
In the Artificial Intelligence theory of consciousness, the
condition for an intelligent being to "really exist" as a
conscious being, is the condition of "being effectively computed
by physical processes". In Bohmian Mechanics, the introduction of
a pointer to select which world is supposed to "exist", while
other worlds remain "non-existing", does not change the fact that
the continuing evolution of the wavefunction without collapse,
still constitutes a real physical computation of the alternative
worlds with all brains they may contain, and therefore, a way of
still giving "real existence" to the minds they contain.
We can further expand this argument as follows :
How could a melody exist, not just as a succession of sounds but
indeed as a melody, without somebody to hear it ?
In the same way, is the famous "hard problem of consciousness" :
how can a thought exist, not just as a computation but as actually
feeling something, in the absence of a non-physical soul inside
the brain to actually feel what the brain is computing ?
By itself, the physical presence of a brain making some
computations, is nothing else than a mathematical pointer that is
"physically given" to that specific computation, that gives it the
quality of being "physically computed" as opposed to other
possible computations. But other possible computations, which do
not receive this physical pointer, still mathematically exist,
don't they ? How does the event of "physically" putting a pointer
on a specific mathematical computation, give this computation a
quality of "existing" more than any other possible computation ?
You can arbitrarily decide to take this pointer as a definition of
"conscious existence" for the whole computation (for this
computation by a physically existing brain to constitute a mind),
assuming it makes sense to speak about the "physical existence of
a global computation" that is the event of having a long series
elementary computations happening "together", while every
elementary step of this computation is repeated lots of times here
or there (but "not together") in the physical world.
But if this pointer is the definition of what "existence" means,
then how can this name of "existence" still also mean what it was
supposed to mean ?
In the same way, Bohmian mechanics introduces a assumption of
presence of the arbitrary data of a mathematical pointer, that is,
the hidden variables, to point to a specific world inside the
many-worlds landscape. And then claims : this pointer is the
definition of "physical existence" for a specific world inside the
many-worlds landscape. But, if all what we have is an arbitrary
mathematical pointer to a specific mathematical structure (world)
inside a mathematical landscape of possible such structures (the
many-worlds), then how can this pointer constitute the definition
of "existence" for this specific world ? Because, the physical law
governing this specific world (the formula of evolution of the
hidden variables) is expressed as depending on the wavefunction,
which is the many-worlds landscape, and thus requires this
many-worlds landscape to already exist.
Reference : Solving
the measurement problem: de Broglie-Bohm loses out to Everett
Problem 6 : Hidden Variables just cannot be a philosophically best interpretation
This is a new argument, logical consequence of Problem 3, which I wrote
in 2025 in another text
(simply explaining and defending the mind makes collapse interpretation),
before copying it here (follow the link to see it in context):
Whether decoherence improves the situation (oddity of Bohmian trajectories)
remains unclear, as an
analysis focused on a particular case, leaving open the general
question. Indeed by the fuzziness of decoherence, the claim at any
time that the hidden variable selected one (measurement) outcome, remains
ill-defined. Thus, no clear philosophical principle ensures this
selection to be preserved in time. Whether it is preserved, is a
hard technical question dependent on a choice of hidden variables
theory. But, it is superseded by the following philosophical
argument.
Imagine two hidden variables theories A and B were found matching
our best physics, where outcomes cannot switch anymore after
conscious observation according to A, but can still switch according
to B. How should a proponent of the hidden variables interpretation
react to this ?
- If this is an advantage of A over B, this means the viewpoint
of conscious observers should matter to the resolution of the
measurement problem. This contradicts the claimed preference for
hidden variables, and generally any physicalist interpretation,
over a mind makes collapse interpretation. Indeed if there is a
philosophical need that outcomes remain fixed after conscious
observation, then this feature would more naturally come as an
explicit fundamental principle, than an empirically empty,
sophisticated coincidence.
- But if this difference does not matter, then one should
directly assume that in the style of Last Thursdayism,
selections keep switching many times after conscious
observation, for correctly assessing the hidden variables
interpretation against many-worlds. In these terms, many-worlds
seems to fare better.
Either way, the hidden variables family of interpretations appears
philosophically unmotivated.
Synthesis of all the above : the Universe looks like
a conspiracy
By which metaphysical accident could
the laws of Nature happen to take, as Bohmian mechanics tells,
the exact shape of that sort of
incredible conspiracy, endorsing in a so exact manner some
very specific effective properties that are a "miraculous" way of absolutely
hiding some radically different kinds of underlying ontological causes ?
Listing these extraordinary discrepancies of properties between effective and underlying
behaviors in the same order they were
listed above as "problems":
- The world everywhere perfectly obeys relativistic invariance
in all its many effective processes : no possible experiment can
ever detect any fault in this invariance (measuring "which is
the right frame") ; this invariance turns out to be one of the
main pillars of the understanding of all physics, from
gravitation to particle physics -- while no such invariance
exists in the deep causes of all this.
- Despite the fundamental necessity for the Universe to be
deterministic, there precisely appears one clearly best
predictive theory (ordinary Quantum Mechanics) expressed in the
form of a very elegant and convenient mathematical structure of
probability law (better than that of underlying causes), that
cannot be turned around by any means (the source of randomness
is absolutely hidden from any possible investigation).
- The effectively relevant concepts to conveniently understand
practical phenomena (the diverse observables of quantum physics
and how concepts of classical physics come as their
approximations) look very different from the shape of their real
causes ("trajectories"....).
- Specifications of the exact structure of the deep causality
laws cannot be investigated by experiments.
- The only thing that looks real for all practical purposes (the
wavefunction, that has to be part of the equations to govern the
evolution of "particle positions" but is not affected by them in
return), is in fact a "totally unreal" thing (all the other
worlds in the many-worlds picture it contains are "totally
unreal"), while only the particle positions (which behave as
mere hidden variables, totally unreal for all practical
purposes), have the metaphysical quality of "real existence".
Would the Universe be schizophrenic or what ?
In several ways, Bohmian mechanics or its motivation are in conflict with a fundamental
principle of modern science in general, and especially modern physics :
logical positivism. Problem : if it can be okay for
a scientist to dismiss idealism as antithetical to
quantum mechanics in the name of the idea that idealism would be antithetical to the principles
of logical positivism which are the very foundation of the proper
understanding of quantum mechanics (and science in general), while, in fact,
idealism is in perfect fit with logical positivism, then what should one think of those scientists
who insist praising Bohmian mechanics for what they see as crucial or maybe required
qualities of scientific meaningfulness, when these supposed "qualities" actually mean or imply
the very antithesis of logical positivism ?
Links to other arguments and articles about Bohm's
interpretation
Sites supporting Bohmian mechanics:
Bohmian Mechanics by Luke Bovard mentioning how cumbersome is Bohmian quantum field theory
A blog article with discussion : Quick
Impressions of Bohmian Mechanics
Why isn't every physicist a Bohmian? reviewing some common objections
In Physics Stackexchange:
Why
do people still talk about Bohmian mechanics/hidden variables
What is wrong with the De
Broglie–Bohm theory a.k.a “Causal Interpretation” of quantum theory?
Quora: Why don't more physicists subscribe to pilot wave theory?
A former proponent's view : "Note that I spent myself a lot of time with the Bohmian interpretation before I rejected it as superficial, essentially for the reasons given by Werner. It didn't add any understanding but wasted a lot of my time"
On some early
objections to Bohm’s theory
Bohmian
mechanics listed among "lost causes in theoretical physics" by R.
F. Streater
How is quantum wave collapse a more reasonable concept than pilot waves?
An
article in Wired followed by a discussion
Bohm's Ontological
Interpretation and Its Relations to Three Formulations of Quantum
Mechanics
A
discussion focusing on Bohmian mechanics (and secondarily, on
many-worlds)
Paul
Dirac's forgotten quantum wisdom. In this article, Luboš Motl
speaks about "deluded pseudoscientists – from David Bohm to dozens
of nameless crackpots".
It's
been a tough week for hidden variable theories
Farewell to determinism
Reddit discussion
Other hidden variables interpretations
Stochastic
interpretation
Solipsistic hidden
variables interpretation
Related pages
Introduction to
quantum physics, that provides a quick and clear initiation,
useful for those not familiar with it yet
Main page of
arguments on quantum physics interpretations
The Many-Worlds
interpretation
Mind Makes
Collapse interpretation of quantum physics