Subject: Re: more questions
On Sun, 27 Oct 2002, Kathryn Blackmond Laskey wrote:
> Henry,
>
> >I have read ... statements ... that measurement involves interaction
> >of a quantum system
> > > with its environment, and is (it is asserted) therefore "nothing but"
> >> Schrodinger evolution on a larger system.
> >
> >This is wishful thinking not backed up by adequate supporting math.
> >
> >...It is indeed widely advertized that the interaction with the
> >environment solves the measurement problem... I believe that the
> >details have not been worked out
> >satisfactorily , and that the gaps are significant, and are
> >sufficient to undercut any strong claim that the Schroedinger equation
> >alone (and this includes the enviromental decoherence) is
> >actually sufficient, by itself, to tie the evolving state vector
> >to well-defined probabilities for human experiences, which is
> >what the orthodox (Copenhagen and von Neumann) formulations
> >achieve by explicitly introducing a second process tied to experiential
> >realities.
> >
> >...if the universe has been evolving
> >since the big bang solely under the influence of the Schroedinger
> >equation then every object and every human brain would now be... represented
> >by a smeared out cloud, by an amorphous continuum. But in order to
> >extract from a quantum state a set of probabilities pertaing to
> >human experiences, and hence to give well defined empirical
> >meaning to the quantum state, one must specify a basis... there is
> >no detailed suggestion as to how
> >a set of particular orthogonal projection operator P are to be
> >specified on the basis of the amorphous state of the brain and
> >the continuous action of the Schroedinger evolution.
>
> The Copenhagen interpretation requires specification of a basis too,
> right? There isn't any theory for what projection operator gets
> applied and when, is there?
Yes there is for the Copenhagen interp. See page 52-53 of my MM&QT.
The point is that this crucial mind-theory correspondence is determined
essentially by trial and error, which as I emphasize in my new
book is essentially what we do in real life, from infancy on,
in learning the correspondence between the feel of the effort and the
resulting feedback. Von Neumann theory is in this respect a direct
generalization of Copenhagen. As for the timing associated with the
quantum Zeno Effect experiments, the experimental conditions combined with
theory does give a connection between how the experimenter sets up the
experiment (how he acts) and when and how frequently the probing action
occurs.
>
> >...a (proper) subspace is a set
> >of zero measure... how can a continuous amorphous structure distinguish
> >one direction from those lying arbitrarily close to it? ... Moreover,
> >the projection operator P cannot be local (confined to a point,
> >as contrasted to a smeared out region.)
>
> In other words, any Wigner type theory needs extra assumptions
> specifying what projection operators govern the experience of
> observers in the many worlds. But so does your theory, right? Is
> your claim that you state upfront that the theory requires
> specification of measurement operators and time of operation, whereas
> they sweep this requirement under the rug?
>
No! My claim is much stronger than that! And this is a main point.
I claim that many-worlds is constitutionally UNABLE TO DO THE JOB, but
that von Neumann QT is constructed so that it can. Many worlds theories
CANNOT work, but vN QT CAN.
I have written a new chapter in my book, new chapter nine, that explains
this in detail. I attach it at the end of this reply.
> >...Any claim to have resolved
> >this claring problem of principle without bringing in another
> >process (besides the Schoedinger equation) needs to be
> >spelled out in detail. But this has not been done. Rather,
> >the environmental decoherence effect has been pointed to as
> >some sort of panacea. But that effect does not resolve the
> >problem at issue, but rather heightens it, by making it
> >effectively impossible, or nearly impossible, to use
> >empirical data to shed any light on the matter. The
> >environmental decoherence effect quickly reduces the density
> >matrix of macroscopic systems to NEAR diagonal form, but the
> >slightly off-diagonal elements in coordinate space hold
> >the continuum of diagonal states together in a continuous interlocked
> >structure. This structure does not break up purely dynamically into
> >a set of discrete regions. Any rule that breaks up this linked amorphous
> >structure in coordinate space into a set of discrete parts associated
> >with distinguishable experiences would seem very difficult if
> >not impossible to achieve solely by the dynamical process specified
> >by to the Schroedinger equation alone.
>
> In quantum mechanics texts they discuss both discrete and continuous
> observables. The position operator, for example, has a continuous
> spectrum. Textbooks talk as if observing postion can result in any
> of a continuum of values. Yet in von Neumann's book on mathematical
> foundations of quantum theory, he assumes observables have discretely
> many values -- finitely many, if I remember correctly. Can there be
> continuous-valued observables? Would that help the "it's all
> Schrodinger evolution" camp?
>
Yes, it might help a lot. There may be mathematical work in this direction
that might help. I am hewing to von Neumann's treatment of the measurement
problem.
> >In any case this extraction of discrete subspaces from the
> >amorphous evolving quantum state needs to be described by those who
> >claim that the Schroedinger equation alone is enough.
>
> But in fairness, don't the vN/W/S advocates have to specify where
> their projection operators come from? The wavefunctions in quantum
> field theory arise from a few fundamental symmetries. But where do
> the projection operators come from?
>
The point is that von Neumann brings in Process I, which is explicitly
designed to solve this problem in an economical and highly constrained
way. There is a big difference between bringing in a process specifically
designed to solve the problem, and claiming that it can be done without
bringing in any other process. In one case the experiential aspect of
nature is given a job, and in the other the experiential aspect of nature
is again dynamically ignorable, or impotent.
> >...I once asked Bohm how he answered Einstein's charge that his model
> >was "too cheap". He said that he agreed!
>
> As I understand what I've read of Bohm's writings, he never actually
> believed his model was right in any absolute sense. He put it
> forward to counter the claim that a hidden variable theory consistent
> with the empirical data could not be constructed. It was an existence
> proof that a class of theories was possible, not the theory he
> thought was a correct description of Nature. He argued passionately
> for a respectful dialogue among differing viewpoints, in contrast to
> the often acrimonious attempts to get one or another viewpoint
> accepted as "correct" purely on the force of the personality of the
> person advocating it. I always liked that about Bohm. :-)
>
I have always strongly recommended that physicists should know about
Bohm's theory: it is an important philosophical tool.
But in the present context we need an ontologically viable
candidate, and in this context Bohm's model fails.
> >...quantum theory explains well how information is
> >transferred to measuring devices. But those clean descriptions
> >are the BASIS of the measurement problem, not the solution.
> >They do not explain how some object whose location is represented by a
> >(center-of-mass) wave function that is spread out over meters is
> >experienced as being located at nearly a point, and with some well
> >defined associated probability. If the device location is smeared out
> >then for each of a large continuum of device locations the device will
> >record the object as being "here." [Suppose the device has in addition to
> >its detecting ability also the capacity to determine and record its
> >location, and to correlated that information to the "detection" event.]
> >
> >Its the same problem as before. Orthodox theory allows the detector to be
> >placed in some particular location corresponding to some particular
> >experience because the observer places it there, but an observer governed
> >solely by the Schroedinger equation has nothing definite about him: the
> >entire situation is a continuous smear with no dynamically defined
> >dividing lines.
>
> But wouldn't each well-defined "here" correspond to a well-localized
> set of possible measurement outcomes and associated probabilities?
Please see the appended chapter 9.
>
> >Some additional principle connected to the mind-brain
> >connection is needed.
> >...The whole von Neumann approach involves assuming that the entire
> >physical universe, without Process I interventions, evolves in
> >accordance with Process II. That is the starting point. This will
> >certainly generate continuous evolution of "measurement" type
> >processes in subsystems. That is all automatic, and essentially trivial.
>
> Why trivial?
One just solves the equations, which is often very simple, but in any case
straightforward in principle. Many cases have been worked out,
and they always yield the results that follow easily from the basic ideas.
>
> >The problem, if no Process I interventions are included,
> >is to tie the amorphous structure generated by Process II to human
> >experience.
>
> Why isn't measurement-type processes in subsystems enough? Why isn't
> the sensory subsystem a perfectly good subsystem, and why isn't
> Schrodinger evolution of measurement processes in sensory subsystems
> good enough?
>
Please see the appended Chapter 9.
> >...As regards tests. it is certainly the case that living
> >systems that exploit the Quantum Zeno Effect in the way I
> >am suggesting, would tend to be more stable and directed than
> >a system evolving under the action of the Process II (Schr. Eq.)
> >alone. So there are, in principle, important empirical differences.
>
> You have challenged the many-worlders to work out the details of
> their claim that environmental decoherence solves the measurement
> problem. Have you worked out in detail the mathematics that backs up
> your claim that QZE exploiters are "certainly" more stable and
> directed?
>
I have worked out the detailed mathematics that shows how a
suitable exploitation of the freedom of choice associated with
Process I CAN make the system more stable and directed. Since the laws
of nature ALLOW this, it is plausible, in view of the way that we see
biological systems exploit available possibities that will enhance their
survival chances, that this advantage will be used by biological agents.
> >Actually, a properly constructed many world theory COULD be
> >essentially equivalent to the von Neumann theory. The point is that
> >a properly constructed many-worlds (actually one-world, many-minds)
> >theory needs, I believe, to specify orthogonal projection operators
> >corresponding to different distinguishable experiences, for only in this
> >way can the theory be adequately connected to experience.
>
> In the quantum computing literature, the "computational basis" is
> used. If we have a quantum computer of n qbits, then the
> computational basis consists the 2^n vectors { |x1>x|x2>x...x|xn> },
> where each xi can be either zero or 1. That is, the eigenvalues of a
> complete observation are n-element bit vectors.
>
> A quantum algorithm as described in that literature applies a
> sequence of unitary transformations to an input assumed to be a bit
> vector, and then takes a measurement to observe the output. In other
> words, a measurement is assumed to have been applied at the
> beginning, so that "we know" the input, and a measurement is applied
> at the end, so that "we know" the output. Sometimes internal
> measurements are applied to insert random steps in the algorithm;
> otherwise the algorithm proceeds purely by unitary evolution. But
> note that measurement enters in an essential way, in specifying the
> inputs and obtaining the outputs.
>
Yes, the problems described in Chapter 9 do not seem to be helped by this
specialization of the set of allowed basis states: the discreteness
problem that I describe is still present.
> Most quantum computing people, I am told, are many-worlders.
I wonder how many if pressed would really say that the human
beings who design the systems and implement the designs are
like the systems they design, and have thought about it and have
answers to the basic discreteness problem that I raise in Chapter 9.
Human beings can make discrete choices about where to place their
measuring devices, but the quantum state of the human being, in the
amorphous state of the universe evolved according to Process II
from the big bang cannot.
> But
> there is a fundamental assumption underlying the entire field of
> quantum computing that the designer of the algorithm specifies when
> measurements are taken. The engineering challenge is to make this a
> reality by creating controllable quantum systems of more than a few
> qbits entangled over macroscopic ranges. It seems, though, that this
> essential dependence on a strikingly Copenhagen-like measurement
> process is rarely stated explicitly as an assumption.
RIGHT!
> It is just
> assumed and applied without comment. I guess the implicit idea in
> the minds of most quantum computing folks is that the other copies of
> the computer in other worlds will be doing the computation on
> different inputs and obtaining the corresponding outputs, which the
> copies of the user in other worlds will be observing?
>
These "other copies" are, in the many-worlds scenario, all
continuously smeared out. There is nothing around like the
discretely-position device that in the Copenhagen QT is
part of the discretely-described observer with particular
thoughts. Every "outside" physical observing system is itself
smeared out. The "relative state" idea becomes mathematically
obscure when every state is ontologically on a par with a full
neighborhood of neighboring states.
> So could we construct an actual example of a quantum computer
> algorithm that would demonstrate your hypothesis that a system that
> adjusted its reduction policy in response to the outcome of previous
> reductions was more "stable and directed" than one for which the
> sequence of reductions was independent of the previous history?
>
I have no doubt that given sufficient time and energy one could
show both theoretically and experimentally that if one gave a system
of quantum computers sufficient capacity to design their own connections,
and sorted out by trial and error and survival of the fittest then
the system would eventually become able to exploit the Quantum Zeno
Effect and keep its motor controls focussed on tasks evaluated as
"Good," by high-level evaluators, by means of rapid probings of the
motor control sub-system by a monitoring sub-system.
> It's a trivial matter, of course, to demonstrate that if you allow
> the timing of measurements to depend on the outcome of previous
> measurements, then you get different observable behavior. Just
> consider a simple one-qbit algorithm that repeatedly applies a
> rotation around the real axis. We can push the probability of
> observing |1> arbitrarily close to either 1 or zero, or anywhere in
> between, just by knowing the rate of rotation and the current state
> (equivalently, state at last observation and time since last
> obersvation), and observing the system at times when the probability
> of observing a |1> is what we want it to be. Although this is
> trivial to demonstrate mathematically, I have never seen that point
> made explicitly in the quantum computing literature.
>
> Might this simple mathematical fact be the basis for a scientific
> definition of "sentient" or "adaptive" systems? A system is adaptive
> (or sentient) if the time of next reduction depends on its current
> state?
>
I do not think one can truly "define" sentience in terms of behavior.
But probably no system can be sentient without this control of
timings of probings by prior states. So this property might be taken
to be a necessary condition of sentience for legal purposes.
> >Suppose the
> >aspects of the brain associated with these P's are being periodically
> >monitored/observed by another part of the brain (e.g., the prefrontal
> >cortex). This monitoring would act like a measurement, and be represented
> >by a von Neumann Process I within the part of the space associated with
> >the original set of P's. And perhaps the monitoring action could itself
> >be associated with other experiences, for example, with a feeling of effort
> >or of high-level control of attention.
> >
> >The point here is that I have continually stressed that Process I
> >is not controlled by any KNOWN law. But further developments MIGHT
> >tie it into Process II, plus some identifications of the connection
> >of the P's with experiences. But in the absence of any specific
> >suggestion as to how this would work it seems to me that von Neumann's
> >formulation is the ONLY CURRENTLY AVAILABLE ADEQUATE THEORY.
>
> But if there is no theory for where the P's come from, and no theory
> for how often they are applied except "more effort gives rise to more
> frequent applications," how is that any more a theory than
> "environmental decoherence will fix the problem?"
>
Process I CAN fix the discreteness problem, even if we do not know the
details of how it is activated. But environmental decohence CANNOT solve
the discreteness problem. It works in the wrong way. It works on the
off-diagonal elements whereas the discreteness problem concerns primarily
the diagonal elements of the density matrix.
> Kathy
>
9. OTHER INTERPRETATIONS.
Some physicists are dissatisfied with von Neumann^Òs formulation of
quantum theory, and have put forth alternative proposals. The origin
of their dissatisfaction is the entry of the minds of human beings---our
streams of conscious thoughts---into physics. I consider this step to
be a decisive advance in our understanding of nature, but some
critics think that science ought to return to the nineteenth century
ideal, which excluded our thoughts from the dynamical workings of
the physical universe.
In this connection Kathryn Blackman Laskey of George Mason
University wrote:
"I would appreciate your answering a question I have.
There is much disagreement in the literature about the reduction
process and how it works, including controversy over whether there is
any such thing as reduction. I have read numerous statements from
physicists that measurement involves interaction of a quantum
system with its environment, and is (it is asserted) therefore "nothing
but" Schrodinger evolution on a larger system."
It has, indeed, been sometimes claimed that the interaction with the
environment solves the measurement problem, However, the
principal protagonists of this notion (e.g., W. Zurek, D. Zeh, & E.
Joos) do not, I believe, claim that all of the essentials of that proposal
have really been worked out. I have argued [Can. J. Phys. 80, 2002: The
basis problem in many-worlds theories, pp.1043-1052] that important
aspects have not been worked out, and that the gaps are sufficiently
serious to block, at this point, the claim that the Schroedinger
equation alone (and this includes the environmental decoherence) is
actually sufficient, by itself, to tie the theory to well-defined predictions
pertaining to human experiences. Such predictions are required for
the theory to be scientifically meaningful, and they are obtained in the
von Neumann formulation only by introducing (Process I) dynamical
interventions that are explicitly tied to our thoughts.
The reason, in brief, why an extra process is needed is this: If the
universe has been evolving since the big bang solely under the
influence of the Schroedinger equation then every object and every
human brain would be by now, due to the uncertainty conditions on
the original positions and velocities, represented in quantum theory
by an amorphous continuum: the center-point each object would not
lie at a particular point, or even be confined to a small region, but
would be continuously spread out over a huge region; and, likewise,
the state of the brain every observer of this object would be a
smeared out conglomeration of many different classically conceivable
components, one for each of the allowed center-points in this big
region. That is, if a human person were observing an object, whose
center-point, as specified by its quantum state, were spread out over
a region several meters in diameter, then the state of the brain of that
person would have, for each of these different locations, a part,
corresponding to the observer^Òs seeing the object in that location. If
each of these parts of the brain were accompanied by the
corresponding experience, then there would exist not just one
experience corresponding to seeing the object in just one place, but a
continuous aggregation of experiences, with one experience for each
of the possible locations in the large region. Thus this theory is often
called, quite rightly, a "many-minds" interpretation.
In order to extract from quantum theory a set of predictions pertaining
to human experiences, and hence to give empirical meaning to the
theory, this smeared out collection of different brain structures must
be resolved in a very special way into a collection of discrete parts,
each corresponding to one possible experience. This discreteness
condition is a technical point, but it constitutes the essential core of
the measurement problem. Hence I must explain it!
Evolution according to the Schroedinger equation (Process II)
generates in general, as I have just explained, a state of the brain of
an observer that is a smeared out continuum of component parts,
each corresponding to different possible experiences. One cannot
assign a nonzero probability to each one of such a continuum of
possibilities, because the total probability would then be infinity,
instead of one (unity). However, the mathematical rules of quantum
theory have a well-defined way to deal with this situation: they
demand that the space of possibilities be divided in a certain very
restrictive way into a countable set of alternative possibilities, where a
countable set is a set that can be numbered (i.e., placed in one-to-
one correspondence with the integer numbers 1, 2, 3, ... .). The need
to specify a particular countable set of parts is the essential problem
in the construction of a satisfactory quantum theory. But then the
technical problem for the dissenters is this: How does one specify a
satisfactory particular countable set of discrete possibilities from
Process II alone, when Process II is a continuous local process that
generates a structure that continuously connects components that
correspond to very different experiences, and hence must belong to
different members of the countable set?
In the Copenhagen formulation of quantum theory this selection of a
preferred set of discrete states is achieved by a choice on the part of
the experimenter. The measuring device, set in a particular place by
the experimenter, selects some particular part of the state of the
observed system that corresponds to some particular kind of
experience. In this simple case the countable set has just two
elements, one specified by the projection operator P, the other
specified by the projection operator (I-P). In this way the basic
problem of specifying a countable set of discrete parts is solved by
bringing into the theory a choice on the part of the experimenter. Von
Neumann solves this discreteness problem in the same way.
Einstein posed essentially the same problem in a clear way. Suppose
a pen that draws a line on a moving scroll is caused to draw a blip
when a radio-active decay is detected by some detector. If the only
process in nature is Process II, then the state of the scroll will be a
blurred out state in which the blip occurs in a continuum of alternative
possible locations. Correspondingly, the brain of a person who is
observing the scroll will be in a smeared out state containing a
continuously connected collection of components, with one
component corresponding to each of the possible locations of the blip
on the scroll. But how does this smeared out continuously connected
state of the brain get divided by Process II alone into distinct
components corresponding to different experiences?
A key feature of the orthodox approach is the "empirical fact" that
experimenters can have definite thoughts, and that they can therefore
place the devices in definite locations. Thus it is the discreteness of
the choice made by the experimenter that resolves the discreteness
problem. But an experimenter represented by a state governed solely
by the Schroedinger equation has nothing discrete about him: his
brain is a continuous smear with no dynamically defined dividing
lines.
The founders of quantum theory (and von Neumann) recognized this
basic problem of principle, and in order to resolve it went to a radical
and revolutionary extreme: they introduced human experimenters
with efficacious free choices into the physical theory. This was a giant
break from tradition. But the enormity of the problem demanded
drastic measures. Because such powerful thinkers as Wolfgang Pauli
and John von Neumann found it necessary to embrace this
revolutionary idea, anyone's claim that this unprecedented step was
wholly unnecessary certainly needs spell out the details . But this has
not been done. Rather, the environmental decoherence effect has
been taken to be a panacea. However, that well understood effect
has virtually no impact on the discreteness problem.
The environmental decoherence effect merely reduces the matrices
representing macroscopic systems to near diagonal form. [Recall that
each physical system is represented by a matrix M(l,l'), where l
specifies a location for every particle in the classical conception of
the system, and so does l'. The `diagonal' elements are those for
which l = l', but the slightly off-diagonal elements remain generally
nonzero, and they lock the whole near-diagonal structure together.]
The region where M(l,l') is significantly different from zero remains
large, even after the effects of interaction with the environment are
taken into account. It is not broken up by the continuous action of
Process II into a collection of different, isolated regions that could be
associated with different experiences. But then the way in which the
countable set of discrete states is singled out evidently depends on
something besides Process II, and the quantum state whose
evolution it generates. In any case, the way that particular
experiences are assigned finite probabilities, given only Process II,
needs to be worked out and described in detail by anyone who claims
that the Schroedinger evolution alone is sufficient.
Actually, the problem is technically much more difficult than the above
brief sketch indicates. The real situation involves a space of an
infinite number of dimensions, but the discreteness problem can be
illustrated in a simple model having just two dimensions. Take a
sheet of paper and put a point on it. (Imagine that your pencil is
infinitely sharp, and can draw a true point, and perfectly straight lines
of zero width.) Start drawing straight lines out from the point in
different directions. With an infinitely sharp pencil your could draw
lines in different directions for billions of years, at one line a second,
and not come even close to using up the set of all possible directions.
However, the rules of quantum theory demand in this two
dimensional case that some one particular direction, (together with
the one perpendicular to it) be picked out from this continuous infinity
of possible directions as preferred to all the others. But how is such
an incredibly precise choice determined by this continuous Process
II?
This is the famous "basis problem." which was solved by the
founders, and by von Neumann, by invoking the choice on the part of
the experimenter. Radical as this step might seem to physicists
trained first in classical physics, the notion that our streams of
consciousness play some important dynamical role in the
determination of our behavior is not outlandish: it is what almost
anyone would naturally expect.
Kathryn went on to say:
"Bohm and Hiley say this (that there is no collapse or reduction) in
describing their hidden variable theory."
Bohm's pilot-wave model is another attempt to add onto the raw
theory an extra process, in order to tie the raw theory to human
experiences in a quantitative way.
The main objection to that theory is that, in spite of many years of
intensive effort, it has not been generalized to cover relativistic cases
involving particle creation and annihilation.
I once asked Bohm how he answered Einstein's charge that his
model was "too cheap". He said that he agreed! And notice that the
last two chapters of his book with Hiley tries to go beyond this model.
David Bohm, like myself, saw the need to deal more adequately with
consciousness, and he wrote several papers on the subject. At the
present time Hiley is working on ideas that go far beyond the
concepts used in the old pilot-wave model. I do not think any physicist
actually working in the area would claim that the pilot-wave model
exists today in the relativistic domain.
Kathryn continued:
Others also say this, including people who don't subscribe to the
Bohm pilot wave + particle ontology, such as Carver Mead in
"Collective Electrodynamics," who gives a fairly well worked-out
example of a quantum oscillator jumping an energy level, and how
this can be explained by systems that briefly cross phases, exchange
energy, then go out of phase again.
Quantum theory explains very well how information is continuously
transferred to measuring devices. But those beautiful descriptions are
the basis of the measurement problem, not the solution. They do not
explain how some object whose location is represented by a wave
function that is spread out over meters is experienced as being
located at nearly a point, and with some well defined probability.
Kathryn continues:
"R. Mirman says "Wavefunctions don't collapse, oversimplifications
do... Perhaps what collapses is not the statefunction, but common
sense... Discontinuity cannot be true, and it is not. But carelessness
unfortunately can be true and too often is, and certainly can make
discontinuity appear true." He goes on to amplify: "If for example we
consider an object striking a screen forming a spot, the statefunction
of the system after the formation, the product of that of the struck
atom plus all objects attracted to it and the scattered object, is found
from the initial one using Schrodinger's equation, and if so found
would be seen to vary continuously. In principle it is possible to
calculate final (perhaps extremely complicated) statefunctions from
initial ones, and the entire transformation from one statefunction to
another is completely continuous. Never is there a sudden change or
collapse. Any such appearances result from ignoring the
(continuous) intermediate stages by regarding these as happening
instantaneously." "
Quite true! If process II is the whole story then there never is a
sudden change or collapse! That's the problem! The Schroedinger
equation generates only continuous changes. But the continuousness
of that Process II evolution is closely tied to the fact that in a universe
evolving exclusively via the Schroedinger Equation, (i.e., Process II)
ever since the big bang, the detector is everywhere, instead of
somewhere, and the observer's brain is a smeared out continuum
encompassing all possibilities. The continuousness stressed by
Mirman is the problem, not the solution.
Once, long ago, I characterized the many-worlds solution as shifting
the whole measurement problem onto the mind-brain problem, about
which it says nothing. For the theory, to be empirically meaningful it
must be tied to probabilistic statements about alternative possible
human experiences. But the smeared-out state of the brain does not
cleanly separate vectors from other vectors that differ from them by
very tiny amounts. But then what principle, involving nothing but the
evolving amorphous state of the universe, can separate the space of
brain states into othogonal subspaces, such as those defined by P
and (I-P), associable with different experiences?
I do not claim that this problem has no solution. But Mirman's
observation that a world evolving according to the Schroedinger
equantion alone is evolving continuously does not solve the
measurement problem: it creates the measurement problem.
Certainly, Heisenberg and Pauli, and von Neumann, understood very
well that a world evolving according to a universally valid
Schroedinger equation would evolve continuously. And they also
realized that this did not solve the measurement problem. I have
absolutely no doubt that von Neumann understood very well also the
essential features of environmental decoherence: the basic ideas are
all clearly displayed in his work. Yet in order to get an empirically
meaningful theory he brought in Process I.