Date: Mon, 21 Oct 2002 17:41:24 -0700 (PDT)
From: stapp@thsrv.lbl.gov
Reply-To: hpstapp@lbl.gov
To: Kathryn Blackmond Laskey
Subject: Re: Quantum Theory of the Human person (fwd)
On Fri, 18 Oct 2002, Kathryn Blackman's Laskey wrote:
> Dear Henry,
>
> I hope all is well with you.
>
> Thanks very much for sending me this abstract.
>
> Regarding the book "The Mindful Universe" you sent recently, where is
> that being published?
It is still under construction. I've already had several offers
to publish it, but I'm holding off on deciding until the manuscript
is more readable to more people, and perhaps addresses more completely
the questions you ask below.
> I would appreciate your answering a question I have.
>
> There is much disagreement in the literature about the reduction
> process and how it works, including controversy over whether there is
> any such thing as reduction. I have read numerous statements from
> physicists that measurement involves interaction of a quantum system
> with its environment, and is (it is asserted) therefore "nothing but"
> Schrodinger evolution on a larger system.
This is wishful thinking not backed up by adequate supporting math.
This is essentially the Everett Approach. I have published my reasons for
claiming that this approach, after a half century of dedicated effort by
very many physicists, has not yet succeeded, technically.
[Can. J. Phys. 2002 SEP v80 N9 The basis problem in many-worlds theories,
pp.1043-105]
It is indeed widely advertized that the interaction with the
environment solves the measurement problem, But I do not believe
that the principal workers in that area (Zurek, Zeh, Joos)
actually make the claim that the whole theory has really been
worked out. I believe that the details have not been worked out
satisfactorily , and that the gaps are significant, and are
sufficient to undercut any strong claim that the Schroedinger equation
alone (and this includes the enviromental decoherence) is
actually sufficient, by itself, to tie the evolving state vector
to well-defined probabilities for human experiences, which is
what the orthodox (Copenhagen and von Neumann) formulations
achieve by explicitly introducing a second process tied to experiential
realities.
The reason, in brief, that a second process tied to experiential
realities is needed is that if the universe has been evolving
since the big bang solely under the influence of the Schroedinger
equation then every object and every human brain would now be, due
to the uncertainty conditions on the original atoms, represented
by a smeared out cloud, by an amorphous continuum. But in order to
extract from a quantum state a set of probabilities pertaing to
human experiences, and hence to give well defined empirical
meaning to the quantum state, one must specify a basis: one must
separate the space of the observing system into a set of discrete
orthogonal subspaces corresponding to different "observations";
i.e., into orthogonal subspaces corresponding to different distinguishable
experiences. But there is no detailed suggestion as to how
a set of particular orthogonal projection operator P are to be
specified on the basis of the amorphous state of the brain and
the continuous action of the Schroedinger evolution.
The problem, basically, is that a (proper) subspace is a set
of zero measure. For example, a subspace of dimension one in
a visualizable space of three dimensions consists of a single
line in the three-dimentional space. The definition of this
subspace must distinguish the vectors that lie along that line
from vectors that deviate from that direction by the the tiniest
amount. But how can a continuous amorphous structure distinguish
one direction from those lying arbitrarily close to it?
[For a higher-dimension example one can think of a plane in a three
dimensional space: again almost all points arbitrarily close
to any point in the plane will lie outside the plane.] Moreover,
the projection operator P cannot be local (confined to a point,
as contrasted to a smeared out region.) So how can a nonlocal
P be specified by a local process (the Schroedinger evolution
operator) acting upon a completely amorphous structure, the
evolved state. The founders (and von Neumann) seemed to recognize
clearly this basic problem of principle, and they introduced
a second process to resolve it. Any claim to have resolved
this claring problem of principle without bringing in another
process (besides the Schoedinger equation) needs to be
spelled out in detail. But this has not been done. Rather,
the environmental decoherence effect has been pointed to as
some sort of panacea. But that effect does not resolve the
problem at issue, but rather heightens it, by making it
effectively impossible, or nearly impossible, to use
empirical data to shed any light on the matter. The
environmental decoherence effect quickly reduces the density
matrix of macroscopic systems to NEAR diagonal form, but the
slightly off-diagonal elements in coordinate space hold
the continuum of diagonal states together in a continuous interlocked
structure. This structure does not break up purely dynamically into
a set of discrete regions. Any rule that breaks up this linked amorphous
structure in coordinate space into a set of discrete parts associated
with distinguishable experiences would seem very difficult if
not impossible to achieve solely by the dynamical process specified
by to the Schroedinger equation alone.
In any case this extraction of discrete subspaces from the
amorphous evolving quantum state needs to be described by those who
claim that the Schroedinger equation alone is enough.
> Bohm and Hiley say this in
> describing their hidden variable theory.
Bohm's pilot-wave model is another way to add onto the raw
theory an extra process to tie the raw theory into human
experiences in a quantitative way.
The main objection to that theory is that, in spite of
long-term intensive effort, it has not been generalized
to cover relativistic cases involving particle
creation and annihilation. Also, the connection of that theory
to experiment was based on the presumption that when the macroscopic
level of "pointers" was reached, the experience of the observer would
correspond to the branch of the pointer wave function that was "occupied
by the "trajectory". (Notice that this involves a second process, which
is explicitly linked to consciousness.) But this linkage is cast into
doubt by examples in which the trajectory goes through one detector but
it is the faraway dectector that fires. [ Dewdney, Hardy, and Squires,
Phys. Lett. A 184 (1993) 6-11]
I once asked Bohm how he answered Einstein's charge that his model
was "too cheap". He said that he agreed! And notice that the last two
chapters of his book with Hiley tries to go beyond this model. And
he, like I, saw the need to deal more adequately with Consciousness,
and wrote several late papers on the subject. And Hiley
is working on ideas that seem quite different from the old pilot-
wave model. I do not think any physicist actually working in the area
claims that the pilot-wave model really exists in the relativistic
domain.
> Others also say this,
> including people who don't subscribe to the Bohm pilot wave +
> particle ontology, such as Carver Mead in "Collective
> Electrodynamics," who gives a fairly well worked-out example of a
> quantum oscillator jumping an energy level, and how this can be
> explained by systems that briefly cross phases, exchange energy, then
> go out of phase again.
This book is checked out at UCB and is being ordered for me from UCSC.
But in any case quantum theory explains well how information is
transferred to measuring devices. But those clean descriptions
are the BASIS of the measurement problem, not the solution.
They do not explain how some object whose location is represented by a
(center-of-mass) wave function that is spread out over meters is
experienced as being located at nearly a point, and with some well
defined associated probability. If the device location is smeared out
then for each of a large continuum of device locations the device will
record the object as being "here." [Suppose the device has in addition to
its detecting ability also the capacity to determine and record its
location, and to correlated that information to the "detection" event.]
Its the same problem as before. Orthodox theory allows the detector to be
placed in some particular location corresponding to some particular
experience because the observer places it there, but an observer governed
solely by the Schroedinger equation has nothing definite about him: the
entire situation is a continuous smear with no dynamically defined
dividing lines. Some additional principle connected to the mind-brain
connection is needed.
> Dorit Ahranov in her review article on quantum
> computing also says that measurement involves Schrodinger evolution
> of the system and its environment. R. Mirman says "Wavefunctions
> don't collapse, oversimplifications do... Perhaps what collapses is
> not the statefunction, but common sense... Discontinuity cannot be
> true, and it is not.