Vous êtes sur la page 1sur 46

Under Our Noses: Existing Theories of

Everything

Trevor Pitts
tpitts@pobox.com
2015-01-31
Copyright 2015

Copyright: Trevor Pitts, 31 January 2015

Abstract
The essays subject is the dynamics of the creation, and the operation in time, of any and
all possible universes capable of or presently harboring intelligent beings. It attempts to
demonstrate that there can be exactly one such universe of any practical relevance by
gathering together existing experimental results and accepted theories to form a summary
of a totally quantized and symmetric physical model of reality. It contains a
reinterpretation of an existing candidate for a theory of quantum gravity in order to
explain time symmetry, and Quantum quasi-teleology, a speculative extension of the
concept of the sum-of-all-paths version of quantum mechanics in order to explain finetuning and by extension the lack of need for theories of supersymmetry. Much of the
material involves neglected topics in conventional physics and their philosophical
implications. The results suggest significant consequences in moral philosophy.

Introduction
Assumptions
I trust that the professional physicists who built all the separate moving parts that I am
gathering together to form a summary of a totally quantized and symmetric physical
model of reality are correct in their experiments, mathematics, and accepted theories.
They already have provided the conceptual tools to eliminate any purported
contradictions and build this model. However, if we cant put something into
comprehensible verbal form (according to Freeman Dyson), we dont clearly understand
it. This is an attempt to do so.

What Does this Essay Offer?


It is not a new proposed Theory of Everything (TOE); it is a proposal to assemble and
connect various neglected parts of late 20C and 21C science to provide an empirical,
testable approach to explain large-scale perceived reality and its creation from nothing.
This may be a revelation of how much we already know about the supposed mysteries
listed below, that a new TOE might be expected to resolve. Perhaps it is a Theory of
Most (TOM).
This essay seeks to demonstrate that everything - space, energy, mass and time, is
discontinuous and quantized. Consider the quantization of mass:
An atom or molecule is a discrete local unit of matter separated by gaps of empty space;
atomic or sub-atomic particle numbers can only exist in whole numbers (integers) or
change by a limited number of integer amounts. The same is true of all particles- every
member of each type is indistinguishable from every other of that type and each
mass/energy state is separated by an empty gap. That gap can be, for example, a gap
between two energy levels of rotation in a molecule, or of vibration in a molecule, or of
Copyright: Trevor Pitts, 31 January 2015

an electrons distance from the nucleus in an atom. There can be no in-between values
and no time exists between the two levels (or, more generally, states). Change is
instantaneous. The quantum change is a jump of a fixed quantity or quantum of
energy.
The quotes around the word instantaneous are necessary because the word may
wrongly imply continuous time. Time itself is quantized according to this essay.
Capitalized NOW in this essay means the postulated universe-wide single moment of the
present.
One pervasive theme of this essay is the sum of all paths version of quantum
mechanics, which is generally agreed to be fundamental to operation of our universe, and
I will argue, to its formation. This, combined with the second theme, the symmetries,
especially general relativity, already can produce consistent answers to all of the truly
basic issues listed below, if properly understood. Remarkably, physics, following
contradictory philosophical/formalistic assumptions, has no widely accepted explanations
for any of the issues below, nor do philosophers:
Einsteins relativity supposedly implies determinism, which is in
philosophical and mathematical formalist conflict with quantum
indeterminacy, basic to quantum mechanics, the most successful theory of
all- with implied determinism, we cannot reconcile the two most basic
theories of reality
The irreversible forward arrow of time and the uniqueness of the present
moment- we do not understand time
The collapse of the wave function we do not understand how events arise
Causality and the entanglement of distant quantum states with each other,
or spooky action at a distance- Einstein spent much of his life on this, with
no resolution
The imbalance of matter versus antimatter in the universe, equivalently, why
was matter not completely annihilated at the origin by the required equal
amounts of antimatter- we have no accepted explanation for the existence of
matter
Why there is something rather than nothing, and how could our particular
spacetime background be constructed consistently with known physics
How did our universe arise, whose most fundamental constants and
parameters appear to have been tweaked to be perfect for life, against
astronomical odds: The Fine-Tuning Problem
The accepted, symmetry-based Standard Model of particle physics requires
19 unrelated constants of arbitrary magnitudes. Further, the possibility of
neutrino mass seems likely to require at least six more, in the view of most
specialists in the field: another fine-tuning problem?
Roger Penroses concern about the anomalously extreme low entropy state
apparently required at the origin
Free will and responsibility versus the jazz song lyric Im depraved because
Im deprived

Copyright: Trevor Pitts, 31 January 2015

(Note that in an effort to be understandable by non-scientists, the references cited are


often not the original ones, but ones more easily accessible and/or comprehensible to
non-physicists.)

Physical Science and Why this Essay Matters?


Logical philosophy and mathematics are pure, self-consistent symbolic abstractions.
Mathematicians, and many physicists such as Roger Penrose, will often argue the
Platonist position that mathematical concepts are purely abstract, having no necessary
causal or descriptive relevance to the actual world. However, much of mathematics is
highly relevant to real world issues. Modern physics is essentially extremely advanced
mathematics equipped with hands, ears and eyes. Yet, in many ways modern physics
promotes the abstractions of mathematics and philosophy instead of physical world
experimental results as the basis of science. This is especially true of the essays subject:
the dynamics of the creation, and the operation in time, of any possible universes capable
of or presently harboring intelligent beings.
A key problematic example is the assumption of a continuum in Nature, which permeates
our language. A continuum is anything that changes gradually from one condition to a
different one, with no gaps. The Universe itself has often been described as The
Continuum. In mathematics, the continuum is formally defined as the line of real
numbers, all of which can be defined by a decimal expression of indefinite length, with
no gaps between any real numbers, requiring infinite divisibility of the line. This formal
continuum concept is pervasive in modern physics. I argue, in agreement with
experiment, that our universe requires the opposite. In experimental fact, our universe is
discontinuous, or quantized. To be quantized is to be able to change only in discrete, tiny,
fixed quantities, with no intermediate values or time between these instantaneous
quantum jumps. Determinist philosophy and mathematical formalist prejudices have
interacted with each other over time to confuse what should otherwise be clear.
This essay agrees with the Platonists that not all of mathematics operates in the real
world. Specifically: First, the continuum concept and its implication, infinite divisibility,
have no role in absolute reality. Second, the fact that a mathematical expression
accurately calculates some quantity by a deterministic method does not imply that a
series of microscopic events is necessarily deterministic by a philosophical extension of
that formalism.
Physical mathematical equations are not reality; they describe a portion of reality. If
reality behaves differently than expectations based on a mathematical formalism, then the
formalism is being philosophically extended beyond its scope. The formalism is not
necessarily wrong; only the extended, essentially philosophical, interpretation is wrong
and unjustified.
Classical determinism is the belief that given sufficient information as to the laws and the
current state of the world one could perfectly predict all future states. There would be

Copyright: Trevor Pitts, 31 January 2015

only one possible future state at any time, given any past state. 19thC (Classical) physics
assumed determinism as basic. Strangely, most physicists today are still Classical
determinists at heart.
Science began by a carefully observed and practical approach to real events. It became
dedicated to testing theories based on experimental or observational, as opposed to
argumentative, linguistic evidence. This led to the goal of skeptical demolition of all
and every hypothesis, theory or conclusion (to the extent that is practicable) with a view
to the survival of only the fittest descriptions of reality and to the elimination of the rest.
Science has evolutionary logic, starting with evident reality as its basis, proceeding
temporarily with some concrete and usually useful experimental conclusion, then
iterating more and more exact experimentation to eliminate failed theories to reach betterfitting theories. This essay argues that, from the late 19th Century onward, existing
philosophical and mathematical formalist prejudices have had a negative influence on
clarity in physics by fostering specious arguments against new, philosophically
uncomfortable experimental evidence.
Currently, in physics there is a common view that the theory explaining gravity, general
relativity, and the theory explaining change, quantum mechanics, cannot be reconciled. It
is true that the problems listed above cannot be solved without using both sets of insights.
This essay, however, proposes that they cannot and need not be combined into a new
theory of quantum gravity. The supposed problem is that Classical determinism, (the idea
that any and every situation inevitably has only one possible future) is thought to
contradict quantum mechanical indeterminacy, preventing their combination into some
final theory of Quantum Gravity. But, there is zero actual experimental evidence for
such determinism in physics. At root, physical law is completely defined by symmetry
and operates via quantum mechanics. Symmetry is a constraint over possible events; it
both allows and enables different outcomes, as long as its rules are obeyed. The great
mathematician, Emmy Noether, derived the laws of nature- conservation laws, from the
symmetries in 1915. The (usually assumed to be deterministic) laws of nature are not
basic, being entirely derived from the truly basic symmetries. The unnoticed distinction
was that symmetries are like the rules of chess: vastly different outcomes are possible
within the rules. Since special and general relativity are both examples of these universal
symmetries, the same argument applies. There is no conflict, because neither is in fact
determinist in the Classical sense. So the future is open, subject to the symmetries, not
ineluctable fate. (Symmetries, quantum mechanics and quantum indeterminacy will be
defined and considered later). Relativity is perhaps among the most basic symmetries, but
it is no different in essence. The key concept to unraveling supposed contradictions is to
accept that symmetries provide both the rules and the particle actors themselves while
quantum mechanics is the mechanism which moves the universe from one state to the
next. Symmetries provide a three dimensional structure of the world and create the
particle actors which quantum mechanics animates in time, while quantum indeterminacy
allows script variations in the universal movie. This essay argues that we can simply
accept that both relativity and quantum mechanics are true, mutually interacting, entirely
separate descriptions of the relevant parts of reality. We resolve the unanswered

Copyright: Trevor Pitts, 31 January 2015

questions above by using both together. We do not cripple ourselves intellectually by


unnecessarily insisting that it is impossible to do so.
The assumption of Classical determinism casts doubt on quantum indeterminacy, a
fundamental component of quantum mechanics, the most successful physical theory of
all, so far. However, there is no experimental evidence purporting to support
determinism that the symmetries cannot wholly explain. The symmetries behind special
and general relativity, and the theories behind quantum mechanics have survived every
theoretical challenge and experimental test of their predictions, no matter how apparently
bizarre their predictions may seem. Neither can be fundamentally wrong. Adding
consideration for some late 20th and 21st century physical insights refutes unwarranted
reservations as to the true reality and meaning of quantum mechanics and the symmetries.
Additionally, this essay fully accepts special and general relativity and quantum
mechanics as experimental fact and submits that they already have been successfully
combined by Roger Penroses decoherence by mass hypothesis, if certain recent solutions
of the Einstein equations of general relativity are also accepted.
The foregoing statements set the stage for addressing the repercussions of physical
deterministic assumptions, highlighted by the following comparison: Both quantum
mechanics and relativity theories arose at the turn of the twentieth century as a result of
experimentally driven crises in Classical Maxwellian and Newtonian theories. Now, in
their aftermath, the crisis of the twenty-first century is philosophical, not physical.
Indeed, the assumed determinism in physics via philosophy and mathematical formalism
produces assumptions corrosive to morality, politics and diplomacy. If science shows
that everything is ineluctable fate, who can be individually guilty, even of the worst
atrocity? Are the depraved guiltless if they were deprived? Yet, who is not deprived of
something? And how can biological evolution by random mutations be explained if the
physical world we live in is completely determined moment to moment from the Big
Bang, forever? The historic fact is that the contribution of fatalist cultures to human
advancement has been negligible versus that made by cultures emphasizing individual
freedom and responsibility. So determinism is unlikely to prove positive
sociologically/economically.
The theme here is universal rule by the symmetries, which function as constraints, but not
absolute controls, of events, in contrast to the usual deterministic assumptions as to their
derivatives, the conservation laws of nature. I propose that we already have the means to
solve all the mysterious issues in the above list, using the four basic tools of modern
physics: microscopically, the sum of all paths version of quantum mechanics, working
subject to the 248 symmetries of the E8 Lie group. These lead to, macroscopically, two
more tools. These two emergent phenomena are Chaos (Lorenz 1993) and Self-Limiting
Criticality (Bak 1996, Jensen 1998). I prefer to call the latter avalanche theory, and will
define both later. One fundamental process, quantum mechanics, one basic principle,
symmetry, and these two emergent phenomena, working subject to a basket of a half
dozen universal constants or parameters unique to our universe (Rees 2000), plus up to
26 less well-known, but vital, parameters inherent in the Standard Model of particle
physics, can explain the entire basic nature of reality. This is not to say that there are no

Copyright: Trevor Pitts, 31 January 2015

unexplained phenomena, especially on the cosmological level, but we can sufficiently


address the list of issues in the Introduction above. All the above four fundamentals are
experimentally confirmed and theoretically sound, but philosophically unappreciated.
This is due to Classical deterministic assumptions and the overly revered formalism of
the mathematics. Indeed, most of these fundamentals have not yet permeated the
philosophy of science sufficiently to change the persistent default deterministic stance,
that of 19th Century Classical physics.

Authors Background
I should point out that I am a chemist, not a physicist. I have an advantage in that all of
chemistry from the nuclei of the elements to the geology of the planet is entirely a
quantum phenomenon. I have to be comfortable with it. Very few of us truly understand
quantum mechanics or all of the symmetries, even fewer understand both. But that is
unnecessary. We simply have to accept the latest physical evidence as the
theoretical/experimental reality on which to base ontological or epistemological
philosophy rather than the other way around.

Quantum Mechanics
The central fact of any quantum phenomenon is the quantum jump. Every quantum
transition is very different from how we perceive the usual big, warm, heavy world
circumstances. If you climb a staircase, you gradually move, from one step to another,
using energy apparently continuously the whole way, more or less evenly between the
steps. The atoms electrons are similarly at a series of levels of energy approximately
corresponding to stepwise average distances from the nucleus. Approximately, the
electron can move from one step to another one, if it is vacant. The difference is that
there is no time or space occupied by the electron between the two states. Rather, it can
jump instantaneously to a lower position, closer to the nucleus, by emitting a precise
frequency of light (or similar electromagnetic energy such as X-rays) as it does so. It can
jump back up, to its vacant position equally instantaneously, if exactly the same
energy/frequency is absorbed by it. So in this aspect - reality is discontinuous, all
microscopic changes occur in specific increments, instantaneously, not gradually. (We
shall explore later what is an instant.) This will be of supreme importance in our
transition from the comfortable, illusory Classical clockwork view to the
quantum/symmetric real world.
The facts of quantum mechanics are contradictory to ordinary, nave observation of large
objects on a huge warm planet. They only become demonstrable at a tiny scale, at low
temperatures and in isolation from uncontrolled interaction with the world of large
objects. But if we consider them together with general relativity, we can extract the
nature of the present moment itself, and propose answers to all the conundrums listed
above. I believe that we already have theories of all issues in physics that can fix the
supposed irreconcilability of quantum mechanics and relativity, this section intends to
present only the bare bones of huge subjects in order to extract the key points.
Copyright: Trevor Pitts, 31 January 2015


Quantum mechanics. Why use this form of words rather than quantum theory? Quantum
science has proceeded beyond mere theory. It is now an exact science, able to calculate
physical quantities accurate to a dozen decimal places, having withstood the determined
attacks of many of the greatest minds in physics for over a century. It is empirical fact,
like the thermodynamics of steam engines, no matter how weird it seems. So now, the
genuine weirdness is the refusal to whole-heartedly accept quantum mechanics as
experimental fact. At root, modern quantum mechanics is the sum-of-all-paths approach,
which is the means by which all events occur. This, as I discuss below, includes the most
important event, the origin of our universe, optimized for the existence and flourishing of
interplanetary technical civilizations.
Quantum mechanics (in its modern form - happily, fully compatible with special
relativity, thanks to Dirac and Feynman) is the fundamental process by which all the
particles get from one event to another. These particles altogether produce all
macroscopic events. The current, supremely successful approach to quantum mechanical
calculation is variously titled the sum of all paths or path integral or sum of all
histories method (Feynman 1985, Feynman, Hibbs et al. 2010). All are different names
for the approach begun by Richard Feynman with his Feynman diagrams in quantum
electrodynamics (Feynman 1985). Essentially, every particle explores all the ways to get
to the next state of reality, however bizarre, unlikely, or apparently impossible each such
path may be.
Is every tiny particle so smart as to be able to explore all imaginable and unimaginable
paths? No, but if the universe as a whole is a giant, massively entangled quantum
computer, then it is smart enough to perform the only actual task in the universe, creating
the next moment, otherwise known as Reality. Entanglement will be explored later.
Everything in Nature at the microscopic level proceeds by quantum mechanical
processes. They are the means by which all change occurs at the fundamental particle
level. Experimentally, such processes always end with an interaction involving some
entity of sufficient mass, generating some aspect of perceptible reality. This universal
end/interaction is the collapse of the wave function, sometimes called decoherence, which
I consider to be the most common phenomenon in Nature. Indeed, as we see later, it is the
only basic actual event in Nature. This collapse cannot be excused away if we are to
construct a testable model of reality consistent with the fundamental, universal
experience known as the present moment.
This essay does not address interpretations of quantum mechanics because there is
nothing to interpret. The quantum phenomena discussed here are simply experimental
and experiential facts or testable predictions. Intellectual philosophical discomfort is not
a valid reason to try to explain away the facts of experiments. Now we come to many
physicists pet hate, the collapse of the wave function. The Schrodinger wave equation
calculates the probabilistic evolution of particle states in time as a wave function in an
abstract deterministic mathematical form. It deals with superpositions between
different possible particle positions and velocities. But in the end, it can only produce

Copyright: Trevor Pitts, 31 January 2015

probability amplitudes of the final position and velocity using the sum of the squares of
these amplitudes. Many of these amplitudes involve i, the square root of minus 1, an
imaginary number. These, by being negative when squared, cancel out some positive
squared components of the wave equation when summed, with excellent results in terms
of describing actual reality. One can barely imagine the shock and horror of most
scientists a century ago when this was first proposed, all reared on the Newtonian
clockwork universe, the Maxwell Equations of electromagnetism and the Laws of
Thermodynamics, when confronted with negative, squared probability amplitudes and
indeterminacy. Their shock prefaces the incredulity and reluctance to accept the
overwhelming experimental evidence, reluctance that persists to this day.
There are problems apparent with the approach initiated by Bohr, Schrodinger, and
Heisenberg, by which one starts with some original state, calculating some physical
change over time. The physical system proceeds through a variety of superposed states
at once (superimposed together on top of each other, always remembering that only one
state will be real when the stack collapses the sum into only one real state ), in different
places, with different velocities etc. These individual superposed quasi-events are each
calculated to proceed deterministically. Only when measured by interacting with some
macroscopic apparatus does this superposition of different likely, unlikely or
impossible routes and states collapse probabilistically into an actual, factual reality.
This led to huge (and absurd) issues: What is a measurement? Is it the motion of an
instrument needle? The observation of this needle movement by a conscious entity? Or
even the publication of the result in Nature magazine? This chain of observers is
potentially endless. The Schrodingers Cat diabolical thought experiment in which an
outside observer cannot measure whether the cat inside the box is dead or not was
Schrodingers masterpiece of sarcasm on this subject - the cat supposedly must be both
dead and alive at once if not observed! Schrodingers cat was far too large and warm to
be quantum superposed as Schrodinger intuited and as Roger Penroses suggestions are
about to experimentally confirmed. The error here was, because everything discussed was
an experiment, there was always an observer, but what of the real macroscopic world,
where, famously, contrary to Bishop Berkeleys subjective idealism, the physics is
believed to be independent of the presence of observers?
Experimentally, all this measurement argument is utterly pointless. It is extraordinarily
difficult to maintain any quantum system in a superposition of states. High vacuum,
extreme cold, very low mass, and total isolation from other masses and outside influences
is required, and even then the achieved elapsed superposition time is very short. A cat is
way too hot, heavy and complicated to be in a superposition as a whole. It will be dead or
alive, in a box or not. Measurement has nothing to do with collapse of the wave
function - any sufficiently large object, including Schrodingers cat itself, in or out of a
box, will do it. Clearly, when the universe was simply dumb rocks, radiation etc., it
managed to collapse itself into reality for billions of years, by itself. Why believe it had to
have conscious observers to convert all those billions of years of previous
superpositions of maybe history into actual history? Would a dinosaur or a caveman be
enough? Or was a Ph.D. necessary?

Copyright: Trevor Pitts, 31 January 2015

In regard to this matter, Roger Penrose (Karolhazy, Frenkel et al. 1986, Penrose and
Isham 1986) has a theory, which is testable (with difficulty and at significant expense). If
this theory survives it will revolutionize physical understanding of quantum mechanics
by explaining wave function collapse. Crucially, according to Penrose (Penrose 1996),
a macroscopic quantum superposition of two differing mass distributions is unstable
and would decay after a characteristic time T, into one or another of the two states. T
is approximately of the order of the Planck time, of which much more later.
Roger Penrose has thereby suggested a very profound solution. All superpositions
gravitationally collapse when an interaction with a sufficient mass occurs. Of equal
importance, he claims that if an isolated state has a sufficient mass, it will collapse itself
(self-decohere) into a fixed reality at a certain point in spacetime. Thus he gives us an
objective collapse theory. How? Penrose suggests that spacetime is intolerant of
superpositions of different amounts and positions of the particles mass since they would
warp spacetime gravitationally, according to the symmetry of general relativity, in
slightly different ways. The idea here is that there is a strict limit to the tolerable
distortion by the different mass distributions in spacetime of different superposed states
in order to avoid a problematic superposition of slightly differently curved spacetimes.
That would be like trying to jam a slightly bent and distorted part into a precision
machine. The machine of proceeding quantum superpositions in time would stop
proceeding and collapse to a fixed immobile state. This is the point where general
relativity and quantum mechanics gravitationally cooperate to collapse the superpositions
to create the precise momentary static state we measure in the present moment. One
could say that Penrose has therefore already found a form of quantum gravity.
In other words, quantum mechanics and general relativity are complementary, not
contradictory. They are essential to create, using Penroses insight, an absolutely real
moment by collapse/decoherence of the wave function. Why absolute? Because, as I
will argue later, the moment of the collapse is fixed and immobile in all 4 dimensions- an
absolute rest frame, as we shall see, a 3D slice of spacetime, minimal in thickness along
the time axis.
For example, a photon from a distant galaxy hits (interacts with) the Rock of Gibraltar.
The wave function, in order to continue evolving, would now require the rock/particle
combo wave function to be inexact in position. The universe, as we all know, will not
tolerate something as big as that rock + particle, being partially in two places at once,
certainly not millions of light years apart. So, the particle now has to be where the rock is,
because the rock, due to its mass, has a lot less indeterminacy of position than the
particle. (See De Broglie wavelength below). Thus, the particle/rock superposition has to
collapse into reality, there and then, changing the rocks state slightly. How big does the
rock or whatever the photon hits have to be? Nobody knows, but until recently the
biggest objects maintained in a superposition (to my knowledge) were 60-carbon
buckminsterfullerene molecules, totaling 720 hydrogen atom masses. Interestingly,
someone (Iskhakov, Agafonov et al. 2012) has recently maintained as many as 100,000
photons in a superposition; but then, they are massless! Crucially, experiments are

Copyright: Trevor Pitts, 31 January 2015

10

ongoing attempting to superpose larger masses to determine the upper limits to such
superpositions before self-decoherence occurs (Brooks, M. 2015).
Why does this matter? The current theory for the creation of the universe is by means of a
quantum fluctuation by which some very small mass very, very briefly came into
virtual existence. Penroses theory suggests that if such a virtual mass were large
enough it might be sufficient to collapse its own wave function into actual realitycausing the Big Bang. Now we have a Creation mechanism. Ongoing and planned
experiments will test this theory- not to attempt to create a new universe, but to see if this
self-collapse, better known as the self-decoherence mechanism, is real.
Today, Feynmans sum of all paths approach is used in physics. It is indeed, as a
practical matter, indeterminate in so far as being unable to exactly predict microscopic
events, rather than making highly accurate calculation of quantities and of explanations of
macroscopic events, for example light reflection by mirrors (Feynman, R. P. (1985).
Again, this is not a contradiction. The universe contains many fixed quantities and
parameters, such as the gravitational constant, that are unchanging, everywhere affecting
events in predictable ways, but not absolutely defining all possible events. By analogy, if
your car has a chassis 11 inches above road level, you cant drive over a rock 13 inches
high, but your choice of route and destination is only limited, not eliminated, by that
restriction. This analogy also holds for every Law of Nature derived from
invariances/symmetries, with the caveat that the car, the road, boulder and driver
are also the product of the entirety of the symmetries/invariances.
Parenthetically, any tiny amplitude component consisting of an impossible path
improves a quantum path integral calculations accuracy. The impossible components
include tiny amplitudes to move backward in time. This means that the final probabilistic
answer to the calculation, via the sum of the squares of all amplitudes, must contain some
contribution from this strange amplitude. Richard Feynman concludes that all particles
have an amplitude to move backward in time. This is unavoidable in the path integral
approach to quantum mechanics. Later, we will see that his concept is crucial to my timesymmetry approach.
This essay accepts as indisputable experimental fact that Penroses objective mechanism
for wave function collapse is real, ubiquitous, and the foundation of reality. The most
popular alternative view that has survived is the Many Worlds Hypothesis, invented to
avoid the philosophical angst many scientists and philosophers feel about wave function
collapse, and/or randomness and indeterminacy. Many Worlds disciples propose that
every interaction, or any change in the whole universe, even one change in a single tiny
neutrinos state, no matter how minuscule or macroscopically insignificant, splits off an
entire new universe. This would produce huge numbers of them for every femtosecond.
All of these hypothesized universes are considered fully deterministic. As none of these
universes (including our own) is any more real than others or detectable by each other,
this notion is incapable of disproof according to its own terms. So it cannot be science,
according to Sir Karl Poppers criterion. I regard this Many Worlds Hypothesis as the
most extreme violation of the Occams Razor Principle of Minimal Assumption in all

Copyright: Trevor Pitts, 31 January 2015

11

history. It begs far more questions than it answers. As we see later, wave function
collapse is the only mechanism that can provide a unique moment of time and a means to
cement evolutionary changes into reality, the process that allowed us to eventually exist.
The Many Worlds Hypothesis is the opposite of an explanation. It is merely an excuse to
allow denial of wave function collapse into a single universal state, and thereby solve
the illusory measurement problem. Hopefully, it is a sign of such philosophical
desperation that it predicts an imminent philosophical capitulation: acceptance of the
reality of the wave function collapsing into the momentary unique reality, according to
Penrose gravitational decoherence of the wave function.
The Ancient Greeks, Leucippus and Democritus concluded that the act of dividing
masses could not go on forever; there was a minimum size, the atom, separated from
others by the void. We now know the universe is almost all vacuum (the Greeks
void). If physical reality somehow failed to keep the electrons in atoms so very far
from the nucleus, our planet and everything upon it would collapse into a brightly shining
ball of neutrons less than a mile in diameter. Consider then, that until recently, few seem
to have considered spacetime itself, in the same atomized or quantized way as matter.
The resulting assumption that space is continuous predicts that a singularity should
exist at the center of a black hole, where the mass is concentrated in zero space at infinite
density. This ought to give us pause. When infinity appears in physics it means you are
wrong somewhere. Similarly, continuous spacetime implies that the Big Bang started at a
singularity, despite the fact that physics fails at singularities.
This essay, conservatively, relies on known physics being valid everywhere, at the origin
or inside black holes. That favors one official candidate for quantum gravity, Loop
Quantum Gravity Theory, (LQGT) which avoids singularities by providing a minimum
interval of spacetime. Consequently, some form of a discontinuous structure of spacetime
arguably is real, hopefully explained in detail in future by extension or development of
this theory. As we shall see later, a clearly quantized version of time is consistent with
the Einstein equations of spacetime, and can be integrated with LQGT to provide a
symmetry for time and a mechanism for the present moment.
In a continuous spacetime there is no lower limit to size. I suggest this is a core
philosophical error: using apparently continuous deterministic mathematical formalism
(which assumes such infinite divisibility) to describe the discontinuous universe that
LQGT proposes. Calculus assumes that the quantities integrated are extremely small at
the limit, beyond measurement, yet finite enough to still retain their characteristics such
as dimensions like mass, velocity and position. I would argue that these so-called
infinitesimal quantities must be finite to preserve any useful characteristics, because a
true infinitesimal (anything finite divided by infinity) is indistinguishable from zero
values of magnitude, dimensions and indeed, all characteristics. True infinitesimals then,
are indistinguishable from nothing, unsurprising since we arrive at infinity by dividing by
zero. Yet summing many such mathematical, calculus infinitesimals results in a
meaningful integral. Integrals are ubiquitous in mathematical physics and they work
beautifully.

Copyright: Trevor Pitts, 31 January 2015

12

The problem here is clear. We have an assumption of continuous spacetime, but we


calculate by what is, in a fundamental way, discontinuous mathematical logic by using a
finite but extremely tiny limit to spacetime subdivision in calculus. An interval on the
inconceivably tiny Planck scale (to be explained later) is certainly a small enough break
to be indistinguishable at our normal macroscopic level from a continuous reality.
Calculus, then, is conceptually a true representation of the world, summing real
spacetime Planck intervals, not an approximation. No wonder calculus works so well in
physics! Discontinuous spacetime has profound implications, not least in elucidating the
origin and nature of the present moment and the arrow of spacetime. The Planck scale
will be very important later.

Symmetries, or equivalently, Invariances


The basic principle controlling the possible range of events is usually known as
symmetry, perhaps better understood in the alternative terms, a series of invariances,
constraints that require that some abstract physical operation be indistinguishable
between before and after states. That is, the physics is the same (indistinguishable,
invariant) before, during and after the operation. This is a very subtle notion, implying
universal and permanent rules. To say there can be no detectable change in the rules
during and after a physical operation is to say that those rules apply for all times and
places- leading to the Laws of Nature. These symmetries/invariances create both the
particles and the rules they obey, as properties of the vacuum (Icke 1995, Lederman and
Hill 2004). This apparently strange concept is all-powerful.
Symmetries such as general and special relativity are two of the perhaps 248 symmetries
that provide the laws and forces of nature and also produce the particle actors upon the
Universal stage. Ironically, Einsteins relative fame is the only operational distinction
between his two new symmetries, special and general relativity and all the other
symmetries. This point of relative fame ironically serves as a means of introduction;
Einstein was world-renowned and won the prize, but his largely ignored, greatest female
contemporary in the field, Emmy Noether, despite massive, entrenched anti-female
prejudice in 19thC German academia, generalized all the symmetries to produce the
Conservation Laws otherwise known as the Laws Of Nature. Her accomplishment
should have given her name similar notoriety today.
The first thing we need to consider in the misunderstanding of the symmetries is the
survival of the conceptual apparatus of Classical physics as a pervasive metaphor despite
a current physics that comprehensively denies the fundamental Classical precepts. Using
19th C assumptions has delayed full understanding of 21stC physics. Classical
determinism is Einsteins view that with sufficient information on the laws and the
current physical state, one could perfectly predict all future states, so that events were a
solid block in time. There would be only one future outcome from any initial physical
state. It also includes David Humes view that causality requires coincidence in both time
and space. The first of these, Einsteins assumption of determinism (but certainly not his
theories, his symmetries!) has been experimentally and theoretically refuted as described

Copyright: Trevor Pitts, 31 January 2015

13

below. We will agree with Hume in a certain manner, expanding beyond the classical
representation of the collision and rebound of billiard balls. We will simply demonstrate
that all causes precede or are simultaneous with effects, equivalent to an outward arrow
of time from the origin.
Why are Classical physics prejudices so powerful, even today? The end of the 19th C was
felt to be the pinnacle of scientific achievement, with only a few issues remaining
unresolved. However, those issues became the main substance of 20th C science, and led
to the comprehensive demolition of the Classical worldview. We must try to imagine the
mist of ignorance and nonsense that had been dispelled by the end of the 19th C and the
turn of the next century. The planetary orbits, the operation of steam engines, electricity
and magnetism had been codified, and even biology, chemistry and medicine, perennially
full of magical notions, had been put into an initially scientific and effective form. All
were based on deterministic models, after centuries of struggle against superstitious,
superficial, or magical interpretations of reality. Deterministic triumphalism seemed
justified at last.
But, Classical physics is dead. We persist in thinking of particles and the laws of nature
although these ideas are simply 19th century classical descriptive survivals.
Macroscopically, classical deterministic ideas are a useful approximation in the limit of
large masses, big distances and ambient to high temperatures. This is of no significance
fundamentally or philosophically, because the universe operates moment to moment
entirely on a microscopic level, even to create macroscopic emergent effects. We must
not allow these ancient ideas to continue to distort our perspective. The modern view to
which I subscribe is that the macroscopic, seemingly Classical world of large warm
objects which we inhabit is emergent from an underlying quantum reality. It is this
transition upward in scale from quantum reality we will explore.
First, consider the symmetries. All experimentally known particles are both produced by,
and obey, the rules created by the Standard Model (Elert 1998-2014), which is a nonabelian gauge theory with the symmetry group U(1)SU(2)xSU(3). This is the truly
revolutionary concept in the Standard Model in particle physics. More, as we will see
later, the whole corpus of relevant universal symmetries is likely to be both within and, I
suggest, completely fill the Lie group of E8 symmetries (Lisi 2007, Lisi 2008). As
described above, a physical symmetry, or alternatively, an invariance, constrains the
possibilities available in reality. We need to remove ourselves from the ordinary world of
visible objects where we make nave philosophical analogies between some of them and
the fundamental elements of physics. We need to think upward from the micro-world to
understand the macro-world, not the traditional reverse.
We think of a particle, but it is only a manifestation of certain symmetries/constraints
in the Standard Model, as part of the E8 symmetry group, that constrain its nature and
behavior. It is a spinning multiply attributed, twisted knot, part of and not separable
from our very peculiar vacuum. This will be significant later. Different constraints give a
particle entity that behaves differently - that is all we can know. It is not a tiny billiard
ball with exotic properties, it is only and always a manifestation of universal rules of

Copyright: Trevor Pitts, 31 January 2015

14

behavior, neither entirely a wave nor a particle. All we can know are the invariances,
which we persist in trying to imagine as waves/particles. The wavicle picture is a
Classically-derived hybrid idea, a metaphorical image only.
For a simpler example, consider rotational symmetry, which among other things means
that a uniform unmarked cylinder when rotated about its axis is invariant in that we
cannot distinguish its final position from its initial position after an arbitrary amount of
rotation. Noethers Theorem showed that these symmetries create conservation laws,
which are, broadly, the laws of nature. Hence, when a spinning ice-skater brings her arms
or legs closer to her body and thereby speeds up her rotation, she is obeying the law of
conservation of angular momentum, which is Noethers result from rotational symmetry.
Obeying is probably a poor choice of words, since she had no alternative to that result,
once she chose to move her arms or legs at all.
Imagine that the Ten Commandments or the rules of soccer were not mere moral
imperatives or rules of the game about which personal decisions could be made whether
to obey them or not, but absolute, unbreakable laws of nature. The particular sins or fouls
could not exist - a force would prevent them. These rules themselves could be derived
by deterministic logic, but vast possibilities of life choices or sports events would still be
possible within these much narrower constraints; some would argue that better soccer
games might result. The hundreds of universal symmetries enable, constrain and allow
the possibilities, but they do not absolutely determine the resulting course of events in
time. There is ZERO experimental evidence for determinism beyond this limited sense.
The symmetries supply the mechanical design and particles, allowing the quantum
changes to create the course of events in time, but like a mechanical phonograph the
choice of actual tune played is not fixed.
The experimental fact is that symmetries/invariances merely allow or disallow certain
actions, they do not completely determine when, how or whether they occur. The
invariances both constrain particle interactions and allow/create their existence. They
enable or prevent certain classes of particle behavior, not the everyday details of life.
The distinction from the ordinary Ten Commandments is that invariance sins are
impossible. The spinning ice-skater is free to decide whether or not or in what manner to
extend or retract her arms or legs in order to win or lose the Olympic Gold Medal, but the
invariances will keep her angular momentum constant, whether she earns a ten or a one
from the judges. This means that the Laws of Nature (really, derivative from the
symmetries by Noethers Theorem) do not absolutely determine events, they provide the
rules and also the active entities (particles) by which only events compatible with
these rules may proceed. This state of affairs is not Classical determinism as usually
understood, where only one future state after a given time interval is possible from a
given starting state, because, in the same way as the Ten Commandments, vast crowds of
future possibilities may occur, equally compatible with these rules. The E8 symmetries
operate as if the Commandments, the world and the humans who were supposed to obey
them were all part and parcel of each other. Moses, like scientists, got to announce the
existence of the rules, not negotiate their terms. We need to never forget the vital
distinction between the symmetry view of constraining events within a range of

Copyright: Trevor Pitts, 31 January 2015

15

possibilities, and at the same time enabling them by providing the particles and forces,
versus the Classical view of absolutely determining events by mysteriously provided laws
and particles.
Rockets fly through empty space, but they may as well be trains on rails in reality,
gripped tightly by gravity, curving around the planets and the sun, in orbits calculable to
inches, the consequence of a symmetry: general relativity (technically-diffeomorphism
invariance). If an astronaut chooses to do so, he can expend some fuel, boosting himself
to join another orbit, obeying similarly operating, but different, invariance rules as the
ice-skating Olympian. Why should we accept that such symmetries completely eliminate
his or her choices?
So what is the role of quantum mechanics in our picture of the universe? Invariances
constrain reality - only certain kinds of particles can exist and invariances both create
them and limit their range of behaviors both microscopically and macroscopically (the
Laws of Nature). The theatre for this universal movie is the volume in one time
dimension and three space dimensions, plus mass-energy in which all this may happen,
but not must happen. If the characters, the particles, are to act on this universal stage,
by what means are they to strut about, as Shakespeare might put it? They will find
themselves in certain states of position, velocity, spin, quantum excitation etc. and their
next state will be found by a quantum mechanical summation of all paths to any next
state of the particles. It will be strongly related to, but not be precisely determined by, the
input state(s).
Last, the exact magnitude of certain observed but inexplicably sized constants and free
parameters determine whether technologically advanced life is or is not possible in a
universe in terms of the four other components of reality above. The macroscopic,
emerging deterministic but unpredictable mechanisms chaos theory and self-limiting
criticality amplify quantum indeterminacy to produce an open future for reality as a
whole. How the extremely narrow and precise magnitudes of these constants and
parameters were chosen from a hypothetically huge range of life-hostile variants,
against huge odds, to make us, the observers, possible, is called the Fine-Tuning Problem
(Leslie 1989).

Spacetime and the Present Moment


Next, consider space, time and mass-energy. Einstein showed us that none of these exists
individually. There is no time without space, only space-time, no mass separate from
energy, but as alternate forms of mass-energy, and in fact no space-time without massenergy and vice versa. So we need to think of space-time-mass-energy or STME for
short. If mass appears, so does spacetime. So there can be no mere arrow of time. All
mass-energy spreads outward in space and time in four dimensions from the origin. We
know that space is expanding (increasing in three dimensions). But there is no space
without time, so time must also increase, outward from the origin, in its dimension. All
STME expands outward from the origin. We ride upon this arrow of space-time-mass-

Copyright: Trevor Pitts, 31 January 2015

16

energy. I believe we steer this arrow into the future, partially and locally, via free-ish
will, as we shall see below.
Einstein was clear that STME is a unity, but was surprised when it was discovered that
the universe is expanding in space, even though his equations in their original form
predicted that. His view was that spacetime was a solid four-dimensional block, with no
distinction as to past, present or future, totally determined forever along the axis of time.
This picture is fine for all moments at least one tiny Planck time (see later) before NOW.
The suggested difference in this essay, and I argue, in reality, from Einsteins view is the
utter non-existence of the past or future; existence is only the present, the results of the
past. While there is only one past history, consistent with Einsteins image, this does not
imply there can be only one future. Simply, all futures must be consistent with that past,
and the past and future must be consistent with the symmetries. Every moment is the seed
of the next; the quantum processes, amplified by chaos and self-limiting criticality allow
a range of possibilities as to how that seed shapes the future.
What if we accept the experimental evidence of the intertwining of the symmetries and
quantum mechanics at every level? Then there must be a deeper explanation for any
significant asymmetries in Nature. Physicists are right; time must somehow be
symmetric. But we cant deny facts, such as the glaringly obvious forward arrow of time,
and the unique present moment, in any such explanation.
This was the basis of Einsteins great misunderstanding. He assumed determinism, and he
assumed that there was no absolute rest frame or special present moment. These were
untrue; but both were useful conceptually, and mathematically simplifying for his
discoveries. The fact is, Lorentz Invariance (Special Relativity) and Diffeomorphism
Invariance (General Relativity) were two new invariances. They are constraints, not
controls, just like all other invariances in physics. There appear to be no reasons to
especially elevate them as incompatible with quantum mechanical indeterminacy. Since
symmetries are universal, either all invariances are in conflict with quantum mechanics,
or none of them are.
The great confusion here is to believe that symmetries, deterministic in mathematical
formalism, thereby compel reality to be so. To calculate this skeleton from which reality
is built is far from determining the fate of the body of reality as a whole. Symmetry
allows or disallows limited aspects of events that Classical physicists and philosophers
believed, wrongly, were completely determined. Total, Classical determinism was and is
impossible as a practical matter, as we shall see below. It is not a paradox to be able to
use deterministic, logical mathematics to correctly calculate quantities in a
microscopically indeterminate universe. It is the genius of great physicists like Dirac and
Feynman to be able to do so. The logical error is to assume that a deterministic
mathematical process correctly calculating an isolated quantity is tantamount to proving
that the sequence of events (i.e. reality) to which the quantity relates is also deterministic.
A quantity for this discussion, is defined as any measurable magnitude which can be
expressed in a mathematical formula.

Copyright: Trevor Pitts, 31 January 2015

17

Gravitation has so far resisted traditional methods of quantization because Einstein was
right; it is purely a result of one of the symmetries. It does not need the graviton, the
theoretical quantum gravitational force-associated particle, to do its job. Einsteins
curving of spacetime by mass is entirely sufficient. Interestingly, even if the graviton, an
entirely theoretical massless spin-2 boson, behaved as believed, there is no feasible way
to detect it experimentally, or to disprove its existence. This failure by Sir Karl Poppers
criterion - the requirement in science of the possibility of disproof - would consign the
graviton to fantasy, not science. Absent a graviton, and with general relativity working
entirely satisfactorily, why do we need the elusive quantum gravity theory as it is usually
considered?
In answer to that question, I say we do not need a new theory of quantum gravity, if we
consider spacetime itself as quantized. In that light, gravity does not need to be quantized.
Gravity is simply the result of Diffeomorphism Invariance, the constancy (invariance) of
physical law even when spacetime geometry is curved by the presence of mass-energy.
There is no need for a theory of quantum gravity, only a quixotic desire to fix a nonexistent philosophical determinism problem. The genuine question, to me, is whether
spacetime itself is continuous or discontinuous, i.e. quantized, being composed of distinct
minimal quanta of spacetime intervals. This will be explored later.

Toward a Consistent 21st Century Physics Free of 19th Century


Philosophical Prejudice
Lets explore the two emergent phenomena on the macroscopic scale: Chaos and SelfLimiting Criticality. Please keep in mind that, for this essay, determined means strictly
Classical, absolute inevitable fate, Einsteins opinion.
First, Chaos is a consequence of sensitive dependence on initial conditions. All Chaotic
systems deterministically amplify tiny differences to create large, unpredictable effects.
So we get mathematically deterministic, but actually unpredictable phenomena.
(Remember that determinism works fine as an approximation, for most large, warm,
unintelligent objects rumbling around in the world.) The famous example is a single
butterflys wing-flapping effects changing the weather. The Chaos effect is in fact
immensely more powerful than that. Consider the tiniest imaginable disturbance to our
planet: removing one electrons mass from some mass at the visible edge of the universe,
13 thousand million light years away. It has been calculated (Ruelle 1991) that on arrival
here, this unimaginably weak gravitational effect on our atmosphere would measurably
alter the weather in two weeks, though the actual nature and magnitude of the effect
could not be predicted! Chaos is a phenomenon where however tiny and precisely known
the input of a change to a state of the world, the eventual resulting larger-scale changes
cannot be precisely calculated. Chaos Theory means that many important macroscopic
systems are both deterministic in mathematical formalism and unpredictable no matter
how accurately we know the initial conditions or the input change and how exact are our
attempted calculations of the result. The effect is to massively amplify tiny effects of
quantum indeterminacy, to a greater and greater extent over time, to defeat deterministic

Copyright: Trevor Pitts, 31 January 2015

18

projections of past events and to create significant differences in alternative possible


futures.
The second, Self-Limiting Criticality is clumsily named. For short and for better
illustration I prefer avalanche theory. It is as ubiquitous as Chaos Theory and similarly
deterministic mathematically. If grains of sand, slightly different as they inevitably are in
size, shape, density, etc., are added to a pile at a certain rate, at some point the pile will
collapse in an avalanche. It is impossible to calculate or predict exactly when, or in what
size, such an avalanche will occur. Instead, avalanches follow a logarithmic scale, one ten
times as big as avalanche X occurs ten times less often than X-sized ones. Such power
laws, not necessarily integer-valued, apply for stock market declines, earthquakes and
many other large and small-scale phenomena. Criticality is the point at which collapse
is imminent, when any tiny addition will start the avalanche or a last added stress can
cause the earthquake fault to slip. It is mathematically distinct from Chaos, but each can
work together. In contrast with Chaos, it is the result of macroscopic amplification of
some relatively variable size, but small sequential addition to a state causing an
incalculable, vastly larger change. A fine example is the effect of a loud sound starting a
massive snow slide by just one or a few snowflakes being dislodged and falling
downslope, quickly snowballing into an avalanche, with possibly huge practical
consequences.
Avalanche theory is another unpredictable and extraordinarily sensitive (but
deterministic in its mathematical formalism) emergent large-scale phenomenon. Like
Chaos, it amplifies any tiny quantum indeterminacy, driving a massive proliferation of
alternative paths for quantum mechanics to eventually collapse into one unique reality at
the present moment.
So Einstein must be wrong for prediction. But determinists would argue that everything is
still absolutely inevitable despite the impossibility of prediction. Their problem is
Heisenberg Quantum Indeterminacy, not Heisenberg Uncertainty - an outmoded
terminology. It is not that we have an uncertain measurement; the actual reality is inexact
and to some degree indeterminate. The smaller the object the greater is the indeterminacy
as a percentage of the energy and time, or of position and velocity. This much is
experimental fact. So David Humes billiard balls, even if impossibly perfectly spherical
and built of impossibly identical numbers of identical atoms in identical relative
positions, would still have extremely tiny inexactitudes in initial and continuing position
and velocity, and after bouncing and re-colliding many times would deviate from
Newtonian paths according to Chaos Theory plus de Broglie/Heisenberg Indeterminacy.
The significant increase in indeterminacy for small objects is seen via the de Broglie
wavelength, lambda, which can be thought of as the combined indeterminacy of position
and velocity.
= h/mv
Here h, is Plancks constant of action, the momentum is the mass, m times the velocity, v.
Plancks constant is extremely tiny (about 7x10-34 Joule-seconds, (energy multiplied by

Copyright: Trevor Pitts, 31 January 2015

19

time, or action to a physicist). It is divided by the momentum, by comparison a very large


number for heavy objects such as billiard balls. However, for liquid molecules, and small
particulate solids like tiny sperm cells, their momentum is small enough to lead to
significant indeterminacy of position and velocity, especially over increasing time.
The single, unique sperm cell that fertilized your mothers egg, like that of every one of
the billions of our ancestors, was buffeted along the way by quadrillions of moving liquid
molecules, (Brownian motion, earlier explained in this way by Einstein!), and each
collision was significantly de Broglie indeterminate in the amount of its path-disturbing
effects. So did the best men or women win? Each of the billions of fertilization events
making our ancestors was like a horserace through a swamp, in a whirlwind, with a few
hundred million participants of differing abilities starting at slightly different places and
times. Would Einstein wager deterministically on who wins that race to conception?
Calculated this way, how likely were you to exist, a priori? On the level of a miracle, I
would suggest. Avalanche theory therefore makes things even worse when attempting to
calculate particle paths forever, now multiplied by chaos and minuscule quantum
indeterminacy. Although at the macroscopic level for short time periods, indeterminacy is
exceedingly small, it is not zero. As we saw above, Chaos and Avalanche Theory causes
huge amplification of any however tiny, quantum inexactitude over time.
The problem is that, the Laws of Nature do not change; they are the same now as at the
origin, in the macro and micro world. So, if we were following Classical determinism, we
must be able to reverse a completely deterministic history, if only we knew exactly all
details of todays particle positions and motion. That is, if Einstein were right about this,
he would be equally right about any point in the past, so that if one were to create, say,
his life, or indeed any known series of events after the origin, then every particle at the
Big Bang would have had to be in a position fixed to less than a quintillion quintillion
quintillionth etc. of their tiny diameter to determine everything up to his birth and
beyond. Why, because the universe had to expand, massively amplifying any particle
positional errors at the origin, and every moment since. Additionally, Chaos and
avalanche theory multiply any original and continuing indeterminacy. So determinism
requires absolutely exact identification, selection and positioning of every particle at the
origin to produce our world as we see it.
To the contrary, we know that such a feat of precise selection at or near the origin, or
anywhere else, to determine Einsteins or your existence is impossible according to
quantum mechanical indeterminacy at the origin and in every moment since. It is even
more ludicrous when we actually contemplate organizing the swirling, turbulent mass
of identical quarks and gluons at billions of degrees in temperature near the origin of
spacetime. How could a Maxwell Demon-type God select, segregate and herd the
necessarily separate, individual (but indistinguishable) quarks into exactly correct
positions to determine the entire future path toward Einsteins existence and precise life
history thirteen thousand million years later? Seriously?
There is no physics for that type of outcome, and I confidently predict that there never
will be. Einstein was neither an accident nor an inevitability, he was just one of a vast

Copyright: Trevor Pitts, 31 January 2015

20

number of possibilities at the origin. So are we all. Darwinian evolution is the only
credible means of arriving at an Einstein, you or me, from a bunch of small molecules on
Earth three and a half thousand million years ago, or from the extreme temperature soup
of quarks and gluons at the origin. That means social, economic, artistic, political, and
technical evolution, all in a modifiedby-human-activity, quasi-Darwinian sense, not just
the usual biological sense. This requires a multiplicity of possible paths to the future to be
available at the origin, and every moment since, in order to multiply the effects of
improvements, over time. Even God, if He obeys His own rules of physics, could not
manipulate the quark/gluon soup in so precisely the right way to make Einstein or you,
the reader, inevitably exist 13 thousand million years after the origin. All of us were
absurdly unlikely, and are miracles in that sense, and so were all our ancestors.
Determinism must argue that we were inevitable, essentially predestined. Nothing here
denies that there is now only one single historical path from the origin to now, for every
particle in the universe. Nor does it imply Liebnizs or the fictional Candides idea of a
best-of-all possible worlds. It is simply that vastly varied other events were inherently
possible, but did not happen. This is necessary to allow Evolution in the broadest sense.
Determinism = total predestination from the Big Bang to now = impossibility

Digression: A Philosophical Application of 21st Century Physics



Many believe that the brain is a giant avalanche system constantly in criticality at
some very high fraction of its constituent neurons (De Arcangelis, PerroneCapano et al. 2006). Because each of these billions of neurons on average is
connected to thousands of others, we must consider an arrangement of
sandpiles. Many sandpiles are connected to thousands of others, like hugely
multidimensional dominoes. Every dominos topple can topple many others. So
each criticality incident at an individual neuron will propagate widely, tipping
many other connected neurons up to or beyond criticality. Further, all of the
billions of neurons are awash in a multitude of signal molecules and chemical
stimulants or depressants. These chemicals and their cellular receptors are subject
to quantum indeterminacy and quantum tunneling which affect the chemical
reactions, and hence the timing and scope, of every neuronal critical incident.
(Quantum tunneling is a quantum effect whereby a reaction can occur more easily
or often despite a high energy barrier otherwise hindering that reaction. It is a
form of especially gross chemical indeterminacy). Consciousness is likely to be
closely related to all these phenomena in a brain structure that in this way cannot
be deterministic in the Classical sense.
If physics drove philosophy, we could define free-ish will, as incompletely
deterministic decision-making, which is inclusive of, but not fully determined by,
outside operative inputs. This could tend to improved outcomes from the
perspective of the individual or group. Free-ish will has obvious survival value in
encouraging social and cultural evolution. For evidence of the evolutionary utility
of free-ish will, just look at the historical under-performance of societies,

Copyright: Trevor Pitts, 31 January 2015

21

religions, or politico-economic systems that attempt to force people to think alike


or not at all.
The vital issue here is to accept the massive contribution of genetic, cultural, and
personal history in a decision, but to always be aware of the final role of the
individual consciousness. If one were trying to design a biological machine for a
ghost to inhabit, the human brain could not be beaten. No matter how
ephemeral the hands on the steering wheel of consciousness, or how truly weak
the will may be in proportion to brain inputs, the brain is the ideally sensitive
vehicle to drive due to quantum indeterminacy and avalanche theory. Doesnt
the massive contribution of genetic, cultural, and personal history above simply
define the individual? Why would it absolutely restrict his/her free(-ish)dom to
decide, given the biophysics above?
We have to consider that consciousness is an integrative phenomenon that
requires a finite and variable time to input, process and summarize sensory and
intentional data. This would make a psychological present. For example, one
could process hearing the sound of ones bare toe painfully tripping over a bell in
about 2 milliseconds. Seeing the bell, in the sense of integrating and interpreting
the visible stimulus as a bell in a dangerous position would take about 200
milliseconds. The pain or touch signals would arrive via the nerves more slowly.
The reflexive motor reaction, too late to avoid kicking the bell, but trying to avoid
falling over, might start before the pain signal arrived and was processed at the
brain. One would experience this entire sequence as simultaneous, and would then
perhaps decide to tidy up the area more effectively in future. Our effective
moment of the present is in fact quite wide and varies from about 500
milliseconds to 3 seconds as measured in different experiments. Benjamin Libets
measurement that the brain begins to form the motor signal for an action before
the conscious intent is aware in consciousness does not deny free-ish will,
because this readiness potential was present regardless of the decision
(Ananthaswamy, A 2013). It can also be argued that the 200 or so millisecond gap
is well within the brains length of subjective simultaneity. Neither can be said to
be first. (Spinney, L. (2015)
Goodbye then, for Classical determinism versus Chance and Evolution. What about
Hume and causality? When two quantum entities are produced in some interaction, they
are entangled. They do not have their own separate quantum states; they are interrelated,
sharing at least two quantum states between them, only provided that neither of their
wave functions has collapsed yet. Essentially, they share a combined wave function. The
basic example is two entangled, physically separated photons and their opposite spin
states. If one entangled photon has one of its only two possible quantum spin states when
measured at laboratory A, the other at laboratory B will instantaneously have the other
available spin state no matter how large the distance between labs A and B. Nobody had
any prior knowledge of which spin state As photon would turn out to have when
measured - it would be absolutely random, as expected according to quantum mechanics.

Copyright: Trevor Pitts, 31 January 2015

22

Yet lab Bs photon would, then always, instantly, have the other spin direction. All this
has been confirmed repeatedly over great distances and is not in dispute. Scientists have
been unnecessarily struggling with this spooky quantum action at a distance as Einstein
put it, for most of a century. Why that struggle was unnecessary is the subject of the next
section.

21st Century Physics and the Present Moment


So what is causality? It must be extended so that it is not limited by Humes proximity in
space criterion, but only for entangled quantum objects. The two entangled photons
coincided in time, not necessarily in space. Because their constraint to have different spin
quantum numbers was instantaneous, there was no difference in time. Once one of the
pair was fixed in one spin orientation by its interaction with some mass, the other was
immediately forced into the opposite spin, no matter how distant from the other. The
cause did not precede the effect, but was simultaneous. So a cause can precede or be
simultaneous with an effect, but it cannot be later. This will be of great significance
below when considering the universal versus the subjective present moment.
This profound quantum-based alteration in how we define the real basis of causality is
key to reconciling all sorts of apparent contradictions. How? By explaining the present
moment and the arrow of spacetime. Einstein once attempted to console the widow of his
friend Besso (Flsing 1997) by saying she should not grieve because her husband was
alive in the past! This past was supposedly as real as the present she felt trapped in. This
is bizarre. We might consider the past to seem similar to such a fixed four-dimensional
solid. Still, it is nothing like the present moment; Bessos widow cannot revisit her past
and touch her formerly living husband. In fact, Einsteins equations allow a very different
solution, called a foliation of 4D spacetime (Marsden and Tipler 1980, Lockwood, M,
2005) to produce a unique, universe-wide present moment. This means that the past
appears as a stack of 3D slices along the time dimension of 4D spacetime. It looks like
Einsteins block time, unless looked at very closely. It is made of very thin sequential 3D
slices, about the thickness across of the extremely tiny Planck time in the time dimension,
with gaps of a similar width between them. This clarifies spooky quantum action at a
distance as happening at a defined moment of time, one Planck time wide, regardless of
distance, across the foliation, orthogonal to times arrow. There is no time across the
foliation, it is a frozen instant. So entangled quantum states are not quite instantaneous
in the old sense, they occur within a minute slice of time. Then time increases by one
more slice, and so on. More on this and Planck units later. It is a paradox that physicists
still think in terms of continuous Nature on the one hand, yet have accepted the
infinitesimal Planck units as minimal sizes of mass, length and time on the other.
We need also to consider that this foliation extends beyond the visible edge of the
universe from our position. Obviously, a physicist beyond our horizon might believe that
different physics pertains in our area, or vice-versa we may believe the same. All would
be wrong. No matter how big the universe may be, there is the same physics and the same
moment of time, because the foliation is everything real, everywhere and the entangled
particles link every part of it together across the foliation. In a very real sense, two

Copyright: Trevor Pitts, 31 January 2015

23

entangled particles are a joint particle, bridging any distance across foliations until one
decoheres with the other. There will be more on foliated time and the Planck time later.
Another of Einsteins assumptions was that there was no absolute rest frame in space, so
we could only consider relative motion between bodies. Einstein did not live to see that
he was wrong. According to J.S. Bell, (Bell 1987) Einstein's theory assumes, but does not
require, that the laws of physics will look the same to all observers in uniform motion,
with no fixed rest frame. This permitted a very concise and elegant form of the theory.
H.A. Lorentz preferred the view that there is indeed a state of "real" rest, defined by the
aether. On this subject, Bell said, "the facts of physics do not oblige us to accept one
philosophy rather than the other". The facts of physics as known today include the cosmic
microwave background radiation (CMBR), unknown to both Lorentz and Einstein, and it
is indeed a suitable universal rest frame corresponding to the mythical aether.
The universe is permeated everywhere by electromagnetic radiation left over from the
Big Bang. Cosmic microwave background radiation is Big Bang electromagnetic
radiation made longer and longer in wavelength as the universe expanded and stretched
it. It is now microwave radiation, called 3 degree Kelvin radiation because it matches the
radiation characteristics of bodies at that temperature, about 3 degrees above absolute
zero. This radiation is stationary in the sense that it has no differential motion relative to
itself or the point of origin. It radiated equally outward everywhere, from everywhere in
the universe, as soon as the relatively tiny universe became transparent to
electromagnetic radiation (about 350,000 years ABB - after the Big Bang). It turns out
that all we have to do is measure the frequency of this radiation in all directions in order
to calculate our absolute velocity in space. The radiation is bluer (higher frequency) if
we are heading more in one direction, redder (lower frequency) if we are moving away.
The bluest reading gives our absolute direction, and the amount of this Doppler shift to
the blue (higher frequency) gives our absolute speed. This is true for everywhere and
everywhen. Our solar systems absolute speed and direction have been determined in this
way.

Digression: Philosophical Misapplications of 20th Century Physics


There is an interesting parallel here to the moral damage done by determinism.
The popular conclusion from Einsteins relativity was that everything is
relative. This leads to the kind of nihilism realized in the mid-20th Century PostStructuralist view wherein moral value distinctions between, say, a gangster and a
saint are disdained. Einstein lived at a time when the aether wind hypothesis
had recently been disproved, and the only imaginable way to measure velocity
was relative to some other object. So there seemed a rootless and drifting aspect
to the new conception of motion. This unfortunate name has led to
misunderstandings, especially since relativity became a worldwide sensation
when it was proved by astronomical observations. A tendency to moral relativism
ensued. Relativity itself could not be more specific in physical meaning or less
relevant to moral meaning. Ironically, the assumption that there was no other
means of measuring velocity is wrong, as we see below. As with determinism, the

Copyright: Trevor Pitts, 31 January 2015

24

physics works, but the assumption was wrong, so these attitude changes were
doubly baseless. The idea of quantum uncertainty also contributed to
postmodernist radical antinomian nihilism where truth itself is shrugged off as
optional, derivative and unknowable.
No red or blue shift in any direction, (absent a significant gravitational field or any
acceleration) means an absolute universal rest frame has been reached. So we have a
defined absolute space rest frame. If the present could be defined in that frame, we would
have an absolute time. If we have both, we have an absolute spacetime, or the present
moment. This is where the foliation solution to the Einstein equations of general relativity
(Marsden and Tipler 1980, Lockwood 2005) comes in again. To foliate spacetime, we cut
it into very thin sequential slices- like foliage or leaves. Here, we ask how thin are the
slices. Einstein, I imagine, would prefer they were infinitely thin, but that would not be
any different from his continuous, four dimensional block spacetime. They are not.
Here we need to look back to the controversy over quantum jumps when quantum
effects were first explored. All quantum changes have no intermediate state or states that
a quantum entity passes through on the way from one state to another. Worse for
Classical prejudices, this jump happens instantly. This was a major issue in the early 20th
century. Quantum jumps did not fit Classical continuous time or motion, but they
happened anyway. They were the very basis of Plancks original answer to the problem
of the anomalous (in Classical terms) spectrum of light emission from hot bodies. He
gave us the famous Planck relation:
E=h
E, the light energy, equals a constant multiplied by v, the frequency of the light. The
constant h, above, is Plancks constant of action. As shown above, its dimensions are
energy multiplied by time. As Plancks constant is extremely small in magnitude relative
to everyday objects, quantum phenomena are noticeable on only the tiniest of scalesatomic and molecular scales. Planck said light was emitted according to this equation in
discreet lumps of energy, quanta, which was later described as fueled by instant jumps
from one electrons energy level in an atom to a lower one, without traversing the
energy gap between them. The Sun heats up atoms by nuclear fusion. Counterintuitively, it is when they cool down at the surface that the light of the Sun is sent to us
as the electrons fall inward closer to the nuclei, losing light energy in the process.
If we propose to quantize everything, what is the detailed mechanism by which the
universe gets from one Planck times state of the universe to the next? This is as hard to
imagine as the symmetries. It is the earlier referenced sum of all paths. It is as if a
particle tries out all the ways of getting from state A to state B, no matter how complex,
exotic or outright impossible they may be in macroscopic terms. The more different crazy
routes are added, the more precise our calculation becomes. Everything adds, subtracts
and cancels out to give us an answer, not the answer. By similar means we can
calculate physical quantities to extraordinary accuracy. Some quantities can be calculated
with such precision that a sniper with similar accuracy could hit a bulls-eye on the moon,

Copyright: Trevor Pitts, 31 January 2015

25

given enough power in the ammunition. So the universe sniffs around a vast horde of
maybes, maybe-nots, and no-ways-in-Hell to give us NOW. Effectively there are indeed
many worlds branching off from every changing particle and moment, but they inhabit
the narrow undecided cracks between sequential Planck-scale STME quantum jumps to
the next real moment, and are never entirely real. The next moment of reality partakes to
some extent in all of them, emerging as a unique summation.
The summation of the wave functions of all the momentarily relevant particles in the
entire universe collapses into reality, instant by separate instant. (It should be noted that
only about one particle in the universe per ten billion is interacting, i.e. relevant, at any
moment. All the particles in your chair are constantly interacting with each other, which
is why it is solidly, constantly real.) Then the next moment arises from the newly existing
reality plus those previously superposed particles that are now relevant due to potential
interaction with components of immediately past reality. This creates the next moment
and so on, forever. The universe-wide present moment is NOW - the collapse of the
quantum state of the whole universe into the only momentary reality there has ever been,
from the first moment to this, expanding in four dimensions outward from the origin in
tiny (Planck interval) quantum jumps of spacetime.
Maybe it should long ago have been obvious that the instantaneous quantum jumps
appear because there is no time, and therefore no possible real events, between the prior
state and the present one, just a tiny interval for the universe to calculate what happens
next by summing the relevant superpositions/virtual particle paths. So, I suggest, the
separation and thickness of each proposed foliation of spacetime should be on the order
of the Planck time, extremely thin, 10-44 seconds, the thinnest time interval possible with
quantum physics. This means that time is quantized, it moves forward in tiny quantum
jumps with no time and therefore no real events in that inconceivably brief interval
between one real moment and the next. So this would be the Planck interval, the no-time
gap across which quanta jump in order to participate in a moment. This idea makes
complete sense of quantum jumps and gives us an interval, call it a placetime, for all
the not-yet-real but relevant paths to be summed and give us the next foliation. The next
foliation is the next moment of reality. So it is better to think of a Planck interval as the
minimum placetime between events. An event (4D) is a sum over placetime of the
related parts of effectively 3D, states (foliations) of the whole universe in this model. Add
the Planck intervals sequentially and you get the flow of time, but the flow is a
quantum movie. Reality is always and only a single 3D momentary slice, one frame of a
3D movie at about 1044 frames per second (the number of Planck times per second) and
very high definition, over 10100 potential Planck volumes per cubic inch. (A Planck
volume is defined as having length, width and height of one Planck length, about 1.6x1035
meters cubed.).
Analogously to a movie film, a frame of reality would stop to be illuminated in the
projector gate for one Planck time, then quantum sprockets- decoherence - move the
next frame/moment into place during the Planck interval. The general opinion in physics
is that the Planck time and space scale is the point where quantum gravity supposedly
enters the picture. But we can see here, that the whole universes quantum path integral

Copyright: Trevor Pitts, 31 January 2015

26

gives us the next moment of reality. So who needs a new theory of quantum gravity since
we already have one, right there at the Planck scale? The actual phenomenon within the
Planck interval is simply Penrose self-decoherence, due to the mass of various parts of
the universe collapsing local parts of the universal wave function, resulting in the unique
nature of the 3D foliated present moment and the exponentially open future that expands
beyond it, Planck interval by Planck interval.
Clearly, we should use the Cosmic Microwave Background Radiation as the reference
frame for this foliation. If we slice four dimensional spacetime this way, we get a frozen,
motionless, effectively 3D space relative to the CMBR with only one moment of time
across the whole universe. Then the universal movie advances one frame by Penrose selfdecoherence of relevant superpositions, and so on forever. There is a total coincidence in
time anywhere across the universal foliation, so all then-current quantum correlations,
like the measurement of one of the two experimental entangled photons above, being
instantaneous, are causal by the new definition, regardless of separation distance. There
is no spookiness in instantaneous action at a distance, therefore, because the zero
distance in time, during the Planck interval, by multiplication, means zero distance in
spacetime. Such action must be instantaneous i.e. occurs during a Planck interval, since
it is a quantum entanglement ended where one of the particles has interacted with the rest
of the universe (by being measured in the above case). Time is the sequence of moments
in which the whole universe collapses forward to create the expanding 3D front of
Reality.
One might ask what is the rate of time? Newton would have said, one second per second,
flowing majestically, unchanging, everywhere, forever. Einstein showed that as measured
from one Earth-based twins rest frame (Alices), the flow of time as measured by some
reliably steady internal flow of events, such as a clock, would be somewhat slower in
another (Bobs) moving at very high speed relative to Alice. So Bob would be younger
than Alice on returning from his journey. The rate of time is fastest in the rest frame of
the CMBR (assuming no significant gravitational field), relative to any object moving
relative to it, which is every object. So, there, in the absolute rest frame of the CMBR, is
the absolute rate of time. Time exists only as the present moment, about one Planck time
separated from the next. Any motion of a particle appears as a quantum jump into the
next slice of spacetime. History is the physical addition over time of the quantum
changes by which all events proceed, one Planck interval at a time. The sum of all prior
Planck intervals is the image of the past, forever gone, Einsteins friend Besso and all.
The only time, in the sense of a place in time where events occur, that ever has existed
was and is no more than approximately a Planck time wide: the expanding edge of time
in space, the outermost 3D slice of a 4D sphere.
The speed of light is the maximum distance a massless particle such as a photon can
travel from its emission point in vacuum to its absorption point in vacuum in a given sum
of Planck times. This Planck time is one of the natural units called Planck units; it is the
time required for a photon to travel a Planck length, about 5.4x10-44 seconds. The Planck
length is about 1.6x10-35 meters. These Planck units are calculated from basic quantities
such as the gravitational constant, Plancks constant of action, h, and the speed of light.

Copyright: Trevor Pitts, 31 January 2015

27

These three are parameters typical of this universe, but we have no clue why they should
be precisely (or even vaguely approximately) the size they are relative to our standard
measures of time, space etc.- that is, relative to the scale of our world. This is the FineTuning Problem, because if they were even slightly different, life could not have evolved.
Consider that a quantum computer is expected to be able to do enormous calculations,
and then give a result when it collapses into a real state. The scope and rapidity of
computation is a rapidly escalating function of the number of superposed particle states
(qubits) usable in the computation. The usual estimate is that every massive real particle
is outnumbered by about 1010 in superpositions at any time, mostly photons and
neutrinos. A computer with so many potential qubits per real particle could, I submit,
easily calculate the fate of those that were previously real, but are momentarily part of the
universal superposition plus a relative few that will be newly real. It could then selfdecohere to give an answer, the next moment, with everything fixed as a 3D slice of 4D
spacetime, waiting for the next quantum jump to the next moment.
Why are the laws, parameters, and constants the same everywhere? These undecided
superposed particles, like non-superposed, presently real particles, are simply
individual embodiments of the totality of our universes basic physics. They are nonlocalized to varying extents. Real particles, part of NOW, are local. Because some were
entangled with others everywhere, in many cases even beyond the visible horizon of the
universe, they transmitted the same physics non-locally throughout spacetime with
them, during each Planck interval in which they decohered.
Why would anyone question the direction of the arrow of time, when we know space is
expanding outward from a minuscule origin point in both time and space, and we know
from Einstein there is no separate time, only space and time inextricably together? So
obviously the expansion must be that of space and time together, outward from the
origin? The answer is that no matter how obvious the direction of time is, we have a
problem of symmetry. Everything arises from symmetries. The laws and equations work
perfectly well in either direction in time, they are time-symmetric, but our world is not.
Most physicists think that it is perfectly feasible to consider reversing events in time
because they are committed to determinism and symmetry and assume continuous time.
As a chemist, I am aware of the absurdity of this as a practical matter; aware of the
lawful mechanisms where identical atoms of each element interact to produce life, but
unable to imagine a lawful mechanism whereby a plant leaf beams exactly the same
photons with zero losses back to exactly the same point on the Sun where they came
from, to somehow reverse the fusion of hydrogen to helium nuclei there while turning
itself back into a seed! The philosophical error is that time reversal invariance in physics
relates only to the mathematical equations of motion (+t or t are algebraically
equivalent), not to actual motion, which is dependent on initial conditions. Time reversal
of events would require starting from different conditions than the original motion
(Sachs 1987). It seems clear that there would also have to be a mysterious addition of
mv2 of energy to reverse a particle in time, and therefore direction in space. To reverse a
vector like velocity requires an identical, but opposite vector of twice the energy. What

Copyright: Trevor Pitts, 31 January 2015

28

God-like entity could do that to every particle everywhere, and with what source of
energy? Time is irreversible.

Entropy and Times Arrow


A question included on the mysteries listed above involves entropy. Why did I not
include thermodynamics, including entropy, as fundamental along with the other four?
Because, if we have a theory of spacetimes arrow, the set of universal symmetries, and
quantum mechanics, then none of thermodynamics is fundamental. Spacetime is
expanding, filled with different particle species, each of whose members is
indistinguishable from one another, and with energy, which distributes itself through all
degrees of freedom. One can think of entropy in thermal terms, or statistical terms. First,
thermally: the First Law of Thermodynamics, is a consequence of the symmetry which
creates the conservation of mass-energy. It states that the mass-energy of an isolated
system does not change, i.e. is conserved or invariant. The Second Law states that
everything that happens increases entropy overall, and conversely, nothing can happen
spontaneously that decreases it, overall. Living entities can add energy to decrease local
entropy to their advantage, but only by increasing total entropy when all the results of
their intervention are included. Entropy is the only macroscopic abstract quantity in
physics that requires, as opposed to works fine with, an arrow of time. Heat will flow
from a hotter to a cooler body spontaneously. It will not do so in reverse, as you discover
if you unplug a refrigerator from its energy source. Similarly, if you dump salt in a barrel
of water it will spontaneously dissolve, diffuse, and eventually become uniform in
concentration throughout the barrel. Taking that salt out again to return the water to its
prior state requires entirely different, humanly directed, not spontaneous, activities and
the expenditure of a great deal of energy. If a process emits light, that light can travel
until it is absorbed, never coming back. If an explosion produces sound, that sound will
spread until it makes some air molecules a little faster (warmer). Warmer air will not send
back the sound. These are the familiar rules. So entropy is, no more and no less, simply
the philosophically scary arrow of time that has been dignified with the useful quantity,
S, entropy, to measure change and direct it in time. Spontaneity simply means that times
arrow will do the job of spreading mass-energy through spacetime, unassisted by
intention. This essay asserts that intention is the evolutionary purpose of consciousness,
using volition to expand and utilize favorable possibilities. Biophysics/biochemistry are
the tools to find the mechanisms of intention and volition.
Next, the statistical version: is Penrose justified in his worry about justifying entropy
being very, very low at the origin? Statistically, entropy is a function of the potential
availability of near-time mass-energy states to future occupation. So, to say that entropy
is minimal at the origin is a tautology- clearly in an enormously hot, tiny ball stuffed with
quarks and gluons there are far fewer state-places to be than later, in a huge, cold, mostly
empty universe. Entropy simply increases as time increases; energy and matter becoming
more and more diffuse, potentially occupying more quantum states of energy, space, and
time as spacetime expands from the origin. Entropy tends to maximize as the universe

Copyright: Trevor Pitts, 31 January 2015

29

eventually becomes vast, nearly absolutely cold and nearly empty, the Heat Death of the
Universe, the opposite of the Big Bang. Entropy, seen as a consequence of the expansion
arrow, is minimal when elapsed time in the universe is minimal. So Boltzmanns version
of entropy will increase, according to his equation, inscribed upon his tomb in Vienna.
This is really the core meaning of entropy: the eternal, spontaneous net spreading out and
cooling of everything, unless prevented by even more expenditure of un-randomness in
the form of locally accumulated energy for life, plus by intention, in the case of
intelligence. Mass is concentrated by gravity, after which the heat and light resulting
from nuclear fusion or gravitational energy will then spread out everywhere from stars.
So net entropy is positive, even though matter has been concentrated into a star. In the
same way, plant life spreads heat energy, being a quantum middleman increasing local
order by indirectly turning locally captured energy-rich starlight into unusable, diffused
away waste heat.
The philosophical problem here is in saying that increases in entropy drive the universe
forward in time, yet refusing to credit the role of times arrow. Why believe that at the
origin there was some kind of strange empty bank of entropy into which one can deposit
entropy, the balance increasing forever to drive the universe, despite that bank perhaps
being dubious according to Penrose? Why not believe the utterly obvious experimental
evidence instead:
spacetime expansion ! times arrow ! entropy increase

Conclusions about Reality


There is no past. Einsteins friend Besso is dead; subsequent to his death he is beyond the
reach of his widow. All we have left are the physical traces of the past that survive into
the present moment. There is no 4D block of the past. The future is always undefined to a
finite extent, even as little as a Planck time ahead of now. There can be no time travel.
There is no absolutely fixed future. Most macroscopic events proceed deterministically,
including the apparent stability of your chair, or the Rock of Gibraltar, although there will
be innumerable tiny quantum changes within each. A microscopic event at the quantum
level may provide an opportunity for a shift in the later consequences of emergent
phenomena like chaos and avalanche processes on the weather, your consciousness or
elsewhere. Opportunities exist, say for a biological mutation, or by a process still
unknown in detail, for an original thought to arise in a human. That thought can then be
acted upon, if considered worthwhile, and the universe will be forever changed.
Since the vast majority of particles in the universe are in a superposition of states at any
time, actual collapsed reality is a minuscule fraction of the entire universal wave
function. Physicists isolate tiny areas of spacetime to make calculations, but the present,
Planck time thick slice of the universe is one complete quantum system of all that is
currently real plus a very few of the great majority (10 billion to one) of undecided
particles together having created the next moment. To emphasize, this is what the much-

Copyright: Trevor Pitts, 31 January 2015

30

anticipated quantum computer is supposed to do - use superpositions of maybe states to


finally collapse into one right answer state.
So what we see is just the tiny currently realized fraction of the universes contents. Our
cars and chairs and cats are massive and warm enough to be persistently real, and time
appears to be continuous, because the Planck interval is so inconceivably short. Does
space indeed only seem to be continuous because the Planck length is also very short? I
would be willing to wager very heavily that it is discontinuous, and, thus, there are no
singularities (where particle density is infinite). Spacetime is necessarily quantized at
close to the Planck scale, because the foliation means that time is quantized. If I am right,
STME is quantized at the Planck scale. I do not suppose that infinitely tiny singularities
of infinite mass density have ever existed or can exist, whether at the center of black
holes or anywhere else, because such singularities make a nonsense of physics, like all
infinities. So space is quantized, too, and has a minimal Planck interval of space, close to
the Planck length. There is no way to make sense of the present moment and its
completely obvious uniqueness without some scheme practically indistinguishable from
the one detailed above. The real mystery, if you disagree with the above scheme, is what
else could possibly explain the present moment within known, testable, physics?
The interesting point here is that spaces smaller than these Planck units are expected by
physicists to be minima for current physics to apply, and below these limits a quantum
gravity theory will be needed. But this is exactly the idea I have been advocating, that
within the time minimum, the Planck interval, a quantum process, Penrose selfdecoherence, occurs to create the next moment of spacetime/mass energy. I suggest this
is exactly the same mechanism as ordinary quantum mechanics because I deny that there
is any space or event below the Planck scale, as opposed to quantum changes within it.
So I suggest that Penrose has solved the problem of quantum gravity by removing the
necessity to invent it. He has shown that, if one also accepts the foliation solutions to the
Einstein equations, they cooperate with quantum mechanics to create reality by
gravitational decoherence of the wave function, moment by moment.
So, it seems clear that quantizing time is a big help to answering the big remaining
questions listed in the Introduction. Yet we still need to quantize space and time together.
My favorite candidate for this task of quantizing spacetime is Loop Quantum Gravity
Theory. We will consider LQGT below, when we look for symmetry in Times Arrow by
postulating a second universe on the other side of the Big Bang, which seems to be
calculable from this theory. Briefly, space is pictured as discontinuous, knitted as a
network from tiny loops of the Planck length in size. This quantization of space is more
important for me than any quixotic attempt to use this concept to quantize gravity. Again,
we leave the mathematics to the geniuses and search for the meaning of the physics.

Times Double-headed Arrow


Spacetime must be quantized. The foregoing statement has been obvious to me since
1998, when I first saw the references about foliating spacetime to create a universe-wide

Copyright: Trevor Pitts, 31 January 2015

31

present moment in non-contradiction to the Einstein equations. Equally obvious was that
it would eventually be possible to find an approach to a theory that supports quantized
spacetime. I now think this will arise via a Loop Quantum Gravity Theory approach,
partly because of its predictions about the time before the Big Bang. Of course there is
no time before the Big Bang in the traditional sense- without mass, energy and
expanding space, there are no events and no time to order them in. But nothing in physics
prevents a universe running in either direction in time. I contend that we live in one half
of a dual universe, and each half moves in opposite time directions away from the origin.
This means that before and after the Beginning are meaningless terms in this context.
They are as arbitrary as having the past, or negatives in general, being on the left of all
graphs.
So, I concluded in 1998 (Pitts 1998) on pure symmetry grounds, that there was another
universe on the other side of the Big Bang in time, and that it was expanding, but
backwards in time. Thus the Big Bang is in the center of time; the universe expands in
time in both directions from the origin, symmetrically, with space expanding in all three
available dimensions outward from the origin point. The physics is identical in both semiuniverses. And, sure enough, everything works exactly the same as here. Babies come out
of the womb and grow up, not are somehow sucked back in. Everybody there thinks
they are moving outward in time from the Big Bang. They are. So are we.
Yet our universe is full of matter, although equal amounts of matter and antimatter are
always created from energy. So where is the antimatter? It fills the other side of the Big
Bang in time (Pitts 1998). Further development of Loop Quantum Gravity Theory may
now support this hypothesis. So we have an answer for what exists before the Big
Bang- an antimatter version of ours, using Feynmans point that ordinary matter
reversed, (i.e. moving backward in time) is indistinguishable, that is invariant, from
antimatter. So perhaps there is something called Feynman Invariance, or Time Symmetry.
This would state that: physics is invariant in positive or negative time, provided that the
zero point of both positive and negative time is at the origin of the universe. This also
removes the apparent issue of antimatter versus matter asymmetry. If the universe is in
fact bifurcated across the origin as above, there should be equal amounts of antimatter
and matter in the universe as a whole. An actual symmetry theory is a better explanation
of time symmetry than pretending that macroscopic events can be reversed in time.
Therefore we can say that this dual universe operates effectively as two independent,
giant quantum computers expanding on opposite sides of the Big Bang, Each calculates
their universal present moment as one of the right answers to the only truly cosmic
question- what happens next? There were vastly many other possible but unrealized right
answers over time, all consistent with the symmetries. To some extent, every wave
function collapse prunes the probability of available wave functions of future states that
would be inconsistent with the newly established reality, so only to that extent is it
determining and limiting the future.
One of the distinguished practitioners of LQGT is Martin Bojowald (Bojowald 2008). In
2007 he published a Big Bounce theory derived from LQGT. There is a long aesthetic

Copyright: Trevor Pitts, 31 January 2015

32

tradition in cosmology of hoping to find some bouncing mechanism to allow an eternity


of expansions and contractions of the universe in order to avoid/postpone the need of
thinking of a way to create a universe from nothing. There has never been any sign of
such a bounce mechanism in our universe, indeed the latest evidence is the reverse,
expansion is accelerating. Roger Penrose has pointed out that if something like our
universe did contract back to near zero spacetime it would be totally unlike the prior
expansion. It would be full of black holes, neutron stars, dead stars, abandoned
spaceships and worn out garbage in general: an ugly, messy unsymmetrical prospect.
Bojowalds idea was that the universe in the past before the Big Bang was collapsing in
terms of our positive time, and had created our universe in a bounce by crashing
down to the Planck scale and bouncing off this minimal volume. I reject this
interpretation, preferring to see an expansion backwards in time rather than a collapse
forwards in time. Thankfully, Bojowald has provided a useful LQGT mechanism for
existence on both sides of the Big Bang in time.
A 4D expansion backwards and forwards in time on each side of the Big Bang is a
far more elegant and symmetric explanation of the meaning of Bojowalds results than
his version, that opinion being merely of a collapse in our time direction. (In 1998 I did
not realize that it was unnecessary to argue that antimatter in our semi-universe moves
backward in time to make this point and no longer make that argument.) My proposed
anti-twins are symmetric - both semi-universes go from hot quark/gluon soup to possible
intelligent life via evolutionary processes, both have a times arrow (in opposite
directions as required by time-symmetry). Equal amounts of antimatter and matter are
expected to exist in the dual universe; so that terrible asymmetry problem is solved. Of
course, all the old bounce idea ever does is postpone the big question of how, why and
when the first universe formed.
Both of the semi-universes are foliated; both have expanded for the same 13 thousand
million years from the origin. Very near the origin, within overlapping quantum
uncertainty in space and time for the two semi-universes contents, interesting exchanges
could have occurred. Perhaps these allowed quantum spacetime indeterminacy to
distribute between, and then segregate, matter and antimatter between the anti-twin
universes as time expanded, away in spacetime too far for indeterminacy to reverse the
separation (Pitts 1998). However, regardless of what the mechanism was that gave a very
slight preponderance of one sign of matter over the other in each separate semi-universe,
(about one part in 10 billion - the rest is radiation) so long as it works symmetrically on
both sides of the Big Bang in time, the overall matter/antimatter symmetry is preserved.
Consequently, each would be dark matter to each other. We ought to be able to detect
them by gravitation since by hypothesis they may occupy the same space, but in time
foliations an equal 13 thousand million years, from the origin, or 26 thousand million
years apart. Since no mutual mass concentrations can exist except at the present space
points of these two time foliations, I would expect them to interact gravitationally at any
such points due to the coincidence in spatial position mutually warping the spacetime
background, but in no other way, since the necessary time coincidence for quantum
interactions (say, electromagnetic, such as light) or indeed, mutual annihilation of both

Copyright: Trevor Pitts, 31 January 2015

33

halves of the dual universe, would be 26 thousand million years apart. This is a
consequence of Einsteins view that gravitation is purely of non-quantum, symmetric
origin. Therefore it is not necessarily time-coincidence dependent like quantum
interactions such as light emission or antimatter-matter annihilation. The two moments,
anti-presents of each other, are the only real mass concentrations anywhere in 4D space,
in this description of time symmetry. So they might interact gravitationally, but otherwise
be mutually invisible, dark matter. Space is so empty that there would be few direct
collisions between the gravitational fields of ghostly anti-objects in the anti-twin, dual
universes. However, the tidal effects of such collisions would be very disruptive on
galactic or, very occasionally, stellar scales. So this hypothesis is testable. We should
look for galaxies being disturbed by an invisible collider, and gravitational effects of
mysterious concentrations of invisible mass.

How to Build Our Twin Universes From Nothing


Before we describe how the universe came to be so perfect a place to evolve to our
present state of civilization (or perhaps vastly beyond our civilization state elsewhere in
the cosmos), we need to explain how and why the basic geometry arose to give us our
spacetime background. Those of my readers who have become sufficiently quantized in
their ontological views will not be surprised that the most fascinating approach is
analogous to the sum of all paths in the rest of the quantum approach to reality. It is the
sum of all possible geometries. We may hypothesize that this process occurred within the
original Planck interval during which appeared the first spacetime-mass-energy quantum
fluctuation of adequate size to collapse itself into ongoing reality. We have only one
theoretical quantum mechanism to create our universes type of STME from nothing
using a sum of all possible geometries, specifically provided that these geometries meet
certain constraints. These constraints happen to be consistent with the arguments in this
essay. That theory is Causal Dynamical Triangulations (CDT).
If the work of Renata Loll and collaborators (Loll 2008, Ambjorn, Jurkiewicz et al. 2009)
using Causal Dynamical Triangulations is true, then spacetime arises through a quantum
traverse through all available geometries. This method has a unique feature of huge
importance to philosophy - it requires an arrow of time, which is the source of the word
causal in the theory title. You can only build our familiar 4D spacetime from scratch
using this type of theory if causes precede or are simultaneous with effects. This is truly
remarkable; spacetimes geometric properties from the Planck length to intergalactic
scales plus spacetimes arrow can be derived from basic quantum principles.
Lolls approach avoids any preferred geometric background at the outset, and uses the
sum over histories approach, also known as the gravitational path integral. According
to the authors it uses equilateral four-dimensional analogues of triangles, equilateral 4simplices, conceived as building blocks glued together to make geometrically distinct
constructs to be included in the path integral over all geometries in an abstract space of
all spacetimes directly, with no need for specific coordinate systems. It turns out that the
only way to arrive at anything resembling our existing spacetime is to restrict the

Copyright: Trevor Pitts, 31 January 2015

34

triangulated histories, and hence the gluing rules allowed in the sum to be only those in
which causality (as a future-directed arrow of time) is present. Numerical iterative
computer experiments on this restricted form of the gravitational path integral have
derived the structure of such a synthetic quantum universe on large scales, that is, stable,
smooth, expanding, extended and four dimensional, like ours. Very significantly, this
approach only works with a global proper time foliation. This foliation -NOW- is a
Planck-scale slice of time through 4D spacetime to give a universe-wide 3D instant as
described earlier.
Quoting Loll and collaborators (emphasis added):
The four-dimensional CDT model of quantum gravity is extremely simple.
It is the path integral over the class of causal geometries with a global time
foliation. In order to perform the summation explicitly, we introduce a grid
of piecewise linear geometries, much in the same way as when defining the
path integral in quantum mechanics.
The importance of this approach to cosmology, if it survives challenge, is immense. It
explicitly requires global foliation of spacetime to give a single universal now, if we are
to construct anything resembling our spacetime from scratch. Most of all, there is a
forward arrow of time according to this approach. Modern physics is founded on
symmetries, including assumed time reversal invariance. Time symmetry must be
fundamental; but time clearly cannot be reversed, which is why the concept of two
balanced expanding universes on each side of the origin of time is so useful.

A Description of Universe-building
Now we enter a more speculative area. Many years ago, at a cocktail party in Berkeley,
California, I asked a physicist the question, why does everything in the universe rotate?
His immediate answer was that energy distributes itself to some degree through any and
all degrees of freedom, and rotation is one of them. This profound concept has haunted
me ever since. One can see that the sum of all paths approach, in exploring all the
possibilities is an extension of the concept. What if the same is true of the symmetries? Is
it conceivable that our universe fills all available symmetries?
As mentioned above, all known particles, even the newly discovered Higgs Boson, are
part of the Standard Model. This symmetry group, U(1)SU(2)xSU(3) was slowly
established, explaining all the conservation laws observed in sub-atomic particle
experiments. These symmetries are Lie Groups, discovered by Sophus Lie in the 1890s.
The conservation laws are derived from symmetries according to Noethers Theorem.
Symmetries are therefore fundamental and constrain all phenomena. Note the word
constrain, not control. Remember the ice skater, constrained by the conservation of
angular momentum, (and very many other symmetries) but performing innumerable
variations of twirling and arm movements as she wills. The constraints allow and enable
the beauty of her art, as Nietzsches concept of beauty in art would agree.

Copyright: Trevor Pitts, 31 January 2015

35

A.G. Lisi published An Exceptionally Simple Theory of Everything in 2007 (Lisi


2007, Lisi 2008). The theory uses our 4D spacetime as a base upon which the enormously
complex E8 symmetry group operates to enforce 248 symmetries, including all the known
ones and predicting a few others. This theory fits our emerging general picture of the
universe. Energy spreads through all available degrees of freedom (e.g. vibration,
rotation, motion, massive or massless particle species etc.). E9, the next larger Lie
group, is infinite, therefore in my estimation meaningless for physics. Think of this - if
there are an infinite number of constraints on the action under E9, then no change is
possible. Forget Lie group E9 (and any above). It cant happen; or rather nothing much
can happen consistent with it.
If this theory of E8 is true, and my opinion of E9, etc. is correct, then E8 is the most
inclusive and largest possible finite law structure upon which to build a universe.
Accordingly our universe might have the most complete, finite, self-consistent set of
possible rules, utilizing the available stock of laws completely, as mass-energy spreads
through all degrees of freedom. This is truly suspicious; this makes the 6 mystery
constants and parameters that make life possible (the Fine-Tuning Problem) look like no
big deal. We get the widest possible set of laws to play with? Remember, these are not
like human laws and regulations, most of which simply limit possibilities - these
symmetries do that while also giving us more particles and more interactions, more
possible futures. In fact more of something I call intricacy, with a specific meaning.
Lets define initial intricacy as a measure of the sum of all potential particle paths in all
conceivable futures of a single possible, very specifically defined, universe immediately
prior to the moment of its self-collapse into reality, if that were to happen. (More on
intricacy later.) Suffice it to note for now that I think maximal intricacy universes will
preferentially self-create. Intricacy as a measure of the sum of all conceivable futures of
all particles also applies to the future of all spacetime moments, including this one. It can
potentially serve as a moral index today, as we shall see later.
Why is this E8 idea so compelling? The old ad hoc picture of discrete conservation laws
had no unifying driving principle, which E8 provides. Classically, massive objects move
in fixed predictable trajectories determined by initial conditions. In the accepted sum-ofall-paths/path integral approach in quantum mechanics, a particle deterministically
explores all imaginable paths between consecutive measurements, each at some level of
probability amplitude. These potential paths when superposed add to or subtract from the
probability amplitude, giving the position and velocity of the particle at the next
measurement, subject to a minimum joint indeterminacy in these measurements. Between
measurements the particle has no defined straight-line path, even if that is the final result
- it is part of the multiplicity of superposed, not-yet-real particles. So, analogous to E8 and
available laws, particles spread through all available paths to some degree. Similarly,
Heisenberg (Heisenberg 1958) felt that the wave function represented tendencies or
Aristotles potentia, not some predetermined outcome. Many particle physicists were
disappointed that the Higgs boson fit so well into the Standard Model prediction of its
characteristics - they wanted new physics! I say, lets fill in any missing entities into the

Copyright: Trevor Pitts, 31 January 2015

36

E8 symmetry group and be happy! Unfortunately, we may search forever for any particles
beyond that, trying to prove a negative.
Now we come to perhaps the greatest mysteries. First, why is there something rather than
nothing? Here again we are looking at the macroscopic world to interpret the microscopic
world. We should always do the reverse, instead. In the visible world, everything is here,
or not. Rocks or elephants neither appear from nothing nor disappear into nothing. Not so
the micro-world. So-called, ephemeral virtual particles appear from and disappear into
the quantum vacuum constantly. This is experimentally detectable by the Casimir effect,
for example, which can be explained by a relatively low abundance of virtual particles
appearing between extremely closely spaced parallel plates, allowing the more common
outside virtual particles on each side to push the plates together. Virtual particles, or
quantum fluctuations, are momentary pop-ups from the vacuum, basic to quantum
indeterminacy, occurring constantly, quickly disappearing back into the vacuum. Let us
assume the usual quantum processes were relevant at the origin. If a quantum fluctuation
is large enough, according to Penrose, it will become real by collapsing its own wave
function, alternately called self-decoherence (Castagnino and Lombardi 2004).
How could reality arise from a quantum fluctuation? Equivalently, why is there
something rather than nothing? Each quantum fluctuation is a superposition of future
paths of particles. Imagine, at the Big Bang, the empty set of possible universes, no
space, no time, no matter, no energy. It makes zero squared, that is, zero contribution to
the sum of the quantum amplitudes in the quantum fluctuation that created reality. So the
probability of no universes is zero. Therefore the probability of at least one universe is
one, or certainty. If we think Classically, nothing is a reasonable notion, a vacuum is
emptiness. If we think symmetry/quantum mechanically, the vacuum is a seething boil of
virtual particles, and everything real, momentarily or long term, is entirely a set of
complex local and nonlocal characteristics of the vacuum. Clearly, in a bifurcated
quantized and symmetric universe expanding in spacetime on each side of the Big Bang,
there has literally never been nothing. There is no emptiness in our universe, it is merely
a useful philosophical, mathematical (zero) and linguistic concept for which we have no
direct physical evidence. So there is no reason to pretend that nothing is a valid way to
describe whatever gave rise to reality as we can know it. My skepticism regarding
infinitys physical relevance is partly based on how we arrive at it by dividing any
number by a non-existent phenomenon, nothingness.
So it is no surprise that there must be something popping into reality, but what? In order
to turn out to be real at the point of origin, that superposition, by analogy with current
ones, should meet the same criterion of maximal probability amplitude of conceivable
particle fates. Quantum mechanics involves virtual particles popping into existence and
then disappearing. Larger masses are far less likely to appear. Any such mass that pops
into existence must be sufficiently large to continue to exist via this type of quantum
fluctuation and decoherence by the gravity of its own mass. Mass is inseparable from,
and must be part of an entire set of parameters and symmetries, if our present universe is
any guide. It would not be simply mass, but mass-energy-space-time. In other words, if a
mass self-creates it means a whole universe self-creates/self-collapses/self-decoheres into

Copyright: Trevor Pitts, 31 January 2015

37

one specific set of possible future histories. The inflation theory is believed to take over
at that point and blow up spacetime faster than light and fill it with mass-energy. That
theory is beyond the scope of this essay. I include it to show that a universe deriving from
a small mass popping into existence once, is conceptually different from vast numbers of
entire huge universes, billions of years in apparent age, full of uncounted copies of us,
popping into existence constantly according to Many Worlds.
Therefore universes with more particles of more types with more possible future
interactions or fates and, as we shall see, the most optimal selection of constants and
parameters would maximize that original quantum amplitude. Since symmetries are the
skeletons of reality, no potential universe that is not maximally symmetric can be
optimal. Such a sub-optimal potential universe would not have sufficiently complex
available particle paths to preferentially self-collapse versus the more symmetric ones.
Note that applying this essays description of time symmetry, there is no time or space
before the Big Bang because Creation occurs in the Planck interval separating the antitwin universes + (ours?) and (theirs?), at the point in which time and anti-time
begin. The anti-twin universes will literally have existed for all time, both negative and
positive, from the origin. So, then, given that there must be twin universes, if ever there
is a quantum fluctuation of some tiny mass (perhaps a few thousand hydrogen atom
masses, somewhat more than the 720 hydrogen masses of Buckminsterfullerene), what
type of actual twin universes has highest quantum amplitude as here considered? From
the earlier arguments, the E8 Lie group seems to give the biggest usable list of cosmic
ingredients in the form of symmetries, giving maximal degrees of freedom for massenergy to fill. The Causal Dynamical Triangulations approach to the gravitational path
integral cancels out all except causal, foliated, expanding universes, so we are most of the
way to the winning recipe for current reality, at least conceptually, despite some missing
detail. What remains is Fine-Tuning.

Fine-Tuning
Fine-Tuning is a term that references a great many parameters or other fixed
characteristics of our universe that have very precise values deemed necessary for the
emergence of intelligent life, and particularly the kind of technical civilization wherein
these beings can ponder their own unlikelihood. The range of possible values for these
characteristics, relative to the exceedingly narrow window allowing us to exist is such
that, a priori, one would conclude that our existence is so unlikely as to be essentially
impossible. Yet here we are. Moreover, life started incredibly early in Earths history,
basically as early as geologically possible. So the universe must have been hugely
optimized in favor of life? I am proposing an extreme form of what has been referred to
as the Strong Anthropic Principle, meaning that the universe was by some process
optimized for minds like ours to question Fine-Tuning. (The Weak Anthropic Principle
merely notes that only universes so optimized will contain such minds, relying on the
obvious argument from hypothesized infinitely many universes.)

Copyright: Trevor Pitts, 31 January 2015

38

Far greater minds than mine have explored the Fine-Tuning Problem in detail (Barrow
and Tipler 1988). Again I trust their conclusions and seek the meaning. Our cosmic home
has been referred to as the Goldilocks Universe; but this is not a choice among three
bowls of food, but from 1 followed by 10123 zeros bowls for one characteristic alone,
multiplied by at least 5 other huge improbabilities. We have hit a cosmic jackpot. The
apparent a priori odds against our universes exact parameters and characteristics being
chosen at random are so large that we have to imagine that trillions of slot machine on
every inhabited planet from the date of foundation of Las Vegas to the Heat Death of the
Universe would have to unfailingly hit the jackpot, every nanosecond, to match a tiny
fraction of these odds. Again, would Pascal have wagered on that? The easy way out is
by the hand of a Creator of theistic type, but I propose a quantum alternative explanation
that may generate a quantized version of God ex nihilo as a bonus.
One can argue that if there are infinite numbers of universes, then not only are we
possible, but there are uncountable numbers of universes that only differ by such details
as the color choice of one womans nail polish on a planet, or even by the path of one
inconceivably tiny neutrino. I regard infinity as a mathematical artifact with no relevance
to physical reality. Inserting infinity into an argument makes absolutely anything not
merely possible, but infinitely repeated - this is absurd as an argument. I suggest that the
concept of a lack of a predetermined limitation of a quantity is far from the same concept
as infinity, but this is outside the scope of this discussion.
If the quantum summation of all particle paths of all possible universes is not infinite, but
vast in size, what type of universe contributes the greatest variety and number of possible
particle path amplitudes to that summation and therefore, I claim, starts to exist at the
first Planck time? I answer the question with an argument for a twin universe, one half on
each side of the Big Bang. Each, identically, almost unimaginably perfectly physically
optimized even to the finest, most obscure details for the eventual opportunity for the
evolution of not merely life, but intelligent life, and not merely intelligent life, but living
technical civilizations. These can potentially expand to an apparently unlimited extent in
spacetime and with the largest capacity for possible events. These optimizations are
needed for a complete explanation of our existence. By Causal Dynamical Triangulations
the summation yields universes with three space and one time dimension, causal, foliated
in time and with an arrow of time. According to Ehrenfest, only three space and one time
dimensional universes can have stable planetary orbits and other vital characteristics, so
other types of universes are inimical to life anyway. (see Barrow, J. D. and F. J. Tipler
(1988) p.260) Causal and with an arrow of time are simply aspects of the same thing
and are obviously necessary for Darwinian and quasi-Darwinian civilization evolution in
its widest, worldwide or even galactic sense. Time must be foliated because that is the
only means by which quantum mechanics can provide both open futures to allow
evolution, and a fixation by means of the collapse to solidify into reality any positive or
negative evolutionary change at some point in time for future generations to build upon
or eliminate.
How the exact values of the half-dozen fundamental universal constants fell within
extremely narrow limits, which happen to make our universe ideal for life to evolve

Copyright: Trevor Pitts, 31 January 2015

39

needs further explanation since these constants appear to be independent of the


laws/symmetries. All the discussions I have read on this issue seem to ignore that the
Standard Model needs between 19 and 26 (25 or 26 if neutrinos have mass) additional
independent parameters of arbitrary values unrelated to the universal constants in order to
work. To avoid this embarrassment supersymmetric particles have been hypothesized
corresponding to known particles, but of much greater mass. None have been found. One
of the best discussions I have found says the usual list of 6 generally accepted universal
constants should be expanded to 19 by including such things as Boltzmanns Constant
(Artigas, Mario, 2004 - a recommended web-available review presentation). I therefore
prefer to generalize the discussion to at least 45 arbitrary parameters, so far, and thus shift
the probabilities far further toward impossibility. Paradoxically, using a concept I call
Quantum Quasi-Teleology we thereby, with more such parameters, we get more scope
for what might be called Extreme Fine Tuning. So, quantum quasi-teleology if it existed
at the origin would have unlimited scope in fineness of tuning of all these parametric
dials. This is important because the theoretical biochemical requirements for de novo
production of life from non-life are massively increasing in improbability as more is
known.

Optimizing our Bi-Universe by Quantum Quasi-Teleology


To summarize, this essay has agreed that it is plausible that our universe originated in a
quantum process involving a sum over all those geometries consistent with a foliated,
causal, arrowed time to produce our bifurcated space-time/anti-time. Call these
Feynman/Loll Universes. A different type of total, the largest finite symmetry group (E8),
determines and assembles the inhabitants of the subatomic particle zoo and also sets
the natural laws that define their permissible interactions. Call Feynman dual universes
operating in both Loll and Lie Group terms Feynman-Loll-Lie Universes. This is a truly
beautiful solution as to how the universe arose and took its basic form. The universal
wave function/state vector filled all the finite symmetries at the origin in a similar way
that energy fills all the available degrees of freedom in the present universe. Nobody
needs to put in by hand as physicists say, these key features of the universe, they flow
automatically from the model.
I suggest that a subset of such Feynman-Loll-Lie Universes had a maximal sum over all
potential particle paths for all potential events (maximal intricacy, defined as maximal
future historical options). In these universes these constants have to be infinitesimally
close to the ideal Goldilocks levels for not merely life, but technically advanced,
ubiquitous civilized life. Such a selection of potential values of the universal
constants/parameters might statistically bias the origin process. This is because the
quantity of potential particle paths in a universe containing conscious life is vast orders of
magnitude higher than those of lifeless ones. Imagine a hunter-gatherer catching a fish
from a coral reef and then sailing an outrigger canoe, far away to a new island. He has
captured and transported countless particles in his life. Having metabolized them, he will
finally deposit the remaining ones in Hawaii. Then imagine an astronaut on vacation who
captures a fish in the same place, eats it, and takes some of its particles to Mars. Now

Copyright: Trevor Pitts, 31 January 2015

40

imagine a dead world where all that the same particles can do is erode from rocks and roll
with a river to the ocean and sink to the bottom as sediment. The motion and metabolism
of the fish and all their respective ancestors and future predators have provided far more
particle fates than the dead river/dead world scenario. Obviously the astronaut has far
more options for particle fates than the Polynesian fisherman, hence the bias toward
technical civilizations in maximizing the intricacy sum.
Just imagine all the unnatural particle paths it took to build your car and drive the last
10,000 miles. Imagine if you had bought a different model built in a different country or
had driven different routes. Every displaced air molecule as you drive or iron atom in the
car adds to (in fact was always included in) the giant superposition at the origin needed
for you to exist and live the life you chose. Lifeless universes, as contributors to the sum
of all possibilities at the origin, are negligible by comparison. As mentioned above, Roger
Penrose proposes that a sufficiently large entity can collapse itself out of superposition
into reality by the spacetime distortion caused by its mass, in an objective wave function
collapse or self-decoherence. If we adapt this to the case of the original quantum
fluctuation that created our double universe, and accept the summation ideas above, we
have ex-nihilo creation of a very comfortable universe for life with no need for a prior
screenplay by God. Such a sum of all paths will give a universe extremely close to the
optimum for us and others like us to exist.
Imagine a huge proliferation of different potential, not actual universes, some that would
collapse back to nothing in little time, some would be filled with black holes, not stars,
some would be close to ours but empty of possibility for life, each with a certain sum of
potential particle paths. Imagine this as a giant dice table in imaginary space in which
possible futures can be pictured. If we are to choose at random, lets throw about 60
different multidimensional, heavily loaded dice with trillions of minutely different face
values at this expanded probability landscape. Surely we will produce one of a sheaf of
only minutely different initial universes completely dominating this landscape of
probability amplitudes, those which are eventually to be filled with long-lived,
widespread technical civilizations. No matter how absurdly unlikely any combination of
the approximately sixty minutely precise choices of parameters may appear to us as an
outcome, that was not true at the Origin. When the dice were thrown the relative
quantum amplitudes of the most intricate futures were decisive, and remember, ALL
improbabilities are explored, squared and summed, no matter how impossible we might
say, if we were asked for our opinion. I call this idea Quantum Quasi-Teleology at the
origin. Because there is no time at the point at which this phenomenon is hypothesized, it
is not ordinary teleology where, contrary to causality, future events supposedly affect the
past, because past and future do not yet exist. It is, instead a weighting mechanism to bias
the Creation mechanism overwhelmingly into one extremely narrow class of futures.
Contrary to ordinary logic, the more fine-tuned parameters available to be tuned, the
more effective the process of optimization, and the more likely a result favorable to
advanced life to eventually appear.
I predict that the commonly known Goldilocks fundamental physical constants like the
strength of gravity or the electron/proton mass ratio will be succeeded by more and more

Copyright: Trevor Pitts, 31 January 2015

41

discoveries of improbable coincidences and acceptance of more of the less well-known


tunable parameters that make civilization and technological progress more likely. For
example, the coefficient of thermal expansion of iron and concrete are very close. So,
iron-reinforced concrete can tolerate temperate change from greater than +45o C to -45o C
without cracking. It is made from some of the commonest elements synthesized in stars,
especially on rocky planets with life where two of the ingredients, limestone and coal,
will be very common. The others, iron, sand and gravel should be ubiquitous, their
constituent elements being synthesized in stars in relative abundance, making ironreinforced concrete an extremely cheap and convenient material to build large-scale
structures on a wet, rocky planet. There will be vastly many other coincidences,
especially in biophysical chemistry related to the origin of life. There must be a truly
massive and detailed series of them, on this hypothesis, since the origin of life appears to
be so extraordinarily improbable biologically, even given the right available chemicals on
suitable planets and suitable symmetries. This argument does not imply that this is the
best of all possible worlds, simply that it is one of the ones that, at the origin, offered the
best potential of allowing eventual, cosmically widespread technical civilizations.
A more subtle prediction of this proposed quantum quasi-teleological theory is that we
should no longer regard the 19 or 26 arbitrary parameters needed to make the Standard
Model work as an embarrassment, rather a benefit. Therefore there is no need to
hypothesize the existence of supersymmetric particles. This quantum quasi-teleological
theory predicts supersymmetric particles will never be found however much effort is
spent looking. It therefore meets Sir Karl Poppers criterion: a testable theory.

Quantizing Religion and Ethics?


Does this mean there is no God? Is it valid to ask who or what threw the quasiteleological dice? What if we admit that the dice themselves threw the 60 or so
Creation quantum dice, each with their vast, unimaginable number of multidimensional
faces as part of the heavily biased process. Then we may no longer have the need for this
God assumption to explain Creation, but could God exist? Lets look again at how we
have described the universe. Let us define a god as something that is aware of (really
part of) our every thought and action, that pervades and rules everything, will control the
physics yet provide us free-ish will, and produce a fundamentally benign Creation. Then
the Universe I described above meets a reasonable definition of a god, but in the form of
a universal quantum computer. We are always one with it at every moment, since we and
it exist only as an integral part of the wave function of everything everywhere creating
the next absolute moment. Is such a god real? Obviously, yes. Is it conscious? Eventually,
yes. We and all the other intelligent species in both universes on each side of the Big
Bang are the intelligence of God. I find the thought that I am an integral part of such an
overwhelming, beautiful, fascinating and perfectly physically optimized universe/god
very comforting. Of course, there should be only one, capital G God, the Universal,
unique, quantum hacker of everything, on each side of the Big Bang.

Copyright: Trevor Pitts, 31 January 2015

42

When did such bifurcated quantum Gods, arise, as defined above? I assume a God has to
be self-aware. Clearly, as the two halves of the complete universe separate in time,
intelligence has arisen. So such a quantum God will at last be able to gaze, via creatures
like ourselves, upon His creation with an increasing level of understanding in each semiuniverse. It seems clear, though, that the original quantum, quasi-teleological Creation
was the result of the massive potential intricacy of both halves added together. The two
demi-Gods are the eventual creators of all of the intricacy (since the universal quantum
computers create everything, eventually). But monotheism is preserved in each half of
our dual reality.
Can we construct an ethics on this basis? What is this Gods will? If the universe was
created by maximal future opportunity for intricacy, that is, the most open particle future
paths attainable, then expansion of responsible freedom is Godly. Truthful, efficient and
effective education is Godly. Space travel and extra-planetary colonization is a moral
imperative, in order to spread and preserve intelligent civilization. It is un-Godly to
enslave anyone or to dominate women and keep them ignorant. Wrong to kill one another
except, for example, in self-defense or for the preservation of a culture more dedicated to
intricacy maximization (basically, freedom and self actualization for the maximum
proportion of a population) against another less so or violently opposed to the very idea.
It is wrong to unnecessarily simplify ecosystems or societies.
As we consider this ethical model conceptually, it can be expanded to an entire
Humanistic (actually Universalistic, including aliens) moral philosophy, in which
Humanitys/civilized intelligences caring development would be paramount. I would
suggest it is most consistent with this Gods will to continue with the principle which
built his Creation, as a moral imperative, whether God was there at the Beginning in selfaware form or not. This tentative Humanistic Intricacy Maximization Ethics, on that
basis, is written in the universal constants, the symmetries, and the geometry of our
beautiful, enormous Universal home, which popped out of nothing, for free. This ethic is
simply the preservation of the most open, unrestrained, civil future for conscious entities,
for as long as possible. Perhaps it is best summarized as maximizing the opportunities for
everyone to ascend Maslows Hierarchy of Needs. I myself find it spiritually uplifting,
knowing myself to be part of the mind and hands of God, helping the most powerful of
physically imaginable Gods to create the next moment of reality. I can choose to follow
Gods own creative imperative, future intricacy maximization, as best I can judge, to
decide my next course of action. So can everyone, but I am not suggesting that everyone,
or every society, will conform to this ethic or try to. I suggest that societies that do so are
likely to become powerful and dominant, as history has shown so far.

Acknowledgements
This essay would not have been possible without the encouragement and patient guidance
of Melvyn Melrose, James Benford, and Michael Lockwood. Any remaining errors are of
my own creation.

Copyright: Trevor Pitts, 31 January 2015

43

References
Ambjorn, J., Jurkiewicz, J., Loll, R. (2009) Quantum gravity as sum over
spacetimes, arxiv.org/abs/0906.3947v2.
Ananthaswamy, A (2013) http://www.newscientist.com/article/dn22144-brainmight-not-stand-in-the-way-of-free-will.html#.VM2DinYdS0Y
Artigas, M, (2004) Lecture,The anthropic principle, science, philosophy or
quesswork? http://www.unav.es/cryf/veneciama.html#title21
Bak, P. (1996). How nature works: the science of self-organized criticality. New
York, NY, USA, Copernicus.
Barrow, J. D. and F. J. Tipler (1988). The Anthropic Cosmological Principle.
Oxford, New York, Oxford University Press.
Bell, J. S. (1987). Speakable and unspeakable in quantum mechanics : collected
papers on quantum philosophy. Cambridge, New York, Cambridge University
Press.
Bojowald, M. (2008) Follow the bouncing universe, Scientific American
libserver.wlsh.tyc.edu.tw/sa/pdf.file/en/e081/e081p036.pdf.
Brooks, M. (2015) The secret life of reality, New Scientist, 225 (302) P.26.
Castagnino, M. and O. Lombardi (2004). Self-Induced Decoherence and the
Classical Limit of Quantum Mechanics, http://philsciarchive.pitt.edu/1883/pittphilsci:1883.
De Arcangelis, L., et al. (2006) Self-Organized Criticality model for Brain
Plasticity. DOI: 10.1103/PhysRevLett.96.028107
Elert, G. (1998-2014). The Standard Model. The Physics Hypertextbook. www.
http://physics.info/standard/
Feynman, R. P. (1985). QED: the strange theory of light and matter. Princeton,
N.J., Princeton University Press.
Feynman, R. P., et al. (2010). Quantum mechanics and path integrals. Mineola,
N.Y., Dover Publications.
Flsing, A. (1997). Albert Einstein: a biography. New York, N.Y., U.S.A., Viking.

Copyright: Trevor Pitts, 31 January 2015

44

Heisenberg, W. (1958). Physics and philosophy; the revolution in modern


science. New York, Harper.
Icke, V. (1995). The force of symmetry. Cambridge, New York, Cambridge
University Press.
Iskhakov, T., et al. (2012). "Polarization-Entangled Light Pulses of 10exp5
Photons." Phys. Rev. Lett. 109 (15).
Jensen, H. J. (1998 ). Self-Organized Criticality: Emergent Complex Behavior in
Physical and Biological Systems. Cambridge, Cambridge University Press.
Karolhazy, F., et al. (1986). On the possible role of gravity in quantum state
reduction. Quantum concepts in space and time. R. Penrose and C. J. Isham.
Oxford, New York, Clarendon Press ; Oxford University Press: x, 358 p.
Lederman, L. M. and C. T. Hill (2004). Symmetry and the beautiful universe.
Amherst, N.Y., Prometheus Books.
Leslie, J. (1989). Universes. London, New York, Routledge.
Lisi, A. G. (2007). An Exceptionally Simple Theory of
Everything. http://arxiv.org/abs/0711.0770
Lisi, A. G. (2008). TED talks: Garrett Lisi's Theory of everything.
http://www.ted.com/talks/garrett_lisi_on_his_theory_of_everything?language=en
Lockwood, M. (2005). The labyrinth of time: introducing the universe. New York,
Oxford University Press.
Loll, R. (2008) The Emergence of Spacetime, or, Quantum Gravity on Your
Desktop, arxiv.org/abs/0711.0273v2.
Lorenz, E. N. (1993). The essence of chaos. Seattle, University of Washington
Press.
Marsden, J. E. and F. J. Tipler (1980). " Maximal hypersurfaces and foliations of
constant mean curvature in general relativity." Physics Reports 66: 109-139.
Penrose, R. (1996). "On Gravity's Role in Quantum State Reduction " General
Relativity and Gravitation 28(5).
Penrose, R. and C. J. Isham (1986). Quantum concepts in space and time. In

Copyright: Trevor Pitts, 31 January 2015

45

Quantum concepts in space and time. Oxford, New York ;, Clarendon Press ;
Oxford University Press: x, 358 p.
Pitts, T. (1998) Dark matter, antimatter and time symmetry,
arxiv.org/html/physics/9812021.
Rees, M. J. (2000). Just six numbers : the deep forces that shape the universe.
New York, Basic Books.
Ruelle, D. (1991). Chance and chaos. Princeton, N.J., Princeton University
Press.
Sachs, R. G. (1987). The physics of time reversal. Chicago, University of
Chicago Press.
Spinney, L. (2015). http://www.newscientist.com/article/mg22530030.500-thetime-illusion-how-your-brain-creates-now.html?full=true#.VM2HB3YdS0Y

Copyright: Trevor Pitts, 31 January 2015

46

Vous aimerez peut-être aussi