Vous êtes sur la page 1sur 9

Randomness

Plamen Valkov
Amsterdam University College
Big Questions in History (Big History)
Essay
Instructor: Dr. Fred Spier

The concept of randomness is one filled with questions, assumptions and


indefinite answers. Let us begin by considering an example of a random occurrence in
our everyday lives. It is often assumed that the toss of a coin can be considered to
have a random outcome. Initially this claim seems completely reasonable as the
probability of it landing as heads or tails seems equally likely, 50 percent for each
outcome. However, upon closer examination of the possibly infinite factors involved
during a coin toss, one can begin to see that the outcome of this toss may in fact be
anything but random. For instance, if one places the coin under an accurate
microscope and extensively examines the aerodynamic imperfections on each of the
coins sides and their effects on the coins aerodynamic performance, it will become
fairly certain that the probability of either heads or tails will become slightly higher or
lower. This process can be repeated with as many factors as observable, such as wind
speed and direction, atmospheric pressure, force distribution of the finger flipping the
coin, and even gross mass of the particles the coin will displace on its trajectory while
being flipped. The point of all of this is that with each factor taken into consideration,
the certainty of either outcome changes. As new factors are considered and accounted
for, the probability of either heads or tails grows closer and closer to 1. However, due
to the infinite amount of potential influencing factors, the probability exhibits an
asymptotic relationship to 1 and will never likely reach it. Just like the example with
the coin flip, this factorial analysis can be conducted with any event considered to be
random, and similar effects on the probabilities of its potential outcomes will be
observed. This ultimately begs the following question: Is there any event that can be
considered truly random? This essay will not attempt to definitively answer this
question but instead give thought to an assumption that forms the basis of so many
scientific observations that natural phenomena can occur randomly or on the basis

of chance. In order to further investigate this question, however, it is important to set


definitions for these terms. Therefore, henceforth the word random will be used to
describe events that occur seemingly independently and do not exhibit a pattern,
meaning predictions cannot be made about future occurrences. Chance will be used
to describe events that occur out of a variety of other possible events that seemingly
have the same probability of occurring.
This essay will attempt to examine the role of random or chance occurrences
throughout the course of Big History and thus draw attention to the question of
whether or not any occurrence can be considered to be truly random at all. This
examination will begin with the Big Bang and the subsequent events leading up to the
formation of stars and galaxies. After this, the evolution of life on Earth will be
considered, starting from its emergence and focusing on the mechanism of evolution
random mutations resulting from errors in the copying of DNA. Finally, human
attempts to create devices for random outcomes will be considered.
According to our current scientific paradigm, the universe emerged over 13,7
billion years ago in one single, violent event the Big Bang. The moment our
universe materialized, an infinite amount of matter and energy, previously packed
together, was released. The second law of thermodynamics states that at this time the
universe was in the lowest state of entropy it has ever been in. The matter and energy
which was infinitely packed together before the Big Bang can be said to have been the
most ordered our universe has ever been, gradually decreasing in order (increasing in
entropy) as matter and energy expanded in every direction. Since we have yet to
figure out where all of this undifferentiated matter and energy came from, supposing
it just appeared, it could be argued that the trajectory of its expansion was the most
random event to have ever occurred. Following that, if one were able to track every

single particle of matter and energys trajectory, along with the temperature of the
universe, events such as the formation of the first protons, neutrons and then chemical
elements, may become entirely predictable.
Another question worth exploring is the Baryonic asymmetry of our universe, in other
words, the supposed excess of matter, as opposed to antimatter. During its emergence,
it is estimated that the quantity of matter exceeded that of antimatter by one extra
particle for every 10 billion pairs. Due to this occurrence, the universe was able to
exist in its material form instead of all matter and antimatter reconverting back into
leftover energy. The reasons for this slight asymmetry are not yet known, however,
research has made it apparent that if this formation was completely random, no such
imbalance would have existed. Particles oscillating between states of matter and
antimatter have been observed to decay into either state and should have decayed
equally into either state in the early universe. This, however, was not the case, leading
researchers to believe that unknown factors or entities may have affected this process
of particle decay in the early universe, creating the circumstances for the emergence
of greater complexity. The Baryonic asymmetry of our universe can be seen as an
example of how early pure randomness became impossible.
Following the emergence of matter and antimatter, in the form of baryons,
circumstances became favourable for electrons to emerge too. This was essential to
the creation of chemical elements due to the role of electrons in surrounding chemical
nuclei to help make them neutral. These electrons also made it possible to interlink
nuclei of multiple chemical elements, thus giving rise to molecules and hence, greater
complexity. The formation of the first galaxies is thought to have occurred as a result
of the spontaneous clumping of these primordial materials, as a result of the force of
gravity and possibly dark matter. The fact that the exact mechanisms behind galaxy

formation are still largely unknown can give us an idea of the magnitude of factors
that need to be taken into consideration when modeling this clumping of matter. It is
likely that this clumping was not spontaneous at all but rather a result of hundreds of
thousands of years of expansion during which matter and energy constantly interacted
through thermodynamics, electromagnetism, radiation and gravity. It is even possible
to conceive that if all factors behind the clumping of matter are taken into
consideration, one may be able to calculate the point of origin for all matter that is
part of a certain galaxy, solar system, star or planet. When all factors are examined,
where can there be room for randomness?
Approximately 4,5 billion years ago, the first signs of life on Earth began to
emerge. Since then, the process of evolution has ensured a constant increase in variety
and complexity. Beginning with single celled organisms and branching forward and
out into the vast number of kingdoms, phyla, classes, orders, families, genera and
species that inhabit the Earth today, evolution has been the driving force behind the
ability of living organisms to survive, in relative terms, the ever changing conditions
on planet Earth. Organisms pass on their genetic information from generation to
generation through DNA and RNA, code-storing molecules, located in the nuclei of
every single cell of a given organism. The codes stored in DNA and RNA are like
complete sets of instructions for the growth and development of the organism that
houses them. These codes dictate the allocation in terms of function and location of
cells, the instructions for the way a particular organism metabolizes, how to make all
of the proteins necessary for the survival of said organism, and perhaps most
importantly, the passing on of the genetic information itself to the organisms
descendants. During the lifetime of an organism, DNA needs to be constantly
replicated due to the demands for new cells to replace old ones. The replication of

cells happens through cell division, during which a complex process takes place,
called mitosis, where the cells nucleus, along with the genetic information it houses,
is split in two. Before this split occurs, however, the genetic information of the cell
(housed in tightly wound strands of DNA, forming chromosomes) needs to be
replicated so that the new cell contains the exact same amount of it as the old one.
The replication of DNA is an extensive process, which involves unwinding the entire
molecule and copying each strand of the double stranded helix, separately. This
process is not without errors, however. Occasionally, small errors occur during the
process of DNA replication, which give rise to mutations in an organism. These
mutations are thought to be equally likely to be either beneficial or harmful to the
organisms, depending on the external environment. If the mutation is beneficial, the
organism will have a better chance of passing on its genetic material (containing this
mutation) and if it is not, the mutation will not be passed on. This is what drives the
entire process of evolution, seemingly random errors in the replication of an
organisms genetic code, which either give it an advantage or a disadvantage. Errors
in DNA replication can be argued to be anything but random though. Logically, errors
in replication make sense for the positive evolution of a species, for example, due to
the fact that they allow the genetic code to be changed enough for a species to adapt
to a changing environment. To illustrate this point, consider brown bears before the
beginning of the last ice age. If brown bears had an impeccable DNA replication
mechanism, which minimized the probability of errors, it is likely the first white bear
would have never existed. The error in the coding for the protein, which controls the
pigmentation of its fur would have never occurred. This means that brown bears
would have never evolved white fur to better camouflage them in with their new

surroundings (snow) and may have gone extinct due to the inability to stalk prey,
undetected.
On the other hand, if errors in the replication of DNA occurred too frequently,
organisms may have faced the problem of too many non-beneficial mutations
(supposing the probability of a mutation being non-beneficial is 50 percent) occurring
during their lifetime, which would have negatively affected their ability to pass on
their genetic information. Therefore, it could be argued that part of the reason for the
evolution of sexual reproduction, as opposed to asexual reproduction, is to control for
DNA replication mechanisms, which let through too many errors. As has been
observed, when it comes to sexual selection, an organisms ability to thrive in its
surrounding environment is not the only important factor. Phenotype also plays a vital
role in attracting a mate, due to the fact that external signs showcase the state of an
organisms health. Taking this into consideration, an organism, which has a DNA
replication mechanism allowing for more errors than average to occur, may exhibit a
phenotype, which is too deviant from the norm. Thus, said organism would be less
likely to attract a mate and its mutations, regardless of whether they are beneficial or
not, along with its error-prone DNA replication mechanism, will not be passed on.
Considering the selection factors of both natural and sexual selection, it can therefore
be argued that the likelihood of errors in DNA replication has been, in fact, balanced
throughout the timespan of lifes existence on Earth.
Recent findings suggest the errors themselves are also not completely random, as was
previously thought to be the case. Changes in the genetic material, which occur on the
molecular level are now thought to be guided by the physical properties of the genetic
code, as well as the necessity to preserve the functions of essential proteins. Current
research suggests that errors in DNA replication occur most often at so called

mutation hotspots, making the probability for beneficial mutations higher than the
probability for non-beneficial ones.
When it comes to human history, there is no shortage of devices created with
the aim of providing random outcomes. Most of these devices have been made in
connection to some sort of games involving gambling. This underlines human
fascination with the concept of chance, which has been explored by philosophers and
scientists alike for thousands of years. Although countless such devices exist, one of
the earliest can be said to be the die, or dice in plural. Despite the fact that the mass
manufacturing of dice only began during the industrial revolution, they have been
around since before recorded history, with some of the oldest being estimated to date
back more than 5000 years. A die is typically a rounded cube with six faces, each of
which contain a number of dots ranging from one to six. Although multiple variants
of the die exist, mainly non-cubic or non-numeric ones, the six-faced numeric die
seems to be the most popular of them all. Early dice were made of materials such as
wood, bone or ivory and were hand-carved. Due to the imperfect nature of human
mechanical movements, it is reasonable to assume that these early dice were a lot
more biased than their modern counterparts, being much more prone to imperfections
causing them to have a higher probability of certain outcomes over others. As
technology has advanced so has out ability to make maximally random dice. With the
onset of the industrial revolution, new machines and techniques for plastic molding,
assisted the crafting of uniform sets of dice, which were now a much more equal
probability distribution of their different outcomes, than their older counterparts. Even
today, however, the manufacturing of dice remains an imperfect process. Casinos, for
example, purchase dice from manufacturers at premium rates, who ensure maximum
sterility of their production process in order to achieve the most random dice possible.

However sterile these processes though, dice will never be able to be truly random
due to the possibly infinite amount of factors affecting their outcome during any given
throw.
In conclusion, the universe seems to be exhibiting a general trend of going
from the purest form of randomness (the initial trajectory of particles and energy
immediately after the Big Bang) to increasingly controlled processes (by other
factors). It can be argued that with increasing complexity, the degree of randomness
of events seems to decrease. Similarly, life on our own planets evolutionary path is a
lot more controlled then we may have previously thought. Random errors in the
coding of DNA, which drive the entire evolutionary process, are surrounded by
deliberate constraints and may not be random at all. Although we have been
consciously aware of our inability to predict certain outcomes throughout human
history, we have been constantly trying to understand the countless factors, and their
interplay, that can exert influence on any given event. Although a full understanding
will likely never be achieved, the strive towards it is an essential characteristic of our
species. Subsequently, the following can be claimed:
From the beginning of time, if all factors are taken into consideration, which may
very well be impossible, the rest of Big History may turn out to be entirely
predictable.

Vous aimerez peut-être aussi