Eventors
By: Doron Shadmi
A part of the history of our universe is written by light.
According to the Special Relativity Theory (SRT) this history has a cone like shape, because the speed of light in a vacuum state has a finite value (299,792,458 m/s) which is considered as one of the Nature's constants.
Let an eventor be any event that has a lightcone.
An eventor can be a single particle, a cluster of galaxies or any event between them.
Each eventor has two bodies which are:
a) The TimeBody, which is the evenetor's timeline existence.
b) The SpaceBody, which is the eventor's past lightcone.
By SRT, there cannot be any relationship between two eventors if their light cones do not meet.
Furthermore, the most up to date information of some eventor cannot be beyond the intersection (marked by red dots) between any two Space Bodies.
Any information, which is not interfered, is actually isolated by the gap between any two eventors' space bodies:
Gravity can be understood as an antiexpansion phenomenon that slows down the expansion process of our universe:
Gravity is the tendency of some eventor to return to its basis along its time body.
From this point of view, a black hole is an eventor that has the tendency to return to the basis of its space/time body, along its timebody.
The timebody of some eventor is actually its most integrated state that has a constancy property, which is recognized by us as rest mass.
Particles like photons have no rest mass and they exist along side the light cone, which determinates the spacebody of any eventor that has a rest mass.
Complexity can be progressed where there is some equilibrium between gravity and expansion along the timebody of some eventor.
For example, our planet is an eventor that has this equilibrium, which dramatically increases the probability of the existence of complex systems.
These systems have the ability to replicate themselves in such a way that also includes the existence of slight deviations, called mutations.
Self replication and mutations are the proper conditions of life form progression, and together with proper environmental conditions, there can be an evolution process that leads to the existence of intelligent life forms.
An intelligent life form has the ability to be aware to the connection between its timebody and its spacebody.
This connection progresses along a curve that actually describes the self aware states of some eventor.
The next pages describe briefly (in a nontechnical way) an idea called CyberneticKernel, which tries to formulate the progression of selfaware states along the timebody (timeline) of an eventor.
About Life
This paper is meant to contribute something to our understanding of Life phenomena.
My point of view is based on a new concept of the Language of Mathematics, where instead of the common Natural number, I use a new information form called “Organic Number” or ON.
Let redundancy be more than one copy of the same entity can be found.
Let uncertainty be more than a one unique name is related to an entity.
ONs are based on a logical reasoning called “Complementary Logic” where redundancy and uncertainty are used as fundamental properties.
From a global point of view the fundamental concept is Symmetry that is measured by its internal redundancy_AND_uncertainty fundamental properties.
The standard set is the case where redundancy_AND_uncertainty cannot be found as fundamental properties, because we deal with broken symmetry, which is recognized as an information form where both cardinal and ordinal are already wellknown.
A multiset (in the case of a finite collection) is the case where cardinal value is wellknown but ordinal is unknown; therefore multisets existence are based on less wellknown conditions, and can be understood as more fundamental then the “normal” sets.
But from a global point of view, where symmetry is the fundamental concept, the system is the fading transition between a “pure” multiset (for example: {x,x}) and a “pure” set (for example: {{x},x}) and viseversa.
VonNeumann modeling is no more than the particular case that is based on infinitely many nested levels of {{x},x} information form, which is actually a nonfinite BinaryTree:
F
0 = { }
T
1 = {{ }} = {0}
1. 

 

F 
T 
2 = {{ },{{ }}} = {0,1}
1. 
. 

 
 

2 
 

 

F 
T 
F 
T 
3 = {{ },{{ }},{{ },{{ }}}} = {0,1,2}
1. 
. 
. 
. 

 
 
 
 

2 
 
 
 

 
 

3 
 

 

F 
T 
F 
T 
F 
T 
F 
T 
4 = {{ },{{ }},{{ },{{ }}},{{ },{{ }},{{ },{{ }}}}} = {0,1,2,3}
1. 
. 
. 
. 
. 
. 
. 
. 
 
 
 
 
 
 
 
 
2 
 
 
 
 
 
 
 
 
 
 
 

3 
 
 
 

 
 

4 
 

 

 
This BinaryTree stands at the basis of N numbers because the information form of a 2valued excludedmiddle logic is based on False_XOR_True :
F
T .
.
  <(Standard Math logical system fundamental buildingblock) xor

If we envision a collection from this point of view, we immediately realize that ZF or Peano’s axiomatic systems are no more than the particular case where redundancy_AND_uncertainty are not used as fundamental properties of these axiomatic systems.
An ON is an ordered information form between a set and a multiset, where its exact place in some collection is determined by its internal symmetrical degree, measured by redundancy_AND_uncertainty .
Let us draw a partial case of ON 4 for better understanding.
If a,b,c,d are used to define uniqueness, then we get:
Uncertainty
<Redundancy>^
d 
d 
d 
d 
 
 

c 
c 
c 
c 
 
 

b 
b 
b 
b 
 
 
{a, a, a, a} V .
.
.
.
{a, b, c, d} .
.
.
.
 
 
 
 
 
 
 
 

 
 
 
 
 
 
 
 

  
 
 
 
  <(Standard Math language uses only 

  
 
 
 
 
 
this noredundancy_ 

 
 
 
 
 
 
nouncertainty_symmetry) 

 
 
 
_ 
 
 

 
 

={x,x,x,x} 
={{{{x},x},x},x} 
============>>>
Uncertainty
<Redundancy>^
d 
d 
d 
d 
 
d 
d 
d 
d 

 

c 
c 
c 
c 
 
c 
c 
c 
c 

 

b 
b 
b 
b 
 
b 
b 
b 
b 
b 
b 
b 
b 
b 
b 
 
{a, a, a, a} V .      .      .      .      {a, a, a, a} .      .  _ .      .      
{a, b, a, a} .      .   .      .      {a, a, a, a} .  .  .  .      _     _ 

 
 
 
_ 
 
 
_ 
 
 _   

 
 
 
 

{x,x,x,x} 
{{x,x},x,x} 
{{{x},x},x,x} {{x,x},{x,x}} 

c 
c c 

b 
b 
b 
b b b b 
{a, b, a, a} .    .   .    .  _ {a, b, a, b} .    .   .    .   {a, a, a, d} .    .    .    .    
{a, a, c, d} .  .  .  .    _     

 
 
 
 
  _  
   

 
 
 
 
  
  

 
 
 
 
  
  

 
 
 
 

{{{x},x},{x,x}} {{{x},x},{{x},x}} {{x,x,x},x} 
{{{x,x},x},x} 
a, 
b, c, d} 

. 
. 
. 
. 

 
 
 
 

 
 
 
 

 
  <(Standard Math language uses only this 

 
 
 
noredundancy_nouncertainty_symmetry) 

 
 

 
 

 
{{{{x},x},x},x}
Symmetry
Let x be a general notation for a singleton.
When a finite collection of singletons has the same color, it means that all singletons are identical, or have the maximum symmetricaldegree.
When each singleton has its own unique color, it means that each singleton
in the finite collection is unique, or the collection has the minimum symmetricaldegree.
Multiplication can be operated only among identical singletons, where addition is operated among unique singletons.
Each natural number is used as some given quantity, where in this given quantity we can order several different sets, that have the same quantity of singletons, but they are different by their symmetrical degrees.
In a more formal way, within the same quantity we can define several possible states, which exist between a multiset and a "normal" set, where the complete multiset and the complete "normal" set are included too.
As this example of transformations between multisets and "normal" sets shows, the internal structure of n+1 > 1 ordered forms, constructed by using all previous n >= 1 forms:
1 

(+1) = {x} 

2 

(1*2) 
= {x,x} 
((+1)+1) = {{x},x}
3 

(1*3) 
= {x,x,x} 
((1*2)+1) 
= {{x,x},x} 
(((+1)+1)+1) = {{{x},x},x}
4 

(1*4) 
= {x,x,x,x} < Maximum symmetricaldegree, 

((1*2)+1*2) 
= {{x,x},x,x} 
Minimum information's 
(((+1)+1)+1*2) 
= {{{x},x},x,x} 
claritydegree 
((1*2)+(1*2)) 
= {{x,x},{x,x}} 
(no uniqueness) 
(((+1)+1)+(1*2)) 
= {{{x},x},{x,x}} 
(((+1)+1)+((+1)+1)) = {{{x},x},{{x},x}}
((1*3)+1)
(((1*2)+1)+1)
((((+1)+1)+1)+1)
5
= {{x,x,x},x} = {{{x,x},x},x} = {{{{x},x},x},x} < Minimum symmetricaldegree, Maximum information's claritydegree (uniqueness)
This ordered collection allowed us to use ONs in order to define a time depended model called Cybernetic Kernel.
In my opinion, the Cybernetic Kernel model can be used in order to improve our insights about the transition between organic and inorganic chemistry.
Today we know that there were tiny irregularities in the BigBang’s space/time fabric, where these irregularities are maybe the fundamental conditions which allowed the existence of galaxies and clusters of galaxies, which has a foamlike shape when observed from a great distance.
This foamlike shape is the result of opposite tendencies of Energy/Matter integration/differentiation fluctuations.
These fluctuations and their dynamic results can be found in any observed scale of our universe.
From the second law of Thermodynamics we knew that there is a global tendency in the observed universe, which actually eliminates the difference between integration and differentiation, until these fluctuations do not express clear and highly ordered Energy/Matter phenomena, as can be observed in early and present universe.
If this is the destiny of our universe, then we can ask: how did the original fluctuation, which its Thermodynamics death we observed, came into being?
Another question is: do we interpret correctly the Energy/Matter integration/differentiation fluctuations in the observed universe?
Let us examine a different model of these observed fluctuations:
By using the inflation model of the BigBang, we say that the first fluctuation had a very high correlation, which allowed the very early universe to “speak” in the same fundamental “language” called the laws of nature.
If we describe the front of the first fluctuation in terms of information, then this front was characterized by highly symmetrical degree that had a very high redundancy degree of the first information structures, which were created in the very early stage.
But we must not forget that these identical information structures holding an elasticlike "memory" of several and different nonlinear degrees of space/time curvatures which are "aspirating" to a singular state from different "points of view".
These different "points of view" of different nonlinear degrees of space/time curvatures, actually prevent a smooth return of each information structure to the singular state.
The result of this nonsmooth return is a diversity of different information structures that can be observed in our universe.
The Organic Number is the model that describes the progression of the Cybernetic Kernel along a timeline, as a result of the return of these information structures to their singular state.
Cybernetic 
kernels 
are 
information 
structures 
based 
on 
self 
reference 
property. 
There is a straight ratio between the smoothness of a Cybernetic Kernel with high degree of self referential property, and the complexity of the information structures which are based on this Cybernetic Kernel.
Also there is a straight ratio between a Cybernetic Kernel and the selfaware states that can be found in nontrivial complex systems like living creatures.
At this stage most of the observed information structures has the tendency to become "Cyberneticallyflat" in the long term (which is recognized as entropy), but by this model there is the possibility that in the very long term, there will be more information structures that are based on "smooth" Cybernetic Kernels, and life phenomena, which we are a part of, will be the main phenomenon that shapes our universe.
Organic Numbers and Cognition
Let us examine this situation:
On a table there is a finite unknown quantity of identical beads > 1.
We have:
A) To find their sum.
B) To be able to identify each bead.
Limitation: we are not allowed to use our memory after we count a bead.
By trying to find the amount of the beads (representing Locality) without using our memory (representing Nonlocality) we find ourselves stuck in 1, so we need an interaction between Nonlocality and Locality if we wish to be able to find the bead's sum. By canceling the limitation mentioned above, condition (A) is satisfied and the bead's amount is known, for example, value 3. Let us try to identify each bead, but they are identical, so we will identify each of them by its location on the table.
But this is an unstable solution, because if someone takes the beads, put them between his hands, shakes them and put them back on the table, we have lost track of the beads identity. Each identical bead can be the bead that was identified by us before it was mixed with the other beads.
We shall represent this situation by:
((a , b , c),(a , b , c),(a , b , c))
By notating a bead as –let's say– 'c' we get:
((a , b),(a , b),c)
and by notating a bead as 'b' we get
(a,b,c)
We satisfy condition (B) but through this investigation we define a universe, which is the result of Nonlocality\Locality Interaction. This result can be used for further mathematical development. Through this simple test we have found that ZF or Peano axioms "leap" straight to the "end of the story" where cardinal and ordinal values are wellknown. As a result many forms that have many clarity degrees are not researched. Organic Mathematics uses Distinction as a firstorder property, and as a result superposition of identities is one of its fundamentals. Through this simple test we get the insight that any mathematical concept is first of all the result of memory\object (abstract or nonabstract) interactions (it is not totally ideal and not totally real [1], for further details please see pages 1516). Let us examine what kind of ONs we get if each information form is based on memory\object interactions. Since each ON is at least an association between our memory and some object(s), its form is based on interactions between at least two opposite properties: Nonlocality (memory) \ Locality (objects).
By using memory\object interactions as the basis of Organic Numbers the researcher is basically educated to be aware of himself during research. This fundamental attitude enables to define and develop the bridging between Ethics and Formal Logic. An example of such development can be shown by the idea of Cybernetic Kernels:
There are 6 different CKs in this diagram, which are ordered by the number of their selfinterference. If we give an "elastic" property to CKs, then CK1 is changed to CK6, and the level of ON5 Cybernetic Efficiency is increased at each step. When the Cybernetic Efficiency is increased, ONs' redundancy and uncertainty levels are reduced, which enables complexity and selfawareness to be developed. We think that both Ethics and Formal Logics have a common principle, which is: to develop the bridging between the simple and the complex under a one comprehensive framework that is aware of its results (it is naturally responsible).
The Complementary Space/Time (CoST) model:
Connectivity or integration is the property that is recognized by us as time or correlation among different entities. A timeline of some universe is the most connected state where no discrete phenomenon exists and all we have is a smooth connectivity without space (no measured place).
Nonconnectivity or differentiation is the property that is recognized by us as space or noncorrelation among entities. A space of some universe is the most disconnected state where no smooth phenomenon exists and all we have is discreteness without time (no timing) or correlation (no measured flow).
Our universe is both time_AND_space and this complementary relation can
be found in any researched level within and without us. A cone and a sphere
are two separated models of a universe, where a sphere is a closed universe (has a "start", a "middle" and an "end" along a timeline) and a cone is an open universe (has a "start" but no "middle" and no "end" along a timeline).
A timeline in both models is like the “spine” of a universe, where any
space/time phenomena are changed relatively to it. Space/Time is a complementary fading transition between “pure” time (the timeline) and “pure” space (the surface). In other words, time and space are the polarities of the same phenomena called universe, which its history exists as complementary space/time environment that has common “laws of nature”
determined by the timeline, which is actually the attractor of a universe.
This timeline can be a single timeline, which is the attractor of a single universe (closed or open):
Also a timeline can be a one branch that belongs to a treelike attractor that has a fractallike property:
"A graphical representation of the multiple inflationary bubbles of Andrei Linde's "self reproducing inflationary universe". Linde's theory is one attempt to generate a "world ensemble," or ensemble of varying universes  within a larger Universe  in which the physical laws and properties may differ from one universe to the next. Changes in shading represent "mutations" in basic physical properties from "parent" to "offspring" universes. (Figure after Andrei Linde, "The SelfReproducing Inflationary Universe, " Scientific American 271 [November 1994]; p.55.)"
If we produce a crosssection and examine an arbitrary slice of a universe, which its space/time fabric is the result of integration/differentiation tendencies between “pure” time and “pure” space, then a natural equilibrium between these “purities” , has the shape of an Archimedeanlike curve, for example:
This Archimedeanlike curve is considered as the optimal zone, where complex phenomena like life, for example, can find stable and rich enough conditions in order to be progressed to self aware nontrivial complex systems.
Summary:
This paper expands the waveparticle duality to a whole universe, where a universe is a complementary phenomenon that exists between two opposite properties, which are integration and differentiation.
Integration is understood as gravity and differentiation is understood as expansion.
The most integrated state is understood as the 4D which is time or timeline.
The 3 other dimensions are the observed space, which is ordered relatively to the timeline that is considered as its attractor.
So the history of a universe is the story of space/time complementary associations along the timeline.
Without this timeline, no fundamental conditions can appear as natural laws of a universe.
With this model we can examine the idea of rich enough conditions in the space/time fabric, which could explain the origin and progression of life phenomena along the timeline.
The next part of this research is to develop a new fundamental mathematical language, using the insights coming from QuantumMechanics, where redundancy and uncertainty are fundamental properties of its axiomatic system.
By doing this, we actually reexamine the whole scientific cosmological research in a new light, where the researcher himself is both observer_AND_participator.
From this point of view any result in any level (and not just in QM level) is influenced by the researcher, and the researcher has to include this influence as an inseparable part of his results.
By using the word 'result' we mean that by this model, ethical results must also be considered as an organic part of the scientific research and development, where 'development' has two basis which are our technical
skills and our ethical skills, which are combined to a one comprehensive scientific method, that can help us to survive the power of our developed technology along the timeline.
The Ideal and the Real
OM's development is possible because we determine the limits of the researchable by using the weak limit (Emptiness) and the strong limit (Fullness). Cantor distinguished three levels of existences:
1) In the mind of God (the Intellectus Divinum)
2) In the mind of man (in abstracto)
3) In the physical universe (in concreto)
By using Fullness as "that has no successor" we show that Cantor's in abstracto Transfinite system is not an actual infinity. We also show how Distinction is a firstorder property of any collection. These developments are based on a cognitive approach of the mathematical science. In "On the Reality of the Continuum" [1] (page 124) we find this sentence:
"From the realist standpoint, numbers and other real things do not need admitting or legitimating by humans to come into existence."
From the idealist standpoint, numbers and other real things do need admitting or legitimating by humans to come into existence. In both cases the term "real thing" has to be understood. According to the realist if "real things" are "real" iff they are totally independent of each other, then no collection is a "real thing" (total independency does not able things to be gathered). According to the idealist if "real things" are "real" iff they are totally dependent of each other, then no collection is a "real thing" (total dependency does not able things to be identified). No collection exists in terms of total dependency (total connectivity) or total independency (total isolation). Since totalities are not researchable on their own, then any research cannot avoid the existence of collections, where collections are the only researchable "real things". Actually we find that a researchable realm is both ideal (has relations) and real (has elements). We have to notice that there is no symmetry in using concepts like "Realist standpoint" in order to understand "real things" because if the requested result is "real things" then we actually give a privilege to the Realist standpoint over the Idealist standpoint about the requested "real thing". This asymmetry can be avoided by changing the requested results to
"researchable things" instead of "real things". In that case the concept of Collection is researchable exactly because it is not totally real and not totally ideal.
Here is the last part of the quote from [1]:
"Furthermore, real objects are always legitimate objects of study in the sciences, even if they are not fully understood or known."
We agree with this quote because "real objects" are valuable for science iff they are researchable, or in other words, they are both real and ideal.
Reference
[1] Anne Newstead & James Franklin, On the Reality of the Continuum Philosophy 83 (2008), 11727 http://web.maths.unsw.edu.au/~jim/newsteadcontinuum.pdf .
Bien plus que des documents.
Découvrez tout ce que Scribd a à offrir, dont les livres et les livres audio des principaux éditeurs.
Annulez à tout moment.