## July 30, 2014

### Old family photo posts have been moved to a new page.

Old family photo posts have been moved to a new page.

## July 19, 2014

### A Curious Way to Represent Numbers: Ternary Factor Tree Representation

This post is only likely to be of interest to those interested in the hidden backwaters of math, and maybe not too many of them. It also has little to do with the other posts here so far.

The conventional system of number representation cannot exactly represent most numbers. Fractions that have factors in the denominator that are not in the number system's base have infinite decimal representations (e.g. other than 2 and 5 for base 10). Square roots and other irrational numbers \ have infinite, non-repeating representations.

In the late 1990s I came up with an alternative system that can exactly represent any rational or irrational number as well as most transcendental numbers using only a finite number of bits. This type of representation is based upon the idea of factored representations of integers extended in a logical way to a nearly universal system of describing number structure.

The conventional system of number representation cannot exactly represent most numbers. Fractions that have factors in the denominator that are not in the number system's base have infinite decimal representations (e.g. other than 2 and 5 for base 10). Square roots and other irrational numbers \ have infinite, non-repeating representations.

In the late 1990s I came up with an alternative system that can exactly represent any rational or irrational number as well as most transcendental numbers using only a finite number of bits. This type of representation is based upon the idea of factored representations of integers extended in a logical way to a nearly universal system of describing number structure.

Pretty but unrelated |

## July 2, 2014

### Universe, Physics and Simulation

[

The chief business of science, particularly physics, is modeling of the universe's phenomena. Modeling phenomena is also a definition of simulation. The universe may not be information, but all we can know of it is information. The universe may not be simulation but all the theories that we can make about it can only be tested through simulation and comparison of the information from the simulation with that from the universe.

The universe may in fact have characteristics of a simulation; it seems likely that models designed to resemble the universe will do so, and therefore that the universe will resemble the models just as well as the reverse – sometimes in unforeseen ways. Some of the characteristics of models that are commonly thought to be artificial or mere approximations may be capable of telling us secrets of how the universe really works.

Physicists spend a great deal of time with equations, but it is only when actual numbers representing a given situation are plugged in that they can these equations be said to be a representation of anything in the physical world; in fact, to represent any kind of fields in any but the simplest situations demands iterating the equations with numbers for every point in space and all velocities or wavelengths, which means plugging in astronomical numbers of coefficients to the equations even for a crude approximation.

How to minimize the number of computations for a given level of accuracy and complexity is the central concern of simulation. For instance, often space is divided into a mesh which is sparse where the situation is simple and dense in more complicated regions. Another commonly-used technique is rather than storing enormous matrices with most entries zero, only storing the few entries containing informative numbers. Other types of compression are also used whenever possible, and compression itself is a rich subject.

If the universe resembles a simulation, it too should show evidence of compression techniques. One obvious one is to only store one copy of identical items, and just use a pointer to that copy wherever another such item appears. A bit more advanced is to only store the differences between near-identical items, together with a single prototype copy as above. This is essentially a programmer's sort of platonism. Even beyond that is compression of analogous structures more generally, which can quickly become quite complex.

This compression effect would also potentially seem able to account for some of the observations that led Rupert Sheldrake to propose the existence of morphic fields and morphic resonance. Once a prototype structure exists, it takes much less computation and storage for the universe to support similar structures, assuming the universe is simulation-like in compressing form. Crystals of new substances should be easier to create again once they have first formed elsewhere. Biological structures, behaviors, and for want of better words what I'll call “plots” and “tropes” should be similarly primed if the compression algorithm can handle such subtle and complex analogies.

More speculatively, if the universe's resemblance to a simulation is not mere appearance, then teleological questions of who is running the simulation and for what purpose arise. The biggest potential increment of efficiency for such an entity in simulating a universe would come from only accurately simulating the regions and events of interest, with sparse and approximate methods used in other regions. This could lead to glitches in the simulation such as are often seen in video-games: lag and other time discontinuities, failure to load sections of the simulation properly, changes in item prototypes, non-player characters not performing when the simulation does not register that the NPCs non-action is perceptible to the player characters, conflicts between versions of the simulation when different PCs high-detail regions come into contact, more radically different simulations coming into contact, continuity errors, objects and characters failing to load or loading twice, violations of physical law, cheat codes, hacking, ... All these, or similar effects have been reported numerous times by different people. (See accounts on the “Glitch in the Matrrix” sub-Reddit) It is often the case that their brains are glitching rather than the exterior simulation, but this sometimes seems to be ruled out by corroboration from other witnesses or by physical evidence. Sometimes these glitches seem purposeful, as when avoiding certain death or when missing items reappear in response to a request. Often, though, they seem to be true glitches, either mistakes or with no apparent purpose other than perhaps revealing the simulated nature of things.

It could also be possible that the universe is natural (more-or-less), with the simulation-like aspects being not artifacts but implicit in the universe's necessary informational self-consistency. Nevertheless, conscious beings arising in the natural universe could learn to hack it from the inside, causing glitches and intimations of purposefulness for other, less adept residents of the universe. The general rule of self-consistency is likely only relative to a given branch of implications of occurrences; inconsistencies define other branches of possibilities. (Perhaps in the beginning was the inconsistency 0=1 : the big bang followed because all propositions and their opposites can be derived from a single contradiction -- but there are branching patterns in the successive derivations of implications from that initial seed.)

Also see Daniel Burfoot's quite readable book on ArXiv, “Notes on a New Philosophy of Empirical Science”, particularly pages 8 to 29. (arXiv:1104.5466 [cs.LG], version 1 April 2011)

*Another post from the archives, this time from late March 2014. I overlooked it or I would have posted it earlier*.]The chief business of science, particularly physics, is modeling of the universe's phenomena. Modeling phenomena is also a definition of simulation. The universe may not be information, but all we can know of it is information. The universe may not be simulation but all the theories that we can make about it can only be tested through simulation and comparison of the information from the simulation with that from the universe.

The universe may in fact have characteristics of a simulation; it seems likely that models designed to resemble the universe will do so, and therefore that the universe will resemble the models just as well as the reverse – sometimes in unforeseen ways. Some of the characteristics of models that are commonly thought to be artificial or mere approximations may be capable of telling us secrets of how the universe really works.

Physicists spend a great deal of time with equations, but it is only when actual numbers representing a given situation are plugged in that they can these equations be said to be a representation of anything in the physical world; in fact, to represent any kind of fields in any but the simplest situations demands iterating the equations with numbers for every point in space and all velocities or wavelengths, which means plugging in astronomical numbers of coefficients to the equations even for a crude approximation.

How to minimize the number of computations for a given level of accuracy and complexity is the central concern of simulation. For instance, often space is divided into a mesh which is sparse where the situation is simple and dense in more complicated regions. Another commonly-used technique is rather than storing enormous matrices with most entries zero, only storing the few entries containing informative numbers. Other types of compression are also used whenever possible, and compression itself is a rich subject.

If the universe resembles a simulation, it too should show evidence of compression techniques. One obvious one is to only store one copy of identical items, and just use a pointer to that copy wherever another such item appears. A bit more advanced is to only store the differences between near-identical items, together with a single prototype copy as above. This is essentially a programmer's sort of platonism. Even beyond that is compression of analogous structures more generally, which can quickly become quite complex.

This compression effect would also potentially seem able to account for some of the observations that led Rupert Sheldrake to propose the existence of morphic fields and morphic resonance. Once a prototype structure exists, it takes much less computation and storage for the universe to support similar structures, assuming the universe is simulation-like in compressing form. Crystals of new substances should be easier to create again once they have first formed elsewhere. Biological structures, behaviors, and for want of better words what I'll call “plots” and “tropes” should be similarly primed if the compression algorithm can handle such subtle and complex analogies.

More speculatively, if the universe's resemblance to a simulation is not mere appearance, then teleological questions of who is running the simulation and for what purpose arise. The biggest potential increment of efficiency for such an entity in simulating a universe would come from only accurately simulating the regions and events of interest, with sparse and approximate methods used in other regions. This could lead to glitches in the simulation such as are often seen in video-games: lag and other time discontinuities, failure to load sections of the simulation properly, changes in item prototypes, non-player characters not performing when the simulation does not register that the NPCs non-action is perceptible to the player characters, conflicts between versions of the simulation when different PCs high-detail regions come into contact, more radically different simulations coming into contact, continuity errors, objects and characters failing to load or loading twice, violations of physical law, cheat codes, hacking, ... All these, or similar effects have been reported numerous times by different people. (See accounts on the “Glitch in the Matrrix” sub-Reddit) It is often the case that their brains are glitching rather than the exterior simulation, but this sometimes seems to be ruled out by corroboration from other witnesses or by physical evidence. Sometimes these glitches seem purposeful, as when avoiding certain death or when missing items reappear in response to a request. Often, though, they seem to be true glitches, either mistakes or with no apparent purpose other than perhaps revealing the simulated nature of things.

It could also be possible that the universe is natural (more-or-less), with the simulation-like aspects being not artifacts but implicit in the universe's necessary informational self-consistency. Nevertheless, conscious beings arising in the natural universe could learn to hack it from the inside, causing glitches and intimations of purposefulness for other, less adept residents of the universe. The general rule of self-consistency is likely only relative to a given branch of implications of occurrences; inconsistencies define other branches of possibilities. (Perhaps in the beginning was the inconsistency 0=1 : the big bang followed because all propositions and their opposites can be derived from a single contradiction -- but there are branching patterns in the successive derivations of implications from that initial seed.)

Also see Daniel Burfoot's quite readable book on ArXiv, “Notes on a New Philosophy of Empirical Science”, particularly pages 8 to 29. (arXiv:1104.5466 [cs.LG], version 1 April 2011)

## June 30, 2014

### A Hand-waving Introduction to Geometric Algebra and Applications

Geometric Algebra (GA, real-valued Clifford Algebras, a.k.a. hypercomplex numbers) gives the only mostly-comprehensible-to-me account not only of higher spatial / temporal dimensions, but of physics in general. I have been studying GA now for over ten years. One of the best things about it is that nearly every paper using GA explains it from first principles before going on to use it for physics or computer science. Most physics papers in other fields seem to take a positive joy in obscure math and impenetrable jargon. I'll try here to give a even less mathematically difficult account of some of GAs implications than most GA papers.

Given a set of n mutually orthogonal basis vectors, one vector for each independent dimension, a space of 2^n quantities results from considering all possible combinations of these basis vectors multiplied together. For instance taking pairs of vectors from a 5D space gives 10 possible planes of rotation, 4D space 6 planes of rotation, while in 3D there are only 3 independent planes of rotation. (The numbers of other combinations for n dimensions go as the n-th row of Pascals triangle or binomial.)

Sums of all the 2^n elements, each weighted by a different scale factor give "multivectors", which are generalizations of complex numbers.

Each of the basis vectors will have a positive or negative square. (Vectors' squares are always scalars, that is, real numbers.) In conventional relativity the basis vectors squares' signs, also called "signatures" are (+ - - - ) or (+ + + -), with the different sign from the others belonging to time. When plugging into the Pythagorean theorem, the square of time can cancel out the squares of the spatial dimensions, giving a distance of zero when the spatial distance equals the time interval (time multiplied by c to give all units in meters). This happens for anything moving at the speed of light. The zero interval is the amount of perceived or "proper" time for a light wave traveling between any two points. This light-speed type of path is also called a "null geodesic". To the photons of the microwave background, no time has passed since they were emitted, supposedly shortly after the universe began.

Now it is possible and actually quite useful for computer graphics to add a pair of dimensions with signature (+ -) to the usual spatial ones (+ + +). The sum and difference of the extra dimensions give an alternate basis for these two dimensions, but with the basis vectors squaring to zero (0 0). These "null dimensions" are called "origin" and "infinity". A projection from this augmented space down to 3D allows many other structures besides points and directions to be represented by vectors in the 5D space. For instance, multiplying 3 points gives a circle passing through those points, 4 points gives a sphere. If one of those points is the point at infinity, then the product is a line or a plane respectively. The other advantages of this way of doing things are too many to list here. This "conformal" scheme is actually quite easy to visualize and learn to use without getting into abstruse math by using the free GAViewer visualization software and its tutorials.

One fellow at Intel extended this to having three pairs of extra dimensions, for a total of nine, so that general ellipsoids rather than just spheres could be specified, but the idea has not become popular since each multivector in it has 2^9 = 512 parts. The 32 parts of regular conformal 3D / 5D multivectors is hard enough to convince people to use. The 11 dimensions of superstring theory are not so well defined as conformal dimensions since seven of the string dimensions are said to be curled up small, "compactified" in some complicated and unspecified fashion.

An interesting thing about the ( +++, +- ) signature algebra is that it is the same as one that has been proposed by JosÃ© B. Almeida as an extension of the usual 3D+t (+++-) "Minkowsi space" of relativity, augmenting the usual external time (-) with a second sort of time having positive square and describing internal or "proper time", (which in relativity will be measured differently by a moving external observer). But if it is assumed that everything in the universe is about the same age, then they have comparable proper time coordinates, so proper time can be used as a universal coordinate corresponding to the universe's temporal radius. This gives a sort of preferred reference frame for the universe, which is ordinarily considered impossible. In this 5D scheme, not just light but also massive particles follow null geodesics, and from that single assumption can be deduced relativity, quantum mechanics, electromagnetism, and in addition dark matter, the big bang and the spatial expansion of the universe seem to be illusions.

The math is also easier than the usual warped-space general relativity, instead using flat euclidean space and having light, etc. move more slowly near mass, that is, treating gravitational fields as being regions of higher refractive index than regular space. This is also the case in gauge-theory gravity, (GTG) which also uses Geometric Algebra, though sticking to the usual 4D Minkowski space. GTG is the only alternative to general relativity that is in agreement with experiment, but GTG also is more consistent, easier, allows dealing with black holes correctly, unlike GR, and is much easier to reconcile with quantum mechanics, which is also much much easier to visualize using GA. For instance, the behavior of the electron can be described fully by treating it as a point charge moving in a tight helix at light speed around its average path (a "jittery motion", or in German: "zitterbewegung"). The handedness of the helix is the electron spin, the curvature of the helix is the mass, the angle of the particle around the helix is the phase.

Geometric Algebra is useful in all areas of physics and computer modelling of physics. GA has been successfully applied to robot path planning, electromagnetic field simulation, image processing for object recognition and simulation, signal processing, rigid body dynamics, chained rotations in general and many other applications. It gives very clear, terse and generally applicable, practically useful descriptions in diverse areas using a single notation and body of techniques.

Given a set of n mutually orthogonal basis vectors, one vector for each independent dimension, a space of 2^n quantities results from considering all possible combinations of these basis vectors multiplied together. For instance taking pairs of vectors from a 5D space gives 10 possible planes of rotation, 4D space 6 planes of rotation, while in 3D there are only 3 independent planes of rotation. (The numbers of other combinations for n dimensions go as the n-th row of Pascals triangle or binomial.)

Sums of all the 2^n elements, each weighted by a different scale factor give "multivectors", which are generalizations of complex numbers.

Each of the basis vectors will have a positive or negative square. (Vectors' squares are always scalars, that is, real numbers.) In conventional relativity the basis vectors squares' signs, also called "signatures" are (+ - - - ) or (+ + + -), with the different sign from the others belonging to time. When plugging into the Pythagorean theorem, the square of time can cancel out the squares of the spatial dimensions, giving a distance of zero when the spatial distance equals the time interval (time multiplied by c to give all units in meters). This happens for anything moving at the speed of light. The zero interval is the amount of perceived or "proper" time for a light wave traveling between any two points. This light-speed type of path is also called a "null geodesic". To the photons of the microwave background, no time has passed since they were emitted, supposedly shortly after the universe began.

Now it is possible and actually quite useful for computer graphics to add a pair of dimensions with signature (+ -) to the usual spatial ones (+ + +). The sum and difference of the extra dimensions give an alternate basis for these two dimensions, but with the basis vectors squaring to zero (0 0). These "null dimensions" are called "origin" and "infinity". A projection from this augmented space down to 3D allows many other structures besides points and directions to be represented by vectors in the 5D space. For instance, multiplying 3 points gives a circle passing through those points, 4 points gives a sphere. If one of those points is the point at infinity, then the product is a line or a plane respectively. The other advantages of this way of doing things are too many to list here. This "conformal" scheme is actually quite easy to visualize and learn to use without getting into abstruse math by using the free GAViewer visualization software and its tutorials.

One fellow at Intel extended this to having three pairs of extra dimensions, for a total of nine, so that general ellipsoids rather than just spheres could be specified, but the idea has not become popular since each multivector in it has 2^9 = 512 parts. The 32 parts of regular conformal 3D / 5D multivectors is hard enough to convince people to use. The 11 dimensions of superstring theory are not so well defined as conformal dimensions since seven of the string dimensions are said to be curled up small, "compactified" in some complicated and unspecified fashion.

An interesting thing about the ( +++, +- ) signature algebra is that it is the same as one that has been proposed by JosÃ© B. Almeida as an extension of the usual 3D+t (+++-) "Minkowsi space" of relativity, augmenting the usual external time (-) with a second sort of time having positive square and describing internal or "proper time", (which in relativity will be measured differently by a moving external observer). But if it is assumed that everything in the universe is about the same age, then they have comparable proper time coordinates, so proper time can be used as a universal coordinate corresponding to the universe's temporal radius. This gives a sort of preferred reference frame for the universe, which is ordinarily considered impossible. In this 5D scheme, not just light but also massive particles follow null geodesics, and from that single assumption can be deduced relativity, quantum mechanics, electromagnetism, and in addition dark matter, the big bang and the spatial expansion of the universe seem to be illusions.

The math is also easier than the usual warped-space general relativity, instead using flat euclidean space and having light, etc. move more slowly near mass, that is, treating gravitational fields as being regions of higher refractive index than regular space. This is also the case in gauge-theory gravity, (GTG) which also uses Geometric Algebra, though sticking to the usual 4D Minkowski space. GTG is the only alternative to general relativity that is in agreement with experiment, but GTG also is more consistent, easier, allows dealing with black holes correctly, unlike GR, and is much easier to reconcile with quantum mechanics, which is also much much easier to visualize using GA. For instance, the behavior of the electron can be described fully by treating it as a point charge moving in a tight helix at light speed around its average path (a "jittery motion", or in German: "zitterbewegung"). The handedness of the helix is the electron spin, the curvature of the helix is the mass, the angle of the particle around the helix is the phase.

Geometric Algebra is useful in all areas of physics and computer modelling of physics. GA has been successfully applied to robot path planning, electromagnetic field simulation, image processing for object recognition and simulation, signal processing, rigid body dynamics, chained rotations in general and many other applications. It gives very clear, terse and generally applicable, practically useful descriptions in diverse areas using a single notation and body of techniques.

### Outline of Relation of Quantum Mechanics and Information Theory

*[From a outline in an October, 2004 email to one of the members of the Ultranet@topica.com email list. Connections to the ideas and terminology of Christopher Langan's "Cognitive-Theoretic Model of the Universe (CTMU) have been edited out for clarity, as have some mathematical speculations.]*

- Quantum Mechanics requires
information theory

- Theories, measurements are just information
- Distinguishable states must differ by >=1 bit
- No outside agency besides the 2 minimally differing states can do the distinguishing between themselves.
- Otherwise the theory would have to explain how the 3rd thing distinguishes not only the 2 original entities from each other but also how it distinguishes itself from the other two as well.
- This requisite ability to distinguish is logically part of every distinguishable entity.
- This logical nature, this ability to distinguish information, is not just the basis for consciousness but a basic form of consciousness itself.

- Every system starts with a pattern reducible to some number of bits and must in any frame of reference be seen through interaction to have the same or greater number of bits as time goes on. (Thermodynamics.)

- The distinguishing of one thing from another creates another entity in the relation between them which is a new embedded subspace within the two which in turn will self-embed itself.
- The increasing dimensions of the binary information vectors' phase space are expanding into an implicit unfilled possibility space which is dynamically infinite in extent.
- The informational expansion into the larger possibility space may show why branches are more frequent than rejoins at the quantum level and thus why time seems to run from a big bang through increasingly entropic (information-containing or perceiving) developments.

### How Big Is a Photon? The Conceptual Foundation of Quantum Mechanics

[From an October 2004 post on the Ultranet@topica.com email list.]

Daniel asked:

>Finally, here's the question: what's the width, if any, of a photon of

>visible light?

I wondered myself in classical electrodynamics what the physical location of the

Poynting vector was within a wave - how is the energy distributed within the wave?

In experiments the energy interaction happens at one spot which can be localized

to any degree required so long as the phenomenon used to register the interaction

propagates with a small enough wavelength. When the "screen" used to register

interactions has more resolution than the waves of the particles it is measuring,

the individual interactions are each well localized on the screen but appear randomly

within the region defined by the impinging waves.

Photons, like all quanta, are not objects but interactions. Elementary entities are

waves when they are going someplace and particles when they get there. All individual

observations are of particles, the wave propagation can only be inferred statistically

from the distribution of particle interactions. Interactions, since they are not

objects, do not themselves have a size. Everything about a given interaction is

specified by what happened in a single instant and therefore what happened at a

single point*, and the amount of this knowledge is restricted by the initial information

content of the interacting particles and the fundamentally limited information conveyed

by the particles resulting from the interaction.

The informational limits imposed by the necessity that two states must either be

distinguishable or indistinguishable with respect to any given interaction lead

to the necessity of quantitization - things that differ must do so by at least one

bit. This limited information conveyed by the particles resulting from the interaction

also requires that a given system must have a maximum information capacity and if

one attempts to get more information out of an interaction with the system then

the results of the interaction become fundamentally unpredictable. For some reason

this unpredictability varies regularly in complex waves and their interference.

Since a system may have multiple parts, it is possible for the information content

of some multi-part system with respect to some interaction to stiil be only one

bit. Independent, separated interactions with different parts of the composite system

with respect to that bit will logically be interdependent - this is entanglement.

The bits transferred by an interaction are all there is ever to be known about it.**

One cannot go back to an given individual interaction and measure it again, such

measurements would be interactions distinct from the original. Therefore observers

cannot compare independent measurements of a given interaction and the notion of

its size is undefined beyond whatever bits were gained from the particles resulting

from the interaction.

_________

*The point-junctions of the particle world-lines in Feynman graphs are being replaced

by junctions with extent in more dimensions, similar to pipe junctions, for example.

I think the gradualness of the separation of two such pipes (superstring world-sheets),

is equivalent to the rate of decoherence of a state into two incompatible possibilities.

This might be regarded as the "size" of the particle interaction.

**See "A Foundational Principle for Quantum Mechanics" by Anton Zeilinger

Daniel asked:

>Finally, here's the question: what's the width, if any, of a photon of

>visible light?

I wondered myself in classical electrodynamics what the physical location of the

Poynting vector was within a wave - how is the energy distributed within the wave?

In experiments the energy interaction happens at one spot which can be localized

to any degree required so long as the phenomenon used to register the interaction

propagates with a small enough wavelength. When the "screen" used to register

interactions has more resolution than the waves of the particles it is measuring,

the individual interactions are each well localized on the screen but appear randomly

within the region defined by the impinging waves.

Photons, like all quanta, are not objects but interactions. Elementary entities are

waves when they are going someplace and particles when they get there. All individual

observations are of particles, the wave propagation can only be inferred statistically

from the distribution of particle interactions. Interactions, since they are not

objects, do not themselves have a size. Everything about a given interaction is

specified by what happened in a single instant and therefore what happened at a

single point*, and the amount of this knowledge is restricted by the initial information

content of the interacting particles and the fundamentally limited information conveyed

by the particles resulting from the interaction.

The informational limits imposed by the necessity that two states must either be

distinguishable or indistinguishable with respect to any given interaction lead

to the necessity of quantitization - things that differ must do so by at least one

bit. This limited information conveyed by the particles resulting from the interaction

also requires that a given system must have a maximum information capacity and if

one attempts to get more information out of an interaction with the system then

the results of the interaction become fundamentally unpredictable. For some reason

this unpredictability varies regularly in complex waves and their interference.

Since a system may have multiple parts, it is possible for the information content

of some multi-part system with respect to some interaction to stiil be only one

bit. Independent, separated interactions with different parts of the composite system

with respect to that bit will logically be interdependent - this is entanglement.

The bits transferred by an interaction are all there is ever to be known about it.**

One cannot go back to an given individual interaction and measure it again, such

measurements would be interactions distinct from the original. Therefore observers

cannot compare independent measurements of a given interaction and the notion of

its size is undefined beyond whatever bits were gained from the particles resulting

from the interaction.

_________

*The point-junctions of the particle world-lines in Feynman graphs are being replaced

by junctions with extent in more dimensions, similar to pipe junctions, for example.

I think the gradualness of the separation of two such pipes (superstring world-sheets),

is equivalent to the rate of decoherence of a state into two incompatible possibilities.

This might be regarded as the "size" of the particle interaction.

**See "A Foundational Principle for Quantum Mechanics" by Anton Zeilinger

### Zitterbewegung Interpretation of Quantum Mechanics

*[From an April 2013 comment on an article about "time crystals" at (billionaire mathematician James Harris Simons') Simons Foundation: Perpetual Motion Test Could Amend Theory of Time]*

Continuous periodic motion is implied by basic quantum mechanics. The simplest interpretation of the quantum numbers of an electron in a hydrogen atom is that the electron really does orbit the proton. De Broglie’s matter waves were conceived as circular motions of a point particle, with the frequency found by noticing that energy is equivalent to both mass and frequency, the former scaled by c-squared and the latter by Planck’s constant. Schrodinger worked out the implications for the Dirac (electron) equation, calling the phenomenon “zitterbewegung”, meaning “trembling motion”. It is of very high frequency – 1.6E21 Hz = 1.6 zettaherz, or billion trillion cycles per second, double that of the De Broglie wave of an electron.

As Oersted Medal winner David Hestenes worked out (using his marvelously clear applied, real-valued Clifford algebras, or “Geometric Algebra”, a lingua franca for mathematical physics) zitterbewegung at its simplest is a helical, light-speed motion of a point charge around its average path.* Further he found that the orientation of the helix is the electron spin, the curvature of the helix is the electron mass, the angle of the particle around the helix is the electron phase, and the helical motion creates a static magnetic dipole and a rotating electric dipole. This is far more comprehensible than the usual explanations (insofar as there are any usual explanations!). His interpretation was borne out by the discovery an absorption of 81.1MeV electrons in silicon crystals, due to the spatial zitter frequency and its electric dipole lining up at that speed with the spatial period of the crystal lattice. Before Hestenes’ explanation, the experimental results were so unexpected as to be implausible to most of the reviewers at the journal Physical Review Letters.

So the eternal and intrinsic helical motion of electrons in any state, including the ground state, is an established fact. How does that differ from the proposed time crystals?

*[It seems to me that this actually makes time simpler - if both light and electrons are constrained to move at c, then time for either sort of particle, rather than being some mysterious quantity with a square opposite in sign to the other dimensions ( x^2 + y^2 +z^2 - (ct)^2 = 1 - (v/c)^2) ) becomes instead simply a distance, the hypotenuse in x^2 + y^2 +z^2 = ct^2. Also it should be noted that electrons can move in more complicated ways than single simple helices, and superpositions are possible. ] See Hestenes essay: Electron time, mass and zitter at FQXi for more information.

## June 28, 2014

### Compression, Entanglement and a Possible Basis for Morphic Fields

*[From a June 26, 2014 draft of a letter to Rupert Sheldrake:]*

So the universe is analogous to a class of computational processes, some more efficient than others,

... at which point I'd like to pause and point out that this doesn't mean that the universe is a computation, or that it isn't, but that it obeys certain rules of consistency that are just like those in some computations, and equally that some computations are also exactly analogous to the rules of the universe, so thatif the most efficient way of doing the computational process has certain methods or characteristics, then the operation of the universe will also have analogous characteristics.Compression is the essence of the operation of the computational processes that are analogous to the universe.

So the universe is analogous to a class of computational processes, some more efficient than others, with the most efficient being heavily favored as representations, which compress natural patterns of evolution of matter and fields so that required resources are minimized to model or instantiate the universe. These compressed representations of patterns have a supra-physical, informational component which is encoded in the thermal radiations of all matter and fields, which cause a cascade of entanglements which in turn have the history of the universe's changing patterns encoded within them. The entanglement of the particles in new patterns with those of past patterns requires the new pattern to be consistent with all the quantum informational constraints of the past patterns. The only consistent universes are those where all the past information from all past patterns is still implicit in each and every new pattern, sub-pattern and interaction. So the past patterns can serve as templates for later patterns, with a size-dependent degree of clarity, as with parts of a hologram, and allow effective compression of all similar situations in the past to each local region of the universe. The thermal radiation information field compresses

__all__similar past situations because it is not truly possible to erase information, but only to turn it into “heat” which is basically just information that one has decided to ignore. Everything in the universe that “stays happened” (as opposed to quantum eraser-type situations) is on the permanent, ineradicable record.

These templates are patterns in both space and time, allowing for example the progressive elaboration of structures in the development of embryos, and so can most effectively be modeled by generative programs which produce the evolving state of the simulation or instantiation, rather than just static data, that is, efficiency implies not just compressibility but minimizing the Kolmogorov complexity of the computational processes analogous to the physical situations. This allows not just physical structures but patterns of behavior and modes of development to be optimized for their analogous computational processes' equivalents of memory space and processing power, and thus gives not just a memory but a super pattern-recognition capability in every part of the universe, which can read a developing situation and compare it with everything in the past light-cone, thus compressing it to effectively require only the new, original information content it embodies to be added to the thermal motions and radiations that communicate past interactions and patterns among the parts of the universe through quantum phases and entanglements' implications. The past patterns it embodies are already in the information field, but each repetition and close variation makes them “stronger”, or more compressed and efficient.

Effectively this is like compression with unlimited side information available. The information capacity of thermal radiation is enormous given it has about 10^19 to 10^21 photons per joule. Even the milli-atto-joules characteristic of the smallest molecular motions give rise to photons. To see the potential power of this sort of compression, movies would be very easy to “compress” to send over a wire if the sender and all viewers already had a copy of every movie ever made as “side information” – only a serial number or tag code would need to be transmitted to “transmit” gigabytes of movie. (But in such a large data set as the the universe's information field there probably is a shortage of short tag codes, codes shorter than the patterns they represent, even though the codes be context-dependent.)

The information field, being in its heat diffusion the same as the wave equation with time replaced by imaginary time, implies that its dynamics occur in imaginary time, which is like a small cylindrical manifold with a particle that changes phase as it spirals along it helically, as in electron zitter motion, rather than staying at one angle on the cylinder as in normal time. (See Zitterbewegung comment on the article on “time crystals” on the Simons Foundation site, reposted here.) It is recurrent time, cyclical time, perhaps not time but eternity. And among the compressed patterns in the information field are all the people who ever lived and every thought and action they ever had or did. Not just the dead ones, either, nor just the distant past, but the past that starts a nanosecond ago, even a yoctosecond ago. In fact, the parts of the future that are implied by the past are already in the field, so it's really somewhat atemporal or eternal.

So the afterlife, precognition, remote viewing and telepathy are implications of this view. It even suggests how it is possible to give a remote viewing target with only an arbitrary code number. The code and the target are physically associated on the envelope or in the computer and the target information is sent via the code in the same way that the movies were “sent” in the example of compression with unlimited side information.

See Daniel Burfoot's book: “Notes on a New Philosophy of Empirical Science” (arXiv:1104.5466v1 [cs.LG]
28 Apr 2011 ) for more down-to-earth applications of the idea of
treating science as a data compression problem, compressing vast
quantities experience and experiment down to pithy theories and
equations.

### Notes Toward a Theory of Eternity

*Yet more from November 2013, this time some unfinished notes I intend to come back to later:*

Notes for expansion:

- Shell of thermal radiation expanding from Earth during and following a life.
- Rotating searchlight of a life's radiation as the Earth rotates,
- Holes in the radiation pattern, shell due to absorption by Sun, planets, stars; re-emission, scattering, gravitational lensing;
- Transactional interpretation of QM demands both an absorber for every emitted photon and a potential for re-emission, thus literal eternity.
- “
*Omnia sunt, lumina sunt.”*“All that is, is light” - All that can be known is information.
- Probability is always and only a measure of ignorance.
- Choices must be real to be real choices – morality cannot coexist with determinism, but requires multiple potential realities, universes.
- Internal dimensions for each entity are rotated by relative motion (Lorentz boosts), but quantum-correlated through all occurring in the same overall 4-space (pseudoscalar).

### Heat as Sound, Neural Impulses as Sound

*Another bit from November 2013:*

Bill Beatty wrote a thought-provoking
essay showing the unity of sound, heat, and electromagnetic
radiation. Heat is a form of sound, of very high frequency and wide
bandwidth. This is acknowledged in the theory explaining
superconductivity, which treats heat and sound as being composed of
quantum pseudo-particles called phonons. Heimburg et. al. showed that neural impulses are primarily sound-like, accounting for their
low speed and because of their solitonic form their lack of energy
dissipation. The electrical effects associated with the impulses are
due to the thickening of the neural membrane during the pulse
increasing the separation between the charges on the inside and
outside of the membrane, resulting in a transient decrease in
capacitance which increases the voltage across the membrane.
Tree-like structures such as neurons have a rich spectrum of
mechanical resonances, largely due indirectly to the form of the
cytoskeleton, which determines the form and stiffness of the neuron.
Neural pulses also resonate with parts of the cytoskeleton and may
change its form, as the microtubules together with the layers of
ordered water surrounding them have non-linear ferroelectric and
topological quantum properties which are linked to discrete shape
changes of the microtubules which in turn affect the shape of axons,
dendrites and thus the neurons' mechanical resonances

*.*### Thermodynamics, Information and the Afterlife

*This was written around the end of November 2013:*

The 2

^{nd}law of thermodynamics states that in any closed system, entropy never decreases. The two apparent loopholes in this law, that entropy can decrease in open systems and can remain the same in either type of system are not of interest here.

Entropy is equivalent to both information and disorder. Attempts to define information as

*negative*entropy are wrong, as is defining information as order. The more orderly the arrangement, no matter what the context, the less information that can be embodied in the arrangement. Information is a measure of improbability, as is entropy. The term “entropy” is often abused to mean not the information embodied in a specific arrangement, but the class of all possible arrangements which look similar from a distance, or averaged together by coarse measurements. Nevertheless, all real cups of hot tea, even those indistinguishable to any macroscopic measurements and composed of absolutely identical constituent molecules, are at the molecular level entirely different in their components' positions and velocities, just as much as two identical pieces of paper with identical amounts of ink, one showing a humorous picture of a cat and the other a budget summary for the Wolverhampton waterworks.

Every event at the molecular (or any other) level which has lasting consequences creates information. That information is almost always encoded in heat. It gradually diffuses, becoming more and more entangled with other bits of information so that one would have to know about more and more to have any hope of determining the original causes behind the motions. Creation and transmission of information require no energy dissipation in general, but erasing information does. The entropy of the information “erased” is not really destroyed, but moved to the outside environment, in the same way as heat is moved to the outside of a refrigerator. Even dropping information into a black hole does not destroy it, but instead it is very gradually re-emitted in a scrambled form as the black hole evaporates All molecular events with consequences that are in principle distinguishable from some other hypothetical course of events leave a permanent but increasingly scrambled record in thermal motions and thermal radiation. This permanent record is the physical substrate for what has been called the “Akashic Records”. There is a potential for long-range correlations to emerge in the detailed patterns of thermal motion which in turn could lead to macroscopic correlations of pattern through a type of chaotic sensitivity to initial conditions which leads not just to variations in the location of systems on a given attractor, but to correlations of the type of attractor. [

*Perhaps, but I now think a sort of consistency filter arising from quantum entanglements is a more likely mechanism*.]

Fourier is now best-known for his “Fourier series” which allow representing anything as a sum of sinusoidal waves of varying frequencies, amplitudes, and phases which is the basis of essentially all digital audio-visual techniques, but during his life he was known best for his work on heat diffusion. It turns out that the equations for heat diffusion are exactly the same as for quantum mechanics, except that heat diffuses in euclidean time while relativistic quantum mechanics demands a Minkowski space, meaning that time has a square opposite in sign to the squares of the spatial dimensions. The two can be converted by using “imaginary time”, that is, time multiplied by the square root of -1,

*i.*(Or some other entity that squares to -1, of which it turns out there are several in Geometric Algebra / real-valued Clifford Algebras.) This procedure known as a “Wick rotation” converts the Schrodinger equation to the Boltzman equation. This only makes sense for massive particles – for light-like particles, time has no independent meaning apart from the distance traveled, time is the hypotenuse in the Pythagorean theorem. (x

^{2}+ y

^{2}+ z

^{2}= t

^{2}) (There are experimental results indicating that massive particles, or at least electrons, have no real mass but move in light-speed helices which give them the appearance of having mass. This is David Hestenes' Zitterbewegung interpretation of QM.)

The relation between the eternal time
of the thermal record and the Minkowski time of everyday experience
is thus a Wick rotation.

### A First Approximation to Mindspace

*The first few posts will be from past writings of mine. Here is most of a letter I wrote to a friend who had moved from being a pillar of the counterculture to espousing ultra-traditional Christian / Calvinist theology and young-Earth Creationism*:

Imagine Indra's net, filling all space and time with a web whose intersections are jewels, each reflecting all of the others. The jewels may also be seen through other schemata – Leibnizian monads, vertexes in Feynman diagrams, atomic perceptions/ perceivers of varied potentialities and probabilities. What seem from the physical point of view as particles (interactions) are seen from another point of view as perceptions whose collective patterns are thoughts. God is immanent in the totality of the net, these atomic perceptions are collectively a basis (in the mathematical sense) for the power set of all their possible permutations. This power set can be viewed in turn as the total mindspace, which has all possible perceptions and thoughts implicit in it. Parts of it are human thoughts (all the possible human thoughts), thoughts of particular groups, individuals, etc. Most of it is outside the region of human thought. Most of it is far less than God, the totality of the mindspace (plus parts of which we cannot speak) – yet also far more than human. A given jewel/interaction may be parts of beings at all levels: God, gods, angels, humans, animals, plants and cells. Because mindspace is not like physical space, these entities which are categorizations of sets and power sets of jewels may overlap like Venn diagrams.

Viewing the net as Feynman diagrams expressing physical reality, in particular thermodynamics/information theory (which are the same thing), a direction of time emerges from the tendency of information (entropy) to accumulate in the form of heat (2

^{nd}law). Viewed as a process in time the net expands from a low entropy state to fill more and more of the implicit unfilled space of possibilities. It never contracts again, it can only expand. It seems in physics that information cannot be destroyed, only scrambled, turned into heat and diffused across the universe, but any event which has had even a microscopic effect is on the permanent record of the universe. This heat/information diffusion is mathematically the same as Schrodinger's equation, but with ordinary time replaced by imaginary time. (turning t into

*i*t is called Wick rotation). The equivalence works both ways. By rotating our view of the imaginary time of the universe considered as heat diffusion into the ordinary time of the wave equation, the scrambled record of the universe, including all the people who have ever lived, is translated back into ordinary time. This is the basis for the reality of the afterlife. The permanence of information in the heat diffusion/ imaginary time view is eternity.

Morality does not enter on a particle level but in the aggregates. Morality, ethics, emotions, thoughts and so forth exist in the mindspace schema, not the physical schema, in the same way that my words exist in the application layer of the OSI model; on the physical layer there are only patterns of charges and spins suitable for interpretation by the higher layers. The difference is that in the physical view of the net, interactions/particles are not only atomic perceptions but equally atomic acts of will. The universe is alive down to the lowest level, not only alive but perceptive and willful. From these more complicated and subtle patterns are composed which are the thoughts and wills of more complicated and subtle beings. Their perceptions and intentions are of varying likeness to other, more general regions of human mindspace which we label imperfectly as “good”, “evil”, etc. These regions of mindspace itself are not changed by our labels on our maps of it, they exist objectively, yet our perceptions of them are necessarily subjective. Our perceptions of them are not the things themselves, though the “things” are ultimately made of atomic perceptions. Other levels of being have their own accordingly larger or smaller regions of mindspace that they perceive as good or evil. There are other categories beyond the basic two, and higher beings are more able to discern them as well.

God, I suspect, finds these limited regions of mindspace that humans regard as good and evil as being not particularly more interesting than the rest of Himself, and scarcely a drop in the bucket of his All. On the other hand, God the absolute can have no experience in the way we do; being All, with nothing outside himself, there can be no separation of subject and object. To have such experiences, he must limit his point of view, and to that extent he becomes less than absolute. The full range of experiences demands all possible types of splits between self/selves and other(s). Thus all our and others' views of the net and each other are a consequence of God's need to limit himself so that he can be not only everything but each thing. No matter what we do or don't do as our limited selves, for God nothing is undone, all is complete.

### Welcome to Mindspace & Minds' Basis

Welcome! This blog is to share some ideas I have been kicking around for a long time.

The first is "mindspace": mindspace is the mathematical space of all possible minds, which has all possible perceptions and thoughts implicit in it.

Parts (subspaces) of mindspace are human thoughts (all the possible human thoughts), thoughts of particular groups, individuals, qualia, motifs, tropes and so on. Most of it is outside the region of human thought. I wish to explore its nature, dimensions, units, operations, and applications.

The second idea that I'll be exploring here is that the second law of thermodynamics implies that information from any event cannot be destroyed, it just becomes more and more entangled among vast numbers of thermal photons and phonons (molecular vibrations), and that these may form a basis for mindspace as well as a potential physical and informational mechanism behind most other "paranormal" phenomena such as "morphic resonance", remote viewing, precognition, out-of-body experiences and the afterlife.

A third area of interest is in "Geometric Algebra" (GA), as reintroduced by David Hestenes. GA is also known as real-valued Clifford Algebra, but goes beyond mathematics by adding physical geometrical interpretations. GA is a fantastically flexible and concise language which unifies most areas of physics using a single notation.

I'll also likely post miscellaneous opinions and items about technology, economics, and whatever else of interest I happen to find.

The first is "mindspace": mindspace is the mathematical space of all possible minds, which has all possible perceptions and thoughts implicit in it.

Parts (subspaces) of mindspace are human thoughts (all the possible human thoughts), thoughts of particular groups, individuals, qualia, motifs, tropes and so on. Most of it is outside the region of human thought. I wish to explore its nature, dimensions, units, operations, and applications.

The second idea that I'll be exploring here is that the second law of thermodynamics implies that information from any event cannot be destroyed, it just becomes more and more entangled among vast numbers of thermal photons and phonons (molecular vibrations), and that these may form a basis for mindspace as well as a potential physical and informational mechanism behind most other "paranormal" phenomena such as "morphic resonance", remote viewing, precognition, out-of-body experiences and the afterlife.

A third area of interest is in "Geometric Algebra" (GA), as reintroduced by David Hestenes. GA is also known as real-valued Clifford Algebra, but goes beyond mathematics by adding physical geometrical interpretations. GA is a fantastically flexible and concise language which unifies most areas of physics using a single notation.

I'll also likely post miscellaneous opinions and items about technology, economics, and whatever else of interest I happen to find.

Subscribe to:
Posts (Atom)