harmonic continuum theory part 2

PART 2 The new Paradigm.
Whatever truths come upon us; whatever future we have in store, we must gear ourselves up for a change of perspective.
We have been exposed to the nonsense of science for such a long time, that it will be a hard effort to replace what “we think we know” with reality. The 'Conspiracy' against chaos theory has seen to that.
A spook once said at the meeting of a society allegedly pledged to compare philosophy, religion and science that relativism was the enemy. Yet it is only by the power of analogy that we can reshape our world.
I wish to assert that our solar system is a mini model, a microcosm of the greater superstructure in the bigger cosmos.
I have reasons for saying this – as our sun and planets conform to harmonic principles that are somewhat similar to atomic behaviour.
Of course the establishment has sat on harmonic principles for centuries not wishing the human race to escape.
The big Masonic motto `order and chaos' the 33rd degree is a dead giveaway that chaos science is the second ordering principle in the Cosmos – for as we have known for millennia and not told – order
emerges for free out of chaos causing fusion and free lunches.
Also at the heart of the matter amongst the transitional elements of a transitional planet in a tertiary cluster of recycled stardust there emerged life almost spontaneously by the non-Darwinic law of Emergence – a gift of chaos – a free lunch.
The ease of emergence of self-assembling DNA has been wholly or partially modeled by; Goodwin, Kauffman and Langton. e.g. [www.santafe.edu] `self assembling auto catalytic polymers'.
In our solar system, the nine or twelve planets circle the Sun at harmonic intervals with their distribution of elements tending to be the most diverse in the centre of the chain of planets at planet Earth, non-metallic to the exterior fringe, heavier metallic towards the star, and mirror in many ways the periodic table of elements.
Complexity and Life is always in the middle – at the heart of all energy or chemical interactions.
In the central transitional elements of the periodic table of chemical elements, exactly where Earth is in this analogy, complex interactions in both our biochemistry and the chemical elements of our solar system can be sustained by a self-regulating equilibrium.

On the one side the entropy we know – on the other side the free emergence that we are banned from knowing.
In the centre of the traditional periodic table of chemistry's square arrangement of chemical elements are the transitional elements, and also iron and carbon upon which facilitative platform biological life is based.
In the centre of the solar system we have planet Earth home to biological carbon and iron based life taking up diverse chemical themes whilst noble gases accrue in Jupiter and Saturn and the more metallic planets are parked closer to the sun.
By a law of mass and ratios it is no random chance that the planets are so aligned, for as the elements of chemistry have harmonic signatures, distributions and properties, so have the planetary orbits and the tendencies of the planetary geology.
Our solar system therefore looks like the periodic table of chemistry – I posit, but it doesn't stop there, for so does the galaxy.
The new 2005 chemical galaxy periodic table conforming to Newland's
1884 Law of Octaves shows the elements arranged by harmonic intervals spiraling out from the centre like the archetypal spiral galaxy each lined up at different harmonics.
It was Rae Tomes in 1992 who spotted from calculations in red shift that the stars and galaxies move apart from one another at harmonic intervals.
The cosmos therefore is a harmonic based chaos system. `Order and Chaos' the watchword of a secret society pledged to stifle the truth.
Chaos structures such as the vortex – a spiral weather system performing harmonic mathematics in scales as small as the subatomic ether to as large as the super cluster show that the cosmos is based on the rules of harmony and chaos.
The chaos spiral or vortex of the `chemical galaxy' could very well be the model for the archetypal atom which can facilitate no more than an octet of eight electrons in a shell [e.g. Pauli Exclusion Principle].
Everywhere we look in the microcosm, we can begin to see the macrocosm – very much `as above, so below'.
Another free gift of chaos is free energy or emergence.
Also noted as a rewarming after the Big Bang and a rewarming after quantum particles tear away from one another, the ‘Unruh Effect’ illustrates that the chaos law of emergence supplies the material we can see with the substance of the ether that we cannot. When we tear away the particle or mass - the top of the fire hydrant from its locality we release the powers that supply it.
Particles are watery harmonic 3D swirls in the turbulence of ether – they are not isolated billiard balls and when they are scooped out, nature abhors a vacuum and fills it for free. [Unruh Effect]

Such fluid and particulate turbulence have been noted in the flame like properties of electromagnetic discharges. For as fire burns in air and oxygen following air turbulence – so too is the effect of etheric turbulence on traveling particles in an electromagnetic discharge – producing fire in the atmosphere of the ether.
Chaos and turbulence models at all scales show that the universe is unified with itself and is a chaos and fluid system operating on harmonic principles.
From the many diverse analogies that can be derived from `as above, so below' I would like to make the assertion that the superstructure of our own cosmic bubble in the greater cosmic foam conforms to the structural analogy of the solar system. In particular, our solar system.
Super clusters tend to form helices and spirals like the harmonic ingredients of DNA so to take the analogy of the solar system further, let super clusters in the cosmos be equivalent to biological life on planet earth.
An arithmetic array that subtracts rows and columns of recycling 1- 13 that comprise all possible harmonic intervals on the scale produces helical DNA patterns called `the loom of Maya'.
Helices are common to harmonics and harmonics are common to the ether and particles.
In this analogy, each biological life form is a super cluster on the planets surface.
The structure, geography and distribution of biological life on planet Earth can be used as an analogy like this because the Earth itself is an effect of a harmonic distribution in both a physical and chemical sense.
By also using an anthropocentric analogy we can further suggest that some rainforest canopies of biology/DNA – our super clusters – have sandwich layers of primary and secondary super cluster elements and
properties, whilst some have tertiary qualities and life forms like the elements of the Milky Way and our own cluster.
We can say that intelligent as we know it life is anthropoid and it is a tertiary product.
We can assert that anthropoid life is relatively rare in comparison to e.g. phylum insecta.
It may be for instance in Cosmic life that the insectoid formula is prevalent more than the anthropomorphic.
In nature in Earth, this formula holds, whilst bacterial and viral life is in even greater abundance.
It would follow that anthropomorphic life and relativity is rare in our cosmic bubble and perhaps other bubbles too.
By using the structural model of the Earth as our bubble and the distribution of DNA like the helical patterns of the super clusters themselves we can predict that they tend to be distributed around a big planetary sphere or bubble in which is dense low frequency material.
There will be some `biological' life-bearing/constellations and clusters inside the geology and core of the super cluster bubble, but these may not be life as we know it. Perhaps taking the inference of a dark underworld in need of light or energy.
With super clusters arraigned around the periphery of the cosmic bubble on the surface of the sphere, there would be a central sun, the light of God holding our planet cosmic bubble in place, and almost certainly side by side in a harmonic pattern would be a whole solar system of adjacent planes or planet bubbles to which we are adjoined. Perhaps some containing super cluster forms and biology and life as we would recognize it.
This analogy would suggest that the Cosmos and the Cosmic foam is a vast place held together by the bright life-giving love of God – at the very least the physical life-force that drives the ecosystem of super clusters and is in it and which binds every form together in the light.
Using the organic analogy further, knowing what we know about the suns output and global warming – it may be that the life force of the beings in the super cluster coated bubble will be driven to the challenge of growth and spiritual and material evolution in some future epoch – to collaborate to travel the vast distances between the next cosmic portal and bubble beyond the little set of bubbles in our local solar system cosmos.
An upcoming publication by I. Fuentes-Schuller and R. B. Mann on quantum entanglement will appear in Physical Review Letters 2005.
It shows amongst other things that Quantum entanglement research proves that acceleration of related or entangled particles from their origins creates a `thermal bath'. In my words, movement between related particles tears the subatomic ether that supports and feeds the particles – causing i.e. a 'thermal bath' or my words, free energy.
There was a thermal bath after the Big Bang and there is a thermal bath at Quantum level today's physicists and cosmologists are literally bathing in free energy.
From the `Perimeter Institute for Theoretical Physics.' Called, `Alice falls into a black hole: Acceleration and quantum entanglement' `What would happen to their entanglement if Alice fell into a black hole and Bob stayed safely outside? We can model this situation by considering Alice to be stationary and Rob (formerly Bob) to be uniformly accelerated with respect to Alice. We found that although the entanglement between them is reduced due to Rob's acceleration, it remains nonzero as long as Rob's acceleration is not infinite.

It has long been known that an accelerated observer detects a thermal bath of particles whereas an observer at rest sees only a vacuum.
Known as the Unruh effect, it is this that causes the degradation in the entanglement measured by Alice and Rob.'
The tearing apart of related and entangled particles across a field produces a `thermal bath'.
Hennessey's Harmonic Continuum Theory
www.whale.to\b\hennessey_b1.html predicts that:
particles are continually emerging standing waves emerging as a weather system at the top of a pyramid of relativity - energy that is fed into maintaining the particle in its locality is bled, released and generated by tearing the particles from their locality.
The tearing effect produces the `thermal bath' effect known in quantum entanglement.
Two spinning magnets therefore would release a great deal of free energy because the field relationship between upper and lower plate atoms is being promoted by high voltages, which when torqued by the physical device produce the thermal bath of the Unruh Effect. I think this paper and or fact of the Unruh effect in quantum entanglement is corroborated as a universal law in the macrocosm by the fact that the Universe has been found to have been rewarming itself after the gravitational tearing and motion of the Big Bang.
The Unruh Effect as a universal law for matter and gravity and time would substantiate/corroborate the free energy ideas of; Faraday Tesla, Townsend Brown, Frank Searle, Bruce De Palma etc ...
In free energy devices - the thermal bath idea is tapped into by the rotating mechanics of the device.
Big Bang researchers have recently conceded to a major flaw in the theory in that the Cosmos has been measured to have reheated itself by obtaining free energy from somewhere.
This idea, I postulate, is the same `Unruh Effect' at massive scales of gravitational entanglement and tearing as is being observed in the quantum microcosm with its entanglement and tearing. Massive movement and tearing of the macrocosm has produced the same thermal bath previously observed at a quantum level. The Cosmos has been reheated after the big bang. This Cosmic rewarming is an Unruh effect the operation of a universal physical law on a massive scale. It is Free Energy.
Academics have been sitting on the proof for free energy .. right
there already published in plain view, though they apparently don't have any reason to believe that particles are externally supplied and refreshed.
Super strings and Einstein do not cover the law of Emergence and total
interconnection within a fluid environment. Theirs is the pristine billiard ball that cannot be a particle and a wave at the same time.
The bit they don't get is that all particles are fed and continually supplied harmonic standing waves like galaxies or vortices in the eye of an ether storm. This ether energy continually supplies, feeds and augments by the chaos law of emergence the particulate form with fresh energy.
That's why spinning two magnets tears away the relationships between
the particles in the upper plate from the particles in the lower plate – creating a rewarming or thermal bath or Unruh Effect.
The recent `chemical galaxy' vortex model for the periodic table of chemical elements is construed as a harmonic chaos form because the elements align themselves by harmonic intervals. This fact was known as far back as Newland's Theory of Octaves in 1884 CE.
Lord Kelvin's Vortex Theory of Ether and Atoms in 1901 CE was a theory born in fluid dynamics and turbulence and at least 100 years ahead of Einstein. We now in 2005 CE have a vortex theory of atoms for the periodic table.
The hidden extras in reality that physicists allegedly cannot grasp is that a particle is both a wave and a particle at the same time because the particle is intimately connected with and derived within the fluids of the ether. A particle is a standing wave made out of harmonic folds of ether, a storm system like the red spot in the eye of Jupiter.
Move a particle or mass by breaking its field relativity away from the pyramid of energies that continually feed it and free energy to the value of the energy in the core of the particle will pour through the gap. This is called the Unruh Effect … we just call it free energy.
There is a long list of ‘fringe heretics’ no longer with us to guide us on this new journey into the future possibilities of Interstellar space.

E.g. 1. William R. Lyne, 'Nikola Tesla, Occult Ether Physics', pub.
Creatopia Productions, ISBN 0-9637467-6-6 p67.
‘Tesla emphasised that his concept of alternating current, AC, using rotating magnetic fields involved new principles rather than refinements of pre-existing work ...
But was Tesla the first to conceive of a rotating magnetic field? The answer is no.
The first workable rotating magnetic field similar to Tesla's 1882 revelation was conceived three years before him by Walter Baily, who demonstrated the principle before the Physical Society of London on June 28, 1879 ...
Two years later, at the Paris Exposition of 1881, came the work of Marcel Deprez, who calculated "that a rotating magnetic field could be produced ... by energising electromagnets with two Out-of-step AC currents."

A question that remains unanswered was whether or not Tesla knew of Baily's work. It is quite possible that he had read Baily's paper, although no one at the time, including Baily, comprehended the importance of the research or understood how to turn it into a practical invention ... pp. 24-25’
According to Tesla's lecture prepared for the Institute of Immigrant Welfare (May. 12, 1938), his "Dynamic Theory of Gravity" was one of two far reaching discoveries, which he

"...worked out in all details", in the years 1893 and 1894. The
1938 lecture was less than five years before his death.
In his 1938 lecture, Tesla said he was progressing with the work,
and hoped to give the theory to the world "very soon", so it was
clearly his intent to "give it to the world", as soon as he had
completed his secret developments.
The "two great discoveries" to which Tesla referred, were:
1. The Dynamic Theory of Gravity - which assumed a field of
force which accounts for the motions of bodies in space;
assumption of this field of force dispenses with the concept of
space curvature (ala Einstein); the ether has an indispensable
function in the phenomena (of universal gravity, inertia,
momentum, and movement of heavenly bodies, as well as all
atomic and molecular matter); and,
2. Environmental Energy - the Discovery of a new physical Truth:
there is no energy in matter other than that received from the
environment.
The usual Tesla birthday announcement - on his 79th birthday
(1935), Tesla made a brief reference to the theory saying it
applies to molecules and atoms as well as to the largest
heavenly bodies, and to "...all matter in the universe in any
phase of its existence from its very formation to its ultimate
disintegration".
In an article, "Man's Greatest Achievement", Tesla outlined his
Dynamic Theory of Gravity in poetic form (as paraphrased
by me [Lyne]):
• That the luminiferous ether fills all space.
• That the ether is acted upon by the life-giving creative force.
• That the ether is thrown into "infinitesimal whirls".
("micro helices") at near the speed of light, becoming
ponderable matter.
• That when the force subsides and motion ceases, matter
reverts to the ether (a form of "atomic decay")
That man can harness these processes, to:
• Precipitate matter from the ether.
18
• Create whatever he wants with the matter and energy
derived.
• Alter the Earth's size.
• Control Earth's seasons (weather control).
• Guide Earth's path through the Universe, like a space ship
• Cause the collisions of planets to produce new suns and
stars, heat, and light.
• Originate and develop life in infinite forms.
Tesla was referring to unlimited energy, derived from the
environment.
Some of Tesla's unusual conceptualisation of the ether had been
nonetheless expounded piecemeal, in his preceding 1890's
lectures. He later railed against the limited and erroneous
theories of Maxwell, Hertz, Lorentz, and Einstein.
Tesla's ether was neither the "solid" ether with the "tenuity of
steel" of Maxwell and Hertz, nor the half-hearted, entrained,
gaseous ether of Lorentz. Tesla's ether consisted of "carriers
immersed in an insulating fluid", which filled all space. Its
properties varied according to relative movement, the presence
of mass, and the electric and magnetic environment.
Tesla's ether was rigidified by rapidly varying electrostatic
forces, and was thereby involved in gravitational effects, inertia,
and momentum, especially in the space near Earth, since, as
explained by Tesla, the Earth is "...like a charged metal ball
moving through space", which creates the enormous, rapidly
varying electrostatic forces which diminish in intensity with the
square of the distance from Earth, just like gravity. Since the
direction of propagation radiates from the Earth, the so-called
force of gravity is toward Earth.’
William R. Lyne, ‘Occult Ether Physics’, ISBN 0-9637467-6-6

In manufacturing Tripartite Essentialism [Hennessey 1991], the Octal/harmonic Arithmetic is generated Boolean-fashion around the validity of 2 systems in exchange/relationship through a common medium.
[A basic exchange in 3 dimensions with 3 components at time t1]
There are, essentially, eight Logically Real models of that transaction at any given time t1, and, 64 at time t2.
A Logically Complete and closed set of 64 can describe all the varieties of logically real activity and exchange within the three dimensional Cosmos between t1 and t2. The language [T]
This closed set of 64 'activity states' is analogous to the activity relationships within the periodic table of Chemistry by Fajan’s Rules.
Current beliefs held by the 20th century Science paradigm have it that atoms of the same kind are 'absolutely' equivalent, subject to the paradoxes of Super strings and Quantum Physics e.g. the ‘collapsing wave paradox’ or, the 'particle-wave duality' paradox.
i.e. is a particle, a particle or a wave, because although it appears to be both in the field as it were, when one tries to nail it down in a laboratory using a belief system called ‘scientific reductionism’ it cannot be both at once.
However, these 20th Century theories do not perform in their 11 or 26 dimensions e.g. Super strings, and in terms of their internal consistency, cannot unify the forces they represent with the labels they possess e.g. gravity and electro weak.
However, certain of the mechanics of the atomic processes that have been observed and quantified, possess consistent mechanics from which predictions in new data models can be made.
These observed properties, perform around certain fundamentals of reality. [e.g. 3-part quark, or 8 electrons per shell in the ‘Pauli exclusion principle’, or ‘Regge resonances’ or harmonic representations of the periodic table of chemistry e.g.
Newland’s ‘Law of Octaves 1864’.
With the famous gas law Physicist Lord Kelvin attempting to introduce a theory of ether in 1901 based on his knowledge of fluid and turbulence, the scene therefore was certainly established to deploy constructs around the harmony of the spheres.
Isaac Newton’s law of planetary motion indicated orbits of harmonic interlude around the star, and indeed Rae Tomes in 1991 published Tomes Harmonic Theory based on analysis of Cosmological red-shift data that demonstrated that there were harmonic components amongst the expanding star fields.
In Paris in 1992, University research using chaos theory demonstrated that the perceived regularity of planetary motion and harmony was a metaphysical illusion as the minute microscopic deviances within planetary gravimetric interaction and displacement could be more precisely accounted for using chaos theory.
The current Physics Theories extemporising from the facts do not perform well enough to attain a final physical theory without reference to paradox or extremely dimensional, complicated, yet strangely inadequate mathematics.
e.g. Heisenberg’s ‘principle of non-locality’
The reality is however, that the three dimensions of this Cosmos;
all the matter and all space and all its time in which we exist, from photons to stars – i.e. its energy, matter and time, is absolutely non-linear and upheld as a field of emergent energy. [Cf. Mathematical Chaos]

An archetypal property of Chaotic Systems is their capacity to emerge new 'ordered' states. e.g. from the weather systems of Jupiter emerges the 'Red Spot' - the spot is inseparable from the weather system.
It is hypothesised that from the sub-atomic energies of the Cosmos, emerges a particle in like manner to the red spot.
The continuous nature of the particles existence is upheld in its continuous and complex relationship to its sub-atomic patronage.
The mathematical process which describes this particle relativity can only be, at its most basic, a relativity of eightness derived from three dimensions of space and 1 of time - [which maximizes the efficacy of Occam's Razor on 20th Century physics.].
The observed particles appear isolated to our methods of observation, but the most apt analogy would be that the subatomic background is a continuously bowed violin - and the emerged note - which is the particle - is in a state invisible to our methods of observation – i.e. We cannot 'see' musical sounds, but the relativity of the invisible sound to the visible musical instrument is consistent.
A Particle is in effect a three-dimensional standing wave - in a different state from its continuous source of supply from the emergent properties of the ether.
All the mathematics of wave theory, resonance and harmony would apply to its physicality and would fit with the ‘eightness’ and harmonic tendencies in the observed data. e.g. Fourier and Laplace mathematical transformations.
The Quantum ‘particle-wave duality’ which is currently a physically unexplained paradox, now has a rational model that makes the laboratory observations a realistic event.
Taking the implications of non-linearity further, it can be said that atoms belong to observed groups with similarly classified properties. [Class Atomism : Hennessey 1991], and that no two similarly classified 'atoms' are absolutely identical due to their Chaos Ontology.
This theory predicts that a particle is analogous to a note, and that on a Cosmic scale the qualities and properties within and between Classes of atoms or notes will vary in a fluidic way. e.g. relative textures, tones, attack and decay [cf. Music in vivo] with each particle/note will mean that some matter can be relatively more 'audible' in the local Cosmic energy mix.
i.e. Some matter may not co-operate with expectations i.e. Have a different perspective to the relative observer in time and space.
This matter, at some higher harmonic interval, may be able to drift through other matter - and that would hold whether the states were relatively similar e.g. metals, or different.

These properties are dependent on the state of the invisible subatomic weather system from which the particle emerges. e.g. 'transparent' iron could pass through dense iron. [See below]
There would, apart from an expected migration of electrons, be also a migration of protons and even neutrons, as the intrusion of larger packets of emergent energy, inconsistent to the local equilibrium of material interaction drew upon the local ‘a priori’ atomic reservoir by Fajan’s Rules, to create new particles.
Fajan in essence said that Big atoms donate to small atoms across a common medium.
The same transference criteria applies to other field behaviour definitions such as Ohm’s Law and Voltages in electricity, and in osmosis in Biology, and transference in Lewin’s Field theory in Psychology. i.e. A to B, through some common C, where in reality, depending on the scale of the transaction or donation, C would be of varying impedance.
The sizes of the ‘new particles’ would be dictated by the atoms within their point of entry and also by the size of the unfilled energy packet.
A big emerged energy packet [EP] of size2 at time1, newly intruding into local smaller atoms might produce noticeable material degradation and inconsistency in atomic behaviour and would create new and larger atoms as the energies from the new packet augmented the locality.
Whereas, a big emerged energy packet of size1 amongst larger atoms at time1 might have a less noticeable effect.
The intrusion of new energies into this cosmos is a continual process and has been noted in observations of dark matter.
At time 2, after intrusion and diffusion however, in all cases, there is more matter and more energy to repair any entropic degradation.
This constant creation and augmentation, though, does upset the medieval applecart as it contradicts Isaac Newton’s Law that states that ‘matter can neither be created or destroyed’.
The transference gradients and the ‘a priori’ size of the atoms and their relative capacity to contribute from within their atomic environment – which may consist of other atomic types in aggregate – would influence the sizes and types of new particles discovered.

There may be 3 classes of these atomic properties;
1. Dense Matter which would be relatively sub-atomically stable
2. Transparent Matter which would be relatively sub-atomically
unstable.
3. Opaque Matter - a relative state between 1 & 2
20th Century science appears to have designated its perceptions
of this reality by using counter-intuitive neologia, however.

I shall designate that our current Cosmos is Dense Matter [DM], and that some place in and around and in between our reality that we cannot see supplies its energy needs by emergence.
This supplier is Transparent Matter [TM] [is called by science 'dark matter'] and that we may be able to detect this activity by taking note of certain hybrid interstitial states of material that may have unusual and unexpected properties in any given context. [e.g. the well documented 'particle zoo' of theoretical physicists.] This hybrid, interstitial type of matter, analogously,
wild unregulated matter I shall designate Opaque Matter. [OM].
As the physical Cosmos has no absolutely fixed properties, due to its ontological non-linearity and innate chaos, the status of 20th Century ‘constants’ must be challenged – e.g. it is predicted that the speed of light [Einstein’s ‘c’ ] is not an absolute Universal constant, nor would the relative behaviour of time be constant and ontologically homogenous either.
Similarly, Planck's Constant, the idea that electron shells are a fixed distance apart can be challenged by this new atomic model.
The pressures from the continual intrusion of newly emerging energy from the ether into and around atomic packets create new electrons, protons and neutrons. This pressure is driven by local cosmic chaos and is variable, and consequently the distance ratios between the electron shells in atoms are variable, not fixed as Planck suggested.
Distances between electron shells are dictated by the underpinning frequency of vibration and emergence in the local ether. This pressure Compresses the atom into shape, a three dimensional standing wave with resonant shells continually supplied and created from events in the emergent ether outside the nucleus.
Emergence from Transparent Matter drives the continual formation and reformation of new electrons, protons and neutrons, in Dense Matter, offsetting local entropy.
(Dense matter as I call it is classified by 20th century science as ‘Light matter’ and ‘Transparent matter’ as I call it is designated by 20th century science as ‘Dark matter’.
The intuitive approach for the purposes of public construction of this new paradigm however, would have it that we in this material Universe are in the Dark and that light should emerge from the Logos, or other bubbles of foam in our Universe, to illuminate and substantiate our material reality.
As this emergent matter clearly upholds the grosser fabric of our own reality, I would require that we would appreciate that it is the free donation of Light from the architecture of the Universe, or indeed some greater Architect than we are that keeps our fabric intact.
I would humbly submit therefore that the Cosmos we are currently agreeing to measure be denoted ‘dark matter’ and that the infinite amounts of immeasurable light and energies continually reported variously as arbitrary and big percentages within the papers of ‘New Scientist’ and ‘Scientific American’ (within the pretext of statistical and logically irrational Bayesian
reasoning) be denoted Light matter.)
In the meantime, and for the purposes of this discourse, I shall denote the 3 material classes of energy and matter as Transparent [TM], Opaque [OM] and Dense [DM], the latter being our reality.
In conclusion - a new Paradigm is possible which has no pathologies and paradoxes and which completely revises the beliefs of the current Cosmologists in Physics and Maths.
It is not a waste of physics research funding to re-use older experimental data on better and more functional models. That after all was the basis and theory of rational scientific practise as outlined by e.g. Karl Popper in his 1963 publication of ‘Conjectures and Refutations’.
The famous two atomic clock experiment created by the Physics establishment to substantiate Einstein’s relativity, in truth, substantiates the alternate hypothesis of time as a radiated effect of mass and gravity.
With two allegedly identical atomic clocks, set at identical times, one on the ground and one in a very high altitude jet aircraft, it was found that the high altitude clock had experienced less time when the jet aircraft landed and became relative to the ground.
The empirical results and data from that experiment although used upon the constructs of Einsteinian relativity, when used on the Tesla and Hennessey model substantiate the idea that time is a field effect of mass. The very high altitude aircraft under conditions of lesser gravity than e.g. 9.81 ms was also, by this new model, travelling through different atomic conditions for time.
Atoms - [mass etc] are the emerged product of other chaotic 'sub-atomic particles/energies' and are themselves dynamic standing waves.
They produce a time wave in relation to the observer but this time and its time-scales are entirely local, and somewhat arbitrarily imposed.
Mass will also produce harmonic images or dimensions of itself, not necessarily as discernible to the observer as the passage of time. e.g. resonant material impressions of dense matter perhaps enlivened by emergent energies and matter – ideologies that we may designate as Mythic or Theological e.g. Heavens, Planes, Spheres, Continuum, Nirvana etc
Time is a wave propagated through the relativity of local mass - and a 'time-scale' may be 'imposed' in relation to some standard of lowest common denominator - LCD - of mass - common to all observed systems. We choose the photon - but make the mistake in assuming that the photon is a Universally Constant LCD for mass and time.

Much in the same way that water is the LCD and standard for life on Earth - we know that it does not flow about from place to place at the same rate.
i.e. The cycling of water on this planet is not 'universally constant' or consistent due to gradients, climate, weather, metabolism, rate of uptake etc and by the same token, the cycling of photons through the Cosmos cannot be constant.
Olbers, a contemporary of Albert Einstein pointed that fact out c.a. 1920 to Einstein, suggesting that if there was no ether as Einstein claimed and asserted to manufacture his ‘artificial constant ‘c’’, then the sky would be white at night, for there would be nothing to hold up and impede all the photons that there were on their journey to our eyes at night. i.e. the sky at
night would be white.
The only argument to bolster Einstein and contradict Olbers was the invented notion that our Universe was finite. Given the academic contradictions to even that that suggest 80% is immeasurable dark matter and that 80% of Infinity is classically zero, and that statistics used like this are totally irrational and meaningless. The absurd claim that the Universe is finite seems to get constantly pushed under the coffee table at the CERN
Institute in Switzerland during break-time discussions of who’s who in the world of chocolate fireguards.
Humanity, however, utilises a photon-based interpretation of time - i.e. it is what we 'see'. Though there may be a better standard particle that we haven't detected yet.
Given that a body of cosmic mass or rock is also a lump of dynamic standing waves - an analogy for the rock would be a serenade by a symphony.
A spinning mass that creates the observed Lens-Thirring effect using Earth as a model, would distort the relativity of space-time in relation to the observer.
If space-time relativity is the hearing this symphony the Lense- Thirring effect in relation to time and gravity as it appears could be modelled using a hifi speaker sitting on a turntable. If the table was still - there would be no disruption of the observed relativity of sound to a hearing listener at time1 and locality1- but if the turntable was set in motion – appreciation of sound waves
e.g. 'time and gravity' would be disrupted by inconsistencies as
Einstein predicted.
Although disagreeing with Einstein, as did Tesla and others, the data for the Einstein model, in this case also, would illustrate a case for the distortion of time observations and also of the ontological linearity of time within such time field measurements.
The chaotic and non-linear aspects of the material Universe and its behaviour would point to time as being a variable field that may experience gradients of impedance along its lines of transmission.

i.e. A to B through C where C is the common medium of e.g. ether

Two parallel plates of equal size and mass, one spinning clockwise and the other counter-clockwise with a massive potential difference in Volts between them would disrupt the field and wave relativity in the gap between the plates. There, the integrity of the structure and mechanics of; gravity, time and electromagnetic wave relativity would be compromised.
The matter and space-time fabric between the two plates is rather like a physical dam, behind which is the subatomic reservoir of the ether.
The sub-atomic reservoir has a natural pressure, caused by the energy of the sub-atomic 'waters' pushing out and creating the 'concrete' atoms. This is called the law of emergence.
Emergence was studied at the Santa Fe Institute and discovered to be contravening the second law of thermodynamics within the context of performance within biological systems by Chris Langton.
[Levy S, ‘Artificial Life’ pub. Penguin 1993, ISBN 0-14-023105-6]
There the biological models intimated that both entropy and emergence were in collaboration to produce virtually automatic self-regulation within the processes of Life e.g. Kauffman’s research models on auto catalytic polymer evolution. [Levy S]
Emergence, the constant input of new energy constructs into that which is observable causing constant construction, deconstruction and reconstruction, suggests a non-static and non-fixed atomic model.
With the study of half-life atomic decay part of an examination of entropy, it would suggest therefore that physics researchers would observe a relative ‘particle zoo’ of new looking particle constructs.
This has indeed been the case as research into Super strings has entered a surreal mathematical unfalsifiability as new and more challenging permutations of energy chaos are entered into the linear mathematical descriptions that one day, if sufficient computing and time and abundant money were supplied would, no doubt, start to look like an enormous fractal map in the bigger picture.
In mathematical topology, the 'pathological' form known as 'Schifflers Horns' displays a 'natural and healthy' truth - that the Cosmos is physically non-linear.
In the alleged pathogen - the 'perfect circle' has become deformed to a homeomorph that appears to break rules and be paradoxical by demonstrating fractal properties.
In reality, there is no pathogenicity or paradox with this topology - because;

There is no absolutely perfect circle delineated in any physical reality; from the mathematically unsatisfying imperfection of PI to the fractal nature of any physical rendering of the circle, whether electronically or more physically.
Unfortunately, the Paradox called Schiffler’s Horns has been discarded from some University mathematical textbooks.
e.g. Concise Encyclopaedia of Mathematics, pub. CRC Press,
London, NY, Washington DC, 1999, Weisstein EW ed. P1597,
cites a ‘Schiffler Point’ but no ‘Schiffler Horns’.
The most perfect line Mankind can draw whether electronically or more 'physically' has a close-up edge resembling the coastline of Norway [i.e. Fractal].
Matter [the dam] is a product of sub-atomic activity and emergence.
The two spinning plates and ultra high voltage are opening a sluice gate like a turbine in the 'concrete' of the dam - and the energy being forced through the electromagnetic hole drives the matter of the dam/world into a state of high energy.
These plates negate the properties of the 'concrete' in that area, and the water or ‘energies’ push through the new artificial equilibrium caused by this process - generating an excess of energy.
For people living inside and part of the concrete [dense matter] of the dam - that may be a good thing - hotter concrete.
The amount of excess energy that becomes available is directly related to the mechanics/size of the 'sluice gate' The amount of excess energy that becomes available is directly related to the mechanics/magnitude of the 'sluice gate'.
Spinning plates of high voltage - create a potential difference between the Earth and the 'sub atomic reservoir' and at some point the bubbling free energy emerging from the reservoir of the sub atomic cosmos would be pushed down the gradient through the 'electromagnetic sluice' created by the spinning/rotating fields.
This theory fully explains the behaviour of the practical apparatus of Faraday and DePalma as detailed below by Richard Walters for The People Magazine / Energy/New Ideas section:
Subtitle: A promising new alternative energy source, neglected in the U.S., advances in the Far East.
‘Physicist Bruce DePalma has a 100 kilowatt generator, which he invented, sitting in his garage. It could power his whole house, but if he turns it on, the government may confiscate it. Harvard educated DePalma, who taught physics at the Massachusetts Institute of Technology for 15 years, claims that his electrical generator can provide cheap, inexhaustible, self-sustaining and non polluting source of energy, using principles that flout
conventional physics and are still not fully understood. His N machine, as it is called, is said to release the "free energy" latent in the space all around us.

DePalma views his device as an innovation that could help to end the world's dangerous dependence on finite supplies of oil, gas, and other polluting fossil based fuels.
The DePalma generator is essentially a simple magnetized flywheel, i.e. a magnetised cylindrical conductor rotating at high speed with the help of a motor. His astonishing claim is that the present versions of the N machine can generate up to five times more power than it consumes. This, of course, defies the basic law of the conservation of energy, which says that the output of energy cannot be more than the input. Most physicists simply
refuse to look at DePalma's findings and dismiss his theories out
of hand.
Yet "proof of principle" for his invention was apparently provided when a large N machine, dubbed the Sunburst, was built in 1978 in Santa Barbara California. Dr. Robert Kincheloe, professor emeritus of electrical engineering at Stanford University, independently tested the Sunburst machine. In his 1986 report (presented to the Society for Scientific Exploration, San Francisco, 6/21/86), Kincheloe noted that the drag of the
rotating magnetised gyroscope is only 13 to 20 percent of a conventional generator operating at an ideal 100 percent efficiency; the DePalma N machine could produce electricity at around 500 percent efficiency.
In Kincheloe's cautious summary: "DePalma may have been right in that there is indeed a situation here whereby energy is being obtained from a previously unknown and unexplained source. This is a conclusion that most scientists and engineers would reject out of hand as being a violation of accepted laws of physics and if true has incredible implications".
DePalma described his N machine and outlined a theory to explain its workings in a paper, "On the Possibility of Extraction of Electrical Energy Directly From Space", published in the British science journal, Speculations in Science and Technology
(Sept. 1990, Vol. 13 No 4). So far, the scientific establishment either has ignored DePalma's controversial claims or remains unaware of them.
No one has ever obtained a patent for an N machine in the U.S., although in the San Francisco area alone, there are some 200 patent applications relating to such devices. The U.S. Patent office automatically denies a patent to any gizmo that purports to produce more energy than it consumes, on the grounds that its personnel are not equipped to evaluate such claims. DePalma is quick to point out that the N machine is not a perpetual motion machine, that mythical contraption long sought by frustrated inventors. "The perpetual motion machine is only supposed to
run itself. It could never put out five times more power than is put into it. Perpetual motion schemes used conventional energy sources, whereas the N machine is a new way of extracting energy from space".

Tewari, a senior engineer with India's Department of Atomic Energy-Nuclear Power Corporation, also directs the Kaiga Project, India's largest atomic power facility, in Karnataka. He freely acknowledges his dept to DePalma, who has shared his experimental results with Tewari for many years. "One day man will connect his apparatus to the very wheelwork of the
universe... and the very forces that motivate the planets in their orbits and cause them to rotate will rotate his own machinery," predicted Nikola Tesla, the Croatian born American electrical genius
"Electrical engineering took a wrong turn 160 years ago," according to Tewari, referring to English scientist Michael Faraday's pioneering work of the world's first dynamo. In 1831, Faraday performed a series of experiments that led to the modern electric induction generator, having two moving parts—a rotor and a stator part. Faraday moved a wire near the pole of a magnet, producing an electrical potential across the ends of the
wire. This induction principle is used in all the electrical generators we use today. And that's precisely what Tewari means by a "wrong turn."
In that same year, 1831, Faraday also performed a simple yet ingenious experiment with a rotating magnetised conductor. The resulting phenomenon (free energy?) has yet to be explained in terms of conventional scientific theory.
By cementing a copper disc on top of a cylinder magnet, and rotating the magnet and disc together, Faraday created an electrical potential. After pondering this phenomenon for many years, he concluded that when a magnet is rotated, its magnetic field remains stationary. Thus, he reasoned, the metal of the magnet moves through its own field, and the relative motion is translated into electrical potential.
Faraday's experiments led him to the revolutionary conclusion that a magnetic field is a property of space itself, not something attached to the magnet, which merely serves to induce or evoke the field.
Known for over 150 years, the ‘Faraday homopolar generator’, as his contraption is called, has been viewed by a handful of visionary inventors as a basis for evoking the free energy latent in space.
They see it as the prototype for a generator capable of providing its own motive power with additional energy to spare. When the world embraced Faraday's two-piece induction generator, whose drawbacks include mechanical friction and electrical losses, the enormous potential of the Faraday homopolar generator was abandoned, in the opinion of free-energy proponents.
Following in Faraday's footsteps, DePalma in 1978 speculated that free energy could be tapped from the matrix of space simply by magnetising a gyroscope. "I reasoned that the metal of the magnetised gyroscope moving through its own magnetic field, when rotated would produce an electrical potential between the axle and the outer edge of the rotating magnetised flywheel," he explains.
This insight led to his N machine, essentially a one piece rotating magnetised flywheel, "Instead of having a rotor and a stator, as do conventional generators, the n machine only has a rotor.
Half of the flywheel is the North Pole, the other half is the South Pole. One electrical contact is put on the axle, another contact is placed on the outer edge of the gyroscope, and presto, electricity is taken directly out of the magnet itself.
For 150 years after Faraday's controversial experiment, no one bothered to see whether or not a rotating magnet generator would have to do the same amount of work as a conventional induction generator in order to produce and identical power output. Then, in 1978, the aforementioned Sunburst homopolar generator was built. Tests determined that its output power
greatly exceeded the input needed to run the machine, that it was much more efficient that an induction generator.
It would generally mean that any free energy generated in massively larger than natural and local amounts would create a disturbance.
The problems inherent in a discontinuity of time and local matter may result in both ‘anti-gravity and anti-mass’ effects and also ‘anti-time’ effects.
As in the ‘two atomic clocks’ experiment, a lessening of gravity and time may occur but this time at a more significant scale of difference between apparatus and relatively massive body of gravity. e.g. planet.
Each moment in time in our 3D of space is unique and the past can never be recreated - as the relative factors that comprise ‘one moment’ are infinitesimally complex.
Therefore the notion of going back in time and creating extreme problems is probably not a physical prospect, as we understand it, as all of our moments were continually and uniquely replenished by emergence from transparent matter donation.
E.g. Time travel is essentially faster than local mass travel.
We may have certain parameters concerning an area of the past which we could visit, but without specific knowledge of those massively complex and local and unique processes of emergence it would be extremely difficult to overcome the inertia written into that time space system in a way which would again influence its overall and global evolution.
Forward time travel, may not be precise enough to be technologically reliable – as there would need to be an appreciation of the local fields and forces within the materials and physics that we inhabit that could predict persistence in;
form and physical structure, molecular ratios in bodies of atomic aggregates and the relative rates of; entropy, emergence and recombination of atomic forces relative to our space-time.
Predicting matter and energy parameters that may yet generally occur using such computation and data available we may then create a temporal standing wave inside an apparatus that structurally matched a predicted future of local space-time. We may gather and collect or generate opaque matter to project into the ‘local future’; i.e. congruent zones of transparent matter, which would seed and mould the evolution of the specifics
within our future worlds. These 'strange attractors' of emerging opaque matter that currently seed our reality may be sufficient to buffer any device then subsequently introduced into this nonstatic envelope against any radical shift in the local aggregates of opaque and transparent matter that would have influenced their emergence gradient. It may be that relatively short jumps forward are possible in some way in the expectation that those
kinds of opaque and transparent weather clouds we send are more static and durable in the winds of time than the nature of the denser matter that we inhabit. There may however be drastic demands on structural integrity even upon opaque matter at any random time. In principle, therefore, given that we could empirically obtain opaque matter such that it may create a more stable context which can facilitate the projection of a more materially and temporally anchored, we may send a complex sensory device from our own time zone possessing more operable properties that could return a result whilst it was being substantiated by the opaque matter envelope.
The keys to our future are the simple constructs that have given our life its most immediate meaning. There is a University education within a country walk or city park, had we but the selfconfidence and self-respect to acquire it.
Tripartite Essentialism is based upon the field theory notion that there is a continuous relation between one system and another through a common medium. i.e. A to B through some common C.
To add to this idea is the notion of Emergence, where one less sophisticated system by its more massive scales of chaos (ether) and in a higher energy state feeds into and - [emerges] another more sophisticated system in our time space.

A New Cosmology.
To analogize using the cycles observed in the material creation and recreation of stars. First born or primary stars, are hot and e.g. blue, ultra-blue or indeed may not be visible at all in some ways although they may be a substantial star relative to some other part of another bubble universe in some other foam. New stars have a short life and eventually go supernova and explode, releasing the products of their cycle of physical synthesis as
relatively more processed debris.
This because they are newly emerged matter and have material that is comprised of unstable and chaotic processes.

At the start of the lifecycle of a star where aggregates of debris are impelled and constrained by gravity to coalesce, an emergence/fusion reaction kicks in if there is a sufficiently steep emergence gradient from transparent matter, to opaque matter at the locality of the dense matter mass.
These highly temporal and chaotic hot blue star fields are the product of emergent intrusion from a physically transparent adjacent bubble in the foam and may incessantly froth and explode and remain in a chaotic material state. However, they may eventually emerge slightly cooler green stars by a process of fusion and by merging such assets as green star materials together. The processes of gravity and opaque matter gradients
and local material relativity as eventually become supplied for example may supply a bit of local temporal debris that could serve as a seed around which to grow new emergence products.
Under these conditions of fusion and emergence, cluster galaxies may form, fed into by a central fusion reaction or ‘white hole’.
The spherical galaxy may continue creating and recreating in a variable way, but that would not automatically preclude the cessation of the conditions of emergence and donation from the other bubble in the foam that was supplying it. The status of supply of structures may drastically change in these unstable conditions. The emergence gradient from the adjacent bubble having dropped off, core stars in this galaxy may re-enter into adjacent chaos, whilst other stars on the periphery of the sphere
may drift off into this cosmos being excluded from the new equilibrium between emergence and entropy around the changing white hole.
In other aspects of our own universe bubble, energy may be passing outwith our universe down a variable gradient into another bubble of variably lesser energy than our own. [a black hole or an extended tear or worn patch of dissipation in our cosmos]
Dissipating energy will create a tear or rift in local dense and opaque matter.
If a galaxy is in this region the densest area of the galaxy may start to donate matter to the other adjacent lower energy cosmic bubble down a concentration gradient.
This stream of local matter will generate large amounts of mass and energy and will extract this from local stars and galaxies.
This toll material from the latter may also coalesce under great pressure being sometimes being unable to exit simultaneously due to the variable gradient conditions imposed by chaos.
Under these circumstances where matter is queuing under pressure to exit this cosmic bubble - such matter may react creating explosions from the source of the black hole. E.g. the ejection of tertiary stars [as noted 2005]

Not necessarily at the centre of a galaxy and not necessarily relatively black in terms of a gravity well but may be small widely spread homogenous patches of relatively smaller brown pores comprised of leaks in the matter and opaque matter ether of this bubble such that the fabric of the bubble is more resilient to the local demands in this locality.
Rather like the appearance of a more resistant but porous brown membrane spread over a galactic cluster than a single black gravity well at the heart of a galaxy.
There may also be instances of relatively brown holes at the heart of a galaxy.
This state of affairs may come upon a relatively hot galaxy or it may come across a relatively cold galaxy.
Eventually, such compacted material of inappropriate relativity that was kept there by bubble exit pressures – basically ‘suction’ - and now exchanging energy across very steep differentials between new and unstable aggregates and under the new conditions of re-emergence, and reconstruction from its original sources explodes in a state of combustion releasing the stellar material.
If conditions for black hole formation occur in a formerly fusion fed hot galaxy, there may be green or yellow stars and star components produced that tend to be fusionable which would further recombine.
This explosion may also produce enough material for yellow or green stars that operate by combustion and fission.
There may be however green or yellow stars that are wholly endowed with processes of fusion.
If conditions for black hole formation occur in a formerly fission fed cooler galaxy, then its subsequent explosion could also produce both types of green and yellow stars that use both fission and fusion.
End products of fission etc are red, brown and black dwarves that eventually recreate the same physical processes and conditions operating within a black hole that lead to a further explosion and the creation of more sophisticated atomic (tertiary) stardust.
The various transitional states of a red dwarf to black suggest by analogy that a black hole entropy gradient could be observed in some forms as the ultimate phase in their cycle.
It may be observed from; the descriptions of black holes, white holes and emergence gradients between cosmic states of transparent, opaque and dense matter, and, between aggregates of our own dense matter atoms by using Fajan’s Rules, that the processes in the macrocosm look identical to those in the microcosm except for the labeling notions that the psychology
of magnitude attaches to them.
The macrocosm by analogy is a very slowly moving weather system, galaxies being the eyes of the energy storm of chaotic emergence.

The subsequent; migration of electrons, protons and neutrons, and the migration of and recycling of fissionable and fusionable material between black and white holes of relative opposite polarity and behaviour indicate that the simple transaction model of A to B through some common C, i.e. from big scale to small scale across a variable and changing common C is the simple cornerstone principle of the Universe.
In keeping with the idea that from some other context D some very large pocket of a different and even greater magnitude of vast transparency or opaqueness could deliver a very large newly formed pocket as a small bubble directly adjacent to our doorstep. As suggested in the atomic scale model, protons, electrons and even neutrons would migrate to fill it, then this process may give us a clue as to the eternal creativity of this Multiverse that we all stay in to make new, baby Universes.
Therein lies the possibility of eternal recombination and strange new introductions to strange new properties, for much of the material could come from bubbles adjacent that possessed incredibly different material behaviour than the properties of matter currently within our own Universe.
The key to all of this analogy and modeling comes from the simple deductions about material properties made by Fajan called Fajan’s Rules. i.e. big to small through common medium.
In a book called Physical Chemistry by GI SMITH.
A very large 15+ volume set on Physical Chemistry from a technical University bookset however, only produced Soddy- Fajan in the index.
Although it was clearly not from this planet in its editorial style, it was very clear that we were not talking about a principle of transfer between high-energy atoms to low energy atoms.
It remains to be seen, however, what Dr Soddy contributed to mankind along with Dr Rutherford when in 1902 and 1903, they produced the theory of 'radioactivity'.
With Einstein's 'special theory of relativity' in 1905, Planck's 'Quantum Theory' in 1900, then again in 1920, Bohr's atomic theory in 1913, by the time 1927 came around, Heisenberg couldn't find out what particle anybody was referring to because particles consistently refused to be found. 'The principle of indeterminacy.' 1927.
With Olbers c.1920's claiming that something was holding them all up somewhere or the sky would be white at night despite Michelson-Morley's 1887 dismissal of the theory of ether,
Einstein's 1905 pronouncement that light ossified at a constant called 'c' was still upheld in the 1930's and beyond. It was left to Schwarz and Green in 1984 to lasso Galileo's 1610 'sidereal messenger' utilizing the methods and lexicon of Sir Isaac Newton's 'Opticks' written in 1704 with their theory of 'Super strings'.

Green and Schwartz's 10 and 26 dimensional string theory, however, even today, continues to be inconsistent in its own terms as a sufficient and logical explanation of 'everything'.
A veritable particle zoo of arbitrary; size, naming, properties, behaviour of classes, ... 'even conventional elementary particle physics has problems when it comes to mass. The basic quantum mechanical rules give no reason as to why mass should be fixed at all. There seems to be no rule why the mass of the electron, for example, should not have a whole range of values. Moreover the various [super strings] symmetry schemes that were created over the last two decades work best when the particle masses are zero. Peat DF, 'Super strings', p.230.]
Symmetry and labeling ensued that began even in its nomenclature to mimic the mathematics and behaviour and torque of turbulence and chaos systems.
Linear mathematical topology began to be twisted and pulled into 'twister networks' and 'spin networks' [Peat DF, 'Super strings' pub.1988, Cardinal, ISBN 0-7474-0583-2, P.231-3].
By creating linear algebraic models of particle behaviour in this turbulence, the mathematics of particle relativity collapsed in places called 'gauge fields' between the different particles.
There, the 'collapsing wave function' could only be solved in mathematical topology by adding a small appendage onto the ends of the particle model like a 'weather vane'. E.g. Penrose. p.270. 'This measurement problem has been around for 50 years now ..' p.271.
It remains to be seen, however, in their study of the torus and twistor mathematical topographical models, from e.g. 19th century mathematical topological algebra of Grassmann HG and Clifford WK, if the super strings mathematicians remember the basic paradox of the torus presented by Schiffler.
Schiffler's Horns paradox explains that a joined torus is impossible and that in fact, all is non-linear chaos.
Green and Schwarz and Gross's super strings are built out of 'compactified' space that allegedly [Peat p.289] compresses 10 and more torus 'structures' together. Also to make the theory more workable, the edges of torus particles somehow need to connect. In the mathematics of topological chaos, however, a structure such as a torus doesn't logically and formally exist.
With even Green and Schwarz now suggesting that some of these empirical dimensions are 'not really dimensions at all', [Peat, p.320] the underlying assumptions and mathematical models used to constrain chaotic empiricism into various regular torus layers are being torn apart.
Super strings, therefore is a dark knotted unfalsifiable cul de sac of arbitrary labels and fragmented mathematical models.
Chaos is evident everywhere in particle energies, even from the linear mathematical investigations, where a particle model performed better because it had mathematical appendages.

These ideas are best seen illustrated in another diagrammatic representation of Schiffler's Horns that uses not a torus to illustrate fractal reality but a representation of a triangle with appendages that are fractal growths upon it called a Köch Curve.
The side of the triangle is only a point of view from another point of chaos somewhere in its fractals in the same way that the incomplete torus of Schiffler is only a point or curve from within the fractal relativities of the unjoined horns.
Schiffler’s Horns is a direct refutation of the super strings torus model.
Within the empirically measured energies of the Cosmos there are many kinds of models and data that would substantiate the ontology of chaos theory.
This would include a universal and 'natural' objectivism and field theory of emergence based on natural turbulence.
With energy emerging and forcing its way into our 'dense matter', it was left to the most ancient and rejected and presumably dejected school of scientific ghosts to rattle the chains behind the brick walls of those lost academic cupboards of the 19th Century.
Hooper WG, 'Aether and Gravitation', pub.1903, Chapman and Hall, London, writes on page 63, '... the atomicity of the aether has already been suggested by such scientists as Clerk Maxwell, Lord Kelvin, Dr. Larmour, and Professors Lodge and J.J Thompson. Clerk Maxwell, in an article on 'Action at a Distance',
in collected works by Niven, referring to the atomicity of the aether writes: 'its minute parts may have rotatory as well as vibratory motions, and the axes of rotation may form those lines of magnetic force which extend in unbroken continuity into regions which no eye has seen'. 'Lord Kelvin, in several articles on 'Vortex Motion' in the Philosophical Magazines of recent
years (c.1903), has mathematically dealt with the aether from the atomic standpoint, and has endeavoured to prove that the aether medium is composed of vortex rings, but he was unable to come to any mathematical conclusion.
Presumably the earliest version of Microsoft Windows in 1903, called Microsoft Windmill had been infected with the 'annelid worm' and had not lived up to the Great Expectations needed for the massive chaos computations later required for vortices in the sophisticated academic hallows of third millennium.
Of the field theory that ties the aether together with the compression of continual emergence, the 1903 data, as available to Hooper enabled enough perspective to deduce for physical properties what would later in the coming 100 years be seen to apply to; Biology, Psychology, Chemistry, Cosmology and Time.
etc. 'The law of inverse squares which governs not only the law of gravitation attraction, but also electricity and light, is equally applicable to the phenomena of heat, so that the intensity of heat varies inversely as the square of the distance. Thus, if we double the distance of any body from the source of the heat, the amount of heat which such a body receives at the increased distance is one-quarter of the heat compared with its original position. If the distance were trebled, then the intensity of the heat would be reduced to one-ninth; while if the distance were four times as great, the intensity of the heat would only be one-sixteenth of what it would receive in its first position.' [Hooper WG, 1903]
Across this chaotic medium, the ether, spinning, rotating and vibrating, the waves of light and electricity propagate. Attracted and deflected by larger gravitational masses. Clerk Maxwell, in his paper on 'Action at a Distance' (collected works, by Niven) writes, ' .. in its infinite continuity .. it extends unbroken from star to star, and when a molecule of hydrogen vibrates in the Dog Star, the medium receives the impulses of those vibrations, and
transmits them to distant worlds.' [Hooper WG, 1903, p.59.]
'Lord Kelvin in giving an address to the British Association, 1901, on 'Clustering of Gravitational Matter in any part of the Universe.' said: 'we are convinced with our President (Professor Rucker) that Aether is matter. Aether we relegate to a distinct species of matter which has inertia, rigidity, elasticity, compressibility, but not heaviness.'
One hundred years later, in the third millennium, Professor Higgs, the Scientific Community and the 'Massless Vector Boson' that may solve the mathematical chaos of super strings and the quest for interplanetary resources, are still beyond help.
Faraday in [Hooper, 1903] writes, (Exp. Res., vol. ii.); 'The view now stated of the composition of matter would seem to involve the conclusion that matter fills all of space, or at least all space to which Gravitation extends, including the sun and its system, for Gravitation is a property of matter dependable on a certain force, and it is this force which constitutes matter. Aether must also be matter.'
Hooper in 1903, an objectivist to the end states 'for example, the laws which govern the light and heat of the sun are the same which govern the light and heat of a candle or a glow-worm; and the laws which govern a planet or world are the same as those which govern an atom. Thus a planet or world, which is simply an agglomeration of atoms, may reveal to us in its motions and laws, what are the motions and laws which govern the atomic
world.'
The laws of attraction and repulsion as stated by Newton and the laws of planetary attraction, conservation of momentum, for every action there is an equal and opposite reaction etc, belong to an era bereft of massive computational analysis.
Although numerous empirical regimes from the famous Lord Kelvin and Boyle, whose gas laws and studies of turbulence would have been major scientific advances had they a version of Windows 95 - the main theories on aether, although very feasible would be doomed without a driving and causal engine – the engine of emergence.

Factors noted by [Hooper, 1903] as Centripetal and Centrifugal forces without a chaos-driven emergence model to drive the 'antigravity' or out-throw of the centrifugal forces being measured would fail in 'gravitational collapse'.
'Here, then, is presented to us a kind of order of celestial phenomena for whose well-being and effectual working the centripetal force or the attraction of gravitation cannot possibly count. In their case another force is demanded which shall be the exact complement and counterpart of the centripetal force.
There needs therefore a force, not an imagined one ... a force existing in each world just like the attraction of gravitation, only the reverse of gravitation, a repellant, repulsive force, acting in the reverse mode and way, to universal attraction. This force must be governed by the same rules and laws that govern centripetal force if it is to work in harmony with the same.'
[Hooper, 1903, p.31.]
Beyond, Kelvin's ideas, though was the ['Spherical Vortex Atom Phil. Trans, 1894] of Professor Hill in Hooper, p.62 ... ' in the conception there put forward, and mathematically worked out, Professor Hill showed that his spherical vortex atom possessed similar properties and characteristics to the vortex rings of Kelvin.'' ... that atom would; rotate, be a magnet, possess elasticity, compressibility, inertia, ... and ... a certain amount of
mass.'
The background to these late 19th century theories, however, fell apart without the tools of massive computations in turbulence and complexity needed to precisely measure the material and its behaviour.

Newton's 'unifying theory of gravity' did not hold the unity of the universe together. Hooper predicts emergence as a counterbalance to gravity, but believes that stars such as the sun were the source on the basis of the results published by Michelson and Morley in 1887 CE in Phil. Mag. December 1887.
Hooper on numerous occasions predicts true, but unfortunately, he went with Lord Kelvin's 'smoke rings' and 'linkages' instead of Hill's 'sphere'.
Without the insight of 'compression' from emergence and accountability for particle recombination and formation, he did fail to produce unity, but, to date, he has been the most advanced particle physicist that the 20th Century has ever seen.
Hennessey's Harmonic Continuum Theory of 2004, however, first collated in 1991 takes a different approach to scaling and internal processes within ether. Hooper places emphasis on an analogy of a Kelvin vortex ring atom that is surrounded by an elliptical cloud of ether, as he had allegedly seen Michelson- Morley publish about Earth's ether envelope in 1887.
His unifying force that formed the inside of the vortex was Newton's gravity, which he called a centripetal force.
If I were using his terminology to explain my theory to him, I would have stated the exact opposite of his findings.

It should have been the chaos force that caused the particles His 'centrifugal' force was the work of chaos on the atom at time that threw energy out into turbulence.
It is emergence that drives the atomic compaction that we discern as 'gravity' and that would make Hooper wrong about the way that he interpreted atomic gravity and also about the arrangement of his ideas about physical extremes as perceived by Newton. e.g. spectra.

This pressure compresses the ether into bigger particles and pockets that resonate their etheric substrate at time1 with the activity of transverse waves. This causes electron shells or 'Quantum numbers'. Rather than a fixed number of quantum shells, however, there are relatively variable empirical results for the distances between the energy states of these internal waves.
This and other motions and spins and relative displacements have caused the paradox of non-locality observed by Heisenberg. [1927]. and also by recent physicists who, using more precise technology were able to manufacture, destroy and enable whole series of arbitrary particles in a 'particle zoo'.
These particles e.g. charm Quarks, Hadrons, Mesons, Gluons etc became every difficult to classify or utilize.
The chosen classification system for the smaller scale atomic components was made counterproductive by the RGB colour scheme, which is non-intuitive.
The Red Green Blue or RGB colour scheme used to classify 'quarks' does not easily and accurately predict symmetry within complementary colours and was therefore difficult to analogize with before attempting to interpret the quark results.
The other difficulty with Quantum Electrodynamics was that it was not possible to contain and restrict particle sizes within the theory model. The scaling issue would have required some 'glue' or 'charm' to keep it all stuck together. This has been practiced in QED and QCD [Quantum Chromo dynamics] using the laws of Boolean Algebra which as you may see from the Mathematical discourse in this work do not all add up.
The law of adding things together A and B to get B and A produces a set containing A and B for the purposes of calling A and B a particle class. i.e. commutation e.g. or Abelian sets. Non-Abelian grouping in gauge theories will produce no rational standard of relativity whatsoever unless either; the laws of association or distribution are applied. No other Boolean Rules
provide any rational alternative. i.e. Sum, Product, Absorption.
Also, although Planck's Constant is directly related to frequency of emissions and even though it is also chained to the Einsteinian light speed it is insufficient in accounting for all the basic factors involved in the energy exchange.
'.. unlike the halfpenny, however, the value of the quantum is not fixed, but is related to the frequency of radiation which, by its emission or absorption, causes the change in energy…’

[Brown GI, 'Introduction to Physical Chemistry SI Edition', pub.
Longmans 1975, ISBN 0-582-32121-X, page 105.]
Planck [1900] in not measuring the rates of emergence of newly introduced created material had omitted a second construct out of his equation. Hooper of 1903, had, in fact, a more sophisticated grasp of the problems within physics that were to continue for the next 100 years.
The relative turbulences observed within particle interaction e.g. 'Jet particles' '.. a system of particles produced during particle reactions at high energies. The jets are interpreted as fragments of elementary objects such as quarks and gluons.'
Fritzsch H, 'Quarks, the stuff of matter' pub. Pelican 1982. ISBN
0-14-022470-X.
The various theories of the weak and strong electromagnetic interactions and their 'invariant symmetry transformations whose effects vary from point to point in space time' [Fritzsch, 1982 p.217.] are called Gauge Theories.
The field theory as it operates and diminishes by power law between quarks has been noted in terms of degrees of 'asymptotic freedom.'
Hooper in 1903 p.221. had already noted the value of Kepler's Third law in this respect using a holistic planetary analogy.
'.. Whewell on this matter in his Inductive Sciences states that 'Kepler assumed that a certain force or virtue resided in the sun by which all bodies all bodies within his influence were carried round him. He illustrated the nature of the force in various ways, comparing it to light, and to the magnetic power that it resembles in the circumstances of operating at a distance, and also of exercising a feebler influence as the distance increases.
Another image to which he referred suggested a much more conceivable kind of mechanical action by which the celestial motions might be produced, viz, a current of fluid matter circulating round the sun, and carrying the planets with it like a boat in a stream.' Whewell adds: 'A vortex fluid constantly whirling round the sun, kept in this whirling motion by the sun itself, and carrying the planets round the sun by its revolution, as a whirlpool carries straws, could be readily understood, and though it appears to have been held by Kepler that this current and Vortex were immaterial, he ascribes to it the power of overcoming the inertia of bodies, and of putting them and keeping them in motion,' [Hooper, 1902, p.221-222.]
Kepler's Third Law as stated by Hooper p.37 and 33 'gives the relation between the (orbit, or) periodic time of a planet and its distance from its star as: the squares of the periodic times of planets are proportional to the cubes of their mean distance. i.e. p.38 .. if we have the periodic time (orbit) of any two planets, and the mean distance of either, we can find out the mean distance of the other by simple proportion.'
In the bubble chambers of the creative particle physicists, however, the Jet and Charm particles gradually eroding the tenacity of quantum numbering were producing an assortment of 'Flavour' particles locked into 'infrared slavery' with no particular reason to be going or staying. In the absence of the local and temporal emergence gradient figures for that year as some sort of constant to put into Planck's equation - an additional burden on the already [a priori] etherically burdened speed of light that would additionally impact on existing matter - they were going to need 'Glueballs' [Fritzsch H] and 'Spaceballs' [Brooks M] to keep it hanging together.

In Cosmology, there would also be cosmic bleeding to account for where large tracts of our cosmic bubble would leech through an opaque membrane by osmosis - as opposed to the singularity of transfer created by a black hole.
Andrew Pickering in his 'Constructing Quarks - a sociological history of particle physics', pub 1986. Edinburgh University Press. ISBN 0-85224-535-1 page 413 refers .. 'Twentieth-century science has a grand and impressive story to tell. Anyone framing a view of the world has to take account of what it has to say ... it is a non-trivial fact about the world that we can understand it and that mathematics provides the perfect language for physical science: that, in a word, science is possible at all. (Polkinghorne
(1983))
Such assertions about science are commonplace in our culture. In many circles they are taken to be incontestable. But the history of HEP (high-energy physics) suggests that they are mistaken. It is unproblematic that scientists produce accounts of the world that they find comprehensible: given their cultural resources, only singular incompetence could have prevented members of the HEP community producing an understandable
version of reality at any point in their history. And, given their extensive training in sophisticated mathematical techniques, the preponderance of mathematics in particle physicists' accounts of reality is no more hard to explain than the fondness of ethnic groups for their native language.'
This problem also extends into logic where the brick wall of the arbitrary has held up successful evolution in computing e.g. Turing's Recursion and Gödel's incompleteness of logic paradoxes. W.V Quine argues that 'the traditional concept of linguistic meaning should be rooted out of respectable, scientific thinking and enquiry ..' Theories of meaning: after the use theory p.50. (Copeland BJ and Stoothoff RH - criticising.)
It was easy to see their point of view in this work.
'Radically translating' to my own analogy ...
e.g. Mr Quene goes to Africa deep in the jungle - does something very very bad to the chiefs daughter - 'gouranga', but Mr Quene asserts that he is always, absolutely going to be safe and sound every single time he does that on the basis that although he cannot understand a word they are saying that he cannot interpret their intentions towards him and their ultimate meaning to his life. i.e. 'these observations will never narrow down the range of possible translations to just one.' [Copeland and
Stoothoff]
Beyond a doubt then, unless Mr Quene is tooled up with superior firepower in the American Traditions of Charlton Weston – AND he can get to the 'Forbidden Zone' in time in possession of a mean looking monkey suit he isn't going to leave that village alive - even if the natives cannot agree what the order of precedence should be for the expiation of their ritual methods of slow execution.
W V Quine, however, in his book 'The Philosophy of Logic edn.2 pub. 1970, Harvard University Press, ISBN 0-674-66563-5 on page 69, also displays a rather worrying occupation with the magical properties of 'new maths' - namely 'Boolean Algebra' though he doesn't name the rules of; sum, product, absorption as useless or the rules of; commutation, association and distribution as useful.
With the rules of semantics broken by the useless Boolean rules and also endorsed by Logicians it would become difficult to imagine how anybody could make any sense of their particle physics results whatsoever.
Instead of Rutherford, Soddy and Planck c.1900 - 1903 with billiard balls held together by c.1980's 'gluons' whilst radiating quantum shells - there is instead a deluge of energy compressing into various packets of various sizes between which and within which are gradients of varying velocities and impedances.
The frequency of transaction across the density and resistance within each packet is relatively driven in time with emergence and its field strength in any temporal locality.
At high frequency and low resistance gradients there are many atomic shells.
The Structural and scalar persistence and ongoing integrity of atomic localities are due to resonance between; emergence and frequency driven, pattern-based structural interdependencies.
These aggregates of simples and complexes are continually fed by the compaction energies of emergence.
Highly facilitative, rigorous and highly structural postponements amongst driven interstitial elements e.g. in diminishing order of scale; neutrons, protons and electrons, incorporate selfsufficiency and relative immunity to emergent driving. At which point they begin to mirror and more accurately reflect the gradient products and bi-products that they have been initially
attracting by difference in energy potential. [i.e. from High to Low energy in this particular packet of ether in which atoms are in growth].
This would eventually develop into a 'dance of symmetry’ that would then attract a greater scale of focus of these initial components, down an increasingly steeper gradient at time1.

If the flow continues to keep the ether pockets supplied in excess, therefore, it will be possible for inanimate matter to grow.
At this scale of atomic complexity, two symmetrically entwined systems that are; compacted, fed by steady and dependable gradient, that are interacting and mutually interfering with their integrities - may in this turbulence, be able to impart a notion of scale and symmetry to a third amount of supplied material thus creating a new and similarly-scaled particle performing to local ratios of transference.
The energies of matter interact through the physical transactions
of constructive and destructive interference. The propagation of
he transverse waves in variably dense and turbulent media by
harmonic and vibratory motion is driven by emergence and dissipated by entropy.
The rules of Quantum Physics [QED] do not regulate particle sizes in Quantum Shells [Peat DH] so this does entail that it is the transaction that the particle facilitates at time1, rather than its scale that would classify it in a periodic table.
'In Newland's Law of Octaves in 1864, he arranged all the elements he knew in ascending order of relative atomic mass and assigned to the elements a series of ordinal numbers which he called atomic numbers. He then noticed that elements with similar chemical properties had atomic numbers that differed by seven or some multiple of seven.
In other words, Newland discovered that the chemical properties of elements were often to be similar for every eighth or sixteenth element, like the notes in octaves of music.' [Brown GI, 'Introduction to Physical Chemistry, SI Edition pub. 1972, ISBN 0-582-32131- X]
The process of emergence, drives the facilitative structure of particles by compressing and feeding the shells across and through variously dense but persistent, pockets of ether across an inconsistent but relative transaction gradient.
This will drive the formation and concretion and cohesion of the particles whilst their elasticity is being continually challenged by constant change. [Hennessey's Rules, 2004]
In these circumstances, particle opportunities are; persistent, temporary and generic and non-generic.
Resonance Hybrids of these generically similar atomic states interact 'electronically' in variously indescribable ways. e.g.1
'No single structural formula which can be written for benzene, for example, accounts for all the known properties of benzene, and this is so for many other compounds.' [Brown, p.185], and, 'Ingold introduced the description of these chaotic interactions as mesomeric states or mesomerism i.e. 'between the parts'.
In this chaos, atomic structure itself will shift about and elastically self-regulate, creating different atomic structures e.g. tautomerism in classical chemistry, or, also driven to mutate, grow or decay into some other element by emergence, density and turbulence gradients within the ether.
According to Böhr [1904], Rutherford's spinning electrons didn't fly away expelled by centrifugal activity - they couldn't escape because they were locked into stationary stasis.
Plichta P, pub. Element 1997 'God's Secret Formula -
Deciphering the riddle of the universe and the prime number code' ISBN 1-86204-014-1, attempts to account for such disintegrity in gauge field spaces with a system of mathematical prime numbers.
He also produces from this an inference that good things come in threes e.g. egg, larvae, insect, or base, sugar, phosphate, but his tripartite theory incorporates the 'a priori' assumption of Planck and Einstein, of fixed; neutrons, protons and electrons, and that the relationships between these are governed, and regulated by concepts within definite mathematical space of a relatively planar Sierpinski Triangle. [a fractal device]
[Another equivalent of Schiffler's Horns in Super strings theory
or indeed the Koch Curve.]
To quote Plichta, 1997, page 194, '.. in reality, so many atoms collide with each other simultaneously in a gas-filled space that nobody would ever get the idea that the kinetics of colliding gas atoms in principle only involve dual collisions and thus yes-no decisions. The space they occupy thus behaves mathematically like the grid space that can be described by Pascal's Triangle.'
With interstitial elements recombining and slewing off e.g. Jet and Charm particles and other numerous massive unorthodox particles at any and every scale, the best natural model for scalar complexity, its most simple of models is not in fact the binary inference of the language [T] as Plichta here infers. By declaring a naive dualism in Fajan's Rules as a natural law,
Plichta has locked particle activity into a 'monotonic' model that cannot account for the massive activities of turbulence within the temporally changing and emerging ether. These activities include factors such as; differences of scale, transfer gradients and velocities and also the new constant H - the rate of emergence.
Although appealing to a chaos ontology and being a mathematical approach to locating Heisenberg's un-locatable particle [1927], Plichta's [1997] strategy, unfortunately, like any produced in Europe in [1907], still bears the burdens of linearity.
It depicts a way with which to evaluate the effects of particle chaos rather than a methodological description of the process produced by Hennessey [1997], 'Tripartite Essentialism',
page.83, pub.1997, Outshore Multimedia ISBN 0-9532034-0-9.
The modal Logic of the Language [A], however, allows for temporal undecidability, and set theory such as; commutation, association and distribution, such that interacting scales of creative and destructive recombinants driven and compacted by emergence can be modeled. With emergence driving an equilibrium there is always going to be some part of the process at some time1 where it cannot be told what the process was
going to emerge into.
Although the language [A] allows for this in its most minimal partite logic, it describes the complete range of these transactions in a closed, finite and absolutely limited set of essential numbers [A] from [a1, a2 ..a729].
In using Yes, No and Don't Know, the don't know part allows for Chaos, whilst Plichta's Theory with essential numbers [T] from [t1, t2 ..t64] does not as every transaction in Plichta's model is always absolutely certain and always absolutely known.
Energy is being continually forced into atomic structures, although continually mutating in a relative harmonic field by degrees of; folding, gradient and compaction into any recipient etheric pocket. The atomic field at time1 in this packet of atoms is regulated by ether density, and the transaction gradient at time1 is driven by harmonic vibration which, depending on the frequency and the substrate will have several harmonic shells.
The forces of 'gravitational' compaction at the heart of the particle may vary, but they will contribute to the degree of expulsion that gives shape to the atomic structure at time2.
The forces of attraction called gravity - a passive idea, therefore, are in truth, the forces of compaction - an active concept. Isaac Newton's apple was in fact pushed down from the sky by a process interruption in the massive scales of carbon within the tree, where the massive abnormal scaling differences between the branch and the ground enabled a massive high velocity transfer with very little impedance to convey it to the ground.
The ethnocentric vision of good things falling into our laps to uphold our lifestyle without taking care of our orchards and what shoots they may push up next year has haunted us for millennia.
Hooper's analogies and rustic charm of the straws and the whirlpools were more grounded in reality than one hundred years of pointless hitek cider.
Work done is done by entropy against new emergence at time2 and self-regulating cohesion and its innate distribution. This turbulence at time1 by entropy supplies adjacent particle systems.
This supply, however, cannot be explained by the models of the last 100 years. e.g. Planck's 'shells' are 'visualised' [Brown, p.108].
At time1, interplay between molecules is driven by an emergence gradient of created and recreated energy and matter via – a mediated transaction of higher to lower, e.g. osmosis, Fajan, Ohm.
Compaction and slewing off of particles from atoms by turbulence is a self-regulating equilibrium. When and if the velocity of the emergent energies of compaction slows, the bigger particles will donate energy [Fajan, 1924]. This donation will be mediated and impeded by particle size and number of shells and other gradients and demands within the adjacent etheric turbulence.
The normalizing factor on scale and density of atomic structure is the relative density and consistency of the local ether.
The fundamental attributes of atomic structure will remain the same in etheric context at any scale of local etheric context.
e.g. in a state of unity of cosmological chaos, ether may be so relatively ('impossibly') and ('incredibly') distorted as to produce a carbon atom the size of a football within a restricted set of circumstances that have penetrated from some lateral plane of scale. In terms of the current Quantum Theory [QED] one could get some very large particles in the wrong place.
Stark and Zeeman noticed that energy levels, and magnitude of 'electricity' in these shells altered and varied according to the lines of force in a field. This force depended on the velocity of interaction within and between the atomic composites and aggregates. The subsequent augmentation or detraction called 'quantum yield' however, will not be universally quantifiable as a law at any given time, or all of time.
Continual turbulence will slew away excess energy and ether that the atoms or its crystalline or state structures cannot hold on to at any given time.
Under the normative forces of emergence, however, where compaction is very consistent over time, inanimate matter may through over supply of similarities of the same ratios, replicate itself.
The equilibrium and bias caused by the velocity of the emergence gradient and driven across some value of normative ether, causes harmonic intervals 'atomic shells'.
These are continually supplied by emergence to retain the elasticity and persistence within Simple Harmonic Motion [SHM].
Peter Plichta, the industrial chemist noted in 1997, in 'God's Secret Formula' that sophisticated carbon-based molecules (e.g. hexagonal ring structures, pentagonal ring structures etc) seemed a natural priority, but spoke in terms of the 'strict divisions' p39, and absolutes, p.32. as his model would dictate.
His brilliant exposition of the chemical morphology of life, however on page 200 and 201 of his book illustrates his true genius in Chemistry.
Plichta discovers that the arithmetic of atomic harmony has a direct and formative influence on the nature and structure of matter and material evolution.
This relates the threeness in the chemistry of life between the two main classes of life form on this planet: the vertebrates and the invertebrates.
e.g. the insect evolved in three geological ages: the Carboniferous, the Permian and the Upper Cretaceous. Its life cycle, finalised 65 million years ago, takes the form of; egg, larvae, insect - in the morphology of: head, thorax and abdomen.

This insect life upon whose activities at the bottom of the food chain, the trophic pyramids of carnivorous and omnivorous vertebrates are based - is comprised and founded on mainly of hexagonal carbon ring structures.
The vertebrates with: head, torso and the extremities, also have tripartite reproductive nesting in their life cycle. e.g. egg, sac, womb.
Plichta noted p.201 '.. I suspected that life on this planet could be mathematically subject to the conditions of two forms of space.
If a flying insect is compared with a flying animal .. e.g. a swallow - it can be shown that both are built to suit the space in which they happen to live.
The structure and the function of the insect incorporates threedimensional
gas-filled space..'
I would disagree with that, as the swarm model by Langton at Santa Fe simplifies this behaviour to a 2-dimensional algorithm.
It does suggest, to the contrary, that insects are incorporated within 2-dimensional space operating in relation to stimuli within 3 dimensions, as indeed is the swallow in his example.
Plichta continues, 'A swallow is constructed entirely differently.
It has no armour of hexagonal sugar - instead it has a skeleton made of inorganic carbon salts.'
The common architecture of 'swarm' refers to a large number of simple agents interacting, whether they are a swarm of bees, an ant colony, a flock of birds, or cars in city traffic. Agents – with their own internal data and rules - act by passing messages back and forth to each other. The system also provides a field object to associate the agents with co-ordinates in space. The agents can modify the environment and in turn their behaviour is dictated by the state of the environment, providing a feedback loop. 'We are attempting to capture the architecture in a generalpurpose way,' says Langton. 'Then people modeling insect behaviour, the economy, the behaviour of molecules getting caught up in complex dynamics, or the evolution of populations can go to the same simulator and not worry about a lot of very subtle computer science and engineering issues.'
Langton in [Santa Fe Institute Bulletin, Fall, 1993, vol8, no.2 page
13-14.]
The end-based telic principle of the incremental complexity and diversity of life and the predation and grazing between scales of sophistication - his 'zoological perspective' does ring true though.
Rupert Sheldrake in his 'a new science of life - the hypothesis of formative causation' pub.1985, Paladin, ISBN 0-586-08583-1 page.95 ..' Time after time when atoms come into existence electrons fill the same orbitals around the nuclei; atoms repeatedly combine to give the same molecular forms; again and again molecules crystallize into the same spatial patterns; seeds of a given species give rise year after year to plants of the same appearance; generation after generation, spiders spin the same types of web. Forms come into being repeatedly, and each time each form is more or less the same. On this fact depends our ability to recognize, identify and name things.
This constancy and repetition would present no problem if all forms were uniquely determined by changeless physical laws or principles.' [Sheldrake, p.95]
Sheldrake does not believe, however, that these laws are testable - as they would have to account for the prior and automatic evolution of DNA - hence his morphogenetic field hypothesis of formative causation.
[T] relativity explains the changeless laws of transaction such that DNA automatically emerges at the end of a telic emergence chain that compresses the ether into atoms, that then drives them into complex self-regulating arrangements of transitional polymers, then further compresses them, if time will allow, into self replicating autonomous processes that are capable of crossing the energy barriers presented by scale to emerge into
and exploit new material circumstances.
Kauffman S in Levy S 'Artificial Life', pub. Penguin 1992, ISBN 0- 14-023105-6, P.136 demonstrates the emergence of selfregulating long chain polymers from such a primordial soup.
Kauffman S, 'Origins of order: self-organization and selection in evolution.', Oxford: Oxford University Press, 1992) or adapted in 'Antichaos and adaptation', Scientific American, August 1991, pp. 78-85. are other presentations of Kauffman's discoveries.
In [T] relativity, the attributes of scaling presented by physical chemistry would present a created sea of emergent atoms [macro], being driven and compacted into more complex molecular activity around the transitional, hybrid and interstitial values within the periodic table of chemistry.
The diversity of the carbon ring structures e.g. Benzene as previously stated in [Brown GI, 1972] are not yet fully enumerated.
By analogy with biological feeding.
Two systems being over-fed and over-substantiated at time1 interacting and exchanging mutual interference, introduced to a new amount of symmetrical aggregate at time2 will be able to influence and drive the symmetrical construction of a substantial third copy at time3 by sympathetic resonance if conditions amongst the atomic persist.
Upscaling to carbon-based life forms of massively complex aggregates and massively complex postponements of discharge, because of massively complex chemical topographies - there would be sufficient postponement of entropy for a more perfect reproductive symmetry to develop [e.g. chaos driven foetus] in this model.
Information Exchange is analagous to energy exchange in atoms.
It may be said that in the interests of efficiency that intelligence appears to have a high opinion of itself - as Langton's Swarm Model at Santa Fe and the many other simulations within Complexity Studies would suggest.

Within behaviourism, for example, the physical data that upholds the perceptions of; arts, philosophy and society can be attributed to the dichotomies presented within territorial and breeding displays between and within species.
It is not to be said, however, that I am immediately looking for a pile of bananas or running away from high voltages.
It has to be suggested therefore that within my macroscale etheric packets emerging from myself are evolutionary spaces devoid of alcohol or other biological placebo in which other priorities emerge that do not distort and break up social complexity.
I consider myself to have a soul with a 'fictional finalism' [Adler A] of a unified society.

JUPITER OUR BINARY STAR - A PREDICTIVE MODEL
Using the Tesla theory of grand unity called the theory of environmental energy, I will outline the mechanism by which Jupiter could become a nova.
With our current understanding of Cosmology and physics, the following theory could be termed pseudo science. However in a world where scientists like Hawking and Dawkins regularly meet to discuss the impending defeat of the paradigm my theory performs what science methodologist and philosopher of science Karl Popper calls in his 1962 publication ‘Conjectures and Refutations’ – a bold conjectural leap.
All that is needed for it to suffice as a scientific model is for it to predict and if necessary be falsified and discarded if it fails to meet its own criteria for a theory.
Such diligence could never have been applied to the Big Bang theory – a dinosaur well past its sell by date patently ignoring its own refutations for decades. e.g. the Great Wall a strip of galaxies in the super cluster contains matter far older than the date of the origins of the universe calculated by the Big Bang. [Lerner, 1992]
My prediction makes several assumptions already made and practiced by Tesla in his Theory of Environmental Energy 1910 in that energy pours into the cosmos from the ether, supplying new particles and mutating others into newer and bigger atoms and particles. Whilst its known and approved opposite force entropy degrades and creates newer and smaller atoms and particles.
The supply of new energy into the universe from the chaos of the subatomic ether is called the law of emergence. This law has been much studied and modeled at the Santa Fe Institute by Kauffman, Langton, and at the Neurosciences Institute by Edelmann and at Cambridge, England by Goodwin.
All of these scientists can model the fact that order emerges from chaos and in fact recombines and degrades and then rebuilds itself into some complex and structured equilibrium. This can illustrate that the universe grows and regulates complex matter. E.g. Kauffman’s ‘Self-regulating auto catalytic polymers’ model.
Emergence and entropy act synergistically to create and recreate new end based systems.
Contrary to the second law of thermodynamics that predicts that matter continually and forever breaks down into minute and homogenous pieces, emergence is the other side of the coin – it drives structural re-assembly.
Kauffman has demonstrated the recombinative power of emergence in his ‘self regulating auto catalytic polymer model’ at the Santa Fe institute. [www.santafe.edu]
New energies have been noted in the form of dark matter – pouring into this cosmos creating surplus, [New Scientist], whilst as far back as 1920, Olbers noted that if there was no ether the night sky would be white because all of the photons in the cosmos should be here by now.
It can only be ether that keeps our night sky dark by acting as an impedance to the passage of photons.
Ether, Emergence and Wave theory, in the context of fluid dynamics were explored as early as 1901 by Lord Kelvin with his vortex theory of ether – but it was the assumption of Einstein to the impoverished throne of disunity in the early 20th Century that took mankind away from the fluid dynamics model of energy.
Recent cosmologists and high energy physicists have created ‘sub atomic ether/plasma’ by boiling up particles into a fission driven gluon soup – but the reality of natural fusion is largely ignored in a world view dominated by disintegration, fission and entropy.
Against the backdrop of emergence, which is a largely ignored and unexplored law of chaos science – new kinds of physical theories of the cosmos can come into being. When combined with fluid dynamics and turbulence studies they provide the essence of a wave based physical theory of the cosmos.
The following conjecture about the possibility of fusion within Jupiter assumes that matter is being continually created and uses Tesla’s 1910 Theory of Environmental Energy with which to predict, borrowing no paradoxes from 20th century physics.
Recent astronomy has provided evidence of distant solar systems containing hot, gas giant planets akin to Jupiter in Earth’s solar system.
I intend to speculate that by the laws of symmetry and selfassembly within the duality of matter and energy that all physical systems tend to reproduce and grow. From this reasoning I speculate [yahoo groups grandunity3000, files section, numbers and agreements, chapter 3] that all solar systems with gas giants tend to become binary star systems.
How though could Jupiter ever become a binary star for our Sun given its presumed state of low energy as calculated by our current and prevalent physics?
I will present a trigger mechanism for Jupiter based on Tesla’sPhysics.

Jupiter has a centre of gravity, most probably a seed rock around which the gasses and liquids have then gathered. Jupiter is comprised of predominately non-metallic elements many of which could be compressed by gravity into superconductors around the core.
Into this planetary atomic environment comes the energy of the subatomic ether channeled through the centre of gravity at the planets core.
Earth’s centre of gravity has been reported to be an electrostatic environment.
It has been reported by mythology that there is a crackling, smoking electrical sun at the heart of the Earth [‘The Smokey God’].
The ether is in and around all atoms and all particles as the ether directly drives their very existence as standing waves in etheric turbulence. Much like the red spot in the eye of Jupiter – particles are harmonic storm systems directly linked to the medium from which they emerge. Harmonic attributes and arithmetic are constantly to be found even amongst cover-ups like quantum theory and super strings.
Most of the emergent force however would be focused at the planetary centre of gravity, more so than at the periphery of the mass of the planet.
Through the core of Jupiter comes the force and energy that drives the creation of new particles and atoms and this force will seek to radiate out beyond the planets surface, dissipating as a creative force as it gets beyond the planetary mass and atomic and particulate ingredients.
The force of emergence, brings new energies and subatomic particles into Jupiter through its core and will alter, mutate and grow the atoms at the planets core.
This will have created over vast time, semi-metallic transitional elements around the seed rock of Jupiter, like a very porous skin comprised of several kilometers of inconsistent semi-metallic
atomic aggregates.
These semi-metallic elements will act like a capacitor, holding the suns electrostatic charge and creating a huge potential difference or voltage between the core and the semi-metallic skin. What would usually happen because of the electro-porous nature of the skin, is that the variation in the force of emergence and the variation in the Sun’s electromagnetic output that the amount of charge held by the leaking semi-conductor would tend to dissipate?
When, however, the electromagnetic output of the sun is unusually high e.g. now.. where most of the planets in the solar system are currently warming up, less of the electrostatic charge around Jupiter’s core will leak away and will build up a massive potential difference between the skin and the core.
When the charge in the semi-metallic skin reaches a critical threshold under these new solar conditions it will or could cause a change of state in its constituents overcoming the physical and atomic constraints to its discharge and will forcibly discharge its massive voltage triggering a cold fusion reaction in Jupiter’s core.
Jupiter will then, I predict, follow in the footsteps of other binary star solar systems by self-assembling itself into a fusion driven binary star.
Another twist to this tale and all the what if’s is that IF this theory predicts correctly then there is another problem with Jupiter that needs urgent seeing to.
As an entity involved in fusion Jupiter tends to the light metallic end of the spectrum but in the year 2001 it was hit by a comet comprised wholly of Plutonium – a very heavy metal.
This comet was not Shoemaker-Levy number 9 in 1999 but in the order of Shoemaker-Levy 12-14. The hit of the plutonium asteroid was in 2001 and was announced by Peter Sissons on BBC UK Evening News as an ‘and finally’ item towards the end of the programme.
With a large amount of heavy metal plutonium in the fusion mixture – Jupiter could easily get indigestion, creating not only fusion but fission reactions driven by the extreme weight of the Plutonium.
Jupiter might not only be a nova – but potentially could also explode and eject matter into the solar system creating havoc amongst the planets.
The presence of a large amount of heavy metal salts could
potentially be disruptive and antagonistic as the elements engaged in fusion would have been primarily lighter.
Some may say that maybe hostile aliens more than aware of the physics of Tesla attempted to destroy the solar system by asteroid attack.
The solution though is relatively simple too – a massive injection of transitional elements such as carbon and iron into the fusion mixture – before it fires itself into life.
This maybe gives us 10 years to fix the problem before the solar activity predicted by the Mayans creates the possibility of fusion in Jupiter.
We can treat the planets explosive indigestion with a dose of carbon – probably with about 100 – 1000 times the mass of Plutonium that went into Jupiter.
I’m sure there’s plenty pure transitional elements floating about nearby in the asteroid belt.
To conclude then, by our current world view of physics, none of the above is possible, but then there is no theory in today’s physics apart from Tesla’s that could explain the interstellar craft that currently flood into our camcorders.
Thus far I have been totally unsuccessful finding confirmation of that Shoemaker Levy 12-14 plutonium comet in 2001 – and I’ve been looking and asking the usual suspects.
Another ironic note is that the comet hit in 2001 – very Arthur C Clarkesque … but in Clarke’s sequel 2012 also a film, Jupiter becomes a binary star. It doesn’t go supernovae in the film though – that one has a happy ending.
I think, personally, that someone is trying to tell us something ….

Comments

Popular posts from this blog

Transhumanism and the Galaxy

The Falkirk Triangle

The New Star Party vision 30 years ago - still cutting edge ?