Entropy, potential Energy, Quantum computing, Determinism, Chaos, morphic fields

0

Dear Lucian

K: Max. Entropy – the state of max. informational content

…and minimum Potential/Energy:

Energy is measured by “its ability to do work” but it is defined by “that which does not diminish in transformation”.

Another word for energy is “potential” as all energies are based on the principle of “polarity”. Difference of polarity is providing a potential for work. That means without polarity there is no Energy.

L: Yes, energy is a “currency” of change, implemented through transfer of momentum, through “forces”. This mechanical picture is complemented by electromagnetism through a more subtle concept of vector potential B=Curl(A), in addition to the common electric potential E=grad(U).

The vector potential was thought to be just a convenient math tool, until experiments like Maxwell-Lodge and Aharonov-Bohm reviled that even a zero-field vector potential (B=curl(A)=0), can produce electrical field at a distance (Maxwell-Lodge effect), and a change in the probability of occurrence of quantum interference (“is the electron or not”, being the question). There are no “lines of force” in between the source of the vector potential (say a coil screened by a Faraday cage), but only “invisible and intangible” circles of vector potential. In a dynamic regime they can be modulated, becoming carriers of information without energy (!). They can be thought of as  the structure of space itself, a sort of fundamental morphic field (but not field strength, i.e. no energy-momentum carriers), a “blue-print” of potential effects materialized when a charge or a resonant devise is present to assume the role of receiver.

I advocated the duality between energy (as “bulk”) and information (as actual structure) and that quantum processes are exchanges of quantum information in DWT v.1 & Q++ (see http://www.virequest.com/VIReQuest_Projects.htm ), starting top-down from principles; but to see that others, independently, arrived at a concrete incarnation of the idea was a really satisfying moment (see Gerard Rousseax http://math.unice.fr/~rousseax/maxwell-lodge.htm, Chirkov and Agreev: http://www.springerlink.com/content/9x3wk14244432277/,

www.springerlink.com/content/b4m144116432v541/fulltext.pdf ).

Unfortunately they (and others) don’t have yet the “big picture”, opening the new “Infotronix Era” as I like to call it – it’s “just” a quantum computing up scaled version of electronics, but it can be done with light (rings a bell?).

It also fits quite well the portrait of dark matter, and dark energy; yet we know next to nothing about these “cosmological nightmares”. We do know, and even Maxwell new back then, that the vector potential is essential for the description of electromagnetism; it fits Feynman’s description of “real field” anyways (see Lecture Notes in Physics, p.?).

And it supports the morphogenesis theory of Rupert Sheldrake http://en.wikipedia.org/wiki/Rupert_Sheldrake, coming from a totally different “direction”; yet not surprising: there is only one root …

K: As we discussed consciousness can be at least approximated by the definition that it is the opposite of matter which is defined by its inertia/momentum. That which has no inertia is pure consciousness.

L: Quite right! Let’s think of a computer – and they’re getting smarter by the minute … -, it is bulky (hardware), electrons and energy is flowing (so far a heater can do that!), but behold, it vibrates in a structured-hierarchic way (lots of software layers on top of a kernel operating system); now exponentiate, make it quantum, and add vector potentials which can couple force-effects non-locally (apparently), via global vector potentials … Luckily I’m not aware of what every other human is doing or experiencing, or am I, sometimes?

K: Taking the same approach to define information as the opposite (or complement if you wish) of energy then we can say :

“Pure information is that which has no polarity” (= no energy) and this of course is the definition of “max/pure entropy”

This is a truly new definition of information or of entropy and many exciting new possibilities can be learned from that :

L: Indeed; information is complementary to energy, the same way wave aspects are complementary to particle aspects, the celebrated wave-particle duality (complementarity principle etc.). Information was almost ignored in physics so far, people striving to experiment at higher-and-higher energies (LHC). The experimentalists instead were looking at low temperature solid state physics (I call it low-entropy physics, because obviously that’s the key issue); and many wonderful effects emerged (supraconductivity, expulsion of magnetic field from cooled conductors, granulation of space properties in type II superconductors, the so called Abrikosov’s lattice etc.).

Now information, the opposite of entropy, is a numeric measure of how complex the structure of the system is, weather we know it or not; but if we don’t, then the system becomes a source of information, and there is a difference of “information potential” between this source, and the receiver, say us. Then, “things” can start happening: information can  be transferred, or maybe the structure itself being imitated (monkey see monkey do?) … I don’t know yet! My belief is that if there is a good and successful way of doing it, Mother Nature found it first (She got a good had start J). So, when making “hypothesis” we should give Her credit and be optimistic …

K: As I described, information contains matter to the degree it has inertia. We know theories and religious gain information  momentum/inertia over time and this means that they increase in their matter content. We literally speak about “crystallized idea/theories/believes “.

L: I still tend to put matter as the basis for structure-information, but the theory of morphic field says matter fills the “blue-print”, … and if I’m not there to put matter… how does it get there? Could be a “fall” along the potential lines of the blue print … That’s why duality is good; you know: chicken or egg? The answer is … “Chicken and egg”; Yin and Yang etc.

K: Equally instructing is that information also contains energy to the degree it contains polarities. Polarity/energy we experience in its informational form as tension, as debate, as opposite thinking, of giving importance to some and none to other issues. The more our thoughts are loaded with these the more it is containing energy.

This way we also understand the old spiritual teaching that by being more neutral in once thinking, by reducing to give importance, by non-attachment we release/ create energy.

L: I would try to avoid “one contains the other”, since this does not stress how different the two are, and tends to subordinate. Like Yin and Yang, information and energy are different aspects and, probably, if unbalanced, too much of one without the other, are not productive/in harmony. Yes, there can be a vector potential with a zero-field-no energy; and ordinarily there is plenty of forces and energy flows without global potential (there is a technical distinction between the two: Hodge decomposition, or Helmholtz decomposition).

K: “Chaos/ Entropy is not the end-product/ lower state/ missing potential but the opposite, the state which is most saturated with information/ hologram/ the beginning/ the source of information that can itself be transformed into energy and thus the cycle can start again” .

L: Perfectly to the point! Determinism is “boring”; an unpredictable yet compatible partner we all seek. Information without a “key” (decoding scheme) can seem “chaos”; and the more I think, I don’t see any other cause for “chaos”: we just don’t hold the access key to interpret it … At this point, you (or the reader) can improvise better then I can, I’m sure; so I’ll stick with the bottom-up approach (science: up-up-and away J).

K: In conclusion: Energy-entropy=information potential- energy

L: Yes – although I prefer my own “recipe”:

Matter-Energy is a manifestation of Potential-Information.

L: Now, I think we know in principle how things work and why. Next time may be you introduce me to the “secrets” of what CORE, the hardware, is made of … By now I guess: it should contain a high-potential information-source (random number generator), a resonator (quartz or semiconductor and … a coil?), probably like any radio tuned to detect a zero-field vector potential, and … ?

PS: Now physicists have to decide what to add to the usual Lagrangian, whose action gets minimized. My guess is, not much: Lagrangian= Energy (T-U) – Entropy. When writing the Feynman path integral, entropy log |Aut(path)| gets in the denominator of the action exponential, as a symmetry factor (Of course, symmetry means: indeterminate, but beautiful!) .

And after we discuss the hardware, the tricky part is “What quantum software runs around us?” … the Living Matrix is the right direction (http://www.thelivingmatrixmovie.com/), but, as a scientist, I need to know how to isolate it, and then, to reverse engineer it!

Share.

About Author

Comments are closed.

Dear Friends and Customers

I am sorry to inform you that I have broken-up business relations with Andrea Gadducci and his company Biot Srl.

After years of paying commissions for software and hardware they have now informed me that they consider both to be their property - which according to their lawyer Guliano Lemme also includes the Inventory of Inergetix products that I had stored at their Rome office and for which I paid rent since it was moved there in June 2018 from Germany.

Until we have decided this matter in court i have to warn you not to buy from Andrea - as these goods might be later declared as stolen and you might loose the money you spend.

However I can offer you to buy directly from me and get a 10% discount to what Andrea quotes you. Please contact me at [email protected]

Thank you for your support in this DLE

Kiran Schmidt