Sunday, January 07, 2007

Information, Entropy and the conservation of...


of what?

I guess, everyone feels, that entropy and information are something like the two sides of the same coin, so everyone feels, that they both together – somehow – fulfill the “original need” for a conserved quantity.

But we only know, that moving along the timeline, information builds order, entropy destroys it.

Ever heard the name “Gerd Binnig”? Nobel Prize Winner in Physics, 1986, for the scanning tunneling microscope? Besides that, he has developed a theory about the origin of nature...

(oh, not that fundamental as information(<whisper, blink>)

but – admittedly – more specified.

In some forgotten special edition of a nice computer magazine he wrote about his theory – the first time i saw a physicist accepting randomness as original part of reality.

And – of course – it is a theory about conserved quantities, formulated as balancing act between two poles.

He thinks of three pairs of poles or equilibria: “attraction” vs. “isolation” to balance “difference”, “reproduction” vs. “death” to balance “numbers” and “mutation” vs. “selection” to balance “diversity” and called it “Fractal Darwinism”. In fact, he was able to deduce the laws of Relativity and Quantum Mechanics from his axioms.

(btw: do you see the relationship to information? Information does not only focus on the poles and the equilibrium, but also considers the process of balancing, which “drives” the change of values towards this particular state “equilibrium” – as long as the force behind IS information, because otherwise the progress ends. Not the balancing itself is annihilated without information, not the fact, that there has to be an equlibrium, since everything happening arises from quantum noise and so has to have its antipode – but without information, there is not repeatability, no repetition, no progress or evolution.)

Diversity seems to be interesting regarding information and entropy – just because information depends on Distinctness. You can’t have repeatability without distinctness, because repeatability is defined on different states – and states have to be identifiable to be recognizable as “different”. And identity needs uniqueness: stable, different values of at least one attribute of the considered state, which are able to play the role of “the identity” to “name” each state.

Diversity. Without diversity no information no universe.

But diversity isn’t enough to help us get an idea of the conservation quantity, so important for our understanding of physics. Why important? Because we all know the power of entropy, the Second Law of Thermodynamics.

Then there is Maxwell's demon and Leó Szilárd, combining increase of energy, entropy and information of an observer with a decrease of entropy in the observed system.

Sounds fine – but i thought about Gaia, a highly ordered system based on intertwined information processing cycles, dependent on a “huge amount” of information for a “huge amount” of time. Gaia needs a lot of energy input, so the increase of order as something like crystallized information is well explained, i guess – and when energy ends and entropy will be “freed”, it will destroy the order.

So is the conservation just in consideration of the overall time? But what about the fact, that the Earth is not an isolated system – where in the whole solar system could the increase of the Maxwell’s demon’s entropy be found? In Gaia herself? Considering Shannons definition of entropy as a measure of the average information content associated with a random outcome?

Information needs energy to build differences – entropy deletes differences and transforms energy in the most unuseful kind. Information deletes uncertainty, entropy measures it.

Then there is the Casimir Effect – connecting boundaries to “orderly” behavior of waves and the creation of energy, which doesn’t violate the conservation of energy because of the change of the system's conditions.

Boundaries, differences, diversity, variation, different states, different observables, different values...

+ the motor of change, the infinite action of quantum noise with it’s slight tendency to prefer states such creating time and timelines...

so what about states? Information prefers states, differences – entropy destroys it...

but where’s the conserved quantity? The sum of all values has to be zero?

How about Noether’s Theorem – conserved quantities need continuous groups. I guess, the quantum noise is continuous, but is it a group? At least, it can create energy (s. Casimir Effect)...



Post a Comment

<< Home