information, energy and time



... yesterday, on NPR, I heard an interesting segment on
information science

... I always was fascinated by the subject, and quite early
on in my studies and meditations, I started to see that almost
everything could be solved by total system energy analysis, rather
than a multitude of mechanical equations, patched together to describe
a system's behavior

... this is my internal zen view of what is going on, so I may not
be adhereing to the conventional equations used for information and entropy

... in such a view, everything is comprised of energy and time, and
the rate at which energy is used per unit of time, is the power

... as humans, we still don't understand Time, but we are figuring out
information

... the most hard to understand concept in all of this, is that information
is disorder itself, and this leads us to a paradox, when you want to mathematically
analyze it

... a paradox where the orderly things we see, are actually disorderly events on the canvas of background noise

... it somehow goes against our common sense notions that things are orderly,
when they are actually measured by how many distinctions they contain, or how
much disorder they contain measured against the background uniformity


... the information-entropy paradox is explained by saying that we need to make distinctions in our
minds in order to see differences in things, at the very minimum, existence or
non-existence .... the 1 and 0

... but a distinction, is disorder, it stands out from the crowd, so information
is observing things that stand out from the background ... disorder on an orderly canvas

... so maximum information, is being aware of the all the disorderly things,
everything that sticks out from the background

... the sum of all these distinctions in our minds, and the context we give them in our lives,
is what we call our reality


... this is where information becomes measured by entropy, but entropy in the context of
no information, no distinctions, the Great Uniformity, where even our thoughts go when
we are done thinking them

... in my terminolgy, maximum entropy means minimum information, but even entropy itself has no context or
meaning at that point


... in classical thermodynamics, entropy is the tendency of the universe to
go toward Absolute Zero, no heat energy, many hundreds of degrees below our conventional
concept of zero Farenheit or Celcius

... when we speak of entropy in the information context, it means the tendency of
any information to degrade, with time, to a lower quality of information

... information degrades to the point of merging into the background noise, with
the passage of Time itself

... it is just mathematically obvious, in my anthropomorphic vision, that
the passage of Time constantly creates new information, therefore diluting
the previously existing information, by just making it a bigger total information pool

... information degrades that way, as time flows,

... is the flow of Time actually causing entropy?


... so what would a universe be like at maximum entropy? If there was no useful
informations in it?

... lets take the chair you sit on for example,

... at maximum entropy, it's atoms would be uniformly scattered across the known universe,
mixed in so well with other stuff out there, that no one would be able
to detect it ever was a chair on earth, sat on by you. No information is left
about that chair

... at maximum information, it is what YOU understand it to be, NOW

... I am reminded of my favorite song lyric, from the Grateful Dead's song Ripple
"Let it be known there is a fountain, not made by the hands of man"

... I'm beginning to see that fountain, the fountain of Time, bubbling out new information,
as we whip along thru space-time on planet earth

... drink deep from the fountain of the Tao, whatever it's mysterious source may be






------------------------------------------

2011 by zentara