It is bizarre n'est-ce pas: Shannon and Weaver were working on a probabilistic model -
the least probable is random and terefore entropic. The most probable is without
information at all - which is also entropic. The trick is to use redundancy (the probable
- Hi, hello, how are you? - to secure a channel, and then use a mix of probable - the
english language for example - and the less probable - someone flew a plane into the world
trade centre - to construct messages.
In early info theory, esp in Neuman and Winer, entropy was a bad thing indeed, to be
combatted with hoeostasis: Cold War paranoia as N Kathrine Hayles argues. Only in Maturana
and Varela, second generation info theorists, does it begin to feel safe; and in
contemporary ('emergence') information theory its a positive.
Entropy! In information theory entropy represents the amount of independent
information contained in a system and is basically the opposite of noise.
Entropy of course
implies disorder, however Shannon used the term to describe the amount of
information present in a system. ??? (what was he thinking?) The larger the
entropy, the more information is present. If noise is added to a system the
entropy is reduced. so, yes, in information theory entropy is a positive
Ada_list mailing list