Barbarian
Member
- Jun 5, 2003
- 33,194
- 2,500
Explain where Shannon showed information was entropy.
See above. I presented a statement from Shannon doing just that.
That's complete rubbish.
Nope. It's a fact. Would you like me to show you the math? Shannon's demonstration that information equates to entropy has been used to make the internet possible in its present form, and to send very weak signals across millions of kilometers of space with virtually no loss of data.
Your trying to equate randomness and information.
Nope. Information, as Shannon said is a measure of uncertainty in a signal.
Shannon did, however, propose, that as entropy is decreased, information is increased.
See above. You've been misled. Again, here's what Shannon actually wrote:
Quantities of the form H Σpi log pi (the constant K merely amounts to a choice of a unit of measure) play a central role in information theory as measures of information, choice and uncertainty.
The form of H will be recognized as that of entropy as defined in certain formulations of statistical mechanics where pi is the probability of a system being in cell i of its phase space. H is then, for example, the H in Boltzmann’s famous H theorem. We shall call H Σpi log pi the entropy of the set of probabilities p1 pn. If x is a chance variable we will write H x for its entropy; thus x is not an argument of a function but a label for a
number, to differentiate it from H y say, the entropy of the chance variable y. The entropy in the case of two possibilities with probabilities p and q 1 p, namely
H = -(p log p + q log q) ...
Notice that he equates information with entropy.
Again, ask your parents about it...
Regrettably, both of my parents died some years ago, having lived full and rewarding lives. Neither of them actually studied systems analysis or information theory, however, and I have. One of my degrees is in systems.
or one of your stats teachers,
One of them in graduate school introduced me to the theory. I was as incredulous as you are, until I saw the numbers.