Christian Forums

This is a sample guest message. Register a free account today to become a member! Once signed in, you'll be able to participate on this site by adding your own topics and posts, as well as connect with other members through your own private inbox!

  • Focus on the Family

    Strengthening families through biblical principles.

    Focus on the Family addresses the use of biblical principles in parenting and marriage to strengthen the family.

  • The Gospel of Jesus Christ

    Heard of "The Gospel"? Want to know more?

    There is salvation in no other, for there is not another name under heaven having been given among men, by which it behooves us to be saved."

  • Site Restructuring

    The site is currently undergoing some restructuring, which will take some time. Sorry for the inconvenience if things are a little hard to find right now.

    Please let us know if you find any new problems with the way things work and we will get them fixed. You can always report any problems or difficulty finding something in the Talk With The Staff / Report a site issue forum.

[_ Old Earth _] Imprisoned scientists jailbreak offer??

Donations

Total amount
$1,642.00
Goal
$5,080.00
Explain where Shannon showed information was entropy.

See above. I presented a statement from Shannon doing just that.

That's complete rubbish.

Nope. It's a fact. Would you like me to show you the math? Shannon's demonstration that information equates to entropy has been used to make the internet possible in its present form, and to send very weak signals across millions of kilometers of space with virtually no loss of data.

Your trying to equate randomness and information.

Nope. Information, as Shannon said is a measure of uncertainty in a signal.

Shannon did, however, propose, that as entropy is decreased, information is increased.

See above. You've been misled. Again, here's what Shannon actually wrote:

Quantities of the form H Σpi log pi (the constant K merely amounts to a choice of a unit of measure) play a central role in information theory as measures of information, choice and uncertainty.

The form of H will be recognized as that of entropy as defined in certain formulations of statistical mechanics where pi is the probability of a system being in cell i of its phase space. H is then, for example, the H in Boltzmann’s famous H theorem. We shall call H Σpi log pi the entropy of the set of probabilities p1 pn. If x is a chance variable we will write H x for its entropy; thus x is not an argument of a function but a label for a
number, to differentiate it from H y say, the entropy of the chance variable y. The entropy in the case of two possibilities with probabilities p and q 1 p, namely

H = -(p log p + q log q) ...


Notice that he equates information with entropy.

Again, ask your parents about it...

Regrettably, both of my parents died some years ago, having lived full and rewarding lives. Neither of them actually studied systems analysis or information theory, however, and I have. One of my degrees is in systems.

or one of your stats teachers,

One of them in graduate school introduced me to the theory. I was as incredulous as you are, until I saw the numbers.
 
Wait, you didn't mention R once?

R = H(x) - Hy(x)

R is the info, H is the entropy. H(x) is the uncertainty (entropy) before receipt, and Hy(x) is the amount the uncertainty (entropy) after receipt of the messgage. The result is the R. So the lower the uncertainty, or entropy, after receipt, the higher the info.

For example:

Say your uncertainty before receipt is 4 bits, and your uncertainty after receipt is 4 bits. This means you didn't receive any info:

R=4-4

R=0 bits of info

Another example;

Say your uncertainty before receipt is 4 bits, and your uncertainty after receipt is 0 bits. This means your received 4 bits of info:

R=4-0

R=O

See above. You've been misled.

Sad that a graduate level instructor taught you this...wow!! Amazing he was employed.


Notice that he equates information with entropy.

Rubbish, nonsense, get that out of your head...someone screwed you up big time.
 
Shannon, in fact, defined entropy as a measure of the average information content associated with a random outcome.
http://en.wikipedia.org/wiki/Information_entropy

R is the info, H is the entropy. H(x) is the uncertainty (entropy) before receipt, and Hy(x) is the amount the uncertainty (entropy) after receipt of the messgage. The result is the R. So the lower the uncertainty, or entropy, after receipt, the higher the info.

You've gotten it backwards again. Read this, it may be more easily understood.

An intuitive understanding of information entropy relates to the amount of uncertainty about an event associated with a given probability distribution. As an example, consider a box containing many coloured balls. If the balls are all of different colours and no colour predominates, then our uncertainty about the colour of a randomly drawn ball is maximal. On the other hand, if the box contains more red balls than any other colour, then there is slightly less uncertainty about the result: the ball drawn from the box has more chances of being red (if we were forced to place a bet, we would bet on a red ball). Telling someone the colour of every new drawn ball provides them with more information in the first case than it does in the second case, because there is more uncertainty about what might happen in the first case than there is in the second. Intuitively, if there were no uncertainty as to the outcome, then we would learn nothing by drawing the next ball, and so the information content would be zero. As a result, the entropy of the "signal" (the sequence of balls drawn, as calculated from the probability distribution) is higher in the first case than in the second.

Shannon, in fact, defined entropy as a measure of the average information content associated with a random outcome.

(same source)

Sad that a graduate level instructor taught you this...wow!! Amazing he was employed.

Well, he had worked at Bell Labs the same time as Shannon, so they probably figured he knew what he was talking about. Since he presented it as Shannon did, I guess they were right.

Barbarian obseraves:
Notice that he equates information with entropy.

Rubbish, nonsense, get that out of your head...someone screwed you up big time.

You vs. Shannon. Not much of a choice, is it?
 
You vs. Shannon. Not much of a choice, is it?

Shannon, in fact, defined entropy as a measure of the average information content associated with a random outcome.

Right. The higher the entropy after receipt, the lower the information content.

Example:

Somebody is supposed to call and give you directions to a party.
Let's say you have 10 bits of uncertainty before the call, because you don't know where in the heck it's supposed to be. Then you receive the call, and the girl giving you directions starts babbling about something unrelated to the directions...then your connection is dropped. So your uncertainty is still 10 bits, because your just as clueless to the directions as you were before the call.

R=10-10

R=O

Info = 0

Dude. Your clueless. :o

You have no idea what Shannon's Theory is. :tongue

Let's take it one step at a time:

R= Information

Are you still with me?


Notice that he equates information with entropy.

My guess, if this guy was one of the engineers, is that you misunderstood him. Every engineer or tech type involved with communication of information knows that the higher your entropy is, the less info that's being transmitted. Of course, he may have been the janitor..who knows? :-?


You vs. Shannon. Not much of a choice, is it?

Now, just remove the Shannon part, and you'll have it right. I'm right in line with Shannon, while you babble your nonsense, in defense of your silly a** philosophy. :roll: You guys crack me up big time...acting all scientific like...just as ignorant as can be. :tongue Sad and funny!! :tongue
 
I showed you Shannon's own words. And I showed you how he determined that entropy was equivalent to information.

Much as I'd love to babble insults with you, it's clear you aren't going to accept what Shannon said, regardless.

But you can at least serve as a bad example.
 
You've confused meaning with information. Two different things. Read Shannon's article to learn why that's important.

It might be helpful to think again that if the message is completely known in advance, no information is obtained, and if the message is completely unanticipated, (say random string of 0 and 1) the the information of the message is maximized, at the same time H is maximized. Hence, the entropy of the message, is the information.
 
Next, we add the entropy prior to receipt:

R=H(x)

Then, finally, we subtract the entropy after receipt:

R = H(x) - Hy(x)

For example;

Suppose you knew nothing about Shannon's theory prior to talking to me. I'll be very generous, and allow you 10 to the 20th bits of uncertainty. Now suppose I explained it to you, but you weren't listening, because you were more preoccupied with defending you dumb a** philosophy, than seeking truth. So your uncertainty remains at 10 to the 20th bits;

therefore, the information you've gained concerning Shannon's theory:

R= (10 to the 20th bits) - (10 to the 20th bits)

R=0

So the total amount of information you possess concerning Shannon's theory is 0. Of course, the math wasn't really necessary... :tongue
 
Entropy as information content
It is important to remember that entropy is a quantity defined in the context of a probabilistic model for a data source. Independent fair coin flips have an entropy of 1 bit per flip. A source that always generates a long string of A's has an entropy of 0, since the next character will always be an 'A'.

The entropy rate of a data source means the average number of bits per symbol needed to encode it.

http://en.wikipedia.org/wiki/Information_entropy

It's kind of pointless for you to deny it.
 
I know you found someone to tell you that entropy is not information. But Shannon, as you see, does.

And since his model actually works, he wins.
 

Donations

Total amount
$1,642.00
Goal
$5,080.00
Back
Top