Yes. Because you've increased uncertainty in the message. If you had just written "The weather today will be.." then there is less information. You're still confusing information with usefulness or meaning. That's not how it works. It can be used so. And BTW, some things like that above message can be used to do metamessages within an innocuous sentence. Some intelligence agencies do things like that.
I add bits just like your claim. You understood by eliminating non-informing “genes.” This is often what cells do with mutations. They ignore them. But there are known mutations that reduce information. This is not disputed. In the cell, information has uses or it’s not information. You seem to be unable to see that biological life is very efficient or it doesn’t survive.
If it only survives long enough to reproduce many of those offspring won’t. Second, the offspring are as able to reproduce as easily as their parents if healthy. No added ability is known in humans from healthy parents.
There are none in humans that aid in reproductive selection. There are known positive mutations but none that give an advantage in reproductive years that don’t kill subsequent offspring, clearly not an evolutionary edge.
If it only survives long enough to reproduce many of those offspring won’t. Second, the offspring are as able to reproduce as easily as their parents if healthy. No added ability is known in humans from healthy parents.
As you have seen, there are a good number of even recent mutations that improve the likelihood of a human living long enough to reproduce. And as you now see, that's all that necessary.
And speciation is a fact. Would you like to see some examples?
No, you've got that wrong. As you saw, information is just a measure of uncertainty in a message.
The more information we know about what is contained in the message, the less bits we can use to represent the message as we have less uncertainty about what kind of message we’re going to represent. As soon as we can ensure the possible characters to appear in the message from the whole alphabet to simply A and H, we can use fewer bits to represent it because we know exactly what we need to represent.
When all the communicating parties already know in advance that the message is going to be HAHAHAHAHA
instead of any other 10-character message containing the letters A and H, we can consider the whole HAHAHAHAHA message as one piece of information and if we build a binary search tree for it we’re going to have zero depth as there’s nothing to search for since we only have one element in the set and P(X)={1}
log21=0
Therefore, this is what we’re going to get from the Shannon entropy equation.
H(X)=−∑i=11P(xi)log2P(xi)=−(1×log211)=−(−1×0)=0
There’s no uncertainty at all in a message that’s already known to contain the string HAHAHAHAHA
so we don’t need to send anything to make it known to the receiving party anymore since there’s no new information for them to gain.
Shannon was trying to understand how messages like Morse code were created and transferred from one place to another. Questions like: How does the information of • • • (the letter "s") compare to the information of – – • • ("z")?
To answer this, Shannon proposed his definition of information:
Information is a measure of the surprise of a given message.
I'm writing a book on What Information Wants [https://www.roote.co/what-information-wants]. But nailing down a definition of information is hard. What is information? Let's look at the Wikipedia definition. That should be helpful, right? (Right?!?) > 1. Information can be thought of as the...
Let's see... starfish put out millions of eggs. Very few survive to become even young starfish. This obviously isn't very efficient. So you're demonstrably wrong.
Barbarian said:
Alleles. And as you now see, even random changes add information. That's how it works.
Show us one of those. Then we'll do the numbers and see.
No, you've got that wrong. As you saw, information is just a measure of uncertainty in a message.
The more information we know about what is contained in the message, the less bits we can use to represent the message as we have less uncertainty about what kind of message we’re going to represent. As soon as we can ensure the possible characters to appear in the message from the whole alphabet to simply A and H, we can use fewer bits to represent it because we know exactly what we need to represent.
When all the communicating parties already know in advance that the message is going to be HAHAHAHAHA
instead of any other 10-character message containing the letters A and H, we can consider the whole HAHAHAHAHA message as one piece of information and if we build a binary search tree for it we’re going to have zero depth as there’s nothing to search for since we only have one element in the set and P(X)={1}
log21=0
Therefore, this is what we’re going to get from the Shannon entropy equation.
H(X)=−∑i=11P(xi)log2P(xi)=−(1×log211)=−(−1×0)=0
There’s no uncertainty at all in a message that’s already known to contain the string HAHAHAHAHA
so we don’t need to send anything to make it known to the receiving party anymore since there’s no new information for them to gain.
Shannon was trying to understand how messages like Morse code were created and transferred from one place to another. Questions like: How does the information of • • • (the letter "s") compare to the information of – – • • ("z")?
To answer this, Shannon proposed his definition of information:
In the Morse code example above, "s" is less surprising than "z", so it contains less information.
I'm writing a book on What Information Wants [https://www.roote.co/what-information-wants]. But nailing down a definition of information is hard. What is information? Let's look at the Wikipedia definition. That should be helpful, right? (Right?!?) > 1. Information can be thought of as the...
www.rhyslindmark.com
Barbarian further said:
Let's see... starfish put out millions of eggs. Very few survive to become even young starfish. This obviously isn't very efficient. So you're demonstrably wrong.
Me:
Humans have never had children such that very few survive so you are demonstrably wrong. In fact, mammals don’t. You see, I only need one example to prove you wrong and I have lots.
But otherwise you refuse to leave the computer and look at real
biology. You measure biological function by a formula and ignore living creatures that don’t comply with the formula.
I’m educated in medicine and we have known that mutations remove information and known this for a long time. There are known mutations that prevent a baby from producing all the proteins needed for successful living. You want examples of real life and then will see if real life matches the formula.
But oddly enough, the information, that mutations can restrict information from reaching production in the cell, cannot be added to your understanding. I doubt there is a mathematical formulae to describe this refusal to add this information.
As you see, that's wrong. In fact, Shannon, who pioneered information theory, showed that every new mutation in a population increases information. Now, I know a few MDs who do understand information theory, but most of them are never trained at all in it. Mostly geneticists, for the obvious reasons. You're an M.D.? Where in medical school did they teach you information theory?
The problem is, you're still confusing "information" with utility. Two different things.
I add bits just like your claim. You understood by eliminating non-informing “genes.” This is often what cells do with mutations. They ignore them. But there are known mutations that reduce information.
Here's the way that works. Suppose that there's ten individuals in a population and one dies and a new one is born (I'm doing it like this to make the math easier to see) In the old population there were alleles A (0.5 frequency) and a (0.5 frequency) for a specific gene. But the new birth was A, and the death was for allele a. Now, the frequencies are 0.6 and 0.4. The information for the old population was about 0.3. The information after the mutation is about 0.29, a decrease in information.
I think you're still confusing information with utility.
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.