Barbarian said:
Alleles. And as you now see, even random changes add information. That's how it works.
Show us one of those. Then we'll do the numbers and see.
No, you've got that wrong. As you saw, information is just a measure of uncertainty in a message.
The more information we know about what is contained in the message, the less bits we can use to represent the message as we have less uncertainty about what kind of message we’re going to represent. As soon as we can ensure the possible characters to appear in the message from the whole alphabet to simply A and H, we can use fewer bits to represent it because we know exactly what we need to represent.
When all the communicating parties already know in advance that the message is going to be HAHAHAHAHA
instead of any other 10-character message containing the letters A and H, we can consider the whole HAHAHAHAHA message as one piece of information and if we build a binary search tree for it we’re going to have zero depth as there’s nothing to search for since we only have one element in the set and P(X)={1}
log21=0
Therefore, this is what we’re going to get from the Shannon entropy equation.
H(X)=−∑i=11P(xi)log2P(xi)=−(1×log211)=−(−1×0)=0
There’s no uncertainty at all in a message that’s already known to contain the string HAHAHAHAHA
so we don’t need to send anything to make it known to the receiving party anymore since there’s no new information for them to gain.
Shannon was trying to understand how messages like Morse code were created and transferred from one place to another. Questions like: How does the information of • • • (the letter "s") compare to the information of – – • • ("z")?
To answer this, Shannon proposed his definition of information:
In the Morse code example above, "s" is less surprising than "z", so it contains less information.
I'm writing a book on What Information Wants [https://www.roote.co/what-information-wants]. But nailing down a definition of information is hard. What is information? Let's look at the Wikipedia definition. That should be helpful, right? (Right?!?) > 1. Information can be thought of as the...
www.rhyslindmark.com
Barbarian further said:
Let's see... starfish put out millions of eggs. Very few survive to become even young starfish. This obviously isn't very efficient. So you're demonstrably wrong.
Me:
Humans have never had children such that very few survive so you are demonstrably wrong. In fact, mammals don’t. You see, I only need one example to prove you wrong and I have lots.
But otherwise you refuse to leave the computer and look at real
biology. You measure biological function by a formula and ignore living creatures that don’t comply with the formula.
I’m educated in medicine and we have known that mutations remove information and known this for a long time. There are known mutations that prevent a baby from producing all the proteins needed for successful living. You want examples of real life and then will see if real life matches the formula.
But oddly enough, the information, that mutations can restrict information from reaching production in the cell, cannot be added to your understanding. I doubt there is a mathematical formulae to describe this refusal to add this information.
We have reached an impasse or perhaps, wall.
Adieu