(Milk asks what information is)
It's an important question when you want to talk about what it means in terms of biology messaging of any kind. Almost no one know what it really is. But it's expressed mathematically.
Wrong. It's the degree of uncertainty in a message. For example, if you are completely certain of the content of a message, it has an information of zero, since you knew exactly what it was to be. To the degree that you are not sure, that is information.
Let's put it into genetics. Let's say a population of organisms has two alleles for one particular gene locus. Both are at a frequency of 0.5 in the population (both 50%). So the information in that population, for that gene is:
Where H is the entropy (information) and p
i is the probability of each allele. So the information in this population would be slightly more than 0.3. That's how information works.
Wrong. Has nothing to do with usefulness or ability to do anything with the information.
If you don't get this, you don't get information.