• CFN has a new look and a new theme

    "I bore you on eagle's wings, and brought you to Myself" (Exodus 19:4)

    More new themes will be coming in the future!

  • Desire to be a vessel of honor unto the Lord Jesus Christ?

    Join For His Glory for a discussion on how

    https://christianforums.net/threads/a-vessel-of-honor.110278/

  • CFN welcomes new contributing members!

    Please welcome Roberto and Julia to our family

    Blessings in Christ, and hope you stay awhile!

  • Have questions about the Christian faith?

    Come ask us what's on your mind in Questions and Answers

    https://christianforums.net/forums/questions-and-answers/

  • Read the Gospel of our Lord Jesus Christ?

    Read through this brief blog, and receive eternal salvation as the free gift of God

    /blog/the-gospel

  • Taking the time to pray? Christ is the answer in times of need

    https://christianforums.net/threads/psalm-70-1-save-me-o-god-lord-help-me-now.108509/

  • Focus on the Family

    Strengthening families through biblical principles.

    Focus on the Family addresses the use of biblical principles in parenting and marriage to strengthen the family.

[_ Old Earth _] How to measure information content of a genome?

Crying Rock said:
There would be a slight decrease in information. Our views may differ because I think that our creator transmitted the original message.
That's the point i was trying to get to.
The given interpretation of information theory in this threads right now explicitly defines the "original message" to hold the highest possible information content, and that all changes are a loss of information by definition.

This means that even if the original message was the lyrics of a Britney Spears song and by absolute freak chance the received message was a copy of the Encyclopaedia Britannica, then that would still count as a loss of information when we use this particular interpretation of information theory.

The crux of the matter is that this interpretation of information theory does not evaluate the quality of the alteration of the original message. It just declares any changes of the original message to be unwanted by definition and doesn't bother to look at what those changes actually do.
 
It makes sense that creationists claim that evolution can't produce new information, then. They are essentially arguing that any change has to a loss of information by definition.

But then, "new information" is an oxymoron, even if we have the problem of not knowing what the imaginary "original information" was.

And no evolutionary process would require "new information."

Rock and a hard place, it seems.
 
The Barbarian said:
It makes sense that creationists claim that evolution can't produce new information, then. They are essentially arguing that any change has to a loss of information by definition.

But then, "new information" is an oxymoron, even if we have the problem of not knowing what the imaginary "original information" was.

And no evolutionary process would require "new information."
Yep, exactly. This makes sense considering Shannon's background as an employee of a telecommunications company. To his employer, any change to the original message was a loss of information by definition, and that is reflected in the theoretical work that Shannon did.

This means that in regards to the theory of evolution Shannon's information theory is not really applicable as an argument against the ToE, at least when one refers to gene duplication during procreation as a case of message transmission. Any evolutionary change, even if it produces an entirely new feature without losing another previously existent one, would be a loss of information according to this interpretation of information theory. It may sound counterintuitive at first, but it makes sense after a closer look.

Otherwise i'd like to challenge the resident creationists to describe a change of a genome that would actually constitute an increase of information according to this definition. There is none.
 
Shannon's equation is used by population biologists to measure the amount of information in a population, as it pertains to single loci. The frequency of each allele becomes the input, and the total information is -1 times the sum of the frequency of each allele times the log of the frequency of each allele.

Not many people know that Claude Shannon's PhD thesis was on the application of information theory to genetics.
 
The Barbarian said:
Shannon's equation is used by population biologists to measure the amount of information in a population, as it pertains to single loci. The frequency of each allele becomes the input, and the total information is -1 times the sum of the frequency of each allele times the log of the frequency of each allele.

Not many people know that Claude Shannon's PhD thesis was on the application of information theory to genetics.

And what does Mr. Tom have to say:

http://www.lecb.ncifcrf.gov/~toms/infor ... ainty.html

Does he contradict Shannon?
 
Does he contradict Shannon?

At the very beginning. Information is indeed a measure of uncertainty. Let's suppose you have a grid of 10X20 units. At the intersection of each line is a ball. How much information do you need to specify the position of each ball? Now imagine it as disordered as possible. How much information is necessary to specify the location of each ball now?

And suppose we were wrong about those balls being randomly distributed. Suppose they actually code for some sort of numerical sequence, depending on their relationship with each other. Which of the two examples would have more information?

Does that help?

In information theory, the more bits of information you have, the more uncertainty you have. The number of possible messages you can make with x number of bits will be two to the x power (since there are only two bits: on and off). Turn this same idea around, and the number of bits you will need to transmit a message will be the base-two logarithm of the number of possible messages.

2^x = M
log(2) M = x

The value x here was given the name "entropy" by Shannon. Usually entropy is measured in "bits per symbol" or a similar relation--if you are using a set of symbols to transmit your message, the entropy is how many bits it will take to represent one symbol. For example, the extended ASCII character set has 256 characters. The base-two logarithm of 256 is 8 (2^8 = 256), so there are 8 bits of entropy per symbol. If one symbol is more likely to follow another (such as "u" often follows "q"), the math gets more complicated, but the relation is still logarithmic.
 
The Barbarian said:
Shannon's equation is used by population biologists to measure the amount of information in a population, as it pertains to single loci.

What differences do you note between population statistics and info exchange between the molecular machines?
 
What differences do you note between population statistics and info exchange between the molecular machines?

At the information level, none. As Shannon noted in his thesis on genetics, the way information works is largely independent of the system itself.
 
The Barbarian said:
What differences do you note between population statistics and info exchange between the molecular machines?

At the information level, none. As Shannon noted in his thesis on genetics, the way information works is largely independent of the system itself.

That's pretty vague to me. Please continue.
 
Shannon first elucidated his ideas about information using population genetics. But the same theory is applicable to a telegraph, to radio transmissions, molecular biology, etc.

A pretty good non-technical source would be Information Theory and Molecular Biology, by Hubert Yockey. You might like him; he's a Christian, and he argues that the origin of life is not knowable by man.
 
The Barbarian said:
Shannon first elucidated his ideas about information using population genetics. But the same theory is applicable to a telegraph, to radio transmissions, molecular biology, etc.

How would you use Shannon's equation:

R = H(x) - Hy(x)

for "information" measurement using population genetics?

"The conditional entropy Hy(x) will, for convenience, be called the equivocation. It measures the average ambiguity of the received signal."

http://cm.bell-labs.com/cm/ms/what/shan ... paper.html
 
Back
Top