• CFN has a new look and a new theme

    "I bore you on eagle's wings, and brought you to Myself" (Exodus 19:4)

    More new themes will be coming in the future!

  • Desire to be a vessel of honor unto the Lord Jesus Christ?

    Join For His Glory for a discussion on how

    https://christianforums.net/threads/a-vessel-of-honor.110278/

  • CFN welcomes new contributing members!

    Please welcome Roberto and Julia to our family

    Blessings in Christ, and hope you stay awhile!

  • Have questions about the Christian faith?

    Come ask us what's on your mind in Questions and Answers

    https://christianforums.net/forums/questions-and-answers/

  • Read the Gospel of our Lord Jesus Christ?

    Read through this brief blog, and receive eternal salvation as the free gift of God

    /blog/the-gospel

  • Taking the time to pray? Christ is the answer in times of need

    https://christianforums.net/threads/psalm-70-1-save-me-o-god-lord-help-me-now.108509/

  • Focus on the Family

    Strengthening families through biblical principles.

    Focus on the Family addresses the use of biblical principles in parenting and marriage to strengthen the family.

[_ Old Earth _] Dr. William Bialek

Crying Rock

Member
Joined
Oct 16, 2008
Messages
554
Reaction score
0
Research Interests


I am interested in the interface between physics and biology, broadly interpreted. A central theme in my research is an appreciation for how well things “work†in biological systems. It is, after all, some notion of functional behavior that distinguishes life from inanimate matter, and it is a challenge to quantify this functionality in a language that parallels our characterization of other physical systems. Strikingly, when we do this (and there are not so many cases where it has been done!), the performance of biological systems often approaches some limits set by basic physical principles. While it is popular to view biological mechanisms as an historical record of evolutionary and developmental compromises, these observations on functional performance point toward a very different view of life as having selected a set of near optimal mechanisms for its most crucial tasks. Even if this view is wrong, it suggests a theoretical physicist's idealization; the construction of this idealization and the attempt to calibrate the performance of real biological systems against this ideal provides a productive route for the interaction of theory and experiment, and in several cases this effort has led to the discovery of new phenomena. The idea of performance near the physical limits crosses many levels of biological organization, from single molecules to cells to perception and learning in the brain, and I have tried to contribute to this whole range of problems.

http://www.princeton.edu/~wbialek/wbialek.html

Much of biological function is about the flow and processing of information. Examples range from bacteria to brains, with many stops in between. All of these many different biological systems, however, are constrained by common physical principles. For example, using only a limited number of molecules to transmit signals means that cells will experience some irreducible noise related to the random behavior of the individual molecules. There is a long history of experiments on signals and noise in biological systems, and in recent years these experiments have expanded to encompass a wider variety of biological processes, including the regulation of gene expression. Remarkably, many biological systems seem to operate near the relevant physical limits to the their signaling performance, and this may give us a glimpse of the “design principles†which select the structure of these systems from the wide range of possibilities. In this course we'll explore the underlying physical principles, the associated mathematical tools, and (in some detail) the connection of these theoretical ideas to quantitative experiments.

http://www.princeton.edu/~wbialek/RUcourseF08/SNI.html


Our intuition from statistical mechanics suggests that the entropy S
measures Max’s uncertainty about what Allan will say in response to his
question, in the same way that the entropy of a gas measures our lack of
knowledge about the microstates of all the constituent molecules. Once Allan
gives his answer, all of this uncertainty is removedâ€â€one of the responses
occurred, corresponding to p = 1, and all the others did not, corresponding
to p = 0â€â€so the entropy is reduced to zero. It is appealing to equate this
reduction in our uncertainty with the information we gain by hearing Allan’s
answer. Shannon proved that this is not just an interesting analogy; it is
the only definition of information that conforms to some simple constraints
[Shannon 1948]

http://www.princeton.edu/~wbialek/RUcourseF08/info1.pdf


4.2. ENTROPY LOST AND INFORMATION GAINED 101

4.2 Entropy lost and information gained
Returning to the conversation between Max and Allan, we assumed that
Max would receive a complete answer to his question, and hence that all
his uncertainty would be removed. This is an idealization, of course. The
more natural description is that, for example, the world can take on many
states W, and by observing data D we learn something but not everything
about W. Before we make our observations, we know only that states of the
world are chosen from some distribution P(W), and this distribution has
an entropy S(W). Once we observe some particular datum D, our (hopefully
improved) knowledge of W is described by the conditional distribution
P(W|D), and this has an entropy S(W|D) that is smaller than S(W) if we
have reduced our uncertainty about the state of the world by virtue of our
observations. We identify this reduction in entropy as the information that
we have gained about W.

http://www.princeton.edu/~wbialek/RUcourseF08/info2.pdf
 
Would you like to try a very simple simulation to show how natural selection can optimize properties of organisms?

Did you know that Shannon's information theory is used in evolutionary science to measure how mutations increase information in a population?
 
The Barbarian said:
Would you like to try a very simple simulation to show how natural selection can optimize properties of organisms?

I have no beef with natural selection optimizing properties of organisms.


The Barbarian said:
Did you know that Shannon's information theory is used in evolutionary science to measure how mutations increase information in a population?

No I didn't. Do you think mutations increase or decrease informational entropy? Who/ what created the original information? As a creationist I believe God created the original information so, unless God intends for particular mutations to occur, they represent a decrease in information.

I can't speak for an atheistic or naturalistic evolutionary interpretation of this matter and don't really care to know.

If you think God has his "hands on" every aspect of evolution (basically directing it) then I can see how you conclude mutations are an increase in information. However that increase in information is a product of intelligence (God).

CR
 
(Barbarian asks if CR knows Shannon's equation is used to measure information in a population)

No I didn't. Do you think mutations increase or decrease informational entropy? Who/ what created the original information?

Here's a way to find out:

Imagine a population with two alleles for a specific gene, each at 50% frequency. Now, imagine a mutation that produces a new allele, which eventually reaches parity with the other two. (each at 0.33 frequency). Apply Shannon's formula and see.

As a creationist I believe God created the original information so, unless God intends for particular mutations to occur, they represent a decrease in information.

In fact, every new mutation increases information in a population, as you'll see if you do the math.

I can't speak for an atheistic or naturalistic evolutionary interpretation of this matter and don't really care to know.

No such thing as "atheistic math."

If you think God has his "hands on" every aspect of evolution

For a Christian, God is intimately involved in every subatomic particle. Nature is just God's way of making a world with enough consistency and predictability that we can live in it.

However that increase in information is a product of intelligence (God).

Turns out we only need to recognize the rules by which He runs things here. Which is great, because science is too weak a process to explain God.
 
B wrote:

Barbarian asks if CR knows Shannon's equation is used to measure information in a population

CR wrote:

No I didn't. Do you think mutations increase or decrease informational entropy? Who/ what created the original information?


B wrote:

Here's a way to find out:

Imagine a population with two alleles for a specific gene, each at 50% frequency. Now, imagine a mutation that produces a new allele, which eventually reaches parity with the other two. (each at 0.33 frequency). Apply Shannon's formula and see.

How do I apply Shannon's formula to a population statistics’ problem when his formula is meant to be applied to communication:

R = H(x) - Hy(x)

Where:

R is the information received,

H(x) is the signal and

Hy(x) is the noise (uncertainty/ entropy) in the received signal.

"The conditional entropy Hy(x) will, for convenience, be called the equivocation. It measures the average ambiguity of the received signal."

C. E. Shannon, ``A mathematical theory of communication,'' Bell System Technical Journal, vol. 27, pp. 379-423 and 623-656, July and October, 1948.

http://www.thocp.net/biographies/papers ... n_1948.pdf

Walk me through how to apply Shannon's formula to population statistics.

Specifically:

B wrote:

Imagine a population with two alleles for a specific gene, each at 50% frequency. Now, imagine a mutation that produces a new allele, which eventually reaches parity with the other two. (each at 0.33 frequency). Apply Shannon's formula and see.

R = H(x) - Hy(x)

What represents R in your scenario? What represents H(x)? What represents Hy(x)?

Do Native Americans’ genes contain more information than other populations because they possess a novel allele:

http://www.ncjrs.gov/App/AbstractDB/Abs ... ?id=247643

And what, at the molecular level, does this allele communicate to the receiver (except blah, blah, blah…)?
 
How do I apply Shannon's formula to a population statistics’ problem when his formula is meant to be applied to communication

Information theory is based on probability theory and statistics. The most important quantities of information are entropy, the information in a random variable, and mutual information, the amount of information in common between two random variables.
http://en.wikipedia.org/wiki/Information_theory

Shannon's equation for information then, is:

H= - (sum of) p(x) log(p(x))

In the case above, it would be about 0.30 for two alleles of 0.5 frequency each, and about 0.48 for three alleles of about 0.3333 each. Feel free to play with it a bit to see how any mutation will increase information in the population.

As you see, information does not have to be the result of an intelligent agent, unless you think God creating a universe in which such things happen naturally is such an example.
 
Bwrote:

Shannon's equation for information then, is:

H= - (sum of) p(x) log(p(x))

No, that's Shannon's equation for entropy/ uncertainty prior to receipt of the signal. Shannon's equation for information is:

R = H(x) - Hy(x)

Where:

R is the information received,

H(x) is the entropy/ uncertainty prior to receipt of the signal

Hy(x) is the entropy/ uncertainty after receipt of the signal

"The conditional entropy Hy(x) will, for convenience, be called the equivocation. It measures the average ambiguity of the received signal."

C. E. Shannon, ``A mathematical theory of communication,'' Bell System Technical Journal, vol. 27, pp. 379-423 and 623-656, July and October, 1948.

http://www.thocp.net/biographies/papers ... n_1948.pdf

See page 20.

So, in your scenario:

B wrote:

Imagine a population with two alleles for a specific gene, each at 50% frequency. Now, imagine a mutation that produces a new allele, which eventually reaches parity with the other two. (each at 0.33 frequency). Apply Shannon's formula and see.

What constitutes:

1. H(x)- the entropy/ uncertainty prior to receipt of the signal?

2. Hy(x)- the entropy/ uncertainty after receipt of the signal?

3. R- the information received?

B wrote:

Shannon's equation for information then, is:

H= - (sum of) p(x) log(p(x))

It appears your trying to equate entropy/ uncertainty with information.

How can a higher amount of entropy/ uncertainty represent a gain in information?

That's completely counter to Shannon's theory.


4.2. ENTROPY LOST AND INFORMATION GAINED 101

4.2 Entropy lost and information gained

Returning to the conversation between Max and Allan, we assumed that
Max would receive a complete answer to his question, and hence that all
his uncertainty would be removed. This is an idealization, of course. The
more natural description is that, for example, the world can take on many
states W, and by observing data D we learn something but not everything
about W. Before we make our observations, we know only that states of the
world are chosen from some distribution P(W), and this distribution has
an entropy S(W). Once we observe some particular datum D, our (hopefully
improved) knowledge of W is described by the conditional distribution
P(W|D), and this has an entropy S(W|D) that is smaller than S(W) if we
have reduced our uncertainty about the state of the world by virtue of our
observations. We identify this reduction in entropy as the information that
we have gained about W.

http://www.princeton.edu/~wbialek/RUcourseF08/info2.pdf


Our intuition from statistical mechanics suggests that the entropy S
measures Max’s uncertainty about what Allan will say in response to his
question, in the same way that the entropy of a gas measures our lack of
knowledge about the microstates of all the constituent molecules. Once Allan
gives his answer, all of this uncertainty is removedâ€â€one of the responses
occurred, corresponding to p = 1, and all the others did not, corresponding
to p = 0â€â€so the entropy is reduced to zero. It is appealing to equate this
reduction in our uncertainty with the information we gain by hearing Allan’s
answer. Shannon proved that this is not just an interesting analogy; it is
the only definition of information that conforms to some simple constraints
[Shannon 1948]

http://www.princeton.edu/~wbialek/RUcourseF08/info1.pdf
 
It appears your trying to equate entropy/ uncertainty with information.

Right. Information is the measure of uncertainty in a signal. If you have a priori no uncertainty at all about a signal, there is no information at all in it. Receiving the signal will give you no information, since you already know what's in it.

The uncertainty is a measure of information.

In information theory, the term information is used in a special sense; it is a measure of the freedom of choice with which a message is selected from the set of all possible messages. Information is thus distinct from meaning, since it is entirely possible for a string of nonsense words and a meaningful sentence to be equivalent with respect to information content.
http://www.answers.com/topic/information-theory

Information theory is based on probability theory and statistics. The most important quantities of information are entropy, the information in a random variable, and mutual information, the amount of information in common between two random variables.
http://en.wikipedia.org/wiki/Information_theory

How can a higher amount of entropy/ uncertainty represent a gain in information?

See above. Entropy (which in signals, is uncertainty) is information.

Entropy is information in source up to a constant
which depends on precision
Constant goes to infinity for infinite precision
Entropy can be negative

http://www.tele.ucl.ac.be/EDU/ELEC2880/ ... Cours6.pdf

That's completely counter to Shannon's theory.

No, it's the heart of his theory. It's completely counter to intuition, but Shannon's insight explains how to pack so much information in a pipe with limited bandwidth.

See the calculations I showed you. Notice that the more alleles in the population (hence greater uncertainty as to which allele you will sample) the more information the sample will hold. If you have only one allele, the information will be 0.0, because it will tell you nothing you do not already know.

Think of a digitized signal over time. At each time interval, the signal is sampled, and
the value measured is one of a number of classes or digits C, so that a changing signal is
classified into a sequence of digits. Over time, we can count the occurrences of each
digit. If the number of each type i is ni, and the total number is N, we can speak of the
probability of measuring each type of digit since measurement started. It is just the fraction
of all the digits in each class pi=ni/N. Shannon then showed that the entropy could
be defined by
H = - p1 log2 p1 - p2 log2 p2 ...- pC log2 pC
where the base 2 logarithm is used so that the measurement turns out to be measured in
“bits.†This quantity has many deep and useful properties that we don’t have to go into
here. Shannon showed that the quantity H was a measure of the amount of information
in the original signal, in the sense that it measured the amount of its variation. He also
showed that it is a lower limit on the length of an equivalent message (in bits) to which
a signal can be compressed.

http://www.usenix.org/publications/logi ... urgess.pdf

If you've had no formal training in systems and information theory, this probably comes as a shock. But it's true, and it works in practical applications, like internet message compression, and use of very weak transmitters to send reliable messages across millions of kilometers of space.
 
B wrote:

See above. Entropy...is uncertainty...is information.

B.,

It's not worth arguing with you.

I've given you two academic sources that conclude decreased entropy=increased information and increased entropy=decreased information: Shannon himself and Princeton:

4.2. ENTROPY LOST AND INFORMATION GAINED 101

4.2 Entropy lost and information gained

Returning to the conversation between Max and Allan, we assumed that
Max would receive a complete answer to his question, and hence that all
his uncertainty would be removed. This is an idealization, of course. The
more natural description is that, for example, the world can take on many
states W, and by observing data D we learn something but not everything
about W. Before we make our observations, we know only that states of the
world are chosen from some distribution P(W), and this distribution has
an entropy S(W). Once we observe some particular datum D, our (hopefully
improved) knowledge of W is described by the conditional distribution
P(W|D), and this has an entropy S(W|D) that is smaller than S(W) if we
have reduced our uncertainty about the state of the world by virtue of our
observations. We identify this reduction in entropy as the information that
we have gained about W.

http://www.princeton.edu/~wbialek/RUcourseF08/info2.pdf



Our intuition from statistical mechanics suggests that the entropy S
measures Max’s uncertainty about what Allan will say in response to his
question, in the same way that the entropy of a gas measures our lack of
knowledge about the microstates of all the constituent molecules. Once Allan
gives his answer, all of this uncertainty is removedâ€â€one of the responses
occurred, corresponding to p = 1, and all the others did not, corresponding
to p = 0â€â€so the entropy is reduced to zero. It is appealing to equate this
reduction in our uncertainty with the information we gain by hearing Allan’s
answer. Shannon proved that this is not just an interesting analogy; it is
the only definition of information that conforms to some simple constraints
[Shannon 1948]

http://www.princeton.edu/~wbialek/RUcourseF08/info1.pdf

B wrote:

Shannon's equation for information then, is:

H= - (sum of) p(x) log(p(x))


CR wrote:

No, that's Shannon's equation for entropy/ uncertainty prior to receipt of the signal. Shannon's equation for information is:

R = H(x) - Hy(x)

Where:

R is the information received,

H(x) is the entropy/ uncertainty prior to receipt of the signal

Hy(x) is the entropy/ uncertainty after receipt of the signal

"The conditional entropy Hy(x) will, for convenience, be called the equivocation. It measures the average ambiguity of the received signal."

C. E. Shannon, ``A mathematical theory of communication,'' Bell System Technical Journal, vol. 27, pp. 379-423 and 623-656, July and October, 1948.

http://www.thocp.net/biographies/papers ... n_1948.pdf


Here's another academic source that concludes just the opposite of you:

B wrote:

See above. Entropy...is uncertainty...is information.

Information Is Not Entropy,
Information Is Not Uncertainty!

http://www.lecb.ncifcrf.gov/~toms/infor ... ainty.html

I'm Confused: How Could Information Equal Entropy?

http://www.ccrnp.ncifcrf.gov/~toms/bion ... al.Entropy

I'm Confused: How Could Information Equal Entropy?

If you are truly searching for truth perhaps you should be asking the same question.

I’m done here.

:wave
 
Your guy seems a bit confused. Shannon himself pointed out that the entropy of the message is a measure of its information.

This probably seems wrong to you, but it has one very important virtue; it works. It is used to compress messages, and to specify the amount of redundancy necessary to obtain the desired degree of accuracy.

Perhaps this makes it clearer:
In information theory, entropy is a measure of the uncertainty associated with a random variable. The term by itself in this context usually refers to the Shannon entropy, which quantifies, in the sense of an expected value, the information contained in a message, usually in units such as bits. Equivalently, the Shannon entropy is a measure of the average information content one is missing when one does not know the value of the random variable. The concept was introduced by Claude E. Shannon in his 1948 paper "A Mathematical Theory of Communication".

Shannon's entropy represents an absolute limit on the best possible lossless compression of any communication, under certain constraints: treating messages to be encoded as a sequence of independent and identically-distributed random variables, Shannon's source coding theorem shows that, in the limit, the average length of the shortest possible representation to encode the messages in a given alphabet is their entropy divided by the logarithm of the number of symbols in the target alphabet.

A fair coin has an entropy of one bit. However, if the coin is not fair, then the uncertainty is lower (if asked to bet on the next outcome, we would bet preferentially on the most frequent result), and thus the Shannon entropy is lower. Mathematically, a coin flip is an example of a Bernoulli trial, and its entropy is given by the binary entropy function. A long string of repeating characters has an entropy rate of 0, since every character is predictable. The entropy rate of English text is between 1.0 and 1.5 bits per letter,[1] or as low as 0.6 to 1.3 bits per letter, according to estimates by Shannon based on human experiments.[2]

http://en.wikipedia.org/wiki/Information_entropy

Again, to the layman, it must seem counter-intuitive. But it works.
 
Back
Top