Crying Rock
Member
- Oct 16, 2008
- 554
- 0
Research Interests
I am interested in the interface between physics and biology, broadly interpreted. A central theme in my research is an appreciation for how well things “work†in biological systems. It is, after all, some notion of functional behavior that distinguishes life from inanimate matter, and it is a challenge to quantify this functionality in a language that parallels our characterization of other physical systems. Strikingly, when we do this (and there are not so many cases where it has been done!), the performance of biological systems often approaches some limits set by basic physical principles. While it is popular to view biological mechanisms as an historical record of evolutionary and developmental compromises, these observations on functional performance point toward a very different view of life as having selected a set of near optimal mechanisms for its most crucial tasks. Even if this view is wrong, it suggests a theoretical physicist's idealization; the construction of this idealization and the attempt to calibrate the performance of real biological systems against this ideal provides a productive route for the interaction of theory and experiment, and in several cases this effort has led to the discovery of new phenomena. The idea of performance near the physical limits crosses many levels of biological organization, from single molecules to cells to perception and learning in the brain, and I have tried to contribute to this whole range of problems.
http://www.princeton.edu/~wbialek/wbialek.html
Much of biological function is about the flow and processing of information. Examples range from bacteria to brains, with many stops in between. All of these many different biological systems, however, are constrained by common physical principles. For example, using only a limited number of molecules to transmit signals means that cells will experience some irreducible noise related to the random behavior of the individual molecules. There is a long history of experiments on signals and noise in biological systems, and in recent years these experiments have expanded to encompass a wider variety of biological processes, including the regulation of gene expression. Remarkably, many biological systems seem to operate near the relevant physical limits to the their signaling performance, and this may give us a glimpse of the “design principles†which select the structure of these systems from the wide range of possibilities. In this course we'll explore the underlying physical principles, the associated mathematical tools, and (in some detail) the connection of these theoretical ideas to quantitative experiments.
http://www.princeton.edu/~wbialek/RUcourseF08/SNI.html
Our intuition from statistical mechanics suggests that the entropy S
measures Max’s uncertainty about what Allan will say in response to his
question, in the same way that the entropy of a gas measures our lack of
knowledge about the microstates of all the constituent molecules. Once Allan
gives his answer, all of this uncertainty is removedâ€â€one of the responses
occurred, corresponding to p = 1, and all the others did not, corresponding
to p = 0â€â€so the entropy is reduced to zero. It is appealing to equate this
reduction in our uncertainty with the information we gain by hearing Allan’s
answer. Shannon proved that this is not just an interesting analogy; it is
the only definition of information that conforms to some simple constraints
[Shannon 1948]
http://www.princeton.edu/~wbialek/RUcourseF08/info1.pdf
4.2. ENTROPY LOST AND INFORMATION GAINED 101
4.2 Entropy lost and information gained
Returning to the conversation between Max and Allan, we assumed that
Max would receive a complete answer to his question, and hence that all
his uncertainty would be removed. This is an idealization, of course. The
more natural description is that, for example, the world can take on many
states W, and by observing data D we learn something but not everything
about W. Before we make our observations, we know only that states of the
world are chosen from some distribution P(W), and this distribution has
an entropy S(W). Once we observe some particular datum D, our (hopefully
improved) knowledge of W is described by the conditional distribution
P(W|D), and this has an entropy S(W|D) that is smaller than S(W) if we
have reduced our uncertainty about the state of the world by virtue of our
observations. We identify this reduction in entropy as the information that
we have gained about W.
http://www.princeton.edu/~wbialek/RUcourseF08/info2.pdf