Christian Forums

This is a sample guest message. Register a free account today to become a member! Once signed in, you'll be able to participate on this site by adding your own topics and posts, as well as connect with other members through your own private inbox!

  • Focus on the Family

    Strengthening families through biblical principles.

    Focus on the Family addresses the use of biblical principles in parenting and marriage to strengthen the family.

  • The Gospel of Jesus Christ

    Heard of "The Gospel"? Want to know more?

    There is salvation in no other, for there is not another name under heaven having been given among men, by which it behooves us to be saved."

  • Site Restructuring

    The site is currently undergoing some restructuring, which will take some time. Sorry for the inconvenience if things are a little hard to find right now.

    Please let us know if you find any new problems with the way things work and we will get them fixed. You can always report any problems or difficulty finding something in the Talk With The Staff / Report a site issue forum.

[_ Old Earth _] Imprisoned scientists jailbreak offer??

Donations

Total amount
$1,642.00
Goal
$5,080.00
M

MrVersatile48

Guest
From http://www.breakpoint.org

Featured Article

Patterns that Point to Design
Mark Early


A Meaningful World Note: This commentary was delivered by Prison Fellowship President Mark Earley.

Chuck Colson spoke yesterday about Benjamin Wiker and Jonathan Witt's interesting new book, A Meaningful World: How the Arts and Sciences Reveal the Genius of Nature.

Wiker's and Witt's argument, as you may recall, is that the wonders of our world point to a universe full of meaning - meaning that, in turn, points to a Designer.

The uniqueness of their work is that they focus on the arts and sciences to make their case.

Chuck already touched on the book's argument about Shakespeare. The authors show that all efforts to force his works into a tidy little box have come up short.

Materialism and deconstructionism insist that all people and all literature are basically the same - that "everybody is a genius, so no one is."

Shakespeare's beautiful and profound plays and poetry stand as a living rebuke of such a despairing worldview.

Wiker and Witt write, "The playwright's themes pose a profound challenge for materialism. . . . But more fundamental still is the challenge Shakespeare's genius poses to any worldview that would reduce everything, including the human mind, to the mindless flux of matter and energy. . . . Are we really to believe that natural selection moved from a single cell in a dirty pond to this?"

But that's only the beginning. From exploring the genius of one human being, the authors move on to exploring another kind of genius.

Specifically, they go on to talk about geometry and chemistry.

But they're not talking about the intelligence of the human beings who came up with the Pythagorean Theorem and the periodic table. After all, these things weren't invented by humans.

What they're pointing to is the genius that created the orderly physical and mathematical laws that govern the whole universe. Without this intentional design, there would have been no Pythagorean Theorem or periodic table to discover.

The authors ask that, if the world was born out of chance, how is it that nature acts according to rational laws?

If we're all here because of random and meaningless events, it doesn't make sense that, one, there are mathematical and scientific laws that govern our world, and, two, that our efforts could discover what those laws are.

We would be fumbling in the darkness of randomness, looking for explanations that didn't even exist.

But the universe is full of patterns - patterns that extend to the smallest particles of an atom, that can be seen in the orderliness of the periodic table of elements. Furthermore, they're patterns that the human mind could discover and comprehend.

How does random chance explain all that?

Materialists look at these natural laws and patterns and say that they allow a closed and godless universe to function properly. Wiker and Witt show that this is utterly "absurd." They give the proper respect to these natural laws and patterns by showing that there's nothing random or godless about them.

The universe is saturated with evidence of the Designer's existence.

The very mathematics and science - and literature - that lead some to posit a barren and meaningless existence are actually the very keys to discovering a universe full of intelligence and meaning.

--------------------------------------------------------------------------------------

Ever seen the famous painting of Jesus, standing at the heart's door & patiently knocking?

Inspired by Revelation 3:20

Revelation 3:20 (New International Version)


20Here I am! I stand at the door and knock. If anyone hears my voice and opens the door, I will come in and eat with him, and he with me.


But time is running out, so don't let pride, prestige, popularity or possessions make you too late to answer the call at your heart now...

"NOW is the acceptable time: behold, NOW is the day of salvation"

Hundreds of top scientists - from micro-biology to astronomy - have seen so much evidence of Intelligent Design that they reject the atheistic brainwashing of schools, colleges & universities

Realise such brainwashing has been planned at least since Aldous Huxley classic, 'Brave New World' - his brother Julian was high up in UNESCO & got them to use the book as the model for 24x7 media brainwashing

Realise we have all the techno for Antichrist's worst ever global tyranny - as in Daniel 7 & Revelation 13

Academia prides itself on freedom of thought, so such censorhip of all criticism of neo-Darwin drivel, of all teaching about Intelligent Design, is most sinister & dangerous indeed

See & spread the comprehensive menu of learned articles & books at http://www.discovery.org/csc

Back to link freedom of speech features...

http://www.christianforums.net/viewtopi ... highlight=

That links to others, that link to others

"Set the captives free: proclaim liberty to the oppressed!"

See Isaiah 61

Isaiah 61 (New International Version)

The Year of the LORD's Favor

1 The Spirit of the Sovereign LORD is on me,
because the LORD has anointed me
to preach good news to the poor.
He has sent me to bind up the brokenhearted,
to proclaim freedom for the captives
and release from darkness for the prisoners
, [a]
2 to proclaim the year of the LORD's favor
and the day of vengeance of our God,
to comfort all who mourn,

3 and provide for those who grieve in Zionâ€â€
to bestow on them a crown of beauty
instead of ashes,
the oil of gladness
instead of mourning,
and a garment of praise
instead of a spirit of despair.
They will be called oaks of righteousness,
a planting of the LORD
for the display of his splendor.

4 They will rebuild the ancient ruins
and restore the places long devastated;
they will renew the ruined cities
that have been devastated for generations.



Altogether now...

123...



Onward, Christian students!! :multi:



God bless!

Ian :-D
 
MrVersatile48 said:
If we're all here because of random and meaningless events, it doesn't make sense that, one, there are mathematical and scientific laws that govern our world, and, two, that our efforts could discover what those laws are.

We would be fumbling in the darkness of randomness, looking for explanations that didn't even exist.

But the universe is full of patterns - patterns that extend to the smallest particles of an atom, that can be seen in the orderliness of the periodic table of elements. Furthermore, they're patterns that the human mind could discover and comprehend.
I think this has several problems in it.

1. Randomness can lead to order in large numbers. For example, temperature is a measure of the average kenetic energy of a large number of atoms. All the atoms are moving with random speeds, but there is order to all of this.

2. Random can also build order in different ways. Look at how atoms align themselves in different ways on a snowflake. Snowflakes are random, but are also ordered.

3. If you are talking about universal laws, then to see thr randomness, you have to see what can change. We don't really know this yet, so statements about variability of physical laws should be treated with caution. However, nothing has dicsounted the concept that there are an infinite number of universes with different physical laws. Only life will spring up where it is possible.
 
Try this:

Entropy. Ever heard it?

It's the proven concept that as time goes on, things get LESS and LESS orderly. It's chaos. Basically it takes more energy to re-order things then to disorder things. So if things by evolutionary points of view are getting more complex and orderly and all around amazing, then how does it account for entropy? It DOESN'T.
 
Basically it takes more energy to re-order things then to disorder things.
...and the overall entropy in the solar system does continually increase, there is no problem with that at all.

Moreover, evolution has been directly observed, including the step from single celled organisms to multi celled ones. If the second law of thermodynamics would pose a problem for this, then that observation would actually falsify the second law of thermodynamics. But it is not a problem.

But that of course is a matter for another thread...
 
The same argument would rule out a baby growing into an adult. Clearly, a lot of creationists have no grasp on the concept of entropy.
 
Clearly, a lot of creationists have no grasp on the concept of entropy.

From an individual who equates information with entropy. :wink:

[quote:cd548]

(Barbarian demonstrates how a new allele increases information by increasing uncertainty)

Yep, but you need to be reminded, it seems. Let's get you a definition of "information" that can actually be tested...

The quantity which uniquely meets the natural requirements that one sets up for information turns out to be exactly that which is known in thermodynamics as entropy. It is expressed in terms of the various probabilities involved--those of getting to certain stages in the process of forming messages, and the probabilities that, when in those stages, certain symbols be chosen next. The formula, moreover, involves the logarithm of probabilities, so that it is a natural generalization of the logarithmic measure spoken of above in connection with simple cases.

To those who have studied the physical sciences, it is most significant that an entropy-like expression appears in the theory as a measure of information. Introduced by Clausius nearly one hundred years ago, closely associated with the name of Boatzmann, and given deep meaning by Gibbs in his classic work on statistical mechanics, entropy has become so basic and pervasive a concept that Eddington remarks The law that entropy always increases - the second law of thermodynamics - holds, (Charlie: This is true) I think, the supreme position among the laws of Nature.
http://www.uoregon.edu/~felsing/virtual_asia/info.html

Now, you may not like that definition, but it turns out to be the right one.
It works. This understandning of information allows us to measure information in things like genetics, data transmission, and so on, and it tells us how to make low-powered transmitters that work over millions of miles, and how to stuff the maxium amount of reliable messages into a data channel.

Given that Shannon's definition works, and yours does not, I don't think youi're going to have much luck in convincing engineers and scientists to go with yours.

Barbarian on why a new allele produces new information:
Yep. All it has to be, is different than the other four. Before we go any further, perhaps you should tell us what you think "information" is, so we can clear up any misunderstandings.

Charlie:

Information does not = entropy.


Barbarian:

Yep, it does. See above.

Charlie:

Information is always a measure of the decrease of uncertainty at a receiver (or molecular machine).

Barbarian:

The quantity which uniquely meets the natural requirements that one sets up for information turns out to be exactly that which is known in thermodynamics as entropy.

Quote:
Your equating randomness with information...that's wrong.


You're confusing entropy with randomness. That's wrong. A system at maxium entropy cannot have any randomness whatever, since that would decrease entropy. Remember in a physical sense, entropy is the lack of any useful heat to do work. And if the thermal energy in a system is randomly distributed, it is not evenly distributed, and therefore heat continues to flow.

Barbarian asks:
So, you're saying that a new allele can't appear in a population? That's demonstrably false. Would you like some examples?

Quote:
I never claimed that new "alleles" don't appear in populations, but I maintain their mutations, which degrade the original information content.

Barbarian:

As you just learned, a new allele increases information. Always does. This is what Shannon was pointing out. Information is the measure of uncertainty in a message. The entropy, in other words.

Charlie:

Again, your confusing randomness with information. That's wrong.


Barbarian:

No. Rather, you've conflated entropy and randomness.

Quote:
And how do you know that “allele†is new?


Barbarian observes:
We can sometimes know, by tracing down the individual in which the mutation occured.

Quote:
Right...a mutation.


Remember, a mutation is not an allele.

Barbarian observes:

Yes, he does. Again, you probably don't know what "information" means in population genetics and information science.

No. The second one has more information, because the uncertainty is increased.

Charlie:

Information does not = entropy.


Barbarian:

If that were so, you wouldn't be communicating over this line. Engineers built that system, with the understanding that information = entropy. That works. Yours doesn't.

Charlie:

By stating information = randomness, you are demonstrating that you don't have a grip on information theory.

Barbarian:

See above. Entropy is not randomness.


Barbarian:

One more time, just so you remember:

The quantity which uniquely meets the natural requirements that one sets up for information turns out to be exactly that which is known in thermodynamics as entropy.
_________________
"Beware of gifts bearing Greeks." - Laocoon



Information Is Not Entropy,
Information Is Not Uncertainty!

Dr. Thomas D. Schneider
National Institutes of Health
National Cancer Institute
Center for Cancer Research Nanobiology Program
Molecular Information Theory Group
Frederick, Maryland 21702-1201
toms@ncifcrf.gov
http://www.ccrnp.ncifcrf.gov/~toms/

There are many many statements in the literature which say that information is the same as entropy. The reason for this was told by Tribus. The story goes that Shannon didn't know what to call his measure so he asked von Neumann, who said `You should call it entropy ... [since] ... no one knows what entropy really is, so in a debate you will always have the advantage' (Tribus1971).

Shannon called his measure not only the entropy but also the "uncertainty". I prefer this term because it does not have physical units associated with it. If you correlate information with uncertainty, then you get into deep trouble. Suppose that:

information ~ uncertainty

but since they have almost identical formulae:
uncertainty ~ physical entropy
so

information ~ physical entropy

BUT as a system gets more random, its entropy goes up:

randomness ~ physical entropy

so

information ~ physical randomness

How could that be? Information is the very opposite of randomness!

The confusion comes from neglecting to do a subtraction:

Information is always a measure of the decrease of uncertainty at a receiver (or molecular machine).


If you use this definition, it will clarify all the confusion in the literature.

Note: Shannon understood this distinction and called the uncertainty which is subtracted the 'equivocation'. Shannon (1948) said on page 20:
R = H(x) - Hy(x)

"The conditional entropy Hy(x) will, for convenience, be called the equivocation. It measures the average ambiguity of the received signal."

The mistake is almost always made by people who are not actually trying to use the measure.

http://www.lecb.ncifcrf.gov/~toms/infor ... ainty.html


I'm Confused: How Could Information Equal Entropy?

If someone says that information = uncertainty = entropy, then they are confused, or something was not stated that should have been. Those equalities lead to a contradiction, since entropy of a system increases as the system becomes more disordered. So information corresponds to disorder according to this confusion.

If you always take information to be a decrease in uncertainty at the receiver and you will get straightened out:

R = Hbefore - Hafter.


where H is the Shannon uncertainty:

H = - sum (from i = 1 to number of symbols) Pi log2 Pi (bits per symbol)

and Pi is the probability of the ith symbol. If you don't understand this, please refer to "Is There a Quick Introduction to Information Theory Somewhere?".

Imagine that we are in communication and that we have agreed on an alphabet. Before I send you a bunch of characters, you are uncertain (Hbefore) as to what I'm about to send. After you receive a character, your uncertainty goes down (to Hafter). Hafter is never zero because of noise in the communication system. Your decrease in uncertainty is the information (R) that you gain.

Since Hbefore and Hafter are state functions, this makes R a function of state. It allows you to lose information (it's called forgetting). You can put information into a computer and then remove it in a cycle.

Many of the statements in the early literature assumed a noiseless channel, so the uncertainty after receipt is zero (Hafter=0). This leads to the SPECIAL CASE where R = Hbefore. But Hbefore is NOT "the uncertainty", it is the uncertainty of the receiver BEFORE RECEIVING THE MESSAGE.

A way to see this is to work out the information in a bunch of DNA binding sites.

Definition of "binding": many proteins stick to certain special spots on DNA to control genes by turning them on or off. The only thing that distinguishes one spot from another spot is the pattern of letters (nucleotide bases) there. How much information is required to define this pattern?

Here is an aligned listing of the binding sites for the cI and cro proteins of the bacteriophage (i.e., virus) named lambda:

alist 5.66 aligned listing of:
* 96/10/08 19:47:44, 96/10/08 19:31:56, lambda cI/cro sites
piece names from:
* 96/10/08 19:47:44, 96/10/08 19:31:56, lambda cI/cro sites
The alignment is by delila instructions
The book is from: -101 to 100
This alist list is from: -15 to 15

------ ++++++
111111--------- +++++++++111111
5432109876543210123456789012345
...............................
OL1 J02459 35599 + 1 tgctcagtatcaccgccagtggtatttatgt
J02459 35599 - 2 acataaataccactggcggtgatactgagca
OL2 J02459 35623 + 3 tttatgtcaacaccgccagagataatttatc
J02459 35623 - 4 gataaattatctctggcggtgttgacataaa
OL3 J02459 35643 + 5 gataatttatcaccgcagatggttatctgta
J02459 35643 - 6 tacagataaccatctgcggtgataaattatc
OR3 J02459 37959 + 7 ttaaatctatcaccgcaagggataaatatct
J02459 37959 - 8 agatatttatcccttgcggtgatagatttaa
OR2 J02459 37982 + 9 aaatatctaacaccgtgcgtgttgactattt
J02459 37982 - 10 aaatagtcaacacgcacggtgttagatattt
OR1 J02459 38006 + 11 actattttacctctggcggtgataatggttg
J02459 38006 - 12 caaccattatcaccgccagaggtaaaatagt
^

Each horizontal line represents a DNA sequence, starting with the 5' end on the left, and proceeding to the 3' end on the right. The first sequence begins with: 5' tgctcag ... and ends with ... tttatgt 3'. Each of these twelve sequences is recognized by the lambda repressor protein (called cI) and also by the lambda cro protein.

What makes these sequences special so that these proteins like to stick to them? Clearly there must be a pattern of some kind.

Read the numbers on the top vertically. This is called a "numbar". Notice that position +7 always has a T (marked with the ^). That is, according to this rather limited data set, one or both of the proteins that bind here always require a T at that spot. Since the frequency of T is 1 and the frequencies of other bases there are 0, H(+7) = 0 bits. But that makes no sense whatsoever! This is a position where the protein requires information to be there.

That is, what is really happening is that the protein has two states. In the BEFORE state, it is somewhere on the DNA, and is able to probe all 4 possible bases. Thus the uncertainty before binding is Hbefore = log2(4) = 2 bits. In the AFTER state, the protein has bound and the uncertainty is lower: Hafter(+7) = 0 bits. The information content, or sequence conservation, of the position is Rsequence(+7) = Hbefore - Hafter = 2 bits. That is a sensible answer. Notice that this gives Rsequence close to zero outside the sites.

If you have uncertainty and information and entropy confused, I don't think you would be able to work through this problem. For one thing, one would get high information OUTSIDE the sites. Some people have published graphs like this.

A nice way to display binding site data so you can see them and grasp their meaning rapidly is by the sequence logo method. The sequence logo for the example above is at http://www.lecb.ncifcrf.gov/~toms/galle ... i.fig1.gif. More information on sequence logos is in the section What are Sequence Logos?

More information about the theory of BEFORE and AFTER states is given in the papers http://www.lecb.ncifcrf.gov/~toms/paper/nano2 , http://www.lecb.ncifcrf.gov/~toms/paper/ccmm and http://www.lecb.ncifcrf.gov/~toms/paper/edmm.

http://www.ccrnp.ncifcrf.gov/~toms/bion ... al.Entropy
Barbarian:


Charlie apparently doesn't know what "information" is, much less how to calculate it.

I'm intrigued by the idea that increasing information is a "decrease" in information.

Let's turn this one backwards, Charlie. Suppose God magically created five alleles, instead of doing it the way He did. Then suppose one allele disappears in a population, and there isn't any more.

Is that then a gain in infomation? If not, why is a new allele not an increase in information?

I have to say, Charley, it looks like you're just chanting phrases they taught you in YE indoctrination.
[/quote:cd548]



All right guys I'm hitting the sack. I'll catch up with you next weekend.

Peace 8-)
 
The difference between Charlies private definition of "information", and that of Shannon (of Bell Labs) is that Shannon's actually works. Hence, the pipes that deliver signals across the internet do so more efficiently because engineers use algorithms based on information as entropy.

Signals from very weak radio sources are accurately recieved from across millions of miles of space because engineers realize that information is entropy.

That kind of information reflects the real world. And, Shannon showed, it is the result of uncertainty.
 
The difference between Charlies private definition of "information", and that of Shannon (of Bell Labs) is that Shannon's actually works. Hence, the pipes that deliver signals across the internet do so more efficiently because engineers use algorithms based on information as entropy.

Signals from very weak radio sources are accurately recieved from across millions of miles of space because engineers realize that information is entropy.

That kind of information reflects the real world. And, Shannon showed, it is the result of uncertainty.

The point I'm trying to make Barb, is Shannon defined information as uncertainty after receipt. The lower the uncertainty, the greater the information. So the the lower the entropy, the greater the information:

Information Is Not Entropy,
Information Is Not Uncertainty!


Information is always a measure of the decrease of uncertainty at a receiver (or molecular machine).

R = H(x) - Hy(x)

The conditional entropy Hy(x) will, for convenience, be called the equivocation. It measures the average ambiguity of the received signal.

http://www.lecb.ncifcrf.gov/~toms/infor ... ainty.html
 
The point I'm trying to make Barb, is Shannon defined information as uncertainty after receipt.
That website talks about it as the change of uncertainty due to the receipt. That change is maximal when the received message has maximal entropy

From http://www.mdpi.org/entropy/htm/e7010068.htm, which is referenced by the website which you referenced:
Entropy measures lack of information; it also measures information. These two conceptions are complementary.
 
That website talks about it as the change of uncertainty due to the receipt. That change is maximal when the received message has maximal entropy

From http://www.mdpi.org/entropy/htm/e7010068.htm, which is referenced by the website which you referenced:
Quote:
Entropy measures lack of information; it also measures information. These two conceptions are complementary.

Information is maximized when uncertainty (entropy), after receipt, is zero.

Entropy measures lack of information. The higher the entropy, the less information. In it's inverse, entropy measures information.


Catch up with you over the weekend, bro.
 
So tell us, Charlie, why do communications engineers use Shannon's definition of entropy and information, and not yours?

I think you already know why, don't you?
 
So tell us, Charlie, why do communications engineers use Shannon's definition of entropy and information, and not yours?

I think you already know why, don't you?

All your doing is strengthening my position. I am using Shannon's theory, by the book, to support my argument. You need to study it, Bro...you still don't understand it, evidenced by your continual comments.
 
I am using Shannon's theory, by the book, to support my argument.
Actually the barbarian did that with the allele calculation. You have yet to show any calculation which does not beg the question by presupposing a zero binding specifity which supports your position.

There was once the question if you would consider the emergence of a new binding spot which results in a new feature in an organism to be an increase of information.

Well...actually once you said this:
I contend information has been lost, because the new binding spots are mutations.
So you're saying that it would be a loss of information because it came to be by mutation - apparently you don't even care about what it does. Is that correct?
 
Charlie's a little put out that Shannon showed that entropy is information in a message.

So he's just denying what Shannon said.
 
Must talk to my dentist about entropy

I can just hear him sing..

Altogether now...

123...


'Change & decay in all around I see..

O, Thou who changest not, abide with me'


This new book review may help a few folk to break out of the prison of doubt:- :wink:

http://www.christianforums.net/viewtopi ... 288#301288

God bless!

Ian :-D
 
Charlie's a little put out that Shannon showed that entropy is information in a message.

So he's just denying what Shannon said.

Explain to me what Shannon referred to as the "equivocation", and how it affects the information, R, in the equation:

R = H(x) - Hy(x)

This is Shannon 101.

And for the allele statistical data, explain to me how this data can instruct molecules to build proteins.

Until you can do that, Barb, there's no use continuing our debate. I'm debating against someone that doesn't understand the theory we're debating...

There's a saying in science: when you’re in a hole, stop digging...you just get deeper.

Cheers, and I hope you find wisdom in life's journey.
 
From Claud Shannon's "A Mathematical Theory of Information":

In Appendix 2, the following result is established:
Theorem 2: The only H satisfying the three above assumptions is of the form:

(information equation)

This theorem, and the assumptions required for its proof, are in no way necessary for the present theory. It is given chiefly to lend a certain plausibility to some of our later definitions. The real justification of these
definitions, however, will reside in their implications. Quantities of the form H Σpi log pi (the constant K merely amounts to a choice of a unit of measure) play a central role in information theory as measures of information, choice and uncertainty.

The form of H will be recognized as that of entropy as defined in certain formulations of statistical mechanics8 where pi is the probability of a system being in cell i of its phase space. H is then, for example, the H in Boltzmann’s famous H theorem. We shall call H Σpi log pi the entropy of the set of probabilities p1 pn. If x is a chance variable we will write H x for its entropy; thus x is not an argument of a function but a label for a
number, to differentiate it from H y say, the entropy of the chance variable y. The entropy in the case of two possibilities with probabilities p and q 1 p, namely

H = -(p log p + q log q) ...

From our previous discussion of entropy as a measure of uncertainty it
seems reasonable to use the conditional entropy of the message, knowing the received signal, as a measure of this missing information. This is indeed the proper definition, as we shall see later. Following this idea
the rate of actual transmission, R, would be obtained by subtracting from the rate of production (i.e., theentropy of the source) the average rate of conditional entropy.

R = H(x) - Hy(x)

The conditional entropy Hy(x) will, for convenience, be called the equivocation. It measures the average ambiguity of the received signal.


So as you see, Shannon showed how information (H) is a measure of entropy in a message.

And for the allele statistical data, explain to me how this data can instruct molecules to build proteins.

Any change that produces a new allele (a mutation) will produce a novel protein, unless it's a point mutation, the change of which produces a triplet that codes for the existing amino acid. Otherwise, the mutation will result in a different protein.

Until you can do that, Barb, there's no use continuing our debate.

Done.

I'm debating against someone that doesn't understand the theory we're debating...

Well, perhaps you'll think differently, knowing what Shannon thought about it.

There's a saying in science: when you’re in a hole, stop digging...you just get deeper.

Nope. The point in science is to get deeper and deeper. That's how it works. We never stop refining it.

Again, you may not like the way Shannon defined information, but fact is, it works. And that's what counts in science.
 
So as you see, Shannon showed how information (H) is a measure of entropy in a message.

Explain where Shannon showed information was entropy. That's complete rubbish. Your trying to equate randomness and information. Shannon did, however, propose, that as entropy is decreased, information is increased. Again, ask your parents about it...or one of your stats teachers, if your in college yet. From what I've read on this forum so far, these guys defending evolution are ignorant of the facts...so I certainly wouldn't ask them.

Makes for good humor though...haven't laughed this hard in a while. :tongue

The thread about Neanderthals not being related to humans was completely laughable...I was rolling!!! :tongue
 

Donations

Total amount
$1,642.00
Goal
$5,080.00
Back
Top