Christian Forums

This is a sample guest message. Register a free account today to become a member! Once signed in, you'll be able to participate on this site by adding your own topics and posts, as well as connect with other members through your own private inbox!

  • Focus on the Family

    Strengthening families through biblical principles.

    Focus on the Family addresses the use of biblical principles in parenting and marriage to strengthen the family.

  • Guest, Join Papa Zoom today for some uplifting biblical encouragement! --> Daily Verses
  • The Gospel of Jesus Christ

    Heard of "The Gospel"? Want to know more?

    There is salvation in no other, for there is not another name under heaven having been given among men, by which it behooves us to be saved."

[_ Old Earth _] When did humans seperate from apes?

2024 Website Hosting Fees

Total amount
$1,048.00
Goal
$1,038.00
I also see this/hear this alot, how close we are with chimp because of the chromosomes As humans we have 46 chromosomes while a chimp has 48 chromosomes. If it was just that 2 chromosomes that made the difference I could almost see a reason to think about it. But fact is these 2 chromosomes makes us world apart, In the millions. We should be worried also about Tobacco because it had 48 chromosomes... I don't hear the evolution community comparing us to tobacco.. interesting? :confused
 
Barb wrote:

"...Actually, totally chaotic systems have more information that highly ordered ones. Do you understand what "information" is?.."

You've got it exactly backwards:

Information Is Not Entropy,
Information Is Not Uncertainty!


Dr. Thomas D. Schneider
National Institutes of Health
National Cancer Institute
Center for Cancer Research Nanobiology Program
Molecular Information Theory Group


http://www.lecb.ncifcrf.gov/~toms/infor ... ainty.html

Cite a peer-reviewed report that cites increasing chaos = increasing information, as far as DNA goes.

Barb wrote:

"...Do you understand what "information" is?.."
 
I see a lot a rebuttal but I don't see any substance.

Evidence is not what most creationists accept. But that's how science works.

Three Israeli scientists have reported in the most recent issue of the Proceedings of the National Academy of Science1 that Au. afarensis may not be our ancestor at all. It all hinges on the jaw of these creatures. Alas, Au. afarensis has a lower jaw bone (mandible) that closely resembles that of a gorilla—not that of a human or even a chimp. The scientists conclude that this “casts doubt on the role of Au. afarensis as a modern human ancestor.â€

Most likely afarensis is close to the line that led to us, but not directly on it. That's been known for some time. However...

W59301_L_bone-clones-australopithecus-afarensis-lucy-jaw.jpg


Fig-454-Lower-jaw-of-Pliopithecus-antiquus-Miocene.jpg


Human%20Evolution123_big.jpg


Which do you think is most like a human? Which is most like an ape? What about the remaining one seems intermediate? Let us know.

This should not come as a huge surprise, since even Donald Johanson, the discoverer of the first Au. afarensis “Lucy,†conceded that its V-shaped mandible was very ape-like, and certainly nothing like that of a human.

I doubt that very much. As you see, two of them lack the "simian shelf" that internally strengthens the ape jaw, having instead the external bracing of a chin. Note also the dention of two of them is quite different than the third. It's an interesting story you have to tell, but as you see, the evidence doesn't support it.

Barbarian chuckles:
You've confused the material with the plans. Why don't you revise this and try again?

No I haven't.

Yes, you have.

All life has a DNA structure.

But it isn't DNA that counts. It's the similarity of DNA that determines relationships, just as we use it for paternity and other human relationships.

The plans are certainly not the same unless you're arms reach your knees and you're covered by hair Mr. Barbarian. ;)

Being of northern European ancestry, I have a few Neandertal traits, but the genetic evidence shows Neandertals were about as far removed from chimps as we are. They are our closest non-human relatives.

Barbarian observes:
Actually, totally chaotic systems have more information that highly ordered ones. Do you understand what "information" is?

I have little doubt that when you state "totally chaotic systems have more information that highly ordered ones " you will refer to things such as snowflakes, stalactites, and sand dunes etc...

No. Daily weather, stock market fluctuations over short runs, turbulent systems, and so on.

However these are examples of patterns. These fall under the scientific category as “chaos and fractals†and are very well understood, and are experienced every day.

None of the things you mentioned are "chaos and fractals." I spent a good amount of time in graduate school on such issues, and you don't know much about them.

The information I am referring to is much more complicated and relevant to this discussion in relation to creation verses evolution.
Consider codes for example...symbolic codes such as music, languages computer programs, blue prints and DNA.

Chaos can produce patterns; it cannot produce a code or symbols my friend. If it can, then please give any examples.

The most recent result was a DNA code that gave the descendants of the man who had the mutation, rather good resistance to hardening of the arteries. Another is the observed evolution of a new, irreducibly complex enzyme system in bacteria. Another is the frameshift mutation that gave bacteria the ability to metabolize nylon oligomeres.

Information is a separate entity on an equal level with matter and energy.

No. Information is an epiphenomenon of matter and energy.

Now let’s look at DNA.
Is DNA a pattern? Well DNA is not simply a molecule with a pattern; it is a code, a language and an information storage mechanism.

And we know that random mutations to the code can produce beneficial changes. Which is all that evolution needs.

Every code known to mankind is created by a conscience mind.

Since the universe was created by a conscious mind, even natural things like DNA fit that category.

Therefore DNA is designed by a mind, and language and information are proof of the action of God.

It is highly disrespectful to suggest God "designs" like a mere creature. He is the Creator.

Barbarian observes:
Same old DNA. You see evolution doesn't proceed by making totally new things. It's always a modification of existing things.

OK, so please explain how a so called simple single cell which has no DNA information for an arm for example could modify in order to accomplish this task.

First, you'd need a colonial animal like a sponge. But since sponge cells very closely resemble single-celled organisms, that's not much of a change. Then you'd need to have some way of differentiation. HOX genes, clearly modified versions of organisms without tissues do that. And then, limb buds can be mere bumps and slowly modify to appendages as they become useful. Would you like to learn the evidence for this?

With respect, you really don’t have a good grasp on the theory of genetic evolution...at least not macro evolution.

I've spent about 50 years in the study of biology. Boy, did you get a wrong number.

No one is arguing against micro evolution. We see micro evolution happening all the time.

And macroevolution from time to time. Did you know that most "creation scientists" now admit that new species evolve?

However macro evolution – Darwinian evolution is not found in the fossils, or DNA. Every example of macro evolution to date has either been an outright fraud, or has been found false alarms.
Take the so called whale evolution.

I can remember a time when creationists declared that the discovery of a whale with functional legs would make them into evolutionists. Then we found some. Their response? "Can't be a whale, whales don't have legs." That's the kind of dishonesty we see among them.

Evolutionist artists took some sparse bone fragments and in their extreme desire to “prove†their preconceptions and biases, they “created†some extremely important MISSING bones (bones which were never part of the actual animals anatomy, because it was never a transitional animal at all) which conveniently enabled their imaginary transitional animal to swim in a way that was most definantly not possible without the specific artistically added parts! That is desperate at best and fraudulent in reality.

Well, let's take a look...

ambulocetus2.jpg


Surprise.

The most amazing thing about Darwinian evolution which is completely ignored by evolutionists, or simply unknown, is that if we take Darwinian evolution as fact, for sake of debate, then it would be a fact that the “transitional†intermediate fossils would constitute the vast majority of the fossil record, but as ardent and well known late evolution superstar Steven J. Gould stated,

It would, if evolution was generally slow and gradual. But the evidence shows that it is more often brief periods of change, in response to environmental change, followed by long periods of stasis.

(a bit of quote mining now)

The extreme rarity of transitional forms in the fossil record persists as the trade secret of palaeontology. The evolutionary trees that adorn our textbooks have data only at the tips and nodes of their branche s; the rest is inference, however reasonable, not the evidence of fossils ….We fancy ourselves as the only true students of life's history, yet to preserve our favoured account of evolution by natural selection we view our data as so bad that we never see the very process we profess to study.

- Stephen J. Gould - "Evolution's Erratic Pace," Natural History, vol. 86 (May 1987), p. 14.


And Gould's comment on such edited fakery:

"Since we proposed punctuated equilibria to explain trends, it is infuriating to be quoted again and again by creationists—whether through design or stupidity, I do not know—as admitting that the fossil record includes no transitional forms. Transitional forms are generally lacking at the species level, but they are abundant between larger groups."
--Stephen Jay Gould, Evolution as Fact and Theory, Hen's Teeth and Horse's Toes: Further Reflections in Natural History, New York: W. W. Norton & Company, 1994, p. 260

Now, I'm pretty sure you aren't the one who doctored together that quote. It's a commonly used one, by the professional creationists, who depend on ignorance to sell it. But it should teach you to be cautious about believing them.

Now I know evolutionists will become enraged at this quote and will scream out the inevitable defence mantra “quote mining!â€

In fact, Gould himself takes you to task for that misrepresentation.

The fact is that this quote is exactly as stated and reflects one of the most devastating “trade secrets†of evolution theory...the graduated transitional fossils are not there my friends.

Notice that Gould says they are. In fact, he also says that there are a number of examples of slow and gradual evolution, citing horses, ammonites ,and others.

Barbarian asks:
What materials would you find in a high rise that are entirely absent from a bungalow? More importantly, what structure do you find in a human that could not have evolved from something simpler?

I’ll answer your question with another question.

A straight answer will be fine, thank you. If you can't do that, we'll note it and go on.

You can order multiple pieces of any material. You can attach any materials together. This is what you claim happens in evolution...

No. It's what they told you the theory says. The fact that we don't see "attaching any materials together" is another reason the theory is accepted.

So you can modify the material but you cannot order any new materials that are not already in the original bungalow. You can only use the tools and equipment that were used to build the bungalow. For example; if you can build a high rise super crane from a bobcat or forklift then go ahead my friend.

Again, you're supposing that's what evolutionary theory says. As Everett Dirkson used to say, you're down on the stuff you aren't up on.

As for the biological question; You are aware that no scientist has ever created life in the lab right?

The theory isn't about the origin of life. Darwin, for example, thought God just created the first living things.

Isn’t it odd that evolutionists believe a fluke, random non intelligent accident produced everything,

It's odd you actually think that's what they say.

Barbarian observes:
Actually, that's deeply wrong. Gene duplication, followed by mutation seems to be a most important part of evolution. Would you like to learn about it?

Really!!??

Yep. For example, that enzyme against hardening of the arteries came about that way.

Wow that’s amazing because scientists across the board know this to be fact.

Show us.

While natural selection and beneficial mutations “may increase an organism’s adaptation,†no one has ever been able to point to a mutation that has actually improved the genetic code by adding new meaningful information (new genes or “instructions†for building a new physical trait).

The enzyme is both a trait and physical. So you're wrong again.

All mutations appear to scramble the already-existing information (instructions), either by the reshuffling or duplication of existing genes, or simply by damaging the genes altogether.

Yep. Most do very little. A few are harmful. A very few are useful. Natural selection sorts it out.

It’s not surprising that one of the most well-known evolutionists openly criticized the traditional neo-Darwinian theory of evolution. On the faculties of Harvard and New York University, the late Stephen Jay Gould was the author of over 15 books on scientific topics and contributed monthly essays to the periodical Natural History since January 1974. His essays have also appeared in other scientific periodicals and his work can be found quoted in educational textbooks at all levels. He wrote that although he had been “beguiled†by the unifying power of neo-Darwinism when he studied it as a graduate student in the 1960s, the weight of the evidence pushed him to the reluctant conclusion that neo-Darwinism “as a general proposition, is effectively dead, despite its persistence as textbook orthodoxy.â€

Darwin was wrong about some things. Neo-Darwinists more accurately described evolution, and the New Synthesis was even more accurate, and now with Punctuated Equillibrium, still more accurate. If you check chemistry, you'll find the same process. It doesn't mean Dalton was wrong to infer atoms and atomic mass; it just means we know more about it now, even if Dalton's theory is dead.

Today, there is a growing realization that the presently accepted concept of natural selection and mutations really explains nothing of evolutionary significance.

Even most non-scientists would know that your claim is false.

One leading creationist summarized the situation well: “All of our real world experience, especially in today’s ‘information age,’ would indicate that to rely on accidental copying mistakes to generate real information is the stuff of wishful thinking, not science.â€

And yet, such processes produced a new, irreducibly complex enzyme system. Maybe your "leading creationist" should have paid attention in class, um?

In everyday experience, information never arises without an intelligent source.

So you think there's an intelligence in hurricanes? Information arises there, and we can detect it. And it's pretty easy to show that every mutation adds information to a population. Would you like to see the numbers?
 
Barbarian observes:
Actually, totally chaotic systems have more information that highly ordered ones. Do you understand what "information" is?.."

You've got it exactly backwards:

Well, let's take a look...

From a PhD in communications/information:
Information is a measure of uncertainty, or entropy, in a situation. The greater the uncertainty, the more the information. When a situation is completely predictable, no information is pres­ent.
http://www.shkaminski.com/Classes/Hando ... Models.htm

And...

The expressions for the two entropies are very similar. The information entropy H for equal probabilities pi = p is:

H=K\ln(1/p)\,

where K is a constant which determines the units of entropy. For example, if the units are bits, then K=1/ln(2). The thermodynamic entropy S , from a statistical mechanical point of view was first expressed by Boltzmann:

S=k\ln(1/p)\,

where p is the probability of a system being in a particular microstate, given that it is in a particular macrostate, and k is Boltzmann's constant. It can be seen that one may think of the thermodynamic entropy as Boltzmann's constant, divided by ln(2), times the number of yes/no questions that must be asked in order to determine the microstate of the system, given that we know the macrostate. The link between thermodynamic and information entropy was developed in a series of papers by Edwin Jaynes beginning in 1957.[55]

There are many ways of demonstrating the equivalence of "information entropy" and "physics entropy", that is, the equivalence of "Shannon entropy" and "Boltzmann entropy". Nevertheless, some authors argue for dropping the word entropy for the H function of information theory and using Shannon's other term "uncertainty" instead.

http://en.wikipedia.org/wiki/Entropy

Lots more of that, if you want to see it.

(a biologist demurs)
Information Is Not Entropy,
Information Is Not Uncertainty!

Dr. Thomas D. Schneider
National Institutes of Health
National Cancer Institute


I'm a biologist myself. But one of my degrees is in systems. Dr. Tom doesn't know any more about it than you do. Many biologists are cognizant of information theory, however, since it has applications in population genetics. Would you be offended if I asked you to learn what "information" is?

Cite a peer-reviewed report that cites increasing chaos = increasing information, as far as DNA goes.

There is another misconception that's holding you back here. Chaotic systems frequently have a high level of order. I ran into this in doing Lotka/Volterra simulations. Turns out tiny rounding errors lead to great fluctuations.

Check out the Feigenbaum constant to learn why. You're in way over your head on this one.

Barb wrote:

"...Do you understand what "information" is?.."

And now, we know that you don't.
 
I also see this/hear this alot, how close we are with chimp because of the chromosomes

No. You've been misled on that. Chromosomes are just bundles of genes. They aren't the genes. So very unrelated organisms can have the same number of chromosomes, but they will have very different genes.

As humans we have 46 chromosomes while a chimp has 48 chromosomes. If it was just that 2 chromosomes that made the difference I could almost see a reason to think about it. But fact is these 2 chromosomes makes us world apart, In the millions. We should be worried also about Tobacco because it had 48 chromosomes... I don't hear the evolution community comparing us to tobacco.. interesting? :confused

Turns out that this is a very interesting story. How can humans be so close to apes genetically, but have a different number of chromosomes? The only way that could happen would be a chromosome fusion or a breakage at some point in evolutionary history. Since we are the odd one out, that suggested that a fusion took place in our ancestors. And since such a fusion should leave traces, it's a testable hypothesis. So the search began.

And there it was. The human #2 chromosome matches up very neatly with two chimp chromosomes. And in the human #2, there are remains of telemeres (ends of chromosomes) right where they'd be if there had been a fusion. And the mystery was solved.
http://www.gate.net/~rwms/hum_ape_chrom.html

Cool, um?
 
The Barbarian said:
Barbarian observes:
Actually, totally chaotic systems have more information that highly ordered ones. Do you understand what "information" is?.."

You've got it exactly backwards:

Well, let's take a look...

From a PhD in communications/information:
Information is a measure of uncertainty, or entropy, in a situation. The greater the uncertainty, the more the information. When a situation is completely predictable, no information is pres­ent.
http://www.shkaminski.com/Classes/Hando ... Models.htm

And...

The expressions for the two entropies are very similar. The information entropy H for equal probabilities pi = p is:

H=K\ln(1/p)\,

where K is a constant which determines the units of entropy. For example, if the units are bits, then K=1/ln(2). The thermodynamic entropy S , from a statistical mechanical point of view was first expressed by Boltzmann:

S=k\ln(1/p)\,

where p is the probability of a system being in a particular microstate, given that it is in a particular macrostate, and k is Boltzmann's constant. It can be seen that one may think of the thermodynamic entropy as Boltzmann's constant, divided by ln(2), times the number of yes/no questions that must be asked in order to determine the microstate of the system, given that we know the macrostate. The link between thermodynamic and information entropy was developed in a series of papers by Edwin Jaynes beginning in 1957.[55]

There are many ways of demonstrating the equivalence of "information entropy" and "physics entropy", that is, the equivalence of "Shannon entropy" and "Boltzmann entropy". Nevertheless, some authors argue for dropping the word entropy for the H function of information theory and using Shannon's other term "uncertainty" instead.

http://en.wikipedia.org/wiki/Entropy

Lots more of that, if you want to see it.

(a biologist demurs)
Information Is Not Entropy,
Information Is Not Uncertainty!

Dr. Thomas D. Schneider
National Institutes of Health
National Cancer Institute


I'm a biologist myself. But one of my degrees is in systems. Dr. Tom doesn't know any more about it than you do. Many biologists are cognizant of information theory, however, since it has applications in population genetics. Would you be offended if I asked you to learn what "information" is?

[quote:2ms8zsas]Cite a peer-reviewed report that cites increasing chaos = increasing information, as far as DNA goes.

There is another misconception that's holding you back here. Chaotic systems frequently have a high level of order. I ran into this in doing Lotka/Volterra simulations. Turns out tiny rounding errors lead to great fluctuations.

Check out the Feigenbaum constant to learn why. You're in way over your head on this one.

Barb wrote:

"...Do you understand what "information" is?.."

And now, we know that you don't.[/quote:2ms8zsas]


So you're at odds with Center for Cancer Research Nanobiology Program's Molecular Information Theory Group's (and Shannon's) statement of what information is:

Note: Shannon understood this distinction and called the uncertainty which is subtracted the 'equivocation'. Shannon (1948) said on page 20:

R = H(x) - Hy(x)

Information is always a measure of the decrease of uncertainty at a receiver (or molecular machine).

"The conditional entropy Hy(x) will, for convenience, be called the equivocation. It measures the average ambiguity of the received signal."

The mistake is almost always made by people who are not actually trying to use the measure.

http://www.lecb.ncifcrf.gov/~toms/infor ... ainty.html
 
Your guy just misunderstands what Shannon was saying about it. But keep in mind, this is his personal crusade, it's not the official opinion of the organization he works for. He's made the layman's mistake of assuming that the information is some kind of physical quantity, rather than a communications process. That is, he's confused between the Shannon information, the ambiguity of the signal before it's decoded, and the interpretation of the signal afterward.

The conditional entropy Hy(x) will, for convenience, be called the equivocation. It measures the average ambiguity of the received signal."

The "ambiguity" is the uncertainty. That's what it means, CR. This is what Shannon meant by "information."

img67.gif

The expression (1.7) is famous as Shannon's entropy or measure of uncertainty.

For more characterizations of the measure (1.4) or (1.7) refer to Aczél and Daróczy (1975) [2] and Mathai and Rathie (1975) [71].


and:

The quantity ?logbp(x) is interpreted as the information content of the outcome x , and is also called the Hartley information of x . Hence the Shannon's entropy is the average amount of information contained in random variable X , it is also the uncertainty removed after the actual outcome of X is revealed.
http://planetmath.org/encyclopedia/Shan ... tropy.html

Would you like me to show you how it applies to actual biological populations? Maybe if you could see it actually work, it would be easier to understand.

There's a reason that science accepts Shannon's definition, and doesn't accept Dr. Tom's. Shannon's works. You see, his theorm, describing information as uncertainty, tells us how to pack the maximum number of bits in a channel, tells us how to use very low powered transmitters to send accurate transmissions over millions of kilometers and so on.

Shannon's way works. Dr. Tom's doesn't. And that's all that it takes for science to chose which one is right.
 
Barb wrote:

Shannon's way works. Dr. Tom's doesn't. And that's all that it takes for science to chose which one is right.

Shannon and Schneider,
National Institutes of Health
National Cancer Institute, are
on the same page. You're
disagreeing with both Shannon
and Schneider. Will you please
point us to some of your peer-reviewed
reports concerning information?


I can't think of two greater authorities
concerning info theory than Shannon
and Schneider.

CR:

You're at odds with Center for Cancer Research Nanobiology Program's Molecular Information Theory Group's (and Shannon's) statement of what information is:

Note: Shannon understood this distinction and called the uncertainty which is subtracted the 'equivocation'. Shannon (1948) said on page 20:

R = H(x) - Hy(x)

Information is always a measure of the decrease of uncertainty at a receiver (or molecular machine).

"The conditional entropy Hy(x) will, for convenience, be called the equivocation. It measures the average ambiguity of the received signal."

The mistake is almost always made by people who are not actually trying to use the measure.

http://www.lecb.ncifcrf.gov/~toms/infor ... ainty.html
 
Shannon and Schneider,
National Institutes of Health
National Cancer Institute, are
on the same page.

Nope. Dr. Tom thinks they are, but as you just learned, he doesn't have any expertise in information theory. He's just misconstrued what Shannon wrote.

But if you want to show us some peer-reviewed paper in a journal of information agreeing with Dr. Tom's layman's opinion, I'm sure we'd be happy to see it. Meantime, let's see what mathematicians say about it, one more time...

The conditional entropy Hy(x) will, for convenience, be called the equivocation. It measures the average ambiguity of the received signal."

The "ambiguity" is the uncertainty. That's what it means, CR. This is what Shannon meant by "information."

Image
The expression (1.7) is famous as Shannon's entropy or measure of uncertainty.

For more characterizations of the measure (1.4) or (1.7) refer to Aczél and Daróczy (1975) [2] and Mathai and Rathie (1975) [71].

and:

The quantity ?logbp(x) is interpreted as the information content of the outcome x , and is also called the Hartley information of x . Hence the Shannon's entropy is the average amount of information contained in random variable X , it is also the uncertainty removed after the actual outcome of X is revealed.
http://planetmath.org/encyclopedia/Shan ... tropy.html


You're disagreeing with both Shannon
and Schneider.

As you see, Schneider is at odds with Shannon. Because he's never actually studied information as a science, he's misunderstood what Shannon wrote. But mathematicians understand it. Learn from them.

I can't think of two greater authorities
concerning info theory than Shannon
and Schneider.

I cited you two papers, which you ignored. For obvious reasons. So here is Shannon's statement from his paper:

Quantities of the form log pi (the constant K merely amounts to a choice of a unit of measure) play a central role in information theory as measures of information, choice and uncertainty.

And:

Suppose there are two events, x and y, in question with m possibilities for the first and n for the second.
Let p(i; j) be the probability of the joint occurrence of i for the first and j for the second. The entropy of the joint event is

(calculations)

with equality only if the events are independent (i.e., p(i; j) = p(i)p( j)). The uncertainty of a joint event is less than or equal to the sum of the individual uncertainties.

The Bell System Technical Journal,
Vol. 27, pp. 379–423, 623–656, July, October, 1948.

So, as you see, Shannon treats information as the uncertainty in a message.
 
Note: Shannon understood this distinction and called the uncertainty which is subtracted the 'equivocation'. Shannon (1948) said on page 20:

R = H(x) - Hy(x)

What is "R"?

I'll help you: Shannon defined it as information.

What is the result of H(x) - Hy(x)?

I'll help you: Shannon defined it as information.
 
Notice that Shannon himself wrote that information was a measure of uncertainty.

You're arguing that Shannon doesn't agree with himself.

BTW, there's an easy way out of this, that explains the confusion. But you seem unable to see what it is.
 
Hint:
Because entropy can be conditioned on a random variable or on that random variable being a certain value, care should be taken not to confuse these two definitions of conditional entropy, the former of which is in more common use.
 
The Barbarian said:
Hint:
Because entropy can be conditioned on a random variable or on that random variable being a certain value, care should be taken not to confuse these two definitions of conditional entropy, the former of which is in more common use.

Will you please put that in terms of Shannon's definition for info:

R = H(x) - Hy(x)

Where-

R = info

H(x) = uncertainty before the receipt of the signal

Hy(x) = uncertainty after the receipt of the signal


http://www.ccrnp.ncifcrf.gov/~toms/bion ... al.Entropy
 
Back
Top