• CFN has a new look and a new theme

    "I bore you on eagle's wings, and brought you to Myself" (Exodus 19:4)

    More new themes will be coming in the future!

  • Desire to be a vessel of honor unto the Lord Jesus Christ?

    Join For His Glory for a discussion on how

    https://christianforums.net/threads/a-vessel-of-honor.110278/

  • CFN welcomes new contributing members!

    Please welcome Roberto and Julia to our family

    Blessings in Christ, and hope you stay awhile!

  • Have questions about the Christian faith?

    Come ask us what's on your mind in Questions and Answers

    https://christianforums.net/forums/questions-and-answers/

  • Read the Gospel of our Lord Jesus Christ?

    Read through this brief blog, and receive eternal salvation as the free gift of God

    /blog/the-gospel

  • Taking the time to pray? Christ is the answer in times of need

    https://christianforums.net/threads/psalm-70-1-save-me-o-god-lord-help-me-now.108509/

  • Focus on the Family

    Strengthening families through biblical principles.

    Focus on the Family addresses the use of biblical principles in parenting and marriage to strengthen the family.

[__ Science __ ] How evolution adds (or subtracts) information

Yes. Because you've increased uncertainty in the message. If you had just written "The weather today will be.." then there is less information. You're still confusing information with usefulness or meaning. That's not how it works. It can be used so. And BTW, some things like that above message can be used to do metamessages within an innocuous sentence. Some intelligence agencies do things like that.
I add bits just like your claim. You understood by eliminating non-informing “genes.” This is often what cells do with mutations. They ignore them. But there are known mutations that reduce information. This is not disputed. In the cell, information has uses or it’s not information. You seem to be unable to see that biological life is very efficient or it doesn’t survive.
 
That's only part of it. Here's the theory:
1. More are born than can survive to reproduce
Not always
2. Every organism is slightly different from its parents
It’s more alike than dissimilar
3. Some of these differences affect the ability of the organism to survive long enough to reproduce.
If it only survives long enough to reproduce many of those offspring won’t. Second, the offspring are as able to reproduce as easily as their parents if healthy. No added ability is known in humans from healthy parents.
4. Those that help tend to be retained and those that harm tend to be removed by natural selection.

There are none in humans that aid in reproductive selection. There are known positive mutations but none that give an advantage in reproductive years that don’t kill subsequent offspring, clearly not an evolutionary edge.
5. This process explains speciation
We are talking humans….same species….no changes.
 
Not always
Pretty much always. Otherwise, we'd be awash in opossums or whatever.
It’s more alike than dissimilar
That's what "slightly" means. Each organism is slightly different than either parent.
If it only survives long enough to reproduce many of those offspring won’t. Second, the offspring are as able to reproduce as easily as their parents if healthy. No added ability is known in humans from healthy parents.
True of all organisms. But it remains true that those who live long enough to reproduce, leave more offspring. Go figure.
There are none in humans that aid in reproductive selection.
As you have seen, there are a good number of even recent mutations that improve the likelihood of a human living long enough to reproduce. And as you now see, that's all that necessary.

And speciation is a fact. Would you like to see some examples?
 
I add bits just like your claim. You understood by eliminating non-informing “genes.”
[/QUOTE]
Alleles. And as you now see, even random changes add information. That's how it works.

This is often what cells do with mutations. They ignore them. But there are known mutations that reduce information.

Show us one of those. Then we'll do the numbers and see.
This is not disputed. In the cell, information has uses or it’s not information.

No, you've got that wrong. As you saw, information is just a measure of uncertainty in a message.

The more information we know about what is contained in the message, the less bits we can use to represent the message as we have less uncertainty about what kind of message we’re going to represent. As soon as we can ensure the possible characters to appear in the message from the whole alphabet to simply A and H, we can use fewer bits to represent it because we know exactly what we need to represent.


When all the communicating parties already know in advance that the message is going to be HAHAHAHAHA

instead of any other 10-character message containing the letters A and H, we can consider the whole HAHAHAHAHA message as one piece of information and if we build a binary search tree for it we’re going to have zero depth as there’s nothing to search for since we only have one element in the set and P(X)={1}

log21=0

Therefore, this is what we’re going to get from the Shannon entropy equation.


H(X)=−∑i=11P(xi)log2P(xi)=−(1×log211)=−(−1×0)=0

There’s no uncertainty at all in a message that’s already known to contain the string HAHAHAHAHA

so we don’t need to send anything to make it known to the receiving party anymore since there’s no new information for them to gain.

Shannon was trying to understand how messages like Morse code were created and transferred from one place to another. Questions like: How does the information of • • • (the letter "s") compare to the information of – – • • ("z")?

To answer this, Shannon proposed his definition of information:

Information is a measure of the surprise of a given message.
In the Morse code example above, "s" is less surprising than "z", so it contains less information.

You seem to be unable to see that biological life is very efficient or it doesn’t survive.
Let's see... starfish put out millions of eggs. Very few survive to become even young starfish. This obviously isn't very efficient. So you're demonstrably wrong.
 
Barbarian said:
Alleles. And as you now see, even random changes add information. That's how it works.



Show us one of those. Then we'll do the numbers and see.


No, you've got that wrong. As you saw, information is just a measure of uncertainty in a message.

The more information we know about what is contained in the message, the less bits we can use to represent the message as we have less uncertainty about what kind of message we’re going to represent. As soon as we can ensure the possible characters to appear in the message from the whole alphabet to simply A and H, we can use fewer bits to represent it because we know exactly what we need to represent.


When all the communicating parties already know in advance that the message is going to be HAHAHAHAHA


instead of any other 10-character message containing the letters A and H, we can consider the whole HAHAHAHAHA message as one piece of information and if we build a binary search tree for it we’re going to have zero depth as there’s nothing to search for since we only have one element in the set and P(X)={1}

log21=0

Therefore, this is what we’re going to get from the Shannon entropy equation.


H(X)=−∑i=11P(xi)log2P(xi)=−(1×log211)=−(−1×0)=0

There’s no uncertainty at all in a message that’s already known to contain the string HAHAHAHAHA

so we don’t need to send anything to make it known to the receiving party anymore since there’s no new information for them to gain.

Shannon was trying to understand how messages like Morse code were created and transferred from one place to another. Questions like: How does the information of • • • (the letter "s") compare to the information of – – • • ("z")?

To answer this, Shannon proposed his definition of information:


In the Morse code example above, "s" is less surprising than "z", so it contains less information.


Barbarian further said:
Let's see... starfish put out millions of eggs. Very few survive to become even young starfish. This obviously isn't very efficient. So you're demonstrably wrong.


Me:
Humans have never had children such that very few survive so you are demonstrably wrong. In fact, mammals don’t. You see, I only need one example to prove you wrong and I have lots.

But otherwise you refuse to leave the computer and look at real
biology. You measure biological function by a formula and ignore living creatures that don’t comply with the formula.

I’m educated in medicine and we have known that mutations remove information and known this for a long time. There are known mutations that prevent a baby from producing all the proteins needed for successful living. You want examples of real life and then will see if real life matches the formula.

But oddly enough, the information, that mutations can restrict information from reaching production in the cell, cannot be added to your understanding. I doubt there is a mathematical formulae to describe this refusal to add this information.


We have reached an impasse or perhaps, wall.

Adieu
 
Last edited:
Humans have never had children such that very few survive
Even in medieval times, about 30 to 50 percent of children did not live to adulthood.
In fact, mammals don’t.
Lion cubs are about 48% mortality before adulthood.

As Malthus pointed out, population always tends to outstrip resources, and there will always be a limit that keeps the population down.

But otherwise you refuse to leave the computer and look at real
biology.
Boy, did you get a wrong number. I've spent more evenings monitoring animal populations than I want to remember.
You measure biological function by a formula
That's how every quantity is measured. We use numbers. Genes work like that, too.

I’m educated in medicine and we have known that mutations remove information
As you see, that's wrong. In fact, Shannon, who pioneered information theory, showed that every new mutation in a population increases information. Now, I know a few MDs who do understand information theory, but most of them are never trained at all in it. Mostly geneticists, for the obvious reasons. You're an M.D.? Where in medical school did they teach you information theory?

The problem is, you're still confusing "information" with utility. Two different things.
 
I add bits just like your claim. You understood by eliminating non-informing “genes.” This is often what cells do with mutations. They ignore them. But there are known mutations that reduce information.
Here's the way that works. Suppose that there's ten individuals in a population and one dies and a new one is born (I'm doing it like this to make the math easier to see) In the old population there were alleles A (0.5 frequency) and a (0.5 frequency) for a specific gene. But the new birth was A, and the death was for allele a. Now, the frequencies are 0.6 and 0.4. The information for the old population was about 0.3. The information after the mutation is about 0.29, a decrease in information.

I think you're still confusing information with utility.
 
Back
Top