• CFN has a new look and a new theme

    "I bore you on eagle's wings, and brought you to Myself" (Exodus 19:4)

    More new themes will be coming in the future!

  • Desire to be a vessel of honor unto the Lord Jesus Christ?

    Join For His Glory for a discussion on how

    https://christianforums.net/threads/a-vessel-of-honor.110278/

  • CFN welcomes new contributing members!

    Please welcome Roberto and Julia to our family

    Blessings in Christ, and hope you stay awhile!

  • Have questions about the Christian faith?

    Come ask us what's on your mind in Questions and Answers

    https://christianforums.net/forums/questions-and-answers/

  • Read the Gospel of our Lord Jesus Christ?

    Read through this brief blog, and receive eternal salvation as the free gift of God

    /blog/the-gospel

  • Taking the time to pray? Christ is the answer in times of need

    https://christianforums.net/threads/psalm-70-1-save-me-o-god-lord-help-me-now.108509/

  • Focus on the Family

    Strengthening families through biblical principles.

    Focus on the Family addresses the use of biblical principles in parenting and marriage to strengthen the family.

[_ Old Earth _] Evolution/ top science breakthrough 2005 !

  • Thread starter Thread starter reznwerks
  • Start date Start date
About snowflakes:
I am inclined to agree that without a "program" order is unlikely to emerge regularly. However, from that one cannot jump to the conclusion that an intelligent agent had to set up the program. Snowflakes are formed due to the chemical properties of water molecules. I don't see any problems with these properties having a purely natural origin - hydrogen is the most simple atom, and oxygen isn't very special either. It's a beautiful example of technically simple properties causing the emergence of countless nicely ordered structures, a bit like fractals.

Furthermore, the second law of thermodynamics does not mention any exceptions for "programs" or intelligent processes, does it?

But the formation of proteins and nucleic acids from amino acids and

nucleotides not only lowers their entropy, but it removes heat energy (and

entropy) from their surroundings. Pretty cool, ey!
Are you saying that endothermic chemical reactions defy the second law of thermodynamics?

I have a strong suspicion that the two instances of entropy that you were talking about use different definitions of the term.
 
NOTW said:
Evolution now claims it knows where we came from? Yeah, riiiiiiigggghhhhtttt!! Wheres that, the ape? Have a closer in depth look into the dna and your missing 4%. When your done that, try and explain where all the embedded information that makes dna tick and where that came from. BTW, theres your ID! Your brainwashed buddy, period!

Actually, all they have to do is have a closer look at the outside instead of at genes and they can tell immediaitely the differences between animals and humans. Too much analyzing of the trees obscures the forest completely! :-)
 
OK OK OK. Charlie, let me ask you one thing.
Explain to me how development (a tiny seed growing in complexity AND order and becoming a huge tree for example) does NOT go against the second law of thermodynamics.
Can you do that?

If you can, those exact reasons are why evolution doesn't go against the SLoT.
 
Oran_Taran said:
OK OK OK. Charlie, let me ask you one thing.
Explain to me how development (a tiny seed growing in complexity AND order and becoming a huge tree for example) does NOT go against the second law of thermodynamics.
Can you do that?

If you can, those exact reasons are why evolution doesn't go against the SLoT.

This is actually the biggest argument used by the popular evolutionist masses who don't understand anything on the matter of abiogenesis (no offense, just the truth). A seed is indeed fairly simple, just like a zygote is. The difference between a seed and regular matter is something called a genome. Each plant has a unique genome, and when received energy it has already built in mechanisms to distribute that energy so as to develop the seed into a plant.

Nature does not have such built in mechanisms. It just has atoms which are made of protons, neutrons and electrons (and the strong force that binds them).

Edit: As for the chimp genome and the missing 4%:

96% similarities don't make us 96% chimps. The human genome and the banana genome have 50% similarities. In fact, over 55% of the proteins used in a chimp are different than those of humans. As opposed to a previous estimate of 98 percent this has fallen to 95 and continues.

This just shows that we have One Creator God, and since chimps have four limbs, a head and a heart we can't expect them to have the same genome as that of a bacteria, which would suggest multiple creators.
 
"The only situation I'm aware of where complexity

increases in the universe, is when energy and a program for transferring

that energy into useful work coexist."


Protos sums up my opinion.


Peace
 
Please define complexity.

Furthermore, if you say that the 2ndLoT pevents an increase of complexity, then where in the law is that exception for programs mentioned? If there is none, then the law is violated if it indeed prevents such a thing.


By the way, i find the jumping around between entropy, order and complexity highly irriting...these are not the same after all, and the 2ndLoT is only about one of them.

Nature does not have such built in mechanisms. It just has atoms which are made of protons, neutrons and electrons (and the strong force that binds them).
And yet the chemical properties of water allow the emergence of snowflakes. Isn't that such a built in mechanism?


6% similarities don't make us 96% chimps. The human genome and the banana genome have 50% similarities. In fact, over 55% of the proteins used in a chimp are different than those of humans. As opposed to a previous estimate of 98 percent this has fallen to 95 and continues.

This just shows that we have One Creator God, and since chimps have four limbs, a head and a heart we can't expect them to have the same genome as that of a bacteria, which would suggest multiple creators.
Are you familiar with ERVs?
 
Please define complexity.

Furthermore, if you say that the 2ndLoT prevents an increase of complexity, then where in the law is that exception for programs mentioned? If there is none, then the law is violated if it indeed prevents such a thing.


By the way, I find the jumping around between entropy, order and complexity highly irriting...these are not the same after all, and the 2ndLoT is only about one of them.



Remember, the entropy decrease is temporary in living beings. When the

system "dies", energy from outside can no longer be "distributed" usefully.

This is because the program and feedback systems become corrupted.


Living creatures are essentially energy processing systems that cannot function unless a multitude of "molecular machines", biochemical cycles, operate synchronically in using energy to oppose second law predictions. All of the thousands of biochemical systems that run our bodies are maintained and regulated by feedback subsystems, many composed of complex substances.

Most of the compounds in the feedback systems are also synthesized internally by thermodynamically nonspontaneous reactions, effected by utilizing energy ultimately transferred from the metabolism (slow oxidation) of food.

When these feedback subsystems fail -- due to inadequate energy inflow, malfunction from critical errors in synthesis, the presence of toxins or
competing agents such as bacteria or viruses -- dysfunction, illness, or death results: energy can no longer be processed to carry out the many reactions we need for life that are contrary to the direction predicted by the second law.

Frank L. Lambert, Professor Emeritus
Occidental College,



Definitions of Complexity:


Information is a complexity-theoretic notion. Indeed, as a purely formal object, the information measure described here is a complexity measure (cf. Dembski, 1998, ch. 4). Complexity measures arise whenever we assign numbers to degrees of complication. A set of possibilities will often admit varying degrees of complication, ranging from extremely simple to extremely complicated. Complexity measures assign non-negative numbers to these possibilities so that 0 corresponds to the most simple and _ [sic] to the most complicated. For instance, computational complexity is always measured in terms of either time (i.e., number of computational steps) or space (i.e., size of memory, usually measured in bits or bytes) or some combination of the two. The more difficult a computational problem, the more time and space are required to run the algorithm that solves the problem. For information measures, degree of complication is measured in bits. Given an event A of probability P(A), I(A) = -log2P(A) measures the number of bits associated with the probability P(A). We therefore speak of the "complexity of information" and say that the complexity of information increases as I(A) increases (or, correspondingly, as P(A) decreases). We also speak of "simple" and "complex" information according to whether I(A) signifies few or many bits of information. This notion of complexity is important to biology since not just the origin of information stands in question, but the origin of complex information.


William Dembski


Complexity in data

In information theory, algorithmic information theory is concerned with the complexity of strings of data.

Complex strings are harder to compress. While intuition tells us that this may depend on the codec used to compress a string (a codec could be theoretically created in any arbitrary language, including one in which the very small command "X" could cause the computer to output a very complicated string like '18995316'"), any two Turing-complete languages can be implemented in each other, meaning that the length of two encodings in different languages will vary by at most the length of the "translation" language - which will end up being negligible for sufficiently large data strings.

It should be noted that these algorithmic measures of complexity tend to assign high values to random noise. However, those studying complex systems would not consider randomness as complexity.

Information entropy is also sometimes used in information theory as indicative of complexity.:

Entropy is a concept in thermodynamics (see thermodynamic entropy), statistical mechanics and information theory. The concepts of information and entropy have deep links with one another, although it took many years for the development of the theories of statistical mechanics and information theory to make this apparent. This article is about information entropy, the information-theoretic formulation of entropy.

When it was discovered that heat is a measure of the average motion of a population of molecules, the notion of entropy became linked to the measure of order or disorder in a system. If this linkage of such disparate ideas as "heat," "average motion," and "order of a system" sounds confusing, you have a good idea of how nineteenth-century physicists felt. For a long time, they thought that heat was some kind of invisible fluid that was transferred from one object to another. When it was discovered that heat is way of characterizing a substance in which the molecules were, on the average, moving around faster than the molecules in a "cold" substance, a new way of looking at systems consisting of large numbers of parts (molecules, in this case) came into being. And this new way of looking at the way the parts of systems are arranged led, eventually, to the entropy-information connection.



http://www.answers.com


Peace
 
Remember, the entropy decrease is temporary in living beings. When the

system "dies", energy from outside can no longer be "distributed" usefully.

This is because the program and feedback systems become
Keep in mind that at all times any decrease of entropy in living things is made up for by a massive increase of entropy elsewhere, e.g. in the sun.

I'd still like to see where the 2ndLoT ("the total entropy of any thermodynamically isolated system tends to increase over time, approaching a maximum value") makes an exception for "programs" if it generally opposes even any local decrease of entropy.

PS: the definition of complexity didn't show up in your post. I take it that you're still editing.
 
Yup...still editing. I'm going to provide several def's...

Should be done in a bit. I just like to save my work, because I've spent a

significant amount of time developing a post in the past...only to lose

it...aaaaarrrrggggghhhhhhhh!!!!

Peace
 
The human genome and the banana genome have 50% similarities.

Come on protos, I know your a lot more than 50% bananas!!

Lol 8-)

(just a little playful humor... ;-) )
 
For information measures, degree of complication is measured in bits. Given an event A of probability P(A), I(A) = -log2P(A) measures the number of bits associated with the probability P(A). We therefore speak of the "complexity of information" and say that the complexity of information increases as I(A) increases (or, correspondingly, as P(A) decreases). We also speak of "simple" and "complex" information according to whether I(A) signifies few or many bits of information. This notion of complexity is important to biology since not just the origin of information stands in question, but the origin of complex information.
That looks a lot like Shannon information, albeit i find it a bit odd that "simple" and "complex" information are mentioned without a threshold to distinguish it. Furthermore, any such threshold would be completely arbitrary anyway. IMHO simple and complex don't make sense as qualifiers in that context, if anything there is just "information" and "more information".

The second one is about Kolmogorov/Chaitin, completely unrelated to the above. As the article mentions, in K/C information theory the content of information is quantified by trying to compress a string - the longer the best compressed version of it is, the more information it it contains.

It should be noted that these algorithmic measures of complexity tend to assign high values to random noise. However, those studying complex systems would not consider randomness as complexity.
...and therefore K/C information theory cannot really be applied when studying complex systems, as K/C does not assign any "value" to information, it's purely about redundancy. Ordered systems have very little K/C information compared to chaotic ones, as they contain redundancy. E.g. regular patterns of distribution of atoms in grids in crystals.
If anything, K/C information theory provides a maximum measure of how much information can be coded in a string of DNA of a set length. It however does not make any statement about the question how much of that "information potential" is used to the organism's benefit.

Information entropy is also sometimes used in information theory as indicative of complexity.:

Entropy is a concept in thermodynamics (see thermodynamic entropy), statistical mechanics and information theory. The concepts of information and entropy have deep links with one another, although it took many years for the development of the theories of statistical mechanics and information theory to make this apparent. This article is about information entropy, the information-theoretic formulation of entropy.
That is about Shannon Entropy. In Shannon's information theory, if applied to biology, the first replicator by definition is assigned the hightest possible level of information, and any change by definition is undesireable and results in a loss thereof. Therefore it is possible to get from amoeba to humans with nothing but a massive loss of shannon information.
In fact, without this evolution couldn't work, as then there would not be such a thing as mutations. Any mutation by definition is an increase of shannon entropy, but that does not make any statement about that mutation's effect on the organism's abilities and reproductive success.

By the way, there is another information related field in which entropy occurs: Cryptology. There entropy is not a measure of randomness, but virtual randomness - the evenness of the distribution of characters in a ciphertext. More entropy makes it harder to decrypt a text without knowledge of the key as there are less regularities in it, but that does not mean that it does not contain information.
 
By the way, there is another information related field in which entropy occurs: Cryptology.

Roger that...Intelligent design randomized. We delt with it continually when

handling and being accountable for distributing and potentially guiding the

warheads on naval aircraft and ships (communications). Also any

communications dealing with nuclear reactors.

I'll answer your other points tomorrow....yawn...gettin' old...damn 2nd law...!!

:wink:

Peace
 
Back
Top