Ruminations on Entropy and Related Topics...

by David Cavanaugh (dcavanau@RO.COM)


The Teleological argument is our oldest and strongest argument against evolution. Cast in terms of modern science, the second law of Thermodynamics and statistical mechanics are among our best friends. Within this context one must understand Entropy and Entropic processes: I would like to cite a quote from our Friend Richard Trott in his article "Lying for Jesus," as an eloquent example of the misunderstanding of Thermodynamics within the Evolutionary community and even within the Creationist community. Richard was bashing Inter-Varsity for sponsoring Duane Gish at Rutgers and for being grossly misinformed about the Second Law of Thermodynamics:

For example, the pamphlet states, "The Second Law of Thermodynamics states that there is a general tendency of all observed systems to go from order to disorder. . . A fundamental law of physics says that natural systems go from order to disorder; evolutionists say that these same systems will go from disorder to order." This is, of course, complete nonsense. Among other problems with this argument, the Second Law of Thermodynamics only applies to systems that are isolated and in thermal equilibrium. Living systems are not isolated systems in thermal equilibrium. Therefore, the systems that an evolutionary scientist talks about are not the same systems that the Second Law of Thermodynamics talks about. The author sums up with the outrageous claim that the "evolutionary hypothesis. . . contradicts one of the most well-established laws of science (the Second Law of Thermodynamics)." The fact that the pamphlet's creationist author knows nothing about thermodynamics doesn't keep him from trying to use it as evidence for the cause of creationism.

The first Law of Thermodynamics is a conservation of energy statement which recognizes the mechanical equivalency of heat energy, hence it is necessary to add heat energy to any statement involving mechanical work

E = Q + W

Second Law of Thermodynamics was developed in the context of the 19th century studies of steam engines and was found by inquisitive Chemists to also apply to the problems of Chemical reactions and Chemical equilibria. The resulting theoretical tapestry was in the context of ideal machines, that operated in a complete cycle, using a thermal gradient to extract mechanical energy or work. A quick review of the Citric acid cycle for providing chemical energy in living organisms, or the phenomena of movement of fauna, will convince the interested reader of the appropriateness of Thermodynamics as a theoretical framework for the study of living systems analogous to that of man made thermal machines.

In it's main form the Second Law of Thermodynamics states that there is no free lunch, that any form of work or conversion of energy has a price in lost energy. The conversion of energy may be electrical to mechanical, mechanical to electrical, electrical to visible light, light to electrical, thermal to electrical, chemical to electrical, electrical to radio waves (electromagnetic), mechanical to mechanical, etc. etc. This statement is analogous to carrying a bucket of water from one place to another, but the bucket has leaks that cause water loss. The Second Law is why there's no perpetual motion machines, due to Entropic heat losses associated with friction. Lord Kelvin's version of the Second law was that a thermal machine could not extract mechanical energy from two equi-temperature bodies. The Claussius statement of the Second Law was that a thermal machine couldn't pump heat from a colder to a hotter body without the use of external work (F * delta S) or energy. The Carnot idealized heat engine (hence Carnot cycle) studied the maximum theoretical (limiting and always less than 100%) efficiency possible from a machine operating in a cycle to extract energy from the thermal gradient between a hot and cold body and convert that energy into useful mechanical energy or work.

The adiabatic (e.g. thermally isolated from the surroundings) gas expansion/contraction of the cycle became the definition of Entropy. However, it was recognized at that time that there was a difference between ideal or near ideal reversible processes and irreversible processes, where the latter is the more typical of the real world. The Entropy increase for reversible processes, sets a lower bound for irreversible processes. Living systems, as with machines, must extract useful energy and work from an energy gradient to survive, function and maintain system order. There is still a cost paid in that Entropy somewhere must increase. In fact the statement of Entropy is valid for any system or process. A thermal system can be defined from small physical volumes to the entire universe, as with the projected Entropic "heat death" of the Universe associated with a uniform temperature distribution. The mathematical derivations which flow from the Second Law usually impose restrictions on system conditions in order to simplify the mathematics and produce useful relations, such as with constant temperature, pressure or thermal isolation from the surrounding environment. The common misunderstanding is therefore to restrict the Second Law to these situations, which then misleads people as to it's applicability to the study of the Evolutionary framework. The formal mathematical statement of the Second Law is as follows:

S = q / t (reversible processes)

S = q / t (irreversible processes)

The Third Law of Thermodynamics due to the work of Boltzmann, of the Boltzmann constant fame, which is a formal definition of Entropy as being related directly to the number of arrangements available to a thermal system, which may also be termed the number of states that the system may enter. This Law recognizes that any thermal system will tend to assume the maximum number of states available to it unless external work or energy restricts that freedom. As above, the definition of a thermal system may encompass small volumes to the size of the entire universe. In this discussion we see an example of the multi-body problem of Physics, where we encounter large combinatorial probabilities, or in other words large state space sizes with many statistical degrees of system freedom. So the real world is filled with Stochastic and Ergotic processes for which a deterministic mentality, taught along with deterministic mathematics, forms an inadequate framework from which to examine most phenomena, including evolution. A corollary to the mathematical statement of the Third Law, often the form presented to students, is that the Entropy of a perfect crystal at absolute zero (the Kelvin scale) is Zero. The formal mathematical statement is as follows:

S = k * Log10 (w)

w = the total number of distinct states available to the system. 'w' is the total size of the combinatorial state space representing the total statistical degrees of freedom for the given Entropic system. An Entropic system would be defined as an atomic/molecular system that is tightly coupled and strongly interrelated from a causal and energy point of view or a macroscopic system where thermal transport between two or more bodies (or within a body) is significant.

k = Boltzman's constant.

Last but not least are our other old friends Enthalpy and Gibbs Free Energy; important in chemical applications. From the definition of the Free Energy we can see the function of Entropy to act to maximize a particular thermal system disorder or act as an energy leak to the surrounding environment. The following mathematical statement of the Gibbs free energy applies to chemical equilibria, but it should be noted that this is in the sense of determining which way the equilibria will shift, thus applying Entropy to a dynamical system contrary to the way it is usually perceived.

delta H = W + q

G = delta H - T * delta S

W = Work

H = Enthalpy

T = temperature

S = Entropy in reversible processes is q / T
but this is a lower bound for typical real world processes and systems.

There is some confusion regarding the precise meaning of Entropy. There are thermodynamic systems where order would seem to be high and Entropy is low. It is true in general that Entropy is a measure of disorder in the Thermodynamical sense and that it will be maximized for a given system. However, when 'w' (hence the size of the system state space) is also relatively low then Entropy must also be relatively low as it is a measure of system complexity. The major physical mechanisms of increases in Entropy are atomic/molecular diffusion, kinetic energy diffusion, Infra-Red electro-magnetic radiation and seeking of the minimum energy (mainly electronic and gravitational) configuration for the system. At absolute zero we still have electron movement, we just have the electrons populating the ground state energy levels. Ice is an example of a system with a high degree of molecular ordering and crystal structure brought about by the high degree of hydrogen bonding and strong dipole moment of water. The reason why ice is also a stable Entropic system is because it is a minimum energy system for water, with greatly reduced degrees of freedom (hence reduced 'w') available for the system, because there is not enough internal energy on average to break the molecular bonding. This system is also a minimum Entropic arrangement, because it represents the least system complexity available to the molecular ensemble at the freezing temperature. However, ice will still equilibrate it's internal temperature within a contiguous chunk, thus still increasing Entropy with respect to any thermal gradients within or about the ice chunk. Thus Entropy still increases to a maximum for these systems, even though they seem to be highly ordered.

This conceptual problem with Entropy can also be seen with the 3-D or conformational structure of proteins, which would seem on the surface to violate Entropic principals, but don't because proteins assume a minimum energy configuration within themselves and with the hydrating water molecules. Hydrophobic structures/groups are on the inside of the protein (some internal groups responsible for the 3-D structure are polar and/or use hydrogen bonding) and hydrophilic/charged structures on outside, hence low W and thus low S. In the multi-body problem, with large degrees of freedom for the system ensemble, it is intermolecular/atomic collisions and vibrations (thermal phonos) which diffuse thermal energy. Thus these systems still trend toward isothermal conditions and uniform concentrations of species present (through diffusion).

Entropy can be seen in other contexts as noise or variation in measurements taken or even in the results of manufacturing operations. Those who have experience in communication or in metrology will have an immediate resonance with this additional identification of Entropic processes. In electronics the movements of electrons (current) produce noise currents when passing through a resistance (Johnson noise) or through a semiconductor junction. The Johnson noise is due to collisions with Atoms and the semi-conductor noise is due to the recombination of electrons with "holes" within the semi-conductor. Both sources of noise are arguably kinetic (e.g. due to movement) in nature and are random, stochastic processes Other sources of noise in Electronics and communication are due to energy cross coupling from electric, magnetic and electro-magnetic fields, generated from within and without the given equipment. Measurement of fundamental physical things such as speed, pressure, length, time are all subject to measurement uncertainty, hence the measurement processes are victims of Entropy. Some have argued that Entropy is the fundamental measurement of the curse pronounced by God at the fall, an interesting notion. Others postulate the process of growing old as fundamentally Entropic in nature. This makes sense when one considers the cumulative damage to genetic information due to Thermal breakdown, free radical breakdown/alteration, and breakdown/alteration from stray ionizing radiations.

Let's consider the proverbial Humanist that zapped the gasses under glass and got some amino acids; to him conclusive proof that he had the recipe for the primordial soup from whence we got spontaneous generation. There are many serious problems with this "proof" based upon sound thermodynamical and statistical mechanical principals, which are far more established scientifically. First of all this guy didn't cheat the second law of Thermodynamics, as he had to do work on the system to beat diffusion and collect this particular gaseous mixture all in one reaction vessel. Subsequently he had to put energy into the system at the correct rate; his guess not necessarily what the actual level was. From this experiment one might infer the possibility of such an atmosphere existing on the earth, but it is simply illogical to accept this as proof that this was the exact case, in the absence of more direct evidence. Obtaining this direct evidence is going to be extremely difficult unless somebody can find a gas pocket somewhere that can be dated reliably and can be shown not to have been contaminated in the large time frame involved. So let's say that we can get amino acids; simple combinatorial probability calculations of even modest size proteins (100-200 amino acid units), without which life would be impossible for a given species, yield probabilities that are so low as to be practically impossible to have formed with chemical kinetics. The typical dodge around this problem is to invoke billions of years of time, which even with simple reaction rate calculations, given unrealistically optimistic assumptions, would show there has been no where near enough time.

As far as seeing evolution in progress, we don't, so they say it's slow a slow process. This seems a convenient rationalization and is not testable in the conventional scientific fashion. There are a few examples trotted out as support for evolution in real time, such as a particular species of English moth going from light to dark in color, but a careful examination of the data would simply show a system in equilibrium reacting to an external stress, according to the principle of Le Chatelier, to seek an new equilibrium. In other words, species cited in such examples do not develop new genetic information, but simply bring forth existing genetic possibilities inherent in the gene pool, which are simply recessive traits. In order to bring new information or possibilities into the gene pool, one needs to invoke mutations, which are almost inevitably detrimental to the survival of the species, as such things with three heads don't survive very well.

That brings us to natural selection, which one could just as reasonably postulate as a mechanism to prevent change (e.g. evolution) from occurring by suppressing mutations and other Entropic system decay, as it could be seen as the primary mechanism by which a random walk to increased system complexity occurs. The idea of biological systems magically changing into more and more complicated entelechies, just flies in the face of a universe ruled by Entropic processes. Living things get sick and die, species become extinct, our abodes get dirty, mechanical contrivances degrade and break, social systems are prone to chaos and anarchy ... The list of things that touch our everyday experience goes on and on. Statistical calculations similar to those cited above for spontaneous generation, yield extremely low probabilities of occurrence. Biological systems may use Entropic processes to survive, but it's a real stretch to say that random processes must surely lead to ordered systems. The mechanical contrivances of Man also use Entropic processes to perform their function, but the damn things just break, so where's the mechanical evolution except as a Teleological process ?


Conversations:

David Cavanaugh: 1. You wrote quoting me: Correspondent: You said: I'm curious about this statement. I was never taught this and none of the chemistry or physics books I have place this kind of restriction on the second law.

Then you wrote: Correspondent: Let me say that I appreciate your question on the subject because it sort of reflects my response about two years ago when I first tried to track this issue after reading Gange. Let me offer this in response:

1. Please read "Origins and Destiny" by Robert Gange, the reference I listed. This is a good starting point. He does a good job of showing the large misunderstanding of entropy which exists. See particularly chapters 6 and 7 and appendices 2 and 3. He also lists lots of references for further study.

David Cavanaugh: Thank you for the reference. 2. You wrote: Correspondent:

2. Let me quote for you the second law of thermodynamics as stated by Arnold Sommerfeld in Vol. V of his Lectures on Theoretical Physics the title of which is 'Thermodynamics and Statistical Mechanics'. I drug this out of my past and it is an example of the "older view" or ungeneralized view of the second law.

"All thermodynamic systems posses a property which is called entropy. It is calculated by imagining that the state of the system is changed from an arbitrarily selected reference state to the actual state through a sequence of states of equilibrium and by summing up the quotients of the quantities of heat dQ introduced at each step and the "absolute temperature" T; the latter is to be defined simultaneously in this connection (first part of the Second Law). During real (i.e. non-ideal) processes the entropy of an isolated system increases (second part of the law)."

3. Note the restriction to thermodynamics and observations at equilibrium.

David Cavanaugh: I believe that a careful reading of this reference will show that it doesn't limit the applicability of the Second Law of Thermodynamics, but deals with the classic problem of how to deal with the Entropy of irreversible processes. In other words, the classic definition of the Second Law defines an exact mathematical relation for reversible processes, in equilibrium, with the understanding that the relation delta S = delta Q / T forms a lower bound for irreversible processes, for which the mathematics rapidly becomes intractable. As a consequence, we can't do the mathematics without finding a way around this problem, as there is no analytical way to treat the problem without finding a way to model irreversible processes by reversible/equilibrium processes. Most of the discussion in the text books therefore will deal with reversible processes in order to derive more advanced relationships. In the Thermodynamical sense, Entropy tracks the state of a given system, therefore what only must be considered is a given initial state, then the final state of the system. In a system involving irreversible processes, a reversible process may be devised to yield the same final system state, thus will allow an exacting Thermodynamical calculation to be made for that system.

Also, the idea of "closed systems" is an idealization introduced to again make the mathematics more tractable, and should not be considered a real (or at least an important) system configuration. I would like to offer a quote from "Physical Chemistry" by Walter J. Moore pub 1972 to amplify the solution to this problem of reversible/irreversible Thermodynamic processes, and the introduction of the "closed system" as a idealized conceptualization to aid the advanced mathematical derivations that follow.

"The change in entropy in going from an equilibrium state A to an equilibrium state B is always the same, irrespective of the path between A and B, since the entropy is a function of the state of the system alone. It makes no difference whether the path is reversible or irreversible. Only if the path is reversible, however, is the entropy change given by integral (dq / T):

delta S = S(B) - S(A) = integral A->B (dq / T) (reversible) (3.24)

To evaluate the entropy change for an irreversible process, we must devise a reversible method for going from the same initial to the same final state, and then apply (3.24). In the kind of thermodynamics we have formulated in this chapter (sometimes called thermostatics), the entropy S is defined only for equilibrium states. Therefore, to evaluate a change in entropy, we must design a process that consists of a succession of equilibrium states - i.e., a reversible process. In any completely isolated system, we are restricted to adiabatic processes because no heat can either enter or leave such a system (*). For a reversible process in an isolated system, therefore, dS = dq / T = 0 / T = 0, so that on integration, S = constant. If one part of the system increases in entropy, the remaining part must decrease by an exactly equal amount.

A fundamental example of an irreversible process is the transfer of heat from a warmer to a colder body. We can make use of an ideal gas to carry out the transfer reversibly, and thereby calculate the entropy change. [editorial note: this is the famous Carnot cycle]... We shall no prove that the entropy of an isolated system always increases during an irreversible process. The proof of this theorem is based on the inequality of Clausius...

(*) footnote: The completely isolated system is, of course, a figment of imagination. Perhaps our whole universe might be considered an isolated system, but no small section of it can be rigorously isolated. As usual, the precision and sensitivity of our experiment must be allowed to determine how the system is to be defined."

David Cavanaugh: 3. Quoting me: Correspondent: Would you please explain in what sense Hobson proved this ? Was this based upon a mathematical, statistical mechanical derivation ? It would seem that the original thermodynamical theoretical system applied to dynamic systems in that they applied to machines. I think that it's interesting to note that Entropy in the strict thermodynamical sense is a parameter, for which the exact physical mechanisms are obscure. That is why it's really necessary to drag statistical mechanics into the discussion, especially when discussing chemical and biochemical systems.

David Cavanaugh: You then said: Correspondent: Note that statistical mechanics is not drug into the discussion but actually defines the discussion.

David Cavanaugh: I agree that Statistical mechanics defines the modern understanding of Thermodynamics, but only in the sense that we have expanded our knowledge to deeper levels, not supplanted the basic Thermodynamical concepts. I would also argue that the Statistical mechanical treatment of Thermodynamics really has been taking place all of this century, not just from the 60's and 70's. I have a paper published by Max Plank in 1900, where he uses the concept of Entropy to develop his theory on the distribution of energy in the electromagnetic spectrum, with judicious use of combinatorial probabilistic reasoning. I would also argue that the point that we should be making must be predicated by a wholistic, blend of all relevant Statistical Mechanical and Thermodynamical concepts, the statistical concept of noise, chemical kinetics, identification of Entropic processes and appropriate uses of the Information and coding theory definition of Entropy as a measure of order (see below). See for example, my post "Ruminations on Entropy and related topics." For example (see Moore or similar text), Thermodynamics was expanded at the end of the 19th century and early this century by formulations of the Third law, including the famous statement by Boltzmann. You call this statement after Hobson "The generalized Second Law," but I would prefer to follow the convention that some texts use and call this the Third Law. Again from Moore: "As we did for the First and Second Laws, we therefore postulate the Third Law of Thermodynamics as an inductive generalization:

It is impossible by any procedure no matter how idealized to reduce the temperature of any system to the absolute zero in a finite number of operations. (Fowler and Guggenheim, Statistical Thermodynamics 1940)

...Thus. for any isothermal reversible process a-> b, the Third Law requires that in the limit as T-> 0, S0(a) - S0(b) = 0. This statement of the Third Law is similar to the famous heat theorem proposed by Walther Nernst in 1906.

Actually, the first satisfactory statement of the Third Law was given in 1923 in the first edition of the book Thermodynamic and the Free Energy of Chemical Substances, by G.N. Lewis and M. Randall:

If the entropy of each element in some crystalline state be taken as zero at the absolute zero of temperature, every substance has a finite positive entropy; but at the absolute zero of temperature the entropy may become zero, and does so become in the case of perfect crystalline substances ...

It may be unduly romantic to call Boltzmann, as some have done, a martyr for the atomic theory. His memorial is a white marble bust by Ambrosi, under which is engraved a short formula:

S = k ln W

[editorial note: Some texts call this the "Third Law of Thermodynamics." If W = 1, as in the perfect crystal, S = 0 as in the above statement.]

David Cavanaugh: 4. You wrote previously: Correspondent: 3. Then, based on Hobson's generalized 2nd law, the entropy (S) of the system, on average, changes with time as delta(S) = 0 [where, delta(S) = k*ln(omega), and omega is the number of ways mass and energy can be arranged, and k is Boltzman's constant]. S is comprised of the thermodynamic entropy component, Sth, and the configurational component, Sc. Sc measures the randomness in the distribution of mass, while Sth measures the randomness in the distribution of energy. Coherent complexity of the system increases as Sc decreases. This 2nd law is a more general statement than that of the former "2nd law of thermodynamics" which only applies to thermodynamic observables and only then in the equilibrium state...

6. But, what known natural mechanisms are there to exchange Sc (in contrast to Sth) with the environment such that the system's Sc decreases on average? And, even if there is a mechanism, the increase of the system's coherent complexity (decrease of its Sc) cannot exceed that available from the environment.

David Cavanaugh: This relationship is pretty much the same as the Boltzmann statement of the Third Law of Thermodynamics, as stated above, and through out much of this century the arrangement of mass has also been included into the theoretical tapestry. I don't believe that the classical Second Law of Thermodynamics has been restricted to systems in equilibria, as discussed above. As an example of where the location or concentration of mass affects chemical energy (Sth) in the form of EMF's, I would offer the electrochemical cell. Electrolyte and electrode material concentration gradients define the EMF or voltage available for all batteries, an Oxygen gradient drives galvanic corrosion mechanisms of metals, Na/K gradient/differential drives the voltage signal of nerve impulses etc., etc. These examples show that there is not always a clean separation of Sth and Sc, especially since much of the energy (e.g. kinetic energy) Entropy Sth is mediated by particles, whether they be electrons, atoms or molecules.

When mixing various chemical species, which then react and experience a system shift to equilibrium, we have a dynamical system until the steady state system condition is reached. During this time, the direction of the system shift is determined by the Gibb's free energy = delta G = delta H - T * delta S.

As can be seen by an inspection of the equation and a knowledge of the appropriate background concepts, the direction that the reaction goes depends on the sign of delta G, which can be determined by S, should the T * delta S term predominate. Thus another example of the relevance of S to non-equilibrium systems, as S can set the direction that a reaction goes during the dynamical period of time before statistical equilibrium is achieved.

Also, I know that I keep harping on the point that the Second Law applies to Thermodynamical machines, which are not equilibrium systems, thus gets around the argument that some Evolutionists make that life is somehow immune to the Second law, because they are "open systems." Well that argument is patently false, as the internal combustion engine in their own automobile is also an open system. As such, a car can't achieve 100% efficiency in converting chemical energy into mechanical energy (efficiency being limited by the Carnot cycle). Where we don't have the complete argument in the classical Second Law statement, is the extended understanding of Entropy after Boltzmann and 20th century Statistical Mechanics. However, in the same fashion, living things can't enjoy 100% efficiency in extracting energy from food consumption or conversion of the stored chemical energy into catabolic, metabolic and perambulation processes. Also, living things suffer the same Entropic decay as do the artificial creations (machines) of man. Consider the following quote from Lehninger "Biochemistry: The Molecular Basis of Cell Structure and Function," published 1972:

"Energy Transformations in Living Cells: The molecular complexity and the orderliness of structure of living organisms, in contrast to the randomness of inanimate matter, have profound implications to the physical scientist. The second law of thermodynamics, the branch of physics dealing with energy and its transformations, states that physical and chemical processes tend to increase the disorder, or randomness, in the world, that is, it's entropy. Natural processes never occur in such a way that the total disorder or entropy in the world decreases. How is it then, that living organisms can create and maintain their intricate orderliness in an environment that is relatively disordered and becoming more so with time?

Living organisms do not constitute exceptions to the laws of thermodynamics. Their high degree of molecular orderliness must be paid for in some way, since it cannot arise spontaneously from disorder. The first law of thermodynamics states that energy can be neither created nor destroyed. Living organisms thus cannot consume or use up energy; they can only transform one form of energy into another. They absorb from their environment a form of energy that is useful to them under the special conditions of temperature and pressure in which they live and then return to the environment an equivalent amount of energy in some other, less useful form. The useful form of energy that cells take in is called free energy, which may be simply defined as that type of energy that can do work at constant temperature and pressure. The less useful type of energy that cells return to their environment consists of heat and other forms, which quickly become randomized in the environment and thus increase its disorder, or in the molecular logic of the living state: Living organisms create and maintain their essential orderliness at the expense of their environment, which they cause to become more disordered and random. The environment of living organisms is absolutely essential to them, not only as a source of free energy, but also as a source of raw materials. In the language of thermodynamics, living organisms are "open" systems because they exchange both energy and matter with their environment and, in so doing, transform both. It is characteristic of open systems that they are not in equilibrium with their environment. Although living organisms may appear to be in equilibrium, because they may not change visibly as we observe them over a period of time, actually they exist in what is called a steady state, which is that condition of an open system in which the rate of transfer of matter and energy from the environment into the system is exactly balanced by the rate of transfer of matter and energy out of the system. It is therefore part of the molecular logic of the living state that the cell is a nonequilibrium open system, a machine for extracting free energy from the environment, which it causes to increase in entropy..."

My take on "coherent complexity" would be phrased something like:

a. You can have order at the cost of reduced system complexity and degrees of freedom for the system ensemble (as occurs for example in phase changes).

b. You can have complexity at the cost of reduced order due to more randomization and size in the system ensemble (due to diffusion and the equipartition of energy principle, or if you will, energy diffusion).

c. You can't (e.g. vanishingly low probability of occurrence) have complex order, without the intervention of external intelligence performing work on the system, or similarly by self regulating Thermodynamical machines created by an intelligence external to the system.

There is a small problem with the last statement claiming that "what known natural mechanisms are there to exchange Sc (in contrast to Sth) with the environment such that the system's Sc decreases on average? " One must again refer to the concepts of the Gibbs free energy and chemical kinetics to understand the favorability of the natural polymerization of certain organic and "pre-bio" polymers, such as is discussed in the abiogenesis literature, after the work by Stanley Miller and others. There are potential conditions under which the formation of polymers will occur, because it is favorable thermodynamically for them to form, as they represent a minimum system energy configuration.

In these cases, where the concept of Entropy comes in, is that past a certain point their molecular weight/size probability becomes vanishingly small, much like any mono-modal probability distribution, in this case a molecular weight distribution. Also from the fact that diffusion of the reaction constituents and also chemical kinetics are inherently random phenomena in solution; informational Entropy comes into the picture due to the expected random distribution of the monomers (in the case of proteins we have amino-acids) in the resulting polymer(s). It's easier to talk about proteins, but DNA or RNA (if they were the "precursors") would be subject to the same considerations.

Once polymers get to a certain critical size, Entropy again comes into play through degradation processes, parasitic in nature, which break chemical bonds, thus producing two or more smaller fragments. Several entropic mechanisms/process exist that accomplish this role. Thermal kinetics is one factor, for when long chain polymers are bombarded by molecular collisions, they vibrate and can and will assume resonant vibrational modes that become energetic enough to break bonds. Alternatively, molecules on the high end of the velocity distribution (looks kind of like a Poisson distribution at room temperature and is termed a Maxwell-Boltzmann molecular velocity distribution) possess enough energy to disrupt chemical bonds during a collision. Another mechanism for the disruption of chemical bonds in polymers, is the absorption of high energy radiant energy (especially UV light), energy from Nuclear fission processes and cosmic (particle) radiation. Sometimes (quite a lot if the time span is enough), the action of energetic radiation (actually any of the above mechanisms) can produce a homogeneous bond dissociation, creating a free radical, which is highly reactive and would not do good things for the linear and 3-D structure of any adjacent "evolving" polymers. Another mechanism that would produce molecular fragmentation, or possibly a "denatured" polymer, would be due to hydrolysis, or reaction with water. Thus Entropy in it's various forms, parasitic in nature just like noise, cause the probability of formation of complex biological polymers to become vanishingly small under conditions not found within living cells.

Speaking of living cells, Entropy is up to it's dastardly deeds there also. In fact, all of the Entropic mechanisms/processes mentioned above which are big time heartburn for abiogenesis, also are at work in living cells. Information and function are subject to chemical noise just like transmitted signals a subject to environmentally radiated noise, Johnson noise, thermocouple effect induced EMF noise, etc. etc. Since copying processes is one of life's biggest activities, such as reproduction, cellular mitosis, generation of messenger RNA and subsequent production of proteins, one would expect to see Entropy at work here also. In fact we do see this, with mutations or Entropic aging which eventually results in death or serious system malfunction and loss of viability (see for example "We Live Too Short and Die Too Long" by Walter M. Bortz MD). One can see the same system phenomena at work in the case of a document that has been photocopied sequentially (next copy made from the previous copy) for so many times that the Nth generation xerox phenomena makes parts of the document unreadable. Thus, a living system looses information (informational Entropy) in direct consequence with metabolic and catabolic processes throughout it's existence, to the point where functionality/information loss has accumulated to critical functions can't occur or other organic disease processes (also Entropic) can overwhelm the defense mechanisms of the organism, and death subsequently occurs. Loss of information or the inducement of erroneous information (informational Entropy) is caused directly by Thermodynamical processes as indicated above.

To use another example, Reliability theory for mechanical and electrical devices has a key theoretical concept in the cumulative damage model, similar to the discussion of Entropic decay in living systems above. The mechanisms which lead to degradation and loss of function in machines and other widgets are due to Entropic processes which result in damage which accumulates to the point of failure (in living systems serious disease and/or death). Parasitic chemical reactions (such as corrosion) and solid state diffusion are leading causes with loss of reliability in electrical and mechanical systems. These mechanisms are directly related to the concept of Thermodynamical/Statistical mechanical Entropy. The life distribution (reliability probability distribution) curves of electrical widgets follow closely human patterns, with infant mortality, adolescent mortality and eventual wear-out mortality. It's also worth mentioning the concept of random stresses contributing to the cumulative damage, according to Miner's rule, where all stresses are normalized for their time of duration with respect to the maximum time of duration for a failure to be induced. Random environmental stresses in this aspect of the model may also be considered to be Entropic in nature.

David Cavanaugh: 5. You wrote: Correspondent: Finally you said: Are you using 'informational content' in the Shannon (e.g. Information and coding theory) context, or are you using it in the context of S = k ln W ? The answer, I believe, is both and Gange covers that in his book.

David Cavanaugh: For the sake of the audience (many of the audience according to the biographies will know this already), I would like to make a few comments on Information and Coding theory, which was developed in large part due to the work of Claude E. Shannon. Shannon used the concept of Entropy to treat problems in communication, especially in the face of noisy communication channels. In the theoretical framework of Shannon, Entropy is a measure of order at the message source, instead of disorder as in the Statistical Thermodynamical sense, and uncertainty at the message receiver. In both cases, the definition of Entropy flows from the size/number of distinct states of a system canonical ensemble. In the case of Thermodynamics after Boltzmann, S = k ln W. In the Shannon sense,

Entropy = - H = Sum over i, P(i) * Log 2 P(i),

where P(i) is related to and/or proportional to 1 / W. P(i) or W can be illustrated easily from combinatorial probability calculations of the Entropy of poker hands. The Shannon entropy is measured in bits, not in Thermodynamical units, and one can think of it as the number of nodes assigned to a binary decision tree, where the log of base two is the number of levels in the tree. So for Shannon, the more possible messages or symbols a source can send, the higher the level of uncertainty for the receiver, hence the higher the informational content of the message. Also, the Entropy of Shannon has the formal canonical form of the statistical expected variable, or more commonly termed an average.

When properly understood, these are complimentary definitions. I would like to also extend a caution for using Information theory from "An Introduction to Information Theory: Symbols, Signals and Noise," by John R. Pierce, revised edition 1980:

"If we want to understand this Entropy of communication theory, it is best first to clear our minds of any ideas associated with the Entropy of physics. Once we understand Entropy as it is used in communication theory thoroughly, there is no harm in trying to relate it to the entropy of physics, but the literature indicates that some workers have never recovered from the confusion engendered by an early admixture of ideas concerning the Entropies of physics and communication theory. "

David Cavanaugh: 6. You wrote: Correspondent: 4. The modern understanding of entropy was formalized by Hobson in 1971, but was arrived at by others from 1956 through 1967 (Jaynes, Robertson, Schwegler, Zubarev, Scalapino, Kawasaki, and Mori). 5. The modern generalized understanding of Entropy recognizes it entirely as a statistical concept. As Gange states in Appendix 3, following his mathematical definition of Entropy, which you can read there, (words in brackets are mine),

David Cavanaugh: There seem to be some things in this quote which don't seem right. Maybe I don't have enough of the material to really put it into context and the right perspective. The several quotes that you've provided, including the one below, don't seem to fit into the conventional forms that the discussion of Entropy is usually cast in, so I've attempted to recast the discussion in the light of what is more traditional in Chemistry, Physics and Information theories. I've been making this argument for the last 25 years since I've first heard it, and during that period of time have snapped up relevant material when I came across it and attempted to bring in novel material where I saw the possibility. I general we are probably saying much of the same thing, but I've introduced a considerable amount of additional material for consideration.

Correspondent: [Begin Quote] "...The failure to recognize entropy as the observer's uncertainty [actually of the uncertainty of the observer's data] arises from the intuitive notion of older ideas that entropy must somehow relate to a mechanical phase function or to some Hamiltonian variable. But this is wrong, and if one insists on understanding entropy in mechanical terms, entropy will never be understood. Entropy is a statistical concept, and has no meaning outside the context of a probability distribution. A probability measures our uncertainty of the occurrence of a single event. Conversely, entropy measures our uncertainty of the occurrence of a collection of events.

David Cavanaugh: I would agree that Entropy can be considered to be uncertainty in data, or noise as I've described it above. One of the best expositions of the statistical concept of signal and noise that I've seen may be found in the book "Statistics for Experimenters" by Box, Hunter and Hunter. More conventionally stated, Entropy is related to the size of the number of distinct states of any system ensemble, or in statistical parlance, the number of degrees of freedom available to the system. Stochastic processes cause the constituent elements of the system ensemble to randomly populate the available levels / niches of the ensemble. The resulting interactions of the system constituent elements form a statistical population, which will in equilibrium develop certain average behaviors or population parameters. If in a dynamical period of change , then system average behaviors will also track with time, providing the time interval considered is sufficiently longer than the characteristic time constant of the stochastic processes(s) in question. Due to the random nature of these system constituent interactions, particularly in the case of molecular kinetics, noise and variation in observable characteristics are introduced. Although Entropy is technically the size of the state space for the system canonical ensemble, it can then also be thought of as a measure of the state of the system and the disorder of the system, because the larger the number of available system states, the larger the potential for randomness when stochastic process are at work and the more likely the system is in a unique state at a given instant in time.

I'm not completely sure what the author is getting at (perhaps because I don't have a background context) when disparaging the use of "mechanical phase functions" or "Hamiltonian variables" in trying to understand Entropy. The first thing that comes to mind when speaking of Hamiltonians is the Quantum Mechanical Hamiltonian operator. If so, Quantum Mechanical wave functions are conventionally interpreted at probability functions, hence are a statistical concept. Perhaps, the mechanical phase function is what I would recognize as a statistical mechanical partition function, and if so, is derived from the Boltzmann (read probability) distribution, thus fundamentally also a statistical concept.

Correspondent: There are, therefore, as many different entropies as there are probability distributions, and if the distributions describe different entities, the entropies will be unrelated to one another [as is the entropy of the distribution of nucleotides along DNA (the entropy germane to life) different from the entropy of the distribution of energy levels (which is not the entropy germane to life)].

David Cavanaugh: If we are considering the potential for natural processes to produce abiogenesis, or the formation of life from precursor organic species, then Genetic Entropy must be related to conventional Thermodynamical Entropy, because the organic polymers must be formed by chemical kinetics. Even though there may be independent stochastic processes with resulting random distributions, we still need to consider them all jointly when trying to assess the probability of formation of a living system through natural processes postulated to explain abiogenesis.

Correspondent: When we speak about the entropy of a system, what we mean by this is the entropy of an observer's data of that system. Some have wrongly held that this forces entropy to be subjective because it then becomes relative to the observer. This is also untrue. Entropy is not relative to the observer, but to the observer's data. It is, therefore, truly objective because it is the same for all observers with the same data.

David Cavanaugh: Same comment about the statistical concept of signal and noise.

Correspondent: However, this is not true for mechanical perceptions such as phase functions and phase points. These are subjective concepts because they cannot be measured in many body systems. As such, they are the metaphysical residue of an older time period.

David Cavanaugh: So far I can't seem to understand the justification for denying the traditional statistical mechanical theoretical system, which deals with mechanical concepts in that kinetics are involved and IR radiant emissions and absorption's are due mainly to molecular stretching, bending, wagging and other vibratory modes. IR spectra are not subjective, and may be measured with any IR spectrophotometer with adequate resolution. We measure many things indirectly that have mechanical expressions at the molecular level, who for example would question the existence of an electron, there's just too much indirect data to ignore ? I would have the same question about terminology as raised above for a more specific definition of phase function and phase point; are these the same thing as partition functions ? Fundamentally, statistics is the real world mathematics that allows us to treat the problem of the multi-body system, because we treat aggregate properties or population parameters. This has generally been the approach in Statistical Thermodynamics since early this century. Perhaps this is saying the same thing as I do, when I argue that deterministic based mathematics is an inadequate basis for treating most real world problems, so one needs a statistical approach.

Correspondent: The generalized entropy S(t) [which he defines in this same appendix] reduces to the familiar thermodynamic entropy at thermal equilibrium, and in general increases [on average] (although it need not be monotonic) as the system approaches equilibrium, i.e.,

S(t=0) .LE. S(t) .LE. S(t -> infinity)

In addition, S(t) does not require thermodynamic variables but is valid for any macroscopic observable and in non-equilibrium situations, i.e.,

S(t=0) < S(t --- infinity)

These last two relations constitute the New Generalized Second Law of Thermodynamics for closed systems." [End quote]

David Cavanaugh: At this point, I'm still left scratching my head and wondering what is different between this statement and the classical statement made concerning the Second Law of Thermodynamics, with the additional exposition / consideration of the Third Law definition of Entropy and the deeper levels of understanding that have been achieved throughout this century in statistical mechanics. I believe that I've shown above that the applicability of the classical statement of the Second Law has always been universal in scope, even if it's difficult to calculate for non-isolated, non-equilibrium systems. Also refer to the comments above about closed systems, which are technically impossible, as there is always some energy coupling. It might be better to speak of systems that are "loosely coupled" from their environment in terms of energy transfer.

There's one last topic I'd like to introduce, especially since this is getting to tome length. Several years ago I did a research paper for a graduate algorithms class on the Simulated Annealing algorithm for combinatorial optimization. As I was doing this work, I was amazed at the profound potential of this theoretical tapestry to investigate Entropy and probability and it's application to the possibility / probability of abiogenesis and/or the evolution of complex life forms. To provide a little background, there is a significant class of important problems in Computer Science, to which there are no known deterministic algorithms that will solve the problems in any kind of practicable period of time. These class of problems would cause super-computers to wander around in the weeds for months and years, without converging on an answer. This group of problems is technically known as the NP class (with several subgroups), or Non-deterministic Polynomial class.

Algorithm complexity and time to completion may be predicted by a figure of merit known as the computational complexity, a function of the number of "points" (N) in the problem to be solved. Algorithms whose computational complexity are polynomials of N, can be solved in reasonable periods of time for even relatively large N. The NP class of algorithms on the other hand involve computational complexity function terms of N! or exponents of N, such as 2^N. This problem is due to the combinatorial explosion of arrangements in the elements of the problem to be solved. Many of these problems are related to scheduling, including the famous traveling salesman problem, where the salesman is on a N city tour, and the challenge is to find the shortest total travel distance. It doesn't take too many cities (a few hundred will do) to choke a computer of any stripe using a deterministic algorithm. It turns out that any of the NP class problems may be cast as a combinatorial optimization problem, so now enter non-deterministic, Monte Carlo based algorithms such as Simulated Annealing. The particular problem to be solved must have a optimization function devised, a system configuration / input ensemble devised that is representable by some data structure and a procedure for moving the system ensemble around in the solution space. It's easy to see why these problems are so hard, when trying to plug in all the possible values possible into the optimization function.

In the case of simulated annealing, the optimization function is cast in the form of an energy function that ends up being evaluated as an Entropic function to guide the termination of the algorithm, hence the profound relation to Thermodynamical, stochastic system ensembles. These Monte-Carlo optimization algorithms relax the optimization criteria from absolute maximum optimization to nearly optimal, and as such what predominates in terms of their computational complexity function is the average behavior of the algorithm, instead of an absolute deterministic bound. It's not too hard to see the analogy to evolution, where the optimization function applies to viability, or Darwin's survival of the fittest. Interestingly enough, there is a sister class of algorithms to the simulated annealing algorithms known as micro-genetic algorithms; note the evolutionary inspiration. However much of the evolutionary terminology that is used with these algorithms, there are some very important structural differences not found in natural systems and processes. If there's any interest we can pursue this approach further, but it's just too involved for more than this brief introduction.

Correspondent: For example, if a two-chambered vessel containing hydrogen and hydrogen sulfide gas is heated, but a difference in temperature is maintained between the two chambers (preventing the system from reaching equilibrium), the gases gradually separate from one another. The hotter region is richer in the lighter hydrogen gas, and the other chamber becomes richer in the heavier hydrogen sulfide gas. This is called thermodiffusion. The molecules are moving faster due to the energy input, but they have become more ordered. If thermodynamics does allow for increasing order and thus complexity, it may allow for evolution to occur. (example from "The Arrow of Time," Coveney and Highfield. 1990.)

David Cavanaugh: I'd like to add a couple of points to break down this phenomena into basic, elementary principles to really produce an intuitive understanding of why this Evolutionary counter-argument is truly full of hot air:

1. There is a temperature gradient across the system, one of the basic thermodynamical requirements for a system being capable of producing work, hence to be counter Entropic.

2. The basic argument that the Evolutionist can use to counter the classical Creationist 2nd Law argument has to do with energy/temperature gradients producing the energy/work needed to provide a local reversal of Entropy at the expense of increasing Entropy in the surrounding environment. This argument doesn't hold water (or Entropy for that matter) because Entropy is reduced concomitantly with reducing the statistical degrees of freedom, hence complexity available for the system. The formation of ice is an excellent example of the creation of order, where the ice system assumes a minimum energy configuration consistent with the temperature, which natural systems like to do, but at the price of a lessening of the number of system available system states in the regular crystal structure.

3. Here's a proposed a classification system that I like. Other equivalent classifications have been proposed, but this one has lots of advantages. Some classifications make a distinction between matter configuration and energy; I don't think that this is necessary, though it is valid:

a. Simple order. b. Complex Order. c. Simple disorder. d. Complex disorder.

4. The physical processes and mechanisms (diffusion of matter and energy being an example) drive to state d, especially if a system can naturally pick up more degrees of freedom. An energy gradient can naturally suppress available system states producing lower order randomness, or state c. Systems always seek to assume minimum energy configurations, thus systems can naturally assume state a.

5. A few additional points about Entropy: a. Entropy is a system state function, it's change from one time to some subsequent time is the change of Entropy.

b. A given change of Entropy for a particular system can have multiple paths to achieve the state change (as in a above), but with out observing the time in between time T0 and T1, the observer can't know what path occurred.

c. Thermodynamical Entropy is a measure of disorder, because it's a measure of the size of system state ensemble, or how many different configurations it may consume.

d. Information theory disorder is a measure of order and information, and like it's Thermodynamical cousin is derived from the size of the system state space/ensemble. The more possibilities, the more meaning a particular system arrangement has. Therefore, c and d are flip sides of the same coin.

e. Entropy is the noise (due to random, stochastic processes) that causes a system to constantly increase in disorder to inhabit the total number of states available (such as with diffusion) at the cost of loss of system order.

f. State 3b above is not natural for any system; it only occurs when intelligence purposefully arranges matter and/or energy and is sustained by an energy gradient and at the expense of increasing the Entropy of the surroundings. Artifacts of intelligence (machines) use energy gradients to maintain their internal order and to produce other artifacts (ex. cellular division) which may behave likewise. Notice the key concept that the artifacts are initiated by intelligence, we argue God.

g. State 3a above may happen in natural systems because of minimum energy arrangements for systems, but the Quantum barrier posed by complexity cannot be bridged except for the intervention of intelligence. Thus Amino Acids may form naturally, but not complex proteins. Micelles can form naturally, such as with soaps, but this is a far cry from a cellular membrane, past the Quantum complexity barrier from the Micelle, though there are strong similarities.

h. Any system that uses energy gradients to maintain internal order pays a price in cumulative internal damage, which eventually results in system dysfunction and "death." This is true with the artifacts of man as well as with Biological systems, which all die in their respective ways.

i. Biological systems cheat death with elaborate information coding and error detection and correction schemes that reduce information loss due to copying noise to extremely low rates. This for example is the reason for sexuality, a system mechanism for combining two independent sources of genetic information, whose redundancy pushes copying error rates due to copying noise many orders of magnitude down. This kind of copying noise - Entropic in nature, both Thermodynamically and in an Information theory sense - is profoundly similar to document copying/Xeroxing noise which causes a steady decrease of document quality/clarity with each copy. Eventually leading to total information loss and unreadability.

6. I've written extensively on Entropy, so I'll not repeat it all here. These somewhat long and technical discussions can be had from the archives, or I can provide a couple good ones upon request to individuals.

7. Now back to H2 and H2S in a system with two chambers at different temperatures, joined with a tube:

a. There is a pressure gradient concomitantly with the temperature gradient, a consequence of increase molecular speeds, energy and momentum.

b. The basic mechanism resulting in the concentration gradient is really that of the H2S; when it goes to the cooler side, it looses the molecular energy and momentum to overcome the energy gradient and diffuse back up the pipe to the other chamber.

c. H2S is not only more dense than H2, but is a much heavier molecule, thus takes much more energy (as 1/2 MV^2) to excite. Also, H2S is a much more polar molecule than H2, hence is collisions are much further away from the elastic ideal, thus tend to stick more together. H2 on the other hand, comes much closer to the ideal Newtonian fluid over a large range of pressures.

d. The Hydrogen (H2) is not so much attracted to the heat as it is forced above the concentrating H2S, thus is concentrated by default in the head space above the concentrating H2S, which includes the pipe and other chamber.

e. As mentioned above, the net work on the system (in concentrating the H2S) is in the direction of the heat flow from higher temperature to lower temperature, just like what is required from the laws of thermodynamics. By the way there is a back pressure from the H2S concentration that would tend to force diffusion of the H2S across the entire system, a force due to "Entropy," but the temperature gradient provides the necessary energy to hold the system in a steady state, or dynamic equilibrium if you will.

f. Note also, that the H2 being so much less massive than the H2S will always pick up the necessary energy from collisions with H2S molecules to climb the energy gradient impressed upon the system.

g. Eventually, the H2S diffusion proceeds to the point where the regions of different gas densities maintains itself, according to the Thermodynamical principals as discussed above.

h. Thus the Evolutionary counter-argument is always shot down with a penetrating analysis, patience, boldness and persistence.

i. This argument is always limited to nuking Naturalism and Atheism, it cannot be logically extended to argue a young earth or disprove various forms of Theistic Evolution, but it's not a bad start and is my favorite approach (That's a surprise to everyone ...)

One final thought that occurred to me after writing this. Think about why it is we have a system here were there is a kinetic energy gradient (same as the temperature gradient), yet the pressure is constant across the system. The kinetic energy is 1/2MV^2 and the momentum is MV, what is the relationship and what is the difference between these two physical quantities. Momentum is also conserved in this system, such that the H2 momentum balances the H2S momentum, thus the pressure is the same from the impulse relationship that F* delta T = M delta V, this relationship determining the force of the inter-molecular collisions at the H2/H2S gas pocket interface. The H2 gets it's higher momentum from the increase in molecular velocity to achieve parity with the momentum of the H2S which comes from it's higher relative mass. Here's the interesting paradox then, an energy gradient exists, yet a momentum gradient does not. Why do things conserve momentum and energy at the same time, those being attributes of the same bodies, where one increases linearly with velocity (momentum), yet the other increases with the square of the velocity? I don't need a lecture with the math, I understand it. I just want to stimulate some thought about the physics, philosophical aspects of this.

September 8, 1996. Updated January 20, 1997.

Addendum

I've felt for some time now the "Laws" of Thermodynamics are badly in need of an overhaul to specify and enumerate more laws than the three currently taught.

I would like to propose the following list as candidates for the new "laws" of Thermodynamics, which I shall call principals at this juncture:

1. Principle of the conservation of energy.

2. Principle of the free diffusion of energy, and constrained (by physical forces and matter interactions) diffusion of matter, from a region of high concentration to a region of lower concentration.

3. Principle of the saturated partition of energy between the possible internal energy states of matter at a given temperature, where the internal energy states of matter have distinct, but finite capacities to store energy. When the possible internal energy states of a distinct system ensemble of matter have equal energy storage capacities, the principle of equal partition of energy follows as a necessary consequence.

4. The principle of the exchange of energy between potential (stored) and free states.

5. The principle of interchangability between forms of energy.

6. The principle of equivalence of matter and energy.

7. The principle of inefficiency; energy is lost when converted from one form to another.

8. The principle of Entropy; a fundamental property of energy and matter. Entropy represents the number of statistical degrees of freedom available to any given thermodynamical ensemble or system. Entropy is determined by the unique energy constraints and configuration of the particular system. Thus, the Entropy of a perfect crystal at absolute thermal zero is zero, as a result of the degeneracy of all energy states into the ground state of the crystal.

9. The principle of simplicity; all thermodynamical systems trend to the simplest configuration available, a configuration maximizing the size of the system ensemble state space (Entropy) subject to the inherent constraints of the system organization principles and total energy. No complex, coherent system arises spontaneously without the application of organizing work from a source external to the system.

11. The principle of minimization of system energy.

12. The principle of coherent work; energy cannot spontaneous move up an energy concentration gradient in any thermodynamical system without the application of coherent work from a source external to the system.

13. The principle of loss of information. Any system functioning as a thermodynamical machine, capable of producing organized and coherent work, must with time suffer informational loss critical to the maintenance, function and efficiency of the machine.

14. The principle of teleology; any thermodynamical system (machine) capable of coherent, self directed organized work is the product of intelligence and design.

This is the first time I've attempted to formally enumerate this list, so I'm sure there's lots of room for refinement, but it's a start. (added February 18, 1999)

See also:


Back to Lambert Dolphin's Library