Pier Maria Boria
Thermodynamics & life
TTC – Thermodynamic Theory of Creation (Refreshed in AD 2013)
Part 1 (of 4): Entropy 1.1
ENTROPY
To understand the meaning of Entropy1, the first pillar of this paper, it would be useful to start with its generalized, qualitative definition: it is an indicator of the state of disorder of a defined group of bodies. The greater is the disorder, the greater the Entropy (“Entropy; synonymous with disorder”: Helmholtz, 1821-1894). We will proceed backwards until its first inception; a strictly thermodynamic origin. To provide clarity, let us consider the following situation: a room containing a table and on the table a bottle which is sealed and filled with smoke (of unknown nature). An observer can take a photograph, as a witness, of the initial state of order: clearly defined are the bottle, the table, the smoke, which occupies a well defined volume, as well as the room (which constitutes our “universe”): system being observed plus environment.
Figure 1.1 – From the point of view of Thermodynamics, the only possible spontaneous transformation is that of increasing entropy. Opening the bottle, Figure 1.1, will result in diffusion of the smoke into the room. After a certain period of time (let’s say a day) the observer will be able to record a state of increased disorder: the smoke has come out of the bottle. One could imagine that after a million days the table could have disintegrated, or in any case, the interaction of this universe with others (caused, for example, by a cataclysm) would have resulted in the destruction of the table and the bottle, and finally of the room itself: the observer will take a different photograph. Since the observations could be thought to extend over an unlimited time, the photographs, in succession, will indicate an increasing state of disorder, in other words entropy. Adopting the language of Prigogine: the transformation towards increasing entropy “produced” positive entropy (the difference in entropy between the final and initial states ≥ 0), while those of decreasing entropy produce negative entropy. In parentheses we note that the inverse transformation (the smoke re-entering into the bottle) could not occur due to at least two reasons, each sufficient in themselves: first, the escape of smoke
1
En = inside, tropien = direction: in the sense of side: evolution
Pier Maria Boria
Thermodynamics & life
is an asymptotic function and its concentration tends towards perfect uniformity in volume with infinite time; second, without a concentration gradient it is not possible to have any movement of mass within the expanded smoke. Proceeding backwards in history we observe that the concept of entropy makes its first entry in physics thanks to the work of Clausius (Germany, 1822-1888), who was searching for principles of conservation which govern thermodynamics. The principles of conservation (which answers the question: “what remains the same after a transformation?”) represents the pillars of any scientific discipline2. Curiously he falls upon a principle of non conservation!, and comes to define an index of state which someone defined as anomalous and which he called Entropy. Therefore, initially, the concept of entropy was strictly thermodynamic (the state of the system under observation depends on variables such as temperature, pressure and volume), while the observation with which we started, as stated, are macroscopic, qualitative generalizations. It is understood that entropy is not an entity of conservation (except in reversible transformations which are entirely theoretical): in transformations which can be performed in practice, in which there is an interaction between the system under observation and the environment, there is an increase in entropy after the transformation (this allows a prediction of the direction which the transformation will take). In Figure 1.2 is another example: a “cold” body at temperature T1 is placed into contact with a “warm” body at temperature T2: the variables at play are the quantity of heat exchanged Q and the temperature T; experience tells us that the quantity of heat Q will pass from the body of higher temperature to that of lower temperature (Clausius’ postulate) until an equilibrium temperature Te is reached, somewhere between the two.
Figure 1.2 – For Clausius’ Postulate Clausius identifies that the relationship for the “quantity of heat transformed”, Q, between final and initial temperatures, is 0
2
We are reminded, for example, amongst others, of the principle of conservation of energy, the principle of the conservation of Angular Momentum etc.
Pier Maria Boria
Thermodynamics & life
because Te < T2. Clausius called Entropy the ratio S = Q/T. Using current thinking we can say that the heat exchanged has performed a transformation in that â&#x2C6;&#x2020;S > 0 .
1.2
A NUMERICAL APPLICATION
Now we do a simple numerical example using what we call the Clausius Calorimeter consisting of an adiabatic calorimeter containing water and a warm body (a cube of copper):
Figure 1.3 â&#x20AC;&#x201C; The Clausius Calorimeter We postulate the following conditions: â&#x20AC;˘ â&#x20AC;˘ â&#x20AC;˘ â&#x20AC;˘
Starting temperature of water Starting temperature of copper Equilibrium temperature Quantity of heat exchanged
300 400 310 30
K K K J.
Since, as it is well known, the elementary variation of entropy is
,
introducing the thermal capacity C (mass m multiplied by itâ&#x20AC;&#x2122;s specific heat c) of the bodies we have â&#x2C6;&#x2122; â&#x2C6;&#x2122; â&#x2C6;&#x2122; , and integrating for each of the two bodies we obtain: for the copper
Pier Maria Boria
Thermodynamics & life
â&#x2C6;&#x2020;
â&#x2C6;&#x2122; 0,255 ,
for the water
â&#x2C6;&#x2020;
â&#x2C6;&#x2122; +0,033 .
More simply we obtain the thermal capacity of each body:
50 " 0, 5 â&#x2C6;&#x2020; 90
50 5 , â&#x2C6;&#x2020; 10
and subsequently we can calculate the total variation in entropy of our closed system: ' â&#x2C6;&#x2020; â&#x2C6;&#x2020; + â&#x2C6;&#x2020; 0,142 + 0,165 +0,023 & ) 0 1 ( As a preview to the second law of Thermodynamics. In the equation 1) we found two addends of opposite sign each one representing a â&#x20AC;&#x153;localâ&#x20AC;? variation of entropy: it follows that even though the total entropy of the testing universe increases we can have local variations of opposite sign3. In fact, generally, when we have a thermal transformation some mass increase in temperature and the other decrease; the heat exchanged is equal and we can say â&#x2C6;&#x2020; 0 *â&#x201E;&#x17D;, - < / 0 â&#x2C6;&#x2020; < 0 *â&#x201E;&#x17D;, - / , that is, the cooled body decreased its own enthalpy in an opposite direction to that of overheating (the meaning of indices is obvious). This observation will soon be useful when talking about â&#x20AC;&#x153;Entropy and Lifeâ&#x20AC;?.
1.3
ANALOGY BETWEEN ENTROPY AND WEIGHT
The content of this paragraph is not essential for the purpose of this paper. However, we consider it useful to complete the understanding of entropy. Amongst the physicists of the XIX century Zeuner (Germany 1828-1907) proposed an interesting analogy between the gravitational potential energy of a weight P and the entropy of a mass with a heat Q and a temperature T. With reference to Figure 1.4 we know that the potential energy (i.e. the mechanical work which can be performed) of the water mass of the reservoir is L = P Î&#x2021; â&#x2C6;&#x2020;H.
3 It seems rational to accept the popular statement according to which the entropy of the astronomical universe is indefinitely increasing in spite of our lesser knowledge of the astronomical universe (see also the â&#x20AC;&#x153;Anthropic Principleâ&#x20AC;?), in any case pay attention not to confuse that with testing a closed universe.
4
Pier Maria Boria
Thermodynamics & life
Figure 1.4 â&#x20AC;&#x201C; System to transform gravitational potential energy into mechanical energy of a motor shaft. Zeuner studied the work obtainable from a thermal motor capable of transforming heat into work with a Carnot Cycle4, allowing the efficiency of the heat/work transformation to be expressed exclusively in temperature terms (as opposed to quantity of heat), which leads us to our goal. In fact, as is widely known, the efficiency of the Carnot Cycle is
T â&#x2C6;&#x2019; T0 T = 1â&#x2C6;&#x2019; 0 , T T where T-T0 is the difference in temperature between â&#x20AC;&#x153;sourceâ&#x20AC;? and â&#x20AC;&#x153;coolantâ&#x20AC;? . Consequently, introducing the quantity of heat Q into the motor, the mechanical work L obtainable will be:
Ρ=
1
â&#x2C6;&#x2020; â&#x2C6;&#x2020; 2
or rather, the expression that appears in Figure 4, where the entropy â&#x2C6;&#x2020;Q/T is a factor of proportionality analogous to the weight P, where the change in height â&#x2C6;&#x2020;H corresponds to the change in temperature â&#x2C6;&#x2020;T which the motor is able to produce (from â&#x2C6;&#x2020;â&#x20AC;?T < â&#x2C6;&#x2020;â&#x20AC;&#x2122;T one has, in proportion, Lâ&#x20AC;?<Lâ&#x20AC;&#x2122;, with the consequence that the residual internal energy, after being depleted and not able to be transformed into work, will be Uâ&#x20AC;? > Uâ&#x20AC;&#x2122;). We can observe that a functional tie exists between Q and T such that by increasing Q, T is increased in direct proportion (considering as constant the specific heat of the mass which runs the cycle, with no latent heat exchange) and therefore, given a particular initial entropy, the work obtainable depends exclusively on the â&#x2C6;&#x2020;T achievable. A motor which expels heat at a lower temperature produces more mechanical work at equal â&#x20AC;&#x153;consumptionâ&#x20AC;?: this is the purpose of the comparison between the two thermal motors in Figure 1.5 .
4
A car run on petrol will produce an Otto Cycle, one on diesel a Diesel Cycle, an exothermic motor will produce a Rankin Cycle, etc.
5
Pier Maria Boria
Thermodynamics & life
Figure 1.5 – Comparison between the mechanical work obtained from two identical thermal motors functioning according to the Carnot cycle for two different exhaust temperatures (T0” in case A and T0’ in case B). The point of view seen above and resumed in equation 2) seems favorable to the presence of high values of entropy tout court: to avoid erroneous generalizations it needs remember that Gibbs says that the maximum energetic gain in thermal transformations, that is to obtain the maximum “free energy” G, is to exploit the total energy (enthalpy) H of the active mass minimizing the entropy at discharge. In fact, the Gibbs equation states that 2 3 ∙ ∆
where H represents the entalpy of the mass transformed. We stress that it is necessary to compare two cases with identical initial temperature (as in Figure 4.1) and to consider that it is the factor ∆T which determines the efficiency of the transformation5. Sea water contains an enormous amount of thermal energy, but at a temperature T (of the source) very near to T0 (that of the coolant): in other words rendering unusable the heat it contains; we can state that sea water contains a “large” amount of thermal energy, but no practical possibility of making a thermal motor work (the thermal difference available is “practically nil”). Exactly for this reason a boiler which burns a combustible fossil material capable of achieving “high” temperatures enabling it to provide water at 90 °C, is to be considered the perpetrator of a grave “thermodynamic crime”. That combustible could be used with more results, for example, in a cogeneration plant where water at low temperature is a “waste” product.
5
Sources at high temperature are necessary to produce thermodynamic cycles with acceptable results. Our car, be it Otto or Diesel, develops a temperature of around 1500 °C in the combustion chamber and give us a mechanical efficiency at the wheels of about 35% (approx, 30% remains “internal energy” and is expelled to the exhaust. The coolant temperature is that of the atmosphere; the remainder is transformed into heat by thermal loss and passive resistances and is dispersed mainly by the radiator).
6
Pier Maria Boria
1.4
Thermodynamics & life
ENTROPY AND LIFE
Livio Gratton (Italian cosmologist from Trieste, died in 1991 and considered the father of Italian Astrophysics) observed that the phenomenon “life” contains something singular which does not fit in with the mechanism described up to this point. The appearance of life, in an electromagnetically structured universe, constitutes a singular moment which cannot be explained technically. In fact, an organism is alive when, within itself, it produces transformations of negative entropy (that is with ∆S<0) which contradicts the second principle. Let us observe a plant seed: if it is alive, in conditions expected in nature, it germinates spontaneously and grows, capturing carbon from the atmosphere, giving body to the plant, and releasing oxygen through chlorophyll synthesis. A small wheat seedling, recently sprouted amongst the snow, germinates and grows, warming itself up at the expense of the ground (who has not observed the molten snow round the seedling? The seedlings, under a thin blanket of snow, poke out and are clearly visible, green seedlings on a white blanket, in the middle of a dark patch of earth, free from that which surrounds them?!). Naturally, if we were to also consider the interaction of the plant with the quanta of solar energy and the surrounding minerals, we would find that the sum of transformations has generated positive entropy (the affirmation that the entropy of the universe tends to increase without limits is correct). A living animal organism, should it be injured, is capable of healing itself: the vis vitalis, as our ancestors called it, produces such an effect, while a dead animal organism remains injured and decomposes with the passing of time (increase of disorder). One could consider the possibility of turning to entropy to define the state of life or death about which we periodically debate, even in practical cases (Terry Schiavo, Eluana Englaro…): if the organism produces negative entropy it is alive, in the opposite case it is not… One could also suggest a crude experimental procedure, of a slightly Hitlerian nature..., which would settle the matter once and for all, consisting of injuring an organism that has a dubious state of life to verify its reactions in one entropic direction or the other… The vis vitalis departs even if all the mechanical organs would be perfectly functional: we can think of the so called cardiac arrest (a phrase that could be a savior for the corner of the art of medicine). One could certainly object that the arrest is the cause, while the departure of the vis vitalis is the effect: who knows!? The only certainty is that with death an irreversible process starts, with the production of positive entropy... and we fall back into line with the second principle! In conclusion, it can be said that the property of entropy is that of an increase in every transformation that can be performed practically (like saying, in every irreversible transformation), except in the case of living organisms. How to produce heating of the plant at the expense of the surrounding masses and to increase the order of the molecules to the point of “forcing” the carbon, taken from the most formless state in existence (that of gaseous CO2), to take on the shape of a trunk giving rise to transformations of decreasing entropy? Also an ordinary refrigerator can produce a local decrease of entropy, expending some energy; in the following figure we represent the energy transformations occurring in it: at the end of the transformation we have the temperatures marked with an asterisk, after the energy Q* leaves the cool body to join the warmest body with the energy Q3 that is needed for the refrigerator to run6.
6
The ratio (Q2+Q*+Q3)/Q3 is the widely known COP (Coefficient Of Performance) of the heat pumps
7
Pier Maria Boria
Thermodynamics & life
Figure 1.6 â&#x20AC;&#x201C; Heat pumping in a refrigerator In this sketch the external energy Q3 appears essential and the system is open: the energy Q* increase its entropy, gaining the temperature T2* entering the condensator. Restarting the numerical example of the Clausius calorimeter we reconfirm Q*=50 J as the heat exchanged; in this condition it is easy to verify that the water temperature decreases by 10 K while the copper increases by 90 K. Assuming COP=3 we have final temperature of water and for the copper
T1* = 290 T2* = 490+90:3 = 520
proceeding as above it follows that for the copper: â&#x2C6;&#x2020; for the water â&#x2C6;&#x2020;
K K,
520 ' " â&#x2C6;&#x2122; 0,04 0,02 0, 5 400 (
290 ' 5 â&#x2C6;&#x2122; 0,034 0,170 300 (
Therefore the quantity of transformed heat, Q*, is subject to the variation â&#x2C6;&#x2020; â&#x2C6;&#x2020; + â&#x2C6;&#x2020; 0,02 0,170 0,15 < 0
' â&#x2C6;ś (
thanks to the contribution of the external energy Q3 the exchanged heat decreases its entropy! Now we will see in what way nature does the heat pumping.
Pier Maria Boria
Thermodynamics & life
Part 2 (of 4): Boltzmann’s Distribution 2.1
THE BOLTZMANN’S DISTRIBUTION
We will reply to the question after having examined the second pillar on which we base this paper: Boltzmann’s Distribution (Ludwig Boltzmann: Austria, 1844-1906). As can also be seen in excellent web pages, the disorganized vibrational velocity of the molecules of a gas (but also those of liquids and solids), at a given temperature, take on values which are continuously and randomly variable following a particular distribution represented graphically in Figure 2.1.
Figure 2.1 – Probability distribution of the velocity of molecules of a gas as a function of the velocity itself, according to Boltzmann’s Statistic
It is thanks to this distribution, discovered by Boltzmann, that living nature, vegetable and animal, can perform local transformations with decreasing entropy: the great masters have thought up theoretical experiments, based on devices capable of selecting molecules of colder gas having higher velocities than what is thought to be the average velocity of the molecules of the warmer gas (Maxwell: the demon, Polvani: the choosing porter, Amerio: the selecting valve) to allow them to pass from a lower temperature environment to another, adjacent environment with higher temperature; in this way obtaining a transformation which locally invalidates the second principle of thermodynamics. In Figure 2.2 it is possible to see that at every average velocity (considered) of the “warm” molecules one can find a corresponding branch of the “cold” curve related to those particles that, should they pass to the warmer side, could cause an increase in that average velocity, and therefore of the temperature.
9
Pier Maria Boria
Thermodynamics & life
Figure 2.2 –The Maxwell demon allows the passage from the colder to the warmer environment only of the molecules which have a velocity higher than the weighted average velocity of the warmer molecules. It is necessary to perform a sorting of the molecules, one by one, with mechanical means not available to man, while the experimental observations, of the type reported above, would suggest that nature is capable of it, operating at a molecular level in the realm of living organisms. In Figure 2.3 is represented the device which allows the “theoretical experiment”, in the form proposed by Prof. Amerio of the Polytechnic of Milano (1955). Maxwell had proposed a “demon” as selector of the molecules (1867): the selection device has been the object of particular attention on the part of Szilard (1929) and later Bennet (1981), with the scope of correctly counting the variation of entropy in the test universe and calculate the required energy for the selection.
Figure 2.3 – The selective valve allows the passage from the colder to the warmer environment only of the molecules which have a velocity higher than the weighted average velocity of the warmer molecules as shown in Fig. 2.2. 10
Pier Maria Boria
Thermodynamics & life
These elementary applications of classic thermodynamics, based on the concept of entropy and on Bolzmann’s Distribution, suggest to us that the phenomenon “life” is to be associated with a “vis-vitalis” external to the dissipative mechanism for which we have ample and daily experience. Obviously it is impossible for man to build a Maxwell device but in our research we have found a very interesting observation by Jaques Monod (Nobel Prize in 1965) that confers the part of demon to the natural enzymes7. According to this point of view we can convert the Figure 1.6 as follows:
Figure 2.4 – The natural heat pumping performed by enzymes and this sketch we consider as typical of the phenomenon “life”. The role played by the vis-vitalis seems essential, because the only electro-chemical energy associated with enzymes are components easily deliverable in the biological laboratories, but nobody has been able to start life from these components8. There are those who attempt an approach to this argument with improper methods and with arbitrary applications of the concept of probability, which leads to theories that are devoid of the required respect for a sound scientific doctrine.
2.2
CONCLUSIONS FROM THE FIRST AND SECOND PART
Rivers of ink have been written about the origin of life, to the point that it is possible to read about the most bizarre theories that completely ignore that which is suggested by the Queen of Physics: Thermodynamics. Paleontology, Biology, extraterrestrials, UFOs, Cosmic Palingenesis and similar are all stirred: numbers, equations, concepts of probability, principles of conservation, etc, are not used
7 8
Le hazard et la nécessité, 1970 – Arnoldo Mondadori Editore Spa. – Milan – Pag. 58 See the Stanley Miller experiment at the end of paragraph 5.4
Pier Maria Boria
Thermodynamics & life
correctly. These are the only foundations possible for a correctly stated scientific discussion (there is no adjective more abused than the term “scientific”). The reader could (perhaps on a rainy Sunday) do some research on the “primordial” soup (but if it is not Knorr, for who’s brand, modestly in youth, we made thermodynamics projects, does not taste good!...), on the “cosmic tank”, on the “typing monkeys”, on the cycle of carbon and oxygen (in relation to the demonization of CO2), on the hydrological cycle (which is a substance that cannot be “consumed”, as is currently heard said, otherwise what cycle would it complete: subjects often treated by substituting Science with ideology and making ample use of the principle of superior authority (the ipse dixit of historical memory), upholding disjointed dogma, but which are politically correct. Sometimes one has the feeling of witnessing the squalid discourse of gossiping women by the fountain!... It can be noted that in the observations made up to now, we have practically not talked about energy, who’s role in the economy of our discourse has been secondary. It’s the definition of the entropy index state which changes the way to view the cosmos: we would not talk of it if it were possible to carry out reversible reactions. We would come to suspect that the irreversibility is a “defect” of the cosmos, having the function of forcing it to a gradual entropic enrichment (and therefore to a degeneration of energy) such that the final form of all the energy available becomes one that is thermally and entropically unusable: therefore, by virtue of what has been discussed, at a certain point in the evolution of the universe, at a finite time, it will not be possible to practically perform any thermodynamic cycle9. That is to say the thermal death of the universe.
9
We will be further willing to suspect a decay of the cosmological properties correlated to the original sin,. Ah! free thought!...
12
Pier Maria Boria
Thermodynamics & life
Part 3 (of 4): Probability
3.1
PROBABILITY IN BOLTZMANN’S STATISTICS
Boltzmann obtained the graph of the probability as a function of temperature, postulating that a certain number m of particles, which are indistinguishable from each other (which we will call A, B, C, ..., M), and a number n of possible states (a, b, c, ..., n) in which one or more particles (even if m) can find themselves; the presence of particles in each state could occur with different possibilities. If the identical particles are free to occupy the various states (as in the case of a gas), these could continuously exchange states between themselves (for example thanks to reciprocal impacts, as in Figure 2.3), whilst “on average” maintaining a certain distribution: subject to the conditions around them (for example temperature), a certain distribution of the possible configurations would be typical of such conditions. Continuing with this example, if by state of the particles we mean possessing a certain amount of kinetic energy E associated with each molecule of a gas, in a certain interval of values of energy ∆E, there will be a stable quantity of molecules, even if amongst themselves continues exchanges of energy occur. Therefore, in the range of the same interval, some particles enter and some leave. If, for the sake of imagination, in what follows, particles will be considered as “balls” and states as levels of energy, the balls will represent the particles, while the levels will represent an interval of energy (∆E). Let us start with a very simple case consisting of 3 particles (m=3) able to be hosted by two levels (n=2), as illustrated in Figure 3.1. In the left column we see all the possible combinations. In the central section we see that certain combinations repeat themselves in such a way that, if the particles become indistinguishable (column 3), they are to be considered the same amongst themselves. Therefore, three possibilities exist such that both the combinations 2,3,4 and 5,6,7 can occur and only once for the combinations 1 and 8. If we “normalize” the possibility (expressing it in unitary or percentage terms), it assumes the role of probability (ratio between favorable cases and possible cases), which we have done in the last column by expressing it in percentage terms, as is common practice.
13
Pier Maria Boria
Thermodynamics & life
Figure 3.1 â&#x20AC;&#x201C; A rather simple case to demonstrate how, given m=3 and n=2, it is possible to have different probabilities for each combination.
14
Pier Maria Boria
Thermodynamics & life
This allows us to draw the graph of Figure 3.2 where we can begin to see the Boltzmann distribution forming:
Figure 3.2 â&#x20AC;&#x201C; The embryonic Boltzmann diagram: increasing particles and the number of possible states, the envelope of the columns (in this particular case not yet) acquires the characteristic asymmetric bell shape.
Following in the footsteps of the great Ludwig we enter into systems which are numerically more substantial: three combinations of seven states with an arbitrary arrangement of four particles as represented in Figure 3.3; the three combinations are equivalent because the particles are indistinguishable, by hypothesis.
15
Pier Maria Boria
Thermodynamics & life
Figure 3.3 - The three configurations are equivalent if the four particles are indistinguishable amongst themselves.
Each of the n states can be associated with A, B, C, etc. (that is, to each or more of the m particles) and since a single particle can occupy each time a different state (and other particles other states), m times, the possible combinations C are n×n×n×…×n (m factors equal to n).
C = nm . We could also be convinced, observing for example Figure 3.4 where it is assumed that n=5 (it looks like a musical stave…) and m=2 particles (therefore 52=25 combinations):
16
Pier Maria Boria
Thermodynamics & life
Figure 3.4 â&#x20AC;&#x201C; Beyond the 25th beat the preceding configurations are repeated because A and B are indistinguishable. Within the range of the 25 possible configurations some are more favored because they appear more frequently: for example 6 and 22, 9 and 25, etc. The unoccupied states are identified by a circle. As is fair to expect, configuration 1 is least favored.
Pier Maria Boria
Thermodynamics & life
We can arrive at the same result with a more practical method, suitable also for very large values of n and m, which we will use as follows. It consists of a tabular method stolen from Combinatorial Analysis where for n and m equal to various units it avoids the need to write hundreds or thousands of key strokes as used above. Let us take two rows and as many columns as there are states, thereby obtaining a grid: in Figure 3.5, to verify what has been said above, we have taken 2 rows and 5 columns (n=2; m=5).
Figure 3.5.- With this grid we obtain the number of possible configurations.
To further demonstrate we will build a grid for n=5 and m=4, as in Figure 3.6, where there are sufficient rows to progressively expose the number of particles (from 4 to 1 in the first box of the first column of the occupancy numbers) and there are n columns.
18
Pier Maria Boria
Thermodynamics & life
Figure 3.6 - Since 54= 625, there are 625 possible combinations; the relative probabilities are listed in the last column: note the asymmetry.
19
Pier Maria Boria
Thermodynamics & life
It is necessary to observe that in the figure the table of numbers of occupancy reminds us, not by chance, of Tartagliaâ&#x20AC;&#x2122;s Triangle, while the Boltzmann type diagram that can be associated, shown in Figure 3.7, takes on an almost familiar shape.
Figure 3.7 - Graphical representation of Figure 6.2; the bars are asymmetric.
20
Pier Maria Boria
Thermodynamics & life
To provide an example and referring to Figure 3.6, we can see how it is possible to obtain 80 possibilities corresponding to his second line. If a box is occupied by 3 particles out of an available 4, the simple combinations of 4 objects with 3 by 3 (as taught by the Combinatorial Analysis) are given by the binomial coefficient 4 6 7 4, 3
and the four possible groups of three numbers have five positions from which to choose. From here 4Ă&#x2014;5=20 possibilities for the group of three numbers. The single remaining particle has the possibility of the four remaining locations, and therefore has 1Ă&#x2014;4=4 possibilities. The product 20Ă&#x2014;4=80 gives us the total possibilities in the case that the particles arrange themselves in two groups, one with three and one with a single particle and having five boxes suitable. It is easy to verify that we will obtain the same result considering first the single particle having five boxes suitable (five possibilities: 1x5=5) and after the three having the four remaining (one is occupied by the single particle, therefore 4x4=16 and 5x16=80 ). Applying the procedure line by line it produces the results shown.
21
Pier Maria Boria
Thermodynamics & life
Part 4 (of 4): Chance
4.1
CHANCE
A sharp-shooter shoots at a target with an excellent rifle: he aims carefully, chooses the moment when his breathing will not interfere and the amount of force with which to pull the trigger so as not to move the barrel, fires the shot and hits the bull’s-eye. Immediately afterwards he takes all the same precautions, but the shot ends up being slightly off target: it could have been a slight disturbance to his sight, an involuntary variation in his breathing, an imperceptible abnormal movement of the finger, a very slight unpredictable wind, or who knows what else. The causes are many and imponderable, slight if each is considered in itself, but interacting differently each time, ensuring that each shot has a different fate. This complex of innumerable causes of disturbance which are not controllable or predictable, and which, not being able to take each into account, one by one, are called the Law of Probability (Gauss’s Law)10. Probability for the reasons given, and law thanks to Carl Friedrich Gauss (1777-1855) who wrote an equation capable of taking into consideration, in a global manner, all those fleeting causes so as to be able to predict with near accurate approximation how the shots will arrange themselves, percentage wise, round the target with different distances from the bullseye. The approximation will be more accurate the greater the number of shots that are fired. Let us assume that the target is as represented in Figure 4.1 and is divided into two parts by means of the section AB, and that our sharpshooter fires many shots; after which we count the number of shots which hit the target in each half:
Figure 4.1- The segmented target
If the reasons for the error are truly random (rifle without defects, such that it does not tend to deviate the shot systematically and neither does the sharpshooter have an analogous defect; there is
10
The example of the sharpshooter was published by Engineer Mario Manaira in N° 256 of “Journal of Mechanics” together with our first article concerning thermodynamics, more than half a century ago (1961)!
22
Pier Maria Boria
Thermodynamics & life
not a steady wind, etc.; in other words there does not exist a cause which always influences with the same bias, called a systematic cause) we could note the following: 1. The shots will be greater in number in the first band round the center; 2. The shots will progressively decrease in number in the subsequent bands as these distance themselves further from the center, until there are very few in the bands furthest away; 3. The shots in the two halves, right and left, in any similar band, will tend to have the same number and will even be identical if sufficient shots are fired. It is therefore possible to represent the phenomenon graphically, as in the following figure:
Figure 4.2 â&#x20AC;&#x201C; The random distribution of the shots in each band and the Gaussian distribution that would be obtained with an infinite number of shots fired.
If the marksman were less capable, the concentration of shots near the zero on the abscissa would reduce and the curve would flatten itself, while maintaining the characteristics given and represented in Figure 4.3. The first observation is that the maximum height of the curve constitutes the â&#x20AC;&#x153;targetâ&#x20AC;?, in other words the goal of the operation, while the absence of systematic causes (in antithesis of randomness) ensures the symmetry of the curve with respect to the vertical which represents our target zero.
23
Pier Maria Boria
Thermodynamics & life
Figure 4.3 - If the marksman is less skilled, the Gaussian flattens. In the case of a systematic cause of error, the curve loses its symmetry: if we assume that the test is performed with a constant wind from left to right, the graph will take on the shape of Figure 4.4:
Figure 4.4 – When the Gaussian is asymmetric it implies that the phenomenon is not “entirely random”11. Let us suppose now that our sharpshooter is blindfolded, the target becomes very large and is moved: he will have to shoot blindly (randomly), left and right, high and low. Given that the Gauss
11
Gauss suggests that the analytical expression of the Law of Randomness is the function. 2
y = e−x , where it can be seen that the curve is symmetrical with respect to the axis x=0, and decreasing both towards the left and right of this line and has a maximum for x=0. It can be shown, further, that the area subtended is +∞
∫e
− x2
dx = π .
−∞
To ensure that this area is equal to unity, as opposed to π , appropriate steps can be taken, which, without changing the general properties illustrated, give the normalized Gauss’s Law.
24
Pier Maria Boria
Thermodynamics & life
function still applies, the probability curve will flatten itself, maintaining the essential characteristics; in particular the two tails which will tend towards a tangent with the abscissa tending towards infinity, a maximum point, a point of inflection and the other characteristics illustrated in Figure 4.5.
Figure 4.5 â&#x20AC;&#x201C; Typical characteristics of a normalized Gaussian.
Supposing once more that the Gauss function still applies, it would be logical to expect a distribution with a curve that is so flat that it will be difficult to see a maximum point corresponding to the center of the target; it will be necessary to fire enough shots so as to occupy every position on the abscissa and to have hit with 100% certainty the bullâ&#x20AC;&#x2122;s-eye. This implies that everything is possible, as long as an infinite number of shots are available (using rhetorical language).
4.2
SOME PROPERTIES OF RANDOM EVENTS
The perplexities regarding the applicability of chance, as referred to the blind sharpshooter, depend on the fact that the Gaussian assumes that programming has been applied to reach an objective, which implies that the operator is conscious of the objective; an element which in this case is absent. Both the existence of a program (the sharpshooter sets out to hit the bullâ&#x20AC;&#x2122;s-eye) and the existence of an objective (the card with circles) appear to be essential to be able to talk about chance. Another example: let us imagine a machine programmed to produce a certain mechanical piece: the program is the design of the piece written in machine language and the objective is the production of the piece. In mass production we will find that it is the case that, despite the work conditions being maintained the same, each piece will be different to the other, to the point that the pieces which exceed the tolerances (which would not allow them to be interchangeable) will be rejected. Innumerable examples could be presented identifying in every case these two characteristics: a program and an objective. Statistics also operate in reverse: from the measurement of a group of subjects it creates a bar chart: its envelope will be the curve of the random distribution. It will give us the average of the values measured; if the curve is symmetrical it will tell us that the phenomenon is not influenced by systematic causes; further, it will tell us the value of the standard deviation, etc.
Pier Maria Boria
Thermodynamics & life
To fix this thought in our heads, let us suppose that we want to study the average height of a population of people who are male: we make many measurements on many subjects, creating bars for every centimeter: we will obtain a graph similar to Figure 4.6.
Figure 4.6 – A practical application: the Gaussian deduced from experimental measurements for statistical purposes In this statistical application, where are the program and objective? They are there, they are there!: they were contained in the information which the people naturally had at conception; a matter of genes and of DNA (an observation coherent with “The Kid Equation”! See the “Introduction to Hyperspace”12). These considerations lead us to think that the meaning of the word “chance” commonly given does not make sense, that “chance” does not exist, and lead us to suspect that Anatole France had an inspired guess when he said: “chance is God’s pseudonym when He does not want to sign his name”. This strongly agrees with what illustrious philosophers have been confirming for centuries: “Deus absconditus est” (Is XLV, XV).
12 In our first volume “Caro amico mio…” – Ed. Pagine – 2010. In our second volume (“Verba volant, eqvuationes manent”) other considerations about a fundamental theorem of Genetics: the Hardy Weinberg theorem.
26
Pier Maria Boria
4.3
Thermodynamics & life
CHANCE & PROBABILITY
We can now summarize some salient functions of Boltzmann and Gauss: Boltzmann: 1.
2. 3. 4.
Deals with probability regarding the characteristics that can be assumed by many identical particles having a certain number of positions available (Dirac and Fermi deal with particles which are distinguishable, but the correct reference in our observations are the identical particles); The function presents a maximum and aesthetically looks like a Gaussian, but it is not symmetrical; It has only a single asymptote to the right of the maximum, and its minimum at infinity coincides with zero: the origin of the reference system; It is normalized so that the area subtended represents the total probability of 100%.
Gauss: 1. 2. 3. 4. 5. 6.
4.4
Deals with chance and is applicable when an objective exists that is defined by a program; The phenomenon “purely by chance” is represented by a curve that is symmetrical about the axis x=0; The Gaussian has a maximum and no minimum at infinity; It possesses two asymptotes, one to the right and one to the left of the maximum; Well defined values of probability can be associated with multiples of the standard deviation. It is normalized as for Boltzmann’s.
THE EDDINGTON’S PARADOX13
Eddington’s famous “Infinite monkey theorem” can be counted amongst the most discussed paradoxes for the fact that it is often quoted by so called “scientific popularizers”. The original assertion states “…a monkey hitting keys at random on a typewriter keyboard, for an infinite amount of times, will almost surely type a given text, such as the complete works of William Shakespeare”. Having taken away the condition of an infinite amount of time the paradox remains acceptable (from the moment we are able to demonstrate that a finite amount of time is sufficient). However, such a long period of time is necessary that the original statement could be seen as an hyperbolic discussion! We have seen that random phenomena require a program in light of an objective. In the case of the typing monkeys the program could include the elimination of duplicate pages (actually, the identical pages, as we will see below) and the objective could consist in the conservation of “good” pages arranged in the right sequence. Applying Boltzmann’s statistics, let us assume that the typewriter has m=30 keys (we can think of “blind” keys, without any writing and all identical) and that we want to write a book of only 106 letters (a thousand typed pages): as we have observed in paragraph 3.1, all the possible combinations are
13
The reader can find all the details regarding these various arguments on the web.
27
Pier Maria Boria
Thermodynamics & life
C = nm = (106)30 = (10)180, In other words there are 10 180 possible configurations. Let us assume that the monkeys are capable of striking 10 keys/sec (skilled typists…), the time necessary would be t = 10180 x 106: 10 = 10 185 sec. Since we can count 1016 seconds in a billion years, it is also possible to say that the time required will be 10 185 : 1016 = 10169 billion years (giga-years)! (let us remember that the big-bang has an age of “only” 14 billion years). In reality the situation is even “worse”; in fact, this calculation (which is generally accepted) is wrong, because we cannot talk about only thirty objects (the letters, punctuation marks, spaces between lines, etc) to be arranged in 10 6 positions, otherwise in each of 10 180 configurations obtainable we would find empty spaces; up to 106-30 in each configuration. It is necessary to postulate that there are 106 letters to be arranged: like conceding that the monkeys have to insert 106 objects, i.e. 106 key strokes. In other words it is necessary that n = m = 106 and in this case the formula of the combinations gives us an astronomical value: 6
C = n m = m m = (10 6 )10 combinations. At a rhythm of 10 key strokes /sec the time corresponds to 6
t = (10 6 )10 ⋅ 10 6 ⋅ 10 −1 = 10 6,000 ,005 sec ≡ 10 5,999 ,989 years.
Figure 4.7 – Summary table of the probabilities according to Boltzmann
In realty the situation is even “worse” still! In fact, in the calculation of the combinations, duplicate configurations are not considered (which necessarily must be considered as possible); in other words, our monkeys could produce the same combinations several times (or two identical pages); anyway, the duplications will be useless in the compilation of our small book of only 106 letters! To this end we invoke chance (to attempt to appreciate the incidence of the repeating of identical pages) and having constructed a Gaussian by arranging the frequency of identical pages, we can reason as follows: having produced all the astronomical combinations as above, in the time calculated (which we will call a cycle), the highest probability of identical pages is in pairs (which
Pier Maria Boria
Thermodynamics & life
we will assign the maximum position), then in threes and so on. At infinity, with a probability of zero, all the pages will be identical! It seems fair to presume that the standard deviation could be very large, qualifying for a very flat Gaussian and let us suppose that the pages to be rejected (one for the doubles, two for the triplets etc) are contained within the first standard deviation (as in Figure 4.8), then the duplicate pages to be thrown away would be about 68%.
Figure 4.8 – The postulated conditions (pages to be eliminated, because they are duplicated, equal to the standard deviation) make 68% of the pages produced useless. Iterating the cycle one could consider the duplication of other pages; however, it can be demonstrated that the phenomenon continues to imply finite times.
How much does the required time increase?: to generalize we assume the length of a cycle to be of one unit and we identify with K the average cost of discarded pages (in the previous numerical case K= 0.68) and then we observe Figure 4.9:
Figure 4.9 – Having exhausted the first cycle identified by 1, the second, of length K, allows the replacement of the duplicate pages produced in the first cycle, the third of length K2, is used to replace those produced in the second cycle, and so on.
The time necessary to complete the cycles required to eliminate the duplicates as they are produced will be given by the sum ∞
∑K
n
,
n =0
which constitutes a geometric series. The Mathematical Analysis teaches us that such a series converges for K < 1 , as is confirmed in our case where it takes on the value 0.68 S=
1 , and, if 1− K
K = 0.68 , gives
29
S=
1 = 3.125 . 1 − 0.68
Pier Maria Boria
Thermodynamics & life
Therefore, multiplying by 3.125 the time dedicated to complete a cycle (3.17·105,999,989 billion years), for the hypothesis made, we also eliminate all the duplicate pages of our little book of 106 key strokes. Changing the value of K (always <1) one obtains different multipliers, but always of a finite value: let the reader imagine the time necessary to type all of Shakespeare’s works! It must be highlighted that the monkey operation, to be considered a success, requires the intervention of external intelligence capable of selecting the useful pages (like thought by Theory of Information) and ordering them in the right sequence to obtain a final, legible manuscript: this obvious necessity implies that negative entropy be introduced into the system; as covered at the beginning, this is like a living being taking the trouble to “put in order”, otherwise the “purely random” work would be entirely useless because it will exclusively produce positive entropy. All experiments attempted by man, with the goal of demonstrating the random production of complex molecules (first building blocks of living organisms), have the defect of requiring an a priori living system, like man, to arrange this production. When, later, chaotic physical-chemical conditions are created (temperature, pressure, methane, electrical arching, etc.), hoping to obtain that which Thermodynamics does not permit, the inventors of the moto perpetuo come to mind, who never give up. The research on the “primordial soup” belongs to this type of experimentation, a battle horse of the followers of the “maitre à penser” of the last century, and which regard, in particular, the laboratory experiments of Stanley Miller: the goal of his research, of a physical-chemical nature, was to recreate life; it was an understandable, but too ambitious of a project for a researcher. The result was absolutely negative (Miller himself recognized it): however, this last piece of information is only whispered in scientific circles, so that we still find genuine touts who speak with enthusiastic, coram populo, of the primordial soup and its miraculous effects14 with a self-assurance that is truly shameful!
4.5
CONCLUSION
On 4 July, 2007, at the London Kinetics Museum, a serious presentation of a perpetual motion machine was scheduled; a machine capable of supplying the user with a power greater than that absorbed. “Obviously”, the presentation was postponed due to “a technical problem”15. It would appear impossible, but advocates convinced of such a motion exist and many inventors submit patent after patent, even though still in illo tempore; Max Planck declared himself to be contrary to such a possibility which violates the principles of Thermodynamics. Based on the reasoning we have developed regarding entropy, probability and chance, the violation of such principles is implicit even in the attempts to obtain living organisms in a laboratory (characterized, as we have seen, as being producers of negative entropy), and as such, a strong analogy can be seen between the advocates of perpetual motion and those aspiring to create life. 1.
The correct application of Boltzmann’s statistics, a phenomenon for which we call on probability, demonstrates that combinations of large numbers of particles, to the point of generating complex organisms, require very long periods of time; in comparison the age of the universe is but the blink of an eye.
2.
The probabilities take on the largest numbers in correspondence with the most disordered configurations.
14 15
From “Corriere della Sera” october, 2008: “La vita sulla Terra? Cominciò nei vulcani con un innesco chimico”. -Source: Wikipedia.
30
Pier Maria Boria
Thermodynamics & life
3.
The most ordered combinations are those which characterize organic structures and the action of an intelligent being is necessary to select, order and conserve, in time, the favorable combinations.
4.
The phrase “chance phenomenon” is somewhat less intuitive than what “common sense” would suggest. In fact, the Gaussian perspective implies that such phenomena are necessarily associated with a program; this program implies the existence of an objective, around which we have an increased concentration of events.
5.
In every case it is necessary to postulate the existence of an intelligent design, without which the configurations and the favorable events constitute events without any functional link between themselves.
6.
The existence of life as a producer of negative entropy on a continuous basis (not cyclical) is a fact visible with our own eyes.
All this leads us to support the existence of a Co-ordinating Entity which possesses life “a priori” and which is capable of transmitting it to animals (animalis homo included!) and vegetation. It can be noted that the two living systems, animal and vegetable, complement each other in the sense that the design requires that emissions from the first (CO2) are the nutrients for the second, and the byproducts of the second (O2) are essential for the first: the carbon and oxygen cycles look like they have been designed! According to the author there is only one explanation: we are in the presence of the greatest Design Physicist of all times: God the Creator! This is what we call Him as Christians, while Israelites call Him Jehovah, the Ishmaelites Allah, the Masons GADU (Great Architect of the Universe), etc. In other terms
the Creation is a thermodynamic necessity!
Amen.