Errors of the Greenhouse Theory - BF

Page 1

Errors of the Greenhouse Theory Version 16 June 2022, by Dr.-Ing. Bernd Fleischmann, info@klima-wahrheiten.de

1. Climate sensitivity: calculated or diced? The temperature increase when the CO2 content of the atmosphere doubles is called climate sensitivity. This doubling compared to 150 years ago (280 ppm) is likely to come because China, India, and other countries continue to expand their use of fossil fuels, and cooling over the next few decades will not increase ocean CO2 uptake at the same rate. An often-cited paper is by Charney et al. from 1979 (Carbon Dioxide and Climate: A Scientific Assessment. Washington, DC: The National Academies Press). They calculated climate sensitivity to 1.5 °C to 4.5 °C using a greenhouse model that already accounted for some feedback effects. Of course, the IPCC can "calculate" the greenhouse effect due to an increase of CO2 much more precisely than Arrhenius 125 years ago or Charney 43 years ago. In the IPCC Supplement of 1992, the result of the "calculations" of 8 teams was a temperature increase of 1.7 °C to 5.3 °C (including feedback effects by e.g. water vapor, https://www.ipcc.ch/report/climate-change-1992-the-supplementaryreport-to-the-ipcc-scientific-assessment/). 21 years later, in the IPCC Report of 2013, it could be "calculated" much better with the help of the latest and most expensive supercomputers and the result is now (https://www.ipcc.ch/report/ar5/wg1/ page 16): A doubling of the CO2 content leads to a temperature increase of 1 °C to 6 °C with 85 % probability. Was this really "calculated" or rolled the dice on a few bottles of wine? This automatically comes to mind with the numbers from 1 to 6 and especially when you read the small print footnote about it: "No best estimate for equilibrium climate sensitivity can now be be given because of a lack of agreement on values across assessed lines of evidence and studies." This means they can't give a good estimate because the models don't fit the measurements! So much for "the science is settled". At least the reports have become more colorful... Considering the billions of research money that have been spent, we certainly expected something more precise. In any case, it is not a consistent theory, if after 20 years of intensive calculations the uncertainty is bigger than before. Maybe some professors just let students turn the screws for 20 years and then present "new findings" at world climate conferences in Buenos Aires, Marrakesh, Montreal, Bali, Cancun, Paris etc.? More than 20,000 people come to these climate conferences, most of them by plane. The last one in Madrid in 2019 lasted 13 days... Let's talk briefly about feedback. Positive feedback is when a signal at the output of a system is fed back to the input in such a way that it positively overlaps with the original input signal and amplifies it. Such systems are unstable if they do not have large internal losses. Everyone knows this from whistling loudspeaker systems. Feedbacks exist not only in technical systems, but also in sociological ones, e.g. if a climate institute gets money for exaggerating negative effects of global warming, it will hire even more people to work on this exaggeration. The example in Wikipedia (https://en.wikipedia.org/wiki/Positive_feedback) is similar to this: If in a herd of cattle one starts to run, e.g. because it has mad cow disease, its neighbors may think that there is danger and run with it until in the end the whole herd runs. This triggers associations with politics…


Back to climate models: John Christy of the University of Alabama and his colleagues compared the IPCC's 102 (sic!) models with measurements from satellites and weather balloons for the period 1979 to 2016. The comparison (graph at left) was made for the tropics (20° south to 20° north), where the changes in terms of cloud cover and ocean cycles are not large, except for El Niño effects. The green line shows the trend of measured temperatures over atmospheric pressure (which corresponds to altitude) and the red line shows the mean of the simulations. The blue squares represent the "coolest" 2.5% of the simulations and the red squares represent the "warmest" 2.5% (partially outside the graph). The temperature value (X axis) is the trend in °C per decade for the period from 1979 to 2016 (https://www.tandfonline.com/doi/full/10.1080/01431161.2018.1444293 also for the graph). The measurements show a mean value for the different heights of 0.1 °C/decade. The IPCC-calculations have a mean of 0.27 °C/decade with strong scatter. They are off by almost a factor of 3. Without the effect of the strong El Niño of 2015/2016, which shifted the measured trend strongly upwards, the difference would be even larger. But it gets even better: in 2005, Stainforth et al. (“Uncertainty in predictions of the climate response to rising levels of greenhouse gases”) evaluated several thousand simulations and "explicitly resolved regional details": the result is a climate sensitivity of 1.9 °C to 11.5 °C. That would take two dice. They also write that some simulations were unstable - I'm not surprised - and that six of the simulations showed significant cooling with CO2 doubling. So much honesty is surprising, but it should have been included in the figures. A 2017 review by Knutti et al. is similarly informative, probably the most recent on the topic (Beyond equilibrium climate sensitivity, Knutti et al., 2017, https://www.nature.com/articles/ngeo3017): Calculations of the climate sensitivity of CO2 since 1998 yield a result ranging from 0 °C to 10 °C, as shown in the graph on the next page. Noteworthy is Knutti’s conclusion: "To limit warming to 2 ° C, future CO2 emissions must remain strongly constrained, regardless of whether climate sensitivity is at the high or low end." According to the motto: It doesn't matter what the researchers calculate, the important thing is that everyone believes in climate catastrophe. The papers with the smallest uncertainty generally provide the lowest equilibrium climate sensitivity. Among them is the publication by Prof. Hermann Harde from 2017 (“Radiation transfer calculations and assessment of global warming by CO2” Int. J. Atmos. Sci. 2017), who has thoroughly investigated all feedback mechanisms and comes to a result of 0.6 °C with CO2 doubling. He gives the contribution of CO2 increase to global warming over the last 100 years as 40 % and the contribution of the Sun as 60 %. In addition, Prof. Harde has calculated in another paper that only a small part of the CO2 increase is due to fossil fuel burning (“What Humans Contribute to Atmospheric CO2: Comparison of Carbon Cycle Models with Observations”, 2019, https://tinyurl.com/y8xpjoyb). Interim conclusion: The models of the IPCC are not even able to reproduce the recent past. They calculate by a factor of 3 too warm for the last 40 years and the variation of the calculated climate sensitivity does not decrease, but it increases! How can anyone seriously have confidence in the IPCC projections for the next 50 or 100 years?



2. Radiation models show unphysical behavior in the troposphere A publication from NASA’s astrobiology institute states "we demonstrated how radiative equilibrium models produce super adiabatic regions in a planet’s troposphere. Our radiative–convective model corrects this unphysical behavior" (https://www.researchgate.net/publication/236842439_An_Analytic_RadiativeConvective_Model_for_Planetary_Atmospheres). Super-adiabatic means that temperature changes more than is physically possible with air pressure or altitude changes. This is not a new finding, as this graph shows. It originates from a 1964 publication, in which the temperature was calculated as a function of the pressure with the radiation equilibrium and shows the (unphysical) temperature graph with radiation equilibrium (lower curve). It leads to a much too high surface temperature of the Earth of more than 332 K (59 °C). The other two curves show the results of convective equilibrium with dry air (dry adiabatic) and humid air (6.5°C/km). One of the authors of the publication, Nobel laureate Syukuro Manabe, already stated at that time "it is necessary to know how the convective adjustment alters some of the unrealistic features of a pure radiative equilibrium" (Thermal Equilibrium of the Atmosphere with a Convective Adjustment, Manabe and Strickler, 1964, https://tinyurl.com/ybr3j64q). This is how it is still done today: the greenhouse theorists calculate something with the radiative equilibrium, find that it does not fit, and then take the convective-adiabatic model to reproduce the actual temperature gradient in the atmosphere. But for the calculation of the temperature increase with an increase of the CO2 concentration they stick to their unphysical radiation model. Below the tropopause NASA uses the convective model and only above the tropopause - where there is practically no more convection - the radiative model. As Tyler Robinson and David Catling, the authors of the NASA study cited above write so aptly: "radiative equilibrium models tend to have regions in their tropospheres where the temperature decrease implies that convection should ensue, which is a process not incorporated into the models but is a part of the essential physics of planetary atmospheres. Convection is common to all planetary tropospheres known in the solar system ... so radiative equilibrium models neglect the basic physics of thermal structure." This means that all models that are essentially based on radiative equilibrium - i.e. all IPCC models - contradict the laws of physics. Therefore, IPCC models cannot be used to calculate realistic temperature gradients or absolute temperatures. Interim Conclusion: The models of the IPCC are based on radiative equilibrium. They lead to unphysical results. What did Prof. Harold Lewis say about the climate hysteria? "It is the greatest and most successful pseudoscientific fraud I have seen in my long life as a physicist" (https://www.aei.org/carpe-diem/proflewis-climate-change-is-the-greatest-and-most-successful-pseudoscientific-fraud-i-have-seen/).

3. The greenhouse model leads to a vacuum on the ground In the standard model for the radiation budget, an upward spiral is labeled "convection", which transports 17 W/m² away from the surface. Why 17 W/m²? In the article of Trenberth et al. (Earth’s Global Energy


Budget, BAMS, 2009, https://tinyurl.com/yb86w8am loads the pdf) it says only succinctly "Surface sensible and latent heat estimates were based on other observations and analyses" without further explanation. You have to let that roll off your tongue, this combination of "standard work", "estimate" and "other observations". If one follows this up and looks at the older publication of Kiehl and Trenberth, to which Trenberth refers to (Earth’s Annual Global Mean Energy Budget, BAMS, 1997), one finds that the sensible heat (convection) itself was not analyzed at all, but the value 24 W/m² (later it was 17 W/m²) was used to level the radiation balance - that the sum is equal to zero. In the work of Wild and colleagues (next chapter) the values for convection of 22 IPCC models are given. They range from 14 to 27 W/m². And the 2013 IPCC report says (https://www.ipcc.ch/site/assets/uploads/2018/02/WG1AR5_all_final.pdf, page 182), "Relative uncertainty in the globally averaged sensible heat flux estimate remains high owing to the very limited direct observational constraints." "The science is settled"? Certainly not! The sensible heat is in the form of kinetic energy of the gases and is transported away from the surface by the rising of the warm air. In the process, the air expands (there is less pressure at the top) and thus cools, as described by the ideal gas law. Now, does a vacuum form on the ground when the warm air rises? Of course not. Horizontal air movements (wind) occur and somewhere just as much cold air sinks as warm air has risen. This cold air is compressed and heats up. If the effect occurs massively on the lee side of mountains, it is called foehn. And in mines it is called self-compression. Does nobody at the IPCC know this or is it deliberately concealed? With the sinking and compression of the air masses, heat is transported to the ground. This effect has been completely "overlooked". It is approximately - depending on the humidity - as large as the heat transport by convection away from the ground, because the atmosphere is in convective equilibrium. Whether the calculated/estimated amount is 24 W/m² or 17 W/m² or much higher is quite irrelevant, because this energy flow is basically the same in both directions. People who know about apples or about thermals - e.g. all glider pilots - know it better than the greenhouse theorists: Where there are updrafts, the downdrafts are not far, or: "What goes up, must come down" (the sentence is attributed to Isaac Newton https://www.goodreads.com/quotes/433926-what-goes-up-must-come-down). In the "Handbuch der Klimatologie" by Julius von Hann - who was nominated three times for the Nobel Prize in Physics - from the year 1883 it is written on page 166: "Therefore the general law is valid: Rising air masses cool down in the ratio of 1 °C per 100m (as long as no condensation of the water vapor occurs), they warm up inversely in the same ratio when descending... This is the condition of the indifferent (convective) equilibrium." The reasoning "as the mechanical theory of heat teaches" is on the page before. In any case, the "forgotten" 14 to 27 W/m² are more than five times the alleged additional radiant power of 3 W/m² due to all "greenhouse gases" emitted by humans (carbon dioxide, methane, nitrous oxide, hydrocarbons ) since 1750 (IPCC Climate Change 2013, The Physical Science Basis, p. 29).

4. Not everything that evaporates comes down as rain The latent heat, i.e. evaporation and condensation, was also calculated incorrectly. The IPCC report of 2013 (https://www.ipcc.ch/site/assets/uploads/2018/02/WG1AR5_all_final.pdf page 182) states: "The global mean latent heat flux is required to exceed 80 W/m² to close the surface energy balance...The emerging debate reflects potential remaining deficiencies in the quantification of the radiative and non-radiative energy balance components and associated uncertainty ranges, as well as in the consistent representation of the global mean energy and water budgets". Meaning: "We don't know how we shall quantify evaporation and condensation, but we must use at least 80 W/m² for our energy balance to work out".


This has nothing to do with physics, rather with a dogma that is defended tooth and nail! Wild's paper says: "The latent heat flux is the energy equivalent of the surface evaporation, which on a global mean basis must equal precipitation." This sounds reasonable at first glance, but it is not at second. Wild himself speaks of considerable uncertainties due to the enormous non-uniformity of precipitation, systematic errors of land-based rain gauges, and inherent difficulties in satellite precipitation determination. In addition - which Wild does not mention - not everything that has previously evaporated and sublimated falls from the sky as rain or snow. Ever heard of dew? Or of resublimation? Or that raindrops evaporate before they fall on the desert floor (or other warm ground)? Or that ice melts and water freezes elsewhere? How do the satellites detect that? So the value for evaporation (in general: phase transitions) cannot be correct, and therefore it is wrong.

5. The "calculation" of the infrared radiation at the Earth's surface is also wrong It is also interesting to see how the greenhouse model has evolved from the IPCC Report 2007 to that of 2013 (https://www.ipcc.ch/site/assets/uploads/2018/02/WG1AR5_all_final.pdf page 181, image below). The back radiation (thermal down surface) is now no longer exactly 333 W/m² but varies in a range from 338 to 348 W/m², thus - like the climate sensitivity - has become much more "precise", "adjusted within their uncertainty ranges to close the energy budget". This means something like "we adjust the numbers so that in the end the energy budget somehow works out" - that sounds like budget planning of politicians. Chapter 2.3.3.2 of the IPCC report is dedicated to the back radiation and the infrared radiation emitted by the Earth's surface (thermal up surface, 398 W/m²). Where do these values come from? I would expect some details in the 1552 pages book "Physical Science Basis" of the IPCC, especially about the measurements and how they were interpolated between measuring stations. After all, the back radiation caused by the "climate gases" is the alleged reason for the global warming! In fact this chapter 2.3.3.2 is one of the shortest of all, not even a third of a page long! Essentially it says that there are very few measuring stations and these are spatially not representative. So where does the 398 W/m² really come from? The image was taken by the IPCC from a 2012 publication by Wild et al. (The global energy balance from a surface perspective, Clim Dyn, 2013, https://tinyurl.com/ycjmc924). They write that 22 different CMIP5 (Coupled Model Intercomparison Project 5) climate simulations were made. The simulations gave between 392.6 and 403.7 W/m² and the mean value for thermal up surface was 396.9 W/m². These are slightly different values than given in the graph - for all other values there are also different values in the text than in the graph - but that doesn't matter, because they made the same mistake as their colleagues who calculated Earth’s temperature without atmosphere. 397 W/m² result from the temperature of a black body radiator of 16 °C according to the Stefan-Boltzmann formula I = 𝜀 𝜎 T4 𝜀 = 1 does not mean a big mistake, because thermal up surface also includes reflected infrared radiation and if 𝜀 < 1, it means that the ground reflects more of the incoming infrared radiation.


More information about the origin of the 397 W/m² is given in the publication of Trenberth et al. (2009, referenced above), to which Wild et al. refer in the end. Trenberth et al. write correctly that for a correct calculation of the infrared radiation one must consider the temporal and spatial variation of the ground temperature. But then they do not do it and write instead that the changes between day and night or winter and summer cancel each other out. This is not true, because the radiation must be calculated first for each place and time. Only afterwards one can form an average value. One must not first average the temperatures and then calculate the radiation for the average temperature because of the nonlinearity (the proportionality to the fourth power of the temperature) of the Stefan-Boltzmann law. Trenberth makes another mistake here (from the cited publication):

"so that the T' and T3 terms vanish" would only be correct if the global temperature distribution were a linear function symmetric with respect to the mean temperature of 15 °C. But it is not, as this graph shows for the mean air temperatures in July (http://www.physicalgeography.net/fundamentals/7m.html):


There is a large area that is colder than -50 °C (Antarctica), but no area that would be warmer than +80 °C, i.e., just as "far away" from the mean 15 °C - to compensate linearly. In the next sentence Trenberth et al. write that the variation of the annual mean temperature between 40 °C at the poles (which is again an inadmissible averaging between the Arctic and the much colder Antarctic) and 30 °C in the tropics must not be neglected, which is correct, but contradicts the neglect of the previously described temperature variations (day/night, winter/summer). Let's calculate an example to illustrate the whole thing. In the tropics, the average temperature is about 30 °C. Let's assume that an area of the same size has a temperature of 0 °C (e.g. the Arctic and parts of the Southern Ocean in July), then the global mean temperature of 15 °C does not change, but the mean radiation is higher: 30 °C leads to a radiation of 478.9 W/m², 0 °C to a radiation of 315.7 W/m². The mean value is 397.3 W/m² and this corresponds to a temperature of 16.15 °C, i.e. 1.15 °C higher than the mean temperatures of the two surfaces. So far so bad. Next, let's look at the variation between day and night, winter and summer. In the tropics, it is admittedly almost negligible at about 10 °C, but the further away you move from the equator, the greater it becomes. In the humid continental areas of the mid-latitudes, like in Germany, almost 30 °C difference between winter nights and summer days is the normal case. In Verkhoyansk in Siberia it is 70 °C, in Antarctica (Vostok, see graphic, from https://de.wikipedia.org/wiki/WostokStation#cite_note-11) almost 100 °C and in the Arctic (Thule) 35 °C. 40 °C temperature difference is a typical value for the continental areas of Africa, Asia and America. The oceans change their surface temperature less than land areas because of their high heat capacity. Here, temperature variations range from 4 °C at the equator to 17 °C in the Baltic Sea. Now I have to estimate and simplify because I don't know any global statistics for this: For the calculation, let's take a value for the temperature fluctuations of 35 °C for one third of the Earth's surface, i.e. in summer (16.15 + 35/2) °C = 33.65 °C (corresponding to 502.4 W/m²) and in winter -1.35 °C (309.5 W/m²). The average value is 405.9 W/m², which is 8.6 W/m² higher than the value of 397.3 W/m² corresponding to 16.15 °C. We divide this by 3, because we only apply it to one third of the Earth's surface, and we get a value of 2.9 W/m² for the error due to the non-consideration of the temperature fluctuations. The radiation from the Earth's surface is calculated too low by this amount. The correct value for thermal up surface is therefore about 400 W/m². A more precise analysis results in lower values for the seasonal behavior (transition between winter and summer) and higher values for the actual geographical variations and daily temperatures (above mentioned temperature variations are monthly averages). Whether the error is 2 W/m² or 5 W/m²: it is a multiple of the alleged imbalance - with which the Earth heats up - from 0.2 to 1 W/m². Consequently, this value is also wrong. Let's have a look at the atmospheric back radiation (thermal down surface). One would think that this value, because it is supposed to be so important and depends on the CO2 content of the atmosphere, is measured in many places. And because it cannot be measured in space, but directly on Earth, the result


should be accurate and climate models should reflect this value. Actually, this is the value that varies most between the various climate models and is the most inaccurately measured. As Wild and colleagues write, there are very few measuring stations that provide useful values: 41 stations worldwide, to be exact, spread across the continents - and none on the oceans, which make up 71% of the Earth's surface. Therefore, a global average cannot be measured, but only calculated via the climate models, which are matched with the stations. Since 95 % of the 21 climate models investigated deliver lower values than the respective measuring stations (with deviations of up to 37 W/m²), extrapolation, interpolation and estimation had to be done to get a "usable" result. One must assume that the result of 342 W/m² is similarly wrong as the other values for Earth’s surface. In any case, the uncertainty of the value is much larger than the range of 338 to 348 W/m² given in the image. The range of the difference between thermal up surface (power radiated from the ground) and thermal down surface (back radiation) is between 49 and 65 W/m² in the analyzed climate models. This 30 % difference between the models means for the climate alarmists "the science is settled", for people with some common sense: these are only rough estimates and - as shown in chapter 2 - based on unphysical models. How Trenberth and Co. can determine a radiation imbalance of 0.6 W/m² with their partly estimated numbers is a mystery to me. That was probably rather a specification to the modelers, so that it fits to the rolled climate sensitivity. Those who study atmospheric physics intensively know that the IPCC-model is too simplistic and wrong. That's why the following email, leaked on Climategate (https://wattsupwiththat.com/climategate/), is no real surprise. On October 14, 2009, at 10:17 am, Kevin Trenberth wrote, apparently quite desperate: "Hi Tom how come you do not agree with a statement that says we are nowhere close to knowing where energy is going or whether clouds are changing to make the planet brighter. We are not close to balancing the energy budget. The fact that we can not account for what is happening in the climate system makes any consideration of geoengineering quite hopeless as we will never be able to tell if it is successful or not! It is a travesty! Kevin" (from https://tinyurl.com/yawg7gwf). Well, this was honest, but he doesn't say that in public! As a summary and for clarification the current IPCC-graphic with the references to wrong values:


6. Back radiation by reflection from the clouds instead of greenhouse gases Also wrong is the assertion that the back radiation of 333 W/m² - or whatever the current "best estimate" is - is only "due to the heated greenhouse gases". Much of the back radiation is due to reflection from the water droplets and ice particles in the clouds, and re-emission from the clouds. These are completely different physical processes that are usually completely neglected. The "back radiation" of the atmospheric gases is ultimately a consequence of the temperature, which has been set by convective, advective and other effects. It is not the cause of the temperature on the ground.

7. The water vapor feedback dilemma The assumed additional back radiation by a doubling of the carbon dioxide content in the atmosphere is not sufficient in any of the umpteen different IPCC greenhouse models for a temperature increase of more than 1 °C. So amplification mechanisms had to be introduced to sound the alarm. The IPCC considers the most important mechanism to be water vapor feedbacks. Water vapor is considered by all greenhouse theorists to be a much more potent "climate gas" than carbon dioxide, due to its broader absorption spectrum and higher concentration in the atmosphere. In the reports of the IPCC, e.g. Climate Change 2013, Chapter TS.3.7 Climate Feedbacks, it is stated that global warming increases the water vapor content in the air, which further increases back radiation. This and "other feedbacks" result in the "calculated" value of 1 °C to 6 °C. Apart from the fact that the range of 1 °C to 6 °C already shows that these are rough estimates not based on physical equations, it is apparently not so easy for non-engineers - and Indian railroad engineers who head the IPCC - to understand feedbacks. The climate modelers at NASA GISS (https://www.nasa.gov/topics/earth/features/vapor_warming.html) describe it this way: "Increasing water vapor leads to warmer temperatures, which causes more water vapor to be absorbed into the air. Warming and water absorption increase in a spiraling cycle." In addition, as the oceans get warmer, more carbon dioxide escapes from them (there is about 50 times as much CO2 in the oceans as in the atmosphere), which would further fuel the feedback loop. And where does this positive - i.e. reinforcing - feedback end? Who tells the water vapor that it should stop the vicious circle - more water vapor, higher temperature, even more water vapor - at 1 °C or 6 °C? And who tells the water vapor that it should only start the vicious circle when the carbon dioxide content increases and not already when the solar radiation increases, the sea ice recedes or the Christ Child (El Niño) is once again hyperactive? Does one of the climate mystics tell that? All low-loss systems with overall positive feedback are unstable, as every engineer who has dealt with control engineering knows. This does not exclude locally and temporally limited positive feedback, e.g. ice albedo. Every stable system has overall and long-term negative feedback. The positive feedback by water vapor is scientifically untenable if it is not accompanied by a simultaneous - and stronger - negative feedback by increasing cloudiness and/or a reduced temperature gradient (lapse rate). This topic runs through the history of science like the wrongly calculated 33 °C temperature increase by the "climate gases". Prof. Richard Lindzen, an atmospheric physicist at the Massachusetts Institute of Technology until 2013, has studied the surface temperature of the tropical oceans with satellite measurements of radiated energy (Earth Radiation Budget Experiment (ERBE)) and the relevant climate models. He states: "All models agree as to positive feedback, and all models disagree very sharply with the observations." (https://www.researchgate.net/publication/241492246_On_the_determination_of_climate_feedbacks_fr om_ERBE_data).


This could also be because the calculation of cloud formation and the influence of clouds at different altitudes is still an unsolved problem for climate modelers. As Steven Sherwood and colleagues wrote resignedly in their comprehensive 2013 review paper (Climate Processes: Clouds, Aerosols and Dynamics, published in G.R. Asrar and J.W. Hurrell (eds.), Climate Science for Serving Society: Research, Modeling and Prediction Priorities, Springer Science+Business Media Dordrecht 2013) "After decades of effort, it remains clear that no current model can reliably simulate both clouds and climate simultaneously. However, the cloud and climate scales cannot be decoupled." I see only one way out of the water vapor dilemma: Increasing water vapor leads to increased cloud formation and thus increased reflection of solar radiation (higher albedo), i.e. a negative feedback loop which prevents heating up.

8. The maximum possible CO2 greenhouse effect The "experiment" with the inversion layer and the gardener’s greenhouse experiment show where the upper limit for the temperature increase with increasing CO2 content of the atmosphere lies. The window glass absorbs almost 100 % of the infrared radiation emitted by the ground, while the CO2 of the atmosphere absorbs only about 14 %. If an infrared-transmissive rock salt plate is used instead of window glass, the temperature does not change - within the measurement tolerance. The CO2 of the atmosphere cannot have a stronger effect than the almost completely absorbing glass plate. In my opinion, this results in an estimate for the CO2 greenhouse effect of significantly less than 0.5 °C for a doubling of the concentration in the atmosphere. Mars, with its fairly uniform surface, lack of water and of phase transitions in the nearly pure CO2 atmosphere, is probably also well suited for determining the CO2 greenhouse effect. And also the Antarctic inversion experiment allows, in my opinion, an estimate of the upper limit. But this work should be done by the greenhouse theorists.

9. The climate tower - it speaks volumes that it does not exist Earth's atmosphere is a bad laboratory. It is constantly changing, at most for a few hours there are halfway constant conditions at most places. But the changes at the same time are guaranteed to be significant on 90 % of Earth's surface. Besides, you can't quickly replace the N2-O2 atmosphere we have on Earth with 97 % CO2 like on Venus. That would be nice, if one could simply measure the alleged "runaway greenhouse effect" on Venus - or could confirm by measurement that it does not exist. Why isn't there a 50 m or 100 m high climate tower which is mirrored on the inside to not influence the postulated back radiation, which has a heatable floor and powerful, the sun or the "atmospheric back radiation" emulating spotlights on the ceiling, and whose interior can be filled with the mentioned atmospheric gases and water vapor in any concentration? Sure, this tower will cost a few millions, but that's peanuts compared to the hundreds of billions spent globally every year on "climate protection." If you don't build a tower, but use an abandoned mine shaft for it, the cost and visual pollution is lower than any wind or solar power plant. The climate shaft probably costs no more than a few years' worth of electricity for one of the supercomputers hogged by the greenhouse theorists. But for these theorists, by their definition, model simulations with various boundary conditions are already "experiments." So why bother with real experiments that may not produce the politically desired result? Within a very short time, one would have clarity about the truth of the greenhouse theory. And probably this is exactly the reason why this climate tower does not exist!


As a "waste product" of one of the largest structures in the world, there is at least one partial experiment. NASA's Vehicle Assembly Building in Florida is 160 m high so that the 111 m high Saturn V rockets could be assembled standing up. Because of the high humidity in Florida, the temperature gradient in the hall is not the dry-adiabatic one of about 1 °C/100 m, but a humid-adiabatic one of 0.6 °C/100 m. Therefore, it is about 1 °C cooler below the ceiling of the hall than on the ground. Clouds sometimes form and it can rain inside the hall! There are of course large fans in the hall. When they are running, however, there is no temperature equalization between the bottom and the top, but the temperature gradient stabilizes (Loschmidt's Temperature Gradient Paradox, George Levy, 2020). Image: https://de.wikipedia.org/wiki/Vehicle_Assembly_Building

10. Conclusions:  The greenhouse theory’s back radiation has no significant effect, neither in the gardener’s greenhouse nor in the atmosphere of the planets.  An overall positive feedback is scientifically untenable.  The famous radiation budget of the Earth is peppered with errors.  The theory behind it partly contradicts the laws of nature and many observations.


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.