Connect with us

C2C Journal

A Planet that Might not Need Saving: Can CO2 Even Drive Global Temperature?

Published

45 minute read

By Jim Mason
Climate change has ingrained itself so deeply in the public consciousness that it’s likely an article of faith that the foundational science was all conducted, tested and confirmed decades ago. Surely to goodness, exactly how carbon dioxide (CO2) behaves in the atmosphere is “settled science”, right? That more CO2 emitted equals more heat and higher temperature is a cornerstone of the ruling scientific paradigm. And yet, finds Jim Mason, the detailed dynamics of how, when and to what degree CO2 transforms radiation into atmospheric heat are anything but settled. So much remains unknown that recent academic research inquiring whether CO2 at its current atmospheric concentrations can even absorb more heat amounts to breaking new ground in climate science. If it can’t, notes Mason, then further changes in CO2 levels not only won’t drive “global boiling”, they won’t have any impact on climate at all.

Electric Vehicles (EVs) – by which are usually meant battery-operated electric vehicles, or BEVs – have long been touted in many countries as the central element in the strategy for stopping global climate change. Canada is no exception. The Liberal government’s Minister of Environment and Climate Change, Stephen Guilbeault, has mandated that by 2030, 60 percent of all vehicles sold in Canada must be BEVs, rising to 100 percent by 2035. In anticipation of the accompanying surge in BEV sales, the federal and Ontario governments have offered huge subsidies to battery manufacturing companies. But now, EV sales are stagnating and automobile manufacturers that were rushing headlong into EV production have dramatically scaled back their plans. Ford Motor Company has even decided that, instead of converting its Oakville, Ontario plant to EV production, it will retool it to produce the Super Duty models of its best-selling – and internal combustion engine-powered – pickup truck line.

Heating up the rhetoric: “The era of global warming has ended; the era of global boiling has arrived,” UN Secretary-General Antonio Guterres (top left) has declared; prominent voices such as former U.S. Vice President Al Gore (top right) insist it’s “settled science” that “humans are to blame” for global warming; this view has been accepted by millions worldwide (bottom). (Sources of photos: (top left) UNclimatechange, licensed under CC BY-NC-SA 2.0; (top right) World Economic Forum, licensed under CC BY-NC-SA 2.0; (bottom) Takver from Australia, licensed under CC BY-SA 2.0)

A big part of the justification for forcing Canadians into EVs has been that we must “follow the science.” Namely, the “settled” science which holds that the planet’s atmosphere is heating dangerously and that humans are the cause of this via our prodigious emissions of heat-trapping gases – mainly carbon dioxide (CO2) – which are magnifying the atmosphere’s “greenhouse” effect. Over the past several decades the accompanying political rhetoric has also heated up, from initial concerns over global warming and a threatened planet – terms that at least accommodated political and scientific debate – to categorical declarations of a “climate emergency”. As UN Secretary-General Antonio Guterres asserted last year, “The era of global warming has ended; the era of global boiling has arrived.”

The foundational term “follow the science” is loaded, however. It is code for “follow the science disseminated by the UN’s Intergovernmental Panel on Climate Change (IPCC).” Article 1 of the UN’s Framework Convention on Climate Change actually defines climate change as “a change of climate which is attributed directly or indirectly to human activity”. Elsewhere the document clearly identities CO2 emitted through the burning of fossil fuels as the causative human activity. So the UN and IPCC long ago limited the scope of what is presented to the public as a scientific investigation and decided not only on the cause of the problem but also the nature of the solution, namely radically driving down “anthropogenic emissions of carbon dioxide and other greenhouse gases.”

The worldwide climate change movement has proved remarkably successful in creating what is known as a “ruling paradigm”. This phenomenon is common in many fields and not necessarily harmful. But in this instance, what is billed as a scientific endeavour has strictly limited the role of open scientific inquiry. The “science” that the movement wants humanity to follow is the result of inferential, inductive interpretation of empirical observations, declared to be “settled science” on the basis of a claimed consensus rather than as a result of controlled, repeatable experiments designed not to reinforce the paradigm but to test it for falsifiability, in accordance with the scientific method. This paradigm has allowed Guterres, for example, to claim that “for scientists, it is unequivocal – humans are to blame.” But it is missing (or attempts to exclude) a key element: rigorous experimentation that subjects the theory on which the paradigm is built to disinterested scientific scrutiny.

Following whose science? The UN’s Intergovernmental Panel on Climate Change (IPCC) defines climate change as being solely “attributed directly or indirectly to human activity,” particularly the burning of fossil fuels, thus limiting not only the scope of public discourse but pertinent scientific inquiry as well. (Sources: (left photo) Robin Utrecht/abacapress.com; (middle photo) Qiu Chen/Xinhua/abacapress.com; (graph) IPCC, 2023: Summary for Policymakers, Climate Change 2023: Synthesis Report)

Thankfully, some scientists still are conducting this kind of research and, for those who value and follow properly-done science, the results can be eye-opening. Two recent scientific papers appear of particular interest in this regard. Each is aimed at assessing the role of CO2 in influencing atmospheric temperature. Nobody doubts whether carbon dioxide is a greenhouse gas; the real questions are how much additional radiant energy CO2 is currently absorbing (such as due to the burning of fossil fuels) compared to the past, and what impact this has on the climate.

One might have thought such work would have been done 30 or more years ago but, apparently, it has not. That additional CO2 emitted into the atmosphere absorbs additional radiant energy, and that such additions do so in a linear fashion, are two of climate change theory’s critical premises. So it would seem crucial to establish scientifically whether gases emitted due to human activity – like CO2 – are capable of raising the Earth’s atmospheric temperature and, accordingly, justify their assigned role of causative agent in humanity’s planet-threatening villainy. The papers discussed below thus deal with questions of profound importance to climate change theory and the policy response of countries around the world.

If CO2 is not actually an effective current driver of atmospheric temperature, the implications are staggering.

How – and How Much – CO2 Traps Heat in the Atmosphere

The first paper developed a mathematically rigorous theory from first principles regarding the absorption of long-wavelength radiation (LWR) by a column of air as the concentration of CO2 (or other greenhouse gases such as water vapour, methane, ozone or nitrous oxide) increases in the atmosphere.

The Earth receives solar energy mainly in shorter wavelengths, including visible light. According to NASA, just under half of this incident radiation reaches the ground, where it is absorbed and transformed into heat. Of that half, just over one-third is radiated back into the atmosphere; about one-third of that amount is absorbed by heat-trapping gases, including CO2. (The air’s main constituents of oxygen and nitrogen are essentially transparent to both incoming visible radiation and outgoing LWR). Importantly, CO2 can only absorb meaningful amounts of LWR in two specific bands, but in these bands, it can absorb all of it.

The greenhouse effect, oversimplified and distorted: This seemingly easy-to-understand info-graphic downplays the fact that a large proportion of incoming solar radiation returns to space, omits the key fact that there would be no life on Earth without the natural greenhouse effect, and leaves out the most significant greenhouse gas of all: water vapour. (Source of image: EDUCBA)

The paper employs formulae whose explanation exceeds the scope of this article, but in simplified terms the theory predicts that at some concentration – designated as “C” – one-half of the LWR in the absorbable band is absorbed. Importantly, the theory postulates that the absorption of LWR does not increase in a linear fashion along with the increase in atmospheric CO2. Instead, as the gas concentration increases, the incremental amount absorbed decreases. At twice the C value – 2C – only three-quarters of the incident radiation would be absorbed. And at 3C, seven-eighths.

By 10C, 99.9 percent of the absorbable LWR is being absorbed. In effect, the atmosphere has become “saturated” with the gas from a radiation-absorption perspective and further increases in CO2 have negligible effect on absorption. This relationship is illustrated in Figure 1. As one can see, it is distinctly non-linear in nature, but is instead exponential, asymptotically approaching 100 percent absorption.

Figure 1: Theoretical graphical depiction of absorption of incident radiation as a function of the concentration of an absorbing gas, in this case forming the core of a theory concerning how long-wave radiation emitted from the Earth’s surface is absorbed by atmospheric CO2. (Source of graph: Jim Mason)

This graph could appear quite scary at first glance. After all, as more CO2 is added to the atmosphere, more LWR is being absorbed, suggesting more heat is being retained instead of escaping to outer space. So doesn’t the Figure 1 curve suggest that, with CO2 concentrations rising and potentially doubling during our era compared to pre-Industrial times, global warming is indeed occurring, CO2 is indeed the cause, and the IPCC’s warnings are justified? The answers depend on what the value of C actually is for CO2 and what the concentration of CO2 is in the atmosphere today. In other words, on where the Earth now sits along that curve, and where it sat when the pre-Industrial era came to an end. That, in turn, will require actual physical experiments – and these are covered later.

Figure 2: View from end of air column showing random positions of radiation-absorbing gas molecules, with red circles representing their associated radiation-absorbing cross-section. (Source of illustration: Jim Mason)

The non-linear LWR/absorption relationship can be understood conceptually as follows. Each physical COmolecule has what amounts to a surrounding area of radiation-absorption capability to specific bands of LWR. This area is “opaque” to those bands. The LWR rising from the Earth’s surface is absorbed if it travels onto this area; outside of it, it is not. The areas can be thought of as little opaque spheres around each molecule, which when viewed externally look like circles. The area of these circles is referred to as the radiation absorption cross-section.

Viewed from the end of the column of air, the circular cross-sections formed by all the CO2 molecules in the air column will effectively add up to some overall fraction of the air column’s cross-sectional area becoming opaque to the LWR. Radiation that strikes any of that area will be absorbed; radiation travelling through the rest of the column’s cross-sectional area will pass into space.

At some concentration of molecules – dubbed C in this essay, as mentioned – half of the column’s cross-section will be opaque and absorbing the incident LWR. This is illustrated in Figure 2. It is of relevance that because the gas molecules are randomly present in a column of air, when viewed from the end they will overlap; the overlapping areas cannot absorb the same radiation twice. C is the concentration at which the effective opaque area, taking into account all the overlapping, is one-half the column’s cross-sectional area.

If the gas concentration is then increased by C, i.e. is doubled, the new molecules will also have an associated opaque area equal to half of the column’s cross-sectional area. Half of this, however, will coincide with the half that is already opaque, so will have no impact. The other half, or one-quarter of the column’s cross-section, will become newly opaque and start absorbing LWR. If the concentration is again increased by C, the new molecules will also have a total opaque area equal to one-half the column cross-section, but three-quarters of this will coincide with already-opaque area so only one-quarter of that one-half, or one-eighth in total, will become new radiation-absorbing opacity.

Figure 3: Illustrative depiction of radiation-absorption cross-section illustrating how the transparent area, where additional molecules would be exposed to the radiant heat source and, therefore, would absorb radiation, is progressively reduced as more molecules are added; after several more iterations, this leads to radiation absorption “saturation” after which no further radiation is absorbed no matter how many more absorbing molecules are added, since all radiation in that wavelength band is already being absorbed. (Source of illustrations: Jim Mason)

This progression is illustrated in Figure 3, but is perhaps more easily visualized in Figure 4. Here, the half of the cross-sectional area of air column that was rendered opaque by the CO2 molecules is shown as being all on one side of the column. The opacity caused by each successive addition of C number of CO2 molecules is shown in a different colour and is positioned to highlight the impact on the remaining transparent cross-sectional area. As can be seen, each successive increase in concentration of C increases the amount of radiation absorption by decreasing amounts – by a factor of two. After ten such increases, the transparent fraction of the column would be reduced to 0.1 percent of its area so that 99.9 percent of the incident radiation is being absorbed.

Although the foregoing description is conceptually correct, a full understanding of natural processes and of the theory requires taking several other considerations into account, most of which are outside the scope of this discussion. One aspect that is important to understand: as mentioned above, CO2 and other greenhouse gases only absorb outgoing radiation efficiently in particular regions (bands) of the electromagnetic spectrum; they absorb little or none in other bands and are therefore “transparent” to any radiation in those bands. The above discussion applies to the regions of the spectrum where the gas can absorb radiant energy at 100 percent.

Figure 4: Alternative depiction of the reduction in incremental radiation-absorbing area as the absorbing gas concentration is increased in multiples of the concentration that absorbs 50 percent of the incident radiation. As in Figure 3, successive sets of molecules are indicated by red, orange and yellow with another set added, induced in green, while blue represents the remaining transparent area. (Source of illustration: Jim Mason)

Another aspect – which becomes important in the following section – is that the Earth’s surface temperature varies by location, weather, time of year and time of day. This will affect how much radiant energy goes out in various wavelengths in various places, and the absorbing gas’s absorption capacity. While the theory holds that this does not alter the basic non-linearity of absorption nor the “saturation” phenomenon, it could alter the point at which “C”, or 50 percent absorption, is reached – something that can be tested through experimentation.

Net of all this is that if the theoretical formulation is correct, one would expect to see a curve similar to Figure 1 for any individual greenhouse gas – or combination of gases – and a radiant energy source of any temperature, with the curve’s specific shape depending on the gas and/or mixture of gases and the temperature of the radiant energy source. From such a curve, it would be possible to determine the concentration at which the gas is absorbing 50 percent of the maximum energy that it will absorb when its concentration is increased to a very large value.

The paper that develops this theoretical formulation is entitled Dependence of Earth’s Thermal Radiation on Five Most Abundant Greenhouse Gases and was co-authored in June 2020 by William A. van Wijngaarden and William Happer. It is highly technical and would be very difficult for anyone without a strong background in mathematics and science, ideally physics, to understand. Accordingly, the accompanying figures in this article that illustrate the paper’s key ideas in a format accessible to the layperson were produced by me, using information derived from the paper’s figures and text.

Van Wijngaarden is a professor in the Department of Physics and Astronomy at York University in Toronto with a more-than 40-year academic track record and nearly 300 academic papers to his credit, while Happer is Professor Emeritus of Physics at Princeton University in New Jersey who had a 50-year-long academic career and nearly 200 papers to his credit. Both authors also have numerous related academic achievements, awards and organizational memberships, and have mentored hundreds of graduate students. Happer happens to be an open skeptic of the IPCC/UN climate change “consensus”, while Van Wijngaarden has testified in a court case that the IPCC’s climate models “systematically overstate global warming”, an assertion that is incontrovertibly true.

Although their paper was not peer-reviewed and, to date, has not been published in a major academic journal, and although both scientists have endured smears in news and social media as climate skeptics or “deniers”, there has not been any known attempt to refute their theory following publication, such as by identifying errors in the logic or mathematics of their theoretical formulation. Accordingly, their paper is in my opinion an example of good science: a coherent theory aimed at explaining a known phenomenon using rigorous scientific and mathematical principles and formulae, plus supporting evidence. It is also, critically, one that can be subjected to physical experimentation, i.e., is disprovable, as we shall soon see.

Running hot: William A. van Wijngaarden (top left) and William Happer (top right), two highly credentialed physicists with outstanding academic track records, are among scientists who are openly critical of the IPCC’s accepted climate models which, as 40 years of temperature observations have clearly shown, “systematically overstate global warming”. (Source of bottom graph: CEI.org)

This opinion is supported by the fact that the same phenomenon of non-linearity and absorption saturation, along with an associated equation referred to as the Beer-Lambert Law, is discussed by Thayer Watkins, a mathematician and physicist, and professor emeritus of economics at San José State University. “In order to properly understand the greenhouse effect one must take into account the nonlinearity of the effect of increased concentration of greenhouse gases,” Watkins notes. “The source of the nonlinearity may be thought of in terms of a saturation of the absorption capacity of the atmosphere in particular frequency bands.”

Subjecting the LWR Absorption Theory to Experimentation – Or, Science the Way it Should be Done

The second paper was published in March of this year and reports on experiments conducted to test van Wijngaarden and Happer’s theory, in accordance with the standard scientific method, using several different greenhouse gases. If the experiments were properly designed to realistically duplicate natural processes and if they then generated results inconsistent with the theory, then the van Wijngaarden/Happer theory could be considered disproved. If the experiments produced results consistent with the theory, the theory would not be proved but would increase in plausibility and justify further experimentation.

The experimental setup is depicted in Figure 5. It was designed to allow the concentration of COwithin a column of gas (in kilograms per square metre of column area, or kg/m2) to be varied in a controlled way and to measure the fraction of the incident radiation that is absorbed at any concentration. The “column” of gas was contained within a cylinder comprised of a 1-metre length of 150 mm diameter PVC pipe, with polyethylene windows at either end to allow ingress and egress of the radiation. CO2 concentration was changed by injecting measured amounts of the gas via a central valve. Water valves on the cylinder bottom were used to allow an identical volume of gas to escape, thereby maintaining the pressure in the cell. (The background gases into which the CO2 was mixed are unimportant since these remained constant, with only the COconcentration varied.)

Figure 5: Diagram of the laboratory setup for measuring the absorption of thermal radiation in CO2. (Source of illustration: Climatic consequences of the process of saturation of radiation absorption in gases, Figure 7)

The radiation source was a glass vessel with a flat side containing oil maintained at a constant temperature. Adjacent to the flat side was a copper plate with a graphite surface facing the gas cell. This ensured that the radiant source, as seen by the cell, was uniform in temperature over the cross-section of the cell and had the radiation profile of a black body at the chosen temperature. The selected temperatures of 78.6°C and 109.5°C were, states the paper, “chosen randomly but in a manner that allowed reliable measurement of radiation intensity and demonstrated the influence of temperature on the saturation mass value.”

Results for CO2 are illustrated in Figure 6, which is taken directly from the paper. The two selected temperatures are separately graphed. Figure 6 clearly shows experimental curves that are qualitatively the same as the theoretical curve in Figure 1 derived from van Wijngaarden/Happer’s paper and the equation noted in Watkins’ website discussion. From the graph it is possible to determine that the concentration of CO2 that results in absorption of 50 percent of the absorbable radiation – the value of C introduced earlier – is about 0.04 kg/m2 for a LWR temperature of 78.6 °C (the one that is closer to the actual average temperature of the Earth’s surface, which NASA lists as 15 °C).

Figure 6: Absorption of incident radiation versus concentration of CO2, with concentration expressed as a weight per cross-sectional area of atmospheric column (kg/m2), using two experimental LWR temperatures. Absorption is effectively measured as the fraction of the total incident radiation that is absorbed in the test column, which is determined by comparing it to an identical test column that maintains the zero-point concentration throughout. The reason that A saturates at less than 1 is because there are many wavelengths in the incident radiation that CO2 does not absorb, which pass through the column regardless of the CO2 concentration, with only the other wavelengths being absorbed. (Source of graph and mathematical formula: Climatic consequences of the process of saturation of radiation absorption in gases, Figure 8)

As the paper notes, the chosen temperatures, while higher than the Earth’s mean surface temperature, facilitate reliable measurements of the radiation intensities and clearly show the effect of temperature on the saturation mass value or, equivalently, the value of C. Specifically, the graphs clearly show that the value of C decreases as the temperature of the radiant source decreases (although with only two points, the nature of the relationship cannot be reliably determined). The implications are discussed in the following section.

This experimental paper is entitled Climatic consequences of the process of saturation of radiation absorption in gases and was co-authored by Jan Kubicki, Krzysztof Kopczyński and Jarosław Młyńczak. It was published in the journal Applications in Engineering Science and cites copious sources, though it does not appear to have been peer-reviewed. Kubicki is an assistant professor in the Institute of Optoelectronics in the Military University of Technology in Warsaw, Poland. Kopczyński appears to be a colleague at the same institution specializing in the atmospheric distribution of aerosols, while Młyńczak is an adjunct professor at the same institution. All three have authored or co-authored a number of scientific papers.

Is CO2 Even Capable of Driving Global Temperatures Higher?

According to a reputable atmospheric tracking website, on September 2, 2024 the Earth’s atmospheric CO2concentration was 422.78 parts per million (ppm). Each ppm worldwide equates to a total atmospheric weight of 7.82 gigatonnes (Gt). The cited concentration therefore amounts to 3,300 Gt of CO2 in the Earth’s atmosphere. Since the Earth’s surface area is 5.1 x 1014 m2, assuming a uniform CO2 distribution, this concentration can be translated into the units used in the above-cited experiment as 6.48 kg/m2 across the Earth’s surface.

This figure might appear at first glance to be a misprint, as 6.48 kg/m2 is approximately 160 times the CO2 C value of 0.04 kg/m2 – the concentration that absorbs 50 percent of the incident LWR. Six-point-six times the C value – the level that absorbs 99 percent of the incident LWR – is still only 0.264 kg/m2. Beyond this, further increases in gas concentration have no impact on absorption or, hence, on temperature. The Earth’s current concentration of CO2 is, accordingly, over 24 times as high as what is needed to achieve the 99 percent absorption value established by experimentation.

Long past the point of change? The Earth’s currently estimated CO2 concentration of 422.78 parts per million (ppm) is 27 times the estimated CO2 saturation level; even in the pre-Industrial era, CO2 concentrations were more than 12 times that level, suggesting the current rise in CO2 concentration is incapable of driving global temperature. (Source of graph: climate.gov)

The implications of this are quite staggering. According to climate.gov, the CO2 concentration in the pre-Industrial era was 280 ppm and prior to that it oscillated between about 180 ppm and 280 ppm. This means that even the pre-Industrial CO2 concentrations were between 64 and 100 times the C value, as well as being more than 10 times the concentrations needed to reach 99 percent absorption. The CO2 concentration, then, was saturated multiple times over with respect to LWR absorption. A glance back at Figure 2 once again makes it clear that at neither of these COconcentration ranges (covering present times and the pre-Industrial era) were the changes to CO2 concentration capable of having any substantive impact on the amount of LWR being absorbed or, consequently, on atmospheric temperatures – let alone the Earth’s whole climate.

Further, they probably never did. According to the published paper Geocarb III: A Revised Model of Atmospheric CO2 Over Phanerozoic Time, the COconcentration has never been less than 180 ppm during the so-called Phanerozoic Eon, which is the entire time during which all the rock layers in the geological column, from the Cambrian upwards, were deposited. So there has never been any point during this significant span of Earth’s history when the concentration of CO2 in the atmosphere was not “saturated” from a LWR absorption perspective. Consequently, throughout that entire period, if the new theory and recent experimentation are correct, changes in COconcentration have been incapable of having any discernible impact on the amount of LWR absorbed by the atmosphere – or, accordingly, on the global climate.

It’s true that increasing CO2 concentration could be capable of driving global atmospheric temperature higher – but only if it began at vastly lower concentrations than exist at present or at any known previous time. If such a time ever existed, it is long past. At the current order of magnitude in its concentration, CO2 simply appears not to be a factor. If further experimentation also generates results consistent with the van Wijngaarden/Happer theory, it would appear that COis incapable of having any impact on atmospheric temperature at all. It cannot, accordingly, be the primary source of “global warming” or “climate change”, let alone of a “climate emergency” or “global boiling”.

While experimental results consistent with a theory do not prove the theory to be true, and replication of the initial results by the three Polish researchers would be very desirable, the experimental results to date are certainly inconsistent with the current ruling paradigm that CO2 emissions from the burning of fossil fuels are the cause of current climate change and, indeed, that the effect of each increase in concentration is accelerating. According to the rules of decision-making in science, unless the experiment can be shown to be mal-designed or fraudulent, the inconsistency between experiment and theory proves, scientifically, that the current paradigm is false.

The incidence and recession of the Ice Age (top), the Medieval Warm Period (bottom left) and the more recent Little Ice Age (bottom right) are just a few examples of global temperature fluctuations that happened independently of the current era’s burning of fossil fuels or increasing CO2 levels. Shown at top, northern mammoths exhibit at the American Museum of Natural History’s Hall of North American Mammals; bottom left, peasants working on the fields next to the Medieval Louvre Castle, from The Very Rich Hours of the Duke of Berry, circa 1410; bottom right, Enjoying the Ice, by Hendrick Avercamp, circa 1615-1620. (Source of top photo: wallyg, licensed under CC BY-NC-ND 2.0)

Moreover, the new theory and experimental result are consistent with numerous empirical observations that are also inconsistent with the ruling IPCC/climate movement paradigm. Examples abound: the occurrence and recession of the Ice Age, the appearance and disappearance of the Roman and Medieval Warm Periods and the Little Ice Age – all without any CO2 from the burning of fossil fuels – the steady decline in atmospheric temperatures from 1940 to 1975 while CO2 levels steadily increased, and the relative flattening of atmospheric temperatures since about 2000 despite CO2 levels continuing to increase. But if the level of CO2 in the atmosphere has long been above “saturation”, then its variations have no real impact on the climate – as these observations indicate – and something else must have caused these climate variations.

To this can be added the failed predictions of the temperature models and of the dire consequences of not preventing further CO2 increases – such as the polar bears going extinct, the Arctic being free of ice, or Manhattan being covered by water, to list just a few. But again, if the atmosphere has long been CO2-saturated with respect to LWR absorption, then the additional CO2 will have no effect on the climate, which is what the failure of the predictions also indicates.

These results, at the very least, ought to give Climaggedonites pause, although probably they won’t. For the rest of us, they strongly suggest that EVs are a solution in search of a problem. It may be that the technology has a place. The alleged simplicity ought to have spinoff advantages, although the alleged spontaneous combustibility might offset these, and the alleged financial benefit might be simply the consequence of government subsidies. Left to its own devices, without government ideological distortions, the free marketplace would sort all this out.

Climate scare à la carte. (Source of screenshots: CEI.org)

More importantly, these results ought to cause politicians to re-examine their climate-related polices. As van Wijngaarden and Happer put it in their paper, “At current concentrations, the forcings from all greenhouse gases are saturated. The saturations of the abundant greenhouse gases H2O and CO2 are so extreme that the per-molecule forcing is attenuated by four orders of magnitude…” The term “forcings” refers to a complicated concept but, at bottom, signifies the ability (or lack) to influence the Earth’s energy balance. The words “forcings…are saturated” could be restated crudely in layperson’s terms as, “CO2 is impotent.”

Kubicki, Kopczyński and Młyńczak are even more blunt. “The presented material shows that despite the fact that the majority of publications attempt to depict a catastrophic future for our planet due to the anthropogenic increase in CO2 and its impact on Earth’s climate, the shown facts raise serious doubts about this influence,” the three Polish co-authors write in their experimental paper. “In science, especially in the natural sciences, we should strive to present a true picture of reality, primarily through empirical knowledge.”

If, indeed, the CO2 concentration in the Earth’s atmosphere is well beyond the level where increases are causing additional LWR to be absorbed and, as a consequence, changing the climate, then all government policies intended to reduce/eliminate CO2 emissions in order to stop climate change are just as effective as King Canute’s efforts to stop the tides. The only difference being that Canute was aware of the futility.

Jim Mason holds a BSc in engineering physics and a PhD in experimental nuclear physics. His doctoral research and much of his career involved extensive analysis of “noisy” data to extract useful information, which was then further analyzed to identify meaningful relationships indicative of underlying causes. He is currently retired and living near Lakefield, Ontario.

Todayville is a digital media and technology company. We profile unique stories and events in our community. Register and promote your community event for free.

Follow Author

C2C Journal

Natural Gas – Not Nuclear – Is the Key to Powering North America’s Future

Published on

From the C2C Journal

By Gwyn Morgan

After decades on the outs with environmentalists and regulators, nuclear power is being heralded as a key component for a “net zero” future of clean, reliable energy. Its promise is likely to fall short, however, due to some hard realities. As North America grapples with the challenge of providing secure, affordable and sustainable energy amidst soaring electricity demand, it is time to accept this fact: natural gas remains the most practical solution for powering our grid and economy.

Nuclear power’s limitations are rooted in its costs, risks and delays. Even under ideal circumstances, building or restarting a nuclear facility is arduous. Consider Microsoft’s much-publicized plan to restart the long-dormant Unit 1 reactor at Three Mile Island in Pennsylvania. This project is lauded as proof of an incipient “nuclear revival”, but despite leveraging existing infrastructure it will cost US$1.6 billion and take four years to bring online.

This is not a unique case. Across North America, nuclear energy projects face monumental lead times. The new generation of small modular reactors (SMRs), often touted as a game-changer, is still largely theoretical. In Canada – Alberta in particular – discussions around SMRs have been ongoing for years, with no concrete progress. The most optimistic projections estimate the first SMR in Western Canada might be operational by 2034.

The reality is that nuclear energy cannot scale quickly enough to meet urgent electricity needs. Canada’s power grid is already strained, and electricity demand is set to grow significantly, driven by electric vehicles and enormous data centres for AI applications. Nuclear power, even if expanded aggressively, cannot fill the gap within the necessary timeframes.

Natural gas, by contrast, is abundant, flexible, low-risk – and highly affordable. It accounts for 40 percent of U.S. electricity generation and plays a critical role in Canada’s energy mix. Unlike nuclear, natural gas infrastructure can be built rapidly, ensuring that new capacity comes online when it’s needed – not decades later. Gas-fired plants are cost-effective and capable of providing consistent, large-scale power while being capable of rapid starts and shut-downs, making them suitable for meeting both base-load and “peaking” power demands.

Climate-related concerns surrounding natural gas need to be put in perspective. Natural gas is the lowest-emission fossil fuel and produces less than half the carbon dioxide of coal per unit of energy output. It is also highly adaptable, supporting renewable energy integration by compensating for the intermittency of wind and solar power.

Nuclear energy advocates frequently highlight its zero-emission credentials, yet they overlook its immense challenges, not just the front-end problems of high cost and long lead times, but ongoing waste disposal and future decommissioning.

Natural gas, by comparison, presents fewer risks. Its production and distribution systems are well-established, and North America is uniquely positioned to benefit from the vast reserves underlying all three countries on the continent. Despite low prices and ever-increasing regulatory obstacles, Canada’s natural gas production has been setting new records.

Streamlining regulatory processes and expanding liquefied natural gas (LNG) export capacity would help revive Canada’s battered economy, with plenty of natural gas left over to help meet growing domestic electricity needs.

Critics argue that investing in natural gas is at odds with the “energy transition” to a glorious net zero future, but this oversimplifies the related challenges and ignores hard realities. By reducing reliance on dirtier fuels like coal, natural gas can help lower a country’s greenhouse gas emissions while providing the reliability needed to support economic growth and renewable energy integration.

Europe’s energy crisis following the recent reduction of Russian gas imports underscores natural gas’s vital role in maintaining reliable electricity supplies. As nations like Germany still phase out nuclear power due to the sheer blind ideology of their left-wing parties, they’re growing more dependent on natural gas to keep the lights (mostly) on and the factories (partially) humming.

Europe is already a destination for LNG exported from the U.S. Gulf Coast, and American LNG exports will soon resume growth under the incoming Trump Administration. Canada has the resources and know-how to similarly scale up its LNG exports; all we need is a supportive federal government.

For all its theoretical benefits, nuclear power remains impractical for meeting immediate and medium-term energy demands. Its high costs, lengthy timelines and significant remaining public opposition make it unlikely to serve as North America’s energy backbone.

Natural gas, on the other hand, is affordable, scalable and reliable. It is the fuel that powers industries, keeps homes warm and provides the stability our electricity grid needs – whether or not we ever transition to “net zero”. By prioritizing investment in natural gas infrastructure and expanding its use, we can meet today’s energy challenges head-on while laying the groundwork for tomorrow’s innovations.

The original, full-length version of this article was recently published in C2C Journal.

Gwyn Morgan is a retired business leader who was a director of five global corporations.

Continue Reading

C2C Journal

A Rush to the Exits: Forget Immigration, Canada has an Emigration crisis

Published on

From the C2C Journal

By Scott Inniss

Canada’s open immigration policy has often been hailed as a positive thing, contributing to the building
of the country. Yet the Trudeau government’s decade-long determination to drive immigration numbers
ever-higher – a policy that public outcry now has it scrambling away from – has obscured an important
and discouraging phenomenon. Every year, tens of thousands of Canadians leave the country, taking
their skills and ambitions with them, and leaving Canada diminished.

Emigration is the flipside of the immigration issue — a side that has been largely ignored. Statistics
Canada estimates that more than 104,000 people left Canada in 2023-2024, a number than has been
rising for the past few years. Another study put the number of Canadian citizens living abroad in 2016 at
between 2.9 million and 5.5 million, with a “medium” scenario of 4,038,700 — or about 12.6 percent of
the Canadian population that year (the latest for which this kind of analysis exists).

This trend isn’t just an abstract problem; it undermines the very economic goals policymakers hope to
achieve through immigration. Emigrants are younger, better educated, and earn higher incomes than
the average Canadian, according to Statcan’s study: “The departure of people with these characteristics
raises concerns about the loss of significant economic potential and the retention of a highly skilled
workforce.” Canada is losing its best and brightest, many of them to the U.S. A survey by the U.S. Census
Bureau this year said the number of people moving from Canada to the U.S. was up 70 percent from a
decade ago.

Canada’s economic decline is big reason for the exodus. In 2022, all 10 Canadian provinces had median
per capita incomes lower than the lowest-earning American state. Canada’s per capita GDP growth has
also stagnated, with projections placing the country dead last among OECD nations out to 2060. Our
productivity is in decline and business investment is moribund, meaning employers in other countries
are able to pay more and compete for qualified labour.

The high cost of living, particularly skyrocketing housing costs, is an increasingly large factor pushing
skilled Canadians abroad. A recent survey by Angus Reid reported that 28 percent of Canadians are
considering leaving their province due to unaffordable housing, with 42 percent of those considering a
move outside Canada.

Even immigrants to Canada are losing faith and moving on. A recent report from the Institute for
Canadian Citizenship, entitled The Leaky Bucket, found that “onward” migration had been steadily
increasing since the 1980s. A follow-up survey of more than 15,000 immigrants and found that 26
percent said they are likely to leave Canada within two years, with the proportion rising to over 30
percent among federally selected economic immigrants—those with the highest scores in the points
system.

“While the fairy tale of Canada as a land of opportunity still holds for many newcomers,” wrote Daniel
Bernhard, CEO of the ICC, there is undeniably a “burgeoning disillusionment. After giving Canada a try,
growing numbers of immigrants are saying ‘no thanks,’ and moving on.” It’s a particularly stark
phenomenon considering that most immigrants have come from much poorer, less developed and often
autocratic or unsafe nations; that these people find Canada – for decades considered the ultimate
destination among those seeking a better life – to be such a disappointment that the best response is to
leave is a damning indictment.

Consider Elena Secara, an immigrant from Romania who built a life here only to find Canada’s economic
reality falling short of her expectations. After nearly two decades, Secara plans to return to Romania, a
country she sees as improving, while Canada, she says, “is getting worse and worse. Canada is
declining…In Romania there are much more opportunities for professionals, the medical system is
better, the food is better.” And, she adds with a laugh, “Even the roads are better.” One of her sons has
already voted with his feet, and is now living in Romania.

That a country like Romania, for years one of Europe’s poorest and most corrupt nations, can now
attract emigrants from Canada should be sobering for policymakers. Canada is facing ever-greater
competition just as it enters the second decade of what may be its longest and most serious economic
deterioration since Confederation.

Each emigrant lost represents not just an individual choice but a systemic failure to provide opportunity
at home. As the revolving door of emigration spins faster, Canada faces a reckoning. Our political leaders
must address the housing crisis, lower tax burdens, and foster a more competitive economy to retain
the talent Canada desperately needs. Without action, Canada’s silent exodus risks becoming a defining
national failure—one that leaves the country less resilient, less innovative, and less prepared for the
future.

The original, full-length version of this article was recently published in C2C Journal.

Scott Inniss is a Montreal writer.

Continue Reading

Trending

X