This is very well stated. For those interested in the claim that a 1 ppm increase of a molecule that makes up a small percentage of the atmosphere can cause all of this warming, here is a very in depth and detailed YT series on CO2. It will take a while to watch, and there are parts that are very tedious, but it's also well worth it.
This is very well stated. For those interested in the claim that a 1 ppm increase of a molecule that makes up a small percentage of the atmosphere can cause all of this warming, here is a very in depth and detailed YT series on CO2. It will take a while to watch, and there are parts that are very tedious, but it's also well worth it.
1ppm? Going from 280 to 419ppm is a tad more than that. It's a 50% increase. That ignores the similar spikes in NOx and CH4 resulting from human activity.
At 280ppm we were at historically critical levels. At 50% increase we are still at historically critical levels on relative basis. We need another several hundred percent increase at minimum, but before then the Grand Solar Minimum will shift temps way lower and more CO2 mendacity will ensue.
Yet the oceans, forests, and grasslands of the world were thriving at this level which gave rise to human civilization. At current levels, none of that is true.
The grand solar minimum resulted in about a 1 degree (c) temperature decrease. If humans ever stop burning fossil fuels, the resulting loss of atmospheric particulates (which shade the planet and reduce temperatures) will more than overcome such small changes.
Perhaps our solar cycles won't follow recent patterns, if we're lucky. Oceans will still be acidified even if temps drop. Last time I checked about 7 years ago, record highs outnumbered newly recorded record lows by a factor of 11:1.
There is not a single legitimate research paper that would in all decency make a claim that by removing anthropogenic CO2 levels that there would be any temp change.
I am sorry, but you really should re-research this.
You do realize, I hope, that it was demonstrated in a laboratory in the 1800s (and is still very much reproducible) that CO2 is much more efficient at storing infrared (i.e.) heat energy than the dominant gasses of N2 or O2 in the atmosphere. Strangely enough, they suggested that global warming could occur from coal burning back then (1870's if my memory serves) as well. Didn't even need Al Gore to tell them.
And how exactly did these 1800s researchers ascertain the PPM for this CO2?
And you believe that extrapolating from an 1800s lab all kinds of "climate" effects that modern computing can't even accurately predict basic weather patterns a few days out confirms all of your beliefs? Really?
I won't claim to be a world class scientist myself, but my grandfather was a physicist. Brought to the US as part of project paperclip as a matter of fact. Suffice it to say that whenever I had a question from my high school physics or chemistry classes, he was a fantastic resource.
Similarly, we know that the rise in temps we've seen results from atmospheric changes, because we can measure the difference between average daytime highs vs. daytime lows. It works much the same as insulation in a house; add more insulation and the night-time low temps don't drop as much. That's exactly the trend we've observed over the last few decades. It's also why humidity gets worse (very noticeable where I'm at in Michigan!). Night-time lows are largely what squeeze atmospheric moisture from the air as dew. Even if daytime temps are identical, you'll notice that humidity is much higher after a warm night that allows the air to retain the moisture content.
If temperature increases were the result of increased solar activity, you would see a rise in daytime highs, not night-time lows.
It's not rocket science. Start with standard air, add a little CO2 (something as simple as fermenting wine would be a good source of nearly pure CO2), and see if it effects heat retention when exposed to sunlight. At standard atmospheric pressure, the number of CO2 molecules in a known volume is easily calculated. If you've ever had a chemistry class, you've probably done something very similar.
This is very well stated. For those interested in the claim that a 1 ppm increase of a molecule that makes up a small percentage of the atmosphere can cause all of this warming, here is a very in depth and detailed YT series on CO2. It will take a while to watch, and there are parts that are very tedious, but it's also well worth it.
https://www.youtube.com/playlist?list=PLX2gX-ftPVXVzU5jGY3FaYEuuu3ANvMZb
1ppm? Going from 280 to 419ppm is a tad more than that. It's a 50% increase. That ignores the similar spikes in NOx and CH4 resulting from human activity.
At 280ppm we were at historically critical levels. At 50% increase we are still at historically critical levels on relative basis. We need another several hundred percent increase at minimum, but before then the Grand Solar Minimum will shift temps way lower and more CO2 mendacity will ensue.
Yet the oceans, forests, and grasslands of the world were thriving at this level which gave rise to human civilization. At current levels, none of that is true.
The grand solar minimum resulted in about a 1 degree (c) temperature decrease. If humans ever stop burning fossil fuels, the resulting loss of atmospheric particulates (which shade the planet and reduce temperatures) will more than overcome such small changes.
Perhaps our solar cycles won't follow recent patterns, if we're lucky. Oceans will still be acidified even if temps drop. Last time I checked about 7 years ago, record highs outnumbered newly recorded record lows by a factor of 11:1.
There is not a single legitimate research paper that would in all decency make a claim that by removing anthropogenic CO2 levels that there would be any temp change.
I am sorry, but you really should re-research this.
You do realize, I hope, that it was demonstrated in a laboratory in the 1800s (and is still very much reproducible) that CO2 is much more efficient at storing infrared (i.e.) heat energy than the dominant gasses of N2 or O2 in the atmosphere. Strangely enough, they suggested that global warming could occur from coal burning back then (1870's if my memory serves) as well. Didn't even need Al Gore to tell them.
And how exactly did these 1800s researchers ascertain the PPM for this CO2?
And you believe that extrapolating from an 1800s lab all kinds of "climate" effects that modern computing can't even accurately predict basic weather patterns a few days out confirms all of your beliefs? Really?
I wish you all the very best in your research.
I won't claim to be a world class scientist myself, but my grandfather was a physicist. Brought to the US as part of project paperclip as a matter of fact. Suffice it to say that whenever I had a question from my high school physics or chemistry classes, he was a fantastic resource.
Similarly, we know that the rise in temps we've seen results from atmospheric changes, because we can measure the difference between average daytime highs vs. daytime lows. It works much the same as insulation in a house; add more insulation and the night-time low temps don't drop as much. That's exactly the trend we've observed over the last few decades. It's also why humidity gets worse (very noticeable where I'm at in Michigan!). Night-time lows are largely what squeeze atmospheric moisture from the air as dew. Even if daytime temps are identical, you'll notice that humidity is much higher after a warm night that allows the air to retain the moisture content.
If temperature increases were the result of increased solar activity, you would see a rise in daytime highs, not night-time lows.
It's not rocket science. Start with standard air, add a little CO2 (something as simple as fermenting wine would be a good source of nearly pure CO2), and see if it effects heat retention when exposed to sunlight. At standard atmospheric pressure, the number of CO2 molecules in a known volume is easily calculated. If you've ever had a chemistry class, you've probably done something very similar.