# Thread: Does Atmosphere Actually Increase Energy Earth Receives

1. I have a question that is not really an "Environmental Science" Question, but I think the people here may know the appropriate science to respond.

Please note: This is NOT a question regarding "Global Warming" per se, and it certainly isn't a question about AGW or what more CO2 may or may not do to our mean temperatures. Please do not construe this as an attack on AGW.

My question is whether the Earth's atmosphere actually ends up increasing the amount of radiative energy the Earth receives. This is different from asking whether it increases the average temperature of the Earth.

Please let me explain.

There are three ways in which the atmosphere can decrease the amount of energy the Earth receives (or increase how rapidly it offloads it):
i) Convection/evaporation and other minor non-radiative cooling
ii) The atmosphere reflects some sunlight that otherwise would have hit the Earth.
iii) The atmosphere absorbs some sunlight that otherwise would have hit the Earth

And there is, obviously, one way for an atmosphere to increase the amount of energy absorbed by the earth: by (re)-radiating energy downward.

Now, just by comparing the Earth with the Moon, one would say the net effect of the atmosphere is to heat the earth because the average temperature of the Earth is greater than the average temperature of the Moon. This difference is around 40 degrees, but I've been told that part of that is due to the Moon's structure being different and the expected difference in mean temperature between an Earth with atmosphere and an Earth without is less.

But I don't think this tells the whole story...not if the question you are asking is "How much energy does the Earth's surface receive?" In that case, temperature itself is not a reliable estimate. The amount of heat energy due to internal processes on earth and the amount of heat energy conveyed away by convection/evaporation is supposed to be pretty small, so...roughly we can say:

Amount of energy the earth receives per square meter = Amount of energy the earth radiates away per square meter.

The issue being that the Earth has a roughly stable total energy budget, and the vast majority of the heat leaving is by radiation.

But if we use this equivalence to measure how much energy the Earth receives, we don't want to look at the temperature of the Earth...we want to look at the fourth power of temperature. That is what decides the amount of energy radiated away by a surface, not the temperature but the T^4. [Stefan-Boltzmann Law]

And this makes things interesting indeed because the Moon has far more severe temperature deviations than the Earth does...so even though the mean temperature of the moon is less, my guess is that it actually radiates away more per square meter. [I'll model it in excel tomorrow]. This would mean that the atmosphere does not really "heat" the Earth (in the sense of increasing the amount of energy it receives, even counting that from re-radiation) but rather the fact that it evenly distributes the energy conspires with our poor choice of Temperature as a measure of heat intake/outake to make it seem like it does.

So, as I said...this is not really about climate problems..which ARE [or WOULD BE] caused by temperature itself. It is purely a scientific question about whether the atmosphere is actually increasing the amount of radiative energy received by the Earth.

If it is not the case [as a back-of-the-envelope estimation using Stefan-Boltzmann Law suggests] that the atmosphere increases the amount of energy received, then it means we need to change the way we discuss "the greenhouse effect" [which is already a misnomer, since a greenhouse is warmed by a completely different mechanism.]

It might be more honest, then, to say that an atmosphere "warms" the Earth [in the sense of raising the average temperature] because the redistribution of energy [both by having an enveloping atmosphere and by convection/conduction] moderates temperatures and this moderation of temperature makes the planet less efficient at radiating its heat away. The mean temperature has to rise higher than it would otherwise be so that Earth can be in thermal equilibrium.

I'd be interested in anyone's thoughts on the above basic idea.

2.

3. My question is whether the Earth's atmosphere actually ends up increasing the amount of radiative energy the Earth receives. This is different from asking whether it increases the average temperature of the Earth.
Found you discussion rather confusing. The Earth's atmosphere is part of the earth.

For the entire system, the vast majority is by reflection of incoming radiation and outgoing IR radiation--that's it. There's a really tiny amount of energy leaving by transfer of rotational momentun to the moon and direct escape of atoms.

If you meant the surfure of the earth there's a whole host of ways to gain and loose energy including condensation/evaporation, fusion/sublimation, conduction, horizonal advection (for oceans) visible gain/IR radation loss and others.

4. Originally Posted by FireBones
That is what decides the amount of energy radiated away by a surface, not the temperature but the T^4. [Stefan-Boltzmann Law]
Is the Earth a near perfect black body? If it's not does your argument still apply?

5. Originally Posted by Lynx_Fox
Found you discussion rather confusing. The Earth's atmosphere is part of the earth.
That might be why you found my discussion confusing...I'm not considering the earth's atmosphere as "part of the earth." I'm considering the Sun-Earth-Atmosphere as a system with three elements.

And I think it is fair to do so because whenever "the greenhouse effect" is brought up, the idea that "the atmosphere reradiates energy downward to the surface" is part of that discussion. In other words, (in many places at least), the "greenhouse effect" is discussed as a mechanism for the Earth (that is, its surface) to receive more energy than it otherwise would have. I'm investigating whether that is really a fair description.

Originally Posted by Lynx_Fox
If you meant the surfure of the earth there's a whole host of ways to gain and loose energy including condensation/evaporation, fusion/sublimation, conduction, horizonal advection (for oceans) visible gain/IR radation loss and others.
At least the tallies I have seen have suggested that condensation/evaporation/sublimation represents a small percentage of the heat lost by the earth. I don't see how horizontal advection represents a loss of heat by the Earth (as a planet).

The principal source of thermal energy for Earth is radiation [either from the sun or from the atmosphere], and the principal mechanism for exporting thermal energy from the Earth (as a planet) is blackbody radiation.

P.S. Why aren't my "quote" tags working?

6. Originally Posted by Ophiolite
Originally Posted by FireBones
That is what decides the amount of energy radiated away by a surface, not the temperature but the T^4. [Stefan-Boltzmann Law]
Is the Earth a near perfect black body? If it's not does your argument still apply?
Yes, the earth is a near blackbody.

7. I found the information I needed in the 2009 energy budget document at http://ams.allenpress.com/archive/15...7-90-3-311.pdf .

According to that, it does appear that the atmosphere increases the total energy input, but my guess is that the actual increase in AVERAGE TEMPERATURE is more due to the moderation of temperature at the Earth's surface, making it a less efficient blackbody.

I'll model this sometime soon and post my results later.

8. When you calculate the earth as a black-body without an atmosphere, with the appropriate emissivity, and 1/4 disk area to sphere area, you get about a -18 C average temperature. The moon is effectively the same distance from the sun, but varies because of a slightly different emissivity. The radiative forcing from the re-emitted radiation in the atmosphere is estimated to increase the average temperature by about 33 C, or to about 15 C.

9. Originally Posted by FireBones
Originally Posted by Lynx_Fox
Found you discussion rather confusing. The Earth's atmosphere is part of the earth.
That might be why you found my discussion confusing...I'm not considering the earth's atmosphere as "part of the earth." I'm considering the Sun-Earth-Atmosphere as a system with three elements.
Ok fair enough.

Originally Posted by Lynx_Fox
If you meant the surfure of the earth there's a whole host of ways to gain and loose energy including condensation/evaporation, fusion/sublimation, conduction, horizonal advection (for oceans) visible gain/IR radation loss and others.
At least the tallies I have seen have suggested that condensation/evaporation/sublimation represents a small percentage of the heat lost by the earth. I don't see how horizontal advection represents a loss of heat by the Earth (as a planet).
If by earth you mean the entire earth system, than those don't play any role whatsoever. There's only reflection of incoming radiation, emission of IR radiation and what ever tiny bit is lost by escaping molecules and tidal transfer to the moon.

Green house gas doesn't effect the earth system other than the spectrum of that IR radiation is being emitted. The total amount being emitted is the same with or without any green house gases. (albedo being the same)

The principal source of thermal energy for Earth is radiation [either from the sun or from the atmosphere], and the principal mechanism for exporting thermal energy from the Earth (as a planet) is black body radiation.
You're back to the confusing terminology again.... referring to earth and earth's atmosphere as separate things, when above you said you were considering the atmosphere as part of the "earth."

Furthermore the earth system doesn't emit like a black body--not even close because the emission must pass through the atmosphere's transparent bands.

--
P.S. Why aren't my "quote" tags working?
Usually it's because there isn't an even number of quotes and slash-quotes.

10. Originally Posted by Wild Cobra
The radiative forcing from the re-emitted radiation in the atmosphere is estimated to increase the average temperature by about 33 C, or to about 15 C.
Of the near surface atmosphere, not the total atmosphere.

11. Originally Posted by Wild Cobra
When you calculate the earth as a black-body without an atmosphere, with the appropriate emissivity, and 1/4 disk area to sphere area, you get about a -18 C average temperature. The moon is effectively the same distance from the sun, but varies because of a slightly different emissivity. The radiative forcing from the re-emitted radiation in the atmosphere is estimated to increase the average temperature by about 33 C, or to about 15 C.
That number is inaccurate [though somehow widely quoted]. What is exceptional is that it is not hard to calculate the blackbody radiation of the Earth. You can see the derivation at http://en.wikibooks.org/wiki/Climate...uence_on_Earth

The commonly cited, erroneous -18 degree value is either due to:
1) Using an emissivity of 0.61 [which makes confuses two definitions for "emissivity." One being the standard physical property used in the S-B law, the other being the derived equivalent value of the Earth inside the Earth-Atmosphere system. Obviously, this latter has no relevance to a discussion of what the Earth would be like without an atmosphere.]
2) Using an albedo that includes clouds [which also makes no sense...without an atmosphere, there are no clouds].

12. Originally Posted by Lynx_Fox
Originally Posted by FireBones

At least the tallies I have seen have suggested that condensation/evaporation/sublimation represents a small percentage of the heat lost by the earth. I don't see how horizontal advection represents a loss of heat by the Earth (as a planet).
If by earth you mean the entire earth system, than those don't play any role whatsoever. There's only reflection of incoming radiation, emission of IR radiation and what ever tiny bit is lost by escaping molecules and tidal transfer to the moon.

Green house gas doesn't effect the earth system other than the spectrum of that IR radiation is being emitted. The total amount being emitted is the same with or without any green house gases. (albedo being the same)

The principal source of thermal energy for Earth is radiation [either from the sun or from the atmosphere], and the principal mechanism for exporting thermal energy from the Earth (as a planet) is black body radiation.
You're back to the confusing terminology again.... referring to earth and earth's atmosphere as separate things, when above you said you were considering the atmosphere as part of the "earth."
Nah, in this case it was just that a source I looked at earlier misrepresented the amount of heat lost due to latent heat flux and convection.

13. I modeled this yesterday [or, rather, this morning] and verified the conclusion I had suggested earlier, most of the warming of Earth's surface due to the atmosphere comes not from the increased energy due to back radiation but rather due to temperature moderation lowering the effectiveness of Earth as a blackbody radiator.

Just to reiterate, by "lowering the effectiveness of Earth as a blackbody radiator" I do not mean "because some of the outgoing radiation is trapped." Radiation escaping earth's surface counts as radiation regardless of whether that radiation is subsequently absorbed by the atmosphere.

Rather, I'm referring to the fact that the effectiveness of a blackbody radiator is not linearly dependent on temperature. A radiator made of a hot side (400 K) and a cold side (200 K) radiates much more energy than radiator at the middle temperature (300 K).

Just to give an impression how much more energy, compare 2^4 + 4^4 [16 + 256 = 272] to 3^4 + 3^4 [81+81 = 162]...so the hot-and-cold radiator radiates 68% more energy than the warm one, though they have the same "average" temperature.

You can estimate this effect by comparing:

A. The average surface temperature of the moon [~ -36 degrees Celsius]
B. The blackbody temperature of the Earth [5.5 degrees Celsius]
C. The blackbody temperature of the green-house affected Earth [16 degrees Celsius]

The gap between B and C is roughly the increase in temperature due to the fact that the Earth receives more radiation [due to back radiation mitigated by convection/latent heat flux]. The gap between A and B is approximately the difference due to moderation of temperatures.

(Actually, since regolith has a higher albedo than earth, the gap between A and B slightly over-states this difference. The difference in Bond Albedo is about 7% so that suggests a difference in Kelvin temperature of about 1.8 percent, or about 5 degrees as a correction.)

You can also see the same thing by comparing the surface temperature of the Moon to the temperature of the regolith underneath. The mean temperature of the moon's regolith one meter in is significantly hotter than the mean temperature of the surface [by 35-40 degrees celsius], showing the reverse effect that the moon's surface [varying widely through the lunar day] is colder on average because this variable temperature makes it a more efficient radiator.

14. Originally Posted by FireBones
That number is inaccurate [though somehow widely quoted]. What is exceptional is that it is not hard to calculate the blackbody radiation of the Earth. You can see the derivation at http://en.wikibooks.org/wiki/Climate...uence_on_Earth

The commonly cited, erroneous -18 degree value is either due to:
1) Using an emissivity of 0.61 [which makes confuses two definitions for "emissivity." One being the standard physical property used in the S-B law, the other being the derived equivalent value of the Earth inside the Earth-Atmosphere system. Obviously, this latter has no relevance to a discussion of what the Earth would be like without an atmosphere.]
2) Using an albedo that includes clouds [which also makes no sense...without an atmosphere, there are no clouds].
I've heard that before, and never really explored it. I cited the lowering of the greenhouse effect once myself, but was scoffed at. The AGW crowd like the 33 C greenhouse effect. Your cited 10 C greenhouse effect really limits how much potential CO2 and CH4 can warm the atmosphere. If that science is true, then the AGW crowd has nothing to cry about.

15. Originally Posted by Lynx_Fox
Originally Posted by Wild Cobra
The radiative forcing from the re-emitted radiation in the atmosphere is estimated to increase the average temperature by about 33 C, or to about 15 C.
Of the near surface atmosphere, not the total atmosphere.
Correct. Sorry I didn't add the specificity.

16. A. The average surface temperature of the moon [~ -36 degrees Celsius]
B. The blackbody temperature of the Earth [5.5 degrees Celsius]
C. The blackbody temperature of the green-house affected Earth [16 degrees Celsius]

The gap between B and C is roughly the increase in temperature due to the fact that the Earth receives more radiation [due to back radiation mitigated by convection/latent heat flux]. The gap between A and B is approximately the difference due to moderation of temperatures.
That is incorrect. What you measuring is the difference between black body and effective emision temperature. That is not the same as the near surface temperture which results from green house gasses. In fact, green house gases can increase the near surface by a large amount, decrease the mid and upper level thus increasing the laspe rate of the atmosphere while having little effect on the effective radiating temperature.

17. If you meant the surfure of the earth there's a whole host of ways to gain and loose energy including condensation/evaporation, fusion/sublimation, conduction, horizonal advection (for oceans) visible gain/IR radation loss and others..

by the way nice post..........

 Bookmarks
##### Bookmarks
 Posting Permissions
 You may not post new threads You may not post replies You may not post attachments You may not edit your posts   BB code is On Smilies are On [IMG] code is On [VIDEO] code is On HTML code is Off Trackbacks are Off Pingbacks are Off Refbacks are On Terms of Use Agreement