One of the strongest predictions of global warming is that the stratosphere will *cool* - unlike the troposphere, which will warm, of course. See the IPCC here for example. This turns out to be not as useful for *detecting* climate change as it might be, because ozone decreases also lower the stratospheric temperature. However...
The interesting question is, *why* does the stratosphere cool? From asking colleagues, its quite clear that very few people have thought about this, and of those few who do think about it few get the right answer. Indeed, I'm not absolutely sure that what I've written below *is* the right answer, but I think it is. For a long (and possibly doomed) attempt to explain it, see this at RealClimate.
[Clarification: 2005/03/05: I fear I may not have been quite as explicit as I might have been: this post is about why the stratosphere cools if all you do is change the GHG's, e.g. CO2. It is not about what happens if you decrease the ozone - that, trivially, cools the stratosphere. Consequently, I am not talking about the observed decrease in temperature in the strat - which is caused by a mixture of ozone depletion and GHG increase - but about what *would* happen in a though experiment if GHG's are increased but ozone is held fixed.]
Anyway: my explanation (thanks HKR) is:
in a uniformly grey non-convecting atmosphere (ie, if the atmosphere were equally transparent at all wavelengths, and uniformly through its depth) heated from below (ie, solar radiation warming the surface; assuming of course that we've relaxed the grey assumption to let the solar through), then increasing the greenhouse gases (GHG's) *doesn't* lead to a cooling at the top: instead, the whole atmosphere warms, though not uniformly. You can see some calcs and pictures and code here;
of course, the real atmos does convect; isn't totally transparent to solar; etc; but the real difference is:
the reason that the real atmosphere has a stratosphere is because of ozone absorbing UV, thereby warming that portion of the upper atmosphere;
hence the stratosphere is considerably warmer than it would be under just longwave (LW, or IR) forcing; and CO2 is only effective in LW frequencies;
hence, increasing CO2 *increases* the stratospheres ability to radiate in the LW, but doesn't substantially increase its ability to gain heat, because most of that comes from the SW;
hence it cools.
In the troposphere (ignoring convection etc etc; the real atmos is complex...) increasing CO2 increases both the ability to gain and lose heat, and this first-order argument doesn't tell you what will happen; as it turns out, it warms.
Note: of course the fact that many people couldn't explain this makes no difference at all to the fact that climate models produce the correct answer: they just integrate the equations, and don't care about *why* things happen.
[Update in response to comment: the troposhere is the lowest bit of the atmosphere - up to about 8km. Temperature generally decreases with height at about 7 oC/km. The stratosphere comes next, temperatures *increase* with height (the temp min defines the interface, called the tropopause) until the mid-strat, then declines again to - I think - the stratopause. See IPCC glossary for more.
CO2 is only radiatively active in the LW - ie the infrared portion of the spectrum. Its just about transparent to visible (SW) light]