Moberg et al: Highly variable Northern Hemisphere temperatures reconstructed from low- and high-resolution proxy data

The 10th Feb edition of nature has a nice paper Highly variable Northern Hemisphere temperatures reconstructed from low- and high-resolution proxy data by ANDERS MOBERG, DMITRY M. SONECHKIN, KARIN HOLMGREN, NINA M. DATSENKO & WIBJÖRN KARLÉN
(doi:10.1038/nature03265). It is sure to attract a lot of attention, but I'm writing this without having read reviews, except the News and News&Views in Nature. Having read what I've written, I suspect that this post requires a lot of background knowledge to make sense of... do feel free to ask questions, here or on sci.env.

What the paper does is to use wavelets to do the temperature reconstruction. It arrives at a more variable record than before, but the temperatures since 1990 are still the warmest in the last 2000 years. something for everyone...

The novel thing about this paper is the wavelets. What they have done is to combine the high-frequency (< 80 y) signal from tree rings with the low-frequency (> 80 y) signal from lake sediments and other such non-annually resolved proxies. This does two things that they think are good: it allows the non-annually resolved proxies to be used (previous, e.g. MBH98, used only those with at least a value each year, to allow calibration against the instrumental record; this study can use data that only provide 50 y means); and it throws away the long-term signal from the tree rings, which they don't trust. There *are* problems with the long-term signal from tree rings, as I understand it, but I'm not sure that throwing all that away is such a good idea.

The result is a more long-term variable signal than, e.g., MBH98. They end up with two "warm peaks" in the smoothed record in 1000 and 1100, at about zero on their anomaly scale, and with annual peaks up to +0.4 oC or a bit less; the most recent data from the instrumental record post 1990 then peak at +0.6 or a bit more, on the same scale. Their minimum about 1600 (the "LIA") is then about -0.7.

On the politics side, this is bound to stir up interest, since the "hockey stick" has become so totemic. Moberg et al note the disparity with MBH (and Mann & Jones 2003; incidentally there is a nice chain of papers: Moberg et al; Jones and Moberg 2003; and then M&J03 which links Moberg to Mann) and unsurprisingly prefer their version, and quote von S in their support, again no surprise. I strongly expect to see unbalanced quotes from this paper in the septic press, with the increased variability emphasised and no sign of "We find no evidence for any earlier periods in the last millenia with warmer conditions than the post-1990 period - in agreement with previous similar studies (1-4,7)" where (1) is MBH98, (2) is MBH99, (7) is Mann and Jones '03. The "News" article in Nature explicitly rejects the idea that this means we're not causing the current warming. And it quotes von Storch: "it does not weaken in any way the hypothesis that recent observed warming is a result mainly of human activity".

Now for my concerns about the paper. I'm no palaeo person, so these comments are not to be taken too seriously, though I have elsewhere attempted to make sense of some of these issues:

I am slightly doubtful that the wavelets stuff has added much to the mix, though it looks impressive. As far as I can tell, they use the wavelets to merge the high-frequency data from the tree rings with low-frequency data from the other sources, which have lower temporal resolution. But... that means the low-res proxies are doing all the work, and the tree rings are just adding a pretty-looking fringe of noise that your eye reads as sort-of error bars. Or have I missed something?

Second, the lack of spatial averaging seems a bit unfortunate, though they say it doesn't matter.

Third, because they have used the wavelets, they end up with a non-dimensional signal which has to be normalised against the instrumental record from 1859 to 1979. So if (for example) their reconstruction was too flat in that period, the renormalisation would pump it up in the pre-industrial period. Or if too noisy, it would get toned down. The adjustment is done to match "mean value and variance" but (being in Nature) this is a bit brief: do they mean the variance of the smoothed or full series? If the full series (which is the default I suppose), then most of the variance is probably coming from the interannual variations of the tree rings, *but* the bit thats really interesting is the long-term signal. If its the long term signal that is being matched, then you only have 1-2 dof's to match in the period 1859-1979.


  1. EurekAlert: Natural climate change may be larger than commonly thought
  2. Nature: news: Past climate comes into focus but warm forecast stays put; News & Views; Article (first para); nb Nature content requires subscription.
  3. Quark Soup (inc graph)

[Correction: v1.0 of this omitted the text "80y) signal from tree rings with the low-frequency ("]


Blogger Lumo said...

Hey William,

I am now the majority shareholder of your blog, see


So I expect you to post more balanced articles. For example, Steve McIntyre analyzed Moberg slightly more carefully than you.

If I am not satisfied with this blog, I will shut it off - the same with RealClimate.org. The first thing you should do on realclimate.org is to include a link to climateaudit.org to the main page.

All the best

10:13 pm  
Anonymous Peter Hearnden said...

It's a very fine blog, though, obviously, some don't see that.

However, if you want to 'buy' a fantasy version of it in some interent game that's up to you. I prefer the real world.


9:40 am  
Blogger Lumo said...

Fortunately, most people don't think that this is a fine blog. Otherwise I would pay more than 500 blog dollars for this penny stock from my multimillion blog capital. ;-)

10:54 pm  
Anonymous Anonymous said...


Interesting post. I take it that you base this comment " But... that means the low-res proxies are doing all the work," from their figure showing the colourful frequency vs. time plot (I don't have the paper in front of me so I don't know the figure number).

If so I have wondered about that as well. It seems that there is nothing of any consequence below 80 years which I find strange. Most of the low frequency proxies are subject to things like mixing problems (in the case of sediments), that is why they are low frequency. Consequently, it is not obvious why they should contribute so much.

On the other hand, this paper shows that the Bristle Cone Pine series don't contribute either. ;)

John Cross

2:28 am  
Anonymous Anonymous said...

Hi, could you please explain the following terms?

1. annually/non-annually resolved proxies --> is this mean that the data is measured equidistantly at each single year?

2. calibration againest/normalized againest the instrumental record? --> what is thie "calibration againest" referring to?

Many Thanks!

4:22 pm  

Post a Comment

Links to this post:

Create a Link

<< Home