Wednesday, September 28, 2016

Regarding the divergence of proxy and instrument temps in the latter half of the 20th century?

September 12, 2011 by admin  
Filed under Discussions

Question by Marc G: Regarding the divergence of proxy and instrument temps in the latter half of the 20th century?

If there is a divergence between proxy and instrument temperature data, what conclusions am I to draw?

If the proxies are not accurate in the current decades (compared to instruments), then how can I be assured that the historical proxies are accurate?

If the instruments are not accurate in current decades (compared to proxies), then how can I be assured that instrumental data is accurate?

It appears that there is a lot of work being done to eliminate this divergence, but it appears to be mathematical in nature. Is anyone working on refining the experimental parameters in the determination of temperature via proxy analysis?

http://www.agu.org/journals/jd/jd0717/2006JD008318/2006JD008318.pdf

Hopefully this link works…..
Keith–>

I can’t fix the link, I think you need access to the AGU papers.
Trevor–>

You don’t have access to the AGU journals?

Here is a link that may work. It is for the abstract:

http://www.agu.org/pubs/crossref/2007…/2006JD008318.shtml

I don’t know if it will work, since it is via the AGU as well.

Best answer:

Answer by Keith P
Well, the link doesn’t work, sorry.

There might be a lot of reasons for proxy divergence, depending on the proxy. But I suspect that what you’re concerned with most is tree-ring temperature proxies. The maximum density of a tree-ring has been shown to be a good proxy for overall summer-spring temperature in a given year.

Tree-ring proxies have broken down in the latter 20th century, and this is believed to be due to the effects of air pollution, principally ozone and its influence on UV radiation. Fortunately, we have actual thermometer records from this period, so the breakdown is not terribly significant to paleoclimate science.

http://www.sciencedirect.com/science?_ob=ArticleURL&_udi=B6VF0-49G5SBP-1&_user=10&_coverDate=01%2F31%2F2004&_rdoc=1&_fmt=&_orig=search&_sort=d&view=c&_acct=C000050221&_version=1&_urlVersion=0&_userid=10&md5=46100852a2c649210b281a7bbb93ec3b

Give your answer to this question below!

Comments

2 Responses to “Regarding the divergence of proxy and instrument temps in the latter half of the 20th century?”
  1. Bob says:

    It’s hard to comment without seeing your data. If you googled [your reference] abstract, you probably could at least come up with an abstract for us.

    But the stuff I’ve seen shows excellent consistency between 20th century data and proxy data. Here are several examples on one graph.

    http://www.globalwarmingart.com/wiki/Image:1000_Year_Temperature_Comparison_png

    I’m sure some research on your part would yield many other such examples.

    EDIT – OK I looked the the abstract. It said the proxy “slightly” under predicted temperatures. Could you share the average amount of the under prediction?

  2. Trevor says:

    Similarly, I couldn’t access the website (password required or need to purchase the report) so this answer will be somewhat restricted.

    In climatology the term ‘proxy’ has a different meaning to that used in other sciences and the general vernacular. It’s used to refer to data or information which by itself is of limited value but from which can be derived a variable of interest. Oxygen isotope and dendrological analyses and a couple of examples.

    By definition, proxy data is known to be unreliable or of little consequence. I’m wondering if what you’re referring to is ‘reconstructed data’, this being data preceding the instrumental record.

    It’s a shame I can’t access the link as this is the type of thing that some skeptics round on and then pronounce ‘oh look, the proxy data is wrong’ whilst conveniently omitting to add that climatologists know it’s wrong which is why it’s not directly used.

    What we do find is that the instrumental and reconstructed temperature records are remarkably accurate. There are some methods that are more accurate than others and the further back in time you go the lower the level of confidence. Such that isotopic analysis of multi-cellular organisms from half a billion years ago can only provide an average global temperature to within an accuracy of 1°C. This might sound fairly accurate but compared to what we can do for a million or a thousand years ago it’s way off the mark.

    No single data set can provide accurate reconstructed temps any further back than 800,000 years but by taking an average of several data sets and repeating the same research then a greater degree of accuracy can be achieved.

    A good way to gauge the accuracy of a method is to reconstruct recent temps then compare the results with the instrumental record for that same period. The methodology can then be tweaked if needs be.

    Another barometer test is to use several approaches to the reconstruction of temperatures and compare the results. The closer they are the greater the accuracy. If a set of results deviates significantly from the mean then it calls for an investigation.

    Climatology is a young science and advances are continually being made. Most data sets are routinely anomalously adjusted, as advances are made data is revisited and homogeneity adjustments are made. It’s not an attempt to ‘eliminate divergences’ but an overall improvement to the accuracy of the data.

    One example of this that you may be aware of is that the hottest year on record for the US was until recently, beleived to be 1998. After homogeneity adjustments were made to the GISTemp record the new record holder is 1934 (if you read in the media that it was anything to do with the Y2K bug then ignore it, I think the media decided this was the case because in 2000 the methodology changed and there was a discrepancy between the pre and post 2000 data).

    As the science of climatology improves and technological advances are made there will undoubtedly be further revisions to the data. Such revisions will be small, the changes that are being applied now are in the order of four decimal places.

    - – - – - – - – -

    RE: YOUR ADDED DETAILS

    Thanks Marc for the added link, I can access the abstract from there. I couldn’t access the AGU site before as I’m at home, I’m now hooked up to the office comp and am attempting to download from there but it’s not playing ball. I’ve ftp’d the datasets but not the written article and everything that goes with it.

    First impression is one of noise… but why? There appears to be considerable mismatches with individual readings out by perhaps as much as 2.5°C. Not having any totals or aves makes comparison difficult, I suspect the means will be closer, a quick sum of 30 from the first set (Tatras) is pretty close to the instrumental mean.

    Probably when averaged out or 10, 20, 30 year means are taken it will be more in line with other reconstructed data and the instrumental record.

    There are clear trends and the overall picture is the same as other methods but there’s a much greater degree of variability in the readings. I’m only looking at individual sites which doesn’t help but all the same, other reconstructed records from individual sites are generally more consistent.

    Can’t really comment on any divergence as all I have to work with are lists of numbers (would take too long to copy into a spreadsheet and analyse).

    I don’t normally work with raw tree ring data, maybe what appear to be anomalies are normal. I much prefer the ice core records, I’m directly involved with this and the data is reliable.

    If the report ever downloads I’ll come back to the question and add more details.

Speak Your Mind

Tell us what you're thinking...
and oh, if you want a pic to show with your comment, go get a gravatar!

Get Adobe Flash playerPlugin by wpburn.com wordpress themes