‘BEST’ Refining of Surface Temp Estimates Could Help Improve Understanding of Global Land Temps

“BEST” Researchers’ efforts aim to improve data on surface temperature estimates and their use could bolster public’s understanding of global land temperatures. It’s no surprise that the effort is not without some controversy.

BERKELEY, Ca. — The Berkeley Earth Surface Temperature (BEST) project is a recent effort on the part of a number of senior scientists and academics to improve surface temperature estimates by increasing the amount of data available and refining methods used to analyze that data.

The project itself has attracted some criticisms, particularly for reaching out to skeptics to try and address their concerns. But the methods the team is using are fundamentally sound, and the results, while unlikely to differ much at a global level from existing temperature records, can provide a good resource for improving regional temperature assessment and analyzing issues like the urban heat island effect. Those steps in turn can help improve public understanding of and communications on global land temperatures.

One of the unfortunate legacies of the hacked e-mail controversies over the past 16 months is a growing distrust of surface temperature records among some segments of the public. A Fox News executive was even moved to send out a memo to staff asking them to “refrain from asserting that the planet has warmed … without immediately pointing out that such theories are based upon data that critics have called into question.”

The heightened skepticism has given rise to a number of efforts among scientists and bloggers to bolster the methods and data used to create global temperature records.

Among these recent efforts are the Exeter conference a few months ago, and various initiatives by climate science bloggers.

The Berkeley project, undertaken by Richard Muller and his team, had a similar impetus. The team includes a rather distinguished list of scientific luminaries, including the statistician David Brillinger, and physicists Saul Perlmutter, Robert Jacobsen, and Arthur Rosenfeld. Climate scientist Judith Curry of Georgia Tech is advising the team, and Robert Rohde is the lead scientist on the project.

Based on very early and still-un-peer-reviewed sampling of just 2 percent of some 1.6 billion records, Pace University writer and blogger Andrew C. Revkin has written, Muller’s recent statements before a House committee “surely can’t be welcome to lawmakers hoping to undercut the credibility of climate science.” Revkin’s posting and comments on it include many useful links for those wanting to further explore the BEST situation. Playing-off Al Gore’s “Inconvenient Truth” title, the Revkin posting was headlined “Republicans Get Inconvenient Replies at Climate Hearing.”

Liberal economics columnist Paul Krugman also weighed-in on the Muller testimony before the House committee, echoing views expressed in Revkin’s blog posting and writing in an April 4 op-ed column that Muller had reported that his group’s preliminary results find a global warming trend “very similar to that reported by the prior groups.”

More Station Data, and New Approaches

The BEST scientists and academics have incorporated additional station data, developed their own approach to combine short records, and developed a spatial interpolation method as an alternative to traditional gridding. They have also applied a particularly innovative approach to allowing them to better deal with various problems like station moves, time of observation changes, and instrument changes that plague the temperature record.

Their major innovation is to treat problems in station records as the start of a separate record. These problem areas are detected by comparing individual stations to their surrounding stations and looking for break points or discontinuities in a station record not reflected in the records of neighboring stations. Their method of record combination has the major benefit of allowing relatively short records to be combined together without introducing biases. This approach means that instead of trying to artificially correct for problems detected by comparing individual stations to their neighbors, the researchers can simply treat these as break points, where subsequent measurements from the same site are treated as a separate record and are optimally fit to the larger series.

The inclusion of numerous shorter records should make it easier for researchers to analyze many of the issues (both spurious and serious) that critics have raised about the surface temperature record. These include the changing number of measurement stations available over the last few decades (discussed here recently), the effect of urbanization on the temperature trend, the effect of locating stations at airports, and general station siting issues.

Having additional tools available to tackle these and other potential problems with the surface temperature record can help clarify existing issues and lead to a clearer overall understanding of the global land temperature.

The BEST scientists and academics are also taking the interesting approach of trying to perfect the method prior to running it against the entire data set. They see this approach as helping them avoid any conscious or unconscious bias that might result from knowing interim results prior to optimizing the methodology.

As of mid-March, they had run the method only on a small random sample of all instruments in their data set, though the results from that sample are quite similar to other published temperature series from NASA, NOAA, and the Hadley Center. They have also been using their tool to examine various issues including station siting, airport vs. non-airport sites, and urban heat island effects.

Using Statistics to Quantify Error Ranges

With both statisticians and scientists as part of their project team, the collaboration is seeking to use the best statistical practices to quantify the error ranges for their results. Finally, they have been working on ways to improve the metadata associated with some of the stations, a frequent weak spot.

While the ultimate results of the BEST project are unlikely to dramatically change current scientific understanding of surface warming, they will provide a more robust series incorporating more data than have prior efforts. Their vastly increased station coverage over the last 50 years can be helpful in better addressing a number of potential issues in the surface temperature record. The BEST team is expected to publish its findings over the next two months or so. Those findings are bound to be closely watched and scrutinized by all “sides” of the surface temperature debate looking for ways to bolster their own perspectives.

Zeke Hausfather

Zeke Hausfather, a data scientist with extensive experience with clean technology interests in Silicon Valley, is currently a Senior Researcher with Berkeley Earth. He is a regular contributor to The Yale Forum (E-mail: zeke@yaleclimatemediaforum.org, Twitter: @hausfath).
Bookmark the permalink.

8 Responses to ‘BEST’ Refining of Surface Temp Estimates Could Help Improve Understanding of Global Land Temps

  1. How on earth can you conflate the stolen e-mail non-scandal with Fox News directives to confuse the public with regard to real climate change science?

    You’re goin’ waaaaay off the rails there.

  2. Zeke says:

    Tenney,

    I was simply pointing out that the Fox memo emerged in the aftermath of the email brouhaha, and capitalized on the public confusion about the reliability of the temperature record that occurred as a result (witness those, for example, who still seem to think that “hide the decline” refers to instrumental records).

  3. Malcolm Hughes says:

    I was disappointed to see this flawed account at the Yale Forum. Elementary fact checking and talking with experienced scholars of the instrumental climate record would have constituted real reporting, but this seems not to have been done. I don’t think this is the kind of example you intend to give. Here’s a specific example: Hausfather writes (of the B.E.S.T. group) “Their major innovation is to treat problems in station records as the start of a separate record. These problem areas are detected by comparing individual stations to their surrounding stations and looking for break points or discontinuities in a station record not reflected in the records of neighboring stations.” Anyone with the slightest knowledge of homogeneity testing of climate records would know that this kind of comparison has been common practice for many decades. For example, World Meteorological Organization Technical Note #79 “Climatic Change” by J Murray Mitchell, Jr. and other, published in 1966, gives guidelines based on a literature going back to at least 1925 (see especially sections 2.3 and 2.4). As the article stands, it is impossible for the reader to tell whether ignorance of this long-standing practice is Muller’s, Hausfather’s or both. It would be particularly disquieting if the ignorance is on Muller’s part, but I would also find failure on Hausfather’s part to use old-fashioned “asking around” disturbing.

  4. Zeke says:

    Dr. Hughes,

    While using surrounding stations to automatically detect for inhomogenities is common practice (see the pairwise homogenization algorithm used in USHCN and GHCN per Menne and Williams 2009, for example), the Robert Rhode’s innovation is to combine a least-squares station combination approach with the treatment of inhomogenities as the start of separate records.

    There are three approaches to combining station records that are common in the literature: the common anomaly method (CAM), the first differences method (FDM), and the reference station method (RSM). The common anomaly method (CAM) is the most widely used approach (by NCDC, for example), but suffers from an inability to effectively deal with short records that do not overlap with the common anomaly period.

    The least squares method (LSM) is an improvement over the RSM that is insensitive to the order in which stations are assessed (for more details on the approach, see the BEST methodology doc from the Exeter conference, or this: http://statpad.wordpress.com/2010/02/19/combining-stations-plan-b/). Of all the methods available, it is the least biased method of combining fragmentary station records, by finding the best fit between an individual station’s de-baselined values and those of surrounding stations.

    The standard approach in correcting for inhomogenities is to use surrounding stations both to detect and to artificially correct for step changes and other detectable breaks. Both Menne’s PHA and Hansen’s nightlight adjustments take this approach. Rhode’s novel approach is to treat detected inhomogenities as the start of a separate record (e.g. as a completely separate station) and combine it with other stations via the least squares method. This somewhat corrects for the tendency for adjustments to somewhat blend together stations at a local or regional level, and also reduces the chance that, say, urban stations would alias in a bit of urban-related warming when used to correct for station moves in nearby rural stations.

    In short, by developing a novel method of station combination and homogenization, adding in significant amounts of new station data (via incorporating GSOD data, among other sources), and developing spatial interpolation tools (which, if not completely novel, has not been widely adopted by existing global records that all use gridding in one form or another), the BEST team seems poised to make a number of useful contributions to the field. Its important to separate out the actual work being done by Rhode and the rest of the BEST team from some of the more inflammatory statements made by Muller about paleoclimatology.

  5. Malcolm Hughes says:

    I’m afraid I didn’t communicate my point clearly enough in my original comment. I had hoped to contribute to the Yale Forum’s stated intent to “… analyze and discuss the process by which climate change is communicated through traditional and new media”. My comments were aimed specifically at the presentation of your article (especially given the label “Fact File” in its header), and were not meant to be a response to the content of Richard Muller’s public pronouncements or the BEST group’s publicity. Your article fails to place the BEST project in the scientific context of many decades of careful and thoroughly tested work, of procedures followed for good reasons, and of many ongoing projects. You partly correct this in your reply to my comment. These shortcomings are all too typical of the “Podunk University scientist in breakthrough discovery” school of reportage (sometimes for sure emanating from University press offices). Perhaps this variant could be described as being of the “caped crusader to the rescue” subspecies of this. My concern is that such stories fail to illuminate how science works and risk misleading lay readers as to how to evaluate claims relayed by the media. By the way, l look forward with interest to seeing the results of the BEST group’s work and that of other new analyses in the regular peer-reviewed literature, and then, the crucial step, how they stand up after several years of close examination by the scientific community.

  6. geostat says:

    Wow, what a couple of whiners. I thought this article was informative and well written. Regarding Berkley Earth, its good to see researchers reaching out to skeptics and taking the estimation of uncertainty seriously.

    • Deep Climate says:

      One of those “whiners” is one of the world’s most prominent climate researchers. He knows more about climatology than the rest of us, including Zeke can ever hope to.

      His last comment is especially pertinent, given the slow progress made by the Berkeley Earth team, which in turn seems related to their non-scientific part of their agenda.

      “By the way, l look forward with interest to seeing the results of the BEST group’s work and that of other new analyses in the regular peer-reviewed literature, and then, the crucial step, how they stand up after *several years* of close examination by the scientific community.” [Emphasis added]

      So far, it’s not standing up too well, except for the obvious confirmations of long-standing work. But it’s early years yet.

  7. Deep Climate says:

    Zeke says:

    “One of the unfortunate legacies of the hacked e-mail controversies over the past 16 months is a growing distrust of surface temperature records among some segments of the public. A Fox News executive was even moved to send out a memo to staff asking them to “refrain from asserting that the planet has warmed … without immediately pointing out that such theories are based upon data that critics have called into question.””

    “Unfortunate legacy?” “Fox News … moved”?

    I realize you don’t like to criticize and call a spade a spade, but let’s get real. “Climategate” was a largely manufactured scandal that became the latest chapter in a long history of anti-science disinformation PR, largely paid for by the fossil fuel industry. And FoxNews and the Wall Street Journal are two media outlets that have been at the centre of that disinformation for a long, long time. Would it kill you to admit that? Of course, FoxNews execs want to present AGW as questionable despite all the evidence – that’s what they do. “Climategate” didn’t “move” them to do this – rather it provided the latest talking points to be echoed over and over again.

    The real scandal is that commentators like you fail to point out the lack of integrity of Fox and the Wall Street Journal on this issue and their complaisance in the anti-science campaigns of retrograde fossil fuel companies and their Republican surrogates.