Question 10

Are data recorded by ground-based stations a reliable indicator of surface temperature trends?

Expert response by Anthony Watts

Anthony Watts is a 25-year broadcast meteorology veteran and currently chief meteorologist for KPAY-AM radio. He got his start as on-air meteorologist for WLFI-TV in Lafayette, Indiana and at KHSL-TV in Chico, California. In 1987, he founded ItWorks, which supplies broadcast graphics systems to hundreds of cable television, television, and radio stations nationwide. ItWorks supplies custom weather stations, Internet servers, weather graphics content, and broadcast video equipment. In 2007, Watts founded, a Web site devoted to photographing and documenting the quality of weather stations across the U.S.

89% of US temperature stations fail official siting criteria

Global warming is one of the most serious issues of our times. Some experts claim the rise in temperature during the past century was “unprecedented” and proof that immediate action to reduce human greenhouse gas emissions must begin. Other experts say the warming was very modest and the case for action has yet to be made. The reliability of data used to document temperature trends is of great importance in this debate. We can’t know for sure if global warming is a problem if we can’t trust the data.

The official record of temperatures in the continental United States comes from a network of 1,221 climate-monitoring stations overseen by the National Weather Service, a department of the National Oceanic and Atmospheric Administration (NOAA). Until now, no one had ever conducted a comprehensive review of the quality of the measurement environment of those stations.

During the past few years I recruited a team of more than 650 volunteers to visually inspect and photographically document more than 860 of these temperature stations. We were shocked by what we found. We found stations located next to the exhaust fans of air conditioning units, surrounded by asphalt parking lots and roads, on blistering-hot rooftops, and near sidewalks and buildings that absorb and radiate heat. We found 68 stations located at wastewater treatment plants, where the process of waste digestion causes temperatures to be higher than in surrounding areas.

In fact, we found that 89 percent of the stations – nearly 9 of every 10 – fail to meet the National Weather Service’s own siting requirements that stations must be 30 meters (about 100 feet) or more away from an artificial heating or radiating/
reflecting heat source. In other words, 9 of every 10 stations are likely reporting higher or rising temperatures because they are badly sited.

It gets worse. We observed that changes in the technology of temperature stations over time also has caused them to report a false warming trend. We found major gaps in the data record that were filled in with data from nearby sites, a practice
that propagates and compounds errors. We found that adjustments to the data by both NOAA and another government agency, NASA, cause recent temperatures to look even higher.
The conclusion is inescapable: The U.S. temperature record is unreliable. The errors in the record exceed by a wide margin the purported rise in temperature of 0.7º C (about 1.2º F) during the 20th century. Consequently, this record should not be cited as evidence of any trend in temperature that may have occurred across the U.S. during the past century.

Since the U.S. record is thought to be “the best in the world,” it follows that the global database is likely similarly compromised and unreliable.

This report presents actual photos of more than 100 temperature stations in the U.S., many of them demonstrating vividly the siting issues we found to be rampant in the network. Photographs of all 865 stations that have been surveyed so far can be found at, where station photos can be browsed by state or searched for by name.

Upon seeing the B91 form for Marysville, my first thought was back to my college days in lab exercises, where if I were conducting an experiment and able to complete only 45 percent of the readings, my instructor would surely tell me to repeat the experiment until I could “do it right.” Yet here we had an official climate-monitoring station, dubbed part of the “high quality” USHCN network that provides data for use in scientific studies, actually measuring the temperature of a parking lot with air conditioners blowing exhaust air on it, and missing more than half of its data for the month of July!

I wondered if other researchers had expressed concern about the quality of the U.S. temperature record and found they had. In 2003, NCDC recognized that the existing USHCN network had problems and commissioned the new climate Reference Network (CRN) to replace the old USHCN network. A report released at the time said:
The research community, government agencies, and private businesses have identified significant shortcomings in understanding and examining long-term climate trends and change over the U.S. and surrounding regions. Some of these shortcomings are due to the lack of adequate documentation of operations and changes regarding the existing and earlier observing networks, the observing sites, and the instrumentation over the life of the network.
These include inadequate overlapping observations when new instruments were installed and not using well maintained, calibrated high-quality instruments. These factors increase the level of uncertainty when government and business decision-makers are considering long-range strategic policies and plans.

Is the U.S. Temperature Record Reliable?

My search also led me to Dr. Roger Pielke Sr., senior research scientist at the Cooperative Institute for Research in Environmental Sciences (CIRES), University of Colorado in Boulder, and professor emeritus of the Department of Atmospheric Science, Colorado State University, Fort Collins. Dr. Pielke had done some studies on the quality of siting and measurements at USHCN climate-monitoring stations in Colorado and he confirmed my fears. He too had seen blatant violations of quality control that contaminated the temperature record.

The missing Marysville data (14 of 31 days) led me to research how missing data was dealt with in the climate record. I learned about a data algorithm used by NCDC called FILNET, short for Fill Missing Original Data in the Network, that is used to “infill” missing data using interpolations of data from surrounding stations. After reading about it, I came to the conclusion that NCDC uses FILNET to create “missing” data where none was ever actually. measured. I looked up FILNET and, sure enough, missing data are created from nearby station estimates. According to a government report, estimates for missing data are provided using a procedure similar to that used in SHAP [Station History Adjustment Program]. This adjustment uses the debiased data from the SHAP and fills in missing original data when needed (i.e. calculates estimated data) based on a “network” of the best correlated nearby stations. The FILNET program also completed the data adjustment process for stations that moved too often for SHAP to estimate the adjustments needed to debias the data.

I asked myself: “With potential heat biases such as temperature measurement near parking lots, air conditioner vents, and radio equipment, plus significant amounts of missing data being interpolated from other stations that may also have issues, how could our national climatic dataset possibly be accurate?” After further discussion with Dr. Pielke, and evaluating how he had done his study there with photography of temperature stations, and realizing the importance of documenting the state of quality control in the U.S. Historical Climatology Network, I decided something needed to be done.

The Surface Stations Project

From my discussions with Dr. Pielke, the Surface Stations Project was born. The concept was simple: Create a network of volunteers to visit USHCN climate-monitoring stations and document, with photographs and site surveys, their quality.
I worked with Dr. Pielke to encapsulate his survey methods into simple instructions any member of the public could understand and follow. I created a Web site,, that featured an interactive online database that would allow for the uploading of photographs and site surveys, along with supporting data.
Since the project’s inception in the Summer of 2007, more than 650 volunteer surveyors have registered, and as of this writing in February 2009, 865 of the 1,221 USHCN climate-monitoring stations have been surveyed, representing more
than 70 percent of the operational climate-monitoring network in the continental United States.

To rate the quality of the station siting characteristics, we used the same metric developed by NOAA’s National Climatic Data Center to set up the climate Reference Network (CRN). According to Section 2.2. of the climate Reference Network (CRN) Site Information Handbook, “the most desirable local surrounding landscape is a relatively large and flat open area with low local vegetation in order that the sky view is unobstructed in all directions except at the lower angles of altitude above the horizon.”

Five classes of sites – ranging from most reliable to least – are defined:
Class 1: Flat and horizontal ground surrounded by a clear surface with a slope below 1/3 (less than 19º). Grass/low vegetation ground cover less than 10 centimeters high. Sensors located at least 100 meters from artificial heating or reflecting surfaces, such as buildings, concrete surfaces, and parking lots. Far from large bodies of water, except if it is representative of the area, and then located at least 100 meters away. No shading for a sun elevation greater than 3 degrees.
Class 2: Same as Class 1 with the following differences. Surrounding vegetation less than 25 centimeters. Artificial heating sources within 30 meters. No shading for a sun elevation greater than 5º.
Class 3: (error 1ºC) Same as Class 2, except no artificial heating sources within 10 meters.
Class 4: (error greater than 2ºC) Artificial heating sources less than 10 meters.
Class 5: (error greater than 5ºC) Temperature sensor located next to/above an artificial heating source, such as a building, roof top, parking lot, or concrete surface.

This rating system is duplicated for the Surface Stations Project. Distances to objects and surfaces are measured by volunteer surveyors, and in cases where hands-on measurements are not possible, due to the weather station being in a secured area (such as airports) or other inaccessible area, measurements are made using aerial survey tools such as Google Earth and other aerial mapping and measurement systems. When the site is inaccessible and the quality of aerial photography is poor, photographic analysis of objects of known size and length that appear with the weather stations (such as chain link fence segments) are used to determine distances.
Armed with these rating tools provided by NOAA, NWS, and NCDC, the Surface Stations Project was able to quantify the quality of the operational USHCN climate-monitoring network. Due to the open and accessible nature of the project, having all photographs and data available online for public viewing, the surveys are seen by dozens to hundreds of people, who readily point out errors or concerns, such as a misidentified station. In such cases where an error is identified, surveys are removed from the database, and the site survey is redone when practical. Each USHCN site rating, once applied, is seen by three different individuals, ensuring it represents a true rating.

Full report available at:

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s