Post Reply
Switch to Forum Live View How Hard Would it Be???
5 years ago  ::  Dec 10, 2009 - 4:08PM #1
Bodean
Posts: 9,710

One of the really pissy issues of Global Warming is the Thermometer Data Sets themselves.


GISS, GHCN, HADCRUT, ... even the Satelites, all apply blanket adjustments to the data to come up a metric.


SO ... how hard would it be to actually go to each site, and make site specific adjustments based specifically on that station, its history, and the changes associated purely with each single station???


I mean .. there's only 1200 or so stations in the US.  You could pay 1200 people a 1000 bucks a piece to go up and collect the data, history, pictures, .. the whole nine yards, and then, make the relevant adjustments that need to be made.


For example ... the Darwin Australia site ... where there is clearly no reason whatsoever to adjust anyting, it gets adjusted .... just because.  This makes absolutely no sense.


Then .. and only then would we have a clue about what the true global temp is.  We'd also have a better grasp on the regional temps and changes, which in my opinion, is far more important than any Global Change.

Quick Reply
Cancel
5 years ago  ::  Dec 10, 2009 - 9:10PM #2
PrHaug
Posts: 230

Bodean,


Don't talk that way. Don't you know reason, common sense, and the scientific method isn't allowed?

Quick Reply
Cancel
5 years ago  ::  Dec 14, 2009 - 4:17PM #3
eadler
Posts: 4,449

Dec 10, 2009 -- 4:08PM, Bodean wrote:


One of the really pissy issues of Global Warming is the Thermometer Data Sets themselves.


GISS, GHCN, HADCRUT, ... even the Satelites, all apply blanket adjustments to the data to come up a metric.


SO ... how hard would it be to actually go to each site, and make site specific adjustments based specifically on that station, its history, and the changes associated purely with each single station???


I mean .. there's only 1200 or so stations in the US.  You could pay 1200 people a 1000 bucks a piece to go up and collect the data, history, pictures, .. the whole nine yards, and then, make the relevant adjustments that need to be made.


For example ... the Darwin Australia site ... where there is clearly no reason whatsoever to adjust anyting, it gets adjusted .... just because.  This makes absolutely no sense.


Then .. and only then would we have a clue about what the true global temp is.  We'd also have a better grasp on the regional temps and changes, which in my opinion, is far more important than any Global Change.




Unless you postulate a deliberate attempt to fake data of global warming, some strange data for a specific stations doesn't invalidate global averages.The errors should cancel. An the new style stations with the Stevenson screens cause the temperature readings to drop 1.2C relative to the old style stations. Based on this, the fact that the corrections are larger than the trend should not be surprising. This seems to have been lost on some of the deniersphere.


The Australiam BOM has a web page where it explains all this:


www.bom.gov.au/climate/change/datasets/d...


A change in the type of thermometer shelter used at many Australian observation sites in the early 20th century resulted in a sudden drop in recorded temperatures which is entirely spurious. It is for this reason that these early data are currently not used for monitoring climate change. Other common changes at Australian sites over time include location moves, construction of buildings or growth of vegetation around the observation site and, more recently, the introduction of Automatic Weather Stations.


The nit picking being done by Watts, really doesn't prove anything. Watts doesn't show the metadata that was used to make the corrections which is referred to in this paper. He only gives an opinion about how he would make the changes, without any supporting data.


reg.bom.gov.au/amm/docs/2004/dellamarta....


An updated and improved version of the Australian high-quality
annual mean temperature dataset of Torok and Nicholls
(1996) has been produced. This was achieved by undertaking a
thorough post-1993 homogeneity assessment using a number of
objective and semi-objective techniques, by matching closed
records onto continuing records, and by adding some shorter
duration records in data-sparse regions. Each record has been
re-assessed for quality on the basis of recent metadata, resulting
in many records being rejected from the dataset. In addition,
records have been re-examined for possible urban contamination
using some new approaches. This update has highlighted
the need for accurate and complete station metadata. It
has also demonstrated the value of at least two years of overlapping
observations for major site changes to ensure the
homogeneity of the climate record. A total of 133 good-quality,
homogenised records have been produced. A non-urban subset
of 99 stations provides reliable calculations of Australia’s annual
mean temperature anomalies with observation error variances
between 15 and 25 per cent of the total variance and
decorrelation length scales greater than the average inter-station
separation.


If Watts were a responsible scientist, who was genuinely skeptical , he would request access to the data base and find the meta data that determines what the corrections should be, and compare it with the corrections that were made. Of course, the deniersphere doesn't really care whether Watts can justify his opinion, all they want to do is deny global warming exists.


 


 

Quick Reply
Cancel
5 years ago  ::  Dec 14, 2009 - 4:32PM #4
eadler
Posts: 4,449

Bodean wrote:


I mean .. there's only 1200 or so stations in the US.  You could pay 1200 people a 1000 bucks a piece to go up and collect the data, history, pictures, .. the whole nine yards, and then, make the relevant adjustments that need to be made.



To continue the theme of my previous post, what you suggest seems to have been done.


The raw data set contains the meta data that you want to see about each site.


www.ncdc.noaa.gov/oa/climate/research/us...


"The U.S. Historical Climatology Network (USHCN, Karl et al. 1990) is a high-quality moderate sized data set of monthly averaged maximum, minimum, and mean temperature and total monthly precipitation developed to assist in the detection of regional climate change. The USHCN is comprised of 1221 high-quality stations from the U.S. Cooperative Observing Network within the 48 contiguous United States. An additional data set containing 46 stations for Alaska is also available; however, data for these stations are not adjusted for inhomogeneities as outlined below for the USHCN. The period of record varies for each station but generally includes the period 1900-1995. The stations were chosen using a number of criteria including length of period of record, percent missing data, number of station moves and other station changes that may affect the data homogeneity, and spatial coverage. Included with the data set are metadata files that contain station history information about station moves, instrumentation, observing times, and elevation."


So anyone who wants to, can study the  USHCN data and the meta data to see of the adjustments are warranted. This was done to created the adjusted data base.

Quick Reply
Cancel
5 years ago  ::  Dec 14, 2009 - 5:15PM #5
Karma_yeshe_dorje
Posts: 12,808
The biggest scientific adjustment problem here is budgetary cuts!
Quick Reply
Cancel
 
    Viewing this thread :: 0 registered and 1 guest
    No registered users viewing
    Advertisement

    Beliefnet On Facebook