Skimming through the often spaghetti-like code, the number of programs which subject the data to a mixed-bag of transformative and filtering routines is simply staggering. Granted, many of these "alterations" run from benign smoothing algorithms (e.g., omitting rogue outliers) to moderate infilling mechanisms (e.g., estimating missing station data from that of those closely surrounding). But many others fall into the precarious range between highly questionable (removing MXD data which demonstrate poor correlations with local temperature) to downright fraudulent (replacing MXD data entirely with measured data to reverse a disorderly trend-line).
In fact, workarounds for the post-1960 "divergence problem," as described by both RealClimate and Climate Audit, can be found throughout the source code. So much so that perhaps the most ubiquitous programmer's comment (REM) I ran across warns that the particular module "Uses 'corrected' MXD - but shouldn't usually plot past 1960 because these will be artificially adjusted to look closer to the real temperatures."
... But oddly enough, the series doesn’t begin its "decline adjustment" in 1960 -- the supposed year of the enigmatic "divergence." In fact, all data between 1930 and 1994 are subject to "correction."
Baby killing–OK with Princeton and Oxford
1 hour ago