FOIA\documents\osborn-tree6\mann\oldprog\maps12.proFOIA\documents\osborn-tree6\mann\oldprog\maps15.proFOIA\documents\osborn-tree6\mann\oldprog\maps24.pro; Plots 24 yearly maps of calibrated (PCR-infilled or not) MXD reconstructions
; of growing season temperatures. Uses "corrected" MXD - but shouldn't usually
; plot past 1960 because these will be artificially adjusted to look closer to
; the real temperatures.
FOIA\documents\harris-tree\recon_esper.pro; Computes regressions on full, high and low pass Esper et al. (2002) series,
; anomalies against full NH temperatures and other series.
; CALIBRATES IT AGAINST THE LAND-ONLY TEMPERATURES NORTH OF 20 N
;
; Specify period over which to compute the regressions (stop in 1960 to avoid
; the decline
FOIA\documents\harris-tree\calibrate_nhrecon.pro;
; Specify period over which to compute the regressions (stop in 1960 to avoid
; the decline that affects tree-ring density records)
;
FOIA\documents\harris-tree\recon1.pro
FOIA\documents\harris-tree\recon2.proFOIA\documents\harris-tree\recon_jones.pro;
; Specify period over which to compute the regressions (stop in 1940 to avoid
; the decline
;
FOIA\documents\HARRY_READ_ME.txt17. Inserted debug statements into anomdtb.f90, discovered that
a sum-of-squared variable is becoming very, very negative! Key
output from the debug statements:
(..)
forrtl: error (75): floating point exception
IOT trap (core dumped)
..so the data value is unbfeasibly large, but why does the
sum-of-squares parameter OpTotSq go negative?!!
FOIA\documents\HARRY_READ_ME.txt22. Right, time to stop pussyfooting around the niceties of Tim's labyrinthine software
suites - let's have a go at producing CRU TS 3.0! since failing to do that will be the
definitive failure of the entire project..
FOIA\documents\HARRY_READ_ME.txtgetting seriously fed up with the state of the Australian data. so many new stations have been
introduced, so many false references.. so many changes that aren't documented. Every time a
cloud forms I'm presented with a bewildering selection of similar-sounding sites, some with
references, some with WMO codes, and some with both. And if I look up the station metadata with
one of the local references, chances are the WMO code will be wrong (another station will have
it) and the lat/lon will be wrong too.
FOIA\documents\HARRY_READ_ME.txtI am very sorry to report that the rest of the databases seem to be in nearly as poor a state as
Australia was. There are hundreds if not thousands of pairs of dummy stations, one with no WMO
and one with, usually overlapping and with the same station name and very similar coordinates. I
know it could be old and new stations, but why such large overlaps if that's the case? Aarrggghhh!
There truly is no end in sight.
FOIA\documents\HARRY_READ_ME.txt28. With huge reluctance, I have dived into 'anomdtb' - and already I have
that familiar Twilight Zone sensation.
FOIA\documents\HARRY_READ_ME.txtWrote 'makedtr.for' to tackle the thorny problem of the tmin and tmax databases not
being kept in step. Sounds familiar, if worrying. am I the first person to attempt
to get the CRU databases in working order?!!
FOIA\documents\HARRY_READ_ME.txtWell, dtr2cld is not the world's most complicated program. Wheras cloudreg is, and I
immediately found a mistake! Scanning forward to 1951 was done with a loop that, for
completely unfathomable reasons, didn't include months! So we read 50 grids instead
of 600!!! That may have had something to do with it. I also noticed, as I was correcting
THAT, that I reopened the DTR and CLD data files when I should have been opening the
bloody station files!!
FOIA\documents\HARRY_READ_ME.txtBack to the gridding. I am seriously worried that our flagship gridded data product is produced by
Delaunay triangulation - apparently linear as well. As far as I can see, this renders the station
counts totally meaningless. It also means that we cannot say exactly how the gridded data is arrived
at from a statistical perspective - since we're using an off-the-shelf product that isn't documented
sufficiently to say that. Why this wasn't coded up in Fortran I don't know - time pressures perhaps?
Was too much effort expended on homogenisation, that there wasn't enough time to write a gridding
procedure? Of course, it's too late for me to fix it too. Meh.
FOIA\documents\HARRY_READ_ME.txtHere, the expected 1990-2003 period is MISSING - so the correlations aren't so hot! Yet
the WMO codes and station names /locations are identical (or close). What the hell is
supposed to happen here? Oh yeah - there is no 'supposed', I can make it up. So I have :-)
FOIA\documents\HARRY_READ_ME.txtWell, it's been a real day of revelations, never mind the week. This morning I
discovered that proper angular weighted interpolation was coded into the IDL
routine, but that its use was discouraged because it was slow! Aaarrrgghh.
There is even an option to tri-grid at 0.1 degree resolution and then 'rebin'
to 720x360 - also deprecated! And now, just before midnight (so it counts!),
having gone back to the tmin/tmax work, I've found that most if not all of the
Australian bulletin stations have been unceremoniously dumped into the files
without the briefest check for existing stations.
FOIA\documents\HARRY_READ_ME.txtAs we can see, even I'm cocking it up! Though recoverably. DTR, TMN and TMX need to be written as (i7.7)./code>
FOIA\documents\HARRY_READ_ME.txtOH FUCK THIS. It's Sunday evening, I've worked all weekend, and just when I thought it was done I'm
hitting yet another problem that's based on the hopeless state of our databases. There is no uniform
data integrity, it's just a catalogue of issues that continues to grow as they're found.
FOIA\documents\osborn-tree6\mann\mxdgrid2ascii.proprintf,1,’Osborn et al. (2004) gridded reconstruction of warm-season’
printf,1,’(April-September) temperature anomalies (from the 1961-1990 mean).’
printf,1,’Reconstruction is based on tree-ring density records.’
printf,1
printf,1,’NOTE: recent decline in tree-ring density has been ARTIFICIALLY’
printf,1,’REMOVED to facilitate calibration. THEREFORE, post-1960 values’
printf,1,’will be much closer to observed temperatures then they should be,’
printf,1,’which will incorrectly imply the reconstruction is more skilful’
printf,1,’than it actually is. See Osborn et al. (2004).’
FOIA\documents\osborn-tree6\summer_modes\data4sweden.pro
FOIA\documents\osborn-tree6\summer_modes\data4sweden.proprintf,1,'IMPORTANT NOTE:'
printf,1,'The data after 1960 should not be used. The tree-ring density'
printf,1,'records tend to show a decline after 1960 relative to the summer'
printf,1,'temperature in many high-latitude locations. In this data set'
printf,1,'this "decline" has been artificially removed in an ad-hoc way, and'
printf,1,'this means that data after 1960 no longer represent tree-ring
printf,1,'density variations, but have been modified to look more like the
printf,1,'observed temperatures.'
FOIA\documents\osborn-tree6\combined_wavelet_col.pro;
; Remove missing data from start & end (end in 1960 due to decline)
;
kl=where((yrmxd ge 1402) and (yrmxd le 1960),n)
sst=prednh(kl)
FOIA\documents\osborn-tree6\mann\mxd_pcr_localtemp.pro; Tries to reconstruct Apr-Sep temperatures, on a box-by-box basis, from the
; EOFs of the MXD data set. This is PCR, although PCs are used as predictors
; but not as predictands. This PCR-infilling must be done for a number of
; periods, with different EOFs for each period (due to different spatial
; coverage). *BUT* don’t do special PCR for the modern period (post-1976),
; since they won’t be used due to the decline/correction problem.
; Certain boxes that appear to reconstruct well are “manually” removed because
; they are isolated and away from any trees.
FOIA\documents\osborn-tree6\briffa_sep98_d.pro;mknormal,yyy,timey,refperiod=[1881,1940]
;
; Apply a VERY ARTIFICAL correction for decline!!
;
yrloc=[1400,findgen(19)*5.+1904]
valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,$
2.6,2.6,2.6]*0.75 ; fudge factor
(...)

;
; APPLY ARTIFICIAL CORRECTION
;
yearlyadj=interpol(valadj,yrloc,x)
densall=densall+yearlyadj
FOIA\documents\osborn-tree6\summer_modes\pl_decline.pro;
; Plots density ‘decline’ as a time series of the difference between
; temperature and density averaged over the region north of 50N,
; and an associated pattern in the difference field.
; The difference data set is computed using only boxes and years with
; both temperature and density in them – i.e., the grid changes in time.
; The pattern is computed by correlating and regressing the *filtered*
; time series against the unfiltered (or filtered) difference data set.
;
;*** MUST ALTER FUNCT_DECLINE.PRO TO MATCH THE COORDINATES OF THE
; START OF THE DECLINE *** ALTER THIS EVERY TIME YOU CHANGE ANYTHING ***
FOIA\documents\osborn-tree6\mann\oldprog\maps12.pro;
; Plots 24 yearly maps of calibrated (PCR-infilled or not) MXD reconstructions
; of growing season temperatures. Uses “corrected” MXD – but shouldn’t usually
; plot past 1960 because these will be artificially adjusted to look closer to
; the real temperatures.
;
FOIA\documents\osborn-tree6\mann\oldprog\calibrate_correctmxd.pro; We have previously (calibrate_mxd.pro) calibrated the high-pass filtered
; MXD over 1911-1990, applied the calibration to unfiltered MXD data (which
; gives a zero mean over 1881-1960) after extending the calibration to boxes
; without temperature data (pl_calibmxd1.pro). We have identified and
; artificially removed (i.e. corrected) the decline in this calibrated
; data set. We now recalibrate this corrected calibrated dataset against
; the unfiltered 1911-1990 temperature data, and apply the same calibration
; to the corrected and uncorrected calibrated MXD data.
FOIA\documents\osborn-tree6\summer_modes\calibrate_correctmxd.pro; No need to verify the correct and uncorrected versions, since these
; should be identical prior to 1920 or 1930 or whenever the decline
; was corrected onwards from.
FOIA\documents\osborn-tree5\densplus188119602netcdf.pro; we know the file starts at yr 440, but we want nothing till 1400, so we
; can skill lines (1400-440)/10 + 1 header line
; we now want all lines (10 yr per line) from 1400 to 1980, which is
; (1980-1400)/10 + 1 lines
(...)
; we know the file starts at yr 1070, but we want nothing till 1400, so we
; can skill lines (1400-1070)/10 + 1 header line
; we now want all lines (10 yr per line) from 1400 to 1991, which is
; (1990-1400)/10 + 1 lines (since 1991 is on line beginning 1990)