Some guides to the primary sources for the Front Range precipitation
Regional Climate Model Output
For this case study we consider a subset of a numerical simulation for 20
years for the Western US. Briefly the control run is a simulation of
current climate and the future runs are based on one percent annual increases in
This output was shared by Lai-Yung (Ruby) Leung (Ruby.Leung@pnl.gov)
and her colloaborators and she
should be clearly acknowledged and cited in any publications
Please include the following reference to these experiments as part of
Leung, L.R., Y. Qian, X. Bian, W.M. Washington, J. Han, and J.O. Roads (2004). Mid-Century Ensemble Regional Climate Change Scenarios For the Western United States. Climatic Change, 62(1-3):75-113.The IMAGe public data directory contains four large (250M) binary files
The file README.RegCM is a general overview of this file ( and 3 other experiments not provided) and the netcdf header listed here is a precise description the contents of this file and the data origins. The provenence of these data is that an ASCII version was made available to Larry McDaniel (ISSE/NCAR) by Ruby Leung and then Tim Hoar (IMAGe/NCAR) subsequently reformatted the output as a self describing netcdf file. (The file name is the experiment id system used by Ruby to track her different model runs.)
The model output consists of variables surface min temperature, max temperature and precip at a grid 61X49 of lon/lat locations and for 7305 days comprising a 20 year simulation. The first day corresponds to the climate conditions of JUL 1 1995. But keep in mind that one would not expect the particular sequence of weather of events observed over this period to be matched by the Reg CM results. The correct interpretation is that the distribution of weather simluated by the Reg CM should match the distribution of observed weather. Keep in mind that from a scientific perspective modelers often look to see if the model reproduces that qualitative behavior of complex geophysical processes rather just comparing mean vluaes of surface variables. However, direct comparisions to observed means is one aspect of assessing the value of these models. At least two grid boxes on each side should be ignored as the simulations are effected by numerical boundary effects.
Finally note that the grid is not equally spaced with respect to lon/lat they are the result of an equally spaced Lambert projection centered on the Western US. So there are 61*49 unique lon and lat values to describe the grid box locations. A simple way to view these data on a map in R is to either use the quilt.plot function in fields or interpolate to an equally spaced grid based on the akima package and then use image, contour, image.plot, drape.plot etc. . Of course interpolation of raw spatial data can always have unexpected artifacts ...As an example of how to process these data using R. The file RegCM.R is the R script that reads in the locations and the precipitation varaibles using the ncdf package and subsets the data for the Front Range domain. The resulting R objects: Cday, Csummer, Cday should be indentical to the corresponding data provided in the R data sets. Note that there is are several minor errors in the logic of the script that have not been corrected so that the R data objects match this processing.
As an intermediate data format the Front Range stations were retrieved from the NCAR MSS and the station records were organized into separate files. This directory is available as FRhourly_Precip.tar.gz (1.6M) Front Range Project Public FTP site In unix
gunzip FRhourly_Precip.tar.gz tar -xvf FRhourly_Precip.tarThis will expand to a directory DataFiles with each station as separate file with the convention hrxxxxx.clean where xxxxxx is the COOP id number for the station. (Clean refers to removing missing value codes from the raw records.) The R data sets order these stations in sorted order by COOP id and the Station.key is test file with the basic station info including the station names and COOP id. The first 10 lines of hr050843.clean illustrate the format each line is a day with format station id, year/month/day, and 24 integers given the recorded precipitation amount for each hour. The units are hundreths of an inch.
NCDC public data site To go directly to NCDC for these data use http://www.ncdc.noaa.gov/oa/documentlibrary/ds-doc.html and then 'Data &Products' on the left menu.
Find the item: Hourly Precipitation Data (TD 3240) and then choose Online User selection Grab the stations by specifying the Station Range. (by the COOP ids)
For the Front Range domain this are the id's from 050183 to 059285 Boulder is 050843 For the next page give the dates start from 1948 to present. When all stations in this range are selected the dataset size exceeds the 100Mb limit for NCDC ftp transfers. In addition it appears that the Boulder station hourly data may not be archived here. But a subset station data can be accessed in this public site and checked against the intermediate files or R data sets.
A direct FTP style web site at NCDC is http://www1.ncdc.noaa.gov/pub/data/prism100/
The documentation for htese data is quite through. More to be added!