Identifying Local Fire Events From Sediment Charcoal Records Via Regularization

Post by Malcolm Itter, a graduate student with Andrew Finley at Michigan State University. Malcolm received an Outstanding Student Paper Award for this work at AGU 2016!

Charcoal particles deposited in lake sediments during and following wildland fires serve as records of local to regional fire history. As paleoecologists, we would like to apply these records to understand how fire regimes, including fire frequency, size, and severity, vary with climate and regional vegetation on a centennial to millennial scale. Sediment charcoal deposits arise from several sources including: 1) direct transport during local fires; 2) surface transport via wind and water of charcoal deposited within a lake catchment following regional fires; 3) sediment mixing within the sample lake concentrating charcoal in the lake center. A common challenge when using sediment charcoal records is the need to separate charcoal generated during local fire events from charcoal generated from regional and secondary sources. Recent work by PalEON collaborators including myself, Andrew Finley, Mevin Hooten, Phil Higuera, Jenn Marlon, Ryan Kelly, and Jason McLachlan applies statistical regularization to separate local and regional charcoal deposition allowing for inference regarding local fire frequency and regional fire dynamics. Here we describe the general concept of regularization as it relates to paleo-fire reconstruction. Additional details can be found in Itter et al. (Submitted).

Figure 1: Illustration of theoretical charcoal deposition to a lake if charcoal particles arising from regional fires were distinguishable from particles arising from local fires (in practice, charcoal particles from different sources are indistinguishable). The figure does not depict charcoal arising from secondary sources such as surface water runoff or sediment mixing.

Figure 1 illustrates primary and regional charcoal deposition to a sample lake. We can think of charcoal deposition to a sample lake as being driven by two independent processes in time: a foreground process driving primary charcoal deposition during local fires, and a background process driving regional and secondary charcoal deposition. In practice, charcoal particles arising from different sources are indistinguishable in sediment charcoal records. We observe a single charcoal count over a fixed time interval. Direct estimation of foreground and background processes is not possible without separate background and foreground counts. We overcome the lack of explicit background and foreground counts by making strong assumptions about the nature of the background and foreground processes. Specifically, we assume the background process is smooth, exhibiting low-frequency changes over time, while the foreground process is highly-variable, exhibiting high-frequency changes in charcoal deposition rates associated with local fires. These assumptions follow directly from a long line of paleoecological research, which partitions charcoal into: 1) a background component that reflects regional charcoal production varying as a function of long-term climate and vegetation shifts; 2) a peak component reflecting local fire events and measurement error.

We use statistical regularization to ensure the assumption regarding the relative smoothness and volatility of the background and foreground processes is met. Under regularization, we seek the solution to an optimization problem (such as maximizing the likelihood of a parameter) subject to a constraint. The purpose of the constraint, in the context of Bayesian data analysis, is to bound the posterior distribution to some reasonable range. In this way, the constraint resembles an informative prior distribution. Additional details on statistical regularization can be found in Hobbs & Hooten (2015) and Hooten & Hobbs (2015).

In the context of sediment charcoal records, we model two deposition processes under the constraint that the background process is smooth, while the foreground process is volatile. We use unique sets of regression coefficients to model the background and foreground processes. Both sets of regression coefficients are assigned prior distributions, but with different prior variances. The prior variance for the foreground coefficients is much larger than the prior variance for the background coefficients. The prior variance parameters serve as the regulators (equivalent to a penalty term in Lasso or ridge regression) and force the background process to be smooth, while allowing the foreground process to be sufficiently flexible to capture charcoal deposition from local fires.

Figure 2: Model results for Screaming Lynx Lake, Alaska. Upper panel indicates observed charcoal counts along with the posterior mean charcoal count (blue line). Middle panel illustrates posterior mean foreground (orange line) and background (black line) deposition processes. Lower panel plots posterior mean probability of fire estimates for each observed time interval (black line) along with the upper and lower bounds of the 95 percent credible interval (gray shading) and an optimized local fire threshold (red line).

Figure 2 shows the results of regularization separation of background and foreground deposition processes from a single set of charcoal counts for Screaming Lynx Lake in Alaska. The probability of fire values presented in the lower panel of Figure 2 follow from the ratio of the foreground process relative to the sum of the background and foreground processes. We would not be able to identify the background and foreground processes without the strong assumption on the dynamics of the processes over time and the corresponding regularization. The benefits of using such an approach to model sediment charcoal deposition are: 1) our model reflects scientific understanding of charcoal deposition to lakes during and after fire events; 2) we are able to identify local fire events from noisy sediment charcoal records; 3) the background process provides a measure of regional fire dynamics, which can be correlated with climate and vegetation shifts over time.

References
1. Hobbs, N.T., Hooten, M.B. 2015. Bayesian Models: A Statistical Primer for Ecologists. Princeton University Press, Princeton, NJ.
2. Hooten, M.B., Hobbs, N.T. 2015. A guide to Bayesian model selection for ecologists. Ecololgical Monographs, 85, 3-28.
3. Itter, M.S., Finley A.O., Hooten, M.B., Higuera, P.E., Marlon, J.R., Kelly, R., McLachlan, J.S. (Submitted). A model-based approach to wildland fire reconstruction using sediment charcoal records. arXiv:1612.02382

State Data Assimilation and PalEON

Post by Michael Dietze and Ann Raiho

What is state data assimilation (SDA)?

SDA is the process of using observed data to update the internal STATE estimates of a model, as opposed to using data for validation or parameter calibration. The exact statistical methods vary, but generally this involves running models forward, stopping at times where data were observed, “nudging” the model back on track, and then restarting the model run (Figure 1). The approached being employed by the modeling teams in PalEON are all variations of ENSEMBLE based assimilation, meaning that in order to capture the uncertainty and variability in model predictions, during the analysis step (i.e. nudge) we update both the mean and the spread of the ensemble based on the uncertainties in both the model and the data. Importantly, we don’t just update the states that we observed, but we also update the other states in the model based on their covariances with the states that we do observe. For example, if we update composition based on pollen or NPP based on tree rings, we also update the carbon pools and land surface fluxes that co-vary with these.

Figure 1. Schematic of how state data assimilation works. From an initial state (shown as pink in the Forecast Step) you make a prediction (blue curve in the Analysis step). Then compare your data or new observation (green in the Analysis step) to the model prediction (blue) and calculate an updated state (pink in the Analysis step).

There are many components in the PalEON SDA and many people are involved. In all methods being employed by PalEON modeling teams, the uncertainty in the meteorological drivers is a major component of the model ensemble spread. Christy Rollinson has developed a workflow that generates an ensemble of ensembles of meteorological drivers — first she starts with an ensemble of different GCM’s that have completed the ‘last millennia’ run (850-1850 AD) and then downscales each GCM in space and time, generating an ensemble of different meteorological realizations for each GCM that propagates the downscaling uncertainty. John Tipton and Mevin Hooten then update this ensemble of ensembles, providing weights to each based on their fidelity with different paleoclimate proxies over different timescales. In addition to the meteorological realizations, some of the techniques being employed also accommodate model parameter error and model process error (which is like a ‘residual’ error after accounting for observation error in the data).

Why are we doing SDA in PalEON?

In the PalEON proposals we laid out four high-level PalEON objectives: Validation, Inference, Initialization, and Improvement. Our previous MIP (Model Intercomparison Project) activities at the site and regional scale were focuses specifically on the first of these, Validation. By contrast, SDA directly informs the next two (Inference, Initialization). Both the SDA and the MIP indirectly support the fourth (Improvement).

In terms of Inference, the central idea here is to formally fuse models and data to improve our ability to infer the structure, composition, and function of ecosystems on millennial timescales. Specifically, by leveraging the covariances between observed and unobserved states we’re hoping that models will help us better estimate what pre- and early-settlement were like, in particular for variables not directly related to our traditional paleo proxies (e.g. carbon pools, GPP, NEE, water fluxes, albedo). The last millennium is a particularly important period to infer as it’s the baseline against which we judge anthropogenic impacts, but we lack measurements for many key variables for that baseline period. We want to know how much we can reduce the uncertainty about that baseline.

In terms of Initialization, a key assumption in many modeling exercises (including all CMIP / IPCC projections) is that we can spin ecosystems up to a presettlement ‘steady state’ condition. Indeed, it is this assumption that’s responsible for there being far less model spread at 1850 than for the modern period, despite having far greater observations for the modern. However, no paleoecologist believes the world was at equilibrium prior to 1850. Our key question is “how much does that assumption matter”? Here we’re using data assimilation to force models to follow the non-equilibrium trajectories they actually followed and assessing how much impact that has on contemporary predictions.

Finally, SDA gives us a new perspective on model validation and improvement. In our initial validation activity, as well as all other MIPs and most other validation activities, if a model gets off to a wrong start, it will generally continue to perform poorly thereafter even if it correctly captures processes responsible for further change over time. Here, by continually putting the model back ‘on track’ we can better assess the ability of models to capture the system dynamics over specific, fixed time steps and when in time & space it makes reasonable vs unreasonable predictions.

SDA Example

Figure 2 shows a PalEON SDA example for a 30 year time period using tree ring estimates of aboveground biomass for four tree species from data collected at UNDERC and a forest gap model called LINKAGES.  The two plots show the tree ring data for hemlock and yellow birch in green, the model prediction in purple and the pink is how the data “nudge” the model. The correlation plot on the right represents the process error correlation matrix.  That is, it shows what correlations are either missing in LINKAGES or are over represented. For example, the negative correlations between hemlock vs. yellow birch and cedar suggest there’s a negative interaction between these species that is stronger than LINKAGES predicted, while at the same time yellow birch and cedar positively covary more than LINKAGES predicted. One interpretation of this is that hemlock is a better competitor in this stand, and yellow birch and cedar worse, than LINKAGES would have predicted. Similarly, the weak correlations of all other species with maple doesn’t imply that maples are not competing, but that the assumptions built into LINKAGES are already able to capture the interaction of this species with its neighbors.

Figure 2. SDA example of aboveground biomass in the LINKAGES gap model. The left and middle plots are the biomass values through time given the data (green), the model predictions (purple), and the updated model-data output (pink). The plot on the right is a correlation plot representing the process error correlation matrix.

 

PalEON at AGU 2016

If you are going to AGU this year make sure to stop by and check out what PalEON has been working on!

AGU 2016 PalEON schedule color coded by day with lighter colors being posters, darker colors talks

AGU 2016 PalEON schedule color coded by day with lighter colors being posters, darker colors talks

Expert Elicitation to Interpret Pollen Data

Post by Andria Dawson, Post-Doc at the University of Arizona and the University of California-Berkeley

Fossil pollen counts from sediments collected from bogs, lakes, or forest hollows tell us something about the composition of surrounding forests (read more about fossil pollen here and here). In a sediment core, pollen samples from multiple depths tell us about changes in these surrounding forests over time. Barring some rare and complex geophysical events, going deeper means going back in time. With some simplifying assumptions about how pollen travels from tree to sediment we can use counts of sediment pollen grains to quantitatively reconstruct forests of the past.

However, correlating depth with time, or aging the sediment, is a difficult problem. Sediment accumulates at rates that vary through time, resulting in non-linear age-depth relationships. This means that knowing the sampling year – or the age of the surface sediment – is not enough to reliably estimate the ages of samples from further down in the sediment. This lack of information is solved with radiometric dating. Small pieces of plant material from the surrounding environment find their way into the sediment; these are macrofossils. Isotope signatures from these macrofossils can be used to determine their approximate age, and provide us with additional age-depth data points. Age-depth models can be constructed from these age-depth data points.

Another way to link depth with age is to look for signatory changes in pollen representation over time. Hallmark changes in the representation of indicator taxa allow scientists to associate sediment depths with events whose dates (ages) are roughly known. In the upper midwestern US, European settlement led to significant land-use changes which resulted in increases in several agricultural indicator taxa, including ambrosia (i.e., ragweed) and rumex (i.e., docks and sorrels) (Figure 1). This change in pollen representation makes it possible to identify pre- and post-settlement depths in a pollen sediment core. This matters because some scientists (including some of us on PalEON) hypothesize that major land-use changes probably caused big changes in the pollen-vegetation relationship. Were these anthropogenically-induced changes in the pollen-vegetation relationship greater than what we would expect without this external forcing? We don’t know, and might never know.

Images of A) ragweed and B) sheep sorrel.

Images of A) ragweed and B) sheep sorrel.

Nevertheless, we want to identify what we often refer to as the settlement horizon in the pollen records for at least two reasons. First, it allows us to compare pollen from the time of European settlement with public land survey records. Second, it is often used as an additional age-depth data point in the construction of age-depth models. But how easy is it to identify this settlement horizon? Recent work shows it is not as easy as one might have thought.

The unofficial PalEON mantra is that it is better to be correct and uncertain than certain and wrong. This line of thought led us to conduct an experiment using expert elicitation, where experts were tasked with identifying the settlement horizon in pollen records from the upper midwest. Four experts each considered 185 pollen records from the upper midwest USA. For 59 pollen records the experts agreed on the location of the settlement horizon (Figure 2). For the remaining records, there was some level of disagreement (Figure 3). This is not surprising, but does highlight the importance of thinking about uncertainty. Does this mean that we should disregard all previous attempts to identify the settlement horizon? The answer to this is a resounding no. The moral from all of this is that understanding your data is critical; understand its uncertainty and how this impacts your work. In the age of big-data and data-sharing, it becomes more difficult to really know your data, but the payoff is sound science. Know your data, and know it well.

To learn more about how we use results from the expert elicitation exercise referred to above, check out our recent Dawson et al. 2016 paper in Quaternary Science Reviews where we calibrate the pollen-vegetation relationship. Elicitation results have also been used to redefine controls for a new suite of age-depth models (Goring et al., in prep), which will in turn be used to assign dates to pollen samples used in vegetation reconstructions (Dawson et al, in prep).

 

Figure 2. Example of a pollen diagram from a site where experts were in complete agreement on the location of the representative pre-settlement sample. Samples identified by experts as pre-settlement are indicated by the dashed lines.

Figure 2. Example of a pollen diagram from a site where experts were in complete agreement on the location of the representative pre-settlement sample. Samples identified by experts as pre-settlement are indicated by the dashed lines.

Pollen Diagram Figure 3

Figure 3. Example of a pollen diagram from a site where experts were in complete disagreement on the location of the representative pre-settlement sample. Samples identified by experts as pre-settlement are indicated by the dashed lines.

References

  1. Dawson, Paciorek, McLachlan, Goring, Williams, Jackson. 2016. Quantifying pollen-vegetation relationships to reconstruct ancient forests using 19th-century forest composition and pollen data. Quaternary Science Reviews.137: 156-175. 
  2. Goring, Dawson, Grimm, et al. Semi-automated age model development for large scale databases. 2016. In prep for submission to Open Quaternary.
  3. Dawson, Paciorek, McLachlan, Goring, Williams, Jackson. 2016. Pre-industrial baseline variation of upper midwestern US vegetation. In prep for submission to Quaternary Science Reviews.

Synthesizing Fire-History Records to Understand Fire-Regime Variability Across Alaska

Post by Tyler Hoecker graduate student with Philip Higuera at the University of Montana. Tyler received an Outstanding Student Paper Award when he presented this work at AGU 2015!

Over the past two decades the paleoecological research community has amassed dozens of sediment cores from across Alaska. These long-term records contain a range of clues about the character of ecosystems and climate that existed deep in the past. Some records extend back as far as 14,000 years, and have been used to reconstruct millennial-scale vegetation, climate and disturbance dynamics. For example, Higuera et al. (2009) identified the marked increase in biomass burning and fire frequency following the transition from forest-tundra to modern black-spruce dominated boreal forest ca. 5000-6000 years ago using cores from four lakes in the south-central Brooks Range (Figure 1).

Figure 1

Figure 1. Paleocharcoal record from a south-central Brooks Range lake presented in Higuera et al. 2009. Vertical bars in the top panel show the charcoal accumulation rate (CHAR; pieces cm-2 year-1) over the past 7,000 years, and crosses indicate peaks in CHAR that are inferred to represent local fire events. Bottom panel shows peak magnitude (pieces cm-2 peak-1) delineated by vegetation zone.

More recent work has focused in on the past two millennia to understand the sensitivity of fire regimes in the boreal forest to centennial-scale climate change, including the Medieval Climate Anomaly (MCA) and the Little Ice Age (LIA) (Kelly et al., 2013). Persisting from ca. 850-1200 A.D. (750-1100 calibrated years before present), the MCA has been proposed as a rough analog to modern and predicted climate warming. This work identified significant increases in biomass burning during the MCA based on 14 cores from the Yukon Flats region of boreal forest, likely in response to regional warming. Biomass burning tapered off before LIA cooling began, hypothesized to reflect negative feedbacks from fuel limitations (Kelly et al., 2013).

The synthesis work I presented at AGU built on the work of Kelly et al. (2013) and focused on characterizing centennial-scale variability over the past two millennia in an additional 12 published fire-history records from elsewhere in the Alaskan boreal forest and tundra (Copper River Basin, Brooks Range, Noatak River Watershed). I looked at these records as Alaskan-wide (n=26) and regional composites (n=14 in Yukon Flats and n=4 elsewhere).

Figure 2

Figure 2. Maps of new and published paleocharcoal records used in the synthesis. 26 paleocharcoal records were analyzed as an Alaskan-wide composite, and as ecoregional composites (colored polygons). 8 new records were collected in the Kuskokwim Mountains ecoregion of interior boreal forest (dark green polygon, westernmost points) in June 2015. Wildfire burn perimeters since 1939 are shown in red.

Each of the composite records demonstrates the pronounced high-frequency variability in fire activity, but long-term trends were also revealed. As an Alaskan-wide composite, this analysis suggests a relatively cohesive increase in biomass burning during the MCA, and a reduction during the LIA, largely reflecting the influence of the Yukon Flats records (Figure 3).

Figure 3

Figure 3. Alaskan-wide composite of 26 paleocharcoal records. Top panel shows the Z-score of charcoal accumulation rate (CHAR; pieces cm-2 year-1) over the last 2500 years. Thick black line represents a 500-year mean, thin black line represents a 100-year mean, and gray band indicates bootstrapped 90% confidence intervals around the 500-year mean. Colored bars indicate the approximate persistence of the Medieval Climate Anomaly (MCA) and Little Ice Age (LIA). Bottom panel indicates the number of records contributing to the composite through time (22-26).

When considered as regional composites (Figure 4) variability across Alaska emerges. At this scale the Yukons Flats, Copper River Basin, and Noatak River Watershed all show a sensitivity to warming, with some variability in the timing of this response. However, the Brooks Range sites show little low-frequency variability during either the MCA or LIA. This suggests that vegetation feedbacks or regional-scale controls act to moderate the impact of warming on biomass burning. Alternatively, climate forcing could have been uneven across Alaska as this scale, and/or climate forcing may not have been substantial enough in some regions to elicit a response in fire regimes. The sensitivity in observed paleocharcoal records may also be a function of sample size; centennial-scale change in some regions may be subtler than can be detected with only four lake cores.

Figure 4

Figure 4. Regional composite paleocharcoal records. Time series are labeled by region (number of records). Each shows the Z-score of charcoal accumulation rate (CHAR; pieces cm-2 year-1) over the last 2500 years. Thick black line represents a 500-year mean, thin black line represents a 100-year mean, and gray band indicates bootstrapped 90% confidence intervals around the 500-year mean. Colored bars indicate the approximate persistence of the Medieval Climate Anomaly (MCA) and Little Ice Age (LIA).

Untangling the potential mechanisms driving the variability across time and space will require similar analyses with larger datasets. For my PalEON-supported MS thesis I will incorporate eight new lake-sediment records collected in 2015 to assess fire-regime sensitivity in a climatically distinct region (Figure 2). This network of lakes will lend itself to detecting the relatively short-term variability of the MCA and LIA. Spatially explicit reconstructions of climate over this time period will also help to explain the causes of variability in biomass burning, and improve our understanding of the relative importance of climate and vegetation dynamics in driving fire regime trend.

Citations
Higuera et al. 2009. Vegetation mediated the impacts of postglacial climate change on fire regimes in the south-central Brooks Range, Alaska. Ecological Monographs 79(2): 201-219

Kelly et al. 2013. Recent burning of boreal forests exceeds fire regime limits of the past 10,000 years. Proceedings of the National Academy of Sciences 110(32): 13055-13060

Science at Notre Dame

Post by Jody Peters, PalEON Program Manager

While PalEON members come from a multitude of institutions from around the US and across the globe, there are a number of individuals based at the University of Notre Dame. Here at Notre Dame we are remembering Father Ted’s impact on the growth and development of scientific research at ND.

To learn about Father Hesburgh’s influence see: http://research.nd.edu/news/64738-hesburghs-influence-on-science-at-notre-dame/

Empirically Reconstructing Biophysics with Remote Sensing Data

Post by Bethany Blakely graduate student with Adrian Rocha and Jason McLachlan at the University of Notre Dame

Biophysics is important . . . but only recently!

The exchange of energy between the land surface and the atmosphere (biophysics) plays a huge role in local and global climate. The polar ice-albedo feedback, where snowmelt reduces albedo and further accelerates melting, is an example that most readers will be familiar with. Vegetation is an important mediator of biophysical change (Figure 1). Changes in stature, phenology, water use, and other vegetation characteristics alter the exchange of energy between a vegetated surface and the surrounding atmosphere. Despite the potential climatic importance of these effects, they were often neglected in early efforts to understand vegetation-climate interactions. One contributor to the relative neglect of biophysics in past climate science may be its temporospatial misalignment with vegetation processes. Surface-atmosphere energy exchanges happen rapidly and at a local scale, making them hard to rectify with annual to centennial changes in landscape and climate.

Figure 1

Figure 1. Major ways in which vegetation affects surface-atmosphere exchange of energy. CRO = crops, DBF = deciduous broadleaf forests, ENF = evergreen needleleaf forests, GRA = grassland. Forested land tends to absorb more energy than non-forested land due to lower albedo but dissipates a larger portion of that energy through evaporation, limiting temperature increase. Zhao et al 2014

 

Remote sensing to the rescue?

Remote sensing is a useful tool for bridging the gap between biophysical and landscape processes. It offers a great deal flexibility in scale and is well suited to represent the kinds of things that matter for biophysics. After all, remote sensing satellites literally measure outputs of energy from the land surface. Since the processed data products from these measurements are standardized at relatively fine temporospatial resolution but collected over annual to decadal periods of time, they facilitate highly data-informed generalizations about fine-scale processes.

The biophysics of the past

I’ve been spending the past two years using – okay, mostly learning to use – remote sensing data to empirically link vegetation and biophysics. The goal is to understand how vegetation change since European settlement has altered the biophysics of the land surface, and how that might affect climate.

Fig 2

Figure 2: PLS pre-settlement vegetation and MODIS modern vegetation

I used 10-year averages of MODIS remote sensing data to create regressions linking vegetation type to two important biophysical properties: albedo and surface temperature. I then projected that relationship onto the vegetation of the past. Think 30,000-color paint-by-number grid cells.

Fig 3

Figure 3: Differences (modern – historic) in albedo and surface temperature

The biophysical changes in the land surface are striking. Typical albedo has increased in the winter – there are less trees (because of logging, urban expansion, etc.) to cover up the snow – but is mostly unchanged in the growing season. Surface temperature has decreased in winter but increased in summer. This trend is particularly interesting because it seems to suggest a loss of temperature regulation with the loss of forests; the modern surface is colder in the cold season and warmer in the warm season. For more detail on my findings, check out my recent AGU poster. Although we can’t know exactly how a centuries-gone landscape exchanged energy with its atmosphere, remote sensing data offers a way to construct useful empirical baseline for changes in vegetation biophysics. In addition to offering its own scientific insights, this work could serve as an interesting comparison to outputs from Paleon Models, a role which I plan to participate in on in the upcoming site level model intercomparison project.

2015 AGU PalEON Talks & Poster Schedule

If you are going to AGU December 14-18 check out our PalEON talks and posters!

We can help fill up your week with great talks and posters, especially on Monday.

AGU 2015 Talks

 

AGU 2015 Posters

Reconstructing Multivariate Climate Using A Mechanistic Tree Ring Model

Post by John Tipton, statistics graduate student with Mevin Hooten at Colorado State University

Statistical Challenges of Paleoclimate Reconstructions
The ability to reconstruct paleoclimate from proxy data is important for understanding how climate has changed in the past and to allow exploration into how changing climate influences ecological processes. Statistical reconstructions of paleoclimate have unique challenges because proxy data are noisy, indirect observations of climate. Thus, any statistical model must address the following challenges: change of temporal support, sparse data, and the prediction of unobserved climate variables. When reconstructing climate from tree ring widths, the change of temporal support arises because our climate data are monthly average temperature and log total precipitation, whereas tree ring growth is measured on an annual scale. Therefore, any statistical model must account for this temporal misalignment. To overcome estimation issues common in sparse data scenarios, many previous reconstructions used linear statistical methods with constraints to regress the tree ring widths onto climate. For a multivariate climate reconstruction (e.g., temperature and precipitation), predicting paleoclimate using linear regression requires the inversion of a many to one functional that has potentially infinite solutions. Thus, multivariate climate reconstructions from univariate tree ring width time series are not commonly performed.

Mechanistic Models – A Promising Alternative
There is a need for rigorous statistical multivariate climate reconstructions, and hence, we developed an alternative to using linear statistical methods (Tipton et. al., In Press), using a mechanistic, biologically motivated model that “grows” tree rings to approximate the true growth process. By growing tree ring widths on a monthly time step, the mechanistic model aligns the monthly climate data with annual tree ring width data. Extending the mechanistic model to allow each tree species to have differential response to climate provides strong constraints on possible climate scenarios, ameliorating the difficulties that arise from having too many unknowns. We perform Bayesian inference to generate a probabilistic reconstruction that allows visual exploration of uncertainties. In contrast, many paleoclimate reconstructions generate point estimates that are not probabilistic in nature. The probabilistic reconstructions provide auxillary information that can be used to determine at what time periods the reconstruction is informative. Unfortunately, the use of a mechanistic growth model comes at a computational cost, thus we fit our model using Markov Chain Monte Carlo with compiled C++ code to increase computation speed.

Reconstruction of T and P – at the Same Time!
Our motivating goal was to generate a reconstruction of spatially explicit climate (temperature and precipitation) in the Northeastern United States that can be used to better understand how vegetation patterns have changed due to both climate change and direct human activity. Our work focuses on the Hudson Valley of New York, although in future work this model framework could be extended to the entire Northeastern United States. We focus on the Hudson Valley because there has been previous efforts to reconstruct the Palmer Drought Severity Index (PDSI), a combination of temperature and precipitation, which we can directly compare to our reconstruction, exploring the benefits and costs of different modeling frameworks. Figure 1 shows our joint temperature and precipitation reconstruction with the darkness of the shaded areas proportional to the probabilistic model predictions. For comparison, the black line in the log precipitation plot represents the previous centered and scaled PDSI reconstruction. Interestingly, there is little learning about temperature from our model (although the uncertainties are reasonable) while the log precipitation reconstruction is highly correlated (r=0.72) with the preexisting PDSI reconstruction. This result is in line with ecological expectations – drought in the Hudson Valley is strongly associated with precipitation (Pederson et al., 2015). When comparing our reconstruction to the previous efforts, our method has the added benefit of providing uncertainty estimates that illuminate where in time the reconstruction is informative without relying on statistically improper scoring rules like RE and CE commonly used in the paleoclimate literature. The use of proper scoring rules for assessing predictive ability is vital, because improper scoring rules can lead to incorrect inference about predictive skill.

Figure 1. Plot of probabilistic reconstruction of temperature and log precipitation using a mechanistic tree ring growth model. The reconstructions are shaded according to posterior predictive probabilities with the dotted lines giving the 95% credible interval. The solid black line in the log precipitation plot is a centered and scaled reconstruction of PDSI using the same data. The black lines at the far right of each reconstruction are the observational records.

Figure 1. Plot of probabilistic reconstruction of temperature and log precipitation using a mechanistic tree ring growth model. The reconstructions are shaded according to posterior predictive probabilities with the dotted lines giving the 95% credible interval. The solid black line in the log precipitation plot is a centered and scaled reconstruction of PDSI using the same data. The black lines at the far right of each reconstruction are the observational records.

What Did We Learn?
Reconstructing climate based on ecological proxies is tricky! Our simulations showed that the model we developed can reconstruct multivariate climate with great skill when tree growth responds to both temperature and precipitation. However, in the Hudson Valley, and many other temperate regions, trees respond mainly to precipitation and the bulk of tree growth occurs in a very limited range of temperatures. Thus, while the reconstruction of precipitation in these regions is both accurate and precise, the reconstruction of temperature is inherently more uncertain. The main benefit of a fully rigorous approach for obtaining the reconstructions is that the proper uncertainty can then be factored in to our scientific understanding of the climate process as well as accounted for in other modeling efforts (e.g., ecosystem computer models that depend on historic climate reconstructions).

References
Pederson, N., A.W. D’Amato, J.M. Dyer, D.R. Foster, D. Goldblum, J.L. Hart, A.E. Hessl, L.R. Iverson, S.T. Jackson, and D. Martin-Benito. (2015). Climate remains an important driver of post-European vegetation change in the eastern United States. Global Change Biology, 21(6), 2105-2110.

Tipton, J.R., M.B. Hooten, N. Pederson, M.P. Tingley, and D. Bishop. (In Press).
Reconstruction of late Holocene climate based on tree growth and mechanistic hierarchical models. Environmetrics.

PalEON at ESA and JSM 2015

Post by Jody Peters, PalEON Program Manager

Next week is a big week for PalEON at two meetings, the 100th annual meeting of the Ecological Society of America (ESA) in Baltimore, Maryland, August 9-14, and the Joint Statistical Meetings (JSM) in Seattle, Washington, August 8-13.

From the contingent of PalEON-ites at JSM, Colorado State graduate student, John Tipton will be giving both an invited talk and an invited poster on “A multi-scale reconstruction of bivariate paleoclimate from tree rings widths using a biologically motivated growth model.”
The poster is hosted by STATMOS from 9:30-10:15 on Sunday, August 9 and the talk is an invited ASA ENVR Student Paper Awards session from 8:30-10:20 on Tuesday, August 11.

We also have a large number of PalEON-ites that will be going to ESA. Below is the schedule of PalEON talks and posters.

It is going to be a great week of sharing the work PalEON has been doing!

ESA 2015 schedule

Models Part 3: Using Ecosystem Models to Advance Ecology

Post by Christine Rollinson, Post-doc at Boston University working with Michael Dietze.

After a bit of a break and some time back out in the woods – for the PalEON blog and myself – it’s time to wrap up our series of posts on the ecosystem modeling side of PalEON.  In the first post, we talked about why a paleoecological research group is using ecosystem models.  The second post took us behind the curtain and down the rabbit hole to describe a bit of the how of the modeling process.  Now it’s time to take a step back again and talk about what we are learning from the PalEON Model Inter-comparison Project (MIP).

Figure 1

Above is a figure showing aboveground biomass at Harvard Forest from the different PalEON MIP models. The first question most people want know is: Which model is right (or at least closest to reality)?  Well, the fact of the matter is, we don’t really know.  We know that modern biomass at the Harvard Forest is around 100 Mg C per hectare.  Most models are in the right ballpark, but each has a very different path it took to get there.  When we go back in time beyond the past few decades the data available to validate most of these models is extremely sparse.  PalEON has gathered pollen samples at Harvard forest and other MIP sites as well as pre-settlement vegetation records for most of the northeast.  The settlement vegetation records have provided a biomass benchmark for some of our western MIP sites, but aside from that, all other pre-1700 information currently available for comparison with models is based on forest composition.

So how do we evaluate these different models when we don’t actually know what these forests were like 1,000 years ago?  In the absence of empirical benchmarks, we are left comparing the models to each other.  A typical approach would then be to look at each model and try to explain why it has a particular pattern or is different from another model.  For example, why does ED have that giant spike in the 11th century? (We’re still trying to figure that one out; see previous post on debugging.)

Figure 2: The PalEON MIP isn’t the only model comparison project to see models predicting vastly different conditions in time periods that have no data available for comparison.  This figure shows similar problems in CMIP51.

Figure 2: The PalEON MIP isn’t the only model comparison project to see models predicting vastly different conditions in time periods that have no data available for comparison. This figure shows similar problems in CMIP51.

 

 

 

 

 

 

 

 

 

 

The purpose of a multi-model comparison doesn’t necessarily need to be identifying a single “best” or most accurate model.  As an ecologist with PalEON, my goal is to avoid dissecting individual models and instead focus on the big picture and answer questions where output from a single model would be insufficient.

I like to view the PalEON MIP in the same way I would view an empirical-based study that is trying to find cohesive ecological patterns across many field sites or ecosystems.  An incredible about of detailed information can be gleaned from a study based in a single site or region, but these types of studies have their limitations.  Single-site studies have extraordinarily detailed information about plant physiology, ecosystem interactions, and direct effects of warming through experimentation.  I think working with a single ecosystem model is very similar to performing a single-site field study.  With one model, you can run simulations to identify potential cause-and-effect relationships, but at the end of the day, the observed response may only be an artifact of that particular model’s structure and not representative of what happens in the real world.

Working with a multi-model comparison is like working with a continental- or global-scale data set.  There’s always a temptation to dig into the story of indvidual sites (or in this case models).  Each site or model has it’s own interesting quirks, stories, and contributions to ecology that a researcher could easily use to build a career (as many have).  However, if we think about ideas that have formed the fundamentals of ecology – ideas such as natural selection, carrying capacity, or biogeography – these ideas are those that apply across taxa and ecosystems.  To draw the parallel with modeling, if we want to move away from a piecemeal approach to ecosystem modeling and find synthetic trends, test ecological theories, and identify areas of greatest uncertainty, we need to draw comparisons across many models.  The PalEON MIP, with detailed output from more than 10 models on ecosystem structure and function from millenial-length runs at six sites, is the perfect model data set to look at these sort of questions.

Figure 3

Current PalEON MIP analyses are investigating a wide range of ecological patterns and processes such as the role of temporal scale in understanding causes of ecosystem variation, transitions in vegetation composition, and ecological resilience after extreme events.  With the PalEON MIP, we are first identifying patterns that we know exist in nature such as concurrent shifts in climate and species composition or a given sensitivity of NPP to temperature.  We can then find out if and when these patterns occur in the model and only then, look at similarities or differences among the models that might explain patterns of model behavior across models.  Additionally, my personal ecological research interests have led me to use these models to test conceptual ecological theories  regarding the role of temporal scale in paleoecological data that have thus far been difficult to assess with empirical data (Fig. 4).

Figure 4

If there’s one point I want you to take away from this series on ecological modeling, it is this: ecosystem models are tools that let us test ecological theories in ways not possible with empirical data.  Models allow us to test our understanding of individual ecosystem processes as well as how those processes scale across space and time.  Because ecosystem models are representations of how we think the world works, modelers aren’t just programmers; modelers are ecologists too!

I want to take this opportunity to thank all of the PalEON team and collaborators and particularly the PalEON MIP participants for contributing runs and helping get me up to speed on ecological modeling.  Keep your eyes out at ESA next month for PalEON-related talks including this session where you can hear several PalEON-related talks including some actual MIP results!

  1. P. Friedlingstein et al., Uncertainties in CMIP5 climate projections due to carbon cycle feedbacks. J. Clim. 27, 511–526 (2014).

 

A Living Forest

Post by Dave Moore, Associate Professor at the School of Natural Resources and the Environment, University of Arizona

Reposted from http://djpmoore.tumblr.com/

Just a couple more photos from our walk in the woods with Prof Kerry Woods.

Since settlement of the US, Eastern Hemlock has been lost from many of the forests. Hemlock, once established, is a fantastic competitor and maintains it’s own dark, moist micro-climate beneath it’s branches. The effectively excludes other species from the location but allows the shade tolerant Hemlock seedlings to thrive.

hemlock1

This particular Hemlock is growing out of the darkness… reaching out for light in a gap caused by a blowdown. The tree is taking advantage of a temporary resource that will likely disappear in the next few years.

hemlock2

Dr. Kerry Woods explains the gap dynamics of the Huron Mountain Wildlife Foundation to our group. Kerry is standing waist deep in Sugar Maple seedlings – the trees are competing with each other to close the gap and make it to the canopy – but most of them will not survive.

hemlock3

Yellow Birch needs to start out on nurse logs on the forest floor. This is the reason you often find this tree growing in straight lines in a natural forest.

hemlock4

Rachel @rachelgallerys and Kelly (ND)

hemlock5

Kelly and Ann (from ND)

hemlock6

Evan (ND) and our driver

Huron Mountain Wildlife

Post by Dave Moore, Associate Professor at the School of Natural Resources and the Environment, University of Arizona
Reposted from http://djpmoore.tumblr.com/

Kerry1

We’ve spent a few days walking in the woods in Michigan and Wisconsin.  Over the last few years the PalEON team have been trying to work out how to challenge and improve terrestrial biosphere models using long term records of vegetation in the North East and Upper Midwest of the US. It’s a diverse team of scientists who use empirical measurements, statistics and modeling approaches to explore how plants and climate have changed in tandem over the last 1-2 thousand years.

This trip was a great opportunity to get away from the models and data and stick our noses firmly in the dirt, leaves and clouds of mosquitoes of the Upper Midwest.

Kerry2

On our first day we had the pleasure of staying with Dr Kerry Woods at the Huron Mountain Wildlife Foundation. Kerry directs the Foundation’s research efforts which range from biodiversity studies to population genetics to community dynamics, aquatic biology and climate. The station is set in old growth forest not far from Marquette, MI.

Kerry3

Kerry gave us a tour of the forest he’s been watching and studying for years. Kerry read the forest to us like a story – history, life history strategies, windstorms and mysteries. It was a pleasure.

Kerry4 Kerry5

Models Part 2: A Day in the Life of an Ecological Modeler

Post by Christine Rollinson, Post-doc at Boston University working with Michael Dietze.

This post is the second in a 3-part series describing the PalEON Model Inter-Comparison Project (MIP). The first post provided an overview of why we are using models to study paleoecology. This post will describe the process and series of decisions modelers make and the challenges we face when running a model.

The modeling process actually is similar to designing experiments, and as with any research project, the process begins with a testable central question or hypothesis about the way the world works.  Like field studies, model simulations can fall into two classes: 1) sensitivity analyses are similar to manipulative experiments where individual model parameters such as maximum photosynthetic capacity are altered one at a time to determine its influence on ecosystem dynamics; or 2) observational studies where a given set of conditions, such as future, or in our case past climate change, is given to the model and the ecosystem dynamics are described.  The paleoecologists, field ecologists, and each PalEON model member are all using the second approach to answer the question of how climate change has affected forest ecosystems over the past millennium.  However, the MIP (Model Inter-comparison Project) team is borrowing an element of the model sensitivity analysis approach and instead of varying individual model parameters, is comparing how ecosystems in entirely different models are influenced by past climate change.

The modeling process can be broken down into three steps: 1) preparation of inputs; 2) running of the model; and 3) analyzing the model output.  This blog post will focus on the first two steps and the third and final post will discuss what we do once the models have been run.

ModelFig2.1

Model Inputs
At the center of the PalEON MIP is the central idea that despite the myriad of differences among the participating models, they are all being run with common meteorology drivers.  Meteorology drivers are a set of environmental conditions, including temperature, precipitation, solar radiation, and CO2 that vary through time and provide external forcing on the models.  In the context of scientific experimental design, the meteorology drivers are the independent variables driving the responses changes through time in the models.

Developing the model inputs is not merely a technical obstacle.  Several of the PalEON MIP models simulate ecosystem dynamics on sub-hourly time scales and require sub-daily meteorology drivers for the full 1,200-year PalEON MIP temporal domain.  This posed an incredible scaling challenge that is common in ecology.  Our meteorology drivers originate with two sets of daily data from CMIP5 plus 6-hourly CRUNCEP data.  Previous PalEON postdocs over the past several years had to temporally downscale each data set to 6-hourly data and align and bias-correct these data sets spatially to provide a continuous meteorology product.

Choices regarding model-specific representation of plant physiology or ecosystem processes are other inputs that must be carefully considered from an ecological point of view. Most models rely on broad generalizations of plant functional types that often include groupings by green period (evergreen or deciduous), leaf type (broadleaf or needle), and generic biome (tropical, temperate, boreal).  The model I’ve been working with (ED) requires a number of physiological parameters about each plant functional type such as cold- and shade-tolerance, leaf cuticular conductance, and root respiration.  These are the types of parameters that are often “tuned” or manually adjusted to produce patterns that we ecologists deem reasonable, even if it means the values of the parameters themselves are not.  In other words, manipulation of these variables can cause “the right answer for the wrong reasons.”  However, because empirical data on many of these processes is surprisingly sparse, determining what is a “reasonable” value is not as straightforward as it may seem.

ModelFig2.2

 

Model Runs & Debugging
ModelFig2.3Once the model inputs and settings are squared away, it’s time to actually run the model and let its internal processes work themselves out.  When I joined PalEON, I had naively assumed that because many of these models have been in use for decades all of the major kinks and “bugs” would have been ironed out and they would be able to easily handle multi-centennial simulations.  What I didn’t realize is that all of these models, are still rapidly growing and there are many technical and conceptual bugs to be fixed.  Some bugs have existed for years, but don’t become a problem except in very rare situations. For example, this spring we discovered a slight bug with temperature and light thresholds for phenology that could cause all deciduous vegetation to die if a certain pattern existed in late winter temperatures. Since the PalEON temporal domain is much longer than most other simulation attempts, based on pure probability we are more likely to generate and experience these rare exceptions that cause the model to break.  ModelFig2.4Other more challenging bugs are those where everything looks reasonable over the course of a few years or decades, but slight drifts overtime can result in some very strange and difficult-to-explain patterns.  While some bugs are simple typos or oversights in the code, some turn out to be conceptual flaws with how ecology is represented in the model.  These bugs in particular illustrate why model- and field-based ecologists must work together.  Ecologists are needed not only to help identify the bugs, but present reasonable solutions or alternatives for the models.

Model Outputs
Just like field studies, the scientific process isn’t over once the data, or output in the case of models, are produced.  For me, analyzing the output to determine what’s actually happening in an ecosystem is where the fun begins.  Although one might think that because ecosystem models are all mathematically structured, linking cause and effect would be easy.  However, the complexity of relationships and feedbacks within each model means that disentangling relationships between multiple climatic drivers, disturbance, and observed responses can be extraordinarily challenging and full of surprises.

These surprises will be the topic of the next post that will talk more about how the PalEON MIP is being used to answer classic questions in ecology.

Models Part 1: The PalEON Model Inter-Comparison Project Comes to Life

Post by Christine Rollinson, Post-doc at Boston University working with Michael Dietze.

While writing what was intended as a single post about ecological modeling and PalEON, I discovered I had a lot more to say on the topic than I initially thought.  In the past year, I have moved from being an ecologist who spends every summer in the field  (at a minimum) to one now mired in the world of models and computation.  I now see myself as something of a modeling ambassador whose goal it is to demystify the modeling process and increase communication and collaboration between the modeling and field-based ecologists.

This post is the first in a series on the ecological modeling side of PalEON and will focus on describing modeling in general and how it fits into the PalEON goals. Subsequent posts will discuss the modeling process in greater detail and how scientists like myself use these models to answer fundamental ecological questions.

ModelFig1

PalEON is collecting and synthesizing data from an incredible array of sources including tree rings, pollen, and historical records.  Although we have taken great care to geographically co-locate as much data collection as possible, these data contain information about very different aspects of ecosystems and at very different points in time.  How can we combine information on centennial-scale changes in species composition from pollen records with sub-minute carbon fluxes from flux towers?  The answer is through models.

Before joining the PalEON modeling team, I, like many other field ecologists, assumed that because terrestrial ecosystem models have been used for a few decades now, the MIP (Model Inter-comparison Project) would merely be a matter of plugging in some meteorology drivers and then analyzing surprises in the output.  Oh how wrong I was!

Ecological Modeling 101
Whether we think about it or not, almost all scientists use models in one form or another.  The ANOVAs and simple linear regressions we use to statistically analyze our data are models.  The terrestrial ecosystem models being used by the PalEON modeling group are essentially dozens, or sometimes hundreds, of these simple models combined.  Ecosystem modelers think about these complex models as scaffolds that let us relate physiological mechanisms for change (such as photosynthetic response to temperature) to coarser long-term patterns (such as changes in plant community composition).

As any ecologist will readily tell you, there is a lot about ecosystems that we don’t fully understand yet.  There are certain physiological processes such as photosynthesis and respiration that act as building blocks that form the foundation for how terrestrial plant ecosystems function.  However, there are still competing methods of characterizing those various building blocks and linking them together.  These different representations of how ecosystems function have led to the creation of multiple ecosystem simulation models¹.  Rather than limit itself to a single model and theory of how ecosystems function, PalEON is working with an international group of researchers to explore how as many models as possible predict ecosystem change to past environmental variability.

Modeling Past Ecosystem ChangeModelFig2

Most of the media attention that discusses ecosystem models focuses on predictions of ecosystem responses to climate change over the next century.  However, many of these models were built, parameterized, and tuned to work well for current conditions over a few decades, at most.  As scientists, we’ve all been warned about the dangers of extrapolating beyond the range for which we have data (Fig. 2).  With PalEON, we are working around this issue by simulating forest responses to past climate change from 850 A.D. through the present. Even though the empirical data for this time period is temporally and spatially sparse, it is better than the literal nothing available for models of the future.

One of the most eye-opening experiences for me has been the discovery that many ecosystem models actually can’t run for hundreds of years at a time.  Problems that prevent the models running can range from memory and computation limitations to the simulated thermodynamic physics in the model being impossible.  More on the challenges of long-term ecological modeling in the next installment: A day in the life of an ecological modeler.  (Spoiler alert: it involves lots of trial-and-error, communication, and XKCD cartoons.)


¹Here I specify simulation model to indicate a class of process-based models that can be given different scenarios (such as climatic conditions) that are used to predict a range of ecosystem states.  This contrasts with statistical models that are typically more descriptive in nature and lack explicit representation of processes such as photosynthesis or disturbance.

 

Edge of the Prairie

Posted by Jody Peters, PalEON Program Manager

Prior to major land use changes, the tall grass prairie was a widespread ecotone in North America extending east into Minnesota, Illinois and Indiana.  Caitlin Broderick, a University of Notre Dame undergraduate working in the McLachlan lab, wanted to leaPrairie_Fig1rn more about the edge of the prairie this spring semester.  She could have listened to NPR’s Garrison Keillor describe life in Lake Wobegone on the edge of the prairie in Minnesota.  But given that Notre Dame has been compiling historical records of trees from Indiana and Illinois, she decided to look at bit closer to home and focused on townships that are within the Yellow River watershed.  Coincidentally, this watershed is split almost in two by what is currently US 31 and what was historically known in the Public Land Survey (PLS) notes as the Michigan Road Lands. (Fig 1).   The following are some of the results Caitlin presented at the University of Notre Dame College of Science Joint Annual Spring Meeting.

Prairie_Fig2Currently, the Yellow River region is dominated by agriculture and deciduous forests (Fig 2).  However, from a 1935 depiction of the extent of the prairie prior to Euro-American settlement, prairie ecosystems extended into northwest Indiana including into the Yellow River region (Fig 3; Transeau 1935).  Caitlin explored what the edge of the prairie in the Yellow River region looked like during 1829-1837 using historical forest data obtained from PLS.  Prairie_Fig3The survey notes provide identification of 1-2 trees, their diameters and distances from corner posts set every mile in each township (n=30 townships; 1055 corners). For this study, trees at each corner were categorized as “Oak” (only oaks were present), “Other” (22 non-oak tree taxa which were dominated by beech, ash, maple and elms), or “Oak + Other” (a combination of oak and non-oak trees).  There were also corners that were categorized as “Water” (in lakes, rivers, creeks, etc.), “No Tree” (had a post set in a mound but with no trees nearby), or No Data (corners with no information provided in the notes).

Caitlin used Arc GIS to map the tree composition at each corner (Fig 4).  Much to our surprise, she did not see many corners with no trees in the areas of Transeau’s prairie in the Yellow River region.  However, there was a striking pattern in the distribution of the trees, with the majority of the trees to the west of the Michigan Road Lands being oaks and those to the east being other hardwoods.

Prairie_Fig4

In addition to examining tree composition, Caitlin wanted to look at the structure of the trees and the physical environment across the region. To better define the two groups of trees (Oaks to the west and Other hardwoods to the east), Caitlin created buffers around individual corners and dissolved the buffers with matching tree classifications to create two groups of trees (Fig 5A). Comparing the trees in the two groups, Caitlin found that there was no difference in tree diameter (mean (cm) ± se; Oak: 40.9 ± 0.5; Other: 40.2 ± 0.6). However, the trees in the Oak group were significantly further from the corner posts compared to the Other group (mean (m) ± se; Oak: 36.2 ± 4.3; Other: 13.6 ± 2.4; p<0.001).  Given that there was a difference in the tree composition and distance from the corners, Caitlin looked at climatic and physical features that could help explain this difference between the oaks of the west and the other hardwoods in the east.  There was no ecological difference in either temperature or precipitation (mean ± se; Temperature (ºC): Oak: 9.88 ± 0.003; Other: 9.89 ± 0.003; Precipitation (mm): Oak: 1010 ± 0.59; Other: 1007 ± 0.74).  However, although northern Indiana is quite flat, as with the tree composition, there was a striking difference in the pattern of elevation change across the Yellow River region that mirrored the change in the tree distribution.  Trees in the Oak group were at significantly lower elevation compared to trees in the Other group (mean (m) ± se; Oak: 225.5 ± 0.7, Other: 251.0 ± 0.6; p<0.01; Fig 5B).

Prairie_Fig5ab

IPrairie_Fig6n addition to exploring what the historical prairie-forest boundary looked like, Caitlin compared the National Commodity Crop Productivity Index (NCCPI) to see if there is a lasting impact on the current crop production by the physical factors that contributed to the differences in historical tree composition and structure.  The NCCPI models the ability of soils, landscapes, and climates to foster non-irrigated commodity crop productivity (Dobos et al. 2012). Caitlin found there was in fact, a difference between NCCPI of the two groups with crop productivity being significantly lower in the western area historically dominated by low density Oak tree communities (Fig 6).

Typically when people think of prairies, they think of open expanses of grass.  However, over the spring semester Caitlin found that in the Yellow River region of northern Indiana, the prairie described by Transeau actually looked more like a low density oak community compared to the closed forests further east.  While at first glance we can no longer see the prairie’s edge in the current vegetation of northern Indiana (Fig 2), when Caitlin looked closer, the factors that once controlled the boundary between the closed forests of the east and the open savannas moving west may still be at work limiting crop production in what was once savanna compared to production in the historically wooded uplands (although what these factors specifically are is still an open question).

As a fun follow-up to Caitlin’s research, nine of us took a lab canoe trip down the Yellow River.  Although we didn’t see the same striking pattern of oaks to the west and other hardwoods to the east, as we drove the 50 minutes south on US 31 to get to the River, we did see small stands of hardwoods intermingled with a number of lone oaks standing in the middle of corn and soybean fields.  On our way to the ice cream store after our canoe trip, we also saw some of the lasting impacts of the oak savannas as we drove through Burr Oak, Indiana.  According to McDonald (1908), the village plat was filed in 1882 and was “nearly in the center of what is known as the “Burr Oak Flats”, which is as beautiful and productive a region as can be found anywhere.”

The trip down the Yellow River was a wonderful way to get outside on a beautiful spring day and experience firsthand the watershed/ecosystem that Caitlin has spent this spring studying in the lab.  The river was an easy enough paddle that novice canoers would be comfortable, but had enough downed trees to provide excitement for the more experienced canoer.  If you are ever in the area, connect with the Little Bit Canoe Rental. They do a wonderful job providing canoes and transport.  Here are a couple of pictures of the fun!

Prairie_Fig7

 

 

References

Dobos, R.R., H.R. Sinclair, Jr., M.P. Robotham. 2012. User Guide for the National Commodity Crop Productivity Index (NCCPI), Version 2.0. Access pdf at the bottom of the page here.

McDonald, D. 1908. A Twentieth Century History of Marshall County, Indiana, Volume 1. Lewis Publishing Company, pg 134. Online access here.

Transeau, E. N. 1935. The Prairie Peninsula. Ecology 16(3): 423-437.

Pollen Dispersal II: Quantitative Reconstructions

Post by Kevin Burke, Graduate Student at the University of Wisconsin, Madison working with Jack Williams.  

Since Lennart Von Post presented the first modern pollen diagram in 1916, the palynological community has been working to better understand the relationship between pollen abundances recovered from sediments, and the vegetation that produces this pollen.1,2 Throughout the twentieth century great strides have been made in quantitatively reconstructing past vegetation from relative pollen percentages. Mechanistic transport models by Colin Prentice and Shinya Sugita use taxonomic abundance and pollen productivity to estimate pollen source area, while other recent work by PalEON member Andria Dawson study this relationship using sophisticated Bayesian hierarchical models.3,4 However, these approaches all tend to use simplifying assumptions,  including constant, uniform wind speeds, and a circular source area. Our work here seeks to challenge and improve upon some of these underlying assumptions by developing a pollen transport model that includes realistic, variable wind speeds and directions.  Moreover, we are using the historical estimates of vegetation developed by the PalEON project.

Using North American Regional Reanalysis (NARR) wind data, a vegetation dataset of tree composition pre-Euro-American settlement , fossil pollen records from the Neotoma Paleoecology Database, and taxon-specific measurements of settling velocities for pollen grains, we are producing improved estimates of pollen loading (in grains per square meter) for lakes across the Prairie-Forest ecotone in the upper Midwest.  This work recently won an Outstanding Student Paper Award from the 2014 Fall Meeting of the American Geophysical Union.

Fig1

While the source area is largely governed by presence/absence of taxa in the vegetation dataset, things become much more interesting when looking at the ratio of relative proportion of pollen loading to relative proportion of trees per grid cell. We see that some taxa are much better represented in the pollen records given the number of trees on the landscape.

Fig2

We can also see that whether a given taxon is over- or underrepresented varies spatially, based on distance to lake and the other taxa present in a given grid cell.  These results shed light on the potential magnitude of a regional pollen source area, and help elucidate the relationship between pollen in a sediment core and trees on the landscape.  Ultimately, this kind of work can be used to build more accurate reconstructions of past vegetation from fossil pollen records.  From there, these paleovegetation reconstructions can be used to better constrain and improve terrestrial ecosystem models – a central goal of the PalEON project.

To see more results, including preliminary comparisons to pollen counts from sediment cores, you can check out our American Geophysical Union poster here.

References:

  1. Fries, M. Review of Palaeobotany and Palynology, 1967.
  2. Marquer et al. Quaternary Science Reviews, 2014.
  3. Prentice, C. Quaternary Research, 1985.
  4. Sugita, S. Quaternary Research, 1993.

2014 AGU PalEON Talks & Poster Schedule

If you are going to be at AGU next week, Dec 15-19. Be sure to see our PalEON-related presentations and posters.  We can help fill up your schedule all week as we have people presenting everyday except Wednesday!

There are two PalEON related sessions being led by people in our group.

1. B24B Constraining Ecosystem Carbon Uptake and Long-Term Storage Using Models and Data II
Tuesday, December 16, 4:00 – 6:00 PM in Moscone West, 2002
Conveners: David Moore, Valerie Trouet, Ankur Desai, Michael Dietze

2. PP44B Understanding Uncertainties in Paleoclimate and Paleoecology: Age Models, Proxy Processes, and Beyond II
Thursday, December 18, 4:00 – 6:00 PM in Moscone West, 2010
Conveners: Connor Nolan, John Williams, Lorraine Lisiecki, Deborah Khider

The following is the schedule of PalEON presentations with the title, presenter, and session number.  Presentations are color coded by location.

PalEON presentation & poster schedule for AGU 2014. Presentations are color coded by location.

PalEON presentation & poster schedule for AGU 2014. Presentations are color coded by location.

In case you are still looking for ways to fill up your time at AGU, we have a number of PalEON participants that will be discussing other work they are involved with.  Drop by to see the range of projects PalEON-ites are up to.

Other, non-PalEON presentations & posters given by PalEON members. Presentations are color coded by location.

Other, non-PalEON presentations & posters given by PalEON members. Presentations are color coded by location.

 

In a New Light

Post by Neil Pederson, Senior Ecologist at Harvard Forest
Reposted from The BroadLeaf Papers

We all love the colors of autumn. Fall brings to mind the vivid reds, oranges, yellows, and deep purples of September and October. By November in the Northeast, the leaves are gone and the sky often tilts into various shades of pale grey. The weather can be bone-chilling in a damp kind of way. It can be a bad time to be in the field. November in New England was the closest I’ve ever been to hypothermia. I now relish fieldwork in November, however, because of its light. November Light has helped me see things in a new way.

Filtered Light. Photo: N. Pederson

Filtered Light. Photo: N. Pederson

The first time I experienced the long-lasting glow of November Light was late in my dissertation field campaign. I recently had some great luck with a collection from an old-growth forest and wanted to see if I could squeeze out a few more diamonds before I called it a dissertation (and work would be lit by fluorescent light).

Kevin was my most reliable volunteer field assistant. I could call at a moment’s notice to see if he wanted to hit the field. He always said yes.

We bolted to southeast Pennsylvania and the weather was on our side. Blue skies and warm temperatures. We scoured this tiny patch of old forest to see if I had missed much on a prior trip. Soon after a brief lunch, it became apparent that we had done about all that was possible in that forest and we were slipping into lazy. So, we leaned back, chatted, and stared at the vernal roof.

At some point I kept checking the time on my GPS. My eyes kept telling me it was getting late. In reality, it was just approaching mid-afternoon. It dawned on me that angle of the Earth in that part of the Northern Hemisphere was delivering us an ever-lasting gobstopper dose of diffuse light. It felt like “sunset” above the Arctic Circle during summer. The light was low and hitting at all kinds of slanted angles. Colors glowed. It was glorious.

At the same time, it dawned on us that we were south of the last glaciation. Elk, woolly mammoth, and other megafauna likely used the game trails we were using that day more than a 100,000 years ago. More glory.

Bronzed Canopy. Photo: N. Pederson

Bronzed Canopy. Photo: N. Pederson

Just this past week we rolled up our field tapes for the last time during the 2014 PalEON season. It was a glorious feeling. Putting in ecological plots for tree-ring analysis is long and rather repetitive work. It is exhausting in a deeply different way than to reconstruct climate from tree rings. It was nice to know we had done a ton of work this year and that we were done. I imagine farmers get these feelings this time of year, too. As I dropped a coiled field tape into a backpack, it was instantly satisfying. We were putting our loyal field equipment down for a long winter’s rest.

Dan Bishop and Javier Martin Fernandez sampling in November Light. Photo: N. Pederson

Dan Bishop and Javier Martin Fernandez sampling in November Light. Photo: N. Pederson

We scheduled this last field campaign more than a month in advance. It is risky scheduling that far in advance in central New England this late in the year. But, after a Nor’Easter and a cold snap, the atmosphere shifted in our favor.

Blue skies. Brilliant Fagaceae colors. Stark contrast of a wide range of brightly-lit yellow leaves with the dark bark of red oak….and black oak?

While installing ‘nests’ around an older plot, or as a rather poetic colleague termed it, installing ‘doughnuts’, I ‘discovered’ a new species. Of course, black oak (Quercus velutina) was always there. It was just not talked about as much and, being hard to identify and often hybridizing with northern red oak Quercus rubra), it is often put in the red oak category. But, there it was, right in front our eyes.

Black oak reaching for the upper canopy among towering northern red oaks. Photo: N. Pederson

Black oak reaching for the upper canopy among towering northern red oaks. Photo: N. Pederson

Black oak bark. Photo: N. Pederson

Black oak bark. Photo: N. Pederson

Perhaps it was the November Light that made it ‘appear’? Maybe it was the showering of diffuse, angled light that made black oak jump out of the forest. Whatever it was, I now saw black oak everywhere. It wasn’t, of course. It was often red oak borrowing some of the velutinous traits of its sleeker, rarer cousin.

The glorious nature of November returned this past week and I saw these forests in a new light.

Canopy dominant red oak. Photo: N. Pederson

Canopy dominant red oak. Photo: N. Pederson

Classic red oak bark, bronzed. Photo: N. Pederson

Classic red oak bark, bronzed. Photo: N. Pederson

Oh there are some conifers in this forest (Pinus strobus). Photo: N. Pederson

Oh there are some conifers in this forest (Pinus strobus). Photo: N. Pederson

Glowing American beech (Fagus grandifolia). Photo: N. Pederson

Glowing American beech (Fagus grandifolia). Photo: N. Pederson

American beech. Photo: N. Pederson

American beech. Photo: N. Pederson

It comes in bronze, too. Photo: N. Pederson

It comes in bronze, too. Photo: N. Pederson

Fading Light. Photo: N. Pederson

Fading Light. Photo: N. Pederson

Late November Light. Photo: N. Pederson
Late November Light. Photo: N. Pederson

Pollen Dispersal I: Why We Get Sediment Pollen

Post by Andria Dawson, Post-Doc at University of California-Berkeley

Pollen and seed dispersal are important reproductive processes in plants, and in part determines the abundance and extent of a species. With a recent push to understand how species will respond to global climate change, dispersal ecology has gained increasing interest. We really want to know if there is a dispersal limitation, or if species can migrate quickly enough to maintain survival in a changing environment. Addressing this question presents challenges, many of which arise from trying to understand and quantify dispersal in ecosystems that are changing in both space and time (Robledo-Arnuncio 2014).
Pollen
In PalEON, one of our efforts involves estimating the past relative abundance of vegetation from fossil pollen records. To do this, we need to quantify the pollen-vegetation relationship, which includes modelling pollen dispersal processes.

For many trees, including those we study in the PalEON domain, pollination is exclusively (or at least predominantly) anemophilous (carried out by wind). In angiosperms, wind pollination is thought to have evolved as an alternative means of reproduction for use when animal pollinators are scarce (Cox 1991). It was previously understood that wind pollination was less efficient than insect pollination (Ackerman 2000), but Friedman and Barrett (2009) show that may not be the case. To estimate the efficiency of wind pollination, they compared pollen captured by stigmas to pollen produced, and found that the mean pollen capture was about 0.32%, which is similar to estimates of animal pollination efficiency. Although both dispersal vectors are comparable with respect to efficiency, these quantities indicate that both are still pretty inefficient – this is great news for paleoecologists! Some of the pollen that is not captured ends up in the sediment.

Now we know that we expect to find pollen in the sediment, but how do we begin to quantify how much pollen of any type we expect to find at a given location? The route a pollen grain takes to arrive at its final location is governed by atmopsheric flow dynamics (among other things). These dynamics are complicated by landscape heterogeneity and climate, and differ among and within individuals because not all pollen grains are created equal. However, we usually aren’t as interested in the route as we are in the final location – in particular, we want to know the dispersal location relative to the source location. The distributions of dispersal locations from a source defines a dispersal kernel which can be empirically estimated with dispersal location data. Often these kernels are modelled using parametric distributions, usually isotropic, and often stationary with respect both space and time. Are these approximations adequate? If so, at what scale? These are some of the questions we hope to address by using Bayesian hierarchical modelling to quantify the pollen vegetation relationship in the Upper Midwest.

References
1. Robledo-Arnuncio, JJ, et al. Movement Ecology, 2014.
2. Cox, PA. Philosophical Transactions of the Royal Society B: Biological Sciences, 1991.
3. Friedman, J, Barrett SCH. Annals of Botany, 2009.
4. Ackerman, JD. Plant Systematics and Evolution, 2000.

Underwater In New England

Post by Bryan Shuman, Associate Professor of Geology & Geophysics at the University of Wyoming

To evaluate how forests have responded to climate change in the past, we need to reconstruct the climate history. Fortunately, in terms of moisture, lakes provide a geological gauge of precipitation (P) minus evapotranspiration (ET). As effective moisture (P-ET) changes, the water tables and lake surfaces rise and fall in elevation. When this happens, sands and other materials that typically accumulate near the shore of a lake are either moved deeper into the lake during low water or shift out from the lake’s center as water levels rise. Ongoing work in New England is building on existing datasets to provide a detailed picture of the multi-century trends in effective moisture. Here are a few highlights of recent progress.

First – the fun part – was fieldwork that I conducted while on sabbatical in New England. The work included a cold but fun day on the ice of Twin Pond in central Vermont with Laurie Grigg and students from Norwich University (pictured).

Coring at Twin Ponds

Coring at Twin Ponds

This trip was a follow up to a previous trip that coincided with Hurricane Sandy’s visit to New England in 2012. As the result of both trips, we now have a series of three cores that record shoreline fluctuations at the pond. Because the sediment contains both carbonate minerals and organic compounds, we have also been able to examine the ratios of oxygen and hydrogen isotopes in the sediment and provide some constraints on the temperature history too.

Ice makes coring easy (its stable), but the swimming was not as good as in the summer when I worked in southern New England with Wyatt Oswald (Emerson College), Elaine Doughty (Harvard Forest), and one of Harvard Forest’s REU students, Maria Orbay-Cerrato. Over several days, we collected new cores that record the Holocene water-level changes at West Side Pond in Goshen, Connecticut, and Green Pond, near Montague, Massachusetts. Floating on a pair of canoes, we enjoyed the early summer sun, told jokes, ate delightful snacks brought from home by Wyatt, and strained our muscles to pull about 5 cores out of each lake. Near shore, the cores from both lakes contained alternating layers of sand and mud consistent with fluctuating water levels. In the lake center at West Side Pond, we also obtained two overlapping cores about 14 m long, which promise to provide a detailed pollen record. Both lakes proved to be excellent swimming holes too!

Second, on a more earnest note, the existing geological records of lake-level change from Massachusetts have been synthesized in a recent (2014) paper in Geophysical Research Letters by Paige Newby et al. The figure shown here summarizes the results and compares the reconstructions with the pollen-inferred deviation from modern annual precipitation levels from a paper by University of Wyoming graduate student, Jeremiah Marsicek, last year (2013) in Quaternary Science Reviews.

Figure 4 from Newby et al. 2014

Figure 4 from Newby et al. 2014

All of the records show a long-term rise in effective moisture since >7000 years ago as well as meaningful multi-century deviations. By accounting for the age uncertainties from the reconstructions, we were able to show that a series of 100-800 year long droughts at 4200-3900, 2900-2100, and 1300-1200 years before AD 1950 affected lake levels (blue curves with reconstruction uncertainty shown) on Cape Cod (Deep Pond), the coastal Plymouth area (New Long Pond), and the inland Berkshire Hills (Davis Pond) – as well as the forest composition as recorded by the pollen from Deep Pond (red line). Interestingly, an earlier drought in the Berkshires at 5700-4900 years ago was out of phase with high water recorded in the eastern lakes. This difference is one of the motivations for the new work in Vermont, Connecticut and central Massachusetts, as well as other ongoing work with Connor Nolan in central Maine: what are the spatial patterns of drought?

The Magic of Science is its Complexity

Post by Lizzy Hare, sociocultural PhD student at the University of California, Santa Cruz. Her dissertation research is on the contributions of paleosciences to the development of forecast models that could be used for policy and management. PalEON is one component of her dissertation.

As an anthropologist of science, my goal is to try to discourage obsolete and idealized views of science through the development of more open and realistic accounts. My training is in cultural anthropology, a sub-discipline that has traditionally worked through the medium of ethnography – descriptive accounts of the customs and practices of people and cultures.

Credit: Climate Change Encyclopedia

Credit: Climate Change Encyclopedia

In my research, I am learning about the process of producing scientific knowledge about climate change adaptation so that I may write about this for a general audience. I hope to share the daily practices, the complexities, the passions, the concerns, as well as the monotony, the frustration and the many absolutely mundane decisions that go on “behind the scenes” of knowledge production, so that we can move beyond idealized understandings of science that have caused political trouble around issues pertaining to climate change adaptation.

There is a great body of information that could aid policymakers and land managers in developing climate change adaptation strategies, but the political climate is such that the issue is avoided, as if ignoring it will make it go away, or is discussed obliquely, as in Florida governor Rick Scott’s recent efforts to begin to address the consequences of climate change without mentioning its causes.

Credit: Rice University/Photos.com

Credit: Rice University/Photos.com

Part of the problem is declining trust in science by those who identify as politically conservative or moderate (Gauchat 2012). But thinking of the issue as simply a matter of conservatives versus liberals is a gross oversimplification. This is a part of the same cultural phenomenon that has led educated, high-income (and generally politically liberal) mothers to opt out of childhood vaccination schedules (Reich 2014). Both climate change skeptics and anti-vaxxers eschew scientific consensus, favoring instead the right to individual freedom to weigh evidence and make independent decisions. Adherents of this position see their method of knowledge acquisition (through shared first-hand accounts, anecdotal evidence, and sometimes even religious texts) to be equivalent to that produced in mainstream science.
Further, because it has become such a politically contentious topic, the polemics on both sides can make it difficult to take seriously those with dissenting opinions. Politicians with largely anti-science constituencies, such as Senator Coburn, have found it politically advantageous to scrutinize science in general and the NSF in particular, and the findings of his report were mocked in the conservative-serving media. On the other side of the spectrum, Michael Mann’s pugnacious and often condescending public persona demonstrates an utter disinterest in the reasons why people choose not to follow science.

Anthropologist of science, Myanna Lahsen and self-described conservative journalist Pascal-Emmanuel Gobry have written about how the public’s idealized perception of science contributes to the contention. According to Lahsen (2013) and Gobry, the public generally believes that science should be an objective broker of truth, independent of culture and politics. Members of the public are thus understandably confused, frustrated, and skeptical when scientific findings emphasize uncertainty or change from year to year or get involved in heated political debates. After all, if they are brokering in absolute Truth, a scientific finding would be the final word on a matter. Lahsen, following anthropologist Christopher Tourney (1996) calls the belief in an idealized science “scientific fundamentalism”. Gobry simply and more bluntly calls it a “botched” understanding of science.

Whether you see the idealized understanding of science as fundamentalist or flawed is relatively inconsequential. Of more immediate concern are the unrealistic expectations that these idealized understandings place on scientific findings. Not only do they discount the tremendous amount of work that goes into continually adjusting, refining, and occasionally revolutionizing scientific knowledge, but they also set up an expectation that science should be wholly without cultural or political influence. This is something that science simply cannot do. Lahsen (2013) shows how unrealistic expectations of science made the Climategate controversy more problematic than it ought to have been, because the “troubling” material that the hackers found in the Climatic Research Unit’s emails can only be considered problematic if there is an a priori assumption that science does not include making subjective decisions about data, analysis, and findings. Following Climategate, contrarian interpretations of climate change gained support in the United States, because it produced evidence that mainstream science is “flawed” by politics and therefore cannot produce Truth (capital T). If this is the case, so the logic goes, then other politically-motivated interpretations of science must be equally valid. (A great example of this kind of interpretative symmetry can be found in the Heartland Institute’s response to the article by Jankó et al. (2014) that PalEON member Simon Goring described in his blog posts here). 

To try to put an end to this argument, the belief that science is a perfect, objective, apolitical, knowledge-producing machine needs to be laid to rest. In place of that narrative we need one that explains the awesome, dynamic complexity of science in practice. There is absolutely no reason for why we need to exaggerate or make science appear magical, the history of science is truly impressive. After all, science has made possible unprecedented advances in knowledge, technology and quality of life.

References Cited:

Gauchat, G. (2012). Politicization of Science in the Public Sphere: A Study of Public Trust in the United States, 1974-2010. American Sociological Review 77(2):167-187.
Lahsen, M. (2013). Climategate: The role of the Social Sciences. Climatic Change 119:547-558.
Reich, J. (2014). Neoliberal Mothering and Vaccine Refusal: Imagined Gated Communities and the Privilege of Choice. Gender & Society 28(5):679-704.
Tourney, C. (1996). Conjuring Science: Scientific Symbols and Cultural Meanings in American Life. Rutgers University Press: New Brunswick.

Big process, small data: Reconstructing climate from historical U.S. fort data

Post by John Tipton, statistics graduate student with Mevin Hooten at Colorado State University, about work John and Mevin are doing with Jack Williams and Simon Goring.

Big data” has very rapidly become a popular topic. What are big data? The concept of big data in statistics is the analysis of very large datasets with the goal of obtaining inference in a reasonable time frame. The paleoclimate world often has the opposite problem: taking small amounts of data and expanding to produce a spatially and temporally rich result while accounting for uncertainty. How do you take a handful of temperature observations and predict a temperature surface over 20,000 locations for a period of 73 years in the past? Perhaps some of the techniques used in big data analysis can help.

Figure 1. Four representative years of temperature records (ºC) from the historical fort network.

Figure 1. Four representative years of temperature records (ºC) from the historical fort network.

The U.S. fort data consist of temperature records from military forts in the Upper Midwest region of the United States from 1820-1893. A subset of these instrumental temperature records (Figure 1) illustrates the sparse nature of the historical U.S. fort data relative to the spatial area of interest, especially in the earlier two years (1831 and 1847). From the small set of temperature observations collected each year, we seek to reconstruct average July temperature at a fine grid of 20,000 prediction locations. Techniques such as optimal spatial prediction, dimension reduction, and regularization allow us to provide formal statistical inference for this very large underlying process using a relatively small set of observational data.

To ameliorate the sparsity of the fort data, we used patterns from recent temperature fields (i.e., PRISM products) as a predictor variables in a Bayesian hierarchical empirical orthogonal function regression that includes a correlated spatial random effect. A strength of this modeling technique is that the primary patterns of temperature should remain stable even though the magnitude might change (e.g., it will always be cooler in the north than in the south). Another characteristic of this methodology is that it allows for localized differences in prediction to arise through a correlated spatial random effect. The correlated spatial random effect is too computationally expensive to calculate using traditional methods so the effect is estimated using big data techniques. Specifically, any remaining correlation that ties the fort locations together beyond that predicted by combinations of the primary temperature patterns is approximated in a lower dimensional space. This greatly reduces the computational effort needed to fit the model. We also employ a type of model selection technique called regularization to borrow strength from years with more data. This results in predictions that are close to the historical mean when there are few observations in a given year, while allowing for more detailed predictions in years with more data. To make the model selection computationally feasible, we fit the model in a highly parallelized high performance cluster computing environment.

The use of big data techniques for large paleoclimate reconstruction allows for statistical estimation of climate surfaces with spatially explicit uncertainties. Results of the mean July temperature for the subset of four years are shown in Figure 2, while the associated spatially explicit uncertainties are shown in Figure 3. These figures illustrate the strengths of the modeling techniques used. In the two earlier years, the predictions are similar to the historical mean with uncertainty increasing as a function of distance from observations. In the two later years with more data, the predictive surfaces have more spatial complexity and less associated uncertainty.

Figure 2. Reconstruction based on the posterior mean July temperature (ºC) for four representative years of the historical fort network.

Figure 2. Reconstruction based on the posterior mean July temperature (ºC) for four representative years of the historical fort network.

Figure 3. Posterior standard deviation surface of mean July  temperature (ºC) for four representative years of the historical fort network.

Figure 3. Posterior standard deviation surface of mean July temperature (ºC) for four representative years of the historical fort network.

By explicitly accounting for latent correlated spatial structure and moderating model complexity using regularization, spatio-temporal predictions of paleoclimate are improved. Furthermore, dig data techniques allow us to fit the statistical models in a reasonable time frame (i.e., on the order of days rather than weeks). The relatively small sample sizes commonly associated with paleoclimate data would not normally fall into the “big data” realm of analyses. However, the processes on which we seek inference are quite large, and thus “big data” techniques are tremendously helpful.

 

 

 

Quaternary Science . . . on Mars . . . three billion years ago.

Post by Simon Goring, Research Assistant at the University of Wisconsin, Madison
Originally posted on OpenQuaternary Discussions.

For a curious person, one of the great benefits of being a Quaternary researcher is the breadth of research that is relevant to your own research questions.  The recent publication of fifty key questions in paleoecology (Seddon et al., 2014) reflects this breadth, spanning a broad range questions that reflect human needs, biogeophysical processes, ecological processes and a broad range of other issues.  The editorial board of Open Quaternary also reflects this incredible disciplinary breadth.  To me it is clear that the Quaternary sciences is an amalgam of multiple disciplines, and, at the same time, a broadly interdisciplinary pursuit.  To be successful one must maintain deep disciplinary knowledge in a core topic, as well as disciplinary breadth across topics such as ecology, anthropology, geology (and specifically geochronology), and you need a good grounding in statistics and climatology.

One of the things that is not always quite as apparent is the breadth of research affected by the Quaternary sciences.  My first exposure to the utility of paleoecology for understanding interplanetary dynamics came as the result of a paper we published two years ago.  In 2012, my co-authors and I developed a regional scale estimate of sediment deposition times in lakes across eastern North America for the Holocene (Goring et al, 2012).  We did this because we were looking toward re-building age models for all cores in eastern North America and wanted to use reliable priors for Bacon (Blaauw and Christen, 2011).  Our priors wound up becoming the default in Bacon, which is great, but the results have also helped inform the lacustrine history of the red planet, Mars.

 

Figure 1.  Paleo-lake level reconstruction in the Melas Baisn of Mars.  From Williams and Weitz (2014).

Williams and Weitz (2014) were examining the Melas Basin, a feature of the Martian surface that appears to show evidence of lacustrine activity at some time in the past.  Given a set of lacustrine features and channel beds in the basin, they began the process of trying to reconstruct lacustrine activity on the surface of Mars.  It seems clear that if our own understanding of geophysical processes during the Quaternary is based on Whewell and Lyell’s concept of Uniformitarianism, that uniformity of process should not be limited to earth.

While we might assume uniformity, there are limits to how much modern or paleo terrestrial analysis can be applied to the Martian surface.  Although the basin age is dated roughly using meteor strikes, the dating of lacustrine establishment and termination are much more difficult.  For one, Holocene models rely on 14C dates. While it may be possible to obtain some form of geochronological information from the Martian surface, at some point it likely requires dating techniques we don’t have on hand.  However, researchers can develop experimental procedures to test the possibility of using other dating techniques, and it seems like development of these techniques is already underway with K-Ar dating (Hurowitz et al., 2012, PDF).

Another limitation of using Quaternary analysis is that our Holocene estimates rely on the assumption that there is near-modern vegetation cover, that sediment transport and flow rates are similar to modern and the distribution and types of sediment are similar to modern.  Even in this assumption we know there are exceptions.  Sediments deposited immediately following deglaciation are often very fine grained, we often see strong increases in organic content during the Holocene, and the presence of a major inflection point in deposition rates is a persistent feature of the near-modern era.

Regardless, to understand how long a lake was present in the Melas Basin there are few options but to look at earth systems.  The Williams and Weitz paper (2014) looked at both deltaic sediments and lake sediments in the basin.  Williams and Weitz (2014) estimate lacustrine activity using sedimentation rates from large deltas in the United States and Russia and our sedimentation rates for lacustrine environments.  Interestingly, the deltaic sedimentation and lacustrine sedimentation rates seem off by orders of magnitude.  In Goring et al. (2012) we show a mean sedimentation rate of approximately 20yr/cm, meaning the lacustrine environments of the Melas Basin might have persisted for almost 90,000 years, while sedimentation rates from the deltas produce estimates of  between 1,000 and 4,000 years.

In a Holocene or Quaternary context, orders of magnitude between 1,000 and 100,000 seem incredibly broad.  But, when we consider that we are examining the surface of another planet, and that the lake formation dates to the Hesperian period almost 3 billion years ago, the temporal certainty that Quaternary science can provide for interplanetary research is in fact astounding.

References Cited:

Blaauw, M, & Christen, JA (2011). Flexible paleoclimate age-depth models using an autoregressive gamma process. Bayesian Analysis, 6(3), 457-474.

Hurowitz, JA, et al. (2012). A New Approach to In-Situ K-Ar Geochronology. LPI Contributions, 1683, 1146.

Goring, S, et al. (2012). Deposition times in the northeastern United States during the Holocene: establishing valid priors for Bayesian age models.Quaternary Science Reviews,48, 54-60.

Seddon, AW, et al.  (2014). Looking forward through the past: identification of 50 priority research questions in palaeoecology. Journal of Ecology, 102(1), 256-267.

Camp PEON Day 6: LAST DAY

We wrapped up the last day with age-depth models and independent project presentations.
keepclam
3 classical age models include linear interpolation (connect the dots), regression (linear or polynomial), and smooth spline.  If you use classical age models – CLAM (classical age models) is an R package that is good step forward.

The field is moving away from classical models and starting to use Bayesian Age Models (started in the mid-1990s).  The age model we discussed and applied to actual data is BACON.

Here is an example of the age model example the class walked through for sediment collected from Glimmerglass Lake.  It is a nice example of a model that is well mixed, fits well, and goes through all radiocarbon dates.

Glimmerglass BACON

The afternoon was dedicated to student presentations on their independent projects. There were a wide range of independent project topics including:

  1. State space model for plant macrofossils
  2. Quantify variance between lake pollen loadings using observed pollen proportions
  3. 2000 years of inferred forest structure in the southern Sierra Nevada from a wet meadow
  4. Empirical succession model application to the fading record problem and uncertainty in forest density trajectories
  5. Alison and Susana – Colombian data “the burn-outs”, Volcanoes influenced the
  6. Using pollen and squirrel fossils to explore two datasets with irregular spacing of time
  7. Exploring Colombian vegetation using 4 proxies: 13C, volcanic minerals, archaeological artifacts and charcoal
  8. Fire signals in southern CA
  9. Pollen in Australian lakes
  10. MCMC of Alaska Picea glauca (white spruce)
  11. Detecting dust storms in lake sediment cores – are the dust storms increasing?
  12. Using eddy flux data and terrestrial models to estimate NPP and biomass from tree rings

For only having 6 hours work, the projects turned out really well, with lots of converged iterations.  It was great to see how much the students were able to pick up over the past 6 days.

We wrapped up big week with lots of hard work by celebrating with a classic Friday night fish fry dinner at Bent’s Camp!

The group!

 

Camp PEON Days 4 & 5: DATA ASSIMILATION!

These two days involved focusing on learning about and applying data assimilation in the lab.

Image source: http://afamily.vn/tam-su-ban-doc/toi-tim-duoc-chong-hai-qua-mang-3402.chn

Image source: http://bit.ly/VI9YI7

Students learned about PDA – parameter data-assimilation NOT public displays of affection (although successful parameters data-assimilation could make you want to celebrate with public displays of affection for your computer).

During these two days students learned:

  1. The need to test-run your model.  If you know the model is wrong – tuning the parameters is not going to help.
  2. Adjusting your priors after your model run is a cardinal sin.  The solution is not to move your prior, but rather to go back and work on your model!
  3. An interesting problem in the iterative process of modeling is knowing when to stop.  When are your models good enough?  How much uncertainty is acceptable?  This is especially important when thinking about the interface between science and policy.  What level of uncertainty is needed in order to make policy?  While the course didn’t provide the answers to these questions – it did give students food for thought.

The students were able to assimilate the tree ring data measured from their tree cores into the PEcAn ecosystem model they had learned about on Day 3.

The following are some of the results from the tree ring data assimilation.

Figure 1. The ring widths from Eastern Hemlocks and Yellow Birches.  Hemlocks went back to mid/early 1700s, birches to probably at least the early 1800s.

Figure 1. The ring widths from Eastern Hemlocks and Yellow Birches. Hemlocks went back to mid/early 1700s, birches to probably at least the early 1800s.

Figure 2. Tree Ring Year effects: For tree-ring folks, this is essentially your master chronology with uncertainty around it.

Figure 2. Tree Ring Year effects: For tree-ring folks, this is essentially your master chronology with uncertainty around it.

Figure 3. The above year effects can get combined with growth to look at annual aboveground biomass change (top) or cumulative aboveground biomass change (below).

Figure 3. The above year effects can get combined with growth to look at annual aboveground biomass change (top) or cumulative aboveground biomass change (below).

Figure 4. The tree rings can also be used to get a (rough) disturbance histories of the two plots (Red plot on top and Yellow plot on bottom).

Figure 4. The tree rings can also be used to get a (rough) disturbance histories of the two plots (Red plot on top and Yellow plot on bottom).

Camp Peon Day 3: BAYESIAN STATISTICS AND ECOSYSTEM MODELS

DSC_0922

Chris Paciorek, Research Statistician at the University of California, Berkeley

The lake sediment and tree cores have gotten us out to the field and demonstrated the types of methods used to collect paleodata.  They have also provided us with new paleodata for the course. However, although going to the field to collect data is fun and important, it is the Bayesian statistics and ecosystem modeling that are the focus for camp. Without the statistical and ecosystem modeling tools, our analysis of the data would be very limited.

Our course is set up to introduce topics and then build on them throughout the week. For example, over Day 1 and 2 we have covered introductions to R, probability and Bayesian statistics, as well as dendrochronology, and ecosystem modeling. Day 3 was a big day for starting to put the pieces together with a statistical discussion in the morning by Chris Paciorek on running MCMC using JAGS and followed by an  ecosystem model discussion by Mike Dietze about using PEcAn.

Mike Dietze, Professor of Ecosystem Modeling at Boston University

Mike Dietze, Professor of Ecosystem Modeling at Boston University

The following are a few nuggets that were covered on Day 3:
1. Bayes theorem, priors and posteriors.
2. Bayesian credible intervals vs. 95% confidence intervals – the Bayesian credible interval is what many ecologists typically think their 95% confident interval is!
3. Bayesian approaches are really nice for common issues in paleoecology – sampling that is not orthogonal, data that is not evenly spaced through time.
4. Acknowledging uncertainty and including it in statistical models is important.
5. When creating ecosystem modeling, it is important to quantify BOTH uncertainty in the data and the uncertainty in the models.
6. Think of models as scaffolds for data synthesis. Models are working theory on how things work. Models represent different spatial and temporal scales. Model acts like one giant covariance matrix.
7. Observations inform models but models can also inform what is measured in the field.

Lastly, it’s important to have fun and collaborate with your peers!

Day 2 of Camp PEON: TREE RINGS

Post by Jody Peters at the University of Notre Dame

Hemlock on Guido Rahr Sr. Tenderfoot Nature Reserve

Hemlock on Guido Rahr Sr. Tenderfoot Nature Reserve

In addition to the lake sediment coring that took place on Day 1, there was also a lot of tree coring that happened. We set up 2, 0.5 ha plots in The Nature Conservancy’s Guido Rahr Sr. Tenderfoot Forest Reserve where we recorded dbh (diameter at breast height) of every tree, tagged those that were large enough and took cores from trees greater than 10 cm dbh. The two stands were mainly comprised of beautiful, large, old hemlocks, as well as some yellow birch, sugar maples, and northern white cedars.

Measuring dbh

Measuring dbh

Tree 1156

Tree 1156

DSC_1420

Coring a hemlock with dbh 71.8 cm

Extracting a core

Extracting a core

After the Day 1 coring, the job on Day 2 was to mount the cores and prepare them for drying.

DSC_1428 DSC_1433
DSC_1443 DSC_1482

On Day 3 we will move on to sanding and measuring the tree rings so that we can use them as part of a data assimilation exercise led by Michael Dietze to estimate Net Primary Production using the software package PEcAn on Day 5.

DSC_1424

Day 1 of Camp PEON: WORKING ON THE FROZEN FINGER!

Post by Simon Goring, Postdoc at the University of Wisconsin-Madison.
This post originally appeared at downwithtime.

We’re at Camp PalEON this week. It’s lots of fun and I think that the attendees get a lot out of it. Effectively we’re trying to distill process associated with the entire seven year project into one week of intensive learning. We teach probability theory, Bayesian methods, ecosystem modelling, dendrochronology, paleoecology and pollen analysis, age modelling and vegetation reconstruction to seventeen lucky early-career researchers in six intensive days (people were still plunking away at 11pm last night, our first day!).

We spend a lot of our time at the University of Notre Dame’s Environmental Research Center indoors looking at computers, but we had a very nice time yesterday afternoon. I hung out on a raft with Jack Williams and Jason McLachlan, coring with a frozen finger. The frozen finger is a special kind of corer, used to recover lake sediment that preserves the sediment stratigraphy in a much cleaner way than many other coring techniques.

Jack Williams and Jason McLachlan filling the core casing with ethanol so that the cold slurry conducts to the outer wall.

Jack Williams and Jason McLachlan filling the core casing with ethanol so that the cold slurry conducts to the outer wall. (photo credit: Jody Peters)

When using the frozen finger we fill the base of the corer with dry ice, and suspend the dry ice in ethanol to create an incredibly cold surface. We then drop the casing into the lake sediment. The sediment freezes to the surface of the core casing over the course of ten or fifteen minutes before we pull the corer back to the surface of the lake. The freeze-corer (or frozen finger) is often used for ancient DNA studies (e.g., Anderson-Carpenter et al., 2011) since the freezing process helps stabilize DNA in the lake sediment until the core can be brought back to the lab and analyzed.

The sediment on the outside of the core casing  is peeled off carefully and wrapped before it is stored in a cooler of dry ice. (photo credit: Jody Peters)

The sediment on the outside of the core casing is peeled off carefully and wrapped before it is stored in a cooler of dry ice. (photo credit: Jody Peters)

Jason McLachlan and I are going to go sieve the sediment ourselves later this afternoon to give the workshop participants a chance to take a look at lake sediment, pick charcoal and find macrofossils later tonight. Meanwhile everyone is hard-coding an MCMC model in R and, later today, learning about Midwestern Paleoecology. All in all, it’s a great course and I’m happy to be involved with it for a second time. Hopefully we’ll have some more posts, but in the meantime we’ve made the preliminary readings open to the public on our project wiki, and most of our R work is up and available on GitHub so that you can take a look and work along.

Maine Fieldwork Part 2: The Bog

Post by Bob Booth,  Associate Professor at Lehigh University; Steve Jackson, Center Director for the U.S. Department of the Interior’s Southwest Climate Science Center; Connor Nolan, Steve’s PhD Student at University of Arizona, and Melissa Berke, Assistant Professor at University of Notre Dame

Read about Maine Fieldwork Part 1.

Maine Fieldwork Part 2
Our adventures in bog coring, lobster consumption, dehydration, lake scouting, dipteran-slapping, and driving (lots of driving) began on July 6 when Bob Booth, Steve Jackson, Melissa Berke, and Connor Nolan rendezvoused in Portland, Maine, and drove to Bangor, our home base for coring at Caribou Bog. A testate-amoeba record of water-table depth from the bog will be compared to a lake-level record from Giles Pond (cored by Connor and Bryan Shuman back in November). These two sites are the new paleoclimate proxies for our Howland Forest HIPS (Highly Integrated Proxy Site).  We also plan to use these records to better understand how lakes and peatlands respond to and record climate variation.

bog CaribouMap

Caribou Bog is a huge (~2200 hectares) ombrotrophic bog that has been the subject of many past investigations. We targeted a part of the bog that had been worked on in the 1980s by Feng Sheng Hu and Ron Davis. Coring took two full days (check out the video below to really appreciate the dipterans and the team’s jumping abilities). On the first day, we surveyed the bog with probbog - MBe rods to select a coring site. Then we hauled all of the heavy coring gear from the car, down a logging trail into the forest, through the ‘moat’, and then across the lumpy bog. Every part of the walk from the van to the bog and back was challenging, each for different reasons. The trail was hot and infested with deerflies and mosquitoes, the forest had no trail and low clearance and forced us to wrestle with young trees, the moat provided ample opportunity for losing boots and called for some gymnastic moves while carrying large and heavy stuff, and finally walking the 300 meters across the bog was like being on a demented stairmaster as we sunk a foot or two into the bog with every step.

bog flower - MB
After three trips to haul all of our gear, we cored the bog, collecting the upper peat (~3-4 meters) with a modified piston corer and the overlaps and deeper sections with a Russian corer.  Although we thought we had ample drinking water the first day, we didn’t, and we chose not to drink the brown bog-water.  Once we returned to the van, we headed straight to the nearest rural convenience store (only 3 miles away) and restored electrolytes and fluids.
 
We completed the coring on the second day, and dragged everything back to the van in three trips.  After dropping Bob off to meet his family in Portland, the rest of us enjoyed a seafood extravaganza at Fore Street restaurant in downtown Portland.  

portland head light

Portland Head Light

Lobster Feast

Lobster Feast

The cores went to Lehigh with Bob, but will eventually be analyzed by Connor. We will count testate amoebae and pollen in the core to get records of paleohydrology and paleovegetation spanning the past 2000 years.  Stay tuned!

Watch on YouTube: Caribou Bog 2014


Hu, F. S., & Davis, R. B. (1995). Postglacial development of a Maine bog and paleoenvironmental implications. Canadian Journal of Botany, 73(4), 638–649.

 

PalEON Sessions at AGU, December 15-19, 2014

FM14-logo-483px

We have a number of PalEONistas leading, co-leading, or giving invited talks for AGU sessions that revolve around topics central to PalEON.  If you are going to the American Geophysical Union conference in December and these sessions are of interest to you, be sure to submit your abstract by Wednesday, August 6!

Sessions

1. Finding Signal in the Noise: Dealing with Multiple Sources of Uncertainty in Paleoclimate and Paleoecology, Session ID: 3722 

The session is sponsored by Paleoceanography and Paleoclimatology and co-sponsored by Biogeosciences,  Global Environmental Change, and Nonlinear Geophysics.

Paleoclimatic and paleoecological proxies contain useful signals of past environmental state and variability that are confounded by multiple forms of uncertainty. Identifying meaningful signals and rigorously quantifying the multiple sources of uncertainty is essential to making inferences about past environments and applying these inferences to validate or improve earth system models or to inform decision-makers. Uncertainty can arise from multiple sources including inexact chronologies, unrepresented processes, uncertainties in the mechanisms that sense and archive past environmental change, and our capacity to describe these processes mathematically. In this session we will explore advances in disentangling this complexity via the use of mechanistic forward models, hierarchical statistical models, and other techniques to make robust inferences about past environments with well-quantified uncertainties.

PalEONistas involved: Jack Williams, Connor Nolan, Andria Dawson, John Tipton

2. Constraining Ecosystem Carbon Uptake and Long Term Storage Using Models and Data, Session ID: 2624

This session will focus on both short and long term processes with controlling ecosystem carbon uptake and storage using both modeling and observational approaches.  Find more details HERE.

PalEONistas involved: Dave Moore, Valerie Trouet, Mike Dietze

3. Ecological Disturbance: Observing and Predicting Disturbance Impacts, Session ID: 2482

This session focuses on studies that address the effects of ecological disturbance on carbon, water, and nutrient dynamics, as well as methods for understanding non-equilibrium conditions. Find more details HERE.

PalEONistas involved: Jaclyn Hatala Matthes, Dave Moore, Mike Dietze

4. Phenology, Session ID: 3265

We encourage the submission of abstracts that address topics including phenological modeling, scaling from organisms to ecosystems, fusion of models and data, ecologicalforecasting, relationships between phenology and ecosystem processes and services, and the role of phenology in policy decision-making. Find more details HERE.

PalEONistas involved: Dave Moore

5. Inter-site Syntheses to Explore the Biophysical Controls on Ecosystem Mass and Energy Cycling, Session ID: 3725

The session is sponsored by Biogeosciences and co-sponsored by Global Environmental Change and Hydrology.

Data syntheses across a small number of geographically- or ecologically-similar research sites, or ‘micro-network’, can be very useful for exploring the response of ecosystem mass and energy cycling to a constrained set of biophysical driving variables. Thus, micro-network syntheses represent an important bridge between studies focused on hypothesis testing and model development at the site level, and the upscaling of those results to regional and continental landscapes characterized by wide gradients in climate and land cover regimes.  In this session, we welcome studies focused on  cross-site data syntheses from a small number (2-7) of field sites to better understand how ecosystem carbon, water, or energy fluxes respond to meteorological drivers, edaphic conditions, and/or land cover change.  We invite contributions that draw from a range of leaf-level, tree-level, plot-level, and/or ecosystem-scale eco-physicological data.

PalEONistas involved: Neil Pederson (with many PalEONista co-authors)

 

 

You Are Suffering For the Greater Good of Science

Post by Simon Goring, Postdoc at the University of Wisconsin-Madison.
This post originally appeared on downwithtime.

“When you have hayfever you are suffering for the greater good of science.“
-Me. The Larry Meiller Show, WPR. July 16, 2014 [Program Archive]

Figure 1.  Your pain is science's gain.  Pollen may go into your nose, but it also enters aquatic environments where it is preserved in lake sediments.  Photo Credit: flickr/missrogue

Figure 1. Your pain is science’s gain. Pollen may go into your nose, but it also enters aquatic environments where it is preserved in lake sediments. Photo Credit: flickr/missrogue

Of course, I was talking paleoecology and the way we use airborne pollen trapped in lake sediments to help improve models of future climate change. We improve models by reconstructing forests of the past. This is one of the central concepts in PalEON (not suffering, paleoecology): Improve ecosystem model predictions for the future by testing them on independent reconstructions of the past. Give greater weight to models that perform well, and improve models that perform poorly.

I was lucky to be on the Larry Meiller Show along with Paul Hanson to discuss PalEON and GLEON, two large scale ecological projects with strong links to The University of Wisconsin. We talked a bit about climate change, large scale research, science funding, open science and historical Wisconsin. It was lots of fun and you can check out the archive here.

I feel like I was a little more prepared for this interview than I have been in the past. Jack Williams passed along his (autographed) copy of Escape from the Ivory Tower by Nancy Baron. The book helped me map out my “message box” and gave me a much better sense of what people might want to hear, as opposed to the things I wanted to talk about (how much can I talk about uncertainty, age modelling and temporal connectivity?). It was useful, and I hope I came off as well prepared and excited by my research (because I am). Regardless, just like learning R, public outreach is a skill, and one that I am happy to practice, if only because I intend to keep doing it.

Anyway, enough science outreach for one week. With this blog post and WPR I’m well above quota!

Sneak Peek at Results for Tree Composition Pre-Euro-American Settlement (ca. 1700-1850 AD)

Posted by Jody Peters with input from Simon Goring and Chris Paciorek

Just as many trees make up a mighty forest, many participants are needed to pull together and analyze data for PalEON.  Together we gain a deeper understanding of past forest dynamics, and use this knowledge to improve long-term forecasting capabilities.  Major components needed to understand past forest dynamics are tree composition, density and biomass prior to Euro-American settlement. In true macrosystems ecology fashion, over the past 3 years (and in some cases longer) individuals from multiple institutions (see Table and Figure captions, and Figure 3 here) have been working on collecting the data and developing a statistical multinomial model for tree composition in the Northeast and Midwest United States.  Our first task has been to estimate percent composition for several of the dominant forest taxa, and to provide reliable estimates of uncertainty.

We are excited to announce we have finally collected enough data to run the model across the entire northeastern United States!  Figure 1 provides an example of the composition results and associated uncertainty for beech and pine.  In addition to these two genera we have similar results for taxa such as oak, birch, hickory, maple, spruce, etc.  We can use these results to compare the pre-European Settlement forest composition to modern forests from US Forest Service Forest Inventory Assessment data as well as those extending 2000 years into the past using pollen data and STEPPS2 analyses (see this University of Wisconsin Press Release).  As we move forward with this project we will continue to update our datasets that have dispersed sampling (e.g., Indiana, Illinois and Ohio: Table 1) and we are in the process of developing maps of estimated density and biomass by tree taxon.

Stay tuned as more results come in and as the manuscripts get published!

 

Figure 1. Estimated composition (top maps) and associated uncertainty (bottom maps) created March 2014.  Estimates come from a spatial multinomial model on an 8 km Albers grid, developed by Andy Thurman from the University of Iowa and Chris Paciorek from the University of California, Berkeley.  The MCMC was run for 150,000 iterations, with the first 25,000 discarded as burn-in, and the remaining iterations subsampled (to save on storage and computation) to give 500 posterior draws.

Figure 1. Estimated composition (top maps) and associated uncertainty (bottom maps) created March 2014. Estimates come from a spatial multinomial model on an 8 km Albers grid, developed by Andy Thurman from the University of Iowa and Chris Paciorek and Andria Dawson from the University of California, Berkeley. The MCMC was run for 150,000 iterations, with the first 25,000 discarded as burn-in, and the remaining iterations subsampled (to save on storage and computation) to give 500 posterior draws.
Click on the image for a bigger, clearer picture.

 

 

 

 

 

 

 

 

 

 

 

 

Table 1. Source of tree data from Public Land Surveys from locations in the Northeast and Midwest United States. The “Sampling” column indicates when data came from the entire location (complete) or from a dispersed sample of townships or towns within the location.

Location Sampling Source
Minnesota Complete David Mladenoff (University of Wisconsin – Madison)
Wisconsin Complete David Mladenoff (University of Wisconsin – Madison)
Michigan's Upper Peninsula and northern Lower Peninsula Complete Michigan Department of Natural Resources; David Mladenoff
Michigan's southern Lower Peninsula Dispersed Jack Williams & Simon Goring (University of Wisconsin – Madison)
Illinois Dispersed Jason McLachlan (University of Notre Dame)
Illinois Chicago area Complete Marlin Bowles & Jenny McBride (The Morton Arboretum)
Illinois St. Louis area Complete Dick Brugam (Southern Illinois University) & Paul Kilburn (Jefferson County Nature Association)
Indiana Dispersed Jason McLachlan (University of Notre Dame)
Ohio Dispersed Charles Cogbill
New England, New York, Pennsylvania and New Jersey Dispersed Charles Cogbill

 

 

PalEON on TV

Posted by Jody Peters, PalEON Program Manager

The elevator pitch (a 30 second to 2 min synopsis of your research) is critical for sharing science with other scientists and the general public.  However, developing this pitch usually does not come naturally to most people.  It is something that needs to be practiced.   Recently Jason McLachlan and Sam Pecararo from the University of Notre Dame, had the opportunity to practice their pitches in featured segments on Outdoor Elements, a show on our local PBS station. Not only did Jason and Sam have to prepare their elevator pitch, but they also had to come up with visual props that would be interesting to view on TV.  We think they both did a great job condensing their science stories into a few minutes!

Jason’s segment, Paleobotany & Climate Change, originally aired on Feb 9, 2014 and focused on PalEON in general and specifically described some of our work with tree data from the Public Land Survey.  After he was taped for this segment last fall, Jason wrote a blog post about what he wished he would have said.  Compare what he wished he would have said to what actually was aired!

JasonOutdoorElements

Sam’s segment, Tree Coring, originally aired on February 16, 2014 and featured Sam coring a tree and talking about using tree rings to get an idea of how climate or other environmental variables influence tree growth.

SamOutdoorElements

Check out these segments to see Jason and Sam’s elevator pitch for some of the work  of PalEON! Click on the links or photos above and scroll down to where it says “Play segment” to view.  Each segment is approximately 7 minutes long.

 

Self thin you must

yoda

Post by Dave Moore, Professor at The University of Arizona
This post also appeared on the Paleonproject Tumblr

We spent a lot of time last week in Tucson discussing sampling protocols for PalEON’s tree ring effort that will happen this summer. The trouble is that trees (like other plants) will self thin over time and when we collect tree cores to recreate aboveground biomass increment we have to be careful about how far back in time we push our claims. Bonus points if you can explain the photo in ecological terms! I stole it from Rachel Gallery’s Ecology class notes.

Neil Pederson and Amy Hessl will be taking the lead in the North East while Ross Alexander working with Dave Moore and Val Trouet (LTRR) will push our sampling into the Midwest and beyond the PalEON project domain westwards. This is a neat collaboration between the PalEON project and another project funded by the DOE. Francesc Montane and Yao Liu who recently joined my lab will be helping to integrate these data into the Community Land Model. Also Mike Dietze’s group will be using the ED model to interpret the results.

Because we want to integrate these data into land surface models we need to have a robust statistical framework so we had some equally robust discussions about statistical considerations with Chris Paciorek and Jason McLachlan and other members of the PalEON team.

Forests in a Changing Climate

Post by Andria Dawson, University of Notre Dame/University of California, Berkeley Postdoc

fall_forest3

Forests play an important role in the global carbon cycle by storing and releasing carbon through processes such as establishment, growth, mortality, and disturbance. Forests can be carbon sources if they release more carbon than they absorb, or carbon sinks if they absorb more than they release. Knowing that forests affect the carbon budget, it is natural to ask about the interactions between forests and the changing climate. Do forests mitigate climate change? The answer to this question is seemingly complex. Here are a few of the many reasons why…

Albedo

Some of the solar radiation that reaches the Earth is absorbed, while some is reflected. The reflectivity of a surface, or albedo, is some measure of the whiteness of a surface. Snow has a high albedo, while open ocean has a low albedo. Forests typically have a low albedo. To keep the earth cool, all we need to do is absorb less of this incoming radiation. Researchers at Dartmouth college found that in regions where snow is common and forest productivity is low, it is beneficial to the economy and the climate to clear those forests, which modulates the temperature by increasing albedo [1].

Carbon dioxide

Trees use carbon dioxide to photosynthesize. As atmospheric CO2 increases, trees are expected to experience increased growth, at least up to a point. CO2 is absorbed through the stomata, but while these stomata are open and readily absorbing CO2, they are also allowing the tree to lose moisture. When CO2 is more readily available, trees don’t have to open their stomata as wide to absord it. This leads to less moisture loss through the stomata, leaving the tree with additional resources for other processes, such as growth [2]. And in turn, increased growth leads to increased CO2 consumption. But only up to a point.

Terpenes

Forests interact with the atmosphere by releasing biological aerosols as well as compounds known as terpenes. Terpenes react and form aerosols, forming clouds, which in part determines how much light is reflected back to space. Spracklen et al. found that terpenes from a simulated pine forest increased cloud thickness, causing an additional 5% of solar radiation to be reflected back to space [3].

Changes in natural disturbance regimes

Climate change has the potential to affect disturbance regimes. Dale et al. succinctly wrote that ‘climate change can affect forests by altering the frequency, intensity, duration, and timing of fire, drought, introduced species, insect and pathogen outbreaks, hurricanes, windstorms, ice storms, or landslides’ [4]. These disturbance events affect forests in different ways, from causing widespread mortality to causing changes in structure, composition, and function.

How to make sense of all of this?

The take home message is that an important relationship exists between forests and climate. The cumulative effect of these feedback mechanisms are difficult to disentangle, and further collaborative research based on ecosystem and atmospheric models confronted with data are key as we move forward. PalEON is one such collaborative effort, drawing from talents to work towards a better understanding of forest systems.

References
[1] Updated citation as of 4-17-14:Lutz, David A., Howarth, Richard B., “Valuing albedo as an ecosystem service: implications for forest management. Climatic Change (2014): doi:10.1007/s10584-014-1109-0
Original citation: Dartmouth College. “Can cutting trees help fight global warming? More logging, deforestation may better serve climate in some areas, study finds.” ScienceDaily. ScienceDaily, 5 December 2013.

[2] Keenan, Trevor F., et al. “Increase in forest water-use efficiency as atmospheric carbon dioxide concentrations rise.” Nature 499.7458 (2013): 324-327.

[3] Spracklen, Dominick V., Boris Bonn, and Kenneth S. Carslaw. “Boreal forests, aerosols and the impacts on clouds and climate.” Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences 366.1885 (2008): 4613-4626.

[4] Dale, Virginia H., et al. “Climate Change and Forest Disturbances Climate change can affect forests by altering the frequency, intensity, duration, and timing of fire, drought, introduced species, insect and pathogen outbreaks, hurricanes, windstorms, ice storms, or landslides.” BioScience 51.9 (2001): 723-734.

Macrosystems Ecology: The More We Know The Less We Know.

Post by Simon Goring, Postdoc at the University of Wisconsin-Madison.
This post originally appeared at downwithtime.

Dynamic Ecology had a post recently asking why there wasn’t an Ecology Blogosphere. One of the answers was simply that as ecologists we often recognize the depth of knowledge of our peers and as such, are unlikely (or are unwilling) to comment in an area that we have little expertise. This is an important point. I often feel like the longer I stay in academia the more I am surprised when I can explain a concept outside my (fairly broad) subject area clearly and concisely.  It surprises me that I have depth of knowledge in a subject that I don’t directly study.

Of course, it makes sense.  We are constantly exposed to ideas outside our disciplines in seminars, papers, on blogs & twitter, and in general discussions, but at the same time we are also exposed to people with years of intense disciplinary knowledge, who understand the subtleties and implications of their arguments.  This is exciting and frightening.  The more we know about a subject, the more we know what we don’t know.  Plus, we’re trained to listen to other people.  We ‘grew up’ academically under the guidance of others, who often had to correct us, so when we get corrected out of our disciplines we are often likely to defer, rather than fight.

This speaks to a broader issue though, and one that is addressed in the latest issue of Frontiers in Ecology and the Environment.  The challenges of global change require us to come out of our disciplinary shells and to address challenges with a new approach, defined here as Macrosystems Ecology.  At large spatial and temporal scales – the kinds of scales at which we experience life – ecosystems cease being disciplinary.  Jim Heffernan and Pat Soranno, in the lead paper (Heffernan et al., 2014) detail three ecological systems that can’t be understood without cross-scale synthesis using multi-disciplinary teams.

Figure 1.  From Heffernan et al. (2014), multiple scales and disciplines interact to explain patterns of change in the Amazon basin.

Figure 1. From Heffernan et al. (2014), multiple scales and disciplines interact to explain patterns of change in the Amazon basin.

The Amazonian rain forest is a perfect example of a region that is imperiled by global change, and can benefit from a Macrosystems approach.  Climate change and anthropogenic land use drives vegetation change, but vegetation change also drives climate (and, ultimately, land use decisions). This is further compounded by teleconnections related to societal demand for agricultural products around the world and the regional political climate.  To understand and address ecological problems in this region then, we need to understand cross-scale phenomena in ecology, climatology, physical geography, human geography, economics and political science.

Macrosystems proposes a cross-scale effort, linking disciplines through common questions to examine how systems operate at regional to continental scales, and at multiple temporal scales.  These problems are necessarily complex, but by bringing together researchers in multiple disciplines we can begin to develop a more complete understanding of broad-scale ecological systems.

Interdisciplinary research is not something that many of us have trained for as ecologists (or biogeographers, or paleoecologists, or physical geographers. . . but that’s another post).  It is a complex, inter-personal interaction that requires understanding of the cultural norms within other disciplines.  Cheruvelil et al. (2014) do a great job of describing how to achieve and maintain high-functioning teams in large interdisciplinary projects, and Kendra also discusses this further in a post on her own academic blog.

Figure 2.  Interdisciplinary research requires effort in a number of different areas, and these efforts are not recognized under traditional reward structures.

Figure 2. From Goring et al., (2014). Interdisciplinary research requires effort in a number of different areas, and these efforts are not recognized under traditional reward structures.

In Goring et al. (2014) we discuss a peculiar issue that is posed by interdisciplinary research.  The reward system in academia is largely structured to favor disciplinary research.  We refer to this in our paper as a disciplinary silo.  You are in a department of X, you publish in the Journal of X, you go to the International Congress of X and you submit grant requests to the X Program of your funding agency.  All of these pathways are rewarded, and even though we often claim that teaching and broader outreach are important, they are important inasmuch as you need to not screw them up completely (a generalization, but one I’ve heard often enough).

As we move towards greater interdisciplinarity we begin to recognize that simply superimposing the traditional rewards structure onto interdisciplinary projects (Figure 2) leaves a lot to be desired.  This is particularly critical for early-career researchers.  We are asking these researchers (people like me) to collaborate broadly with researchers around the globe, to tackle complex issues in global change ecology, but, when it comes time to assess their research productivity we don’t account for the added burden that interdisciplinary research can require of a researcher.

Now, I admit, this is self-serving.  As an early career researcher, and member of a large interdisciplinary team (PalEON), much of what we propose in Goring et al. (2014) strongly reflects on my own personal experience.  Outreach activities, the complexities of dealing with multiple data sources, large multi-authored papers, posters and talks, and the coordination of researchers across disciplines are all realities for me, and for others in the project, but ultimately, we get evaluated on grants and papers.  The interdisciplinary model of research requires effort that never gets valuated by hiring or tenure committees.

That’s not to say that hiring committees don’t consider this complexity, and I know they’re not just looking for Nature and Science papers, but at the same time, there is a new landscape for researchers out there, and we’re trying to evaluate them with an old map.

In Goring et al. (2014) we propose a broader set of metrics against which to evaluate members of large interdisciplinary teams (or small teams, there’s no reason to be picky).  This list of new metrics (here) includes traditional metrics (numbers of papers, size of grants), but expands the value of co-authorship, recognizing that only one person is first in the authorship list, even if people make critical contributions; provides support for non-disciplinary outputs, like policy reports, dataset generation, non-disciplinary research products (white papers, books) and the creation of tools and teaching materials; and adds value to qualitative contributions, such as facilitation roles, helping people communicate or interact across disciplinary divides.

This was an exciting set of papers to be involved with, all arising from two meetings associated with the NSF Macrosystems Biology program (part of NSF BIO’s Emerging Frontiers program).  I was lucky enough to attend both meetings, the first in Boulder CO, the second in Washington DC.  As a post-doctoral researcher these are the kinds of meetings that are formative for early-career researchers, and clearly, I got a lot out of it.  The Macrosystems Biology program is funding some very exciting programs, and this Frontiers issue attempts to get to the heart of the Macrosystems approach.  It is the result of many hours and days of discussion, and many of the projects are already coming to fruition.  It is an exciting time to be an early-career researcher, hopefully you agree!

PEONs at AGU

If you are going to AGU be sure to look around for PalEONs In addition to a number of PEONs that are attending AGU, we will have 3 PalEON posters and 1 talk that will be given. Here are the details. Check it out!

Posters – Monday morning
1) Goring et al. Effects of Euro-American settlement and historic climate variability on species-climate relationships and the co-occurrence of dominant tree species. AGU 2013. (Poster, B11G-0438)

2) Matthes et al. Representations of historic vegetation dynamics in CMIP5 and MsTMIP models (Poster B11E-0401)

3) Dawson et al. Spatio-temporal changes in forest composition inferred from fossil pollen records in the Upper Midwestern USA (Poster, B11G-0446)

Talk – Thursday morning
1) Steinkamp and Hickler. Is drought-induced forest dieback globally increasing? (Talk, B42B. Ecological Disturbance: Observing and Predicting the Impacts of Landscape Disturbance III)

The Invasion of the Zombie Maples

Post by Ana Camila Gonzalez, Undergraduate Researcher with Neil Pederson and the Tree Ring Laboratory at Columbia’s Lamont-Doherty Earth Observatory

As an undergraduate student interning at the Tree Ring Lab at Lamont-Doherty Earth Observatory, my involvement with PalEON has been rather localized to the data production side of things. My knowledge on the dynamics of climate and the models involved in forecasting future climate change is obviously limited as a second-year student. My knowledge on how frustrating it can be to cross-date the rings in Maple trees, however, is more extensive.

This past summer I was able to join the Tree Ring Lab on a fieldwork trip to Harvard Forest in Petersham, MA. My main task was to map each plot where we cored, recording the species of each tree cored, its distance to the plot center, its DBH, its canopy position, its compass orientation, and any defining characteristics (the tree was rotten, hollow, had two stems, etc.). The forest was beautiful, but it became more beautiful every time I wrote down the letters QURU (Quercus rubra)I had plenty of experience with oaks, and knew that they did not often create false or missing rings and are thus a fairly easy species to cross-date. I shuddered a little every time I had to write down BEAL (Betula alleghaniensis), however, since I had looked at a few yellow birches before and knew the rings were sometimes impossible to see let alone cross-date. I had no reaction to the letters ACRU (Acer rubrum), however, since I had never looked at a red maple core before. I was happy that it was a tree I could easily identify, and so I didn’t mind that the letters kept coming up. Had I known what was to come, I would’ve found a way to prevent anyone from putting a borer to a red maple.

At first, the maples seemed to be my friends. The rings were sensitive enough that multiple marker years helped me figure out where the missing rings where, what was false and what was real. I morbidly became a fan of the gypsy moth outbreak of 1981, because in many cases (but not all) it produced a distinct white ring that marked that year very clearly. This was definitely challenging, as the trees also seemed to be locally sensitive—a narrow ring in one tree might not at all be present in another—but all in all it seemed to be going well.

And then came the Zombie Maples.

Fig (a) Anatomy of a White Ring: Above is a core collected in 2003. It was alive. The white ring in the center of the image is 1981, the year of the regional gypsy moth outbreak in New York and New England.

That white ring you’re seeing above is the characteristic 1981 ring from a Zombie Maple cored in 2003. After that ring we can only see four rings—but this tree is alive, which means that there should be 13 rings after 1990 (Fig b). This means approximately 10 are missing.

Fig (b) Anatomy of a Zombie Maple: Above is a core collected in 2003. It was alive. The 1990 ring is marked in the image just right of center. There should be 13 rings between 1990 and the bark. You can only see four. Is it Alive? Is it Dead? Eek! It is a Zombie!!

This kind of suppression in the last two decades was present in multiple cores, and it made many perfectly alive trees seem like they should have been dead. Nine rings missing in a little over one millimeter. We see even more severe cases in our new collection: 15 rings where there should be 30 rings in about 2 millimeters—how is this tree supporting itself?

Cross-dating these cores took a lot longer than planned, and at times I was tempted to pretend my box of maples went missing, but afterwards I felt I was a much stronger cross-dater, and I’m realizing more and more that this really matters. If you’re going to base a model off of data that involves ring-width measurements from particular years, you better make sure you have the right years. What if we didn’t know the gypsy moth outbreak occurred in 1981, and somebody counting the rings back on the Zombie maple core above was led to believe it occurred in 1996? Our understanding of the trigger for this event would be incorrect because we would be looking for evidence from the wrong decade.

In a way, the Maples are still my friends. They were almost like the English teacher in high school who graded harshly who you didn’t appreciate until you realized how much better your writing had become.

PalEON Goes Into the Field

Post by Connor Nolan, University of Arizona Graduate Student

On Sunday November 3, Bryan Shuman and I (Connor Nolan) packed up a rental van with coring gear and hit the road for the 5.5 hour drive from Woods Hole, MA to Bangor, ME. 

Our aim was to do identify and core a lake for lake-level reconstruction near-ish to the Howland Experimental Forest flux site. We can survey lakes for evidence of past lake level changes using ground penetrating radar. The first day we had adventures in learning to navigate Maine’s back roads and we surveyed 2 lakes – Crystal Lake and Pickerel Pond. Both were beautiful sites, but the past lake level story was not very clear. 

Day 2 included surveys of 3 more lakes — Peep Lake, Salmon Pond, and Giles Pond — and an excursion through the largest industrial blueberry farm in North America (an interesting looking site called Rocky Lake is on their property, we did not survey it this trip due to big no trespassing signs on the property…). Just as we were starting to wonder if we would find the right lake on this trip we surveyed Giles Pond and it turned out to be the one! We arrived in the perfect light to take a great picture.Day 3 we cored Giles Pond along a transect. We ended up with 4 cores in all from this trip with lots of sand layers (a good thing for this kind of work!!). The Younger Dryas has a very distinctive lithology in this region, a light gray clay, and we have this lithology in some of our cores so we should have a record that goes back nearly 15,000 years! 

It was my first lake coring experience and it was a lot of fun! The cores are currently at Woods Hole Oceanographic Institute with Bryan. I will make a return trip there before long to do some initial analyses and then ship the cores back to Arizona for initial dating and more work!