Developing a 21st Century framework for lake-specific eutrophication assessment using quantile regression: Lake-specific eutrophication assessment
Limnology and Oceanology Methods, 13, , 2015
Abstract: Over the past 30+ years, researchers and water resource managers have often relied on a set of regression-based equations to describe the relationships between secchi depth (SD), chlorophyll (Chl) and total phosphorous (TP) and quantitatively assess lake trophic status after Carlson (1977). Here, we develop a revised framework for eutrophication assessment that incorporates recent statistical advances in ecology and leverages the increasing availability of lake-specific datasets in the 21st Century. Long-term (1992–2012) water quality data from Lake Champlain (LC) are used to revisit and revise classic equations of tropic state indices (TSIChl/TP). The upper boundaries of SD–ln(Chl) and ln(Chl)–ln(TP) distributions within this dataset fit well with quantile regression (99th, QR) to generate LC-specific TSIChl/TP equations. Our results illustrate that Carlson (1977)'s original TSIChl/TP equations overestimate the trophic status of LC relative to LC-specific equations, and highlight the power of the QR-derived TSIChl/TP metric. We combine TSISD and TSIChl into one metric to indicate pseudoeutrophication and pseudomesotrophication of oligotrophic waters as well as pseudoeutrophication of mesotrophic waters to identify waters threatened by potential trophic shift. Additionally, TSIChl and TSITP were coupled as a complimentary dual metric to indicate potential risks of excessive phosphorus loading to oligotrophic and mesotrophic waters. With these dual metric schemes, we performed cluster analysis of 15 locations to spatially assess trophic status and phosphorous risks across LC. This study describes a relatively simple and robust approach for lake-specific status assessment, the structure of which can be broadly utilized within monitoring and research communities.
[edit database entry]
Bongard's work focuses on understanding the general nature of cognition, regardless of whether it is found in humans, animals or robots. This unique approach focuses on the role that morphology and evolution plays in cognition. Addressing these questions has taken him into the fields of biology, psychology, engineering and computer science.
Danforth is an applied mathematician interested in modeling a variety of physical, biological, and social phenomenon. He has applied principles of chaos theory to improve weather forecasts as a member of the Mathematics and Climate Research Network, and developed a real-time remote sensor of global happiness using messages from Twitter: the Hedonometer. Danforth co-runs the Computational Story Lab with Peter Dodds, and helps run UVM's reading group on complexity.
Laurent studies the interaction of structure and dynamics. His research involves network theory, statistical physics and nonlinear dynamics along with their applications in epidemiology, ecology, biology, and sociology. Recent projects include comparing complex networks of different nature, the coevolution of human behavior and infectious diseases, understanding the role of forest shape in determining stability of tropical forests, as well as the impact of echo chambers in political discussions.
Hines' work broadly focuses on finding ways to make electric energy more reliable, more affordable, with less environmental impact. Particular topics of interest include understanding the mechanisms by which small problems in the power grid become large blackouts, identifying and mitigating the stresses caused by large amounts of electric vehicle charging, and quantifying the impact of high penetrations of wind/solar on electricity systems.
Bagrow's interests include: Complex Networks (community detection, social modeling and human dynamics, statistical phenomena, graph similarity and isomorphism), Statistical Physics (non-equilibrium methods, phase transitions, percolation, interacting particle systems, spin glasses), and Optimization(glassy techniques such as simulated/quantum annealing, (non-gradient) minimization of noisy objective functions).