Abstract: In terms of its soluble precursors, the coagulation proteome varies quantitatively among apparently healthy individuals. The significance of this variability remains obscure, in part because it is the backdrop against which the hemostatic consequences of more dramatic composition differences are studied. In this study we have defined the consequences of normal range variation of components of the coagulation proteome by using a mechanism-based computational approach that translates coagulation factor concentration data into a representation of an individual's thrombin generation potential. A novel graphical method is used to integrate standard measures that characterize thrombin generation in both empirical and computational models (e.g max rate, max level, total thrombin, time to 2 nM thrombin ('clot time')) to visualize how normal range variation in coagulation factors results in unique thrombin generation phenotypes. Unique ensembles of the 8 coagulation factors encompassing the limits of normal range variation were used as initial conditions for the computational modeling, each ensemble representing 'an individual' in a theoretical healthy population. These 'individuals' with unremarkable proteome composition was then compared to actual normal and 'abnormal' individuals, i.e. factor ensembles measured in apparently healthy individuals, actual coagulopathic individuals or artificially constructed factor ensembles representing individuals with specific factor deficiencies. A sensitivity analysis was performed to rank either individual factors or all possible pairs of factors in terms of their contribution to the overall distribution of thrombin generation phenotypes. Key findings of these analyses include: normal range variation of coagulation factors yields thrombin generation phenotypes indistinguishable from individuals with some, but not all, coagulopathies examined; coordinate variation of certain pairs of factors within their normal ranges disproportionately results in extreme thrombin generation phenotypes, implying that measurement of a smaller set of factors may be sufficient to identify individuals with aberrant thrombin generation potential despite normal coagulation proteome composition.
[edit database entry]
Bongard's work focuses on understanding the general nature of cognition, regardless of whether it is found in humans, animals or robots. This unique approach focuses on the role that morphology and evolution plays in cognition. Addressing these questions has taken him into the fields of biology, psychology, engineering and computer science.
Danforth is an applied mathematician interested in modeling a variety of physical, biological, and social phenomenon. He has applied principles of chaos theory to improve weather forecasts as a member of the Mathematics and Climate Research Network, and developed a real-time remote sensor of global happiness using messages from Twitter: the Hedonometer. Danforth co-runs the Computational Story Lab with Peter Dodds, and helps run UVM's reading group on complexity.
Laurent studies the interaction of structure and dynamics. His research involves network theory, statistical physics and nonlinear dynamics along with their applications in epidemiology, ecology, biology, and sociology. Recent projects include comparing complex networks of different nature, the coevolution of human behavior and infectious diseases, understanding the role of forest shape in determining stability of tropical forests, as well as the impact of echo chambers in political discussions.
Hines' work broadly focuses on finding ways to make electric energy more reliable, more affordable, with less environmental impact. Particular topics of interest include understanding the mechanisms by which small problems in the power grid become large blackouts, identifying and mitigating the stresses caused by large amounts of electric vehicle charging, and quantifying the impact of high penetrations of wind/solar on electricity systems.
Bagrow's interests include: Complex Networks (community detection, social modeling and human dynamics, statistical phenomena, graph similarity and isomorphism), Statistical Physics (non-equilibrium methods, phase transitions, percolation, interacting particle systems, spin glasses), and Optimization(glassy techniques such as simulated/quantum annealing, (non-gradient) minimization of noisy objective functions).