Publications
Genomic mining for complex disease traits with 'Random Chemistry'
Genetic Programming and Evolvable Machines, 8, 395-411, 2007
Status: Published
Citations:
Cite: [bibtex]

Abstract: Our rapidly growing knowledge regarding genetic variation in the human genome offers great potential for understanding the genetic etiology of disease. This, in turn, could revolutionize detection, treatment, and in some cases prevention of disease. While genes for most of the rare monogenic diseases have already been discovered, most common diseases are complex traits, resulting from multiple gene–gene and gene-environment interactions. Detecting epistatic genetic interactions that predispose for disease is an important, but computationally daunting, task currently facing bioinformaticists. Here, we propose a new evolutionary approach that attempts to hill-climb from large sets of candidate epistatic genetic features to smaller sets, inspired by Kauffman’s 'random chemistry' approach to detecting small auto-catalytic sets of molecules from within large sets. Although the algorithm is conceptually straightforward, its success hinges upon the creation of a fitness function able to discriminate large sets that contain subsets of interacting genetic features from those that don’t. Here, we employ an approximate and noisy fitness function based on the ReliefF data mining algorithm. We establish proof-of-concept using synthetic data sets, where individual features have no marginal effects. We show that the resulting algorithm can successfully detect epistatic pairs from up to 1,000 candidate single nucleotide polymorphisms in time that is linear in the size of the initial set, although success rate degrades as heritability declines. Research continues into seeking a more accurate fitness approximator for large sets and other algorithmic improvements that will enable us to extend the approach to larger data sets and to lower heritabilities.
[edit database entry]

Bongard's work focuses on understanding the general nature of cognition, regardless of whether it is found in humans, animals or robots. This unique approach focuses on the role that morphology and evolution plays in cognition. Addressing these questions has taken him into the fields of biology, psychology, engineering and computer science.
Continuous Self-Modeling. Science 314, 1118 (2006). [Journal Page]

Danforth is an applied mathematician interested in modeling a variety of physical, biological, and social phenomenon. He has applied principles of chaos theory to improve weather forecasts as a member of the Mathematics and Climate Research Network, and developed a real-time remote sensor of global happiness using messages from Twitter: the Hedonometer. Danforth co-runs the Computational Story Lab with Peter Dodds, and helps run UVM's reading group on complexity.

Laurent studies the interaction of structure and dynamics. His research involves network theory, statistical physics and nonlinear dynamics along with their applications in epidemiology, ecology, biology, and sociology. Recent projects include comparing complex networks of different nature, the coevolution of human behavior and infectious diseases, understanding the role of forest shape in determining stability of tropical forests, as well as the impact of echo chambers in political discussions.

Hines' work broadly focuses on finding ways to make electric energy more reliable, more affordable, with less environmental impact. Particular topics of interest include understanding the mechanisms by which small problems in the power grid become large blackouts, identifying and mitigating the stresses caused by large amounts of electric vehicle charging, and quantifying the impact of high penetrations of wind/solar on electricity systems.

Bagrow's interests include: Complex Networks (community detection, social modeling and human dynamics, statistical phenomena, graph similarity and isomorphism), Statistical Physics (non-equilibrium methods, phase transitions, percolation, interacting particle systems, spin glasses), and Optimization(glassy techniques such as simulated/quantum annealing, (non-gradient) minimization of noisy objective functions).