Publications
Crowd ideation of supervised learning problems
Preprint, 2018
Status: Published
Citations:
Cite: [bibtex]

Abstract: Crowdsourcing is an important avenue for collecting machine learning data, but crowdsourcing
can go beyond simple data collection by employing the creativity and wisdom of crowd
workers. Yet crowd participants are unlikely to be experts in statistics or predictive modeling, and
it is not clear how well non-experts can contribute creatively to the process of machine learning.
Here we study an end-to-end crowdsourcing algorithm where groups of non-expert workers propose
supervised learning problems, rank and categorize those problems, and then provide data to train
predictive models on those problems. Problem proposal includes and extends feature engineering because
workers propose the entire problem, not only the input features but also the target variable. We
show that workers without machine learning experience can collectively construct useful datasets and
that predictive models can be learned on these datasets. In our experiments, the problems proposed
by workers covered a broad range of topics, from politics and current events to problems capturing
health behavior, demographics, and more. Workers also favored questions showing positively correlated
relationships, which has interesting implications given many supervised learning methods
perform as well with strong negative correlations. Proper instructions are crucial for non-experts,
so we also conducted a randomized trial to understand how different instructions may influence the
types of problems proposed by workers. In general, shifting the focus of machine learning tasks
from designing and training individual predictive models to problem proposal allows crowdsourcers
to design requirements for problems of interest and then guide workers towards contributing to the
most suitable problems.
[edit database entry]

Bongard's work focuses on understanding the general nature of cognition, regardless of whether it is found in humans, animals or robots. This unique approach focuses on the role that morphology and evolution plays in cognition. Addressing these questions has taken him into the fields of biology, psychology, engineering and computer science.
Continuous Self-Modeling. Science 314, 1118 (2006). [Journal Page]

Danforth is an applied mathematician interested in modeling a variety of physical, biological, and social phenomenon. He has applied principles of chaos theory to improve weather forecasts as a member of the Mathematics and Climate Research Network, and developed a real-time remote sensor of global happiness using messages from Twitter: the Hedonometer. Danforth co-runs the Computational Story Lab with Peter Dodds, and helps run UVM's reading group on complexity.

Laurent studies the interaction of structure and dynamics. His research involves network theory, statistical physics and nonlinear dynamics along with their applications in epidemiology, ecology, biology, and sociology. Recent projects include comparing complex networks of different nature, the coevolution of human behavior and infectious diseases, understanding the role of forest shape in determining stability of tropical forests, as well as the impact of echo chambers in political discussions.

Hines' work broadly focuses on finding ways to make electric energy more reliable, more affordable, with less environmental impact. Particular topics of interest include understanding the mechanisms by which small problems in the power grid become large blackouts, identifying and mitigating the stresses caused by large amounts of electric vehicle charging, and quantifying the impact of high penetrations of wind/solar on electricity systems.

Bagrow's interests include: Complex Networks (community detection, social modeling and human dynamics, statistical phenomena, graph similarity and isomorphism), Statistical Physics (non-equilibrium methods, phase transitions, percolation, interacting particle systems, spin glasses), and Optimization(glassy techniques such as simulated/quantum annealing, (non-gradient) minimization of noisy objective functions).