Selection models of language production support informed text partitioning: an intuitive and practical bag-of-phrases framework for text analysis
Abstract: The task of text segmentation, or 'chunking,' may occur at many levels in text analysis, depending on whether it is most beneficial to break it down by paragraphs of a book, sentences of a paragraph, etc. Here, we focus on a fine-grained segmentation task, which we refer to as text partitioning, where we apply methodologies to segment sentences or clauses into phrases, or lexical constructions of one or more words. In the past, we have explored (uniform) stochastic text partitioning---a process on the gaps between words whereby each space assumes one from a binary state of fixed (word binding) or broken (word separating) by some probability. In that work, we narrowly explored perhaps the most naive version of this process: random, or, uniform stochastic partitioning, where all word-word gaps are prescribed a uniformly-set breakage probability, q. Under this framework, the breakage probability is a tunable parameter, and was set to be pure-uniform: q = 1/2. In this work, we explore phrase frequency distributions under variation of the parameter q, and define non-uniform, or informed stochastic partitions, where q is a function of surrounding information. Using a crude but effective function for q, we go on to apply informed partitions to over 20,000 English texts from the Project Gutenberg eBooks database. In these analyses, we connect selection models to generate a notion of fit goodness for the 'bag-of-terms' (words or phrases) representations of texts, and find informed (phrase) partitions to be an improvement over the q = 1 (word) and q = 1/2 (phrase) partitions in most cases. This, together with the scalability of the methods proposed, suggests that the bag-of-phrases model should more often than not be implemented in place of the bag-of-words model, setting the stage for a paradigm shift in feature selection, which lies at the foundation of text analysis methodology.
[edit database entry]
Bongard's work focuses on understanding the general nature of cognition, regardless of whether it is found in humans, animals or robots. This unique approach focuses on the role that morphology and evolution plays in cognition. Addressing these questions has taken him into the fields of biology, psychology, engineering and computer science.
Danforth is an applied mathematician interested in modeling a variety of physical, biological, and social phenomenon. He has applied principles of chaos theory to improve weather forecasts as a member of the Mathematics and Climate Research Network, and developed a real-time remote sensor of global happiness using messages from Twitter: the Hedonometer. Danforth co-runs the Computational Story Lab with Peter Dodds, and helps run UVM's reading group on complexity.
Laurent studies the interaction of structure and dynamics. His research involves network theory, statistical physics and nonlinear dynamics along with their applications in epidemiology, ecology, biology, and sociology. Recent projects include comparing complex networks of different nature, the coevolution of human behavior and infectious diseases, understanding the role of forest shape in determining stability of tropical forests, as well as the impact of echo chambers in political discussions.
Hines' work broadly focuses on finding ways to make electric energy more reliable, more affordable, with less environmental impact. Particular topics of interest include understanding the mechanisms by which small problems in the power grid become large blackouts, identifying and mitigating the stresses caused by large amounts of electric vehicle charging, and quantifying the impact of high penetrations of wind/solar on electricity systems.
Bagrow's interests include: Complex Networks (community detection, social modeling and human dynamics, statistical phenomena, graph similarity and isomorphism), Statistical Physics (non-equilibrium methods, phase transitions, percolation, interacting particle systems, spin glasses), and Optimization(glassy techniques such as simulated/quantum annealing, (non-gradient) minimization of noisy objective functions).