Krieger-Eisenhower Professor of Cognitive Science
Precise theories of higher cognitive domains like language and reasoning rely crucially on complex symbolic rule systems like those of grammar and logic. According to traditional cognitive science and artificial intelligence, such symbolic systems are the very essence of higher intelligence. Yet intelligence resides in the brain, where computation appears to be numerical, not symbolic; parallel, not serial; quite distributed, not as highly localized as in symbolic systems. Furthermore, when observed carefully, much of human behavior is remarkably sensitive to the detailed statistical properties of experience; hard-edged rule systems seem ill-equipped to handle these subtleties. My research attempts to identify the proper roles within a unified theory of cognition for symbolic computation, numerical neural computation, and statistical computation.
More specifically, the basic questions driving this research include: What are the central general principles of computation in connectionist -- abstract neural -- networks? How can these principles be reconciled with those of symbolic computation? Addressing these questions over the past two decades, my work has led to a new computational architecture for cognition which integrates connectionist and symbolic computation. Can this framework further the theory of higher cognition, by connecting it with lower-level principles derived from neural computation?
The connectionist conception of intuitive knowledge as a collection of conflicting soft constraints, interacting via optimization of well-formedness or Harmony, led in joint research with Graldine Legendre to the connectionist-based formalism of Harmonic Grammar. Incorporating the richly structured representations and universal well-formedness constraints of symbolic linguistic theory, Alan Prince and I developed a grammar formalism called Optimality Theory which brings general connectionist computational principles of optimization into the heart of the symbolic theory of universal grammar. The optimization that emerges is no longer inherently numerical: constraint strengths are encoded in a hierarchy of constraints, ranked from strongest to weakest; each constraint is stronger than all weaker constraints combined.
According to Optimality Theory (OT), possible human languages share a common set of universal constraints on well-formedness. These constraints are highly general, and hence conflict; thus some must be violated in optimal, i.e., grammatical, structures. The different surface patterns of the world's languages emerge via different priority rankings of the fixed set of universal constraints: each ranking is a language-particular grammar, a means of resolving the inherent conflicts among the universal constraints.
My current research addresses multiple aspects of OT. These include superadditive constraint interaction ('local conjunction' of constraints), especially in phonology (vowel harmony; Obligatory Contour Principle; sonority and syllable structure), as well as numerical and connectionist implementation of OT constraint interaction.