Highlighting the close relationship between linguistic explanation and learnability, Bruce Tesar and Paul Smolensky examine the implications of Optimality Theory (OT) for language learnability. They show how the core principles of OT lead to the learning principle of constraint demotion, the basis for a family of algorithms that infer constraint rankings from linguistic forms. Of primary concern to the authors are the ambiguity of the data received by the learner and the resulting interdependence of the core grammar and the structural analysis of overt linguistic forms. The authors argue that iterative approaches to interdependencies, inspired by work in statistical learning theory, can be successfully adapted to address the interdependencies of language learning. Both OT and Constraint Demotion play critical roles in their adaptation. The authors support their findings both formally and through simulations. They also illustrate how their approach could be extended to other language learning issues, including subset relations and the learning of phonological underlying forms.
Paul Smolensky is Professor of Cognitive Science at Johns Hopkins University. He was a leading member of the PDP connectionist research group, and is the recipient of the 2005 David E. Rumelhart Prize in Cognitive Science, which is awarded annually to an individual or collaborative team making a significant contribution to the formal analysis of human cognition.
Number Of Pages:
- ID: 9780262201261
- Saver Delivery: Yes
- 1st Class Delivery: Yes
- Courier Delivery: Yes
- Store Delivery: Yes
Prices are for internet purchases only. Prices and availability in WHSmith Stores may vary significantly
© Copyright 2013 - 2017 WHSmith and its suppliers.
WHSmith High Street Limited Greenbridge Road, Swindon, Wiltshire, United Kingdom, SN3 3LD, VAT GB238 5548 36