View More View Less
  • 1 ELTE Eötvös Loránd University, Budapest
Restricted access

Most linguistic theories postulate structures with covert information, not directly recoverable from utterances. Hence, learners have to interpret their data before drawing conclusions. Within the framework of Optimality Theory (OT), Tesar & Smolensky (1998) proposed Robust Interpretive Parsing (RIP), suggesting the learners rely on their still imperfect grammars to interpret the learning data. I introduce an alternative, more cautious approach, Joint Robust Interpretive Parsing (JRIP). The learner entertains a population of several grammars, which join forces to interpret the learning data. A standard metrical phonology grammar is employed to demonstrates that JRIP performs significantly better than RIP.

  • Biró, Tamás. 2003. Quadratic alignment constraints and finite state Optimality Theory. In Proceedings of the Workshop on Finite-State Methods in Natural Language Processing (FSMNLP), held within EACL-03, Budapest. 119126. ROA-600.

    • Search Google Scholar
    • Export Citation
  • Biró, Tamás. 2006. Finding the right words: Implementing Optimality Theory with simulated annealing. Doctoral dissertation. University of Groningen. ROA-896.

    • Search Google Scholar
    • Export Citation
  • Biró, Tamás, 2010. OTKit: Tools for Optimality Theory. A software package. http://www.birot.hu/OTKit/

  • Biró, Tamás. 2013. Towards a robuster interpretive parsing: Learning from overt forms in Optimality Theory. Journal of Logic, Language and Information 22. 139172.

    • Search Google Scholar
    • Export Citation
  • Boersma, Paul. 1997. How we learn variation, optionality, and probability. Proceedings of the Institute of Phonetic Sciences, Amsterdam (IFA) 21. 4358.

    • Search Google Scholar
    • Export Citation
  • Boersma, Paul. 1998. Functional Phonology: Formalizing the interactions between articulatory and perceptual drives. Doctoral dissertation. University of Amsterdam. (Published by Holland Academic Graphics, The Hague.)

    • Search Google Scholar
    • Export Citation
  • Boersma, Paul. 2003. Review of B. Tesar & P. Smolensky (2000): Learnability in OT. Phonology 20. 436446.

  • Boersma, Paul. 2009. Some correct error-driven versions of the Constraint Demotion algorithm. Linguistic Inquiry 40. 667686.

  • Boersma, Paul and Bruce Hayes. 2001. Empirical tests of the Gradual Learning Algorithm. Linguistic Inquiry 32. 4586.

  • Boersma, Paul and Joe Pater, 2008. Convergence properties of a gradual learning algorithm for Harmonic Grammar. ROA-970.

  • Eisner, Jason. 1997. Efficient generation in Primitive Optimality Theory. In Proceedings of the 35th Annual Meeting of the Association for Computational Linguistics (ACL-1997) and 8th EACL. Madrid, 313320. Also: ROA-206.

    • Search Google Scholar
    • Export Citation
  • Hayes, Bruce. 1995. Metrical stress theory. Principles and case studies. Chicago: The University of Chicago Press.

  • Jäger, Gerhard. 2003. Simulating language change with functional OT. In S. Kirby (ed.) Language evolution and computation. Proceedings of the Workshop at ESSLLI, Vienna. 5261.

    • Search Google Scholar
    • Export Citation
  • Jarosz, Gaja. 2013. Learning with hidden structure in Optimality Theory and Harmonic Grammar: Beyond Robust Interpretive Parsing. Phonology 30. 2771.

    • Search Google Scholar
    • Export Citation
  • Magri, Giorgio. 2011. An online model of the acquisition of phonotactics within Optimality Theory. In L. Carlson, C. Hölscher and T. F. Shipley (eds.) Expanding the space of cognitive science: Proceedings of the 33rd Annual Conference of the Cognitive Science Society. Austin, TX: Cognitive Science Society, 20122017.

    • Search Google Scholar
    • Export Citation
  • Magri, Giorgio. 2012. Convergence of error-driven ranking algorithms. Phonology 29. 213269.

  • McCarthy, John J. 2003. OT constraints are categorical. Phonology 20. 75138.

  • Niyogi, Partha. 2006. The computational nature of language learning and evolution. Cambridge, MA: MIT Press.

  • Pater, Joe. 2008. Gradual learning and convergence. Linguistic Inquiry 39. 334345.

  • Prince, Alan and Paul Smolensky. 1993. Optimality theory: Constraint interaction in generative grammar. Technical Report TR-2, Center for Cognitive Science, Rutgers University, New Brunswick, N.J. and Technical Report CU-CS-697-93, Department of Computer Science, University of Colorado, Boulder.

    • Search Google Scholar
    • Export Citation
  • Pulleyblank, Douglas and William J. Turkel. 2000. Learning phonology: Genetic algorithms and Yoruba tongue-root harmony. In J. Dekkers, F. van der Leeuw and J. van De Weijer (eds.) Optimality Theory: Phonology, syntax, and acquisition. Oxford: Oxford University Press. 554591.

    • Search Google Scholar
    • Export Citation
  • Reeves, Colin R. (ed.). 1995. Modern heuristic techniques for combinatorial problems. London: McGraw-Hill.

  • Riggle, Jason. 2004. Contenders and learning. In B. Schmeiser, V. Chand, A. Kelleher and A. Rodriguez (eds.) Proceedings of the 23rd West Coast Conference on Formal Linguistics (WCCFL 23). Somerville, MA: Cascadilla Press.

    • Search Google Scholar
    • Export Citation
  • Riggle, Jason, 2009. Generating contenders. Technical report, ROA-1044.

  • Samek-Lodovici, Vieri and Alan Prince, 1999. Optima. ROA-363.

  • Tesar, Bruce and Paul Smolensky. 1998. Learnability in Optimality Theory. Linguistic Inquiry 29. 229268.

  • Tesar, Bruce and Paul Smolensky. 2000. Learnability in Optimality Theory. Cambridge, MA: MIT Press.

  • Turkel, Bill, 1994. The acquisition of Optimality Theoretic systems. ROA-11.

  • Yang, Charles D. 2002. Knowledge and learning in natural language. Oxford: Oxford University Press.

Monthly Content Usage

Abstract Views Full Text Views PDF Downloads
May 2020 0 2 2
Jun 2020 0 4 2
Jul 2020 2 0 0
Aug 2020 0 3 1
Sep 2020 0 1 0
Oct 2020 0 0 0
Nov 2020 0 0 0