Grammar induction

May 30, 2011 — August 9, 2021

generative art
grammar
language
machine learning
stringology
Figure 1

Mathematically speaking, inferring the “formal language” which can describe a set of expressions. In the slightly looser sense used by linguists studying natural human language, discovering the syntactic rules of a given language, which is kinda the same thing but with every term sloppier, and the subject matter itself messier.

This is already a crazily complex area, and being naturally perverse, I am interested in an especially esoteric corner of it, to whit, grammars of things that aren’t speech; inferring design grammars, say, could allow you to produce more things off the same “basic plan” from some examples of the thing. Look at enough trees and you know how to build the rest of the forest, that kind of thing. That is called inverse procedural modelling, apparently. I’m especially interested in things expressed not as a sequence of symbols from a finite alphabet — i.e. not over the free monoid, or in a partially-observed context. This is a boutique interest, although not entirely novel.

Fun aside: In the 19th century it was fashionable to think of design as a grammatical thing, although I am not exactly clear on how similar their notion of grammar was to mine.

Figure 2: Owen Jones, Grammar of ornament

I also care about probabilistic grammars, i.e. assigning measures to these things.

Normally design grammars deal with simple languages, such as, say “regular” languages. I’m interested in things a rung or two up the Chomsky hierarchy - context-free grammars, maybe even context-sensitive ones.

See also design grammars, iterated function systems and my defunct research proposal in this area, Cosma Shalizi’s inevitable mention in 3, 2, 1… go

A nice position paper Peter Norvig on Chomsky and statistical versus explanatory models of natural language syntax. Full of sick burns.

Basically, Chomsky says

It’s true there’s been a lot of work on trying to apply statistical models to various linguistic problems. I think there have been some successes, but a lot of failures. There is a notion of success … which I think is novel in the history of science. It interprets success as approximating unanalyzed data.

Norvig then says, actually, opaque, predictive approximations are OK and scientifically interesting. That was 2011 and since then, the scruffy, dense, opaque, predictive models have continued to be ascendant in language processing, particularly transformer networks, which do a disturbingly good job of handling language without an explicit grammar.

1 References

Angluin. 1987. Learning Regular Sets from Queries and Counterexamples.” Information and Computation.
———. 1988. Identifying Languages from Stochastic Examples.” No. YALEU/DCS/RR-614.
Badii, and Politi. 1999. Complexity: Hierarchical Structures and Scaling in Physics. Cambridge Nonlinear Science Series.
Bolhuis, Tattersall, Chomsky, et al. 2014. How Could Language Have Evolved? PLoS Biol.
Boulanger-Lewandowski, Bengio, and Vincent. 2012. Modeling Temporal Dependencies in High-Dimensional Sequences: Application to Polyphonic Music Generation and Transcription.” In 29th International Conference on Machine Learning.
Casacuberta, and de la Higuera. 2000. Computational Complexity of Problems on Probabilistic Grammars and Transducers.” In Grammatical Inference: Algorithms and Applications.
Charniak. 1996a. “Tree-Bank Grammars.” In In Proceedings of the Thirteenth National Conference on Artificial Intelligence.
———. 1996b. Statistical Language Learning.
Chater, and Manning. 2006. Probabilistic Models of Language Processing and Acquisition. Trends in Cognitive Sciences.
Chomsky, N. 1956. Three Models for the Description of Language.” IRE Transactions on Information Theory.
Chomsky, Noam. 2002. Syntactic Structures.
Clark, Alexander, and Eyraud. 2005. Identification in the Limit of Substitutable Context-Free Languages.” In Algorithmic Learning Theory. Lecture Notes in Computer Science.
Clark, Peter, Tafjord, and Richardson. 2020. Transformers as Soft Reasoners over Language.” In IJCAI 2020.
de la Higuera. 2000. Current Trends in Grammatical Inference.” In Advances in Pattern Recognition. Lecture Notes in Computer Science.
———. 2005. A Bibliographical Study of Grammatical Inference.” Pattern Recognition.
———. 2010. Grammatical Inference : Learning Automata and Grammars.
de la Higuera, Piat, and Tantini. 2006. Learning Stochastic Finite Automata for Musical Style Recognition.” In Implementation and Application of Automata. Lecture Notes in Computer Science.
Dehbi, Hadiji, Gröger, et al. 2017. Statistical Relational Learning of Grammar Rules for 3D Building Reconstruction.” Transactions in GIS.
Duvenaud, Lloyd, Grosse, et al. 2013. Structure Discovery in Nonparametric Regression Through Compositional Kernel Search.” In Proceedings of the 30th International Conference on Machine Learning (ICML-13).
Elman. 1990. Finding Structure in Time.” Cognitive Science.
———. 1991. Distributed Representations, Simple Recurrent Networks, and Grammatical Structure.” Machine Learning.
———. 1993. Learning and Development in Neural Networks: The Importance of Starting Small.” Cognition.
Fee, Kozhevnikov, and Hahnloser. 2004. Neural Mechanisms of Vocal Sequence Generation in the Songbird. Annals of the New York Academy of Sciences.
Gaillard, and Baudin. 2014. A Consistent Deterministic Regression Tree for Non-Parametric Prediction of Time Series.” arXiv:1405.1533 [Cs, Math, Stat].
Giesa, Spivak, and Buehler. 2011. Reoccurring Patterns in Hierarchical Protein Materials and Music: The Power of Analogies.” BioNanoScience.
Gold. 1967. Language Identification in the Limit.” Information and Control.
Grosse, Salakhutdinov, Freeman, et al. 2012. Exploiting Compositionality to Explore a Large Space of Model Structures.” In Proceedings of the Conference on Uncertainty in Artificial Intelligence.
Grünwald. 1996. A Minimum Description Length Approach to Grammar Inference.” In Connectionist, Statistical, and Symbolic Approaches to Learning for Natural Language Processing. Lecture Notes in Computer Science.
Hannun, Pratap, Kahn, et al. 2020. Differentiable Weighted Finite-State Transducers.” arXiv:2010.01003 [Cs, Stat].
Heckel, Lajios, and Menge. 2004. Stochastic Graph Transformation Systems.” In Graph Transformations. Lecture Notes in Computer Science.
Hopcroft, and Ullman. 1979. Introduction to Automata Theory, Languages and Computation.
Hsu, Chater, and Vitányi. 2011. The Probabilistic Analysis of Language Acquisition: Theoretical, Computational, and Experimental Analysis.” Cognition.
Jackendoff. 2002. Foundations of Language: Brain, Meaning, Grammar, Evolution.
Jin, and Kozhevnikov. 2011. A Compact Statistical Model of the Song Syntax in Bengalese Finch.” PLoS Comput Biol.
Johnson. 2004. Gold’s Theorem and Cognitive Science.” Philosophy of Science.
Jones, Waring, Westwood, et al. 1868. The grammar of ornament.
Keller, and Lutz. 1997. “Evolving Stochastic Context-Free Grammars from Examples Using a Minimum Description Length Principle.” In 1997 Workshop on Automata Induction Grammatical Inference and Language Acquisition.
Kontorovich, Cortes, and Mohri. 2006. Learning Linearly Separable Languages.” In Algorithmic Learning Theory. Lecture Notes in Computer Science 4264.
Lari, and Young. 1990. The Estimation of Stochastic Context-Free Grammars Using the Inside-Outside Algorithm.” Computer Speech & Language.
López, Sempere, and García. 2004. Inference of Reversible Tree Languages.” IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics.
Manning. 2002. “Probabilistic Syntax.” In Probabilistic Linguistics.
Manning, and Schütze. 1999. Foundations of Statistical Natural Language Processing.
Meyer. 1900. Handbook of ornament; a grammar of art, industrial and architectural designing in all its branches, for practical as well as theoretical use.
Mohri, and Riley. 2016. A Disambiguation Algorithm for Weighted Automata.” Theoretical Computer Science.
Nevill-Manning, and Witten. 1997. “Identifying Hierarchical Structure in Sequences: A Linear-Time Algorithm.”
Nowak, Komarova, and Niyogi. 2001. Evolution of Universal Grammar. Science.
Pinker, and Bloom. 1990. “Natural Language and Natural Selection.” Behavioral and Brain Sciences.
Shalizi, and Crutchfield. 2000. “Computational Mechanics: Pattern and Prediction, Structure and Simplicity.”
Smith. 1984. Plants, Fractals, and Formal Languages.” In SIGGRAPH Comput. Graph.
Talton, Yang, Kumar, et al. 2012. Learning Design Patterns with Bayesian Grammar Induction.” In Proceedings of the 25th Annual ACM Symposium on User Interface Software and Technology - UIST ’12.
Tong, Bickett, Christiansen, et al. 2007. Learning Grammatical Structure with Echo State Networks.” Neural Networks.
Valiant, L. G. 1984. A Theory of the Learnable.” Commun. ACM.
Valiant, Leslie G. 2009. Evolvability.” J. ACM.
Vidal, Thollard, de la Higuera, et al. 2005. Probabilistic Finite-State Machines - Part I.” IEEE Transactions on Pattern Analysis and Machine Intelligence.
Vinyals, Kaiser, Koo, et al. 2015. Grammar as a Foreign Language.” In Advances in Neural Information Processing Systems 28.