Book contents
- Frontmatter
- Contents
- Preface
- Acknowledgements
- 1 Introduction
- 2 The data and some applications
- Part I The Tools
- Part II What Does Learning a Language Mean?
- Part III Learning Algorithms and Techniques
- 11 Text learners
- 12 Informed learners
- 13 Learning with queries
- 14 Artificial intelligence techniques
- 15 Learning context-free grammars
- 16 Learning probabilistic finite automata
- 17 Estimating the probabilities
- 18 Learning transducers
- 19 A very small conclusion
- References
- Index
11 - Text learners
from Part III - Learning Algorithms and Techniques
Published online by Cambridge University Press: 05 July 2014
- Frontmatter
- Contents
- Preface
- Acknowledgements
- 1 Introduction
- 2 The data and some applications
- Part I The Tools
- Part II What Does Learning a Language Mean?
- Part III Learning Algorithms and Techniques
- 11 Text learners
- 12 Informed learners
- 13 Learning with queries
- 14 Artificial intelligence techniques
- 15 Learning context-free grammars
- 16 Learning probabilistic finite automata
- 17 Estimating the probabilities
- 18 Learning transducers
- 19 A very small conclusion
- References
- Index
Summary
L'homme est à la recherche d'un nouveau langage auquel la grammaire d'aucune langue n'aura rien à dire.
Guillaume ApollinaireThe method that will be used is equivalent to finding a PSL (context free phrase structure language, Chomsky (1956)) that in some sense best ‘fits’ the set [αl]. The measure of goodness of fit will be the product of the a-priori probability of the language selected and the probability that the language selected would produce [α1] as a set of acceptable sentences.
Ray Solomonoff (Solomonoff, 1964)From the results of the previous section, it appears quite clear that DFAs cannot be identified in the limit from text. At which point, if the chosen task is indeed to learn DFAs from text, we need either some help (perhaps some extra information about the strings or some structure) or else we will have to add an extra bias and reduce the class of automata.
It should be noted that adding more bias consists also of adding some more information (something like ‘we know that the automaton has this property’).
Window languages
It is well known that regular languages correspond to the class of languages that can be parsed using a bounded memory. But what happens is that this memory is bounded a priori. We may consider the subclass of languages for which parsing uses only a memory of size k, meaning that the next letter to be read will depend only on the knowledge of the k — 1 previous characters.
- Type
- Chapter
- Information
- Grammatical InferenceLearning Automata and Grammars, pp. 217 - 236Publisher: Cambridge University PressPrint publication year: 2010