© Copyright 2005 Peri Hankey - documentation license Gnu FDL - code license Gnu GPL - validate HTML
SourceForge.net Logo a paradigm shift

In these comments (adapted from an email exchange) I explain why my presentation of the language machine appears not to take account of much that has happened since 1957. The point is that almost all subsequent work in formal grammar has been dominated by the generative model of language. The language machine goes back to some fundamental ideas in Chomsky's original papers, and reinterprets them as central to a paradigm for language that concentrates on recognising and substituting sequences of grammatical symbols and variable bindings - the analytic model of language.


home some numbers

I am of course aware that there have been many developments since Chomsky's original paper. But if you consider the grammatical frameworks that are listed in the wikipedia, you have a group of 4 varieties of "functional grammar" which concern themselves more or less with language as social interaction, a group of 7 frameworks that relate more or less directly to the generative tradition, and one (x-bar) which I am at present unable to place, but which is I suspect some kind of 'go-further stripe' for context-free grammars.

It can also be instructive to look at some searches on Google:

 "analytic grammar"     produces about     200 results   
 "recognition grammar"  produces about  34,700 results  ("speech recognition grammar" : 28,100)
 "generative grammar"   produces about 228,000 results
 "unrestricted grammar" produces about     817 results
 "context-free grammar" produces about 256,000 results

home the generative model

In the generative model, the rewriting rules of a grammar go from the grammar to the sentences - the grammar generates the language by successively rewriting strings in the grammar until a sentence emerges. Analysis is seen as a matter of deciding whether a particular sentence can be generated from the grammar. Of course actual analysers cannot work this way, so as a way of thinking about analysis it is deeply unhelpful and far removed from direct utility. There is as far as I know almost no work in this tradition that seriously attempts to make sense of the unrestricted rewriting rules in type-0 grammars (van Wijngaarden w-grammars are an exception that have as far as I know never been successfully implemented). The limitations of context-free grammars are very well known, so they have to be augmented with additional mechanisms to be useful. The result of all this is additional complexity, and a desolate sense of dead end.

home the analytic model

The analytic model of language that David Hendry and I developed shares with Chomsky's original theory the need for two different kinds of symbol, and a system of rewriting or substitution rules. But it concentrates on doing analysis by recognition and substitution, and it adds a system of variable bindings - without these you emerge at the end of the sentence with nothing to show for it but the magic symbol "sentence" which stands for all possible sentences. Beyond the elements that it has in common with Chomsky's original theory, it goes its own way.

home the language machine contains the lambda calculus

The lambda experiment demonstrates that the system of rules and variables at the heart of the language machine effectively contains the lambda calculus, and so is in effect a universal machine. The lambda notation in itself is not in fact of any particular importance to me except as a way of making this point - it's just another language, albeit interestiing and immensely influential. But it is a significant point - my system claims to implement unrestricted grammars, and the languages that are generated by unrestricted grammars are the languages that can be recognised by universal machines. So my machine had better be universal, or else I have egg on my face.

The correspondence between unrestricted rules in my system and functions in the lambda calculus shows pretty conclusively that unrestricted rules as written in the generative direction are equivalent to functions in the lambda calculus that have been written back-to-front, with the name and arguments after the body of the function. So it is not surprising they always seemed hard.

home the lm-diagram

My lm-diagram directly shows what happens when you apply unrestricted analytical rewriting rules to an input stream, and so can represent anything such rules can do, and so can represent any conceivable process of grammatical analysis or computation. The software can directly generate the diagram in real time, so you can see what it does as it does it. It shows exactly why you cannot do unrestricted grammar online symbol-by-symbol as each arrives, unless you have two tightly coupled processes that recurse independently in counterpoint to each other, and it shows what you have to do to recover from partial ambiguity. You can also directly relate to the diagram a whole structure of variable reference scopes which enable rules to construct multiple simultaneous representations of what has been analysed.

Finally, a purely generative system cannot do any kind of analysis, while an analytic system that can substitute more symbols than it recognises can operate as a generative system. A generative grammar cannot implement the language machine. The language machine could implement generative grammar - if such a thing exists in any practical sense.

home the paradigm shift

My experience has been you can't get from the generative model to the analytic model without going through a kind of paradigm shift. My emphasis on the early theory was intended as a way of providing some common ground - this is what we have in common, now hold onto it, stand on your feet, and look at the whole thing from a completely different angle - you'll find that apples confusingly (at first) hang down from the branches. After a time you'll see that this way round, some things that looked difficult turn out to be easy. No amount of analysis of subsequent embellishments of the original theory can help you make this transition.

home the implementation

However I am fairly certain that at present there is no other system that exists, works, is so fundamentally simple, contains computation and the most general category of grammar, is directly applicable and brings with it a complete system for visualising the process and structure of linguistic and computational analysis as it unfolds in time.

Incidentally, the system is not as some people seem to assume, a 'proposed' or 'toy' system - it already exists in its entirety as free software that is available for anyone to use, modify and improve, subject only to the terms of the Gnu GPL. The full text of the system and its metalanguage compilers, with numerous examples including the lambda experiment are all presented on the website on pages that are themselves produced by a fairly mundane application of its own technology. In its previous incarnations since the early 1980s it continued in everyday use in several real-life projects with little attention for years at a time.

home