A categorial grammar consists of two parts, a lexicon, which assigns a set of types (also called categories) to each basic symbol, and some type inference rules, which determine how the type of a string of symbols follows from the types of the constituent symbols. It has the advantage that the type inference rules can be fixed once and for all, so that the specification of a particular language grammar is entirely determined by the lexicon.
A categorial grammar shares some features with the simply typed lambda calculus. Whereas the lambda calculus has only one function type, a categorial grammar typically has two function types, one type which is applied on the left, one on the right. For example, a simple categorial grammar might have two function types and . The first, is the type of a phrase that results in a phrase of type when followed (on the right) by a phrase of type . The second, is the type of a phrase that results in a phrase of type when preceded (on the left) by a phrase of type .
As Lambek explains, the notation is based upon algebra. A fraction when multiplied by (i.e. concatenated with) its denominator yields its numerator. Since concatenation is not commutative, it makes difference whether the denominator occurs to the left or right. The concatenation must be on the same side as the denominator for it to cancel out.
The first and simplest kind of categorial grammar is called a basic categorial grammar, or sometimes an AB-grammar (after Ajdukiewicz and Bar-Hillel). Given a set of primitive types, let be the set of types constructed from primitive types. In the basic case, this is the least set such that and if then . Think of these as purely formal expressions freely generated from the primitive types; any semantics will be added later. Some authors assume a fixed infinite set of primitive types used by all grammars, but by making the primitive types part of the grammar, the whole construction is kept finite.
A basic categorial grammar is a tuple where is a finite set of symbols, is a finite set of primitive types, and .
The relation is the lexicon, which relates types to symbols . Since the lexicon is finite, it can be specifed by listing a set of pairs like .
Such a grammar for English might have three basic types, assigning count nouns the type, complete noun phrases the type, and sentences the type . Then an adjective could have the type, because if it is followed by a noun then the whole phrase is a noun. Similarly, a determiner has the type, because it forms a complete noun phrase when followed by a noun. Intransitive verbs have the type, and transitive verbs the type . Then a string of words is a sentence if it has overall type .
For example, take the string "the bad boy made that mess". Now "the" and "that" are determiners, "boy" and "mess" are nouns, "bad" is an adjective, and "made" is a transitive verb, so the lexicon is {, }.
and the sequence of types in the string is
now find functions and appropriate arguments and reduce them according to the two inference rules and :
The fact that the result is means that the string is a sentence, while the sequence of reductions shows that it must be parsed as ((the (bad boy)) (made (that mess))).
Categorial grammars of this form (having only function application rules) are equivalent in generative capacity to context-free grammars and are thus often considered inadequate for theories of natural language syntax. Unlike CFGs, categorial grammars are lexicalized, meaning that only a small number of (mostly language-independent) rules are employed, and all other syntactic phenomena derive from the lexical entries of specific words.
Another appealing aspect of categorial grammars is that it is often easy to assign them a compositional semantics, by first assigning interpretation types to all the basic categories, and then associating all the derived categories with appropriate function types. The interpretation of any constituent is then simply the value of a function at an argument. With some modifications to handle intensionality and quantification, this approach can be used to cover a wide variety of semantic phenomena.
Read more about Categorial Grammar: Lambek Calculus, Historical Notes, Some Definitions, Refinements of Categorial Grammar
Famous quotes containing the word grammar:
“Grammar is a tricky, inconsistent thing. Being the backbone of speech and writing, it should, we think, be eminently logical, make perfect sense, like the human skeleton. But, of course, the skeleton is arbitrary, too. Why twelve pairs of ribs rather than eleven or thirteen? Why thirty-two teeth? It has something to do with evolution and functionalismbut only sometimes, not always. So there are aspects of grammar that make good, logical sense, and others that do not.”
—John Simon (b. 1925)