Tuesday, February 5, 2013

In linguistics, what is Constraint-based Grammar? Why is it identified with the feature of reversibility?

Linguistic knowledge is innate, implicit, and shared by every member of our species. Moreover, any language in the world is acquired rapidly and effortlessly in a comparatively short period of time by a human child, after he or she is exposed to an environment in which that language is spoken. Within less than three years of time after birth, the child begins to infer, deduce and generalize language-specific rules from a very limited language input....

Linguistic knowledge is innate, implicit, and shared by every member of our species. Moreover, any language in the world is acquired rapidly and effortlessly in a comparatively short period of time by a human child, after he or she is exposed to an environment in which that language is spoken. Within less than three years of time after birth, the child begins to infer, deduce and generalize language-specific rules from a very limited language input. Noam Chomsky calls this the problem of “poverty of stimulus”. It is limited as there is an absence of negative and positive feedback for a lot of linguistic rules. Yet, the child never makes structure independent errors ever (Chomsky, 1971). Acquisition of language, thus, is not much of a problem for a child even at the age of 2-3 years. However, it is quite a task for the language researcher, who already knows so much about the structure of a given language, to decode the rules of this complex system and explicitly define it to make machines learn the human language.


Constraint-based grammars or constraint-based theories of grammars express this implicit linguistic knowledge in the form of well-defined, explicit mathematical rules or algorithms. They have gained popularity in the field of Computational Linguistics and more so in Natural Language Processing (NLP). Constraint-based grammars are developed for languages other than English also. The formal, uniform, context-dependent rules written in clear linguistic terminology are used to analyze given text of a language. Constraints are introduced at different grammatical levels (word, phrases, sentences, etc.) and tags are applied to tokens (of a particular type) in the text before parsing and semantic interpretation take place.


Constraint-based grammar formalisms used to have one major drawback earlier. Although they could help in language analysis using computational and mathematical models of grammar, they failed to accommodate the learnability theories of language. Lexicalized, Well-Founded Grammars (LWFG) are an improvement over the traditional Constraint-based Grammars, which take into account the learnability factor and, thus, aid in understanding a language system deeply. Constraints are introduced at the semantic level, which allow interpretation of a text during syntactic parsing.


An important feature of the Constraint-based grammars is that they take information from different levels of linguistic analysis. Complex language structures are described and built one by one from smaller structures at various steps. Such a description is uniform and the resulting structure is consistent and independent of the order of the steps. In other words, these grammar formalisms use a linguistic base system in which the type of constraints that are applied are important and not the order in which they are applied. In this way, Constraint-based Grammars are reversible.

No comments:

Post a Comment

What is the Exposition, Rising Action, Climax, and Falling Action of "One Thousand Dollars"?

Exposition A "decidedly amused" Bobby Gillian leaves the offices of Tolman & Sharp where he is given an envelope containing $1...