;
headline photo

transformational (generative) grammar definition

Sabtu, 19 Desember 2009

Linguis. a system of linguistic analysis consisting of a set of rules that generate basic syntactic structures, in the form of simple independent clauses, and a set of transformational rules that operate on those structures so as to produce questions, complex sentences, etc. and thus to account for every possible sentence of a language.
generative grammar


A linguistic theory that attempts to describe a native speaker's tacit grammatical knowledge by a system of rules that in an explicit and well-defined way specify all of the well-formed, or grammatical, sentences of a language while excluding all ungrammatical, or impossible, sentences.


Finite set of formal rules that will produce all the grammatical sentences of a language. The idea of a generative grammar was first definitively articulated by Noam Chomsky in Syntactic Structures (1957). The generative grammarian's task is ideally not just to define the interrelation of elements in a particular language, but also to characterize universal grammar — that is, the set of rules and principles intrinsic to all natural languages, which are thought to be an innate endowment of the human intellect. See also grammar, syntax.
The theory of language structures first proposed in Chomsky's Syntactic Structures (1957). Just as physics studies the forms of physically possible processes, so linguistics should study the form of possible human languages. This would define the limits of language by delimiting the kinds of processes that can occur in language from those that cannot. The result would be a universal grammar, from which individual languages would derive as, in effect, different ways of doing the same thing. Variations such avocabulary
and principles governing word order would be revealed as different applications of the same underlying rules. Tacit knowledge of this universal grammar is pre-programmed, an innate biological endowment of normal human infants. The argument with which Chomsky supported the claim for such an endowment is known as the argument from the ‘poverty of stimulus’: it is argued that language-learning proceeds so fast in response to such a relatively slender body of ‘data’ that the infant must be credited with an innate propensity to follow the grammar of everybody else. The extent to which this argument treats the infant as a theorist or ‘little linguist’ has been much debated (see also language of thought hypothesis). In Chomsky's original model, language consists of phrase structure rules and transformations. Phrase structure rules represent the grammatically basic constituent parts of the sentence (e.g. a sentence might be a noun phrase + a verb phrase). Transformation rules change relations (as in the active/passive transformation) and determine how complex sentences may be formed from more simple ones. This latter function became taken over by the phrase structure rules in the later work (Aspects of the Theory of Syntax, 1965) that introduced what became known as the standard theory. In this, phrase structure rules perform the task of defining the deep structure of a sentence, from which its surface structure is thought of as derived by means of possible repeated transformations. Deep structure bears some affinity to the idea of the logical structure of a sentence, thought of as the representation of the sentence that reveals its inferential properties. The notion did not survive in generative semantics, one of the successors to Chomsky's standard theory, in which transformation rules map semantic representations onto surface structures. The introduction of semantics is often thought to be a necessary amendment to the purely syntactic and grammatical approach of Chomsky's early theory.

After the standard theory came the extended standard theory, and eventually government-binding theory, both of which maintain an abstract and mathematical approach to the discovery of linguistic principles of the highest generality. Philosophically most interest has centred on the claim that complex grammatical principles might be innate, and on the relationship between syntax and semantics that is presupposed in the idea of a generative grammar. In general, philosophical formalists have been more interested in the possibility of unravelling concealed semantic structure, rather than in the more purely grammatical problems of linguistics.

10 pts for first answer. What is the definition of generative grammar?
Best Answer - Chosen by Asker
linguistics, generative grammar generally refers to a proof-theoretic framework for the study of syntax partially inspired by formal grammar theory and pioneered by Noam Chomsky. A generative grammar is a set of rules that recursively "specify" or "generate" the well-formed expressions of a natural language. This encompasses a large set of different approaches to grammar. The term generative grammar is also broadly used to refer to the school of linguistics where this type of formal grammar plays a major part, including:

The Standard Theory (ST) (also widely known as Transformational grammar (TG))
The Extended Standard Theory (EST) (also widely known as Transformational grammar (TG))
Principles and Parameters Theory (P&P) which includes both Government and Binding Theory (GB) and the Minimalist Program (MP)
Relational Grammar (RG)
Lexical-functional Grammar (LFG)
Generalized Phrase Structure Grammar (GPSG)
Head-Driven Phrase Structure Grammar (HPSG)
Generative grammar should be distinguished from traditional grammar, which is often strongly prescriptive rather than purely descriptive, is not mathematically explicit, and has historically investigated a relatively narrow set of syntactic phenomena. In the "school of linguistics" sense it should be distinguished from other linguistically descriptive approaches to grammar, such as various functional theories.

The term generative grammar can also refer to a particular set of formal rules for a particular language; for example, one may speak of a generative grammar of English. A generative grammar in this sense is a formal device that can enumerate ("generate") all and only the grammatical sentences of a language. In an even narrower sense, a generative grammar is a formal device (or, equivalently, an algorithm) that can be used to decide whether any given sentence is grammatical or not.

In most cases, a generative grammar is capable of generating an infinite number of strings from a finite set of rules. These properties are desirable for a model of natural language, since human brains are of finite capacity, yet humans can generate and understand a very large number of distinct sentences. Some linguists go so far as to claim that the set of grammatical sentences of any natural language is indeed infinite.

Generative grammars can be described and compared with the aid of the Chomsky hierarchy proposed by Noam Chomsky in the 1950s. This sets out a series of types of formal grammars with increasing expressive power. Among the simplest types are the regular grammars (type 3); Chomsky claims that regular languages are not adequate as models for human language, because all human languages allow the embedding of strings within strings in a hierarchical way.

At a higher level of complexity are the context-free grammars (type 2). The derivation of a sentence by a context-free grammar can be depicted as a derivation tree. Linguists working in generative grammar often view such derivation trees as a primary object of study. According to this view, a sentence is not merely a string of words, but rather a tree with subordinate and superordinate branches connected at nodes.

Essentially, the tree model works something like this example, in which S is a sentence, D is a determiner, N a noun, V a verb, NP a noun phrase and VP a verb phrase:



The resulting sentence could be The dog ate the bone. Such a tree diagram is also called a phrase marker. They can be represented more conveniently in a text form, (though the result is less easy to read).

However Chomsky at some point argued that phrase structure grammars are also inadequate for describing natural languages. To address this, Chomsky formulated the more complex system of transformational grammar.

When generative grammar was first proposed, it was widely hailed as a way of formalizing the implicit set of rules a person "knows" when they know their native language and produce grammatical utterances in it. However Chomsky has repeatedly rejected that interpretation; according to him, the grammar of a language is a statement of what it is that a person has to know in order to recognise an utterance as grammatical, but not a hypothesis about the processes involved in either understanding or producing language. In any case the reality is that most native speakers would reject many sentences produced even by a phrase structure grammar. For example, although very deep embeddings are allowed by the grammar, sentences with deep embeddings are not accepted by listeners, and the limit of acceptability is an empirical matter that varies between individuals, not something that can be easily captured in a formal grammar. Consequently, the influence of generative grammar in empirical psycholinguistics has declined considerably.
Source(s):
Wikipedia, the free encyclopedia


4 Komentar::

Anonim mengatakan...

krg mnarik

Anonim mengatakan...

Hiya! I simply would like to give an enormous thumbs up for the nice info
you have got here on this post. I will be coming
again to your blog for more soon.

Feel free to visit my homepage - where can i buy semolina flour in the 16117 area

BhestMilla SC mengatakan...

anonim 1 > emang.haha
anonim 2 > thanks you.. welcome back!

Anonim mengatakan...

Howdy! I simply would like to give an enormous thumbs up for the great data you may have here on this post.
I will be coming back to your weblog for more soon.

Review my web site; SEO收費

Posting Komentar

♥♥♥Eit..Eit..kayanya pengunjung mau kirim komentar nih tentang bacaan barusan..ya dah..NAME/URL juga boleh kok..Makasih yah.. ♥♥♥