Vous êtes sur la page 1sur 9

Generative grammar

From Wikipedia, the free encyclopedia Jump to: navigation, search

Theoretical linguistics

Cognitive linguistics Generative linguistics Quantitative linguistics

Functional theories of grammar

Phonology Morphology


Syntax Lexis Semantics Pragmatics Graphemics Orthography Semiotics

Descriptive linguistics

Anthropological linguistics Comparative linguistics Historical linguistics

Phonetics Graphetics Etymology Sociolinguistics

Applied and experimental linguistics

Computational linguistics Evolutionary linguistics

Forensic linguistics Internet linguistics Language acquisition Language assessment Language development Language education Linguistic anthropology Neurolinguistics Psycholinguistics

Second-language acquisition

Related articles

History of linguistics Linguistic prescription List of linguists

List of unsolved problems in linguistics


v t e

In theoretical linguistics, generative grammar refers to a particular approach to the study of syntax. A generative grammar of a language attempts to give a set of rules that will correctly predict which combinations of words will form grammatical sentences. In most approaches to generative grammar, the rules will also predict the morphology of a sentence.[citation needed] Generative grammar originates in the work of Noam Chomsky, beginning in the late 1950s. Early versions of Chomsky's theory were called transformational grammar, and this term is still used as a collective term that includes his subsequent theories. There are a number of competing versions of generative grammar currently practiced within linguistics. Chomsky's current theory is known as the Minimalist program. Other prominent theories include or have included head-driven phrase structure grammar, lexical functional grammar, categorial grammar, relational grammar, link grammar, and tree-adjoining grammar.[citation needed] Chomsky has argued that many of the properties of a generative grammar arise from an "innate" universal grammar. Proponents of generative grammar have argued that most grammar is not the result of communicative function and is not simply learned from the environment (see poverty of the stimulus argument). In this respect, generative grammar takes a point of view different from cognitive grammar, functional and behaviorist theories.[citation

Most versions of generative grammar characterize sentences as either grammatically correct (also known as well formed) or not. The rules of a generative grammar typically function as an algorithm to predict grammaticality as a discrete (yes-or-no) result. In this respect, it differs from stochastic grammar, which considers grammaticality as a probabilistic variable. However, some work in generative grammar (e.g. recent work by Joan Bresnan) uses stochastic versions of optimality theory.[citation needed]

Generative grammar theory of linquistics

Posted in Brain and language learning, General language learning, Learn Polish, Linguistics Generative grammar is the idea that although our brains are limited; and our experience with a language is always limited, as we have not herd all possible combinations of native or secondary language. However, we have an innate ability to generate and understand an infinite number of combinations of sentences. This mean although I have never herd a combination of works in a particular sequence, I will be able to understand it. Further, although I have never herd a sentence or grammar structure, I will be able to create a sentence. There is a finite set of grammar rules in a language. Noam Chomsky is the originator of the theory of generative grammar. Many linguists love Chomsky and talk about his theories all the time. As a human I think he has male a huge contribution to linguistics.

Although I have not give this much though, I would say my first reaction is, with all due respect, its a nice theory but has limited practical application and further not fully true. I can generally make sentences and create sentence in Polish that are grammatically correct. However, if I have not herd a sequence, even in a limited way it would be very hard to be perfectly correct. Further, when I give Polish native speakers crazy fun tests on their own grammar, they do not know it. I will take foreign or fictional or elf language words or strange ideas and ask them to make a sentence with all the proper word endings etc, and they have to think about and debate the declinations and grammar, as they have never herd such word declinations. Even if there is a rule they do not have an innate feel for it, they have to start thinking. Many times they will get it wrong according to the rule and say, oh well we do not usually change foreign words or phrases. This is true, but the point is its hard for them to self generate which they have not herd or encountered. Maybe I am missing the point with generative grammar linguistic theory, but in reality it never sat well with me. I think you need massive exposure to a language to speak it.

Etymology: Adopting the term generative from mathematics, linguist Noam Chomsky introduced the concept of generative grammar in the 1950s.


"A significant break in linguistic tradition came in 1957, the year American Noam Chomsky's Syntactic Structures appeared and presented the concept of a 'transformational generative grammar.' A generative grammar is essentially one that 'projects' one or more given sets of sentences that make up the language one is describing, a process characterizing human language's creativity. Modified in its theoretical principles and methods over succeeding years by many linguists, principally in the USA, a transformational generative grammar attempts to describe a native speaker's linguistic competence by framing linguistic descriptions as rules for 'generating' an infinite number of grammatical sentences. "A generative grammar, as understood by Chomsky, must also be explicit; that is, it must precisely specify the rules of the grammar and their operating conditions." (Steven Roger Fischer, A History of Language. Reaktion Books, 1999)

"Simply put, a generative grammar is a theory of competence: a model of the psychological system of unconscious knowledge that underlies a speaker's ability to produce and interpret utterances in a language. . . . A good way of trying to understand [Noam] Chomsky's point is to think of a generative grammar as essentially a definition of competence: a set of criteria that linguistic structures must meet to be judged acceptable." (Frank Parker and Kathryn Riley, Linguistics for Non-Linguists. Allyn and Bacon, 1994)

What is linguistics?
Linguistics is the study of language -- but there are many approaches to the study of language.

One might study language as a cultural phenomenon that binds people together or divides them, as a tool for social interaction, or as an artistic medium. One might take a historical perspective and study the familial relations among languages or how languages have changed over time. There is also an approach which sees language as interesting because it is structured and accessible product of the human mind. As such, language offers a means to study the nature of the mind that produces it. This is the approach taken by the linguists in the Program in Linguistics at Princeton. More specifically, our work is carried out within the framework of generative grammar. So...

What is generative grammar?

Linguists who work within the framework of generative grammar strive to develop a general theory that reveals the rules and laws that govern the structure of particular languages, and the general laws and principles governing all natural languages. The basic areas of study include phonology (the study of the sound patterns of language), morphology (the study of the structure and meaning of words), syntax (the study of the structure of sentences), and semantics (the study of linguistic meaning). A signature feature of generative grammar is the view that humans have an innate "language faculty" and that the universal principles of human language reflect intrinsic properties of this language faculty. In learning their native languages, children acquire specific rules that determine the sound and meaning of utterances in the language. These rules interact with each other in complex ways, and the entire system is learned in a relatively short time and with little or no apparent conscious effort. The most plausible explanation for the success of human language learners is that they have access to a highly restrictive set of principles which does not require (or permit) them to consider many alternatives in order to account for a particular construction, but instead limits them to a few possible rules from which a choice can be made -- if necessary, without much further evidence. Since there is no evidence that the principles that define the class of possible rules and systems of rules are learned, it is thought that these principles serve as the preconditions for language learning, forming part of the innate capacity of every normal child. Viewed in this light, the principles we are attempting to discover are part of the genetic endowment of all humans. It follows that an understanding of these principles is necessary to an understanding of the mental makeup of the human species. Only after extensive parts of the grammars of different languages have been formulated is it possible to ask questions concerning the ways in which various languages differ or the ways in which all languages are the same. Consequently, a large part of our effort is devoted to the study of linguistic detail (for example, the interpretation of English verb phrase ellipsis, the morpho-semantics of the Greek perfect, the syntax of multiple questions, or prosodic phrasing in Korean). The ultimate goal is not merely to understand these details, but to use them as a bridge to understanding the human language faculty in general.

By Kamil Winiewski, Aug 20th, 2007 Generative grammar is a notion that was developed in 1950s by Noam Chomsky. Although numerous scholars disagreed with Chomskys claims he gained many supporters and the idea was both developed and challenged at the same time. His works have exerted considerable influence on psycholinguistics, cognitive linguistics, applied linguistics as well as language methodology, and with time generative grammar received broader meaning than it initially had. Based partially on mathematical equations generative grammar is a set of rules that provide a framework for all the grammatically possible sentences in a language, excluding those which would be considered ungrammatical. A classical generative grammar consists of four elements:

A limited number of nonterminal signs; A beginning sign which is contained in the limited number of nonterminal signs; A limited number of terminal signs; A finite set of rules which enable rewriting nonterminal signs as strings of terminal signs.

The rules could be applied in a free way and the only requirement is that the final result must be a grammatically correct sentence. What is more, generative grammar is recursive, which means that any output of application of rules can be the input for subsequent application of the same rule. That should enable generating sentences as the daughter ofthe father of the brother of his cousin. Chomsky considered language to be a species-specific property which is a part of the human mind. Chomsky studied the Internal-language, a mental faculty for language. He also wanted to account for the linguistic competence of native speakers and the linguistic knowledge of language present in language users minds. As he argued:

People know which sentences are grammatically well formed in their native language They have this knowledge also of previously unheard sentences So they must rely on mentally represented rules and not only on memory Generative grammars might be regarded as models of mentally represented rules The ability to acquire such sets of rules is most probably uniquely human.

Moreover, Chomsky argued that people posses a kind of Language Faculty which is a part of human natural biological qualities. The innate linguistic knowledge that enables practically any child to learn any of about 6000 existing languages (at a given point in time) is sometimes known as the Universal Grammar. This theory is often supported by the arguments that creole languages are created in a natural way and their users invent their own linguistic systems. What is more, it appears that creole languages share certain features even despite the distances that not allows for contact of two different creoles. References:

Brown K. (Editor) 2005. Encyclopedia of Language and Linguistics 2nd Edition. Oxford: Elsevier. Wilson R. A. (Editor) 1999. The MIT encyclopedia of cognitive sciences. London: The MIT Press.

By Kamil Winiewski Aug 29th, 2007 Applied linguistics is an umbrella term that covers a wide set of numerous areas of study connected by the focus on the language that is actually used. The emphasis in applied linguistics is on language users and the ways in which they use languages, contrary to theoretical linguistics which studies the language in the abstract not referring it to any particular context, or language, like Chomskyan generative grammar for example. Interestingly even among applied linguists there is a difference of opinion as to the scope, the domains and limits of applied linguistics. There are many issues investigated by applied linguists such as discourse analysis, sign language, stylistics and rhetoric as well as language learning by children and adults, both as mother tongue and second or foreign language. Correlation of language and gender, as well as the transfer of information in media and interpersonal communication are analyzed by applied linguists. Also forensic linguistics, interpretation and translation, together with foreign language teaching methodology and language change are developed by applied linguistics. Shortly after the introduction of the term applied linguistics it was associated mainly with first, second and foreign language teaching, however nowadays it is seen as more interdisciplinary branch of science. Although in certain parts of the world language teaching remains the major concern of applied linguists, issues such as speech pathologies and determining the levels of literacy of societies, or language processing along with differences in communication between various cultural groups - all gain interest elsewhere. In European union the focus of applies linguistics is put on the issues connected with the language policy of this multilingual community. The primary aim is to keep the balance in fulfilling the need for lingua franca and maintaining smaller languages in order for them not to get devalued. This is a pressing matter as with the migration of people within the European union and from outside its boarders the mixture of languages is getting more and more complex. Therefore, the focus is also put on analyzing language attitudes, adopting common language policy, creating teaching textbooks and other materials. As it can be seen there are many trends in applied linguistics, some interconnected, others not having too much in common. There are, however, some very general tendencies among applied linguists to put more effort on certain investigations such as languages of wider communication, corpus analysis, or critical applied linguistics. When it comes to languages of wider communication it is clear that with the increasing numbers of international travels and technological advances the need for an international language raises. As English is the contemporary lingua franca applied linguists attempt to include language policy and planning in their interest, but is also concerned with analyzing language and identity, and special educational needs. Corpus analysis takes both quantitative and qualitative approach to the study of language and applied linguists focus of the identification of patterns of language use depending on social context, audiences, genres and settings. Critical applied linguistics is interested in the social problems connected with language such as unemployment, illiteracy and pedagogy. Brown K. (Editor) 2005. Encyclopedia of Language and Linguistics 2nd Edition. Oxford: Elsevier.

By Kamil Winiewski Aug 19th, 2007 Corpus linguistics is not another branch of science, but rather a term that denotes the methodologies and approaches to the analysis of languages. A corpus is a collection of either spoken or written texts in a given language (less often of two languages) consisting nowadays usually more than a million words. Different types of corpora enable analyzing various kinds of discourses in order to find quantitative evidence on existence of patterns in language or to verify some theories. At first corpus studies focused on single words, their frequency and occurrence, yet with the development of technology and more precise search engines the possibilities increased dramatically. Now it is possible to search for a word and only a particular instances of a given word class, or entire patterns such as preposition + noun, or determiner + noun, or a word + specific word class following it. Such investigations make it easy, for example, for dictionary publishers to find collocations. Corpus linguistics is also applied to translation studies where with the use of corpora of two languages it became apparent the meanings of words and their supposed equivalents might differ in use or collocates. Moreover, some grammar aspect strongly connected to lexis enable linguists to show differences in the use of certain grammar structures in translations, even if similar grammar structures are available in the source and target languages. In the case of English also differences between its British and American varieties can be easily analyzed thanks to the corpora. Historical change of words meanings and grammar is analyzed as a result of corpora development and although the number of old texts available in the electronic form is much smaller than the amount of contemporary texts the work is doable. Thus, the differences in grammar aspects concerning the passive voice were traced and it turns out that with the 19th century the passive voice in the English language started to be used more and more often. When written and spoken corpora became available, linguists started analyzing them in order to check if there are any patterns of differences between speech and writing. It appears that apart from some quite obvious features such as false starts and hesitations which occur in speech, but not in writing, the use of large numbers of deictic expressions is also more frequent in oral discourses. It is probably because of extra linguistic signals that the spoken language is more vague. Additionally certain grammatical features apparent in speech might be considered ungrammatical in writing. Unlike other scholars, linguists following the corpus linguistics methodology attempt to describe naturally occurring language supporting their views by large amounts of evidence found in corpora. Moreover, statistical operations are often involved in the work on corpora especially when frequencies of use of some linguistic aspects are measured. Large databases of naturally occurring language helped to make progress in the studies of phraseology, especially when it was discovered that certain meanings of words correlate with the grammatical structures in which they are used. Corpus linguistics found application in many fields such as critical discourse analysis, stylistics, forensic linguistics, as well as translations and language teaching. In translations it is helpful since using parallel corpora enables better choice of equivalents and grammar structures that would reflect the desired meaning. Additionally studying corpora revealed that translators do not translate words in texts, but larger units clauses, or sentences. Corpora studies have probably had even bigger influence on language teaching. First of all, they influenced the ways dictionaries are made, secondly learners language has been studied to improve the teachers knowledge of it, and the learners are nowadays encouraged to make use of corpora on their own, in order to increase their language awareness. Moreover, the results

of studying information gathered from corpora influenced the design and content of language workbooks. Brown K. (Editor) 2005. Encyclopedia of Language and Linguistics 2nd Edition. Oxford: Elsevier.

By Kamil Winiewski Aug 17th, 2007 Cognitive linguistics is still a very young discipline which had its beginnings in the 1970s, and whose quick development and extension of investigated issues dates to the mid-1980s. Since then the scope of interest of this branch of science started to include various areas such as syntax, discourse, phonology and semantics, all of which are looked upon as the representation of conceptual organization in language. Probably the most developed idea that emerged from cognitive linguists efforts is that of the cognitive grammar. The aim of cognitive grammar is to formulate a theory of meaning and grammar which would be cognitively probable and would fulfill the following requirements that the only structures allowed in the grammar are:

Symbolic, semantic, or phonological structures usually occurring in linguistic expressions (Saussurean sign) Schemas for such structures (acquired by exposure to multiple examples of the pattern) Categorization of relationships among the above mentioned elements.

Apart from that, cognitive linguistics is interested in issues such as processes by which and patterns in which conceptual content is arranged in language. Therefore, the structuring of concepts like scenes and events, space and time, force and causation, together with motion and location attract the cognitive linguists interest. Moreover, the ideational and affective categories ascribed to cognitive agents such as expectation and affect, volition and intention, as well as attention and perspective are examined. By and large, the cognitive linguists intentions are to ascertain the integrated organization of conceptual structuring in language by approaching such issues as the semantic structure of lexical and morphological forms, together with syntactic patterns. Also interrelationships of conceptual structures, as in the gathering of conceptual categories into large structuring systems are investigated. Brown K. (Editor) 2005. Encyclopedia of Language and Linguistics 2nd Edition. Oxford: Elsevier. Wilson R. A. (Editor) 1999. The MIT encyclopedia of cognitive sciences. London: The MIT Press.