This chapter discusses the principles that determine legitimate minimal domains for antecedents of reflexive forms. It offers novel critiques of the idea that these principles reduce to some elementary statement involving c‑command or analogs thereof, and proposes a relational account for certain documented constraints.
In this paper, the morphology, syntax, semantics, and diachrony of expressions liketwenty-odd are described, based on the results of a corpus study which considers data from the British National Corpus, the Oxford English Dictionary, and Google. The -odd suffix appears most frequently with twenty, and in collocations with temporal nominals such as years, days, etc. Distributionally it appears to be a derivational suffix on numerals, occurring inside additional suffixation such as ordinal -th. It originated from the use of odd to denote a surplus or remainder, which usage has existed for several hundred years. It is distinct from other English approximatives, and approximatives in other languages, in that -odd expresses an indeterminate range above the cardinality of the modified numeral, but not below it, while other approximative expressions (like about) include the possibility that the actual number might be either above or below the reference number.
This chapter discusses elements of communicative content that are not expressed by overt elements of a sentence. In the 1970s and 1980s, mostly inspired by the work of Grice, forms of ‘unexpressed elements of content’ not contemplated by linguistic theory of the time began to surface under a variety of labels, collectively called ‘impliciture’ here. It is argued in this chapter that recent experimental work suggests that certain forms of impliciture are tied to language via “standardization” which provides a pragmatic scenario that does not require access to potentially unbounded domains of general background information.
In this chapter, arguments against several variants of the modern syntax-based analyses of deverbal nominalizations are presented, and the classic lexicalist approach deriving from Chomsky’s 1970 Remarks on nominalization is defended. The modern approaches of Alexiadou (2001), Fu, Roeper and Borer (2001), Harley and Noyer (1998), which revive in various forms the sentential Generative Semantics analyses of event nominals, are each considered and rejected in turn. In such approaches, argument-structure nominals contain some amount of verbal structure as a proper subpart. Yet, all such nominals exhibit surface syntactic patterns that resemble exactly those of nonderived nominals. The absence of verb-phrase syntax within nominalizations is a fundamental generalization about such nominals, and is very problematic for analyses which propose such substructure.
Our goal here is to explore an unusual approach to the long-standing problem of coordination in natural language — the problem of accommodating subordinate and coordinate structures within a consistent and empirically sound syntax. In what follows we’ll offer a brief overview of the problem and identify a central assumption about the syntax of coordinates (the Homogeneity Thesis) that seems to be very widely shared by investigators working on coordination regardless of their theoretical orientation. We will then review some recent experimental results that seem to clash with certain implications of the Homogeneity Thesis. Though the evidence reviewed here is far from definitive, we argue that serious consideration of alternatives to the Homogeneity Thesis is in order.
It is commonly assumed that the occurrence and distribution of processing errors offer a “window” into the architecture of cognitive processors. In recent years, psycholinguists have drawn inferences about syntactic encoding processes in language production by examining the distribution and rate of subject–verb agreement (SVA) errors in different contexts. To date, dozens of studies have used a sentence repetition-completion paradigm to elicit SVA errors. In this task, participants hear a sentence fragment (or “preamble”), repeat it, and provide a well-formed completion. These experiments have shown that when a singular head is modified by a phrase containing a plural NP (e.g. The bill for the accountants...), a significant number of SVA errors may occur. Several experiments have shown that, in English, the phonological form of words within a subject NP plays virtually no role in the rate of error occurrence. Yet recent data from our lab suggests that overt morphophonological case information does matter: speakers are more likely to produce the errorThe bill for the accountants were outrageous than The bill for them were outrageous. In this paper, we will present the results of this case-marking study and discuss the implications for models of language production.
In this chapter, we provide a brief overview of a history of studies of coordination. We then report the results of one experiment concerning the acquisition of coordination in English that has not before been reported, “The Mud-Puddle” study, and set it in the context of this history and our developing quest for understanding the nature of linguistic coordination and the fundamental competence that underlies its acquisition. This experiment was conducted by an honors student at Cornell at the time (Krawiec 1980); its data have been preserved and now reanalyzed. Its results, although preliminary, bear on the nature of the syntax-semantics interface that coordinate structures involve, integrate with certain current theoretical advances, and suggest future research possibilities.
We have long loved Langendoen (1970) — a paper on the theoretical justification of “transformations, their effects on the structure of sentences, and the conditions under which they are optional or obligatory” (p. 102). In that paper, Langendoen argued that acceptability and grammaticality are “partially independent [and] partially dependent notions” (p. 103). We are struck by the implications of this contrast for language learning. If the learner’s grammar is a set of probabilistic patterns and not (also or instead) a set of grammatical rules, one might expect high frequency elements to be ‘grammatical’ and low frequency elements to be ‘ungrammatical.’ In other words, grammaticality and acceptability should be similar if frequency is the determining factor. But Langendoen (1970) hypothesized that grammatical competence contributes to grammaticality while processing factors contribute to acceptability. Our research shows clearer effects of frequency on the latter than the on former and thus relates to Langendoen’s observation.
This chapter explores the role of frequency in children’s syntactic and morphophonological development. One study compares relative clauses involving different extraction sites, which constructions vary considerably in their frequency of occurrence. Children’s production of these relatives suggests that frequency affects sentence planning, but their judgments of the same relatives are out of synchrony with the frequency rates. The other study presented here concerns the a and an forms of the indefinite article, which distinction is acquired relatively late even though the forms occur frequently. These studies show that frequency cannot be the whole story. We conclude that children’s mastery of a system of rules proceeds — at least to some extent — independently of frequency patterns in the input.
This paper uses the syntactic category of determiner to address the issue of innateness in language acquisition. Reviewing data from infants and toddlers, I propose that categories are innate and that children show continuity in category acquisition. As development proceeds, children learn the individual words in each category in the target language and the specific syntactic properties of those words, but they do not construct the categories themselves.
Finite state approaches to phonology usually make use of transducers to model the mapping of input to output forms. This applies to both rule-based approaches and more recent approaches inspired by Optimality Theory. Here, we develop an alternative approach based on automata where phonological generalizations and lexical regularities are encoded as regular expressions and these expressions are combined by intersection and concatenation. The system that results captures the full range of phonological systems, but does so with simpler automata, rather than transducers. In addition, the resulting system bears interesting similarities to Optimality Theory. We also compare the approach to other finite state approaches.
This contribution expounds on ideas put forth by a group of New York City generative grammarians that language possesses certain essential features that are uncaused and adhere to an abstract ideal form. An analogy of the situation with language is made with certain natural properties of numbers. It is also noted that this situation contrasts with that of the functional structure of human color vision. This idea is viewed alongside recent work in biolinguistics and is compared to the neoplatonist view of language, namely that language is discovered by the child learner and not triggered. The main consequences of this idea are discussed both within the historical context and with respect to current theories on language acquisition.
The Internet has given us a new playing field for global collaboration. It could transform the practice of linguistics through universal access to huge quantities of digital language documentation and description. But this transformation can happen only if certain aspects of community practice are formalized by defining and adhering to shared standards. After expanding on the vision for what linguistics could be like in the twenty-first century, this essay attempts to clarify the role of standards by considering two case studies of life with and without standards — using solar time versus standard time, and using language names versus language identifiers. The essay then develops two metaphors that seek to put standards in a positive light: “linguistics as community” and “development as freedom.” The ultimate conclusion is that only by submitting to the constraints of shared standards will the community be free to develop the riches of knowledge it is seeking.
In this chapter Sherwin Cody’s well-known correspondence course is analyzed within a historical context and according to the norms of the early part of the 20th century. Details of the course are summarized, and several notable examples are given concerning the prescriptive rules for pronunciation, practical grammar, and grammatical correctness. The course itself and aspects of the successful marketing campaign are discussed. Cody’s prescriptivist and descriptivist approaches are evaluated according to early 20th-century society; it is argued that Cody’s course was influenced by several, sometimes opposing, factors. The views of language experts of the day, including educators and linguists, are taken into account.
This chapter discusses the principles that determine legitimate minimal domains for antecedents of reflexive forms. It offers novel critiques of the idea that these principles reduce to some elementary statement involving c‑command or analogs thereof, and proposes a relational account for certain documented constraints.
In this paper, the morphology, syntax, semantics, and diachrony of expressions liketwenty-odd are described, based on the results of a corpus study which considers data from the British National Corpus, the Oxford English Dictionary, and Google. The -odd suffix appears most frequently with twenty, and in collocations with temporal nominals such as years, days, etc. Distributionally it appears to be a derivational suffix on numerals, occurring inside additional suffixation such as ordinal -th. It originated from the use of odd to denote a surplus or remainder, which usage has existed for several hundred years. It is distinct from other English approximatives, and approximatives in other languages, in that -odd expresses an indeterminate range above the cardinality of the modified numeral, but not below it, while other approximative expressions (like about) include the possibility that the actual number might be either above or below the reference number.
This chapter discusses elements of communicative content that are not expressed by overt elements of a sentence. In the 1970s and 1980s, mostly inspired by the work of Grice, forms of ‘unexpressed elements of content’ not contemplated by linguistic theory of the time began to surface under a variety of labels, collectively called ‘impliciture’ here. It is argued in this chapter that recent experimental work suggests that certain forms of impliciture are tied to language via “standardization” which provides a pragmatic scenario that does not require access to potentially unbounded domains of general background information.
In this chapter, arguments against several variants of the modern syntax-based analyses of deverbal nominalizations are presented, and the classic lexicalist approach deriving from Chomsky’s 1970 Remarks on nominalization is defended. The modern approaches of Alexiadou (2001), Fu, Roeper and Borer (2001), Harley and Noyer (1998), which revive in various forms the sentential Generative Semantics analyses of event nominals, are each considered and rejected in turn. In such approaches, argument-structure nominals contain some amount of verbal structure as a proper subpart. Yet, all such nominals exhibit surface syntactic patterns that resemble exactly those of nonderived nominals. The absence of verb-phrase syntax within nominalizations is a fundamental generalization about such nominals, and is very problematic for analyses which propose such substructure.
Our goal here is to explore an unusual approach to the long-standing problem of coordination in natural language — the problem of accommodating subordinate and coordinate structures within a consistent and empirically sound syntax. In what follows we’ll offer a brief overview of the problem and identify a central assumption about the syntax of coordinates (the Homogeneity Thesis) that seems to be very widely shared by investigators working on coordination regardless of their theoretical orientation. We will then review some recent experimental results that seem to clash with certain implications of the Homogeneity Thesis. Though the evidence reviewed here is far from definitive, we argue that serious consideration of alternatives to the Homogeneity Thesis is in order.
It is commonly assumed that the occurrence and distribution of processing errors offer a “window” into the architecture of cognitive processors. In recent years, psycholinguists have drawn inferences about syntactic encoding processes in language production by examining the distribution and rate of subject–verb agreement (SVA) errors in different contexts. To date, dozens of studies have used a sentence repetition-completion paradigm to elicit SVA errors. In this task, participants hear a sentence fragment (or “preamble”), repeat it, and provide a well-formed completion. These experiments have shown that when a singular head is modified by a phrase containing a plural NP (e.g. The bill for the accountants...), a significant number of SVA errors may occur. Several experiments have shown that, in English, the phonological form of words within a subject NP plays virtually no role in the rate of error occurrence. Yet recent data from our lab suggests that overt morphophonological case information does matter: speakers are more likely to produce the errorThe bill for the accountants were outrageous than The bill for them were outrageous. In this paper, we will present the results of this case-marking study and discuss the implications for models of language production.
In this chapter, we provide a brief overview of a history of studies of coordination. We then report the results of one experiment concerning the acquisition of coordination in English that has not before been reported, “The Mud-Puddle” study, and set it in the context of this history and our developing quest for understanding the nature of linguistic coordination and the fundamental competence that underlies its acquisition. This experiment was conducted by an honors student at Cornell at the time (Krawiec 1980); its data have been preserved and now reanalyzed. Its results, although preliminary, bear on the nature of the syntax-semantics interface that coordinate structures involve, integrate with certain current theoretical advances, and suggest future research possibilities.
We have long loved Langendoen (1970) — a paper on the theoretical justification of “transformations, their effects on the structure of sentences, and the conditions under which they are optional or obligatory” (p. 102). In that paper, Langendoen argued that acceptability and grammaticality are “partially independent [and] partially dependent notions” (p. 103). We are struck by the implications of this contrast for language learning. If the learner’s grammar is a set of probabilistic patterns and not (also or instead) a set of grammatical rules, one might expect high frequency elements to be ‘grammatical’ and low frequency elements to be ‘ungrammatical.’ In other words, grammaticality and acceptability should be similar if frequency is the determining factor. But Langendoen (1970) hypothesized that grammatical competence contributes to grammaticality while processing factors contribute to acceptability. Our research shows clearer effects of frequency on the latter than the on former and thus relates to Langendoen’s observation.
This chapter explores the role of frequency in children’s syntactic and morphophonological development. One study compares relative clauses involving different extraction sites, which constructions vary considerably in their frequency of occurrence. Children’s production of these relatives suggests that frequency affects sentence planning, but their judgments of the same relatives are out of synchrony with the frequency rates. The other study presented here concerns the a and an forms of the indefinite article, which distinction is acquired relatively late even though the forms occur frequently. These studies show that frequency cannot be the whole story. We conclude that children’s mastery of a system of rules proceeds — at least to some extent — independently of frequency patterns in the input.
This paper uses the syntactic category of determiner to address the issue of innateness in language acquisition. Reviewing data from infants and toddlers, I propose that categories are innate and that children show continuity in category acquisition. As development proceeds, children learn the individual words in each category in the target language and the specific syntactic properties of those words, but they do not construct the categories themselves.
Finite state approaches to phonology usually make use of transducers to model the mapping of input to output forms. This applies to both rule-based approaches and more recent approaches inspired by Optimality Theory. Here, we develop an alternative approach based on automata where phonological generalizations and lexical regularities are encoded as regular expressions and these expressions are combined by intersection and concatenation. The system that results captures the full range of phonological systems, but does so with simpler automata, rather than transducers. In addition, the resulting system bears interesting similarities to Optimality Theory. We also compare the approach to other finite state approaches.
This contribution expounds on ideas put forth by a group of New York City generative grammarians that language possesses certain essential features that are uncaused and adhere to an abstract ideal form. An analogy of the situation with language is made with certain natural properties of numbers. It is also noted that this situation contrasts with that of the functional structure of human color vision. This idea is viewed alongside recent work in biolinguistics and is compared to the neoplatonist view of language, namely that language is discovered by the child learner and not triggered. The main consequences of this idea are discussed both within the historical context and with respect to current theories on language acquisition.
The Internet has given us a new playing field for global collaboration. It could transform the practice of linguistics through universal access to huge quantities of digital language documentation and description. But this transformation can happen only if certain aspects of community practice are formalized by defining and adhering to shared standards. After expanding on the vision for what linguistics could be like in the twenty-first century, this essay attempts to clarify the role of standards by considering two case studies of life with and without standards — using solar time versus standard time, and using language names versus language identifiers. The essay then develops two metaphors that seek to put standards in a positive light: “linguistics as community” and “development as freedom.” The ultimate conclusion is that only by submitting to the constraints of shared standards will the community be free to develop the riches of knowledge it is seeking.
In this chapter Sherwin Cody’s well-known correspondence course is analyzed within a historical context and according to the norms of the early part of the 20th century. Details of the course are summarized, and several notable examples are given concerning the prescriptive rules for pronunciation, practical grammar, and grammatical correctness. The course itself and aspects of the successful marketing campaign are discussed. Cody’s prescriptivist and descriptivist approaches are evaluated according to early 20th-century society; it is argued that Cody’s course was influenced by several, sometimes opposing, factors. The views of language experts of the day, including educators and linguists, are taken into account.