A Note

Entries from January-July 2016 are for ENGL 3580: The English Language, taught by Dr. Larry Swain. It was my great pleasure to be Dr. Swain’s TA for this class, and it was also my great pleasure to deliver several lectures throughout the course about language myths; notes from these lectures are available on this blog. Here is a citation for the text, if anyone is interested–
Bauer, Laurie, and Peter Trudgill, eds. Language Myths. London: Penguin Books, 1998. Print.

Advertisements

Lecture Notes for Myth 21: “America is Ruining the English Language”– John Algeo

In this, we’re dealing with a standard, and language standards are difficult to define in what they are; it’s easier to define a language standard by what it is not. And for many, we in America are not respecting or living up to a standard, because we use some words differently than British English, pronounce words differently, coin words of questionable social value (like twerk), etc.

The standard that the Prince of Wales talks about twenty-one years ago is a pretty specifically ethnocentric standard—“English” English. As he is quoted as saying on the top of p. 177, “We must act now to insure that English…maintains its position as the world language.”

There have been numerous attempts to control English for hundreds of years. The rise of prescriptivists and grammarians in the seventeenth century voiced the same concerns, yet English, in all its varieties, has changed since then. Languages change, yes, and many see that change as something vile that needs to be stopped.

This may just be me, so take it with a grain or two of salt, but the sense I get when people say that language shouldn’t change or that people are ruining English is an incomplete statement. What I think a lot of people are getting at is that language shouldn’t change from the way they use it and understand it, and when language does change in a way that they don’t like, it’s being ruined.

It’s not that we have to like every change the language undergoes. We have many, many lists of words/phrases that grate on our ears. And that’s fine. As academics, one avenue we should consider is not necessarily to just change our opinions, but to inspect why we do not like something, what our own standards are, and why our standards are what they are.

For instance, as I’ve mentioned, I do not care for the singular “they.” But, now, I have a better understanding of why it’s used and what specific reasons I have for not liking it. I can appreciate its role in the language, even if I don’t like it.

Consider, too, that if languages didn’t change, as is some people’s druthers, we’d all be speaking Old English. Expect that we wouldn’t, we’d all be speaking Proto-Germanic. Except that we wouldn’t, we’d all be speaking Proto-Indo-European.  Except that we wouldn’t, we’d all be speaking the mother tongue, and there would be no such thing as “another language.” Which would make understanding everyone and writing dictionaries pretty easy, but we’d also be losing out on quite a bit.

So what we think of as “good” and “bad” in terms of language is highly subjective—that goes for how a language sounds, individual words and phrases, syntax, writing systems, etc. I have a friend who once said (and I think he was kidding—at least, I hope he was kidding) that he thinks Japanese people are faking it, because no one could figure out that alphabet.

From here, I challenge you to think about some language nuances that grate on your nerves, and spend some time thinking about why. Dig for something deeper than “Well, I just don’t like it.” Try to figure out why, and see how it affects your perspective.

Lecture Notes for Myth 20: “Everyone Has an Accent Except Me”– John H. Esling

In short, we all have an accent. (Spoiler alert).  Accents are distinguishing characteristics in the way one speaks, and no one is devoid of characteristics.

One’s accent is influenced by, and often created by, the people one spends the most time around, usually in a particular area of the world, whose members of that area have a distinct way of speaking. (“Distinct” is not akin to “bad;” it is akin to “characteristics”).

If we recall from Chapter 2 in Linguistics for Everyone, in the section about first-language acquisition, when we’re infants, and even still in the womb, we can detect intonation patterns of our native language (37-8). And when we’re babbling in the prelinguistic stage, that babbling is going to reflect the intonation patterns of our native language; so we’re developing our accents before we can even talk (38). A baby conceived, born, and raised in Bemidji is not going to sound like he/she is from Boston once he/she begins speaking.

Many of us think that we don’t have an accent, though, and this may be because we’re surrounded by people who speak with the same accent that we do. It’s relative. We don’t realize the differences until we experience something different; i.e. when we hear a person with a different accent speak.

-Many of us in Bemidji don’t distinguish between the mid-back vowel /ɔ/ and the low-back vowel /a/, because there’s no separation between the two in our accents. But for other accents, there is a difference, and these two vowels are distinct sounds.

-What furthers the thought that we don’t have an accent, especially for people like us from the Midwest, is that our accent is the received standard for most newscasters on TV. We’re exposed to our own accent on TV and in our nonTV lives, which can perpetuate that thought. But it’s important to remember that many of these people train and practice the Midwestern accent. For example, Stephen Colbert is from South Carolina, and he actively suppressed his Southern accent because of the stereotype that people with Southern accents are not intelligent. We know that accent has nothing to do with intellect, yet this myth has persisted and still persists.

If we leave the area where out accent developed, we might begin to change our accents. But then again, we might not. A friend of mine, who grew up in northern Minnesota, has lived in England for ten years, and still sounds like he lives in Nashwauk. A different friend spent a summer in Scotland, and had a light brogue when he came back to Minnesota.

Many of us are quick to recognize other accents as “others,” and there’s nothing wrong with that (provided that we aren’t discriminating or generalizing as a result), but it’s just as important that we recognize our own accents as just that. Our accents are relative to our own linguistic features as well as our own experiences, and this helps give English its rich array of diversity.

We often try to place an accent and try to figure out where that accent is from; we often try to determine the “otherness” of it, which, again, we don’t usually notice a difference until we experience it. It’s always good to experience diversity, but when that experience turns into negative perceptions about a group of people based on one factor, problems can, and often do, arise. What we need to remember is that social and historical factors have a huge influence, to whatever degree, on how we speak; everyone has an accent, and there’s nothing wrong with that.

Lecture Notes for Myth 16: “You Shouldn’t Say ‘It is Me’ because ‘Me’ is Accusative”– Laurie Bauer

Here is a very, very brief crash course in case as it pertains to language— case is markers on a word, usually a word ending, that indicates what that word’s role in a sentence is.

-Let’s take Old English, for example. The nominative case refers to the subject of the verb (naming) and functions as the subject of a sentence. The accusative case refers to the direct object (the thing that receives the action of the verb). The dative case refers to the indirect object (to whom or for whom something is done). The genitive case indicates possession or ownership. There’s also an instrumental case (a thing with which something is done), but it is not terribly common. There are other cases that exist in other languages, too—Latin has a vocative case, an ablative case, and a locative case, in addition to the cases OE has, as well. (Comedian Eddie Izzard weighs in on this).

Since word endings, and sometimes the specific article that precedes them, dictate what words are doing in a sentence, word order doesn’t matter quite as much. In ModE, word order receives much more attention, because ModE contains more elements of an analytic language than Old English does. Specifically, ModE is sometimes called a SVO (subject verb object) language. Languages in which word order doesn’t matter as much are called synthetic languages. Most languages will contain some elements of both, but can be labeled as synthetic or analytic fairly easily.

Since word order receives much more attention in ModE than some other languages, we don’t put much stress on word endings (even though they exist in ModE), which makes the concept of case rather foreign to many. But because we don’t put much emphasis on case, saying that one ought not say “It’s me” because “me” is accusative is a little trivial in ModE. And I mean “trivial” in a similar way as the “rule” that one shouldn’t split an infinitive—we’re capable of doing both in ModE.

One other function of the nominative is to show what the subject complement is. As Bauer elaborates on p. 133, “A subject compliment is a phrase like the teacher in sentences such as Miss Smith is the teacher.” In grammar, a complement is a word or phrase that combines with the head of a new phrase/clause to form a larger phrase. A subject compliment, then, “refers to the same person [or thing] as the subject of the sentence,” and since it’s referring to the subject, the nominative comes into play (133).

But despite some similarities, we’re again touching on prescriptive vs. descriptive. With “It is I” vs. “It is me,” the meaning is there in both sentences, and pretty much everyone understands what’s being communicated.

As “It is me” became a more popular sentence, some decided to argue against it, basing that argument on Latin grammar, because, as we’ve talked about, Latin was so revered. But, as we’ve also talked about, English and Latin have their differences. They have different sets of grammatical functions and rules.

But again, English and Latin have their differences, and it’s difficult to enforce prescriptive rules, because people use language in non-prescriptive ways all the time.

Lecture Notes for Language Myth 14: “Double Negatives are Illogical”– Jenny Cheshire

The types of double negatives that most people dislike are negative words like “not,” “never,” and “no” paired with a negative verb like “isn’t,” “aren’t,” “ain’t,” “won’t,” etc. Some quotations from the essay are that these “make [people’s] blood boil” and are “appall[ing]” (Cheshire 114).

Most speakers of ModE approach this mathematically—that two negatives equal a positive. But the truths that human language and mathematical language represent are not the same. Mathematical figures represent an exact truth—either right or wrong, whereas words have definitions and connotations, and their meanings are much richer, and sometimes more problematic, than mathematical figures. So when we encounter a double negative, like in the sentence, “I ain’t going to no clinic today!” we can fairly assume that the speaker isn’t uttering this with math in mind, but is rather making a pretty emphasized statement about the desire to not go to a clinic. Too, if we were to interpret this mathematically (just for the sake of argument), we run into some problems, especially since we’re given no context with this sentence. If, then, two negatives cancel each other out, what’s left? That the speaker is going to a clinic today, sure, but what clinic or which clinic? A clinic? The clinic? Any clinic? Anything that could be considered a “clinic,” like a general practitioner’s office, or a dance clinic, or an animal clinic? Point is, if we apply mathematical rules of negation to language, it wouldn’t make much sense to not apply math to the obverse of negation or what results from two cancelled-out terms, lest we catch ourselves in a contradiction.  A point Cheshire makes on the top on p. 115 is that “when we have two negatives to deal with…the question is not just whether or not they are illogical, but precisely which logical issues are involved and how they interrelate with each other and with the rest of the utterance.”

Dealing more directly with negatives and double negatives, the distinction between “negative” and “not negative” may not always be so simple and not so binary. Let’s take this example. Compare a Lucky Strike (a cigarette with no filter) to a Marlboro Ultralight, and ask which is less bad for one’s health. Most people will say the Ultralight, but it’s important that we use the phrase “less bad,” because there are no cigarettes that are good for one’s health. So one is bad, one is worse, which means that the first is less bad. Considering the context, it just doesn’t feel right to ask “which is better for one’s health?,” even though the mathematical standards some of us impose on double negatives would dictate that we do, in fact, say “better” instead of “less bad.”

Another example, and I heard this several times when I was working in a bakery, was that people would ask which donut was healthiest. And, really, there’s no such thing as a “healthy donut,” (it’s dough fried in lard, after all), and like with the other example, it might be more appropriate to ask which donut is the least horrible for one’s health. A plain cake donut is less bad than a blueberry bismark topped with buttercream frosting, but try explaining to your thighs that the plain cake donut is healthier.

Often, we don’t distinguish the world in such binary ways. Consider the difference between “not mean” and “nice.” They aren’t necessarily interchangeable terms, but sometimes we need that greyer area to describe things. For those going into teaching, too, there’s a big difference between saying that a student’s essay sucks and saying that the essay needs work. Sometimes we need that wiggle room, and language gives us that opportunity.

There are other examples, too, in which very few would interpret two negative terms as cancelling each other out and equaling a positive or affirmative term. If someone said to a small child, “No, dear, don’t drink that. Put it back under the sink,” no one would interpret that as “Drink that.” Or if a waiter asks if you’d like to see the dessert menu and you say, “Oh, no, not for me, thanks,” the waiter probably isn’t going to look at that as two negatives cancelling each other out and wheel out the cake tray. Sometimes we use negative terms for emphasis, and that rarely gets called into question.

To bring this into a larger scale, quite a few languages use double negatives. Spanish, Ukrainian, Persian, Russian, Italian all use double negatives, and speakers of those languages regard them as normal; since they’re a standard part of those languages, double negatives don’t come under the scrutiny that they do in ModE. And speaking of English, another language that uses double negatives is our precursor—Old English.

Some instances of double negatives are going to stand out more to us than others, especially since ours is a language in which double negatives are stigmatized. But if we take their use into context, and pay closer attention to all the less-noticeable instances of double negatives we encounter, we might not be so quick to jump to conclusions and cry foul.

Lecture Notes for Language Myth 12: “Bad Grammar Is Slovenly”– Lesley Milroy

As we’ve talked about on several occasions, this myth is dealing with prescriptive vs. descriptive grammar. Prescriptive grammar is an imposition of the rules (and I use that term loosely), some of which are arbitrary. Descriptive, on the other hand, is how we use language is a more natural sense, is more context-specific, and takes into account dialect and register. People use English differently from each other, different from the received standard, and that’s fine. It contributes to the richness and variety of English, which, historically speaking, is a pretty varied language as it is.

When we get into the questions posed at the beginning of the essay, we run into some issues with definitions. In what context do we mean “grammar,” and more specifically, how are we defining “bad” grammar? Further, we usually have some idea of what we’re considering and the particular standards we use to justify our claims of labeling an utterance as “bad grammar,” but what aren’t we considering when we use these terms?

In the first three examples beginning on p. 94, we have three descriptive sentences and their prescriptive counterparts. And while the descriptive sentences do not meet the standards of prescriptive grammar, consider the communicative act itself—is there any ambiguous meaning or difficulty in understanding what these descriptive sentences are communicating? I’d be hard-pressed to believe that there is.

Milroy, the essay’s author, mentions codification on p. 95, which is a standardizing of a language. And there are certainly some benefits to having a standard, or at least an idea of a standard. It helps us determine what constitutes good academic writing. It helps us with having an understanding of how to communicate in highly formal settings. By knowing the standards, we’re better able to constructively criticize it. But imposing a standard renders anything below that standard as something lesser, and, usually, as something not good enough in this specific context. And telling a person that his/her language isn’t good enough certainly has its downfalls.

Prescriptivism is largely based on “cultural, [social], and political pressures” (Milroy 96). Hence why some standards are based on Latin, even though Latin grammar and English grammar function differently in many ways. As we’ve talked about before, Latin was a revered language, and to try to give English a similar reverence, those writing the rules based many English standards on Latin.

Milroy mentions that prescriptivism is something of a “linguistic etiquette,” and of course, there are certain contexts in which a heightened etiquette is a good idea (96). Writing one’s thesis, speaking to a judge while one is on trial, speaking to a job interviewer, etc. If, however, each speaker of English spoke with sentences like this during every communicative occurrence in which we engaged, not only would we sound insufferably stuffy, but the language would be stripped of its variety and color, and the diversity with which we use the language would severely diminish, and possibly disappear. And again, what do we stand to lose if everyone use the highest form of language etiquette all the time?

Towards the bottom of p. 98, Milroy makes an excellent point. “A grammatical sentence…follows the rules of the language as it is used by its native speakers” (emphasis mine). Even if we can’t name the linguistic phenomena or identify certain parts of a sentence like the direct object, most of us still produce utterances that are understandable, and in that case, grammatical. (There are exceptions—children, those who have suffered strokes or are aphasia, etc.). Those instances notwithstanding, ungrammatical sentences like “That book is my!” or “I didn’t went to school yesterday” are not going to occur naturally in the language. Our unconscious understanding of grammar—our mental grammar—provides this understanding, whether we’re aware of it or not.

Towards the bottom of p. 99, Milroy makes another point, which is quite interesting. “[L]inguistic prescriptivism [is] the last open door to discrimination.” And that happens quite often, and I’m sure we’ve all made judgments about a person based on his/her grammar. When we’re in positions to pass judgments about one’s use of grammar in certain contexts, like if we’re English teachers reading students’ essays, we’re probably going to bring an elevated sense of grammatical use to the table. However, in most cases, passing judgment about one’s grammar and how it reflects a person is rather uncalled for, and may say more about us than it does about the other person. Which is worse—“bad” grammar or judgmental pedantry?

Too, if we’re talking about grammar, we also have to consider the medium, and be aware that there are some big differences between writing and speaking. Speaking, often, is more ephemeral and fleeting; there’s no chance to revise a spoken utterance. Sure, you can go back and correct yourself, or acknowledge that you misspoke, but that doesn’t change that a misspeaking is already out and the listener heard it. Speech is more prone to “slovenliness” (and again, I’m using that term loosely) if we pay attention. “Um,” “uh,” “er,” “like,” “y’know,” and other various verbal tics, nervous throat-clearing, unfinished sentences, etc. are rampant in most people’s speech. And like commenting on someone’s grammar, there’s a certain context in which telling a person that she says “um” too much may be received as constructive criticism and not unnecessary judgment.

To wrap this up, definitions of “bad grammar” and “good grammar” are quite difficult to maintain when we take some necessary facets of language use into context. How people define these terms vary in different geographic and dialectical regions, and to remain steadfast to a definition of “bad grammar” is to impose a standard that is not as black-and-white as we might think, and this catchall term is exceedingly difficult to maintain in the grand perspective of the use of English.

Lecture Notes for Language Myth 11: “Italian Is Beautiful, German Is Ugly”—Howard Giles and Nancy Niedzielski

This is something of a tough myth, because we have to keep in mind that beauty is in the eye (or ear) of the beholder.

“Certain languages are more aesthetically pleasing than others” (p. 85)

-Italian and French are, because of how they sound, seen by many to be more elegant, sophisticated, sonorous, and romantic compared to languages like German and Arabic, which many see as harsh, ugly, and unpleasant.

-On the spectrum of aesthetics, English is somewhere in the middle—not receiving much praise, but also not much scorn for its sound.

-This, however, gets into perspectives about English dialects—some of which “sound better” than others. Compare Posh English (Upper Received Pronunciation) to a Cockney accent, which many find grating. Or a Midwestern accent compared to a Georgia drawl.

What this very quickly gets into is that opinions about how a language or dialect sound often transcend into a similar opinion about the language itself, and can (and often does) influence the way we think about the speakers of the language themselves.

-In this, perceptions about French and Italian lead to the positive stereotypes of art, romance, fine wine, gourmet food, and generally pleasant things, whereas perceptions about German leads to stereotypes of barbarous war-mongers who don’t have humor. And that often overlooks that what’s revered about French and Italian culture can also be found in German culture—the language has little to do with the aesthetic cultural contributions. The language one uses or the way one pronounces words is not directly correlated to the person’s intelligence, beliefs, or cultural contributions. And yes, evil German-speaking people exist, as do people with southern accents who actually are dumb rednecks, as are elegant Italian-speaking people, but the obverse of all of those are also true.

So what makes ugly sounds “ugly” and vice versa?

-one approach, noted at the bottom on p. 86, is called the “inherent value hypothesis,” which is a rather contentious idea, but it provides the idea that some languages/dialects/accents are just more pleasing to the ear. And because some languages simply sound more pleasant to the human ear, they receive more prestige.

-because of this, the way social attitudes regarding the sounds one produces can affect one’s own opinions about his/her language/dialect/accent, and can lead to what is called “linguistic self-hatred,” in which a speaker demeans or self-deprecates his/herself because of how he/she speaks (p. 87). If you’re given enough grief about sounding Minnesotan, especially by society at large, you may come to hate your accent and way of speaking.

-as it is, though, jabs at Minnesota-speak are fairly playful and don’t carry much negative significance as do jabs at, say, southern accents.

There’s also something called the “social connotation hypothesis,” which is that the sounds of a language are perceived in correlation to opinions about the nationality or ethnicity of the speakers themselves (p. 88).

-by and large, the closer we speak to a received standard, the better we’ll fare in terms of how we’re socially perceived, which can be a big deal in when we’re job hunting. If you have the same qualification as a person with a thick southern accent and you both apply for the same job, the person with the southern accent is less likely to get it because his/her way of speaking is more unpleasant.

-thus, many people in America negatively associate the sounds of the Arabic language with negative perceptions of Arabic people, which many of us realize are untrue and unfair, yet this view persists.

-So, hypothetically, if the various social conventions had deemed southern dialects as standard, and all non-southern dialects as nonstandard or substandard, our Midwestern accent may have been the one receiving the negative perceptions.

However you perceive a language in terms of its sound is one thing. What you think is pleasing or grating is mostly out of your control. But if your perceptions about the way a language sounds directly influence what you think of the people who speak that language, that’s another thing entirely.