This is the first of many posts meant to pursue the question of whether Spanglish as we know it can be considered its own language. My purpose was to find out whether the community of Spanglish speakers follow their own rules that are consistent within the community, and whether enough of the range of life's experiences happen in a Spanglish environment, that such a language becomes thoroughly adapted, and its rules apply everywhere, with children growing up knowing only Spanglish, or functioning entirely in Spanglish.
Now one obvious problem is that the Spanglish of the Puerto Rican boroughs of New York may be different from the Spanglish of Texas, or California, or Miami, or even Gibraltar, but I'll set that aside for now. There may be pockets who live in Spanglish in any of many places, and they may be different, or use it for different purposes. I'd like to focus on what is heard in Texas and northeastern Mexico, because that's what I can get the most information about. But still I don't have much of a sense of how much is spoken, where it is spoken, how relentlessly the languages are mixed, etc.
One claim I have heard more than once is that Spanglish is English, used with Spanish words whenever more emotion is being stated. English thus has the sound
of being colder and more analytical, so by nature saying Dios mio
is more emotional than My God
, simply because Spanish carries more expressive emotion and feels stronger on the emotional side.
Now this would qualify as a rule
, if it were consistent, if every speaker agreed on it, and if then all emotional expressions
were consistently in Spanish. We could then study the phenomenon of Spanish replacement, which would assume that these expressions were placed into an English frame, and the frame would adjust to their Spanishness, whereas they would adapt in some cases to the English sentence they were in.
I can use that as a working theory as I investigate what actually happens. I can tell you that these days there is plenty of data out there. Please remember that I myself have made no claims
about what happens as I have just arrived in this area, and really don't have much of a picture of it. Do people grow up in this language? Is it spoken enough
that the community has agreements about how it's spoken and what is used? I have no idea. I'll use this as a springboard to find out.
My undergraduates have just finished writing term papers which were, to me, very interesting. They had three choices: 1) choose an uncommon or endangered language and write about it, 2) take a stand on whether "Spanglish" can be considered a language of its own, 3) investigate linguistic forensics and its application to a court case. About six jumped on #3, and it could be assumed that they are forensics majors; many of my students are. Only one took #2 (so far - in fact I have only received 34 out of about 40 papers); and the rest did endangered or obscure languages.
So let's talk about those. There were Native American languages and their programs to survive and teach their children; similarly, there was an Australian one. There were two on Ainu and some other Asian ones; there was also two papers on European languages, one in Poland and one in Belarus.
One was on Klingon, which was invented by a guy in response to enthusiastic reaction to Star Trek and a group of aliens who appear on it. The guy supposedly was less
concerned about grammar than about other elements of the language, which he spelled out very carefully. Classes have been held in Klingon, and Hamlet was rewritten in Klingon. It has an enthusiastic fan base, but no group of native speakers
. It has a tension between people who believe that only what the originator specified
is valid, versus people who would experiment, change and adapt the language. It raises questions about whether it is a language, partly because interest in it is growing, and even maturing, yet it has restrictions on settling in to being a huge language.
Another paper was on Boontling, a language spoken in the small California town of Boonville. Apparently these rural people, in the late nineteenth century, started systematically making up new words and using them in place of English words, until their Boontling slang was indecipherable by outsiders; ultimately it became like a souvenir, glorified in YouTubes and in tourist literature. But the question is: did it ever have its own grammar, or was it just new words being plugged in systematically in places where English would have content words? My question really is whether it can be called its own language if it has used English grammar and phonology entirely, and simply replaced all the words. In the case of this language, there were generations of people who grew up within it, unclear about what common things (such a phone) were really called; however, these native speakers ended up with a huge vocabulary, more than with two separate languages.
A final question was about saving native languages, and Hawaiian came up as one that has significant attention put to saving it. Hawaiian is an interesting and unique language and attracted this attention partly because of that. The University of Hawaii set about teaching Hawaiian as a second language and soon built up a culture of non-native speakers who were very interested in perpetuating the language. Meanwhile a group of native Hawaiians holed up on one of the islands, where outsiders were prohibited and where they could successfully perpetuate the language and a culture. But the inevitable happened, or is happening: this culture survives only by being isolated. Its elders die; its young generation seek more action in the larger cities. Now, of the two populations who speak Hawaiian, it can be argued that the university one may be more vibrant, or, that due to lack of contact, the variations inevitably split from each other
. And either group is just as likely to change anything in their variant, as we saw in English in colonial days. The set of changes that befall a language invariably create a different language, when and if they don't maintain contact.
I'll read these papers more carefully as soon as I can, and report back. I find it an interesting subject, and want to pursue it more.
Labels: endangered languages, language, languages, ttu
Is Simply Being Able to Speak English Enough? (guest post)
The following is contributed by Eve Pearce, a guest writer. I gladly give over some of my weblog to consider her opinions, and wish her luck as a writer making her way in this digital world.
Is Simply Being Able to Speak English Enough?
The Problem with Learning English as a second language with a view to
working with natives, is that simply being educated in standard English
may not be good enough. It may suffice for communication with pen pals,
helping pass exams and surviving in a schooling environment; but all
these scenarios do not account for day to day abbreviations and
colloquialisms or slang that is simply not found in a text book.
after the most studious of pupils have achieved the highest award
possible under instruction of any type of ESL course or similar variant;
this will not prepare the candidate for the unruly English employment search
that is sure to leave them bemused. An important factor that is simply not taught under standard ESL guidelines
how to use regional dialect. Regional dialect is a form of speech
particular to a particular region (hence the name), this is imperative
to understanding even the most basic of day to day conversations. People
living in Newcastle in England for instance are referred to as
‘Geordies’ and the younger generations in particular make use of their
particular slang with words such such as, ‘mortal’, meaning inebriated
and under the influence of alcohol. Even other English natives in a
different area couldn’t comprehend such a transition of nouns – so how
would non natives manage?
Similarly, inhabitants of Alabama in American may use the term
‘mudflap’ to refer to human hair normally as a form of negative diction,
this subsequently has negative connotations regarding the aesthetics of
one’s hair. Even if the descriptor is translated into a more
standardised form of dialect and taught as part of an ESL course, the
tonality and implications of the verb or noun cannot be expressed in
written form. This can lead to awkward situations that can not only be
embarrassing, but can get the poor, confused individual into a lot of
trouble. For example; ‘spinster’ in commonly exchanged English refers to
a single woman in the same way that bachelor refers to a single male;
however, although bachelor clearly has positive connotations, spinster
is mostly viewed as a highly offensive term and implies the subject is
undesirable. A lady, I’m sure, would not take kindly to such a reference
made on her behalf! The list is almost exponential in the sense that,
as soon as older linguistics become ‘unfashionable’, newer terms are
being invented, either officially as additions to the oxford English
dictionary, or unofficially in the form of regional dialect or an
individual’s personal idiolect (their own distinct style of speech).
Evolution of Language
This evolution of language is nothing new or out of the norm; if you
consider Chaucer and Shakespeare, two of the most influential figures in
the history of the English language and contemplate how they spoke, you
will see the vast contrast between their idiolects and the dialect we
now consider to be ‘standard english’. Natives who speak English
regularly would be somewhat perplexed if we communicated with each other
with such terms as ‘thou’ and ‘shalt’ and referred to objects with
different terminology i.e. a sword as a rapier as Shakespeare would have
done. The evolvement of the English Language is unfortunately something
to be expected both formally and unofficially; the only way to truly
keep up to date with at least one variation of a particular dialect of
English is to immerse yourself in the culture of the area; which is not
something that can be done straight away if you can only afford to move
to the area once a suitable placement has been found.
solution to the problems posed by regional colloquialisms, that truly
hinder a non-natives ability to function, is to introduce a secondary
course once candidates have graduated from learning standard English.
This should be unique to the location the individual wishes to move to
and should incorporate practical elements that are required for day to
day living such as trips to corner stores, doctors, public toilets,
offices and even local bars and clubs. Potential employees do not seem
to appreciate the value of being able to socialise efficiently. Business
meetings are often conducted over meals in restaurants, clients of
firms are often enticed in bars and partnerships and deals are often
established after a social platform as well as a professional one has
About 20-30% of landing a dream role is not just about expertise and
qualifications held on a curriculum vitae, but about charisma and the
allure of the candidate. If the interviewee is unable to comprehend
native humour, manner and customs they will fall at the first hurdle
before they have a chance to exhibit any of their professional qualities
that will, undoubtedly, remain hidden no matter how impeccable and
invaluable they may be. If this wasn’t the case, and clinical, cold
credentials were the only criterion for job roles, we would not have
face to face interviews. A very simple fact of life that inexplicably
seems to of been overlooked by those teaching English as a second
language, is that the key to success, and even survival, is interaction.
Of course this is the fundamental reason for learning a language in the
first place but, as human beings, we use a lot more than just an array
of precisely strung together sentences to voice our opinions - we
gesticulate, change our poise, tone and overall persona to adapt to an
environment. All of these factors differ from country to country, county
to county, state to state and even city to city and are no means solely
reliant on the English language alone.
Online Schools Under Fire: guest post by Brianna Meiers
Brianna Meiers, an education blogger who writes regular profiles of schools offering distance learning classes, takes a look at online higher education from a variety of standpoints in today’s article. The blog has talked about trends in online education before,
and this piece rounds out the discussion by highlighting some of the many ways in which online learning is continuing to change and improve.
Online Schools Under Fire: Assessing the Effort to Improve
The Internet is reshaping the face of modern education, bringing with it many benefits in terms of accessibility and universality. More programs than ever before are offering courses and degrees online. An increasingly diverse group of students are responding, which in many respects is good—but also raises questions about legitimacy and true equality. While some online programs are arguably quite top-notch, not all are. Between non accredited for-profit universities and so-called “diploma mills,” there is a lot of room for abuse in the digital space. E-student enrollments are on the rise across the board, which means that issues of legitimacy and value deserve a second look.
Digital learning has come a long way even in the last five years. The online-only schools that once dominated the space by catering primarily to adult learners have been supplemented by a growing number of undergraduate degree programs geared towards more traditional first-time students. Many of these are offered by for-profit online academies, but a growing number are extension programs of more traditional
universities. Some schools, particularly Stanford, Harvard, and the Massachusetts Institute of Technology, also offer some courses for free online through the Coursera and edX platforms.
The trend is catching on quickly. For schools, offering digital lessons can be a great way to attract more students and broaden the tuition base. Students enrolling online often benefit from lower costs, as well as exceptional flexibility. “Roughly one-in-four college graduates report that they have taken a class online. However, the share doubles to 46% among those who have graduated in the past ten years,” a 2011 report published by the Pew Internet and American Life project said. “Among all adults who have taken a class online, 39% say the format’s educational value is equal tothat of a course taken in a classroom.”
This trend may be indicative of what some call a “disruptive innovation” in the educational sector. Following this model, movement starts off small, usually by targeting just a few customers or, in this case, students. Speed picks up incrementally, effecting a slow permeation until the landscape has forever changed.
One of the biggest concerns with online education has to do with its quality. Though immensely popular, for-profit Internet schools traditionally have a lower reputation than in-person institutions. If online learning truly is a disruptive innovation, however, there is a
strong likelihood that its quality will only progress and evolve with market needs.
“A disruptive innovation always starts out at a lower quality,” Louis Soares, director of postsecondary education at the Center for American Progress, told U.S. News & World Report. "But if you take that for-profit energy out of higher education, online
education wouldn't have grown the way it has in the last 10 years.”
And growing it still is, in some sectors eclipsing enrollment numbers at more traditional universities. This has some worrying that online education will replace the need for classroom-based learning, though most experts agree that these fears are too far out on the horizon to be a concern. More likely, most say, is a radical technological integration into traditional classrooms. Many of the college classes of the future are likely to be “flipped,” with lectures and lessons happening online in a student’s homework time. Class time is then devoted to more collaborative learning and the person-to-person teamwork and interaction that is more difficult to simulate in the online space.
“Although traditional lectures are a standard and acceptable teaching method, they are much less effective for today's students, who have grown up on high-speed Internet, video games and mobile gadgets,” EdTech magazine reported in 2012. “In moving away from the lecture-only model, faculty and students are using recorded class lectures; notebook computers and tablets; and digital content and learning management systems. Smartphones, student response systems and blogs are also on therise as learning technologies.”
Digital learning is still somewhat new when it comes to educational developments, however, and lawmakers are quick to caution against moving too quickly. Innovate too fast, and key aspects of learning may be lost, many say. An October 2012 meeting of government and education policy officials in Washington, D.C. tried to flesh out the questions related to online schools in more depth, mirroring the conversations happening in state
governments and university boardrooms across the nation.
Top on the national meeting agenda were questions of cheating and fraud online—common when students cannot definitively authenticate their identity—and the value and needed rigor of accreditation. Arne Duncan, the U.S. Secretary of Education, played a key role in moderating the day’s discussions, trying to draw a balance between the sometimes questionable quality of online institutions and the unprecedented openness they bring to the market.
Inside Higher Ed, which covered the meeting, focused on the debate concerning how much oversight the government should have when it comes to overseeing the online learning platform. “Duncan echoed other participants throughout the day in his call for better data, standards, and accountability in higher education,” an Inside Higher Ed report said the day after the meeting. “That naturally leads to tension in creating quality-protecting policies without becoming an overly prescriptive regulatory regime.”
Though the overall value of online higher education is sometimes in doubt, there is little question that it is here to stay. Not only are more students than ever before earning online degrees and completing credited coursework online, more traditional programs are looking for ways to leverage technology for residential students. Rather than asking whether these programs should exist, scholars and academics may be better served wondering how they can be tailored to provide the best possible educational experience.
Leverett, T. (2009, Mar.). Red ink and the OK Corral
, written for Error correction frontier: The good, the bad and the ugly
(Discussion, Writing IS,TESOL 2009, Denver CO USA). Available https://docs.google.com/document/pub?id=1JCPRiIt2yBGSGCMdwDTbaalPipTvGw1fw3JD52L7DYk
Leverett, T. (2009). Red ink and the mud-wrestler’s grasp
. Available: https://docs.google.com/document/pub?id=1zpEfLQpYIdR2b682ggi0D9DUUTq5OZR6VLc6thAt2xw.
These two, when I went looking for them, were hard to find. I'd written them, and put them on SIU's web; when SIU took down its web, I put them on Google docs, but I went to find them the other day and Google made me log in. That, and the fact that even typing one title into Google elicited nothing
, and now I'm pretty sure they're too hard to find.
One thing I think that happens is that Google gets tired of dead links and gets a little too aggressive about wiping them out. So that, since these titles were associated with dead links, all links
with these titles were suspect and pretty soon there are no links
to the articles themselves. But the other question involves google-docs itself. Do you really have to log in to read them? If so I can see how the free web now has no links
. And I'd like to correct it, for what it's worth. The idea was for people to be able to read them
. Not that they do.
This brings me to the topic itself. It is a running argument in my present workplace. We have heard the standard line (Truscott's) that making grammatical correction is not going to improve their English as much as just purely reading more. Many of us believe that it is an important step in the process, whether you see improvement directly (the day after) or not; we know from watching people learn, whether they have learned anything or not, whether we've wasted our time, whether they needed to hear it or not. One thing I find interesting is that in a lab setting you get to watch individual people better and because you know them, you actually see if things did in fact work. You learn better ways of saying things, better ways of handling situations. I'm enjoying that; it's interesting. It is, by nature, a different crowd, and therefore a different question from the one dealt with in the articles above. But similar logic applies: We have to decide what's the best use of our time. And it's an open question, still.
Labels: bib, grammar, sla