Here’s an intriguing study for those interested in how language affects how we think. It’s also of interest to those who speak more than one language or are interested in learning another language, because it deals with the long-debated question as to whether bilinguals working in their non-native language automatically access the native-language representations in long-term memory, or whether they can ‘switch off’ their native language and use only the target language memory codes.
The study follows on from an earlier study by the same researchers that indicated, through the demonstration of hidden priming effects, that bilinguals subconsciously access their first language when reading in their second language. In this new study, 45 university students (15 native English speakers, 15 native Chinese speakers, and 15 Chinese-English bilinguals) were shown two blocks of 90 word pairs. The pairs could have positive emotional value (e.g., honesty-program), negative valence (failure-poet), or neutral valence (aim-carpenter); could be semantically related (virus-bacteria; love-rose) or unrelated (weather-gender). The English or Chinese words were flashed on the screen one at a time, with a brief interval between the first and second word. The students had to indicate whether the second word was related in meaning to the first, and their brain activity was monitored.
The English and Chinese speakers acted as controls — it was the bilinguals, of course, who were the real interest. Some of the English word pairs shared a sound in the Chinese translation. If the Chinese words were automatically activated, therefore, the sound repetition would have a priming effect.
This is indeed what was found (confirming the earlier finding and supporting the idea that native language translations are automatically activated) — but here’s the interesting thing: the priming effect occurred only for positive and neutral words. It did not occur when the bilinguals saw negative words such as war, discomfort, inconvenience, and unfortunate.
The finding, which surprised the researchers, is nonetheless consistent with previous evidence that anger, swearing or discussing intimate feelings has more power in a speaker's native language. Parents, too, tend to speak to their infants in their native tongue. Emotion, it seems, is more strongly linked to our first language.
It’s traditionally thought that second language processing is fundamentally determined by the age of acquisition and the level of proficiency. The differences in emotional resonance have been, naturally enough, attributed to the native language being acquired first. This finding suggests the story is a little more complicated.
The researchers theorize that they have touched on the mechanism by which emotion controls our fundamental thought processes. They suggest that the brain is trying to protect us by minimizing the effect of distressing or disturbing emotional content, by shutting down the unconscious access to the native language (in which the negative words would be more strongly felt).
A few more technical details for those interested:
The Chinese controls demonstrated longer reaction times than the English controls, which suggests (given that 60% of the Chinese word pairs had overt sound repetitions but no semantic relatedness) that this conjunction made the task substantially more difficult. The bilinguals, however, had reaction times comparable to the English controls. The Chinese controls showed no effect of emotional valence, but did show priming effects of the overt sound manipulation that were equal for all emotion conditions.
The native Chinese speakers had recently arrived in Britain to attend an English course. Bilinguals had been exposed to English since the age of 12 and had lived in Britain for an average of 20.5 months.