News Topic perception

About these topic collections

I’ve been reporting on memory research for over ten years and these topic pages are simply collections of all the news items I have made on a particular topic. They do not pretend to be in any way exhaustive! I cover far too many areas within memory to come anywhere approaching that. What I aim to do is provide breadth, rather than depth. Outside my own area of cognitive psychology, it is difficult to know how much weight to give to any study (I urge you to read my blog post on what constitutes scientific evidence). That (among other reasons) is why my approach in my news reporting is based predominantly on replication and consistency. It's about the aggregate. So here is the aggregate of those reports I have at one point considered of sufficient interest to discuss. If you know of any research you would like to add to the collection, feel free to write about it in a comment (please provide a reference).

Latest news

A new study provides more evidence that meditation changes the brain, and different types of meditation produce different effects.

More evidence that even an 8-week meditation training program can have measurable effects on the brain comes from an imaging study. Moreover, the type of meditation makes a difference to how the brain changes.

The study involved 36 participants from three different 8-week courses: mindful meditation, compassion meditation, and health education (control group). The courses involved only two hours class time each week, with meditation students encouraged to meditate for an average 20 minutes a day outside class. There was a great deal of individual variability in the total amount of meditation done by the end of the course (210-1491 minutes for the mindful attention training course; 190-905 minutes for the compassion training course).

Participants’ brains were scanned three weeks before the courses began, and three weeks after the end. During each brain scan, the volunteers viewed 108 images of people in situations that were either emotionally positive, negative or neutral.

In the mindful attention group, the second brain scan showed a decrease in activation in the right amygdala in response to all images, supporting the idea that meditation can improve emotional stability and response to stress. In the compassion meditation group, right amygdala activity also decreased in response to positive or neutral images, but, among those who reported practicing compassion meditation most frequently, right amygdala activity tended to increase in response to negative images. No significant changes were seen in the control group or in the left amygdala of any participant.

The findings support the idea that meditation can be effective in improving emotional control, and that compassion meditation can indeed increase compassionate feelings. Increased amygdala activation was also correlated with decreased depression scores in the compassion meditation group, which suggests that having more compassion towards others may also be beneficial for oneself.

The findings also support the idea that the changes brought about by meditation endure beyond the meditative state, and that the changes can start to occur quite quickly.

These findings are all consistent with other recent research.

One point is worth emphasizing, in the light of the difficulty in developing a training program that improves working memory rather than simply improving the task being practiced. These findings suggest that, unlike most cognitive training programs, meditation training might produce learning that is process-specific rather than stimulus- or task-specific, giving it perhaps a wider generality than most cognitive training.

A study indicates that difficulty in seeing the whole, vs elements of the whole, is associated with impairment in perceptual grouping, and this is more common with age.

A standard test of how we perceive local vs global features of visual objects uses Navon figures — large letters made up of smaller ones (see below for an example). As in the Stroop test when colors and color words disagree (RED), the viewer can focus either on the large letter or the smaller ones. When the viewer is faster at seeing the larger letter, they are said to be showing global precedence; when they’re faster at seeing the component letters, they are said to be showing local precedence. Typically, the greater the number of component letters, the easier it is to see the larger letter. This is consistent with the Gestalt principles of proximity and continuity — elements that are close together and form smooth lines will tend to be perceptually grouped together and seen as a unit (the greater the number of component letters, the closer they will be, and the smoother the line).

In previous research, older adults have often demonstrated local precedence rather than global, although the results have been inconsistent. One earlier study found that older adults performed poorly when asked to report in which direction (horizontal or vertical) dots formed smooth lines, suggesting an age-related decline in perceptual grouping. The present study therefore investigated whether this decline was behind the decrease in global precedence.

In the study 20 young men (average age 22) and 20 older men (average age 57) were shown Navon figures and asked whether the target letter formed the large letter or the smaller letters (e.g., “Is the big or the small letter an E?”). The number of component letters was systematically varied across five quantities. Under such circumstances it is expected that at a certain level of letter density everyone will switch to global precedence, but if a person is impaired at perceptual grouping, this will occur at a higher level of density.

The young men were, unsurprisingly, markedly faster than the older men in their responses. They were also significantly faster at responding when the target was the global letter, compared to when it was the local letter (i.e. they showed global precedence). The older adults, on the other hand, had equal reaction times to global and local targets. Moreover, they showed no improvement as the letter-density increased (unlike the young men).

It is noteworthy that the older men, while they failed to show global precedence, also failed to show local precedence (remember that results are based on group averages; this suggests that the group was evenly balanced between those showing local precedence and those showing global precedence). Interestingly, previous research has suggested that women are more likely to show local precedence.

The link between perceptual grouping and global precedence is further supported by individual differences — older men who were insensitive to changes in letter-density were almost exclusively the ones that showed persistent local precedence. Indeed, increases in letter-density were sometimes counter-productive for these men, leading to even slower reaction times for global targets. This may be the result of greater distractor interference, to which older adults are more vulnerable, and to which this sub-group of older men may have been especially susceptible.

Example of a Navon figure:

FFFFFF
F
FFFFFF
F
FFFFFF

New research suggests that meditation can improve your ability to control alpha brainwaves, thus helping you block out distraction.

As I’ve discussed on many occasions, a critical part of attention (and working memory capacity) is being able to ignore distraction. There has been growing evidence that mindfulness meditation training helps develop attentional control. Now a new study helps fill out the picture of why it might do so.

The alpha rhythm is particularly active in neurons that process sensory information. When you expect a touch, sight or sound, the focusing of attention toward the expected stimulus induces a lower alpha wave height in neurons that would handle the expected sensation, making them more receptive to that information. At the same time the height of the alpha wave in neurons that would handle irrelevant or distracting information increases, making those cells less receptive to that information. In other words, alpha rhythm helps screen out distractions.

In this study, six participants who completed an eight-week mindfulness meditation program (MBSR) were found to generate larger alpha waves, and generate them faster, than the six in the control group. Alpha wave activity in the somatosensory cortex was measured while participants directed their attention to either their left hand or foot. This was done on three occasions: before training, at three weeks of the program, and after the program.

The MBSR program involves an initial two-and-a-half-hour training session, followed by daily 45-minute meditation sessions guided by a CD recording. The program is focused on training participants first to pay close attention to body sensations, then to focus on body sensations in a specific area, then being able to disengage and shifting the focus to another body area.

Apart from helping us understand why mindfulness meditation training seems to improve attention, the findings may also explain why this meditation can help sufferers of chronic pain.

A new study suggests that one-off learning (that needs no repetition) occurs because the amygdala, center of emotion in the brain, judges the information valuable.

Most memory research has concerned itself with learning over time, but many memories, of course, become fixed in our mind after only one experience. The mechanism by which we acquire knowledge from single events is not well understood, but a new study sheds some light on it.

The study involved participants being presented with images degraded almost beyond recognition. After a few moments, the original image was revealed, generating an “aha!” type moment. Insight is an experience that is frequently remembered well after a single occurrence. Participants repeated the exercise with dozens of different images.

Memory for these images was tested a week later, when participants were again shown the degraded images, and asked to recall details of the actual image.

Around half the images were remembered. But what’s intriguing is that the initial learning experience took place in a brain scanner, and to the researchers’ surprise, one of the highly active areas during the moment of insight was the amygdala. Moreover, high activity in the amygdala predicted that those images would be remembered a week later.

It seems the more we learn about the amygdala, the further its involvement extends. In this case, it’s suggested that the amygdala signals to other parts of the brain that an event is significant. In other words, it gives a value judgment, decreeing whether an event is worthy of being remembered. Presumably the greater the value, the more effort the brain puts into consolidating the information.

It is not thought, from the images used, that those associated with high activity in the amygdala were more ‘emotional’ than the other images.

The intraparietal sulcus appears to be a hub for connecting the different sensory-processing areas as well as higher-order processes, and may be key to attention problems.

If our brains are full of clusters of neurons resolutely only responding to specific features (as suggested in my earlier report), how do we bring it all together, and how do we switch from one point of interest to another? A new study using resting state data from 58 healthy adolescents and young adults has found that the intraparietal sulcus, situated at the intersection of visual, somatosensory, and auditory association cortices and known to be a key area for processing attention, contains a miniature map of all the things we can pay attention to (visual, auditory, motor stimuli etc).

Moreover, this map is copied in at least 13 other places in the brain, all of which are connected to the intraparietal sulcus. Each copy appears to do something different with the information. For instance, one map processes eye movements while another processes analytical information. This map of the world may be a fundamental building block for how information is represented in the brain.

There were also distinct clusters within the intraparietal sulcus that showed different levels of connectivity to auditory, visual, somatosensory, and default mode networks, suggesting they are specialized for different sensory modalities.

The findings add to our understanding of how we can shift our attention so precisely, and may eventually help us devise ways of treating disorders where attention processing is off, such as autism, attention deficit disorder, and schizophrenia.

[1976] Anderson, J. S., Ferguson M. A., Lopez-Larson M., & Yurgelun-Todd D. (2010).  Topographic maps of multisensory attention. Proceedings of the National Academy of Sciences. 107(46), 20110 - 20114.

Every moment a multitude of stimuli compete for our attention. Just how this competition is resolved, and how we control it, is not known. But a new study adds to our understanding.

Following on from earlier studies that found individual neurons were associated with very specific memories (such as a particular person), new research has shown that we can actually regulate the activity of specific neurons, increasing the firing rate of some while decreasing the rate of others.

The study involved 12 patients implanted with deep electrodes for intractable epilepsy. On the basis of each individual’s interests, four images were selected for each patient. Each of these images was associated with the firing of specific neurons in the mediotemporal lobe. The firing of these neurons was hooked up to a computer, allowing the patients to make their particular images appear by thinking of them. When another image appeared on top of the image as a distraction, creating a composite image, patients were asked to focus on their particular image, brightening the target image while the distractor image faded. The patients were successful 70% of the time in brightening their target image. This was primarily associated with increased firing of the specific neurons associated with that image.

I should emphasize that the use of a composite image meant that the participants had to rely on a mental representation rather than the sensory stimuli, at least initially. Moreover, when the feedback given was fake — that is, the patients’ efforts were no longer linked to the behavior of the image on the screen — success rates fell dramatically, demonstrating that their success was due to a conscious, directed action.

Different patients used different strategies to focus their attention. While some simply thought of the picture, others repeated the name of the image out loud or focused their gaze on a particular aspect of the image.

Resolving the competition of multiple internal and external stimuli is a process which involves a number of different levels and regions, but these findings help us understand at least some of the process that is under our conscious control. It would be interesting to know more about the relative effectiveness of the different strategies people used, but this was not the focus of the study. It would also be very interesting to compare effectiveness at this task across age, but of course this procedure is invasive and can only be used in special cases.

The study offers hope for building better brain-machine interfaces.

Researchers trained blindfolded people to recognize shapes through coded sounds, demonstrating the abstract nature of perception.

We can see shapes and we can feel them, but we can’t hear a shape. However, in a dramatic demonstration of just how flexible our brain is, researchers have devised a way of coding spatial relations in terms of sound properties such as frequency, and trained blindfolded people to recognize shapes by their sounds. They could then match what they heard to shapes they felt. Furthermore, they were able to generalize from their training to novel shapes.

The findings not only offer new possibilities for helping blind people, but also emphasize that sensory representations simply require systematic coding of some kind. This provides more evidence for the hypothesis that our perception of a coherent object ultimately occurs at an abstract level beyond the sensory input modes in which it is presented.

[1921] Kim, J. - K., & Zatorre R. J. (2010).  Can you hear shapes you touch?. Experimental Brain Research. 202(4), 747 - 754.

A strong preference for looking at moving shapes rather than active people was evident among toddlers with autism spectrum disorder.

A study involving 110 toddlers (aged 14-42 months), of whom 37 were diagnosed with an autism spectrum disorder and 22 with a developmental delay, has compared their behavior when watching a 1-minute movie depicting moving geometric patterns (a standard screen saver) on 1 side of a video monitor and children in high action, such as dancing or doing yoga, on the other.

It was found that only one of the 51 typically-developing toddlers preferred the shapes, but 40% of the ASD toddlers did, as well as 9% of the developmentally delayed toddlers. Moreover, all those who spent over 69% of the time focusing on the moving shapes were those with ASD.

Additionally, those with ASD who preferred the geometric images also showed a particular pattern of saccades (eye movements) when viewing the images — a reduced number of saccades, demonstrated in a fixed stare. It’s suggested that a preference for moving geometric patterns combined with lengthy absorption in such images, might be an early identifier of autism. Such behavior should be taken as a signal to look for other warning signs, such as reduced enjoyment during back-and-forth games like peek-a-boo; an unusual tone of voice; failure to point at or bring objects to show; and failure to respond to their name.

[1891] Pierce, K., Conant D., Hazin R., Stoner R., & Desmond J. (2010).  Preference for Geometric Patterns Early in Life As a Risk Factor for Autism. Arch Gen Psychiatry. archgenpsychiatry.2010.113 - archgenpsychiatry.2010.113.

A new study provides evidence for the theory that sensory integration is impaired in autism.

Children with autism often focus intently on a single activity or feature of their environment. A study involving 17 autistic children (6-16 years) and 17 controls has compared brain activity as they watched a silent video of their choice while tones and vibrations were presented, separately and simultaneously.

A simple stimulus takes about 20 milliseconds to arrive in the brain. When information from multiple senses registers at the same time, integration takes about 100 to 200 milliseconds in normally developing children. But those with autism took an average of 310 milliseconds to integrate the noise and vibration when they occurred together. The children with autism also showed weaker signal strength, signified by lower amplitude brainwaves.

The findings are consistent with theories that automatic sensory integration is impaired in autism, and may help explain autism’s characteristic sensitivity to excessive sensory stimulation.

Older news items (pre-2010) brought over from the old website

Perception affected by mood

An imaging study has revealed that when people were shown a composite image with a face surrounded by "place" images, such as a house, and asked to identify the gender of the face, those in whom a bad mood had been induced didn’t process the places in the background. However, those in a good mood took in both the focal and background images. These differences in perception were coupled with differences in activity in the parahippocampal place area. Increasing the amount of information is of course not necessarily a good thing, as it may result in more distraction.

[1054] Schmitz, T. W., De Rosa E., & Anderson A. K. (2009).  Opposing Influences of Affective State Valence on Visual Cortical Encoding. J. Neurosci.. 29(22), 7199 - 7207.

http://www.eurekalert.org/pub_releases/2009-06/uot-pww060309.php

What we perceive is not what we sense

Perceiving a simple touch may depend as much on memory, attention, and expectation as on the stimulus itself. A study involving macaque monkeys has found that the monkeys’ perception of a touch (varied in intensity) was more closely correlated with activity in the medial premotor cortex (MPC), a region of the brain's frontal lobe known to be involved in making decisions about sensory information, than activity in the primary somatosensory cortex (which nevertheless accurately recorded the intensity of the sensation). MPC neurons began to fire before the stimulus even touched the monkeys' fingertips — presumably because the monkey was expecting the stimulus.

[263] de Lafuente, V., & Romo R. (2005).  Neuronal correlates of subjective sensory experience. Nat Neurosci. 8(12), 1698 - 1703.

http://www.eurekalert.org/pub_releases/2005-11/hhmi-tsi110405.php

Varied sensory experience important in childhood

A new baby has far more connections between neurons than necessary; from birth to about age 12 the brain trims 50% of these unnecessary connections while at the same time building new ones through learning and sensory stimulation — in other words, tailoring the brain to its environment. A mouse study has found that without enough sensory stimulation, infant mice lose fewer connections — indicating that connections need to be lost in order for appropriate ones to grow. The findings support the idea that parents should try to expose their children to a variety of sensory experiences.

[479] Zuo, Y., Yang G., Kwon E., & Gan W. - B. (2005).  Long-term sensory deprivation prevents dendritic spine loss in primary somatosensory cortex. Nature. 436(7048), 261 - 265.

http://www.sciencentral.com/articles/view.htm3?article_id=218392607

Brain regions that process reality and illusion identified

Researchers have now identified the regions of the brain involved in processing what’s really going on, and what we think is going on. Macaque monkeys played a virtual reality video game in which the monkeys were tricked into thinking that they were tracing ellipses with their hands, although they actually were moving their hands in a circle. Monitoring of nerve cells revealed that the primary motor cortex represented the actual movement while the signals from cells in a neighboring area, called the ventral premotor cortex, were generating elliptical shapes. Knowing how the brain works to distinguish between action and perception will help efforts to build biomedical devices that can control artificial limbs, some day enabling the disabled to move a prosthetic arm or leg by thinking about it.

[1107] Schwartz, A. B., Moran D. W., & Reina A. G. (2004).  Differential Representation of Perception and Action in the Frontal Cortex. Science. 303(5656), 380 - 383.

http://news-info.wustl.edu/tips/page/normal/652.html
http://www.eurekalert.org/pub_releases/2004-02/wuis-rpb020704.php

Memory different depending on whether information received via eyes or ears

Carnegie Mellon scientists using magnetic resonance imaging found quite different brain activity patterns for reading and listening to identical sentences. During reading, the right hemisphere was not as active as expected, suggesting a difference in the nature of comprehension experienced when reading versus listening. When listening, there was greater activation in a part of Broca's area associated with verbal working memory, suggesting that there is more semantic processing and working memory storage in listening comprehension than in reading. This should not be taken as evidence that comprehension is better in one or other of these situations, merely that it is different. "Listening to an audio book leaves a different set of memories than reading does. A newscast heard on the radio is processed differently from the same words read in a newspaper."

[2540] Michael, E. B., Keller T. A., Carpenter P. A., & Just M. A. (2001).  fMRI investigation of sentence comprehension by eye and by ear: Modality fingerprints on cognitive processes. Human Brain Mapping. 13(4), 239 - 252.

http://www.eurekalert.org/pub_releases/2001-08/cmu-tma081401.php

The chunking of our lives: the brain "sees" life in segments

We talk about "chunking" all the time in the context of memory. But the process of breaking information down into manageable bits occurs, it seems, right from perception. Magnetic resonance imaging reveals that when people watched movies of common, everyday, goal-directed activities (making the bed, doing the dishes, ironing a shirt), their brains automatically broke these continuous events into smaller segments. The study also identified a network of brain areas that is activated during the perception of boundaries between events. "The fact that changes in brain activity occurred during the passive viewing of movies indicates that this is how we normally perceive continuous events, as a series of segments rather than a dynamic flow of action."

Zacks, J.M., Braver, T.S., Sheridan, M.A., Donaldson, D.I., Snyder, A.Z., Ollinger, J.M., Buckner, R.L. & Raichle, M.E. 2001. Human brain activity time-locked to perceptual event boundaries. Nature Neuroscience, 4(6), 651-5.

http://www.eurekalert.org/pub_releases/2001-07/aaft-bp070201.php

Amygdala may be critical for allowing perception of emotionally significant events despite inattention

We choose what to pay attention to, what to remember. We give more weight to some things than others. Our perceptions and memories of events are influenced by our preconceptions, and by our moods. Researchers at Yale and New York University have recently published research indicating that the part of the brain known as the amygdala is responsible for the influence of emotion on perception. This builds on previous research showing that the amygdala is critically involved in computing the emotional significance of events. The amygdala is connected to those brain regions dealing with sensory experiences, and the theory that these connections allow the amygdala to influence early perceptual processing is supported by this research. Dr. Anderson suggests that “the amygdala appears to be critical for the emotional tuning of perceptual experience, allowing perception of emotionally significant events to occur despite inattention.”

[968] Anderson, A. K., & Phelps E. A. (2001).  Lesions of the human amygdala impair enhanced perception of emotionally salient events. Nature. 411(6835), 305 - 309.

http://www.eurekalert.org/pub_releases/2001-05/NYU-Infr-1605101.php

Odor

Rosemary is a herb long associated with memory. A small study now provides some support for the association, and for the possible benefits of aromatherapy. And a rat study indicates that your attitude to work might change how stimulants affect you.

A small study involving 20 people has found that those who were exposed to 1,8-cineole, one of the main chemical components of rosemary essential oil, performed better on mental arithmetic tasks. Moreover, there was a dose-dependent relationship — higher blood concentrations of the chemical were associated with greater speed and accuracy.

Participants were given two types of test: serial subtraction and rapid visual information processing. These tests took place in a cubicle smelling of rosemary. Participants sat in the cubicle for either 4, 6, 8, or 10 minutes before taking the tests (this was in order to get a range of blood concentrations). Mood was assessed both before and after, and blood was tested at the end of the session.

While blood levels of the chemical correlated with accuracy and speed on both tasks, the effects were significant only for the mental arithmetic task.

Participants didn’t know that the scent was part of the study, and those who asked about it were told it was left over from a previous study.

There was no clear evidence that the chemical improved attention, but there was a significant association with one aspect of mood, with higher levels of the scent correlating with greater contentment. Contentment was the only aspect of mood that showed such a link.

It’s suggested that this chemical compound may affect learning through its inhibiting effect on acetylcholinesterase (an important enzyme in the development of Alzheimer's disease). Most Alzheimer’s drugs are cholinesterase inhibitors.

While this is very interesting (although obviously a larger study needs to confirm the findings), what I would like to see is the effects on more prolonged mental efforts. It’s also a little baffling to find the effect being limited to only one of these tasks, given that both involve attention and working memory. I would also like to see the rosemary-infused cubicle compared to some other pleasant smell.

Interestingly, a very recent study also suggests the importance of individual differences. A rat study compared the effects of amphetamines and caffeine on cognitive effort. First of all, giving the rats the choice of easy or hard visuospatial discriminations revealed that, as with humans, individuals could be divided into those who tended to choose difficult trials (“workers”) and those who preferred easy ones (“slackers”). (Easy trials took less effort, but earned commensurately smaller reward.)

Amphetamine, it was found, made the slackers worked harder, but made the workers take it easier. Caffeine, too, made the workers slack off, but had no effect on slackers.

The extent to which this applies to humans is of course unknown, but the idea that your attitude to cognitive effort might change how stimulants affect you is an intriguing one. And of course this is a more general reminder that factors, whatever they are, have varying effects on individuals. This is why it’s so important to have a large sample size, and why, as an individual, you can’t automatically assume that something will benefit you, whatever the research says.

But in the case of rosemary oil, I can’t see any downside! Try it out; maybe it will help.

A rat study reveals how training can improve or impair smell perception.

The olfactory bulb is in the oldest part of our brain. It connects directly to the amygdala (our ‘emotion center’) and our prefrontal cortex, giving smells a more direct pathway to memory than our other senses. But the olfactory bulb is only part of the system processing smells. It projects to several other regions, all of which are together called the primary olfactory cortex, and of which the most prominent member is the piriform cortex. More recently, however, it has been suggested that it would be more useful to regard the olfactory bulb as the primary olfactory cortex (primary in the sense that it is first), while the piriform cortex should be regarded as association cortex — meaning that it integrates sensory information with ‘higher-order’ (cognitive, contextual, and behavioral) information.

Testing this hypothesis, a new rat study has found that, when rats were given training to distinguish various odors, each smell produced a different pattern of electrical activity in the olfactory bulb. However, only those smells that the rat could distinguish from others were reflected in distinct patterns of brain activity in the anterior piriform cortex, while smells that the rat couldn’t differentiate produced identical brain activity patterns there. Interestingly, the smells that the rats could easily distinguish were ones in which one of the ten components in the target odor had been replaced with a new component. The smells they found difficult to distinguish were those in which a component had simply been deleted.

When a new group of rats was given additional training (8 days vs the 2 days given the original group), they eventually learned to discriminate between the odors the first animals couldn’t distinguish, and this was reflected in distinct patterns of brain activity in the anterior piriform cortex. When a third group were taught to ignore the difference between odors the first rats could readily distinguish, they became unable to tell the odors apart, and similar patterns of brain activity were produced in the piriform cortex.

The effects of training were also quite stable — they were still evident after two weeks.

These findings support the idea of the piriform cortex as association cortex. It is here that experience modified neuronal activity. In the olfactory bulb, where all the various odors were reflected in different patterns of activity right from the beginning (meaning that this part of the brain could discriminate between odors that the rat itself couldn’t distinguish), training made no difference to the patterns of activity.

Having said that, it should be noted that this is not entirely consistent with previous research. Several studies have found that odor training produces changes in the representations in the olfactory bulb. The difference may lie in the method of neural recording.

How far does this generalize to the human brain? Human studies have suggested that odors are represented in the posterior piriform cortex rather than the anterior piriform cortex. They have also suggested that the anterior piriform cortex is involved in expectations relating to the smells, rather than representing the smells themselves. Whether these differences reflect species differences, task differences, or methodological differences, remains to be seen.

But whether or not the same exact regions are involved, there are practical implications we can consider. The findings do suggest that one road to olfactory impairment is through neglect — if you learn to ignore differences between smells, you will become increasingly less able to do so. An impaired sense of smell has been found in Alzheimer’s disease, Parkinson's disease, schizophrenia, and even normal aging. While some of that may well reflect impairment earlier in the perception process, some of it may reflect the consequences of neglect. The burning question is, then, would it be possible to restore smell function through odor training?

I’d really like to see this study replicated with old rats.

High-tech X-ray scans of ancient fossil skulls have revealed that the increase in brain size that began with the first mammals was driven by improvements in smell and touch.

190-million-year-old fossil skulls of Morganucodon and Hadrocodium, two of the earliest known mammal species, has revealed that even at this early stage of mammalian evolution, mammals had larger brains than would be expected for their body size. High-resolution CT scans of the skulls have now shown that this increase in brain size can be attributed to an increase in those regions dealing with smell and touch (mammals have a uniquely well developed ability to sense touch through their fur).

Comparison of these fossils with seven fossils of early reptiles (close relatives of the first mammals), 27 other primitive mammals, and 270 living mammals, has further revealed that the size of the mammalian brain evolved in three major stages. First, an initial increase in the olfactory bulb and related areas (including the cerebellum) by 190 million years ago; then another jump in the size of these regions shortly after that time; and finally an increase in those regions that control neuromuscular coordination by integrating different senses by 65 million years ago.

It’s speculated that the initial increase in smell and touch was driven by early mammals being nocturnal — dinosaurs being active during the day.

[2301] Rowe, T. B., Macrini T. E., & Luo Z. - X. (2011).  Fossil Evidence on Origin of the Mammalian Brain. Science. 332(6032), 955 - 957.

Previous research suggesting loss of smell function may serve as an early marker of Alzheimer's disease has now been supported by evidence from genetically engineered mice.

Previous research suggesting loss of smell function may serve as an early marker of Alzheimer's disease has now been supported by a finding that in genetically engineered mice, loss of smell function is associated with amyloid-beta accumulation in the brain, and that amyloid pathology occurs first in the olfactory region. It was striking how sensitive olfactory performance was to even the smallest amount of amyloid presence in the brain as early as three months of age (equivalent to a young adult).

Previous research suggesting loss of smell function may serve as an early marker of Alzheimer's disease has now been supported by evidence from genetically engineered mice.

Previous research suggesting loss of smell function may serve as an early marker of Alzheimer's disease has now been supported by a finding that in genetically engineered mice, loss of smell function is associated with amyloid-beta accumulation in the brain, and that amyloid pathology occurs first in the olfactory region. It was striking how sensitive olfactory performance was to even the smallest amount of amyloid presence in the brain as early as three months of age (equivalent to a young adult).

Older news items (pre-2010) brought over from the old website

Why smells can be so memorable

Confirming the common experience of the strength with which certain smells can evoke emotions or memories, an imaging study has found that, when people were presented with a visual object together with one, and later with a second, set of pleasant and unpleasant odors and sounds, then presented with the same objects a week later, there was unique activation in particular brain regions in the case of their first olfactory (but not auditory) associations. This unique signature existed in the hippocampus regardless of how strong the memory was — that is, it was specific to olfactory associations. Regardless of whether they were smelled or heard, people remembered early associations more clearly when they were unpleasant.

[2543] Yeshurun, Y., Lapid H., Dudai Y., & Sobel N. (2009).  The Privileged Brain Representation of First Olfactory Associations. Current Biology. 19, 1869 - 1874.

http://www.physorg.com/news176649240.html

Difficulty identifying odors may predict cognitive decline

Older adults who have difficulty identifying common odors may have a greater risk of developing mild cognitive impairment, increasingly recognized as a precursor to Alzheimer’s disease.  A study of nearly 600 older adults (average age 79.9) found that 30.1% developed mild cognitive impairment over the five-year period of the study. Risk of developing mild cognitive impairment was greater for those who scored worse on an odor identification test given at the start of the study. For example, those who scored below average (eight) were 50% more likely to develop MCI than those who scored above average (11). This association did not change when stroke, smoking habits or other factors that might influence smell or cognitive ability were considered. Impaired odor identification was also associated with lower cognitive scores at the beginning of the study and with a more rapid decline in episodic memory (memory of past experiences), semantic memory (memory of words and symbols) and perceptual speed. The odor test involved identifying 12 familiar odors given four possible alternatives to choose from.

[1130] Wilson, R. S., Schneider J. A., Arnold S. E., Tang Y., Boyle P. A., & Bennett D. A. (2007).  Olfactory Identification and Incidence of Mild Cognitive Impairment in Older Age. Arch Gen Psychiatry. 64(7), 802 - 808.

http://www.eurekalert.org/pub_releases/2007-07/jaaj-dio062807.php

Odor can help memory, in some circumstances

A study in which students played a computer version of a common memory game in which you turn over pairs of cards to find each one's match found that those who played in a rose-scented room and were later exposed to the same scent during slow-wave sleep, remembered the locations of the cards significantly better than people who didn't have that experience (97% vs 86%). Those exposed to the odor during REM sleep, however, saw no memory boost. Imaging revealed the hippocampus was activated when the odor was presented during slow-wave sleep. Having the smell available throughout sleep wouldn’t help, however, because we adapt to smells very quickly. Being exposed to the smell when being tested didn’t help either. Nor did experiencing the odor during slow-wave sleep help when the memory task involved a different type of memory — learning a finger-tapping sequence — probably because procedural memory doesn’t depend on the hippocampus.

[1206] Rasch, B., Buchel C., Gais S., & Born J. (2007).  Odor Cues During Slow-Wave Sleep Prompt Declarative Memory Consolidation. Science. 315(5817), 1426 - 1429.

http://www.physorg.com/news92647884.html
http://www.nature.com/news/2007/070305/full/070305-10.html

Scent of fear impacts cognitive performance

A study involving 75 female students found that those who were exposed to chemicals from fear-induced sweat performed more accurately on word-association tasks than did women exposed to chemicals from other types of sweat or no sweat at all. When processing meaningfully related word pairs, the participants exposed to the fear chemicals were significantly more accurate than those in either the neutral sweat or the control (no-sweat) condition. When processing word pairs that were ambiguous in threat content, such as one neutral word paired with a threatening word or a pair of neutral words, subjects in the fear condition were significantly slower in responding than those in the neutral sweat condition.

Chen, D., Katdare, A. & Lucas, N. 2006. Chemosignals of Fear Enhance Cognitive Performance in Humans. Chemical Senses, Advance Access published on March 9, 2006

http://www.eurekalert.org/pub_releases/2006-03/ru-sof033106.php

Brain region involved in recalling memories from smell identified

We all know the power of smell in triggering the recall of memories. New research has found the specific area of the brain involved in this process - a section of the hippocampus called CA3. The hippocampus has long been known to play a crucial part in forming new memories. It appears that the CA3 region of the hippocampus is crucial for recalling memories from partial representations of the original stimulus.

[1060] Wilson, M. A., Tonegawa S., Nakazawa K., Quirk M. C., Chitwood R. A., Watanabe M., et al. (2002).  Requirement for Hippocampal CA3 NMDA Receptors in Associative Memory Recall. Science. 297(5579), 211 - 218.

http://www.eurekalert.org/pub_releases/2002-05/bcom-tr052902.php
http://news.bbc.co.uk/hi/english/health/newsid_2017000/2017321.stm

Hearing difficulties

A new study has found that errors in perceptual decisions occurred only when there was confused sensory input, not because of any ‘noise’ or randomness in the cognitive processing. The finding, if replicated across broader contexts, will change some of our fundamental assumptions about how the brain works.

The study unusually involved both humans and rats — four young adults and 19 rats — who listened to streams of randomly timed clicks coming into both the left ear and the right ear. After listening to a stream, the subjects had to choose the side from which more clicks originated.

The errors made, by both humans and rats, were invariably when two clicks overlapped. In other words, and against previous assumptions, the errors did not occur because of any ‘noise’ in the brain processing, but only when noise occurred in the sensory input.

The researchers supposedly ruled out alternative sources of confusion, such as “noise associated with holding the stimulus in mind, or memory noise, and noise associated with a bias toward one alternative or the other.”

However, before concluding that the noise which is the major source of variability and errors in more conceptual decision-making likewise stems only from noise in the incoming input (in this case external information), I would like to see the research replicated in a broader range of scenarios. Nevertheless, it’s an intriguing finding, and if indeed, as the researchers say, “the internal mental process was perfectly noiseless. All of the imperfections came from noise in the sensory processes”, then the ramifications are quite extensive.

The findings do add weight to recent evidence that a significant cause of age-related cognitive decline is sensory loss.

http://www.futurity.org/science-technology/dont-blame-your-brain-for-that-bad-decision/

[3376] Brunton, B. W., Botvinick M. M., & Brody C. D. (2013).  Rats and Humans Can Optimally Accumulate Evidence for Decision-Making. Science. 340(6128), 95 - 98.

A large study finds that hearing loss significantly increases the rate of cognitive decline in old age.

I’ve written before about the gathering evidence that sensory impairment, visual impairment and hearing loss in particular, is a risk factor for age-related cognitive decline and dementia. Now a large long-running study provides more support for the association between hearing loss and age-related cognitive decline.

The study involved 1,984 older adults (aged 75-84) whose hearing and cognition was tested at the start of the study, with cognitive performance again assessed three, five, and six years later.

Those with hearing loss showed significantly faster cognitive decline than those with normal hearing — some 30-40% faster (41% on the MMSE; 32% on the Digit Symbol Substitution Test), with rate directly related to the amount of hearing loss.

On average, older adults with hearing loss developed significant cognitive impairment 3.2 years sooner than those with normal hearing — a very significant difference indeed.

It has been suggested that increasing social isolation and loneliness may underlie some, if not all, of this association. It may also be that difficulties in hearing force the brain to devote too much of its resources to processing sound, leaving less for cognition. A third possibility is that some common factor underlies both hearing loss and cognitive decline — however, the obvious risk factors, such as high blood pressure, diabetes and stroke, were taken account of in the analysis.

The findings emphasize the importance of getting help for hearing difficulties, rather than regarding them as ‘natural’ in old age.

[3293] Lin, F. R., Yaffe K., Xia J., & et al (2013).  Hearing loss and cognitive decline in older adults. JAMA Internal Medicine. 1 - 7.

More evidence that learning a musical instrument in childhood, even for a few years, has long-lasting benefits for auditory processing.

Adding to the growing evidence for the long-term cognitive benefits of childhood music training, a new study has found that even a few years of music training in childhood has long-lasting benefits for auditory discrimination.

The study involved 45 adults (aged 18-31), of whom 15 had no music training, 15 had one to five years of training, and 15 had six to eleven years. Participants were presented with different complex sounds ranging in pitch while brainstem activity was monitored.

Brainstem response to the sounds was significantly stronger in those with any sort of music training, compared to those who had never had any music training. This was a categorical difference — years of training didn’t make a difference (although some minimal length may be required — only one person had only one year of training). However, recency of training did make a difference to brainstem response, and it does seem that some fading might occur over long periods of time.

This difference in brainstem response means that those with music training are better at recognizing the fundamental frequency (lowest frequency sound). This explains why music training may help protect older adults from hearing difficulties — the ability to discriminate fundamental frequencies is crucial for understanding speech, and for processing sound in noisy environments.

[3074] Skoe, E., & Kraus N. (2012).  A Little Goes a Long Way: How the Adult Brain Is Shaped by Musical Training in Childhood. The Journal of Neuroscience. 32(34), 11507 - 11510.

More evidence that music training protects older adults from age-related impairment in understanding speech, adding to the potential benefits of music training in preventing dementia.

I’ve spoken before about the association between hearing loss in old age and dementia risk. Although we don’t currently understand that association, it may be that preventing hearing loss also helps prevent cognitive decline and dementia. I have previously reported on how music training in childhood can help older adults’ ability to hear speech in a noisy environment. A new study adds to this evidence.

The study looked at a specific aspect of understanding speech: auditory brainstem timing. Aging disrupts this timing, degrading the ability to precisely encode sound.

In this study, automatic brain responses to speech sounds were measured in 87 younger and older normal-hearing adults as they watched a captioned video. It was found that older adults who had begun musical training before age 9 and engaged consistently in musical activities through their lives (“musicians”) not only significantly outperformed older adults who had no more than three years of musical training (“non-musicians”), but encoded the sounds as quickly and accurately as the younger non-musicians.

The researchers qualify this finding by saying that it shows only that musical experience selectively affects the timing of sound elements that are important in distinguishing one consonant from another, not necessarily all sound elements. However, it seems probable that it extends more widely, and in any case the ability to understand speech is crucial to social interaction, which may well underlie at least part of the association between hearing loss and dementia.

The burning question for many will be whether the benefits of music training can be accrued later in life. We will have to wait for more research to answer that, but, as music training and enjoyment fit the definition of ‘mentally stimulating activities’, this certainly adds another reason to pursue such a course.

Two large studies respectively find that common health complaints and irregular heartbeat are associated with an increased risk of developing Alzheimer’s, while a rat study adds to evidence that stress is also a risk factor.

A ten-year study involving 7,239 older adults (65+) has found that each common health complaint increased dementia risk by an average of about 3%, and that these individual risks compounded. Thus, while a healthy older adult had about an 18% chance of developing dementia after 10 years, those with a dozen of these health complaints had, on average, closer to a 40% chance.

It’s important to note that these complaints were not for serious disorders that have been implicated in Alzheimer’s. The researchers constructed a ‘frailty’ index, involving 19 different health and wellbeing factors: overall health, eyesight, hearing, denture fit, arthritis/rheumatism, eye trouble, ear trouble, stomach trouble, kidney trouble, bladder control, bowel control, feet/ankle trouble, stuffy nose/sneezing, bone fractures, chest problems, cough, skin problems, dental problems, other problems.

Not all complaints are created equal. The most common complaint — arthritis/rheumatism —was only slightly higher among those with dementia. Two of the largest differences were poor eyesight (3% of the non-demented group vs 9% of those with dementia) and poor hearing (3% and 6%).

At the end of the study, 4,324 (60%) were still alive, and of these, 416 (9.6%) had Alzheimer's disease, 191 (4.4%) had another sort of dementia and 677 (15.7%) had other cognitive problems (but note that 1,023 were of uncertain cognitive ability).

While these results need to be confirmed in other research — the study used data from broader health surveys that weren’t specifically designed for this purpose, and many of those who died during the study will have probably had dementia — they do suggest the importance of maintaining good general health.

Common irregular heartbeat raises risk of dementia

In another study, which ran from 1994 to 2008 and followed 3,045 older adults (mean age 74 at study start), those with atrial fibrillation were found to have a significantly greater risk of developing Alzheimer’s.

At the beginning of the study, 4.3% of the participants had atrial fibrillation (the most common kind of chronically irregular heartbeat); a further 12.2% developed it during the study. Participants were followed for an average of seven years. Over this time, those with atrial fibrillation had a 40-50% higher risk of developing dementia of any type, including probable Alzheimer's disease. Overall, 18.8% of the participants developed some type of dementia during the course of the study.

While atrial fibrillation is associated with other cardiovascular risk factors and disease, this study shows that atrial fibrillation increases dementia risk more than just through this association. Possible mechanisms for this increased risk include:

  • weakening the heart's pumping ability, leading to less oxygen going to the brain;
  • increasing the chance of tiny blood clots going to the brain, causing small, clinically undetected strokes;
  • a combination of these plus other factors that contribute to dementia such as inflammation.

The next step is to see whether any treatments for atrial fibrillation reduce the risk of developing dementia.

Stress may increase risk for Alzheimer's disease

And a rat study has shown that increased release of stress hormones leads to cognitive impairment and that characteristic of Alzheimer’s disease, tau tangles. The rats were subjected to stress for an hour every day for a month, by such means as overcrowding or being placed on a vibrating platform. These rats developed increased hyperphosphorylation of tau protein in the hippocampus and prefrontal cortex, and these changes were associated with memory deficits and impaired behavioral flexibility.

Previous research has shown that stress leads to that other characteristic of Alzheimer’s disease: the formation of beta-amyloid.

Another study confirms the cognitive benefits of extensive musical training that begins in childhood, at least for hearing.

A number of studies have demonstrated the cognitive benefits of music training for children. Now research is beginning to explore just how long those benefits last. This is the second study I’ve reported on this month, that points to childhood music training protecting older adults from aspects of cognitive decline. In this study, 37 adults aged 45 to 65, of whom 18 were classified as musicians, were tested on their auditory and visual working memory, and their ability to hear speech in noise.

The musicians performed significantly better than the non-musicians at distinguishing speech in noise, and on the auditory temporal acuity and working memory tasks. There was no difference between the groups on the visual working memory task.

Difficulty hearing speech in noise is among the most common complaints of older adults, but age-related hearing loss only partially accounts for the problem.

The musicians had all begun playing an instrument by age 8 and had consistently played an instrument throughout their lives. Those classified as non-musicians had no musical experience (12 of the 19) or less than three years at any point in their lives. The seven with some musical experience rated their proficiency on an instrument at less than 1.5 on a 10-point scale, compared to at least 8 for the musicians.

Physical activity levels were also assessed. There was no significant difference between the groups.

The finding that visual working memory was not affected supports the idea that musical training helps domain-specific skills (such as auditory and language processing) rather than general ones.

Another study builds on earlier indications that hearing loss is a risk factor for dementia, and emphasizes the need for early intervention.

Data from the Baltimore Longitudinal Study on Aging, begun in 1958, has revealed that seniors with hearing loss are significantly more likely to develop dementia than those who retain their hearing. The study involved 639 people whose hearing and cognitive abilities were tested between 1990 and 1994, then re-tested every one to two years. By 2008, 58 (9%) of them had developed dementia (37 of which were Alzheimer’s).

Those with hearing loss at the beginning of the study were significantly more likely to have developed dementia. The degree of hearing loss also correlated with greater risk: those with mild, moderate, and severe hearing loss had twofold, threefold, and fivefold, respectively, the risk of developing dementia over time. The association was maintained after other risk factors, (high blood pressure, smoking, education, age, sex, race) were taken into account.

The reason for the association is not yet known. It’s possible that a common pathology may underlie both, or that the strain of decoding sounds over the years may make the brain more vulnerable to dementia, or that hearing loss makes people more socially isolated (a known risk factor for dementia).

The findings do suggest that hearing loss should be regarded more seriously, and not simply accepted as a natural part of growing old.

Two recent studies point to how those lacking one sense might acquire enhanced other senses, and what limits this ability.

An experiment with congenitally deaf cats has revealed how deaf or blind people might acquire other enhanced senses. The deaf cats showed only two specific enhanced visual abilities: visual localization in the peripheral field and visual motion detection. This was associated with the parts of the auditory cortex that would normally be used to pick up peripheral and moving sound (posterior auditory cortex for localization; dorsal auditory cortex for motion detection) being switched to processing this information for vision.

This suggests that only those abilities that have a counterpart in the unused part of the brain (auditory cortex for the deaf; visual cortex for the blind) can be enhanced. The findings also point to the plasticity of the brain. (As a side-note, did you know that apparently cats are the only animal besides humans that can be born deaf?)

The findings (and their broader implications) receive support from an imaging study involving 12 blind and 12 sighted people, who carried out an auditory localization task and a tactile localization task (reporting which finger was being gently stimulated). While the visual cortex was mostly inactive when the sighted people performed these tasks, parts of the visual cortex were strongly activated in the blind. Moreover, the accuracy of the blind participants directly correlated to the strength of the activation in the spatial-processing region of the visual cortex (right middle occipital gyrus). This region was also activated in the sighted for spatial visual tasks.

A comprehensive review of the recent research into the benefits of music training on learning and the brain concludes music training in schools should be strongly supported.

A review of the many recent studies into the effects of music training on the nervous system strongly suggests that the neural connections made during musical training also prime the brain for other aspects of human communication, including learning. It’s suggested that actively engaging with musical sounds not only helps the plasticity of the brain, but also helps provide a stable scaffolding of meaningful patterns. Playing an instrument primes the brain to choose what is relevant in a complex situation. Moreover, it trains the brain to make associations between complex sounds and their meaning — something that is also important in language. Music training can provide skills that enable speech to be better heard against background noise — useful not only for those with some hearing impairment (it’s a common difficulty as we get older), but also for children with learning disorders. The review concludes that music training tones the brain for auditory fitness, analogous to the way physical exercise tones the body, and that the evidence justifies serious investment in music training in schools.

[1678] Kraus, N., & Chandrasekaran B. (2010).  Music training for the development of auditory skills. Nat Rev Neurosci. 11(8), 599 - 605.

A month's training in sound discrimination reversed normal age-related cognitive decline in the auditory cortex in old rats.

A rat study demonstrates how specialized brain training can reverse many aspects of normal age-related cognitive decline in targeted areas. The month-long study involved daily hour-long sessions of intense auditory training targeted at the primary auditory cortex. The rats were rewarded for picking out the oddball note in a rapid sequence of six notes (five of them of the same pitch). The difference between the oddball note and the others became progressively smaller. After the training, aged rats showed substantial reversal of their previously degraded ability to process sound. Moreover, measures of neuron health in the auditory cortex had returned to nearly youthful levels.

New research reveals that understanding spoken speech relies on sound changes, making "low" vowels most important and "stop" consonants least important.

As I get older, the question of how we perceive speech becomes more interesting (people don’t talk as clearly as they used to!). So I was intrigued by this latest research that reveals that it is not so much a question of whether consonants or vowels are more important (although consonants do appear to be less important than vowels — the opposite of what is true for written language), but a matter of transitions. It’s all a matter of the very brief changes across amplitude and frequency that make sound-handling neurons fire more often and easily — after all, as we know from other perception research, we’re designed to recognize/respond to change. Most likely to rate as high-change sounds are "low" vowels, sounds like "ah" in "father" or "top" that draw the jaw and tongue downward. Least likely to cause much change are "stop" consonants like "t" and "d" in "today." The physical measure of change corresponds closely with the linguistic construct of sonority (or vowel-likeness).

[1632] Stilp, C. E., & Kluender K. R. (2010).  Cochlea-scaled entropy, not consonants, vowels, or time, best predicts speech intelligibility. Proceedings of the National Academy of Sciences. 107(27), 12387 - 12392.

Older news items (pre-2010) brought over from the old website

Music training helps you hear better in noisy rooms

I’ve often talked about the benefits of musical training for cognition, but here’s a totally new benefit. A study involving 31 younger adults (19-32) with normal hearing has found that musicians (at least 10 years of music experience; music training before age 7; practicing more than 3 times weekly within previous 3 years) were significantly better at hearing and repeating sentences in increasingly noisy conditions, than the non-musicians. The number of years of music practice also correlated positively with better working memory and better tone discrimination ability. Hearing speech in noisy environments is of course difficult for everyone, but particularly for older adults, who are likely to have hearing and memory loss, and for poor readers.

[960] Parbery-Clark, A., Skoe E., Lam C., & Kraus N. (2009).  Musician enhancement for speech-in-noise. Ear and Hearing. 30(6), 653 - 661.

http://www.eurekalert.org/pub_releases/2009-08/nu-tum081709.php

Why it's hard to hear in a crowded room

New research helps explain why it’s difficult for those with impaired hearing to hear conversation involving several different people, particularly in a busy setting such as a restaurant or at a party. It appears that as you attend to a continuous auditory stream (such as one person speaking from one location), your attention gets refined and improved over time. However, if that person gets changing location, or if you have to focus on more than one speaker, then degradation occurs as attention gets switched and begins the process of building up performance again. It’s speculated that the same sort of attentional selectivity may occur with objects in a complex visual scene (think of “Where’s Wally”).

[1148] Best, V., Ozmeral E. J., Kopco N., & Shinn-Cunningham B. G. (2008).  Object continuity enhances selective auditory attention. Proceedings of the National Academy of Sciences. 105(35), 13174 - 13178.

http://www.eurekalert.org/pub_releases/2008-08/bu-mta082108.php

Memory impairment associated with sound processing disorder

Central auditory processing dysfunction refers to the situation where hearing in quiet settings is normal or near normal but is substantially impaired in the presence of competing noise or in other difficult listening situations. Such a problem is not helped by amplification and requires alternative rehabilitation strategies. Central auditory processing has been found to be impaired in those with dementia. Now a study comparing individuals with dementia, those with mild memory impairment but without a dementia diagnosis, and those without memory loss, has found that scores on central auditory processing tests were significantly lower in both the group with dementia and in the group with mild memory impairment, compared to controls.

[302] Gates, G. A., Anderson M. L., Feeney P. M., McCurry S. M., & Larson E. B. (2008).  Central auditory dysfunction in older persons with memory impairment or Alzheimer dementia. Archives of Otolaryngology--Head & Neck Surgery. 134(7), 771 - 777.

http://www.eurekalert.org/pub_releases/2008-07/jaaj-mia071708.php

Hearing loss in older adults may compromise cognitive resources for memory

A study involving older adults with good hearing and a group with mild-to-moderate hearing loss has found that even when older adults could hear words well enough to repeat them, their ability to memorize and remember these words was poorer in comparison to other individuals of the same age with good hearing. The researchers suggest that the effect of expending extra effort comprehending words means there are fewer cognitive resources for higher level comprehension. Working memory capacity tends to diminish as we age.

[394] Wingfield, A., Tun P. A., & McCoy S. L. (2005).  Hearing Loss in Older Adulthood. Current Directions in Psychological Science. 14(3), 144 - 148.

http://www.eurekalert.org/pub_releases/2005-08/bu-hli082905.php

Add comment

  • Web page addresses and e-mail addresses turn into links automatically.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd>
  • Lines and paragraphs break automatically.

More information about formatting options

By submitting this form, you accept the Mollom privacy policy.

Comments