Aging - how cognitive function declines

About these topic collections

I’ve been reporting on memory research for over ten years and these topic pages are simply collections of all the news items I have made on a particular topic. They do not pretend to be in any way exhaustive! I cover far too many areas within memory to come anywhere approaching that. What I aim to do is provide breadth, rather than depth. Outside my own area of cognitive psychology, it is difficult to know how much weight to give to any study (I urge you to read my blog post on what constitutes scientific evidence). That (among other reasons) is why my approach in my news reporting is based predominantly on replication and consistency. It's about the aggregate. So here is the aggregate of those reports I have at one point considered of sufficient interest to discuss. If you know of any research you would like to add to the collection, feel free to write about it in a comment (please provide a reference).

Latest news

Analysis of brain scans and cognitive scores of 64 older adults from the NIA's Baltimore Longitudinal Study of Aging (average age 76) has found that, between the most cognitively stable and the most declining (over a 12-year period), there was no significant difference in the total amount of amyloid in the brain, but there was a significant difference in the location of amyloid accumulation. The stable group showed relatively early accumulation in the frontal lobes, while the declining group showed it in the temporal lobes.

[3624] Yotter, R. A., Doshi J., Clark V., Sojkova J., Zhou Y., Wong D. F., et al. (2013).  Memory decline shows stronger associations with estimated spatial patterns of amyloid deposition progression than total amyloid burden. Neurobiology of Aging. 34(12), 2835 - 2842.

Data from two longitudinal studies of older adults (a nationally representative sample of older adults, and the Alzheimer’s Disease Neuroimaging Initiative) has found that a brief cognitive test can distinguish memory decline associated with healthy aging from more serious memory disorders, years before obvious symptoms show up.

Moreover, the data challenge the idea that memory continues to decline through old age: after excluding the cognitively impaired, there was no evidence of further memory declines after the age of 69.

The data found that normal aging showed declines in recollective memory (recalling a word or event exactly) but not in reconstructive memory (recalling a word or event by piecing it together from clues about its meaning, e.g., recalling that “dog” was presented in a word list by first remembering that household pets were presented in the list). However, declines in reconstructive memory were reliable predictors of future progression from healthy aging to mild cognitive impairment and Alzheimer’s.

[3556] Brainerd, C. J., Reyna V. F., Gomes C. F. A., Kenney A. E., Gross C. J., Taub E. S., et al. (2014).  Dual-retrieval models and neurocognitive impairment. Journal of Experimental Psychology: Learning, Memory, and Cognition. 40(1), 41 - 65.

Data from the very large, long-running UK National Child Development Study has revealed that those who exercised at least four times weekly as both a child and an adult performed better on cognitive tests at age 50 than those who exercised two to three times per month or less, and the latter in turn performed better than those who hadn’t regularly exercised at all.

The data was collected through face-to-face interviews of more than 9,000 people at the ages of 11, 16, 33, 42, 46, and 50. Cognitive score was based on an immediate and delayed recall task (ten unrelated words), ability to name as many animals as possible in one minute, and time taken to cross out specified letters in a series.

The findings add a further perspective to the pile of evidence for the value of regular exercise in fighting age-related cognitive decline.

[3336] Dregan, A., & Gulliford M. C. (2013).  Leisure-time physical activity over the life course and cognitive functioning in late mid-adult years: a cohort-based investigation. Psychological Medicine. FirstView, 1 - 12.

A small study of “Super Agers” has found a key difference between them and typical older adults: an unusually large anterior cingulate (involved in attention), with four times as many von Economo neurons.

Scientific American article

Preliminary findings from a small study show that older adults, after learning to use Facebook, performed about 25% better on tasks designed to measure their ability to continuously monitor and to quickly add or delete the contents of their working memory (updating).

Report on Futurity

A new study confirms the role slow-wave sleep plays in consolidating memories, and reveals that one reason for older adults’ memory problems may be the quality of their sleep.

Recent research has suggested that sleep problems might be a risk factor in developing Alzheimer’s, and in mild cognitive impairment. A new study adds to this gathering evidence by connecting reduced slow-wave sleep in older adults to brain atrophy and poorer learning.

The study involved 18 healthy young adults (mostly in their 20s) and 15 healthy older adults (mostly in their 70s). Participants learned 120 word- nonsense word pairs and were tested for recognition before going to bed. Their brain activity was recorded while they slept. Brain activity was also measured in the morning, when they were tested again on the word pairs.

As has been found previously, older adults showed markedly less slow-wave activity (both over the whole brain and specifically in the prefrontal cortex) than the younger adults. Again, as in previous studies, the biggest difference between young and older adults in terms of gray matter volume was found in the medial prefrontal cortex (mPFC). Moreover, significant differences were also found in the insula and posterior cingulate cortex. These regions, like the mPFC, have also been associated with the generation of slow waves.

When mPFC volume was taken into account, age no longer significantly predicted the extent of the decline in slow-wave activity — in other words, the decline in slow-wave activity appears to be due to the brain atrophy in the medial prefrontal cortex. Atrophy in other regions of the brain (precuneus, hippocampus, temporal lobe) was not associated with the decline in slow-wave activity when age was considered.

Older adults did significantly worse on the delayed recognition test than young adults. Performance on the immediate test did not predict performance on the delayed test. Moreover, the highest performers on the immediate test among the older adults performed at the same level as the lowest young adult performers — nevertheless, these older adults did worse the following day.

Slow-wave activity during sleep was significantly associated with performance on the next day’s test. Moreover, when slow-wave activity was taken into account, neither age nor mPFC atrophy significantly predicted test performance.

In other words, age relates to shrinkage of the prefrontal cortex, this shrinkage relates to a decline in slow-wave activity during sleep, and this decline in slow-wave sleep relates to poorer cognitive performance.

The findings confirm the importance of slow-wave brainwaves for memory consolidation.

All of this suggests that poorer sleep quality contributes significantly to age-related cognitive decline, and that efforts should be made to improve quality of sleep rather than just assuming lighter, more disturbed sleep is ‘natural’ in old age!

Women who undergo surgical menopause at an earlier age may have an increased risk of cognitive decline.

The issue of the effect of menopause on women’s cognition, and whether hormone therapy helps older women fight cognitive decline and dementia, has been a murky one. Increasing evidence suggests that the timing and type of therapy is critical. A new study makes clear that we also need to distinguish between women who experience early surgical menopause and those who experience natural menopause.

The study involved 1,837 women (aged 53-100), of whom 33% had undergone surgical menopause (removal of both ovaries before natural menopause). For these women, earlier age of the procedure was associated with a faster decline in semantic and episodic memory, as well as overall cognition. The results stayed the same after factors such as age, education and smoking were taken into consideration.

There was also a significant association between age at surgical menopause and the plaques characteristic of Alzheimer's disease. However, there was no significant association with Alzheimer’s itself.

On the positive side, hormone replacement therapy was found to help protect those who had surgical menopause, with duration of therapy linked to a significantly slower decline in overall cognition.

Also positively, age at natural menopause was not found to be associated with rate of cognitive decline.

Bove, R. et al. 2013. Early Surgical Menopause Is Associated with a Spectrum of Cognitive Decline. To be presented at the American Academy of Neurology's 65th Annual Meeting in San Diego, March 21, 2013.

A mouse study shows that weakening unwanted or out-of-date connections is as important as making new connections, and that neurological changes as we age reduces our ability to weaken old connections.

A new study adds more support to the idea that the increasing difficulty in learning new information and skills that most of us experience as we age is not down to any difficulty in acquiring new information, but rests on the interference from all the old information.

Memory is about strengthening some connections and weakening others. A vital player in this process of synaptic plasticity is the NMDA receptor in the hippocampus. This glutamate receptor has two subunits (NR2A and NR2B), whose ratio changes as the brain develops. Children have higher ratios of NR2B, which lengthens the time neurons talk to each other, enabling them to make stronger connections, thus optimizing learning. After puberty, the ratio shifts, so there is more NR2A.

Of course, there are many other changes in the aging brain, so it’s been difficult to disentangle the effects of this changing ratio from other changes. This new study genetically modified mice to have more NR2A and less NR2B (reflecting the ratio typical of older humans), thus avoiding the other confounds.

To the researchers’ surprise, the mice were found to be still good at making strong connections (‘long-term potentiation’ - LTP), but instead had an impaired ability to weaken existing connections (‘long-term depression’ - LTD). This produces too much noise (bear in mind that each neuron averages 3,000 potential points of contact (i.e., synapses), and you will see the importance of turning down the noise!).

Interestingly, LTD responses were only abolished within a particular frequency range (3-5 Hz), and didn’t affect 1Hz-induced LTD (or 100Hz-induced LTP). Moreover, while the mice showed impaired long-term learning, their short-term memory was unaffected. The researchers suggest that these particular LTD responses are critical for ‘post-learning information sculpting’, which they suggest is a step (hitherto unknown) in the consolidation process. This step, they postulate, involves modifying the new information to fit in with existing networks of knowledge.

Previous work by these researchers has found that mice genetically modified to have an excess of NR2B became ‘super-learners’. Until now, the emphasis in learning and memory has always been on long-term potentiation, and the role (if any) of long-term depression has been much less clear. These results point to the importance of both these processes in sculpting learning and memory.

The findings also seem to fit in with the idea that a major cause of age-related cognitive decline is the failure to inhibit unwanted information, and confirm the importance of keeping your mind actively engaged and learning, because this ratio is also affected by experience.

A mouse study indicates that caffeine can help prevent inflammation occurring in the brain, by blocking an early response to cell damage.

Caffeine has been associated with a lower of developing Alzheimer's disease in some recent studies. A recent human study suggested that the reason lies in its effect on proteins involved in inflammation. A new mouse study provides more support for this idea.

In the study, two groups of mice, one of which had been given caffeine, were exposed to hypoxia, simulating what happens in the brain during an interruption of breathing or blood flow. When re-oxygenated, caffeine-treated mice recovered their ability to form a new memory 33% faster than the other mice, and the caffeine was observed to have the same anti-inflammatory effect as blocking interleukin-1 (IL-1) signaling.

Inflammation is a key player in cognitive impairment, and IL-1 has been shown to play a critical role in the inflammation associated with many neurodegenerative diseases.

It was found that the hypoxic episode triggered the release of adenosine, the main component of ATP (your neurons’ fuel). Adenosine is released when a cell is damaged, and this leakage into the environment outside the cell begins a cascade that leads to inflammation (the adenosine activates an enzyme, caspase-1, which triggers production of the cytokine IL-1β).

But caffeine blocks adenosine receptors, stopping the cascade before it starts.

The finding gives support to the idea that caffeine may help prevent cognitive decline and impairment.

A new understanding of why dementia sometimes occurs with HIV, even when treated, may also suggest a new approach to other neurological disorders, including age-related cognitive decline.

HIV-associated dementia occurs in around 30% of untreated HIV-positive patients. Surprisingly, it also is occasionally found in some patients (2-3%) who are being successfully treated for HIV (and show no signs of AIDS).

A new study may have the answer for this mystery, and suggest a solution. Moreover, the answer may have general implications for those experiencing cognitive decline in old age.

The study found that HIV, although it doesn’t directly infect neurons, tries to stop the development of BDNF. Long known to be crucial for memory and learning, the reduced production of mature BDNF results in axons and dendrites shortening — meaning connections between neurons are lost. That in turn, brings about the death of some neurons.

It seems that the virus interferes with the normal process of development in BDNF, whereby one form of it, called proBDNF, is cut by certain enzymes into a new form called mature BDNF. It is in this form that it has its beneficial effect on neuron growth. Unfortunately, in its earlier form it is toxic to neurons.

This imbalance in the proportions of mature BDNF and proBDNF also appears to occur as we age, and in depression. It may also be a risk factor in Parkinson's and Huntington's diseases.

However, these findings suggest a new therapeutic approach.

Compounds in green tea and chocolate may help protect brain cells

In which context, it is interesting to note another new study, which has been busy analyzing the effects on brain cells of 2000 compounds, both natural and synthetic. Of the 256 that looked to have protective effects, nine were related to epicatechin, which is found in cocoa and green tea leaves.

While we’ve been aware for some time of these positive qualities, the study specifically identified epicatechin and epigallocatechin gallate (EGCG) as being the most effective at helping protect neurons by inducing production of BDNF.

One of the big advantages these compounds have is in their ability to cross the blood-brain barrier, making them a good candidate for therapy.

While green tea, dark chocolate, and cocoa are particularly good sources, many fruits also have good levels, in particular, black grapes, blackberries, apples, cherries, pears, and raspberries. (see this University of Davis document (pdf) for more detail)

Daily consumption of a high level of cocoa was found to improve cognitive scores, insulin resistance and blood pressure, in older adults with mild cognitive impairment.

Back in 2009, I reported briefly on a large Norwegian study that found that older adults who consumed chocolate, wine, and tea performed significantly better on cognitive tests. The association was assumed to be linked to the flavanols in these products. A new study confirms this finding, and extends it to older adults with mild cognitive impairment.

The study involved 90 older adults with MCI, who consumed either 990 milligrams, 520 mg, or 45 mg of a dairy-based cocoa drink daily for eight weeks. Their diet was restricted to eliminate other sources of flavanols (such as tea, red wine, apples and grapes).

Cognitive assessment at the end of this period revealed that, although scores on the MMSE were similar across all groups, those consuming higher levels of flavanol cocoa took significantly less time to complete Trail Making Tests A and B, and scored significantly higher on the verbal fluency test. Insulin resistance and blood pressure was also lower.

Those with the highest levels of flavanols did better than those on intermediate levels on the cognitive tests. Both did better than those on the lowest levels.

Changes in insulin resistance explained part, but not all, of the cognitive improvement.

One caveat: the group were generally in good health without known cardiovascular disease — thus, not completely representative of all those with MCI.


A honey bee study shows how old foraging bees quickly start to decline cognitively, and how this can be reversed in some if they return to more social domestic duties in the hive.

I often talk about the importance of attitudes and beliefs for memory and cognition. A new honey bee study provides support for this in relation to the effects of aging on the brain, and suggests that this principle extends across the animal kingdom.

Previous research has shown that bees that stay in the nest and take care of the young remain mentally competent, but they don’t nurse for ever. When they’re older (after about 2-3 weeks), they become foragers, and foraging bees age very quickly — both physically and mentally. Obviously, you would think, bees ‘retire’ to foraging, and their old age is brief (they begin to show cognitive decline after just two weeks).

But it’s not as simple as that, because in artificial hives where worker bees are all the same age, nurse bees of the same age as foragers don’t show the same cognitive and sensory decline. Moreover, nurse bees have been found to maintain their cognitive abilities for more than 100 days, while foragers die within 18 days and show cognitive declines after 13-15 days (although their ability to assess sweetness remains intact).

The researchers accordingly asked a very interesting question: what happens if the foragers return to babysitting?

To achieve this, they removed all of the younger nurse bees from the nest, leaving only the queen and babies. When the older, foraging bees returned to the nest, activity slowed down for several days, and then they re-organized themselves: some of the old bees returned to foraging; others took on the babysitting and housekeeping duties (cleaning, building the comb, and tending to the queen). After 10 days, around half of these latter bees had significantly improved their ability to learn new things.

This cognitive improvement was also associated with a change in two specific proteins in their brains: one that has been associated with protection against oxidative stress and inflammation associated with Alzheimer disease and Huntington disease in humans (Prx6), and another dubbed a “chaperone” protein because it protects other proteins from being damaged when brain or other tissues are exposed to cell-level stress.

Precisely what it is about returning to the hive that produces this effect is a matter of speculation, but this finding does show that learning impairment in old bees can be reversed by changes in behavior, and this reversal is correlated with specific changes in brain protein.

Having said this, it shouldn’t be overlooked that only some of the worker bees showed this brain plasticity. This is not, apparently, due to differences in genotype, but may depend on the amount of foraging experience.

The findings add weight to the idea that social interventions can help our brains stay younger, and are consistent with growing evidence that, in humans, social engagement helps protect against dementia and age-related cognitive impairment.

The (probably) experience-dependent individual differences shown by the bees is perhaps mirrored in our idea of cognitive reserve, but with a twist. The concept of cognitive reserve emphasizes that accumulating a wealth of cognitive experience (whether through education or occupation or other activities) protects your brain from the damage that might occur with age. But perhaps (and I’m speculating now) we should also consider the other side of this: repeated engagement in routine or undemanding activities may have a deleterious effect, independent of and additional to the absence of more stimulating activities.

A study qualifies evidence that occupational exposure to solvents increases the risk of cognitive impairment later in life.

The study involved 4,134 people (average age 59) who worked at the French national gas and electric company, of whom most worked at the company for their entire career. Their lifetime exposure to chlorinated solvents, petroleum solvents, benzene and non-benzene aromatic solvents was estimated, and they were given the Digit Symbol Substitution Test to assess cognitive performance. Cognitive impairment was defined as scoring below the 25th percentile. Most of the participants (88%) were retired.

For analysis, participants were divided into two groups based on whether they had less than a secondary school education or not. This revealed an interesting finding: higher rates of solvent exposure were associated with cognitive impairment, in a dose-dependent relationship — but only in those with less than a high school education. Recency of solvent exposure also predicted worse cognition among the less-educated (suggesting that at least some of the damage was recoverable).

However, among those with secondary education or higher, there was no significant association between solvent exposure (quantity or recency) and cognition.

Over half the participants (58%) had less than a high school education. Of those, 32% had cognitive impairment — twice the rate in those with more education.

The type of solvent also made a difference, with non-benzene aromatic solvents the most dangerous, followed by benzene solvents, and then chlorinated and petroleum solvents (the rates of cognitive impairment among highly-exposed less-educated, was 36%, 24%, and 14%, respectively).

The findings point to the value of cognitive reserve, but I have several caveats. (Unfortunately, this study appears in a journal to which I don’t have access, so it’s possible the first of this at least is answered in the paper.) The first is that those with less education had higher rates of exposure, which raises the question of a threshold effect. Second is that the cognitive assessment is only at one point of time, lacking both a baseline (do we know what sort of average score adults of this age and with this little education would achieve? A quick online search threw up no such appropriate normative data) and a time-comparison that would give a rate of decline. Third, is that the cognitive assessment is very limited, being based on only one test.

In other words, the failure to find an effect among those with at least a high school education may well reflect the lack of sensitivity in the test (designed to assess brain damage). More sensitive tests, and test comparisons over time, may well give a different answer.

On its own, then, this finding is merely another data-point. But accumulating data-points is how we do science! Hopefully, in due course there’ll be a follow-up that will give us more information.

A small study has found that, in older adults, their sense of control fluctuates over the course of a day, and this affects their cognitive abilities.

Previous research has pointed to a typical decline in our sense of control as we get older. Maintaining a sense of control, however, appears to be a key factor in successful aging. Unsurprisingly, in view of the evidence that self-belief and metacognitive understanding are important for cognitive performance, a stronger sense of control is associated with better cognitive performance. (By metacognitive understanding I mean the knowledge that cognitive performance is malleable, not fixed, and strategies and training are effective in improving cognition.)

In an intriguing new study, 36 older adults (aged 61-87, average age 74) had their cognitive performance and their sense of control assessed every 12 hours for 60 days. Participants were asked questions about whether they felt in control of their lives and whether they felt able to achieve goals they set for themselves.

The reason I say this is intriguing is that it’s generally assumed that a person’s sense of control — how much they feel in control of their lives — is reasonably stable. While, as I said, it can change over the course of a lifetime, until recently we didn’t think that it could fluctuate significantly in the course of a single day — which is what this study found.

Moreover, those who normally reported having a low sense of control performed much better on inductive reasoning tests during periods when they reported feeling a higher sense of control. Similarly, those who normally reported feeling a high sense of control scored higher on memory tests when feeling more in control than usual.

Although we can’t be sure (since this wasn’t directly investigated), the analysis suggests that the improved cognitive functioning stems from the feeling of improved control, not vice versa.

The study builds on an earlier study that found weekly variability in older adults’ locus of control and competency beliefs.

Assessment was carried out in the form of a daily workbook, containing a number of measures, which participants completed twice daily. Each assessment took around 30-45 minutes to complete. The measures included three cognitive tests (14 alternate forms of each of these were used, to minimize test familiarity):

  • Letter series test: 30 items in which the next letter in a series had to be identified. [Inductive reasoning]
  • Number comparison: 48 items in which two number strings were presented beside each other, and participants had to identify where there was any mismatch. [Perceptual speed]
  • Rey Auditory Verbal Learning Task: participants have to study a list of 15 unrelated words for one minute, then on another page recall as many of the words as they could. [Memory]

Sense of control over the previous 12 hours was assessed by 8 questions, to which participants indicated their agreement/disagreement on a 6-point scale. Half the questions related to ‘locus of control’ and half to ‘perceived competence’.

While, unsurprisingly, compliance wasn’t perfect (it’s quite an arduous regime), participants completed on average 115 of 120 workbooks. Of the possible 4,320 results (36 x 120), only 166 were missing.

One of the things that often annoys me is the subsuming of all within-individual variability in cognitive scores into averages. Of course averages are vital, but so is variability, and this too often is glossed over. This study is, of course, all about variability, so I was very pleased to see people’s cognitive variability spelled out.

Most of the variance in locus of control was of course between people (86%), but 14% was within-individual. Similarly, the figures for perceived competence were 88% and 12%. (While locus of control and perceived competence are related, only 26% of the variability in within-person locus of control was associated with competence, meaning that they are largely independent.)

By comparison, within-individual variability was much greater for the cognitive measures: for the letter series (inductive reasoning), 32% was within-individual and 68% between-individual; for the number matching (perceptual speed), 21% was within-individual and 79% between-individual; for the memory test, an astounding 44% was within-individual and 56% between-individual.

Some of this within-individual variability in cognitive performance comes down to practice effects, which were significant for all cognitive measures. For the memory test, time of day was also significant, with performance being better in the morning. For the letter and number series tests, previous performance also had a small effect on perceived competence. For the number matching, increase in competence subsequent to increased performance was greatest for those with lower scores. However, lagged analyses indicated that beliefs preceded performance to a greater extent than performance preceding beliefs.

While it wasn’t an aspect of this study, it should also be noted that a person’s sense of control may well vary according to domain (e.g., cognition, social interaction, health) and context. In this regard, it’s interesting to note the present findings that sense of control affected inductive reasoning for low-control individuals, but memory for high-control individuals, suggesting that the cognitive domain also matters.

Now this small study was a preliminary one and there are several limitations that need to be tightened up in subsequent research, but I think it’s important for three reasons:

  • as a demonstration that cognitive performance is not a fixed attribute;
  • as a demonstration of the various factors that can affect older adults’ cognitive performance;
  • as a demonstration that your beliefs about yourself are a factor in your cognitive performance.

[2794] Neupert, S. D., & Allaire J. C. (2012).  I think I can, I think I can: Examining the within-person coupling of control beliefs and cognition in older adults. Psychology and Aging. No Pagination Specified - No Pagination Specified.

A large Swedish twin study reveals the prevalence of age-related cognitive impairment and points to the greater importance of environment over genes. Another very large study points to marked regional variation in mild cognitive impairment.

Data from 11,926 older twins (aged 65+) has found measurable cognitive impairment in 25% of them and subjective cognitive impairment in a further 39%, meaning that 64% of these older adults were experiencing some sort of cognitive impairment.

Although subjective impairment is not of sufficient magnitude to register on our measurement tools, that doesn’t mean that people’s memory complaints should be dismissed. It is likely, given the relative crudity of standard tests, that people are going to be aware of cognitive problems before they grow large enough to be measurable. Moreover, when individuals are of high intelligence or well-educated, standard tests can be insufficiently demanding. [Basically, subjective impairment can be thought of as a step before objective impairment, which itself is a step before mild cognitive impairment (MCI is a formal diagnosis, not simply a descriptive title), the precursor to Alzheimer’s. Note that I am calling these “steps” as a way of describing a continuum, not an inevitable process. None of these steps means that you will inevitably pass to the next step, but each later step will be preceded by the earlier steps.]

Those with subjective complaints were younger, more educated, more likely to be married, and to have higher socio-economic status, compared to those with objective impairment — supporting the idea that these factors provide some protection against cognitive decline.

The use of twins reveals that environment is more important than genes in determining whether you develop cognitive impairment in old age. For objective cognitive impairment, identical twins had a concordance rate of 52% compared to 50% in non-identical same-sex twins and 29% in non-identical different-gender twins. For subjective impairment, the rates were 63%, 63%, and 42%, respectively.

National variation in MCI prevalence

Another very large study, involving 15,376 older adults (65+), has explored the prevalence of amnestic MCI in low- and middle-income countries: Cuba, Dominican Republic, Peru, Mexico, Venezuela, Puerto Rico, China, and India. Differences between countries were marked, with only 0.6% of older adults in China having MCI compared to 4.6% in India (Cuba 1.5%, Dominican Republic 1.3%, Peru 2.6%, Mexico 2.8%, Venezuela 1%, Puerto Rico 3% — note that I have selected the numbers after they were standardized for age, gender, and education, but the raw numbers are not greatly different).

Studies to date have focused mainly on European and North American populations, and have provided prevalence estimates ranging from 2.1%-11.5%, generally hovering around 3-5% (for example, Finland 5.3%, Italy 4.9%, Japan 4.9%, the US 6% — but note South Korea 9.7% and Malaysia 15.4%).

What is clear is that there is considerable regional variation.

Interestingly, considering their importance in Western countries, the effects of both age and education on prevalence of aMCI were negligible. Granted that age and education norms were used in the diagnosis, this is still curious. It may be that there was less variance in educational level in these populations. Socioeconomic status was, however, a factor.

Participants were also tested on the 12-item WHO disability assessment schedule (WHODAS-12), which assesses five activity-limitation domains (communication, physical mobility, self-care, interpersonal interaction, life activities and social participation). MCI was found to be significantly associated with disability in Peru, India, and the Dominican Republic (but negatively associated in China). Depression (informant-rated) was also only associated with MCI in some countries.

All of this, I feel, emphasizes the situational variables that determine whether an individual will develop cognitive impairment.

Caracciolo B, Gatz M, Xu W, Pedersen NL, Fratiglioni L. 2012. Differential Distribution of Subjective and Objective Cognitive Impairment in the Population: A Nation-Wide Twin-Study. Journal of Alzheimer's Disease, 29(2), 393-403.

[2801] Sosa, A. L., Albanese E., Stephan B. C. M., Dewey M., Acosta D., Ferri C. P., et al. (2012).  Prevalence, Distribution, and Impact of Mild Cognitive Impairment in Latin America, China, and India: A 10/66 Population-Based Study. PLoS Med. 9(2), e1001170 - e1001170.

Full text available at

A large ten-year study of middle-aged to older adults (45-70) has found that cognitive decline begins in the 45-55 decade, with reasoning ability the most affected by age.

The age at which cognitive decline begins has been the subject of much debate. The Seattle longitudinal study has provided most of the evidence that it doesn’t begin until age 60. A more recent, much larger study that allows both longitudinal and cross-sectional analysis suggests that, depressingly, mid-to-late forties might be closer to the mark.

A long-term British study known as Whitehall II began in 1985, when all civil servants aged 35-55 in 20 London-based departments were invited to participate. In 1997-9, 5198 male and 2192 female civil servants, aged 45-70 at this point, were given the first of three rounds of cognitive testing. The second round took place in 2002-4, and the third in 2007-9.

Over these ten years, all cognitive scores except vocabulary declined in all five age categories (45-49, 50-54, 55-59, 60-64, and 65-70 at baseline). Unsurprisingly, the decline was greater with increasing age, and greatest for reasoning. Men aged 45-9 at baseline showed a 3.6% decline in reasoning, compared to a 9.6% decline for those aged 65-70. Women were less affected by age: while showing the same degree of decline when younger, the oldest showed a 7.4% decline.

None of the other cognitive tasks showed the same age-related deterioration as reasoning, which displayed a consistently linear decline with advancing age. The amount of decline over ten years was roughly similar for each age group for short-term memory and phonemic and semantic fluency (although the women displayed more variability in memory, in a somewhat erratic pattern which may perhaps reflect hormonal changes — I’m speculating here). Moreover, the amount of decline in each decade for these functions was only about the same as reasoning’s decline in the younger decades — about -4% in each decade.

Men and women differed significantly in education (33% of men attended university compared to 21% of women; 57% of women never finished secondary school compared to 39% of men). It is therefore unsurprising that men performed significantly better on all cognitive tests except memory (noting that the actual differences in score were mostly quite small: 16.9/35 vs 16.5 for phonemic fluency; 16.7/35 vs 15.8 for semantic fluency; 25.7/33 vs 23.1 for vocabulary; 48.7/65 vs 41.6 for reasoning).

The cognitive tests included a series of 65 verbal and mathematical reasoning items of increasing difficulty (testing inductive reasoning), a 20-word free recall test (short-term verbal memory), recalling as many words as possible beginning with “S” (phonemic fluency) and recalling members of the animal category (semantic fluency), and a multi-choice vocabulary test.

The design of the study allowed both longitudinal and cross-sectional analyses to be carried out. Cross-sectional data, although more easily acquired, has been criticized as conflating age effects with cohort differences. Generations differ on several relevant factors, of which education is the most obvious. The present study semi-confirmed this, finding that cross-sectional data considerably over-estimated cognitive decline in women but not men — reflecting the fact that education changed far more for women than men in the relevant time periods. For example, in the youngest group of men, 30% had less than a secondary school education and 42% had a university degree, and the women showed a similar pattern, with 34% and 40%. However, for those aged 55-59 at baseline, the corresponding figures were 38% and 29% for men compared to 58% and 17% for women.

The principal finding is of course that measurable cognitive decline was evident in the youngest group, meaning that at some point during that 45-55 decade, cognitive faculties begin to decline. Of course, it should be emphasized that this is a group effect — individuals will vary in the extent and timing of any cognitive decline.

(A side-note: During the ten year period, 305 participants died. The probability of dying was higher in those with poorer cognitive scores at baseline.)

New findings show the T variant of the KIBRA gene improves episodic memory through its effect on hippocampal activity. Another study finds the met variant of the BDNF gene is linked to greater age-related cognitive decline.

Previous research has found that carriers of the so-called KIBRA T allele have been shown to have better episodic memory than those who don’t carry that gene variant (this is a group difference; it doesn’t mean that any carrier will remember events better than any non-carrier). A large new study confirms and extends this finding.

The study involved 2,230 Swedish adults aged 35-95. Of these, 1040 did not have a T allele, 932 had one, and 258 had two.  Those who had at least one T allele performed significantly better on tests of immediate free recall of words (after hearing a list of 12 words, participants had to recall as many of them as they could, in any order; in some tests, there was a concurrent sorting task during presentation or testing).

There was no difference between those with one T allele and those with two. The effect increased with increasing age. There was no effect of gender. There was no significant effect on performance of delayed category cued recall tests or a visuospatial task, although a trend in the appropriate direction was evident.

It should also be noted that the effect on immediate recall, although statistically significant, was not large.

Brain activity was studied in a subset of this group, involving 83 adults aged 55-60, plus another 64 matched on sex, age, and performance on the scanner task. A further group of 113 65-75 year-olds were included for comparison purposes. While in the scanner, participants carried out a face-name association task. Having been presented with face-name pairs, participants were tested on their memory by being shown the faces with three letters, of which one was the initial letter of the name.

Performance on the scanner task was significantly higher for T carriers — but only for the 55-60 age group, not for the 65-75 age group. Activity in the hippocampus was significantly higher for younger T carriers during retrieval, but not encoding. No such difference was seen in the older group.

This finding is in contrast with an earlier, and much smaller, study involving 15 carriers and 15 non-carriers, which found higher activation of the hippocampus in non-T carriers. This was taken at the time to indicate some sort of compensatory activity. The present finding challenges that idea.

Although higher hippocampal activation during retrieval is generally associated with faster retrieval, the higher activity seen in T carriers was not fully accounted for by performance. It may be that such activity also reflects deeper processing.

KIBRA-T carriers were neither more nor less likely to carry other ‘memory genes’ — APOEe4; COMTval158met; BDNFval66met.

The findings, then, fail to support the idea that non-carriers engage compensatory mechanisms, but do indicate that the KIBRA-T gene helps episodic memory by improving the hippocampus function.

BDNF gene variation predicts rate of age-related decline in skilled performance

In another study, this time into the effects of the BDNF gene, performance on an airplane simulation task on three annual occasions was compared. The study involved 144 pilots, of whom all were healthy Caucasian males aged 40-69, and 55 (38%) of whom turned out to have at least one copy of a BDNF gene that contained the ‘met’ variant. This variant is less common, occurring in about one in three Asians, one in four Europeans and Americans, and about one in 200 sub-Saharan Africans.  

While performance dropped with age for both groups, the rate of decline was much steeper for those with the ‘met’ variant. Moreover, there was a significant inverse relationship between age and hippocampal size in the met carriers — and no significant correlation between age and hippocampal size in the non-met carriers.

Comparison over a longer time-period is now being undertaken.

The finding is more evidence for the value of physical exercise as you age — physical activity is known to increase BDNF levels in your brain. BDNF levels tend to decrease with age.

The met variant has been linked to higher likelihood of depression, stroke, anorexia nervosa, anxiety-related disorders, suicidal behavior and schizophrenia. It differs from the more common ‘val’ variant in having methionine rather than valine at position 66 on this gene. The BDNF gene has been remarkably conserved across evolutionary history (fish and mammalian BDNF have around 90% agreement), suggesting that mutations in this gene are not well tolerated.

Chimpanzee brains don’t shrink with age as humans’ do. It may be that cognitive impairment and even dementia are our lot because we work our brains too hard for too long.

Comparison of 99 chimpanzee brains ranging from 10-51 years of age with 87 human brains ranging from 22-88 years of age has revealed that, unlike the humans, chimpanzee brains showed no sign of shrinkage with age. But the answer may be simple: we live much longer. In the wild, chimps rarely live past 45, and although human brains start shrinking as early as 25 (as soon as they reach maturity, basically!), it doesn’t become significant until around 50.

The answer suggests one reason why humans are uniquely vulnerable to Alzheimer’s disease — it’s all down to our combination of large brain and long life. There are other animals that experience some cognitive impairment and brain atrophy as they age, but nothing as extreme as that found in humans (a 10-15% decline in volume over the life-span). (Elephants and whales have the same two attributes as humans — large brains and long lives — but we lack information on how their brains change with age.)

The problem may lie in the fact that our brains use so much more energy than chimps’ (being more than three times larger than theirs) and thus produce a great deal more damaging oxidation. Over a longer life-span, this accumulates until it significantly damages the brain.

If that’s true, it reinforces the value of a diet high in antioxidants.

[2500] Sherwood, C. C., Gordon A. D., Allen J. S., Phillips K. A., Erwin J. M., Hof P. R., et al. (2011).  Aging of the cerebral cortex differs between humans and chimpanzees. Proceedings of the National Academy of Sciences. 108(32), 13029 - 13034.

A study indicates that difficulty in seeing the whole, vs elements of the whole, is associated with impairment in perceptual grouping, and this is more common with age.

A standard test of how we perceive local vs global features of visual objects uses Navon figures — large letters made up of smaller ones (see below for an example). As in the Stroop test when colors and color words disagree (RED), the viewer can focus either on the large letter or the smaller ones. When the viewer is faster at seeing the larger letter, they are said to be showing global precedence; when they’re faster at seeing the component letters, they are said to be showing local precedence. Typically, the greater the number of component letters, the easier it is to see the larger letter. This is consistent with the Gestalt principles of proximity and continuity — elements that are close together and form smooth lines will tend to be perceptually grouped together and seen as a unit (the greater the number of component letters, the closer they will be, and the smoother the line).

In previous research, older adults have often demonstrated local precedence rather than global, although the results have been inconsistent. One earlier study found that older adults performed poorly when asked to report in which direction (horizontal or vertical) dots formed smooth lines, suggesting an age-related decline in perceptual grouping. The present study therefore investigated whether this decline was behind the decrease in global precedence.

In the study 20 young men (average age 22) and 20 older men (average age 57) were shown Navon figures and asked whether the target letter formed the large letter or the smaller letters (e.g., “Is the big or the small letter an E?”). The number of component letters was systematically varied across five quantities. Under such circumstances it is expected that at a certain level of letter density everyone will switch to global precedence, but if a person is impaired at perceptual grouping, this will occur at a higher level of density.

The young men were, unsurprisingly, markedly faster than the older men in their responses. They were also significantly faster at responding when the target was the global letter, compared to when it was the local letter (i.e. they showed global precedence). The older adults, on the other hand, had equal reaction times to global and local targets. Moreover, they showed no improvement as the letter-density increased (unlike the young men).

It is noteworthy that the older men, while they failed to show global precedence, also failed to show local precedence (remember that results are based on group averages; this suggests that the group was evenly balanced between those showing local precedence and those showing global precedence). Interestingly, previous research has suggested that women are more likely to show local precedence.

The link between perceptual grouping and global precedence is further supported by individual differences — older men who were insensitive to changes in letter-density were almost exclusively the ones that showed persistent local precedence. Indeed, increases in letter-density were sometimes counter-productive for these men, leading to even slower reaction times for global targets. This may be the result of greater distractor interference, to which older adults are more vulnerable, and to which this sub-group of older men may have been especially susceptible.

Example of a Navon figure:


New research explains why fewer new brain cells are created in the hippocampus as we get older.

It wasn’t so long ago we believed that only young brains could make neurons, that once a brain was fully matured all it could do was increase its connections. Then we found out adult brains could make new neurons too (but only in a couple of regions, albeit critical ones). Now we know that neurogenesis in the hippocampus is vital for some operations, and that the production of new neurons declines with age (leading to the idea that the reduction in neurogenesis may be one reason for age-related cognitive decline).

What we didn’t know is why this happens. A new study, using mice genetically engineered so that different classes of brain cells light up in different colors, has now revealed the life cycle of stem cells in the brain.

Adult stem cells differentiate into progenitor cells that ultimately give rise to mature neurons. It had been thought that the stem cell population remained stable, but that these stem cells gradually lose their ability to produce neurons. However, the mouse study reveals that during the mouse's life span, the number of brain stem cells decreased 100-fold. Although the rate of this decrease actually slows with age, and the output per cell (the number of progenitor cells each stem cell produces) increases, nevertheless the pool of stem cells is dramatically reduced over time.

The reason this happens (and why it wasn’t what we expected) is explained in a computational model developed from the data. It seems that stem cells in the brain differ from other stem cells. Adult stem cells in the brain wait patiently for a long time until they are activated. They then undergo a series of rapid divisions that give rise to progeny that differentiate into neurons, before ‘retiring’ to become astrocytes. What this means is that, unlike blood or gut stem cells (that renew themselves many times), brain stem cells are only used once.

This raises a somewhat worrying question: if we encourage neurogenesis (e.g. by exercise or drugs), are we simply using up stem cells prematurely? The researchers suggest the answer depends on how the neurogenesis has been induced. Parkinson's disease and traumatic brain injury, for example, activate stem cells directly, and so may reduce the stem cell population. However, interventions such as exercise stimulate the progenitor cells, not the stem cells themselves.

A new study reveals that older adults’ greater problems with multitasking stem from their impaired ability to disengage from an interrupting task and restore the original task.

Comparison of young adults (mean age 24.5) and older adults (mean age 69.1) in a visual memory test involving multitasking has pinpointed the greater problems older adults have with multitasking. The study involved participants viewing a natural scene and maintaining it in mind for 14.4 seconds. In the middle of the maintenance period, an image of a face popped up and participants were asked to determine its sex and age. They were then asked to recall the original scene.

As expected, older people had more difficulty with this. Brain scans revealed that, for both groups, the interruption caused their brains to disengage from the network maintaining the memory and reallocate resources to processing the face. But the younger adults had no trouble disengaging from that task as soon as it was completed and re-establishing connection with the memory maintenance network, while the older adults failed both to disengage from the interruption and to reestablish the network associated with the disrupted memory.

This finding adds to the evidence that an important (perhaps the most important) reason for cognitive decline in older adults is a growing inability to inhibit processing, and extends the processes to which that applies.

A new study of older adults indicates atrophy of the cerebellum is an important factor in cognitive decline for men, but not women.

Shrinking of the frontal lobe has been associated with age-related cognitive decline for some time. But other brain regions support the work of the frontal lobe. One in particular is the cerebellum. A study involving 228 participants in the Aberdeen Longitudinal Study of Cognitive Ageing (mean age 68.7) has revealed that there is a significant relationship between grey matter volume in the cerebellum and general intelligence in men, but not women.

Additionally, a number of other brain regions showed an association between gray matter and intelligence, in particular Brodmann Area 47, the anterior cingulate, and the superior temporal gyrus. Atrophy in the anterior cingulate has been implicated as an early marker of Alzheimer’s, as has the superior temporal gyrus.

The gender difference was not completely unexpected — previous research has indicated that the cerebellum shrinks proportionally more with age in men than women. More surprising was the fact that there was no significant association between white memory volume and general intelligence. This contrasts with the finding of a study involving older adults aged 79-80. It is speculated that this association may not develop until greater brain atrophy has occurred.

It is also interesting that the study found no significant relationship between frontal lobe volume and general intelligence — although the effect of cerebellar volume is assumed to occur via its role in supporting the frontal lobe.

The cerebellum is thought to play a vital role in three relevant areas: speed of information processing; variability of information processing; development of automaticity through practice.

An animal study points to confusion between memories being central to amnesia, rather than a failure to recall.

We have thought of memory problems principally in terms of forgetting, but using a new experimental method with amnesic animals has revealed that confusion between memories, rather than loss of memory, may be more important.

While previous research has found that amnesic animals couldn't distinguish between a new and an old object, the new method allows responses to new and old objects to be measured separately. Control animals, shown an object and then shown either the same or another object an hour later, spent more time (as expected) with the new object. However, amnesic animals spent less time with the new object, indicating they had some (false) memory of it.

The researchers concluded that the memory problems were the result of the brain's inability to register complete memories of the objects, and that the remaining, less detailed memories were more easily confused. In other words, it’s about poor encoding, not poor retrieval.

Excitingly, when the amnesic animals were put in a dark, quiet space before the memory test, they performed perfectly on the test.

The finding not only points to a new approach for helping those with memory problems (for example, emphasizing differentiating details), but also demonstrates how detrimental interference from other things can be when we are trying to remember something — an issue of particular relevance in modern information-rich environments. The extent to which these findings apply to other memory problems, such as dementia, remains to be seen.

New research shows that many old bees, like many older humans, have trouble replacing out-of-date knowledge with new memories.

I love cognitive studies on bees. The whole notion that those teeny-tiny brains are capable of the navigation and communication feats bees demonstrate is so wonderful. Now a new study finds that, just like us, aging bees find it hard to remember the location of a new home.

The study builds on early lab research that demonstrated that old bees find it harder to learn floral odors. In this new study, researchers trained bees to a new nest box while their former nest was closed off. Groups composed of mature and old bees were given several days in which to learn the new home location and to extinguish the bees' memory of their unusable former nest box. The new home was then disassembled, and groups of mixed-age bees were given three alternative nest locations to choose from (including the former nest box). Some old bees (those with symptoms of senescence) preferentially went to the former nest site, despite the experience that should have told them that it was unusable.

The findings demonstrate that memory problems and increasing inflexibility with age are not problems confined to mammals.

Findings from a large Swedish study are consistent with the hypothesis that more education and better healthcare have produced less cognitive impairment in present-day older adults.

Beginning in 1971, healthy older adults in Gothenburg, Sweden, have been participating in a longitudinal study of their cognitive health. The first H70 study started in 1971 with 381 residents of Gothenburg who were 70 years old; a new one began in 2000 with 551 residents and is still ongoing. For the first cohort (born in 1901-02), low scores on non-memory tests turned out to be a good predictor of dementia; however, these tests were not predictive for the generation born in 1930. Those from the later cohort also performed better in the intelligence tests at age 70 than their predecessors had.

It’s suggested that the higher intelligence is down to the later cohort’s better pre and postnatal care, better nutrition, higher quality education, and better treatment of high blood pressure and cholesterol. And possibly the cognitive demands of modern life.

Nevertheless, the researchers reported that the incidence of dementia at age 75 was little different (5% in the first cohort and 4.4% in the later). However, since a substantially greater proportion of the first cohort were dead by that age (15.7% compared to 4.4% of the 2nd cohort), it seems quite probable that there really was a higher incidence of dementia in the earlier cohort.

The fact that low scores on non-memory cognitive tests were predictive in the first cohort of both dementia and death by age 75 supports this argument.

The fact that low scores on non-memory cognitive tests were not predictive of dementia or death in the later cohort is in keeping with the evidence that higher levels of education help delay dementia. We will need to wait for later findings from this study to see whether that is what is happening.

The findings are not inconsistent with those from a very large U.S. national study that found older adults (70+) are now less likely to be cognitively impaired (see below). It was suggested then also that better healthcare and more education were factors behind this decline in the rate of cognitive impairment.

Previous study:

A new nationally representative study involving 11,000 people shows a downward trend in the rate of cognitive impairment among people aged 70 and older, from 12.2% to 8.7% between 1993 and 2002. It’s speculated that factors behind this decline may be that today’s older people are much likelier to have had more formal education, higher economic status, and better care for risk factors such as high blood pressure, high cholesterol and smoking that can jeopardize their brains. In fact the data suggest that about 40% of the decrease in cognitive impairment over the decade was likely due to the increase in education levels and personal wealth between the two groups of seniors studied at the two time points. The trend is consistent with a dramatic decline in chronic disability among older Americans over the past two decades.

A new study suggests that inconsistencies in rate of age-related cognitive decline may be partly due to practice effects, but though decline does occur it is slower than some have estimated.

Reports on cognitive decline with age have, over the years, come out with two general findings: older adults do significantly worse than younger adults; older adults are just as good as younger adults. Part of the problem is that there are two different approaches to studying this, each with their own specific bias. You can keep testing the same group of people as they get older — the problem with this is that they get more and more practiced, which mitigates the effects of age. Or you can test different groups of people, comparing older with younger — but cohort differences (e.g., educational background) may disadvantage the older generations. There is also argument about when it starts. Some studies suggest we start declining in our 20s, others in our 60s.

One of my favorite cognitive aging researchers has now tried to find the true story using data from the Virginia Cognitive Aging Project involving nearly 3800 adults aged 18 to 97 tested on reasoning, spatial visualization, episodic memory, perceptual speed and vocabulary, with 1616 tested at least twice. This gave a nice pool for both cross-sectional and longitudinal comparison (retesting ranged from 1 to 8 years and averaged 2.5 years).

From this data, Salthouse has estimated the size of practice effects and found them to be as large as or larger than the annual cross-sectional differences, although they varied depending on the task and the participant’s age. In general the practice effect was greater for younger adults, possibly because younger people learn better.

Once the practice-related "bonus points" were removed, age trends were flattened, with much less positive changes occurring at younger ages, and slightly less negative changes occurring at older ages. This suggests that change in cognitive ability over an adult lifetime (ignoring the effects of experience) is smaller than we thought.

Healthy older adults reporting subjective cognitive impairment are dramatically more likely to progress to MCI or dementia, and decline significantly faster.

Subjective cognitive impairment (SCI), marked by situations such as when a person recognizes they can't remember a name like they used to or where they recently placed important objects the way they used to, is experienced by between one-quarter and one-half of the population over the age of 65. A seven-year study involving 213 adults (mean age 67) has found that healthy older adults reporting SCI are dramatically more likely to progress to MCI or dementia than those free of SCI (54% vs 15%). Moreover, those who had SCI declined significantly faster.

Reisberg, B. et al. 2010. Outcome over seven years of healthy adults with and without subjective cognitive impairment. Alzheimer's & Dementia, 6 (1), 11-24.

A new study finds a decision-making advantage to the increased difficulty older brains have in filtering out irrelevant information.

It’s now well established that older brains tend to find it harder to filter out irrelevant information. But now a new study suggests that that isn’t all bad. The study compared the performance of 24 younger adults (17-29) and 24 older adults (60-73) on two memory tasks separated by a 10-minute break. In the first task, they were shown pictures overlapped by irrelevant words, told to ignore the words and concentrate on the pictures only, and to respond every time the same picture appeared twice in a row. The second task required them to remember how the pictures and words were paired together in the first task. The older adults showed a 30% advantage over younger adults in their memory for the preserved pairs. It’s suggested that older adults encode extraneous co-occurrences in the environment and transfer this knowledge to subsequent tasks, improving their ability to make decisions.

[276] Campbell, K. L., Hasher L., & Thomas R. C. (2010).  Hyper-binding: a unique age effect. Psychological Science: A Journal of the American Psychological Society / APS. 21(3), 399 - 405.

Full text available at

More data from the National Survey of Midlife Development in the United States has revealed that cognitive abilities reflect to a greater extent how old you feel, not how old you actually are.

More data from the National Survey of Midlife Development in the United States has revealed that cognitive abilities reflect to a greater extent how old you feel, not how old you actually are. Of course that may be because cognitive ability contributes to a person’s wellness and energy. But it also may reflect benefits of trying to maintain a sense of youthfulness by keeping up with new trends and activities that feel invigorating.

[171] Schafer, M. H., & Shippee T. P. (2009).  Age Identity, Gender, and Perceptions of Decline: Does Feeling Older Lead to Pessimistic Dispositions About Cognitive Aging?. The Journals of Gerontology Series B: Psychological Sciences and Social Sciences. 65B(1), 91 - 96.

A new study provides evidence that it's not age per se that affects the quality of decision-making, but individual differences in processing speed and memory.

A study involving 54 older adults (66-76) and 58 younger adults (18-35) challenges the idea that age itself causes people to become more risk-averse and to make poorer decisions. Analysis revealed that it is individual differences in processing speed and memory that affect decision quality, not age. The stereotype has arisen no doubt because more older people process slowly and have poorer memory. The finding points to the need to identify ways in which to present information that reduces the demand on memory or the need to process information very quickly, to enable those in need of such help (both young and old) to make the best choices. Self-knowledge also helps — recognizing if you need to take more time to make a decision.

The discovery that a particular type of dendritic spine is lost with age not only provides a target for therapy, but also emphasizes the importance of building skills and expertise when young.

A rhesus monkey study has revealed which dendritic spines are lost with age, providing a new target for therapies to help prevent age-association cognitive impairment. It appears that it is the thin, dynamic spines in the dorsolateral prefrontal cortex, which are key to learning new things, establishing rules, and planning, that are lost. Learning of a new task was correlated with both synapse density and average spine size, but was most strongly predicted by the head volume of thin spines. There was no correlation with size or density of the large, mushroom-shaped spines, which were very stable across age and probably mediate long-term memories, enabling the retention of expertise and skills learned early in life. There was no correlation with any of these spine characteristics once the task was learned. The findings underscore the importance of building skills and broad expertise when young.

An imaging study has found differences in brain activity that explain why older adults are better at remembering positive events.

An imaging study reveals why older adults are better at remembering positive events. The study, involving young adults (ages 19-31) and older adults (ages 61-80) being shown a series of photographs with positive and negative themes, found that while there was no difference in brain activity patterns between the age groups for the negative photos, there were age differences for the positive photos. In older adult brains, but not the younger, two emotion-processing regions (the ventromedial prefrontal cortex and the amygdala) strongly influenced the memory-encoding hippocampus.

Examination of the brains from 9 “super-aged” — people over 80 whose memory performance was at the level of 50-year-olds — has found that some of them had almost no tau tangles. Are they genetically protected, or reaping the benefits of a preventive lifestyle?

Examination of the brains from 9 “super-aged” — people over 80 whose memory performance was at the level of 50-year-olds — has found that some of them had almost no tau tangles. The accumulation of tau tangles has been thought to be a natural part of the aging process; an excess of them is linked to Alzheimer’s disease. The next step is to work out why some people are immune to tangle formation, while others appear immune to the effects. Perhaps the first group is genetically protected, while the others are reaping the benefits of a preventive lifestyle.

The findings were presented March 23 at the 239th National Meeting of the American Chemical Society (ACS).

An imaging study involving 79 volunteers aged 44 to 88 has found more brain atrophy and faster rates of decline in brain regions particularly affected by aging, among those ranked high in neuroticism traits.

An imaging study involving 79 volunteers aged 44 to 88 has found lower volumes of gray matter and faster rates of decline in the frontal and medial temporal lobes of those who ranked high in neuroticism traits, compared with those who ranked high in conscientious traits. These are brain regions particularly affected by aging. The idea that this might occur derived from the well-established effects of chronic stress on the brain. This is the first study to investigate whether the rate and extent of cognitive decline with age is influenced by personality variables. Extraversion, also investigated, had no effect. The study does not, however, rule out the possibility that it is reduction in brain tissue in these areas that is affecting personality. There is increasing evidence that people tend to become more neurotic and less conscientious in early-stage Alzheimer's.

[174] Jackson, J., Balota D. A., & Head D. (Submitted).  Exploring the relationship between personality and regional brain volume in healthy aging. Neurobiology of Aging. In Press, Corrected Proof,

The role of the dopamine-regulating COMT gene in cognitive function has been the subject of debate. Now a large study of older adults has revealed that the Met variant of the COMT gene was linked to a greater decline in cognitive function. This effect was more pronounced for African-Americans.

The role of the catechol-O-methyltransferase (COMT) gene in cognitive function has been the subject of some debate. The gene, which affects dopamine, comes in two flavors: Val and Met. One recent study found no difference between healthy carriers of these two gene variants in terms of cognitive performance, but did find differences in terms of neural activity. Another found that, although the gene did not affect Alzheimer’s risk in its own, it acted synergistically with the Alzheimer’s gene variant to do so. Now an eight-year study of nearly 3000 adults in their 70s has revealed that the Met variant of the COMT gene was linked to a greater decline in cognitive function. This effect was more pronounced for African-Americans. This is interesting because it has been the Val genotype that in other research has been shown to have a detrimental effect. It seems likely that this genotype must be considered in its context (age, race, gender, and ApoE status have all been implicated in research).

Citekey 180 not found

Older news items (pre-2010) brought over from the old website

Factors helping you maintain cognitive function in old age

An 8-year study of over 2,500 seniors in their 70s, has found that 53% showed normal age-related decline, 16% showed major cognitive decline, and an encouraging 30% had no change or improved on the tests over the years. The most important factors in determining whether a person maintained their cognitive health was education and literacy: those with a ninth grade literacy level or higher were nearly five times as likely to stay sharp than those with lower literacy levels; those with at least a high school education were nearly three times as likely to stay sharp as those who have less education. Lifestyle factors were also significant: non-smokers were nearly twice as likely to stay sharp as smokers; those who exercised moderately to vigorously at least once a week were 30% more likely to maintain their cognitive function than those who do not exercise that often; people working or volunteering and people who report living with someone were 24% more likely to maintain cognitive function.

Citekey 909 not found

Better cognitive performance from US seniors compared to British

A study involving over 8,000 older Americans and over 5,000 British seniors has found a significant difference in cognitive performance between the two nationalities, with the Americans scoring on average as if they were ten years younger than the British. The U.S. advantage in "brain health" was greatest for the oldest old---those aged 85 and older. Part of the difference can be accounted for by higher levels of education and net worth in the United States, and part by significantly lower levels of depressive symptoms (possibly attributable to the much greater degree of medication in the US for depression). It was also found that dramatically more U.S. seniors reported no alcohol use (over 50%), compared to the British (15.5%). It is also speculated that the earlier retirement in Britain may be a factor, and also the greater prevalence of untreated hypertension.

Citekey 773 not found

Full text available at

Memory gets worse with age if you think about it

Confirming earlier research (and what I’ve been saying for ten years), thinking that memory diminishes with age is sufficient for some elderly people to score lower on cognitive tests. Moreover, and confirming other research relating to gender and race, the study also found that a senior's ability to remember something was heavily influenced by the activation or inactivation of negative stereotypes (for example, by being told before the test that older people perform more poorly on that type of memory test). The effects of negative stereotypes were experienced more by those in their sixties than older (but those in their seventies performed worse when they felt stigmatized), and more by the very well-educated. There was some indication that these effects occur through their effect on motivation.

Citekey 1013 not found

Circadian clock may be critical for remembering what you learn

We know circadian rhythm affects learning and memory in that we find it easier to learn at certain times of day than others, but now a study involving Siberian hamsters has revealed that having a functioning circadian system is in itself critical to being able to remember. The finding has implications for disorders such as Down syndrome and Alzheimer's disease. The critical factor appears to be the amount of the neurotransmitter GABA, which acts to inhibit brain activity. The circadian clock controls the daily cycle of sleep and wakefulness by inhibiting different parts of the brain by releasing GABA. It seems that if it’s not working right, if the hippocampus is overly inhibited by too much GABA, then the circuits responsible for memory storage don't function properly. The effect could be fixed by giving a GABA antagonist, which blocks GABA from binding to synapses. Recent mouse studies have also demonstrated that mice with symptoms of Down syndrome and Alzheimer's also show improved learning and memory when given the same GABA antagonist. The findings may also have implications for general age-related cognitive decline, because age brings about a degradation in the circadian system. It’s also worth noting that the hamsters' circadian systems were put out of commission by manipulating the hamsters' exposure to light, in a technique that was compared to "sending them west three time zones." The effect was independent of sleep duration.

Citekey 688 not found

Occasional memory loss tied to lower brain volume

A study of 503 seniors (aged 50-85) with no dementia found that 453 of them (90%) reported having occasional memory problems such as having trouble thinking of the right word or forgetting things that happened in the last day or two, or thinking problems such as having trouble concentrating or thinking more slowly than they used to. Such problems have been attributed to white matter lesions, which are very common in older adults, but all of the participants in the study had white matter lesions in their brains, and the amount of lesions was not tied to occasional memory problems. However it was found that those who reported having such problems had a smaller hippocampus than those who had no cognitive problems. This was most noteworthy in subjects with good objective cognitive performance.

Citekey 895 not found

Decline of mental skills in years before death

A long-running study of 288 people with no dementia, who were followed from age 70 to death, has found that there was substantial acceleration in cognitive decline many years prior to death. Time of onset and rate of terminal decline varied considerably across cognitive abilities, with verbal ability beginning its terminal decline 6.6 years prior to death, spatial ability 7.8 years before death, and perceptual speed 14.8 years before death. With verbal ability, it appeared that the decline was not due to age only, but due to health issues.

Citekey 212 not found

Aging impairs the 'replay' of memories during sleep

During sleep, the hippocampus repeatedly "replays" brain activity from recent experiences, in a process believed to be important for memory consolidation. A new rat study has found reduced replay activity during sleep in old compared to young rats, and rats with the least replay activity performed the worst in tests of spatial memory. The best old rats were also the ones that showed the best sleep replay. Indeed, the animals who more faithfully replayed the sequence of neural activity recorded during their earlier learning experience were the ones who performed better on the spatial memory task, regardless of age. The replay activity occurs during slow-wave sleep.

Citekey 1319 not found

White-matter changes linked to gait and balance problems

A three-year study involving 639 adults between the ages of 65 and 84 has found that people with severe white matter changes (leukoaraiosis) were twice as likely to score poorly on walking and balance tests as those people with mild white matter changes. The study also found people with severe changes were twice as likely as the mild group to have a history of falls. The moderate group was one-and-a-half times as likely as the mild group to have a history of falls. Further research will explore the effect of exercise.

Citekey 1004 not found

Lack of imagination in older adults linked to declining memory

In a study in which older and younger adults were asked to think of past and future events, older adults were found to generate fewer details about past events — and this correlated with an impaired ability to imagine future events. The number of details remembered by older adults was also linked to their relational memory abilities. The findings suggest that our ability to imagine future events is based on our ability to remember the details of previously experienced ones, extract relevant details and put them together to create an imaginary event.

Citekey 287 not found

Brain systems become less coordinated with age, even in the absence of disease

An imaging study of the brain function of 93 healthy individuals from 18 to 93 years old has revealed that normal aging disrupts communication between different regions of the brain. The finding is consistent with previous research showing that normal aging slowly degrades white matter. The study focused on the links within two critical networks, one responsible for processing information from the outside world and one, known as the default network, which is more internal and kicks in when we muse to ourselves. “We found that in young adults, the front of the brain was pretty well in sync with the back of the brain [but] in older adults this was not the case. The regions became out of sync and they were less correlated with each other.” However, older adults with normal, high correlations performed better on cognitive tests. Among older individuals whose brain systems did not correlate, all of the systems were not affected in the same way. The default system was most severely disrupted with age. The visual system was very well preserved.

Citekey 1052 not found

Why neurogenesis is so much less in older brains

A rat study has revealed that the aging brain produces progressively fewer new nerve cells in the hippocampus (neurogenesis) not because there are fewer of the immature cells (neural stem cells) that can give rise to new neurons, but because they divide much less often. In young rats, around a quarter of the neural stem cells were actively dividing, but only 8% of cells in middle-aged rats and 4% in old rats were. This suggests a new approach to improving learning and memory function in the elderly.

Citekey 1077 not found

Senior’s memory complaints should be taken seriously

A study involving 120 people over 60 found those who complained of significant memory problems who still performed normally on memory tests had a 3% reduction in gray matter density in their brains. This compares to 4% in those diagnosed with mild cognitive impairment. This suggests that significant memory loss complaints may indicate a very early "pre-MCI" stage of dementia for some people.

Citekey 979 not found

Alzheimer's pathology related to episodic memory loss in those without dementia

A study of 134 participants from the Religious Orders Study or the Memory and Aging Project has found that, although they didn't have cognitive impairment at the time of their death, more than a third of the participants (50) met criteria for a pathologic diagnosis of Alzheimer's disease. This group also scored significantly lower on tests for episodic memory, such as recalling stories and word lists. The results provide further support for the idea that a ‘cognitive reserve’ can allow people to tolerate a significant amount of Alzheimer's pathology without manifesting obvious dementia. It also raises the question whether we should accept any minor episodic memory loss in older adults as 'normal'.

Citekey 967 not found

Does IQ drop with age or does something else impact intelligence?

As people grow older, their IQ scores drop. But is it really that they lose intelligence? A study has found that if college students had to perform under conditions that mimic the perception deficits many older people have, their IQ scores would also take a drop.

Citekey 234 not found

Walking in older people is related to cognitive skills

A study of 186 adults aged 70 and older tested gait speed with and without interference (walking while reciting alternate letters of the alphabet). Walking speed was predictable from performance on cognitive tests of executive control and memory, particularly when the participant was required to recite at the same time. The findings suggest that in old age, walking involves higher-order executive-control processes, suggesting that cognitive tests could help doctors assess risk for falls. Conversely, slow gait could alert them to check for cognitive impairment.

Citekey 1812 not found

Immune function important for cognition

New research overturns previous beliefs that immune cells play no part in — and may indeed constitute a danger to — the brain. Following on from an earlier study that suggested that T cells — immune cells that recognize brain proteins — have the potential to fight off neurodegenerative conditions such as Alzheimer’s, researchers have found that neurogenesis in adult rats kept in stimulating environments requires these immune cells. A further study found that mice with these T cells performed better at some tasks than mice lacking the cells. The researchers suggest that age-related cognitive decline may be related to this, as aging is associated with a decrease in immune system function, suggesting that boosting the immune system may also benefit cognitive function in older adults.

Citekey 435 not found

Early life stress can lead to memory loss and cognitive decline in middle age

Age-related cognitive decline is probably a result of both genetic and environmental factors. A rat study has demonstrated that some of these environmental factors may occur in early life. Among the rats, emotional stress in infancy showed no ill effects by the time the rats reached adulthood, but as the rats reached middle age, cognitive deficits started to appear in those rats who had had stressful infancies, and progressed much more rapidly with age than among those who had had nurturing infancies. Middle-aged rats who had been exposed to early life emotional stress showed deterioration in brain-cell communication in the hippocampus.

Citekey 1274 not found

Older people with the 'Alzheimer's gene' find it harder to remember intentions

It has been established that those with a certain allele of a gene called ApoE have a much greater risk of developing Alzheimer’s (those with this allele on both genes have 8 times the risk; those with the allele on one gene have 3 times the risk). Recent studies also suggest that such carriers are also more likely to show signs of deficits in episodic memory – but that these deficits are quite subtle. In the first study to look at prospective memory in seniors with the “Alzheimer’s gene”, involving 32 healthy, dementia-free adults between ages of 60 and 87, researchers found a marked difference in performance between those who had the allele and those who did not. The results suggest an exception to the thinking that ApoE status has only a subtle effect on cognition.

Citekey 1276 not found

Some brains age more rapidly than others

Investigation of the patterns of gene expression in post-mortem brain tissue has revealed two groups of genes with significantly altered expression levels in the brains of older individuals. The most significantly affected are mostly those related to learning and memory. One of the most interesting, and potentially useful, findings, is that patterns of gene expression are quite similar in the brains of younger adults. Very old adults also show similar patterns, although the similarity is less. But the greatest degree of individual variation occurs in those aged between 40 and 70. Some of these adults show gene patterns that look more like the young group, whereas others show gene patterns that look more like the old group. It appears that gene changes start around 40 in some people, but not in others. It also appears that those genes that are affected by age are unusually vulnerable to damage from agents such as free radicals and toxins in the environment, suggesting that lifestyle in young adults may play a part in deciding rate and degree of cognitive decline in later years.

Citekey 1335 not found

Drugs to improve memory may worsen memory in some

Drugs that increase the activity of an enzyme called protein kinase A improve long-term memory in aged mice and have been proposed as memory-enhancing drugs for elderly humans. However, the type of memory improved by this activity occurs principally in the hippocampus. A new study suggests that increased activity of this enzyme has a deleterious effect on working memory (which principally involves the prefrontal cortex). In other words, a drug that helps you remember a recent event may worsen your ability to remember what you’re about to do (to take an example).

Citekey 1404 not found

Memory-enhancing drugs for elderly may impair working memory and other executive functions

A number of pharmaceutical companies are working on developing memory-enhancing drugs not only for patients with clinical memory impairment, but also for perfectly healthy people. Although some drugs have been found that can improve cognitive function in those suffering from impairment, the side effects preclude their use among healthy people. However, a recent study has found evidence that a well-established drug used for narcolepsy (excessive daytime sleepiness) may improve cognition in normal people, without side effects. The drug seems to particularly affect some tasks requiring planning and working memory (and in a further, as yet unpublished study, appears helpful for adults with ADHD). Whether the drug (modafinil) has anything over caffeine in terms of the cognitive benefits it brings is still debated. More interestingly, and in line with the sometimes conflicting results of these kinds of drugs on different people, the researchers suggest that the effect of drugs on cognitive function depends on the level at which the individual cognitive system is operating: if your system is mildly below par, the right brain chemical could improve performance; if it’s well below par, the same dose will have a much smaller effect; if (and this is the interesting one) it’s already operating at peak, the chemical could in fact degrade performance.

Citekey 1360 not found

Magnetic resonance imaging may help predict future memory decline

A six-year imaging study of 45 healthy seniors assessed changes in brain scans against cognitive decline. They found that progressive atrophy in the medial temporal lobe was the most significant predictor of cognitive decline, which occurred in 29% of the subjects.

Citekey 490 not found

Mouse study suggests new approach to reducing age-related cognitive decline

Young and old mice learned that a particular tone was associated with a mild electric footshock. When the tone was immediately followed by a shock, both young and aged mice easily remembered the association on the following day. When the tone was separated from the shock by several seconds, the old mice were strongly impaired in comparison to the young mice. The researchers found highly elevated levels of a calcium-activated potassium channel, the so-called SK3 channel, in the hippocampus of old, but not of young mice. When the researchers selectively downregulated SK3 channels in the hippocampus of aged mice, the impairment in learning and memory was prevented. This suggests a new approach to treating age-related memory decline.

Blank, T., Nijholt, I., Kye, M-J., Radulovic, J. & Spiess, J. 2003. Small-conductance, Ca2+-activated K+ channel SK3 generates age-related memory and LTP deficits. Nature Neuroscience, 6(9),911–912. Published online: 27 July 2003, doi:10.1038/nn1101

Rat study offers more complex model of brain aging

A study of young, middle-aged, and aged rats, trained on two memory tasks, has revealed 146 genes connected with brain aging and cognitive impairment. Importantly, the changes in gene activity had mostly begun in mid-life, suggesting that changes in gene activity in the brain in early adulthood might set off cellular or biological changes that could affect how the brain works later in life. The study provides more information on genes already linked to aging, including some involved in inflammation and oxidative stress, and also describes additional areas in which gene activity might play a role in brain aging, including declines in energy metabolism in cells and changes in the activity of neurons (nerve cells) in the brain and their ability to make new connections with each other, increases in cellular calcium levels which could trigger cell death, cholesterol synthesis, iron metabolism and the breakdown of the insulating myelin sheaths that when intact facilitate efficient communication among neurons.

Citekey 852 not found

Is a dwindling brain chemical responsible for age-related cognitive decline?

A study of what are probably the world's oldest monkeys may explain age-related mental decline. The study found that the very old monkeys' nerves in the visual cortex lose their ability to discriminate between one signal and another and that this loss was directly related to the presence of a chemical called gamma-aminobutyric acid (Gaba), a neurotransmitter that appears to dwindle in old age. If a lack of GABA is indeed responsible for the old neurons' indiscriminate firing, this problem may be simple enough to treat. There already exist drugs that increase GABA production, although these drugs have yet to be carefully tested on the elderly.

Citekey 660 not found

Rat studies provide more evidence on why aging can impair memory

Among aging rats, those that have difficulty navigating water mazes have no more signs of neuron damage or cell death in the hippocampus, a brain region important in memory, than do rats that navigate with little difficulty. Nor does the extent of neurogenesis (birth of new cells in an adult brain) seem to predict poorer performance. Although the researchers have found no differences in a variety of markers for postsynaptic signals between elderly rats with cognitive impairment and those without, decreases in a presynaptic signal are correlated with worse cognitive impairment. That suggests that neurons in the impaired rat brains may not be sending signals correctly.

Gallagher, M. 2002. Markers for memory decline. Paper presented at the Society for Neuroscience annual meeting in Orlando, Florida, 5 November.

An enzyme that helps us to forget

A series of experiments on genetically altered laboratory mice showed those with low levels of the enzyme protein phosphatase-1 (PP1), were less likely to forget what they had learned. This enzyme appears to be critical in helping us forget unwanted information, but it may also be partly responsible for an increase in forgetting in older adults. It was found that as the mice aged, the level of PP1 increased. When the action of PP1 was blocked, the mice recovered their full learning and memory abilities.

Citekey 1357 not found

Age-related changes in brain dopamine may underpin the normal cognitive problems of aging

A new model suggests why and how many cognitive abilities decline with age, and offers hope for prevention. Research in the past few years has clarified and refined our ideas about the ways in which cognitive abilities decline with age, and one of these ways is in a reduced ability to recall the context of memories. Thus, for example, an older person is less likely to be able to remember where she has heard something. According to this new model, context processing is involved in many cognitive functions — including some once thought to be independent — and therefore a reduction in the ability to remember contextual information can have wide-reaching implications for many aspects of cognition. The model suggests that context processing occurs in the prefrontal cortex and requires a certain level of the brain chemical dopamine. It may be that in normal aging, dopamine levels become low or erratic. Changes in dopamine have also been implicated in Alzheimer’s, as well as other brain-based diseases.

Citekey 1180 not found