Memory in Normal Aging

Cognitive decline in normal aging

How cognitive function declines with age

Older adults commonly need to practice more than younger adults to achieve the same level of performance. Such age deficits are at least partly due to poorer monitoring of their learning.

Failing to immediately retrieve well-known information does become more common with age, with an increase in "tips of the tongue" evident as early as the mid-thirties. Older people tend to be less likely than younger people to actively pursue a missing word.

Older adults are less likely than younger ones to use the appropriate brain regions when performing a memory task, and more likely to use cortical regions that are not as useful. But this can be at least partly overcome if the seniors are given specific strategy instructions.

Older adults appear to be particularly impaired in context processing — particularly seen in an inability to remember where they heard (or read, or saw) something. Because context is involved in many memory processes, this may have far-reaching implications. An impaired ability to remember context may reflect frontal-lobe inefficiency rather than aging per se.

Decreased ability to remember past events is linked to an impaired ability to imagine future events.

Older adults may compensate for cognitive decline by using additional brain regions. However, the downside is that these brain regions are then not available when a task requires them specifically. This may explain older adults' poorer performance on complex short-term memory tasks.

An important (perhaps even the most important) reason for cognitive decline in older adults is now seen to be a growing inability to filter out irrelevant/distracting information and inhibit processing. There can, however, be a decision-making/problem-solving advantage to this inclusion of apparently irrelevant information.

Older adults’ greater problems with multitasking stem from their impaired ability to disengage from an interrupting task and restore the original task.

There is growing evidence that memory problems (even amnesia) reflect confusion between memories more than loss of memory, and age-related difficulties reflect increasing difficulties in replacing out-of-date information with new, or distinguishing between them.

There do seem to be some gender differences in how brains change with age, which is consistent with the evidence that general intelligence is reflected in different brain attributes for men and women.

While IQ tends to drop with age, this may simply reflect perception deficits, not cognitive ones.

Brain regions that are especially affected by age include shrinking of the frontal lobe, especially the prefrontal cortex, of the medial temporal lobe, especially the hippocampus, and (for men only) the cerebellum. Aging also tends to degrade white matter, leading to brain networks growing less coordinated. The default network is most severely disrupted. Levels of the inhibitory neurotransmitter GABA also tend to decline with age, as does the levels of dopamine. Both are important for learning and memory.

See news reports

Rate of cognitive decline

White matter appears to decrease faster than grey matter, but doesn't begin to decline until the forties. Presumably this relates to the decline in processing speed that is the most evident characteristic of age-related decline.

Grey matter, on the other hand, declines at a fairly constant rate from adolescence, mirroring a decline in processing ability that seems to start as early as the twenties.

Cognitive decline seems to be faster in women than men. This presumably reflects apparent gender differences in brain activity. For example, while women seem to have a greater density of brain cells in the prefrontal cortex, they also show a steeper rate of decline so that, in old age, the density is similar between the genders.

There is some evidence that individual differences in processing speed and memory are more important than age, and that personality attributes affect the rate of cognitive decline and brain atrophy.

Some gene variants, including the so-called Alzheimer’s gene, are associated with a faster rate of decline, or an earlier start. These may be triggered by activity in early adulthood. Head size in adulthood also seems to affect rate of decline. Head size in adulthood reflects not only head size at birth, but growth in the early years — pointing to the importance of providing both proper nourishment and intellectual stimulation in these early years.

See news reports

Extent of cognitive decline in the population

Most older adults do not suffer cognitive impairment. Around 30-40% of adults over 65 have the type of cognitive loss we regard as a normal consequence of age — a measurable (but slight) decline on memory tests; a feeling that you're not quite as sharp or as good at remembering, as you used to be (age-related cognitive impairment). Around 10% of adults over 65 develop mild cognitive impairment (MCI), which does impact everyday living, and is a precursor of Alzheimer's.

There are significant differences in prevalence as a function of age. For example, in the U.S., a large sample found MCI in 9% of those aged 70 to 79 and nearly 18% of those 80 to 89. Prevalence decreased with years of education: 25% in those with up to eight years of education, 14% in those with nine to 12 years, 9% in those with 13 to 16 years, and 8.5% in those with greater than 16 years.

Large-scale population surveys of mild cognitive impairment in the elderly have produced large differences in national levels, ranging from 10% to 26%.

Although women may decline at a faster rate than men, prevalence of decline may be greater among men. For example, a large Dutch survey of those aged 85 and older found more women than men had good memory (41% vs 29%) and mental speed (33% vs 28%), despite the fact that more women than men had a limited education.

However, severe memory problems in the elderly have become more rare. The main reasons seem to be better physical fitness (partly due to better healthcare), higher levels of education, and greater personal wealth.

See news reports

 

Forgetfulness in old age may be related to changes in retrieval strategy

A study of younger and older adults indicates that memory search tends to decline with age because, with reduced cognitive control, seniors’ minds tend to ‘flit’ too quickly from one information cluster to another.

Evidence is accumulating that age-related cognitive decline is rooted in three related factors: processing speed slows down (because of myelin degradation); the ability to inhibit distractions becomes impaired; working memory capacity is reduced.

A new study adds to this evidence by looking at one particular aspect of age-related cognitive decline: memory search.

The study put 185 adults aged 29-99 (average age 67) through three cognitive tests: a vocabulary test, digit span (a working memory test), and the animal fluency test, in which you name as many animals as you can in one minute.

Typically, in the animal fluency test, people move through semantic categories such as ‘pets’, ‘big cats’, and so on. The best performers are those who move from category to category with optimal timing — i.e., at the point where the category has been sufficiently exhausted that efforts would be better spent on a new one.

Participants recalled on average 17 animal names, with a range from 5 to 33. While there was a decline with age, it wasn’t particularly marked until the 80s (an average of 18.3 for those in their 30s, 17.5 for those in their 60s, 16.5 for the 70s, 12.8 for the 80s, and 10 for the 90s). Digit span did show a decline, but it was not significant (from 17.5 down to 15.3), while vocabulary (consistent with previous research) showed no decline with age.

But all this is by the by — the nub of the experiment was to discover how individuals were searching their memory. This required a quite complicated analysis, which I will not go into, except to mention two important distinctions. The first is between:

  • global context cue: activates each item in the active category according to how strong it is (how frequently it has been recalled in the past);
  • local context cue: activates each item in relation to its semantic similarity to the previous item recalled.

A further distinction was made between static and dynamic processes: in dynamic models, it is assumed the user switches between local and global search. This, it is further assumed, is because memory is ‘patchy’ – that is, information is represented in clusters. Within a cluster, we use local cues, but to move from one cluster to another, we use global cues.

The point of all this was to determine whether age-related decline in memory search has to do with:

  • Reduced processing speed,
  • Persisting too long on categories, or
  • Inability to maintain focus on local cues (this would relate it back to the inhibition deficit).

By modeling the exact recall patterns, the researchers ascertained that the recall process is indeed dynamic, although the points of transition are not clearly understood. The number of transitions from one cluster to another was negatively correlated with age; it was also strongly positively correlated with performance (number of items recalled). Digit span, assumed to measure ‘cognitive control’, was also negatively correlated with number of transitions, but, as I said, was not significantly correlated with age.

In other words, it appears that there is a qualitative change with age, that increasing age is correlated with increased switching, and reduced cognitive control is behind this — although it doesn’t explain it all (perhaps because we’re still not able to fully measure cognitive control).

At a practical level, the message is that memory search may become less efficient because, as people age, they tend to change categories too frequently, before they have exhausted their full potential. While this may well be a consequence of reduced cognitive control, it seems likely (to me at least) that making a deliberate effort to fight the tendency to move on too quickly will pay dividends for older adults who want to improve their memory retrieval abilities.

Nor is this restricted to older adults — since age appears to be primarily affecting performance through its effects on cognitive control, it is likely that this applies to those with reduced working memory capacity, of any age.

Reference: 

[3378] Hills, T. T., Mata R., Wilke A., & Samanez-Larkin G. R. (2013).  Mechanisms of Age-Related Decline in Memory Search Across the Adult Life Span. Developmental Psychology. No - Pagination Specified.

Cognitive decline in old age related to poorer sleep

A new study confirms the role slow-wave sleep plays in consolidating memories, and reveals that one reason for older adults’ memory problems may be the quality of their sleep.

Recent research has suggested that sleep problems might be a risk factor in developing Alzheimer’s, and in mild cognitive impairment. A new study adds to this gathering evidence by connecting reduced slow-wave sleep in older adults to brain atrophy and poorer learning.

The study involved 18 healthy young adults (mostly in their 20s) and 15 healthy older adults (mostly in their 70s). Participants learned 120 word- nonsense word pairs and were tested for recognition before going to bed. Their brain activity was recorded while they slept. Brain activity was also measured in the morning, when they were tested again on the word pairs.

As has been found previously, older adults showed markedly less slow-wave activity (both over the whole brain and specifically in the prefrontal cortex) than the younger adults. Again, as in previous studies, the biggest difference between young and older adults in terms of gray matter volume was found in the medial prefrontal cortex (mPFC). Moreover, significant differences were also found in the insula and posterior cingulate cortex. These regions, like the mPFC, have also been associated with the generation of slow waves.

When mPFC volume was taken into account, age no longer significantly predicted the extent of the decline in slow-wave activity — in other words, the decline in slow-wave activity appears to be due to the brain atrophy in the medial prefrontal cortex. Atrophy in other regions of the brain (precuneus, hippocampus, temporal lobe) was not associated with the decline in slow-wave activity when age was considered.

Older adults did significantly worse on the delayed recognition test than young adults. Performance on the immediate test did not predict performance on the delayed test. Moreover, the highest performers on the immediate test among the older adults performed at the same level as the lowest young adult performers — nevertheless, these older adults did worse the following day.

Slow-wave activity during sleep was significantly associated with performance on the next day’s test. Moreover, when slow-wave activity was taken into account, neither age nor mPFC atrophy significantly predicted test performance.

In other words, age relates to shrinkage of the prefrontal cortex, this shrinkage relates to a decline in slow-wave activity during sleep, and this decline in slow-wave sleep relates to poorer cognitive performance.

The findings confirm the importance of slow-wave brainwaves for memory consolidation.

All of this suggests that poorer sleep quality contributes significantly to age-related cognitive decline, and that efforts should be made to improve quality of sleep rather than just assuming lighter, more disturbed sleep is ‘natural’ in old age!

Why learning gets harder as we get older

A mouse study shows that weakening unwanted or out-of-date connections is as important as making new connections, and that neurological changes as we age reduces our ability to weaken old connections.

A new study adds more support to the idea that the increasing difficulty in learning new information and skills that most of us experience as we age is not down to any difficulty in acquiring new information, but rests on the interference from all the old information.

Memory is about strengthening some connections and weakening others. A vital player in this process of synaptic plasticity is the NMDA receptor in the hippocampus. This glutamate receptor has two subunits (NR2A and NR2B), whose ratio changes as the brain develops. Children have higher ratios of NR2B, which lengthens the time neurons talk to each other, enabling them to make stronger connections, thus optimizing learning. After puberty, the ratio shifts, so there is more NR2A.

Of course, there are many other changes in the aging brain, so it’s been difficult to disentangle the effects of this changing ratio from other changes. This new study genetically modified mice to have more NR2A and less NR2B (reflecting the ratio typical of older humans), thus avoiding the other confounds.

To the researchers’ surprise, the mice were found to be still good at making strong connections (‘long-term potentiation’ - LTP), but instead had an impaired ability to weaken existing connections (‘long-term depression’ - LTD). This produces too much noise (bear in mind that each neuron averages 3,000 potential points of contact (i.e., synapses), and you will see the importance of turning down the noise!).

Interestingly, LTD responses were only abolished within a particular frequency range (3-5 Hz), and didn’t affect 1Hz-induced LTD (or 100Hz-induced LTP). Moreover, while the mice showed impaired long-term learning, their short-term memory was unaffected. The researchers suggest that these particular LTD responses are critical for ‘post-learning information sculpting’, which they suggest is a step (hitherto unknown) in the consolidation process. This step, they postulate, involves modifying the new information to fit in with existing networks of knowledge.

Previous work by these researchers has found that mice genetically modified to have an excess of NR2B became ‘super-learners’. Until now, the emphasis in learning and memory has always been on long-term potentiation, and the role (if any) of long-term depression has been much less clear. These results point to the importance of both these processes in sculpting learning and memory.

The findings also seem to fit in with the idea that a major cause of age-related cognitive decline is the failure to inhibit unwanted information, and confirm the importance of keeping your mind actively engaged and learning, because this ratio is also affected by experience.

Hearing loss accelerates cognitive decline in older adults

A large study finds that hearing loss significantly increases the rate of cognitive decline in old age.

I’ve written before about the gathering evidence that sensory impairment, visual impairment and hearing loss in particular, is a risk factor for age-related cognitive decline and dementia. Now a large long-running study provides more support for the association between hearing loss and age-related cognitive decline.

The study involved 1,984 older adults (aged 75-84) whose hearing and cognition was tested at the start of the study, with cognitive performance again assessed three, five, and six years later.

Those with hearing loss showed significantly faster cognitive decline than those with normal hearing — some 30-40% faster (41% on the MMSE; 32% on the Digit Symbol Substitution Test), with rate directly related to the amount of hearing loss.

On average, older adults with hearing loss developed significant cognitive impairment 3.2 years sooner than those with normal hearing — a very significant difference indeed.

It has been suggested that increasing social isolation and loneliness may underlie some, if not all, of this association. It may also be that difficulties in hearing force the brain to devote too much of its resources to processing sound, leaving less for cognition. A third possibility is that some common factor underlies both hearing loss and cognitive decline — however, the obvious risk factors, such as high blood pressure, diabetes and stroke, were taken account of in the analysis.

The findings emphasize the importance of getting help for hearing difficulties, rather than regarding them as ‘natural’ in old age.

Reference: 

[3293] Lin, F. R., Yaffe K., Xia J., & et al (2013).  Hearing loss and cognitive decline in older adults. JAMA Internal Medicine. 1 - 7.

Menopause forgetfulness greatest early in postmenopause

A smallish study suggests that the cognitive effects of menopause are greatest in the first year after menopause.

Being a woman of a certain age, I generally take notice of research into the effects of menopause on cognition. A new study adds weight, perhaps, to the idea that cognitive complaints in perimenopause and menopause are not directly a consequence of hormonal changes, but more particularly, shows that early post menopause may be the most problematic time.

The study followed 117 women from four stages of life: late reproductive, early and late menopausal transition, and early postmenopause. The late reproductive period is defined as when women first begin to notice subtle changes in their menstrual periods, but still have regular menstrual cycles. Women in the transitional stage (which can last for several years) experience fluctuation in menstrual cycles, and hormone levels begin to fluctuate significantly.

Women in the early stage of post menopause (first year after menopause), as a group, were found to perform more poorly on measures of verbal learning, verbal memory, and fine motor skill than women in the late reproductive and late transition stages. They also performed significantly worse than women in the late menopausal transition stage on attention/working memory tasks.

Surprisingly, self-reported symptoms such as sleep difficulties, depression, and anxiety did not predict memory problems. Neither were the problems correlated with hormone levels (although fluctuations could be a factor).

This seemingly contradicts earlier findings from the same researchers, who in a slightly smaller study found that those experiencing poorer working memory and attention were more likely to have poorer sleep, depression, and anxiety. That study, however, only involved women approaching and in menopause. Moreover, these aspects were not included in the abstract of the paper but only in the press release, and because I don’t have access to this particular journal, I cannot say whether there is something in the data that explains this. Because of this, I am not inclined to put too much weight on this point.

But we may perhaps take the findings as support for the view that cognitive problems experienced earlier in the menopause cycle are, when they occur, not a direct result of hormonal changes.

The important result of this study is the finding that the cognitive problems often experienced by women in their 40s and 50s are most acute during the early period of post menopause, and the indication that the causes and manifestations are different at different stages of menopause.

It should be noted, however, that there were only 14 women in the early postmenopause stage. So, we shouldn’t put too much weight on any of this. Nevertheless, it does add to the picture research is building up about the effects of menopause on women’s cognition.

While the researchers said that this effect is probably temporary — which was picked up as the headline in most media — this was not in fact investigated in this study. It would be nice to have some comparison with those, say, two or three and five years post menopause (but quite possibly this will be reported in a later paper).

Reference: 

[3237] Weber, M. T., Rubin L. H., & Maki P. M. (2013).  Cognition in perimenopause. Menopause: The Journal of The North American Menopause Society.

Dopamine decline underlies episodic memory decline in old age

Findings supporting dopamine’s role in long-term episodic memory point to a decline in dopamine levels as part of the reason for cognitive decline in old age, and perhaps in Alzheimer’s.

The neurotransmitter dopamine is found throughout the brain and has been implicated in a number of cognitive processes, including memory. It is well-known, of course, that Parkinson's disease is characterized by low levels of dopamine, and is treated by raising dopamine levels.

A new study of older adults has now demonstrated the effect of dopamine on episodic memory. In the study, participants (aged 65-75) were shown black and white photos of indoor scenes and landscapes. The subsequent recognition test presented them with these photos mixed in with new ones, and required them to note which photos they had seen before. Half of the participants were first given Levodopa (‘L-dopa’), and half a placebo.

Recognition tests were given two and six hours after being shown the photos. There was no difference between the groups at the two-hour test, but at the six-hour test, those given L-dopa recognized up to 20% more photos than controls.

The failure to find a difference at the two-hour test was expected, if dopamine’s role is to help strengthen the memory code for long-term storage, which occurs after 4-6 hours.

Individual differences indicated that the ratio between the amount of Levodopa taken and body weight is key for an optimally effective dose.

The findings therefore suggest that at least part of the reason for the decline in episodic memory typically seen in older adults is caused by declining levels of dopamine.

Given that episodic memory is one of the first and greatest types of memory hit by Alzheimer’s, this finding also has implications for Alzheimer’s treatment.

Caffeine improves recognition of positive words

Another recent study also demonstrates, rather more obliquely, the benefits of dopamine. In this study, 200 mg of caffeine (equivalent to 2-3 cups of coffee), taken 30 minutes earlier by healthy young adults, was found to improve recognition of positive words, but had no effect on the processing of emotionally neutral or negative words. Positive words are consistently processed faster and more accurately than negative and neutral words.

Because caffeine is linked to an increase in dopamine transmission (an indirect effect, stemming from caffeine’s inhibitory effect on adenosine receptors), the researchers suggest that this effect of caffeine on positive words demonstrates that the processing advantage enjoyed by positive words is driven by the involvement of the dopaminergic system.

Old honeybees can regain youthful cognition when they return to youthful duties

A honey bee study shows how old foraging bees quickly start to decline cognitively, and how this can be reversed in some if they return to more social domestic duties in the hive.

I often talk about the importance of attitudes and beliefs for memory and cognition. A new honey bee study provides support for this in relation to the effects of aging on the brain, and suggests that this principle extends across the animal kingdom.

Previous research has shown that bees that stay in the nest and take care of the young remain mentally competent, but they don’t nurse for ever. When they’re older (after about 2-3 weeks), they become foragers, and foraging bees age very quickly — both physically and mentally. Obviously, you would think, bees ‘retire’ to foraging, and their old age is brief (they begin to show cognitive decline after just two weeks).

But it’s not as simple as that, because in artificial hives where worker bees are all the same age, nurse bees of the same age as foragers don’t show the same cognitive and sensory decline. Moreover, nurse bees have been found to maintain their cognitive abilities for more than 100 days, while foragers die within 18 days and show cognitive declines after 13-15 days (although their ability to assess sweetness remains intact).

The researchers accordingly asked a very interesting question: what happens if the foragers return to babysitting?

To achieve this, they removed all of the younger nurse bees from the nest, leaving only the queen and babies. When the older, foraging bees returned to the nest, activity slowed down for several days, and then they re-organized themselves: some of the old bees returned to foraging; others took on the babysitting and housekeeping duties (cleaning, building the comb, and tending to the queen). After 10 days, around half of these latter bees had significantly improved their ability to learn new things.

This cognitive improvement was also associated with a change in two specific proteins in their brains: one that has been associated with protection against oxidative stress and inflammation associated with Alzheimer disease and Huntington disease in humans (Prx6), and another dubbed a “chaperone” protein because it protects other proteins from being damaged when brain or other tissues are exposed to cell-level stress.

Precisely what it is about returning to the hive that produces this effect is a matter of speculation, but this finding does show that learning impairment in old bees can be reversed by changes in behavior, and this reversal is correlated with specific changes in brain protein.

Having said this, it shouldn’t be overlooked that only some of the worker bees showed this brain plasticity. This is not, apparently, due to differences in genotype, but may depend on the amount of foraging experience.

The findings add weight to the idea that social interventions can help our brains stay younger, and are consistent with growing evidence that, in humans, social engagement helps protect against dementia and age-related cognitive impairment.

The (probably) experience-dependent individual differences shown by the bees is perhaps mirrored in our idea of cognitive reserve, but with a twist. The concept of cognitive reserve emphasizes that accumulating a wealth of cognitive experience (whether through education or occupation or other activities) protects your brain from the damage that might occur with age. But perhaps (and I’m speculating now) we should also consider the other side of this: repeated engagement in routine or undemanding activities may have a deleterious effect, independent of and additional to the absence of more stimulating activities.

Reference: 

Controlling diabetes important for slowing cognitive decline

Findings from a large, long-running study adds to growing evidence that poorly controlled diabetes is associated with faster cognitive decline.

The latest finding from the large, long-running Health, Aging, and Body Composition (Health ABC) Study adds to the evidence that preventing or controlling diabetes helps prevent age-related cognitive decline.

The study involves 3,069 older adults (70+), of whom 717 (23%) had diabetes at the beginning of the study in 1997. Over the course of the study, a further 159 developed diabetes. Those with diabetes at the beginning had lower cognitive scores, and showed faster decline. Those who developed diabetes showed a rate of decline that was between that faster rate and the slower rate of those who never developed diabetes.

Among those with diabetes, those who had higher levels of a blood marker called glycosylated hemoglobin had greater cognitive impairment. Higher levels of this blood marker reflect poorer control of blood sugar.

In other words, both duration and severity of diabetes are important factors in determining rate of cognitive decline in old age.

Effect of blood pressure on the aging brain depends on genetics

For those with the Alzheimer’s gene, higher blood pressure, even though within the normal range, is linked to greater brain shrinkage and reduced cognitive ability.

I’ve reported before on the evidence suggesting that carriers of the ‘Alzheimer’s gene’, APOE4, tend to have smaller brain volumes and perform worse on cognitive tests, despite being cognitively ‘normal’. However, the research hasn’t been consistent, and now a new study suggests the reason.

The e4 variant of the apolipoprotein (APOE) gene not only increases the risk of dementia, but also of cardiovascular disease. These effects are not unrelated. Apoliproprotein is involved in the transportation of cholesterol. In older adults, it has been shown that other vascular risk factors (such as elevated cholesterol, hypertension or diabetes) worsen the cognitive effects of having this gene variant.

This new study extends the finding, by looking at 72 healthy adults from a wide age range (19-77).

Participants were tested on various cognitive abilities known to be sensitive to aging and the effects of the e4 allele. Those abilities include speed of information processing, working memory and episodic memory. Blood pressure, brain scans, and of course genetic tests, were also performed.

There are a number of interesting findings:

  • The relationship between age and hippocampal volume was stronger for those carrying the e4 allele (shrinkage of this brain region occurs with age, and is significantly greater in those with MCI or dementia).
  • Higher systolic blood pressure was significantly associated with greater atrophy (i.e., smaller volumes), slower processing speed, and reduced working memory capacity — but only for those with the e4 variant.
  • Among those with the better and more common e3 variant, working memory was associated with lateral prefrontal cortex volume and with processing speed. Greater age was associated with higher systolic blood pressure, smaller volumes of the prefrontal cortex and prefrontal white matter, and slower processing. However, blood pressure was not itself associated with either brain atrophy or slower cognition.
  • For those with the Alzheimer’s variant (e4), older adults with higher blood pressure had smaller volumes of prefrontal white matter, and this in turn was associated with slower speed, which in turn linked to reduced working memory.

In other words, for those with the Alzheimer’s gene, age differences in working memory (which underpin so much of age-related cognitive impairment) were produced by higher blood pressure, reduced prefrontal white matter, and slower processing. For those without the gene, age differences in working memory were produced by reduced prefrontal cortex and prefrontal white matter.

Most importantly, these increases in blood pressure that we are talking about are well within the normal range (although at the higher end).

The researchers make an interesting point: that these findings are in line with “growing evidence that ‘normal’ should be viewed in the context of individual’s genetic predisposition”.

What it comes down to is this: those with the Alzheimer’s gene variant (and no doubt other genetic variants) have a greater vulnerability to some of the risk factors that commonly increase as we age. Those with a family history of dementia or serious cognitive impairment should therefore pay particular attention to controlling vascular risk factors, such as hypertension and diabetes.

This doesn’t mean that those without such a family history can safely ignore such conditions! When they get to the point of being clinically diagnosed as problems, then they are assuredly problems for your brain regardless of your genetics. What this study tells us is that these vascular issues appear to be problematic for Alzheimer’s gene carriers before they get to that point of clinical diagnosis.

Syndicate content