depression

Depression in the elderly linked to Alzheimer's risk

Preventing major depression in adults with mild symptoms

Previous research has indicated that about a quarter of older adults who become mildly depressed will go on to become seriously depressed within a year or two. A study comparing problem-solving therapy for primary care — a seven-step approach delivered by non-mental-health professionals to help patients resolve difficulties and thus improve coping skills and confidence — with a program of dietary coaching (same number of sessions and hours), has found that elderly adults with mild symptoms of depression responded equally well to both treatments.

Late-life depression increases dementia risk

depressed older adult

Late-life depression is associated with an increased risk for all-cause dementia, Alzheimer’s disease, and, most predominantly, vascular dementia, a new study shows.

Feeling lonely linked to increased dementia risk

A study that attempts to separate the effects of social isolation from subjective feelings of loneliness concludes that feelings of loneliness have a greater effect on dementia risk.

There's quite a bit of evidence now that socializing — having frequent contact with others — helps protect against cognitive impairment in old age. We also know that depression is a risk factor for cognitive impairment and dementia. There have been hints that loneliness might also be a risk factor. But here’s the question: is it being alone, or feeling lonely, that is the danger?

A large Dutch study, following 2173 older adults for three years, suggests that it is the feeling of loneliness that is the main problem.

At the start of the study, some 46% of the participants were living alone, and some 50% were no longer or never married (presumably the discrepancy is because many older adults have a spouse in a care facility). Some 73% said they had no social support, while 20% reported feelings of loneliness.

Those who lived alone were significantly more likely to develop dementia over the three year study period (9.3% compared with 5.6% of those who lived with others). The unmarried were also significantly more likely to develop dementia (9.2% vs 5.3%).

On the other hand, among those without social support, 5.6% developed dementia compared with 11.4% with social support! This seems to contradict everything we know, not to mention the other results of the study, but the answer presumably lies in what is meant by ‘social support’. Social support was assessed by the question: Do you get help from family, neighbours or home support? It doesn’t ask the question of whether help would be there if they needed it. So this is not a question of social networks, but more one of how much you need help. This interpretation is supported by the finding that those receiving social support had more health problems.

So, although the researchers originally counted this question as part of the measure of social isolation, it is clearly a poor reflection of it. Effectively, then, that leaves cohabitation and marriage as the only indices of social isolation, which is obviously inadequate.

However, we still have the interesting question re loneliness. The study found that 13.4% of those who said they felt lonely developed dementia compared with 5.7% of those who didn’t feel this way. This is a greater difference than that found with the ‘socially isolated’ (as measured!). Moreover, once other risk factors, such as age, education, and other health factors, were accounted for, the association between living alone and dementia disappeared, while the association with feelings of loneliness remained.

Of course, this still doesn’t tell us what the association is! It may be that feelings of loneliness simply reflect cognitive changes that precede Alzheimer’s, but it may be that the feelings themselves are decreasing cognitive and social activity. It may also be that those who are prone to such feelings have personality traits that are in themselves risk factors for cognitive impairment.

I would like to see another large study using better metrics of social isolation, but, still, the study is interesting for its distinction between being alone and feeling lonely, and its suggestion that it is the subjective feeling that is more important.

This is not to say there is no value in having people around! For a start, as discussed, the measures of social isolation are clearly inadequate. Moreover, other people play an important role in helping with health issues, which in turn greatly impact cognitive decline.

Although there was a small effect of depression, the relationship between feeling lonely and dementia remained after this was accounted for, indicating that this is a separate factor (on the other hand feelings of loneliness were a risk factor for depression).

A decrease in cognitive score (MMSE) was also significantly greater for those experiencing feelings of loneliness, suggesting that this is also a factor in age-related cognitive decline.

The point is not so much that loneliness is more detrimental than being alone, but that loneliness in itself is a risk factor for cognitive decline and dementia. This suggests that we should develop a better understanding of loneliness, how to identify the vulnerable, and how to help them.

Social isolation decreases myelin

A mouse study demonstrates that prolonged social isolation can lead to a decrease in myelin, an effect implicated in a number of disorders, including age-related cognitive decline.

Problems with myelin — demyelination (seen most dramatically in MS, but also in other forms of neurodegeneration, including normal aging and depression); failure to develop sufficient myelin (in children and adolescents) — are increasingly being implicated in a wide range of disorders. A new animal study adds to that evidence by showing that social isolation brings about both depression and loss of myelin.

In the study, adult mice were isolated for eight weeks (which is of course longer for a mouse than it is to us) to induce a depressive-like state. They were then introduced to a mouse they hadn’t seen before. Although typically very social animals, those who had been socially isolated didn’t show any interest in interacting with the new mouse — a common pattern in human behavior as well.

Analysis of their brains revealed significantly lower levels of gene transcription for oligodendrocyte cells (the components of myelin) in the prefrontal cortex. This appeared to be caused by a lower production of heterochromatin (tightly packed DNA) in the cell nuclei, producing less mature oligodendrocytes.

Interestingly, even short periods of isolation were sufficient to produce changes in chromatin and myelin, although behavior wasn’t affected.

Happily, however, regardless of length of isolation, myelin production went back to normal after a period of social integration.

The findings add to the evidence that environmental factors can have significant effects on brain development and function, and support the idea that socializing is good for the brain.

Reference: 

Self-imagination helps memory in both healthy and memory-impaired

A small study involving patients with TBI has found that the best learning strategies are ones that call on the self-schema rather than episodic memory, and the best involves self-imagination.

Sometime ago, I reported on a study showing that older adults could improve their memory for a future task (remembering to regularly test their blood sugar) by picturing themselves going through the process. Imagination has been shown to be a useful strategy in improving memory (and also motor skills). A new study extends and confirms previous findings, by testing free recall and comparing self-imagination to more traditional strategies.

The study involved 15 patients with acquired brain injury who had impaired memory and 15 healthy controls. Participants memorized five lists of 24 adjectives that described personality traits, using a different strategy for each list. The five strategies were:

  • think of a word that rhymes with the trait (baseline),
  • think of a definition for the trait (semantic elaboration),
  • think about how the trait describes you (semantic self-referential processing),
  • think of a time when you acted out the trait (episodic self-referential processing), or
  • imagine acting out the trait (self-imagining).

For both groups, self-imagination produced the highest rates of free recall of the list (an average of 9.3 for the memory-impaired, compared to 3.2 using the baseline strategy; 8.1 vs 3.2 for the controls — note that the controls were given all 24 items in one list, while the memory-impaired were given 4 lists of 6 items).

Additionally, those with impaired memory did better using semantic self-referential processing than episodic self-referential processing (7.3 vs 5.7). In contrast, the controls did much the same in both conditions. This adds to the evidence that patients with brain injury often have a particular problem with episodic memory (knowledge about specific events). Episodic memory is also particularly affected in Alzheimer’s, as well as in normal aging and depression.

It’s also worth noting that all the strategies that involved the self were more effective than the two strategies that didn’t, for both groups (also, semantic elaboration was better than the baseline strategy).

The researchers suggest self-imagination (and semantic self-referential processing) might be of particular benefit for memory-impaired patients, by encouraging them to use information they can more easily access (information about their own personality traits, identity roles, and lifetime periods — what is termed the self-schema), and that future research should explore ways in which self-imagination could be used to support everyday memory tasks, such as learning new skills and remembering recent events.

Improving memory for specific events can help depression

A small study suggests that training in recalling personal memories can significantly help those with depression.

We know that people with depression tend to focus on, and remember, negative memories rather than positive. Interestingly, it’s not simply an emotion effect. People with depression, and even those at risk of depression (including those who have had depression), tend to have trouble remembering specific autobiographical memories. That is, memories of events that happened to them at a specific place and time (as opposed to those generalized event memories we construct from similar events, such as the ‘going to the dentist’ memory).

This cognitive difficulty seems to exacerbate their depression, probably through its effect on social encounters and relationships.

A new study, however, has found that a particular training program (“Memory Specificity Training”) can help both their memory for specific events and their symptoms of depression.

The study involved 23 adolescent Afghani refugees in Iran, all of whom had lost their fathers in the war in Afghanistan and who showed symptoms of depression. Half were randomly assigned to the five-week memory training program and half received no training.

The training program involved a weekly 80-minute group session, in which participants learned about different types of memory and memory recall, and practiced recalling specific memories after being given positive, neutral, and negative keywords.

Participants’ memory for specific events was tested at the start of the study, at the end of the five-week training period, and two months after the end of the training. Compared to the control group, those given the training were able to provide more specific memories after the training, and showed fewer symptoms of depression at the two month follow-up (but not immediately after the end of training).

The study follows on from a pilot study in which ten depressed female patients were given four weekly one-hour sessions of memory training. Improvements in memory retrieval were associated with less rumination (dwelling on things), less cognitive avoidance, and improvements in problem-solving skills.

It’s somewhat unfortunate that the control group were given no group sessions, indeed no contact (apart from the tests) of any kind. Nevertheless, and bearing in mind that these are still very small studies, the findings do suggest that it would be helpful to include a component on memory training in any cognitive behavioral therapy for depression.

Rapamycin makes young mice learn better and prevents decline in old mice

Further evidence from mice studies that the Easter Island drug improves cognition, in young mice as well as old.

I have reported previously on research suggesting that rapamycin, a bacterial product first isolated from soil on Easter Island and used to help transplant patients prevent organ rejection, might improve learning and memory. Following on from this research, a new mouse study has extended these findings by adding rapamycin to the diet of healthy mice throughout their life span. Excitingly, it found that cognition was improved in young mice, and abolished normal cognitive decline in older mice.

Anxiety and depressive-like behavior was also reduced, and the mice’s behavior demonstrated that rapamycin was acting like an antidepressant. This effect was found across all ages.

Three "feel-good" neurotransmitters — serotonin, dopamine and norepinephrine — all showed significantly higher levels in the midbrain (but not in the hippocampus). As these neurotransmitters are involved in learning and memory as well as mood, it is suggested that this might be a factor in the improved cognition.

Other recent studies have suggested that rapamycin inhibits a pathway in the brain that interferes with memory formation and facilitates aging.

Aging successfully

In a recent news report, I talked about a study of older adults that found that their sense of control over their lives fluctuates significantly over the course of a day, and that this impacts on their cognitive abilities, including reasoning and memory.

Diet linked to brain atrophy in old age

A more rigorous measurement of diet finds that dietary factors account for nearly as much brain shrinkage as age, education, APOE genotype, depression and high blood pressure combined.

The study involved 104 healthy older adults (average age 87) participating in the Oregon Brain Aging Study. Analysis of the nutrient biomarkers in their blood revealed that those with diets high in omega 3 fatty acids and in vitamins C, D, E and the B vitamins had higher scores on cognitive tests than people with diets low in those nutrients, while those with diets high in trans fats were more likely to score more poorly on cognitive tests.

These were dose-dependent, with each standard deviation increase in the vitamin BCDE score ssociated with a 0.28 SD increase in global cognitive score, and each SD increase in the trans fat score associated with a 0.30 SD decrease in global cognitive score.

Trans fats are primarily found in packaged, fast, fried and frozen food, baked goods and margarine spreads.

Brain scans of 42 of the participants found that those with diets high in vitamins BCDE and omega 3 fatty acids were also less likely to have the brain shrinkage associated with Alzheimer's, while those with high trans fats were more likely to show such brain atrophy.

Those with higher omega-3 scores also had fewer white matter hyperintensities. However, this association became weaker once depression and hypertension were taken into account.

Overall, the participants had good nutritional status, but 7% were deficient in vitamin B12 (I’m surprised it’s so low, but bear in mind that these are already a select group, being healthy at such an advanced age) and 25% were deficient in vitamin D.

The nutrient biomarkers accounted for 17% of the variation in cognitive performance, while age, education, APOE genotype (presence or absence of the ‘Alzheimer’s gene’), depression and high blood pressure together accounted for 46%. Diet was more important for brain atrophy: here, the nutrient biomarkers accounted for 37% of the variation, while the other factors accounted for 40% (meaning that diet was nearly as important as all these other factors combined!).

The findings add to the growing evidence that diet has a significant role in determining whether or not, and when, you develop Alzheimer’s disease.

Syndicate content