episodic memory

How stress affects your learning

October, 2012

A small study shows that stress makes it more likely for learning to use more complicated and subconscious processes that involve brain regions involved in habit and procedural learning.

We know that stress has a complicated relationship with learning, but in general its effect is negative, and part of that is due to stress producing anxious thoughts that clog up working memory. A new study adds another perspective to that.

The brain scanning study involved 60 young adults, of whom half were put under stress by having a hand immersed in ice-cold water for three minutes under the supervision of a somewhat unfriendly examiner, while the other group immersed their hand in warm water without such supervision (cortisol and blood pressure tests confirmed the stress difference).

About 25 minutes after this (cortisol reaches peak levels around 25 minutes after stress), participants’ brains were scanned while participants alternated between a classification task and a visual-motor control task. The classification task required them to look at cards with different symbols and learn to predict which combinations of cards announced rain and which sunshine. Afterward, they were given a short questionnaire to determine their knowledge of the task. The control task was similar but there were no learning demands (they looked at cards on the screen and made a simple perceptual decision).

In order to determine the strategy individuals used to do the classification task, ‘ideal’ performance was modeled for four possible strategies, of which two were ‘simple’ (based on single cues) and two ‘complex’ (based on multiple cues).

Here’s the interesting thing: while both groups were successful in learning the task, the two groups learned to do it in different ways. Far more of the non-stressed group activated the hippocampus to pursue a simple and deliberate strategy, focusing on individual symbols rather than combinations of symbols. The stressed group, on the other hand, were far more likely to use the striatum only, in a more complex and subconscious processing of symbol combinations.

The stressed group also remembered significantly fewer details of the classification task.

There was no difference between the groups on the (simple, perceptual) control task.

In other words, it seems that stress interferes with conscious, purposeful learning, causing the brain to fall back on more ‘primitive’ mechanisms that involve procedural learning. Striatum-based procedural learning is less flexible than hippocampus-based declarative learning.

Why should this happen? Well, the non-conscious procedural learning going on in the striatum is much less demanding of cognitive resources, freeing up your working memory to do something important — like worrying about the source of the stress.

Unfortunately, such learning will not become part of your more flexible declarative knowledge base.

The finding may have implications for stress disorders such as depression, addiction, and PTSD. It may also have relevance for a memory phenomenon known as “forgotten baby syndrome”, in which parents forget their babies in the car. This may be related to the use of non-declarative memory, because of the stress they are experiencing.

Reference: 

[3071] Schwabe, L., & Wolf O. T.
(2012).  Stress Modulates the Engagement of Multiple Memory Systems in Classification Learning.
The Journal of Neuroscience. 32(32), 11042 - 11049.

Source: 

Topics: 

tags memworks: 

Sleep learning making a comeback?

August, 2012

Two new studies provide support for the judicious use of sleep learning — as a means of reactivating learning that occurred during the day.

Back when I was young, sleep learning was a popular idea. The idea was that a tape would play while you were asleep, and learning would seep into your brain effortlessly. It was particularly advocated for language learning. Subsequent research, unfortunately, rejected the idea, and gradually it has faded (although not completely). Now a new study may presage a come-back.

In the study, 16 young adults (mean age 21) learned how to ‘play’ two artificially-generated tunes by pressing four keys in time with repeating 12-item sequences of moving circles — the idea being to mimic the sort of sensorimotor integration that occurs when musicians learn to play music. They then took a 90-minute nap. During slow-wave sleep, one of the tunes was repeatedly played to them (20 times over four minutes). After the nap, participants were tested on their ability to play the tunes.

A separate group of 16 students experienced the same events, but without the playing of the tune during sleep. A third group stayed awake, during which 90-minute period they played a demanding working memory task. White noise was played in the background, and the melody was covertly embedded into it.

Consistent with the idea that sleep is particularly helpful for sensorimotor integration, and that reinstating information during sleep produces reactivation of those memories, the sequence ‘practiced’ during slow-wave sleep was remembered better than the unpracticed one. Moreover, the amount of improvement was positively correlated with the proportion of time spent in slow-wave sleep.

Among those who didn’t hear any sounds during sleep, improvement likewise correlated with the proportion of time spent in slow-wave sleep. The level of improvement for this group was intermediate to that of the practiced and unpracticed tunes in the sleep-learning group.

The findings add to growing evidence of the role of slow-wave sleep in memory consolidation. Whether the benefits for this very specific skill extend to other domains (such as language learning) remains to be seen.

However, another recent study carried out a similar procedure with object-location associations. Fifty everyday objects were associated with particular locations on a computer screen, and presented at the same time with characteristic sounds (e.g., a cat with a meow and a kettle with a whistle). The associations were learned to criterion, before participants slept for 2 hours in a MR scanner. During slow-wave sleep, auditory cues related to half the learned associations were played, as well as ‘control’ sounds that had not been played previously. Participants were tested after a short break and a shower.

A difference in brain activity was found for associated sounds and control sounds — associated sounds produced increased activation in the right parahippocampal cortex — demonstrating that even in deep sleep some sort of differential processing was going on. This region overlapped with the area involved in retrieval of the associations during the earlier, end-of-training test. Moreover, when the associated sounds were played during sleep, parahippocampal connectivity with the visual-processing regions increased.

All of this suggests that, indeed, memories are being reactivated during slow-wave sleep.

Additionally, brain activity in certain regions at the time of reactivation (mediotemporal lobe, thalamus, and cerebellum) was associated with better performance on the delayed test. That is, those who had greater activity in these regions when the associated sounds were played during slow-wave sleep remembered the associations best.

The researchers suggest that successful reactivation of memories depends on responses in the thalamus, which if activated feeds forward into the mediotemporal lobe, reinstating the memories and starting the consolidation process. The role of the cerebellum may have to do with the procedural skill component.

The findings are consistent with other research.

All of this is very exciting, but of course this is not a strategy for learning without effort! You still have to do your conscious, attentive learning. But these findings suggest that we can increase our chances of consolidating the material by replaying it during sleep. Of course, there are two practical problems with this: the material needs an auditory component, and you somehow have to replay it at the right time in your sleep cycle.

Reference: 

Source: 

Topics: 

tags lifestyle: 

tags memworks: 

tags strategies: 

Alzheimer’s biomarkers present decades before symptoms

July, 2012
  • People with a strong genetic risk of early-onset Alzheimer’s have revealed a progression of brain changes that begin 25 years before symptoms are evident.

A study involving those with a strong genetic risk of developing Alzheimer’s has found that the first signs of the disease can be detected 25 years before symptoms are evident. Whether this is also true of those who develop the disease without having such a strong genetic predisposition is not yet known.

The study involved 128 individuals with a 50% chance of inheriting one of three mutations that are certain to cause Alzheimer’s, often at an unusually young age. On the basis of participants’ parents’ medical history, an estimate of age of onset was calculated.

The first observable brain marker was a drop in cerebrospinal fluid levels of amyloid-beta proteins, and this could be detected 25 years before the anticipated age of onset. Amyloid plaques in the precuneus became visible on brain scans 15-20 years before memory problems become apparent; elevated cerebrospinal fluid levels of the tau protein 10-15 years, and brain atrophy in the hippocampus 15 years. Ten years before symptoms, the precuneus showed reduced use of glucose, and slight impairments in episodic memory (as measured in the delayed-recall part of the Wechsler’s Logical Memory subtest) were detectable. Global cognitive impairment (measured by the MMSE and the Clinical Dementia Rating scale) was detected 5 years before expected symptom onset, and patients met diagnostic criteria for dementia at an average of 3 years after expected symptom onset.

Family members without the risky genes showed none of these changes.

The risky genes are PSEN1 (present in 70 participants), PSEN2 (11), and APP (7) — note that together these account for 30-50% of early-onset familial Alzheimer’s, although only 0.5% of Alzheimer’s in general. The ‘Alzheimer’s gene’ APOe4 (which is a risk factor for sporadic, not familial, Alzheimer’s), was no more likely to be present in these carriers (25%) than noncarriers (22%), and there were no gender differences. The average parental age of symptom onset was 46 (note that this pushes back the first biomarker to 21! Can we speculate a connection to noncarriers having significantly more education than carriers — 15 years vs 13.9?).

The results paint a clear picture of how Alzheimer’s progresses, at least in this particular pathway. First come increases in the amyloid-beta protein, followed by amyloid pathology, tau pathology, brain atrophy, and decreased glucose metabolism. Following this biological cascade, cognitive impairment ensues.

The degree to which these findings apply to the far more common sporadic Alzheimer’s is not known, but evidence from other research is consistent with this progression.

It must be noted, however, that the findings are based on cross-sectional data — that is, pieced together from individuals at different ages and stages. A longitudinal study is needed to confirm.

The findings do suggest the importance of targeting the first step in the cascade — the over-production of amyloid-beta — at a very early stage.

Researchers encourage people with a family history of multiple generations of Alzheimer’s diagnosed before age 55 to register at http://www.DIANXR.org/, if they would like to be considered for inclusion in any research.

Reference: 

[2997] Bateman, R. J., Xiong C., Benzinger T. L. S., Fagan A. M., Goate A., Fox N. C., et al.
(2012).  Clinical and Biomarker Changes in Dominantly Inherited Alzheimer's Disease.
New England Journal of Medicine. 120723122607004 - 120723122607004.

Source: 

Topics: 

tags development: 

tags memworks: 

tags problems: 

Effect of blood pressure on the aging brain depends on genetics

July, 2012
  • For those with the Alzheimer’s gene, higher blood pressure, even though within the normal range, is linked to greater brain shrinkage and reduced cognitive ability.

I’ve reported before on the evidence suggesting that carriers of the ‘Alzheimer’s gene’, APOE4, tend to have smaller brain volumes and perform worse on cognitive tests, despite being cognitively ‘normal’. However, the research hasn’t been consistent, and now a new study suggests the reason.

The e4 variant of the apolipoprotein (APOE) gene not only increases the risk of dementia, but also of cardiovascular disease. These effects are not unrelated. Apoliproprotein is involved in the transportation of cholesterol. In older adults, it has been shown that other vascular risk factors (such as elevated cholesterol, hypertension or diabetes) worsen the cognitive effects of having this gene variant.

This new study extends the finding, by looking at 72 healthy adults from a wide age range (19-77).

Participants were tested on various cognitive abilities known to be sensitive to aging and the effects of the e4 allele. Those abilities include speed of information processing, working memory and episodic memory. Blood pressure, brain scans, and of course genetic tests, were also performed.

There are a number of interesting findings:

  • The relationship between age and hippocampal volume was stronger for those carrying the e4 allele (shrinkage of this brain region occurs with age, and is significantly greater in those with MCI or dementia).
  • Higher systolic blood pressure was significantly associated with greater atrophy (i.e., smaller volumes), slower processing speed, and reduced working memory capacity — but only for those with the e4 variant.
  • Among those with the better and more common e3 variant, working memory was associated with lateral prefrontal cortex volume and with processing speed. Greater age was associated with higher systolic blood pressure, smaller volumes of the prefrontal cortex and prefrontal white matter, and slower processing. However, blood pressure was not itself associated with either brain atrophy or slower cognition.
  • For those with the Alzheimer’s variant (e4), older adults with higher blood pressure had smaller volumes of prefrontal white matter, and this in turn was associated with slower speed, which in turn linked to reduced working memory.

In other words, for those with the Alzheimer’s gene, age differences in working memory (which underpin so much of age-related cognitive impairment) were produced by higher blood pressure, reduced prefrontal white matter, and slower processing. For those without the gene, age differences in working memory were produced by reduced prefrontal cortex and prefrontal white matter.

Most importantly, these increases in blood pressure that we are talking about are well within the normal range (although at the higher end).

The researchers make an interesting point: that these findings are in line with “growing evidence that ‘normal’ should be viewed in the context of individual’s genetic predisposition”.

What it comes down to is this: those with the Alzheimer’s gene variant (and no doubt other genetic variants) have a greater vulnerability to some of the risk factors that commonly increase as we age. Those with a family history of dementia or serious cognitive impairment should therefore pay particular attention to controlling vascular risk factors, such as hypertension and diabetes.

This doesn’t mean that those without such a family history can safely ignore such conditions! When they get to the point of being clinically diagnosed as problems, then they are assuredly problems for your brain regardless of your genetics. What this study tells us is that these vascular issues appear to be problematic for Alzheimer’s gene carriers before they get to that point of clinical diagnosis.

Reference: 

Source: 

Topics: 

tags development: 

tags memworks: 

tags problems: 

Walking through doorways causes forgetting

March, 2012

A series of experiments indicates that walking through doorways creates event boundaries, requiring us to update our awareness of current events and making information about the previous location less available.

We’re all familiar with the experience of going to another room and forgetting why we’ve done so. The problem has been largely attributed to a failure of attention, but recent research suggests something rather more specific is going on.

In a previous study, a virtual environment was used to explore what happens when people move through several rooms. The virtual environment was displayed on a very large (66 inch) screen to provide a more immersive experience. Each ‘room’ had one or two tables. Participants ‘carried’ an object, which they would deposit on a table, before picking up a different object. At various points, they were asked if the object was, say, a red cube (memory probe). The objects were not visible at the time of questioning. It was found that people were slower and less accurate if they had just moved to a new room.

To assess whether this effect depends on a high degree of immersion, a recent follow-up to this study replicated the study using standard 17” monitors rather than the giant screens. The experiment involved 55 students and once again demonstrated a significant effect of shifting rooms. Specifically, when the probe was positive, the error rate was 19% in the shift condition compared to 12% on trials when the participant ‘traveled’ the same distance but didn’t change rooms. When the probe was negative, the error rate was 22% in the shift condition vs 7% for the non-shift condition. Reaction time was less affected — there was no difference when the probes were positive, but a marginally significant difference on negative-probe trials.

The second experiment went to the other extreme. Rather than reducing the immersive experience, researchers increased it — to a real-world environment. Unlike the virtual environments, distances couldn’t be kept constant across conditions. Three large rooms were used, and no-shift trials involved different tables at opposite ends of the room. Six objects, rather than just one, were moved on each trial. Sixty students participated.

Once again, more errors occurred when a room-shift was involved. On positive-probe trials, the error rate was 28% in the shift condition vs 23% in the non-shift. On negative-probe trials, the error rate was 21% and 18%, respectively. The difference in reaction times wasn’t significant.

The third experiment, involving 48 students, tested the idea that forgetting might be due to the difference in context at retrieval compared to encoding. To do this, the researchers went back to using the more immersive virtual environment (the 66” screen), and included a third condition. In this, either the participant returned to the original room to be tested (return) or continued on to a new room to be tested (double-shift) — the idea being to hold the number of spatial shifts the same.

There was no evidence that returning to the original room produced the sort of advantage expected if context-matching was the important variable. Memory was best in the no-shift condition, next best in the shift and return conditions (no difference between them), and worst in the double shift condition. In other words, it was the number of new rooms entered that appears to be important.

This is in keeping with the idea that we break the action stream into separate events using event boundaries. Passing through a doorway is one type of event boundary. A more obvious type is the completion of an action sequence (e.g., mixing a cake — the boundary is the action of putting it in the oven; speaking on the phone — the boundary is the action of ending the call). Information being processed during an event is more available, foregrounded in your attention. Interference occurs when two or more events are activated, increasing errors and sometimes slowing retrieval.

All of this has greater ramifications than simply helping to explain why we so often go to another room and forget why we’re there. The broader point is that everything that happens to us is broken up and filed, and we should look for the boundaries to these events and be aware of the consequences of them for our memory. Moreover, these contextual factors are important elements of our filing system, and we can use that knowledge to construct more effective tags.

Read an article on this topic at Mempowered

Reference: 

[2660] Radvansky, G. A., Krawietz S. A., & Tamplin A. K.
(2011).  Walking Through Doorways Causes Forgetting: Further Explorations.
The Quarterly Journal of Experimental Psychology.

Source: 

Topics: 

tags memworks: 

tags problems: 

Negative stereotypes about aging affect how well older adults remember

March, 2012

Another study has come out supporting the idea that negative stereotypes about aging and memory affect how well older adults remember. In this case, older adults reminded of age-related decline were more likely to make memory errors.

In the study, 64 older adults (60-74; average 70) and 64 college students were compared on a word recognition task. Both groups first took a vocabulary test, on which they performed similarly. They were then presented with 12 lists of 15 semantically related words. For example, one list could have words associated with "sleep," such as "bed," "rest," "awake," "tired" and "night" — but not the word “sleep”. They were not told they would be tested on their memory of these, rather they were asked to rate each word for pleasantness.

They then engaged in a five-minute filler task (a Sudoku) before a short text was read to them. For some, the text had to do with age-related declines in memory. These participants were told the experiment had to do with memory. For others, the text concerned language-processing research. These were told the experiment had to do with language processing and verbal ability.

They were then given a recognition test containing 36 of the studied words, 48 words unrelated to the studied words, and 12 words related to the studied words (e.g. “sleep”). After recording whether or not they had seen each word before, they also rated their confidence in that answer on an 8-point scale. Finally, they were given a lexical decision task to independently assess stereotype activation.

While young adults showed no effects from the stereotype manipulation, older adults were much more likely to falsely recognize related words that had not been studied if they had heard the text on memory. Those who heard the text on language were no more likely than the young adults to falsely recognize related words.

Note that there is always quite a high level of false recognition of such items: young adults, and older adults in the low-threat condition falsely recognized around half of the related lures, compared to around 10% of unrelated words. But in the high-threat condition, older adults falsely recognized 71% of the related words.

Moreover, older adults’ confidence was also affected. While young adults’ confidence in their false memories was unaffected by threat condition, older adults in the high-threat condition were more confident of their false memories than older adults in the low-threat condition.

The idea that older adults were affected by negative stereotypes about aging was supported by the results of the lexical decision task, which found that, in the high-threat condition, older adults responded more quickly to words associated with negative stereotypes than to neutral words (indicating that they were more accessible). Young adults did not show this difference.

Reference: 

Thomas, A. K., & Dubois, S. J. (2011). Reducing the burden of stereotype threat eliminates age differences in memory distortion. Psychological science, 22(12), 1515-7. doi:10.1177/0956797611425932

Source: 

Topics: 

tags: 

tags development: 

tags memworks: 

tags problems: 

The problem in correcting false knowledge

February, 2012

Whether corrections to students’ misconceptions ‘stick’ depends on the strength of the memory of the correction.

Students come into classrooms filled with inaccurate knowledge they are confident is correct, and overcoming these misconceptions is notoriously difficult. In recent years, research has shown that such false knowledge can be corrected with feedback. The hypercorrection effect, as it has been termed, expresses the finding that when students are more confident of a wrong answer, they are more likely to remember the right answer if corrected.

This is somewhat against intuition and experience, which would suggest that it is harder to correct more confidently held misconceptions.

A new study tells us how to reconcile experimental evidence and belief: false knowledge is more likely to be corrected in the short-term, but also more likely to return once the correction is forgotten.

In the study, 50 undergraduate students were tested on basic science facts. After rating their confidence in each answer, they were told the correct answer. Half the students were then retested almost immediately (after a 6 minute filler task), while the other half were retested a week later.

There were 120 questions in the test. Examples include: What is stored in a camel's hump? How many chromosomes do humans have? What is the driest area on Earth? The average percentage of correct responses on the initial test was 38%, and as expected, for the second test, performance was significantly better on the immediate compared to the delayed (90% vs 71%).

Students who were retested immediately gave the correct answer on 86% of their previous errors, and they were more likely to correct their high-confidence errors than those made with little confidence (the hypercorrection effect). Those retested a week later also showed the hypercorrection effect, albeit at a much lower level: they only corrected 56% of their previous errors. (More precisely, on the immediate test, corrected answers rose from 79% for the lowest confidence level to 92% for the highest confidence. On the delayed test, corrected answers rose from 43% to 70% on the second highest confidence level, 64% for the highest.)

In those instances where students had forgotten the correct answer, they were much more likely to reproduce the original error if their confidence had been high. Indeed, on the immediate test, the same error was rarely repeated, regardless of confidence level (the proportion of repeated errors hovered at 3-4% pretty much across the board). On the delayed test, on the other hand, there was a linear increase, with repeated errors steadily increasing from 14% to 23% as confidence level rose (with the same odd exception — at the second highest confidence level, proportion of repeated errors suddenly fell).

Overall, students were more likely to correct their errors if they remembered their error than if they didn’t (72% vs 65%). Unsurprisingly, those in the immediate group were much more likely to remember their initial errors than those in the delayed group (85% vs 61%).

In other words, it’s all about relative strength of the memories. While high-confidence errors are more likely to be corrected if the correct answer is readily accessible, they are also more likely to be repeated once the correct answer becomes less accessible. The trick to replacing false knowledge, then, is to improve the strength of the correct information.

Thus, as recency fades, you need to engage frequency to make the new memory stronger. So the finding points to the special need for multiple repetition, if you are hoping to correct entrenched false knowledge. The success of immediate testing indicates that properly spaced retrieval practice is probably the best way of replacing incorrect knowledge.

Of course, these findings apply well beyond the classroom!

Reference: 

[2725] Butler, A. C., Fazio L. K., & Marsh E. J.
(2011).  The hypercorrection effect persists over a week, but high-confidence errors return.
Psychonomic Bulletin & Review. 18(6), 1238 - 1244.

Source: 

Topics: 

tags memworks: 

tags strategies: 

tags study: 

Why a select group of seniors retain their cognitive abilities

December, 2011
  • Comparison of the brains of octogenarians whose memories match those of middle-aged people reveals important differences between their brains and those of cognitively-normal seniors.

A certain level of mental decline in the senior years is regarded as normal, but some fortunate few don’t suffer from any decline at all. The Northwestern University Super Aging Project has found seniors aged 80+ who match or better the average episodic memory performance of people in their fifties. Comparison of the brains of 12 super-agers, 10 cognitively-normal seniors of similar age, and 14 middle-aged adults (average age 58) now reveals that the brains of super-agers also look like those of the middle-aged. In contrast, brain scans of cognitively average octogenarians show significant thinning of the cortex.

The difference between the brains of super-agers and the others was particularly marked in the anterior cingulate cortex. Indeed, the super agers appeared to have a much thicker left anterior cingulate cortex than the middle-aged group as well. Moreover, the brain of a super-ager who died revealed that, although there were some plaques and tangles (characteristic, in much greater quantities, of Alzheimer’s) in the mediotemporal lobe, there were almost none in the anterior cingulate. (But note an earlier report from the researchers)

Why this region should be of special importance is somewhat mysterious, but the anterior cingulate is part of the attention network, and perhaps it is this role that underlies the superior abilities of these seniors. The anterior cingulate also plays a role error detection and motivation; it will be interesting to see if these attributes are also important.

While the precise reason for the anterior cingulate to be critical to retaining cognitive abilities might be mysterious, the lack of cortical atrophy, and the suggestion that super-agers’ brains have much reduced levels of the sort of pathological damage seen in most older brains, adds weight to the growing evidence that cognitive aging reflects clinical problems, which unfortunately are all too common.

Sadly, there are no obvious lifestyle factors involved here. The super agers don’t have a lifestyle any different from their ‘cognitively average’ counterparts. However, while genetics might be behind these people’s good fortune, that doesn’t mean that lifestyle choices don’t make a big difference to those of us not so genetically fortunate. It seems increasingly clear that for most of us, without ‘super-protective genes’, health problems largely resulting from lifestyle choices are behind much of the damage done to our brains.

It should be emphasized that these unpublished results are preliminary only. This conference presentation reported on data from only 12 of 48 subjects studied.

Reference: 

Harrison, T., Geula, C., Shi, J., Samimi, M., Weintraub, S., Mesulam, M. & Rogalski, E. 2011. Neuroanatomic and pathologic features of cognitive SuperAging. Presented at a poster session at the 2011 Society for Neuroscience conference.

Source: 

Topics: 

Months: 

tags development: 

tags memworks: 

tags problems: 

Memory genes vary in protecting against age-related cognitive decline

November, 2011

New findings show the T variant of the KIBRA gene improves episodic memory through its effect on hippocampal activity. Another study finds the met variant of the BDNF gene is linked to greater age-related cognitive decline.

Previous research has found that carriers of the so-called KIBRA T allele have been shown to have better episodic memory than those who don’t carry that gene variant (this is a group difference; it doesn’t mean that any carrier will remember events better than any non-carrier). A large new study confirms and extends this finding.

The study involved 2,230 Swedish adults aged 35-95. Of these, 1040 did not have a T allele, 932 had one, and 258 had two.  Those who had at least one T allele performed significantly better on tests of immediate free recall of words (after hearing a list of 12 words, participants had to recall as many of them as they could, in any order; in some tests, there was a concurrent sorting task during presentation or testing).

There was no difference between those with one T allele and those with two. The effect increased with increasing age. There was no effect of gender. There was no significant effect on performance of delayed category cued recall tests or a visuospatial task, although a trend in the appropriate direction was evident.

It should also be noted that the effect on immediate recall, although statistically significant, was not large.

Brain activity was studied in a subset of this group, involving 83 adults aged 55-60, plus another 64 matched on sex, age, and performance on the scanner task. A further group of 113 65-75 year-olds were included for comparison purposes. While in the scanner, participants carried out a face-name association task. Having been presented with face-name pairs, participants were tested on their memory by being shown the faces with three letters, of which one was the initial letter of the name.

Performance on the scanner task was significantly higher for T carriers — but only for the 55-60 age group, not for the 65-75 age group. Activity in the hippocampus was significantly higher for younger T carriers during retrieval, but not encoding. No such difference was seen in the older group.

This finding is in contrast with an earlier, and much smaller, study involving 15 carriers and 15 non-carriers, which found higher activation of the hippocampus in non-T carriers. This was taken at the time to indicate some sort of compensatory activity. The present finding challenges that idea.

Although higher hippocampal activation during retrieval is generally associated with faster retrieval, the higher activity seen in T carriers was not fully accounted for by performance. It may be that such activity also reflects deeper processing.

KIBRA-T carriers were neither more nor less likely to carry other ‘memory genes’ — APOEe4; COMTval158met; BDNFval66met.

The findings, then, fail to support the idea that non-carriers engage compensatory mechanisms, but do indicate that the KIBRA-T gene helps episodic memory by improving the hippocampus function.

BDNF gene variation predicts rate of age-related decline in skilled performance

In another study, this time into the effects of the BDNF gene, performance on an airplane simulation task on three annual occasions was compared. The study involved 144 pilots, of whom all were healthy Caucasian males aged 40-69, and 55 (38%) of whom turned out to have at least one copy of a BDNF gene that contained the ‘met’ variant. This variant is less common, occurring in about one in three Asians, one in four Europeans and Americans, and about one in 200 sub-Saharan Africans.  

While performance dropped with age for both groups, the rate of decline was much steeper for those with the ‘met’ variant. Moreover, there was a significant inverse relationship between age and hippocampal size in the met carriers — and no significant correlation between age and hippocampal size in the non-met carriers.

Comparison over a longer time-period is now being undertaken.

The finding is more evidence for the value of physical exercise as you age — physical activity is known to increase BDNF levels in your brain. BDNF levels tend to decrease with age.

The met variant has been linked to higher likelihood of depression, stroke, anorexia nervosa, anxiety-related disorders, suicidal behavior and schizophrenia. It differs from the more common ‘val’ variant in having methionine rather than valine at position 66 on this gene. The BDNF gene has been remarkably conserved across evolutionary history (fish and mammalian BDNF have around 90% agreement), suggesting that mutations in this gene are not well tolerated.

Reference: 

Source: 

Topics: 

tags development: 

tags memworks: 

tags problems: 

Cannabis disrupts synchronized brain activity

November, 2011

Effects of a cannabis-like drug on rats explain why cannabis is linked to schizophrenia and how it might impair cognition, as well as supporting our understanding of how working memory works.

Research into the effects of cannabis on cognition has produced inconsistent results. Much may depend on extent of usage, timing, and perhaps (this is speculation) genetic differences. But marijuana abuse is common among sufferers of schizophrenia and recent studies have shown that the psychoactive ingredient of marijuana can induce some symptoms of schizophrenia in healthy volunteers.

Now new research helps explain why marijuana is linked to schizophrenia, and why it might have detrimental effects on attention and memory.

In this rat study, a drug that mimics the psychoactive ingredient of marijuana (by activating the cannabinoid receptors) produced significant disruption in brain networks, with brain activity becoming uncoordinated and inaccurate.

In recent years it has become increasingly clear that synchronized brainwaves play a crucial role in information processing — especially that between the hippocampus and prefrontal cortex (see, for example, my reports last month on theta waves improving retrieval and the effect of running on theta and gamma rhythms). Interactions between the hippocampus and prefrontal cortex seem to be involved in working memory functions, and may provide the mechanism for bringing together memory and decision-making during goal-directed behaviors.

Consistent with this, during decision-making on a maze task, hippocampal theta waves and prefrontal gamma waves were impaired, and the theta synchronization between the two was disrupted. These effects correlated with impaired performance on the maze task.

These findings are consistent with earlier findings that drugs that activate the cannabinoid receptors disrupt the theta rhythm in the hippocampus and impair spatial working memory. This experiment extends that result to coordinated brainwaves beyond the hippocampus.

Similar neural activity is observed in schizophrenia patients, as well as in healthy carriers of a genetic risk variant.

The findings add to the evidence that working memory processes involve coordination between the prefrontal cortex and the hippocampus through theta rhythm synchronization. The findings are consistent with the idea that items are encoded and indexed along the phase of the theta wave into episodic representations and transferred from the hippocampus to the neocortex as a theta phase code. By disrupting that code, cannabis makes it more difficult to retain and index the information relevant to the task at hand.

Reference: 

Source: 

Topics: 

tags: 

tags lifestyle: 

tags memworks: 

Pages

Subscribe to RSS - episodic memory
Error | About memory

Error

The website encountered an unexpected error. Please try again later.