object recognition

Even short periods of exercise help you learn and remember

  • A small study of young adults found that 10 minutes of light exercise improved memory for details and increased relevant brain activity.
  • Another study found that 15 minutes of more intense exercise after learning a new motor skill resulted in better skill performance a day later.

Ten minutes of light exercise boosts memory

Following rat studies, a study involving 36 healthy young adults has found that 10 minutes of light exercise (such as tai chi, yoga, or walking) significantly improved highly detailed memory processing and resulted in increased activity in the hippocampus.

It also boosted connectivity between the hippocampus and cortical regions that support detailed memory processing (parahippocampal, angular, and fusiform gyri), and the degree of improvement in this connectivity predicted the extent of this memory improvement for an individual.

The memory task involved remembering details of pictures of objects from everyday life, some of which were very similar to other pictures, requiring participants to distinguish between the different memories.

Mood change was also assessed, and the researchers ruled out this as a cause of the improved memory.

https://www.theguardian.com/science/2018/sep/24/10-minutes-of-exercise-a-day-improves-memory

Exercise after learning helps you master new motor skills

Another recent study found that 15 minutes of cardiovascular exercise after learning a new motor skill resulted in better skill learning when tested a day later.

Exercise was also found to decrease desynchronization in beta brainwaves and increase their connectivity between hemispheres. The degree of improvement in skill learning reflected changes in beta-wave desynchronization. It appears that exercise helped the brain become more efficient in performing the skill.

The motor skill consisted of gripping an object akin to a gamers' joystick and using varying degrees of force to move a cursor up and down to connect red rectangles on a computer screen as quickly as possible.

Note that there was no difference between the two groups (those who exercised and those who didn’t) 8 hours after learning — the difference didn’t appear until after participants had slept. Sleep helps consolidate skill learning.

https://www.eurekalert.org/pub_releases/2018-07/mu-1oe071118.php

https://www.futurity.org/15-minutes-exercise-brain-motor-skills-1805322

Reference: 

Suwabe, K. 2018. Rapid stimulation of human dentate gyrus function with acute mild exercise. Proceedings of the National Academy of Sciences Oct 2018, 115 (41) 10487-10492; DOI: 10.1073/pnas.1805668115

[4398] Dal Maso, F., Desormeau B., Boudrias M-H., & Roig M.
(2018).  Acute cardiovascular exercise promotes functional changes in cortico-motor networks during the early stages of motor memory consolidation.
NeuroImage. 174, 380 - 392.

 

Topics: 

tags lifestyle: 

tags memworks: 

Exercise might help your vision

  • A small study found that low-intensity exercise significantly boosted activation in the visual cortex above what occurred during rest or high-intensity exercise.

A study involving 18 volunteers who performed a simple orientation discrimination while on a stationary bicycle, has found that low-intensity exercise boosted activation in the visual cortex, compared with activation levels when at rest or during high-intensity exercise.

The changes suggest that the neurons in the visual cortex were most sensitive to the orientation stimuli during the low-intensity exercise condition relative to the other conditions. It’s suggested that this reflects an evolutionary pressure for the visual system to be more sensitive when the individual is actively exploring the environment (as opposed to, say, running away).

http://www.futurity.org/vision-exercise-brains-1400422-2/

Reference: 

[4274] Bullock, T., Elliott J. C., Serences J. T., & Giesbrecht B.
(2016).  Acute Exercise Modulates Feature-selective Responses in Human Cortex.
Journal of Cognitive Neuroscience. 29(4), 605 - 618.

Source: 

Topics: 

tags lifestyle: 

tags memworks: 

Simple semantic task reveals early cognitive problems in older adults

January, 2013

A study finds early semantic problems in those with MCI, correlating with a reduced capacity to carry out everyday tasks.

A small study shows how those on the road to Alzheimer’s show early semantic problems long before memory problems arise, and that such problems can affect daily life.

The study compared 25 patients with amnestic MCI, 27 patients with mild-to-moderate Alzheimer's and 70 cognitively fit older adults (aged 55-90), on a non-verbal task involving size differences (for example, “What is bigger: a key or a house?”; “What is bigger: a key or an ant?”). The comparisons were presented in three different ways: as words; as images reflecting real-world differences; as incongruent images (e.g., a big ant and a small house).

Both those with MCI and those with AD were significantly less accurate, and significantly slower, in all three conditions compared to healthy controls, and they had disproportionately more difficulty on those comparisons where the size distance was smaller. But MCI and AD patients experienced their biggest problems when the images were incongruent – the ant bigger than the house. Those with MCI performed at a level between that of healthy controls and those with AD.

This suggests that perceptual information is having undue influence in a judgment task that requires conceptual knowledge.

Because semantic memory is organized according to relatedness, and because this sort of basic information has been acquired a long time ago, this simple test is quite a good way to test semantic knowledge. As previous research has indicated, the problem doesn’t seem to be a memory (retrieval) one, but one reflecting an actual loss or corruption of semantic knowledge. But perhaps, rather than a loss of data, it reflects a failure of selective attention/inhibition — an inability to inhibit immediate perceptual information in favor of more relevant conceptual information.

How much does this matter? Poor performance on the semantic distance task correlated with impaired ability to perform everyday tasks, accounting (together with delayed recall) for some 35% of the variance in scores on this task — while other cognitive abilities such as processing speed, executive function, verbal fluency, naming, did not have a significant effect. Everyday functional capacity was assessed using a short form of the UCSD Skills Performance Assessment scale (a tool generally used to identify everyday problems in patients with schizophrenia), which presents scenarios such as planning a trip to the beach, determining a route, dialing a telephone number, and writing a check.

The finding indicates that semantic memory problems are starting to occur early in the deterioration, and may be affecting general cognitive decline. However, if the problems reflect an access difficulty rather than data loss, it may be possible to strengthen these semantic processing connections through training — and thus improve general cognitive processing (and ability to perform everyday tasks).

Reference: 

Source: 

Topics: 

tags development: 

tags memworks: 

tags problems: 

Autism therapy can normalize face processing

November, 2012

A small study shows that an intensive program to help young children with autism not only improves cognition and behavior, but can also normalize brain activity for face processing.

The importance of early diagnosis for autism spectrum disorder has been highlighted by a recent study demonstrating the value of an educational program for toddlers with ASD.

The study involved 48 toddlers (18-30 months) diagnosed with autism and age-matched normally developing controls. Those with ASD were randomly assigned to participate in a two-year program called the Early Start Denver Model, or a standard community program.

The ESDM program involved two-hour sessions by trained therapists twice a day, five days every week. Parent training also enabled ESDM strategies to be used during daily activities. The program emphasizes interpersonal exchange, social attention, and shared engagement. It also includes training in face recognition, using individualized booklets of color photos of the faces of four familiar people.

The community program involved evaluation and advice, annual follow-up sessions, programs at Birth-to-Three centers and individual speech-language therapy, occupational therapy, and/or applied behavior analysis treatments.

All of those in the ESDM program were still participating at the end of the two years, compared to 88% of the community program participants.

At the end of the program, children were assessed on various cognitive and behavioral measures, as well as brain activity.

Compared with children who participated in the community program, children who received ESDM showed significant improvements in IQ, language, adaptive behavior, and autism diagnosis. Average verbal IQ for the ESDM group was 95 compared to an average 75 for the community group, and 93 vs 80 for nonverbal IQ. These are dramatically large differences, although it must be noted that individual variability was high.

Moreover, for the ESDM group, brain activity in response to faces was similar to that of normally-developing children, while the community group showed the pattern typical of autism (greater activity in response to objects compared to faces). This was associated with improvements in social behavior.

Again, there were significant individual differences. Specifically, 73% of the ESDM group, 53% of the control group, and 29% of the community group, showed a pattern of faster response to faces. (Bear in mind, re the control group, that these children are all still quite young.) It should also be borne in mind that it was difficult to get usable EEG data from many of the children with ASD — these results come from only 60% of the children with ASD.

Nevertheless, the findings are encouraging for parents looking to help their children.

It should also be noted that, although obviously earlier is better, the findings don’t rule out benefits for older children or even adults. Relatively brief targeted training in face recognition has been shown to affect brain activity patterns in adults with ASD.

Reference: 

[3123] Dawson, G., Jones E. J. H., Merkle K., Venema K., Lowy R., Faja S., et al.
(2012).  Early Behavioral Intervention Is Associated With Normalized Brain Activity in Young Children With Autism.
Journal of the American Academy of Child & Adolescent Psychiatry. 51(11), 1150 - 1159.

Source: 

Topics: 

tags: 

tags development: 

tags memworks: 

tags problems: 

Cut ‘visual clutter’ to help MCI & Alzheimer’s

October, 2012

A small study shows that those with MCI perform poorly on a visual discrimination task under high interference conditions, suggesting that reducing interference may improve cognitive performance.

Memory problems in those with mild cognitive impairment may begin with problems in visual discrimination and vulnerability to interference — a hopeful discovery in that interventions to improve discriminability and reduce interference may have a flow-on effect to cognition.

The study compared the performance on a complex object discrimination task of 7 patients diagnosed with amnestic MCI, 10 older adults considered to be at risk for MCI (because of their scores on a cognitive test), and 19 age-matched controls. The task involved the side-by-side comparison of images of objects, with participants required to say, within 15 seconds, whether the two objects were the same or different.

In the high-interference condition, the objects were blob-like and presented as black and white line-drawings, with some comparison pairs identical, while others only varied slightly in either shape or fill pattern. Objects were rotated to discourage a simple feature-matching strategy. In the low-interference condition, these line-drawings were interspersed with color photos of everyday objects, for which discriminability was dramatically easier. The two conditions were interspersed by a short break, with the low interference condition run in two blocks, before and after the high interference condition.

A control task, in which the participants compared two squares that could vary in size, was run at the end.

The study found that those with MCI, as well as those at risk of MCI, performed significantly worse than the control group in the high-interference condition. There was no difference in performance between those with MCI and those at risk of MCI. Neither group was impaired in the first low-interference condition, although the at-risk group did show significant impairment in the second low-interference condition. It may be that they had trouble recovering from the high-interference experience. However, the degree of impairment was much less than it was in the high-interference condition. It’s also worth noting that the performance on this second low-interference task was, for all groups, notably higher than it was on the first low-interference task.

There was no difference between any of the groups on the control task, indicating that fatigue wasn’t a factor.

The interference task was specifically chosen as one that involved the perirhinal cortex, but not the hippocampus. The task requires the conjunction of features — that is, you need to be able to see the object as a whole (‘feature binding’), not simply match individual features. The control task, which required only the discrimination of a single feature, shows that MCI doesn’t interfere with this ability.

I do note that the amount of individual variability on the interference tasks was noticeably greater in the MCI group than the others. The MCI group was of course smaller than the other groups, but variability wasn’t any greater for this group in the control task. Presumably this variability reflects progression of the impairment, but it would be interesting to test this with a larger sample, and map performance on this task against other cognitive tasks.

Recent research has suggested that the perirhinal cortex may provide protection from visual interference by inhibiting lower-level features. The perirhinal cortex is strongly connected to the hippocampus and entorhinal cortex, two brain regions known to be affected very early in MCI and Alzheimer’s.

The findings are also consistent with other evidence that damage to the medial temporal lobe may impair memory by increasing vulnerability to interference. For example, one study has found that story recall was greatly improved in patients with MCI if they rested quietly in a dark room after hearing the story, rather than being occupied in other tasks.

There may be a working memory component to all this as well. Comparison of two objects does require shifting attention back and forth. This, however, is separate to what the researchers see as primary: a perceptual deficit.

All of this suggests that reducing “visual clutter” could help MCI patients with everyday tasks. For example, buttons on a telephone tend to be the same size and color, with the only difference lying in the numbers themselves. Perhaps those with MCI or early Alzheimer’s would be assisted by a phone with varying sized buttons and different colors.

The finding also raises the question: to what extent is the difficulty Alzheimer’s patients often have in recognizing a loved one’s face a discrimination problem rather than a memory problem?

Finally, the performance of the at-risk group — people who had no subjective concerns about their memory, but who scored below 26 on the MoCA (Montreal Cognitive Assessment — a brief screening tool for MCI) — suggests that vulnerability to visual interference is an early marker of cognitive impairment that may be useful in diagnosis. It’s worth noting that, across all groups, MoCA scores predicted performance on the high-interference task, but not on any of the other tasks.

So how much cognitive impairment rests on problems with interference?

Reference: 

Newsome, R. N., Duarte, A., & Barense, M. D. (2012). Reducing Perceptual Interference Improves Visual Discrimination in Mild Cognitive Impairment : Implications for a Model of Perirhinal Cortex Function, Hippocampus, 22, 1990–1999. doi:10.1002/hipo.22071

Della Sala S, Cowan N, Beschin N, Perini M. 2005. Just lying there, remembering: Improving recall of prose in amnesic patients with mild cognitive impairment by minimising interference. Memory, 13, 435–440.

Source: 

Topics: 

tags development: 

tags memworks: 

tags problems: 

How exercise affects the brain, and who it benefits

June, 2012

New research indicates that the cognitive benefits of exercise depend on the gene variant you carry.

I’ve mentioned before that, for some few people, exercise doesn’t seem to have a benefit, and the benefits of exercise for fighting age-related cognitive decline may not apply to those carrying the Alzheimer’s gene.

New research suggests there is another gene variant that may impact on exercise’s effects. The new study follows on from earlier research that found that physical exercise during adolescence had more durable effects on object memory and BDNF levels than exercise during adulthood. In this study, 54 healthy but sedentary young adults (aged 18-36) were given an object recognition test before participating in either (a) a 4-week exercise program, with exercise on the final test day, (b) a 4-week exercise program, without exercise on the final test day, (c) a single bout of exercise on the final test day, or (d) remaining sedentary between test days.

Exercise both improved object recognition memory and reduced perceived stress — but only in one group: those who exercised for 4 weeks including the final day of testing. In other words, both regular exercise and recent exercise was needed to produce a memory benefit.

But there is one more factor — and this is where it gets really interesting — the benefit in this group didn’t happen for every member of the group. Only those carrying a specific genotype benefited from regular and recent exercise. This genotype has to do with the brain protein BDNF, which is involved in neurogenesis and synaptic plasticity, and which is increased by exercise. The BDNF gene comes in two flavors: Val and Met. Previous research has linked the less common Met variant to poorer memory and greater age-related cognitive decline.

In other words, it seems that the Met allele affects how much BDNF is released as a result of exercise, and this in turn affects cognitive benefits.

The object recognition test involved participants seeing a series of 50 images (previously selected as being highly recognizable and nameable), followed by a 15 minute filler task, before seeing 100 images (the previous 50 and 50 new images) and indicating which had been seen previously. The filler task involved surveys for state anxiety, perceived stress, and mood. On the first (pre-program) visit, a survey for trait anxiety was also completed.

Of the 54 participants, 31 carried two copies of the Val allele, and 23 had at least one Met allele (19 Val/Met; 4 Met/Met). The population frequency for carrying at least one Met allele is 50% for Asians, 30% in Caucasians, and 4% in African-Americans.

Although exercise decreased stress and increased positive mood, the cognitive benefits of exercise were not associated with mood or anxiety. Neither was genotype associated with mood or anxiety. However, some studies have found an association between depression and the Met variant, and this study is of course quite small.

A final note: this study is part of research looking at the benefits of exercise for children with ADHD. The findings suggest that genotyping would enable us to predict whether an individual — a child with ADHD or an older adult at risk of cognitive decline or impairment — would benefit from this treatment strategy.

Reference: 

Source: 

Topics: 

tags development: 

tags lifestyle: 

tags memworks: 

tags problems: 

How your hands affect your thinking

October, 2011

Two recent studies in embodied cognition show that hand movements and hand position are associated with less abstract thinking.

I always like studies about embodied cognition — that is, about how what we do physically affects how we think. Here are a couple of new ones.

The first study involved two experiments. In the first, 86 American college students were asked questions about gears in relation to each other. For example, “If five gears are arranged in a line, and you move the first gear clockwise, what will the final gear do?” The participants were videotaped as they talked their way through the problem. But here’s the interesting thing: half the students wore Velcro gloves attached to a board, preventing them from moving their hands. The control half were similarly prevented from moving their feet — giving them the same experience of restriction without the limitation on hand movement.

Those who gestured commonly used perceptual-motor strategies (simulation of gear movements) in solving the puzzles. Those who were prevented from gesturing, as well as those who chose not to gesture, used abstract, mathematical strategies much more often.

The second experiment confirmed the results with 111 British adults.

The findings are consistent with the hypothesis that gestures highlight and structure perceptual-motor information, and thereby make such information more likely to be used in problem solving.

That can be helpful, but not always. Even when we are solving problems that have to do with motion and space, more abstract strategies may sometimes be more efficient, and thus an inability to use the body may force us to come up with better strategies.

The other study is quite different. In this study, college students searched for a single letter embedded within images of fractals and other complex geometrical patterns. Some did this while holding their hands close to the images; others kept their hands in their laps, far from the images. This may sound a little wacky, but previous research has shown that perception and attention are affected by how close our hands are to an object. Items near our hands tend to take priority.

In the first experiment, eight randomly chosen images were periodically repeated 16 times, while the other 128 images were only shown once. The target letter was a gray “T” or “L”; the images were colorful.

As expected, finding the target letter was faster the more times the image had been presented. Hand position didn’t affect learning.

In the second experiment, a new set of students were shown the same shown-once images, while 16 versions of the eight repeated images were created. These versions varied in their color components. In this circumstance, learning was slower when hands were held near the images. That is, people found it harder to recognize the commonalities among identical but differently colored patterns, suggesting they were too focused on the details to see the similarities.

These findings suggest that processing near the hands is biased toward item-specific detail. This is in keeping with earlier suggestions that the improvements in perception and attention near the hands are item-specific. It may indeed be that this increased perceptual focus is at the cost of higher-order function such as memory and learning. This would be consistent with the idea that there are two largely independent visual streams, one of which is mainly concerned with visuospatial operations, and the other of which is primarily for more cognitive operations (such as object identification).

All this may seem somewhat abstruse, but it is worryingly relevant in these days of hand-held technological devices.

The point of both these studies is not that one strategy (whether of hand movements or hand position) is wrong. What you need to take away is the realization that hand movements and hand position can affect the way you approach problems, and the things you perceive. Sometimes you want to take a more physical approach to a problem, or pick out the fine details of a scene or object — in these cases, moving your hands, or holding something in or near your hands, is a good idea. Other times you might want to take a more abstract/generalized approach — in these cases, you might want to step back and keep your body out of it.

Reference: 

Source: 

Topics: 

tags memworks: 

tags strategies: 

tags study: 

Negative gossip sharpens attention

July, 2011

Faces of people about whom something negative was known were perceived more quickly than faces of people about whom nothing, or something positive or neutral, was known.

Here’s a perception study with an intriguing twist. In my recent round-up of perception news I spoke of how images with people in them were more memorable, and of how some images ‘jump out’ at you. This study showed different images to each participant’s left and right eye at the same time, creating a contest between them. The amount of time it takes the participant to report seeing each image indicates the relative priority granted by the brain.

So, 66 college students were shown faces of people, and told something ‘gossipy’ about each one. The gossip could be negative, positive or neutral — for example, the person “threw a chair at a classmate”; “helped an elderly woman with her groceries”; “passed a man on the street.” These faces were then shown to one eye while the other eye saw a picture of a house.

The students had to press one button when they could see a face and another when they saw a house. As a control, some faces were used that the students had never seen. The students took the same length of time to register seeing the unknown faces and those about which they had been told neutral or positive information, but pictures of people about whom they had heard negative information registered around half a second quicker, and were looked at for longer.

A second experiment confirmed the findings and showed that subjects saw the faces linked to negative gossip for longer periods than faces about whom they had heard about upsetting personal experiences.

Reference: 

[2283] Anderson, E., Siegel E. H., Bliss-Moreau E., & Barrett L F.
(2011).  The Visual Impact of Gossip.
Science. 332(6036), 1446 - 1448.

Source: 

Topics: 

tags memworks: 

Role of expectation on memory consolidation during sleep

March, 2011

A new study suggests sleep’s benefits for memory consolidation depend on you wanting to remember.

Two experiments involving a total of 191 volunteers have investigated the parameters of sleep’s effect on learning. In the first experiment, people learned 40 pairs of words, while in the second experiment, subjects played a card game matching pictures of animals and objects, and also practiced sequences of finger taps. In both groups, half the volunteers were told immediately following the tasks that they would be tested in 10 hours. Some of the participants slept during this time.

As expected, those that slept performed better on the tests (all of them: word recall, visuospatial, and procedural motor memory), but the really interesting bit is that it turned out it was only the people who slept who also knew a test was coming that had improved memory recall. These people showed greater brain activity during deep or "slow wave" sleep, and for these people only, the greater the activity during slow-wave sleep, the better their recall.

Those who didn’t sleep, however, were unaffected by whether they knew there would be a test or not.

Of course, this doesn’t mean you never remember things you don’t intend or want to remember! There is more than one process going on in the encoding and storing of our memories. However, it does confirm the importance of intention, and cast light perhaps on some of your learning failures.

Reference: 

[2148] Wilhelm, I., Diekelmann S., Molzow I., Ayoub A., Mölle M., & Born J.
(2011).  Sleep Selectively Enhances Memory Expected to Be of Future Relevance.
The Journal of Neuroscience. 31(5), 1563 - 1569.

Source: 

Topics: 

tags lifestyle: 

tags memworks: 

Better reading may mean poorer face recognition

January, 2011

Evidence that illiterates use a brain region involved in reading for face processing to a greater extent than readers do, suggests that reading may have hijacked the network used for object recognition.

An imaging study of 10 illiterates, 22 people who learned to read as adults and 31 who did so as children, has confirmed that the visual word form area (involved in linking sounds with written symbols) showed more activation in better readers, although everyone had similar levels of activation in that area when listening to spoken sentences. More importantly, it also revealed that this area was much less active among the better readers when they were looking at pictures of faces.

Other changes in activation patterns were also evident (for example, readers showed greater activation in the planum temporal in response to spoken speech), and most of the changes occurred even among those who acquired literacy in adulthood — showing that the brain re-structuring doesn’t depend on a particular time-window.

The finding of competition between face and word processing is consistent with the researcher’s theory that reading may have hijacked a neural network used to help us visually track animals, and raises the intriguing possibility that our face-perception abilities suffer in proportion to our reading skills.

Reference: 

Source: 

Topics: 

tags: 

tags memworks: 

tags study: 

Pages

Subscribe to RSS - object recognition
Error | About memory

Error

The website encountered an unexpected error. Please try again later.