New research from Dr. Neal Barnard suggests from that what you eat may make a difference in the onset of Alzheimer's. To read more, follow the link above.
Two new experiments, one involving people and the other animals, suggest that regular exercise can substantially improve memory, although different types of exercise seem to affect the brain quite differently. The news may offer consolation to the growing numbers of us who are entering age groups most at risk for cognitive decline.
It was back in the 1990s that scientists at the Salk Institute for Biological Studies in La Jolla, Calif., first discovered that exercise bulks up the brain. In groundbreaking experiments, they showed that mice given access to running wheels produced far more cells in an area of the brain controlling memory creation than animals that didn’t run. The exercised animals then performed better on memory tests than their sedentary labmates.
Since then, scientists have been working to understand precisely how, at a molecular level, exercise improves memory, as well as whether all types of exercise, including weight training, are beneficial.
The new studies provide some additional and inspiring clarity on those issues, as well as, incidentally, on how you can get lab rats to weight train.
For the human study, published in The Journal of Aging Research, scientists at the University of British Columbia recruited dozens of women ages 70 to 80 who had been found to have mild cognitive impairment, a condition that makes a person’s memory and thinking more muddled than would be expected at a given age.
Mild cognitive impairment is also a recognized risk factor for increasing dementia. Seniors with the condition develop Alzheimer’s disease at much higher rates than those of the same age with sharper memories.
Earlier, the same group of researchers had found that after weight training, older women with mild cognitive impairment improved their associative memory, or the ability to recall things in context — a stranger’s name and how you were introduced, for instance.
Now the scientists wanted to look at more essential types of memory, and at endurance exercise as well. So they randomly assigned their volunteers to six months of supervised exercise. Some of the women lifted weights twice a week. Others briskly walked. And some, as a control measure, skipped endurance exercise and instead stretched and toned.
At the start and end of the six months, the women completed a battery of tests designed to study their verbal and spatial memory. Verbal memory is, among other things, your ability to remember words, and spatial memory is your remembrance of where things once were placed in space. Both deteriorate with age, a loss that’s exaggerated in people with mild cognitive impairment.
And in this study, after six months, the women in the toning group scored worse on the memory tests than they had at the start of the study. Their cognitive impairment had grown.
But the women who had exercised, either by walking or weight training, performed better on almost all of the cognitive tests after six months than they had before.
There were, however, differences.
While both exercise groups improved almost equally on tests of spatial memory, the women who had walked showed greater gains in verbal memory than the women who had lifted weights.
What these findings suggest, the authors conclude, is that endurance training and weight training may have different physiological effects within the brain and cause improvements in different types of memory.
That idea tallies nicely with the results of the other recent study of exercise and memory, in which lab rats either ran on wheels or, to the extent possible, lifted weights. Specifically, the researchers taped weights to the animals’ tails and had them repeatedly climb little ladders to simulate resistance training.
After six weeks, the animals in both exercise groups scored better on memory tests than they had before they trained. But it was what was going on in their bodies and brains that was revelatory. The scientists found that the runners’ brains showed increased levels of a protein known as BDNF, or brain-derived neurotrophic factor, which is known to support the health of existing neurons and coax the creation of new brain cells. The rat weight-trainers’ brains did not show increased levels of BDNF.
The tail trainers, however, did have significantly higher levels of another protein, insulinlike growth factor, in their brains and blood than the runners did. This substance, too, promotes cell division and growth and most likely helps fragile newborn neurons to survive.
What all of this new research suggests, said Teresa Liu-Ambrose, an associate professor in the Brain Research Center at the University of British Columbia who oversaw the experiments with older women, is that for the most robust brain health, it’s probably advisable to incorporate both aerobic and resistance training.
It seems that each type of exercise “selectively targets different aspects of cognition,” she said, probably by sparking the release of different proteins in the body and brain.
But, she continued, no need to worry if you choose to concentrate solely on aerobic or resistance training, at least in terms of memory improvements. The differences in the effects of each type of exercise were subtle, she said, while the effects of exercise – any exercise – on overall cognitive function were profound.
“When we started these experiments,” she said, “most of us thought that, at best, we’d see less decline” in memory function among the volunteers who exercised, which still would have represented success. But beyond merely stemming people’s memory loss, she said, “we saw actual improvements,” an outcome that, if you’re waffling about exercising today, is worth remembering.
Adding more color to your diet in the form of berries is encouraged by many nutrition experts. The protective effect of berries against inflammation has been documented in many studies. Diets supplemented with blueberries and strawberries have also been shown to improve behavior and cognitive functions in stressed young rats.
To evaluate the protective effects of berries on brain function, specifically the ability of the brain to clear toxic accumulation, researchers from the Human Nutrition Research Center on Aging at Tufts University and University of Maryland Baltimore County recently fed rats a berry diet for 2 months and then looked at their brains after irradiation, a model for accelerated aging. All of the rats were fed berries 2 months prior to radiation and then divided into two groups- one was evaluated after 36 hours of radiation and the other after 30 days.
"After 30 days on the same berry diet, the rats experienced significant protection against radiation compared to control," said investigator Shibu Poulose, PhD. "We saw significant benefits to diets with both of the berries, and speculate it is due to the phytonutrients present."
The researchers looked at neurochemical changes in the brain, in particular what is known as autophagy, which can regulate the synthesis, degradation and recycling of cellular components. It is also the way in which the brain clears toxic accumulations. "Most diseases of the brain such as Alzheimer's and Parkinson's have shown an increased amount of toxic protein. Berries seem to promote autophagy, the brain's natural housekeeping mechanism, thereby reducing the toxic accumulation," said Poulose.
For many years, breast cancer patients have reported experiencing difficulties with memory, concentration and other cognitive functions following cancer treatment. Whether this mental "fogginess" is psychosomatic or reflects underlying changes in brain function has been a bone of contention among scientists and physicians.
Now, a new study led by Dr. Patricia Ganz, director of cancer prevention and control research at UCLA's Jonsson Comprehensive Cancer Center, demonstrates a significant correlation between poorer performance on neuropsychological tests and memory complaints in post-treatment, early-stage breast cancer patients -- particularly those who have undergone combined chemotherapy and radiation.
"The study is one of the first to show that such patient-reported cognitive difficulties -- often referred to as 'chemo brain' in those who have had chemotherapy -- can be associated with neuropsychological test performance," said Ganz, who is also a professor of health policy and management at UCLA's Fielding School of Public Health and a professor of medicine at the David Geffen School of Medicine at UCLA.
The study was published April 18 in the online edition of the Journal of the National Cancer Institute and will appear in an upcoming print edition of the journal.
Ganz and her colleagues looked at 189 breast cancer patients, who enrolled in the study about one month after completing their initial breast cancer treatments and before beginning endocrine hormone-replacement therapy (70 percent planned to undergo hormone therapy). Two-thirds of the women had had breast-conserving surgery, more than half had received chemotherapy, and three-quarters had undergone radiation therapy. The average age of study participants was 52.
Because cognitive complaints following cancer treatment have often been associated with anxiety and depressive symptoms, limiting confidence that "chemo brain" and similar difficulties reflect a cancer treatment toxicity, the researchers excluded women with serious depressive symptoms. They also took careful account of the cancer treatments used and whether or not menopause and hormonal changes could be influencing the cognitive complaints. A sample of age-matched healthy women who did not have breast cancer was used as a control group.
The researchers provided a self-reporting questionnaire to the women and found that those with breast cancer reported, on the whole, more severe complaints than normal; 23.3 percent of these patients had higher complaints about their memory, and 19 percent reported higher complaints about higher-level cognition (problem-solving, reasoning, etc.). Significantly, those breast cancer patients who reported more severe memory and higher-level cognition problems were more likely to have undergone both chemotherapy and radiation.
While earlier studies had not identified a consistent association between cognitive complaints and neuropsychological testing abnormalities, the UCLA research team found that even when patients reported subtle changes in their memory and thinking, neuropsychological testing showed detectable differences.
In particular, they discovered that poorer performance on the neuropsychological test was associated both with higher levels of cognitive complaints and with combined radiation-and-chemotherapy treatment, as well as with symptoms related to depression.
"In the past, many researchers said that we can't rely on patients' self-reported complaints or that they are just depressed, because previous studies could not find this association between neuropsychological testing and cognitive complaints," Ganz said. "In this study, we were able to look at specific components of the cognitive complaints and found they were associated with relevant neuropsychological function test abnormalities."
The findings are part of an ongoing study that seeks to examine the extent to which hormone therapy contributes to memory and thinking problems in breast cancer survivors, and this pre-hormone therapy assessment was able to separate the effects of initial treatments on these problems. Earlier post-treatment studies of breast cancer patients were difficult to interpret, as they included women already taking hormone therapy.
"As we provide additional reports on the follow-up testing in these women, we will track their recovery from treatment, as well as determine whether hormone therapy contributes to worsening complaints over time," Ganz said.
This research was supported by the National Cancer Institute and the Breast Cancer Research Foundation, and by funding from the National Institutes of Health to the Cousins Center for Psychoneuroimmunology.
Brain scans are increasingly able to reveal whether or not you believe you remember some person or event in your life. In a new study presented at a cognitive neuroscience meeting today, researchers used fMRI brain scans to detect whether a person recognized scenes from their own lives, as captured in some 45,000 images by digital cameras. The study is seeking to test the capabilities and limits of brain-based technology for detecting memories, a technique being considered for use in legal settings.
"The advancement and falling costs of fMRI, EEG, and other techniques will one day make it more practical for this type of evidence to show up in court," says Francis Shen of the University of Minnesota Law School, who is chairing a session on neuroscience and the law at a meeting of the Cognitive Neuroscience Society (CNS) in San Francisco this week. "But technological advancement on its own doesn't necessarily lead to use in the law." But as the technology has advanced and as the legal system desires to use more empirical evidence, neuroscience and the law are intersecting more often than in previous decades.
In U.S. courts, neuroscientific evidence has been used largely in cases involving brain injury litigation or questions of impaired ability. In some cases outside the United States, however, courts have used brain-based evidence to check whether a person has memories of legally relevant events, such as a crime. New companies also are claiming to use brain scans to detect lies - although judges have not yet admitted this evidence in U.S. courts. These developments have rallied some in the neuroscience community to take a critical look at the promise and perils of such technology in addressing legal questions - working in partnership with legal scholars through efforts such as the MacArthur Foundation Research Network on Law and Neuroscience.
Recognizing your own memories
What inspired Anthony Wagner, a cognitive neuroscientist at Stanford University, to test fMRI uses for memory detection was a case in June 2008 in Mumbai, India, in which a judge cited EEG evidence as indicating that a murder suspect held knowledge about the crime that only the killer could possess. "It appeared that the brain data held considerable sway," says Wagner, who points out that the methods used in that case have not been subject to extensive peer review.
Since then, Wagner and colleagues have conducted a number of experiments to test whether brain scans can be used to discriminate between stimuli that people perceive as old or new, as well as more objectively, whether or not they have previously encountered a particular person, place, or thing. To date, Wagner and colleagues have had success in the lab using fMRI-based analyses to determine whether someone recognizes a person or perceives them as unfamiliar, but not in determining whether in fact they have actually seen them before.
In a new study presented today, his team sought to take the experiments out of the lab and into the real world by outfitting participants with digital cameras around their necks that automatically took photos of the participants' everyday experiences. Over a multi-week period, the cameras yielded 45,000 photos per participant.
Wagner's team then took brief photo sequences of individual events from the participants' lives and showed them to the participants in the fMRI scanner, along with photo sequences from other subjects as the control stimuli. The researchers analyzed their brain patterns to determine whether or not the participants were recognizing the sequences as their own. "We did quite well with most subjects, with a mean accuracy of 91% in discriminating between event sequences that the participant recognized as old and those that the participant perceived as unfamiliar, " Wagner says. "These findings indicate that distributed patterns of brain activity, as measured with fMRI, carry considerable information about an individual's subjective memory experience - that is, whether or not they are remembering the event."
In another new study, Wagner and colleagues tested whether people can "beat the technology" by using countermeasures to alter their brain patterns. Back in the lab, the researchers showed participants individual faces and later asked them whether the faces were old or new. "Halfway through the memory test, we stopped and told them 'What we are actually trying to do is read out from your brain patterns whether or not you are recognizing the face or perceiving it as novel, and we've been successful with other subjects in doing this in the past. Now we want you to try to beat the system by altering your neural responses.'" The researchers instructed the participants to think about a familiar person or experience when presented with a new face, and to focus on a novel feature of the face when presented a previously encountered face.
"In the first half of the test, during which participants were just making memory decisions, we were well above chance in decoding from brain patterns whether they recognized face or perceived it as novel. However, in the second half of the test, we were unable to classify whether or not they recognized the face nor whether the face was objectively old or new," Wagner says. Within a forensic setting, Wagner says, it is conceivable that a suspect could use such measures to try to mask the brain patterns associated with memory.
Wagner says that his work to date suggests that the technology may have some utility in reading out brain patterns in cooperative individuals but that the uses are much more uncertain with uncooperative individuals. However, Wagner stresses that the method currently does not distinguish well between whether a person's memory reflects true or false recognition. He says that it is premature to consider such evidence in the courts because many additional factors await future testing, including the effects of stress, practice, and time between the experience and the memory test.
Overgeneralizing the adolescent brain
A general challenge to the use of neuroscientific evidence in legal settings, Wagner says, is that most studies are at the group rather than the individual level. "The law cares about a particular individual in a particular situation right in front of them," he says, and the science often cannot speak to that specificity.
Indeed, B.J. Casey of the Weill Medical College of Cornell University says that too often we overgeneralize the lack of self control among adolescents. Although adolescents do show poor self control as a group, some situations and individuals are more prone to this breakdown than others.
"It is not that teens can't make decisions, they can and they can do so efficiently," Casey says. "It is when they must make decisions in the heat of the moment - in presence of potential or perceived threats, among peers - that the court should consider diminished responsibility of teens while still holding them accountable for their behavior." Research suggests that this diminished ability is due to the immature development of circuitry involved in processing of negative or positive cues in the environment in the subcortical limbic regions and then in regulating responses to those cues in the prefrontal cortex.
The body of research to date is at the group-level, however, and is not yet able to comment on the neurobiological maturity of an individual adolescent. To help provide more guidance on this issue in legal settings, Casey and colleagues are working alongside legal scholars on a developmental imaging study, funded by the MacArthur Foundation, that is examining behaviors relevant to juvenile criminal behavior, including impulsivity and peer influence.
Making real-world connections
The same type of work - to connect brain imaging to particular behaviors in the real-world - is ongoing in a number of other areas, including fMRI-based lie detection and linking negligence to specific mental states. "It's a big leap to go from a laboratory setting, in which impulse control may be measured by one's ability to not press a button in response to a stimulus, to the real-world, where the question is whether someone had requisite self-control not to tie up an innocent person and throw them off a bridge." Shen says. "I don't see neuroscience solving these big problems anytime soon, and so the question for law becomes: What do we do with this uncertainty? I think this is where we're at right now, and where we'll be for some time."
"With a few notable exceptions such as death penalty cases, cases where a juvenile is facing a very stiff sentence, and litigating brain injury claims, 'law and neuroscience' is not familiar to most lawyers," Shen says. "But this might change - and soon." The ongoing work is vital, he says, for laying a foundation for a future that's yet to come, and he hopes that more neuroscientists will increasingly collaborate with legal scholars.
Source: Cognitive Neuroscience Society
UCLA researchers have developed a new brain-imaging tool to help identify signs of cognitive decline early on in individuals who don't yet show symptoms of dementia.
The connection between stroke risk and cognitive decline has been well established by previous research. Individuals with higher stroke risk, as measured by factors like high blood pressure, have traditionally performed worse on tests of memory, attention and abstract reasoning.
The current small study demonstrated that not only stroke risk, but also the burden of plaques and tangles, as measured by a UCLA brain scan, might influence cognitive decline.
The imaging tool used in the study was developed at UCLA and reveals early evidence of amyloid beta "plaques" and neurofibrillary tau "tangles" in the brain - the hallmarks of Alzheimer's disease.
The study demonstrated that taking both stroke risk and the burden of plaques and tangles into account may offer a more powerful assessment of factors determining how people are doing now and will do in the future.
"The findings reinforce the importance of managing stroke risk factors to prevent cognitive decline even before clinical symptoms of dementia appear," said first author Dr. David Merrill, an assistant clinical professor of psychiatry and biobehavioral sciences at the Semel Institute for Neuroscience and Human Behavior at UCLA.
This is one of the first studies to examine both stroke risk and plaque and tangle levels in the brain in relation to cognitive decline before dementia has even set in, Merrill said.
According to the researchers, the UCLA brain-imaging tool could prove useful in tracking cognitive decline over time and offer additional insight when used with other assessment tools.
For the study, the team assessed 75 people who were healthy or had mild cognitive impairment, a risk factor for the future development of Alzheimer's. The average age of the participants was 63.
The individuals underwent neuropsychological testing and physical assessments to calculate their stroke risk using the Framingham Stroke Risk Profile, which examines age, gender, smoking status, systolic blood pressure, diabetes, atrial fibrillation (irregular heart rhythm), use of blood pressure medications, and other factors.
In addition, each participant was injected with a chemical marker called FDDNP, which binds to deposits of amyloid beta plaques and neurofibrillary tau tangles in the brain. The researchers then used positron emission tomography (PET) to image the brains of the subjects - a method that enabled them to pinpoint where these abnormal proteins accumulate.
The study found that greater stroke risk was significantly related to lower performance in several cognitive areas, including language, attention, information-processing speed, memory, visual-spatial functioning (e.g., ability to read a map), problem-solving and verbal reasoning.
The researchers also observed that FDDNP binding levels in the brain correlated with participants' cognitive performance. For example, volunteers who had greater difficulties with problem-solving and language displayed higher levels of the FDDNP marker in areas of their brain that control those cognitive activities.
"Our findings demonstrate that the effects of elevated vascular risk, along with evidence of plaques and tangles, is apparent early on, even before vascular damage has occurred or a diagnosis of dementia has been confirmed," said the study's senior author, Dr. Gary Small, director of the UCLA Longevity Center and a professor of psychiatry and biobehavioral sciences who holds the Parlow-Solomon Chair on Aging at UCLA's Semel Institute.
Researchers found that several individual factors in the stroke assessment stood out as predictors of decline in cognitive function, including age, systolic blood pressure and use of blood pressure-related medications.
Small noted that the next step in the research would be studies with a larger sample size to confirm and expand the findings.
The study has been published in the April issue of the Journal of Alzheimer's Disease. (ANI)
Dementia care, including that of Alzheimer's disease and other conditions that result in cognitive losses, has costs so high that it tops that of the two leading causes of death in the United States, heart disease and cancer, according to a study funded by the National Institutes of Health published today in the New England Journal of Medicine .
Costs of Dementia Care to Individuals and Health Care
Unlike heart disease and cancer, the two leading causes of death in the United States, according to theU.S. Centers for Disease Control and Prevention , where the high costs of treatment are related to surgery, chemotherapy, radiation and ongoing medication treatment, many of the costs associated with dementia are to the individuals and their families.
The yearly costs associated with dementia show that out-of-pocket spending averages $6,200 per person, outdistancing Medicare's cost of $2,700 per person. Informal home care costs per person were determined both for caregiving time valued according to replacement costs, $27,800, and for caregiving time valued according to cost of forgone wages, $13,200.
Although the individual costs may seem to pale in comparison to those of nursing home costs annually per person with dementia at $13,000, many people with dementia are first cared for at home or in the home of a family member before ever receiving nursing home care. The grand totals for care purchased in the marketplace plus caregiving time valued according to replacement costs were $56,300 and those for care purchased in the marketplace plus caregiving time valued according to the cost of forgone wages were $41,700.
All the costs were adjusted to 2010 dollars.
Comparison of Total Annual Costs for Dementia, Heart Disease and Cancer
Care purchased in the marketplace for 2010 totaled $109 billion for dementia, $102 billion for heart disease and $77 billion for cancer, reported the AP .
Statistics compiled by the NIH-funded study show that in 2010, the total costs for dementia care forreplacement cost annually were $215 billion and those for dementia care in cost of forgone wages were $159 billion.
Estimated costs for each of the three cost areas -- marketplace, replacement costs and forgone wages -- for 2040 show that these total costs will jump to $259 billion, $511 billion and $379 billion, respectively.
Outlook for Dementia Care in the Future
So much of the care associated with dementia is left to their spouses and family members. As Richard Hodes, M.D. , Director of the National Institute on Aging, pointed out to WebMD.com , that such care in the future may face tougher times due to the smaller sizes of families among younger baby boomers. This will mean there are fewer family members to care for a rapidly aging population, among whom the incidence of dementia increases with the years.
The National Alzheimer's Project Act , signed into law in January 2011, is an attempt to approach the situation with Alzheimer's disease and dementia in general in a proactive manner, which the Alzheimer's
Association has already begun to run with in its National Alzheimer's Plan .
Individuals and families need to be planning now for what the future may hold for the older individuals should dementia strike. Remember that important documents such as advanced directives and financial planning should be made ahead of a dementia diagnosis, allowing you control over such important decisions and lessening the burden for family members.
A new study finds risk factors for heart disease and stroke are an effective method to predict future declines in cognitive abilities, or memory and thinking.
Researchers were surprised that the cardiac factors appear to do a better job at predicting risk of dementia than specific measures designed to test the threat of dementia.
“This is the first study that compares these risk scores with a dementia risk score to study decline in cognitive abilities 10 years later,” said Sara Kaffashian, Ph.D., with the French National Institute of Health and Medical Research.
The study involved 7,830 men and women with an average age of 55. Risk of heart disease and stroke (cardiovascular disease) and risk of dementia were calculated for each participant at the beginning of the study.
The heart disease risk score included the following risk factors: age, blood pressure, treatment for high blood pressure, high density lipoprotein (HDL) cholesterol, total cholesterol, smoking, and diabetes.
The stroke risk score included age, blood pressure, treatment for high blood pressure, diabetes, smoking, history of heart disease, and presence of cardiac arrhythmia (irregular heart beat).
The dementia risk score included age, education, blood pressure, body mass index (BMI), total cholesterol, exercise, and whether a person had the APOE ε4 gene, a gene associated with dementia.
Memory and thinking abilities were measured three times over 10 years.
The study found that all three risk scores predicted 10-year decline in multiple cognitive tests.
However, heart disease risk scores showed stronger links with cognitive decline than a dementia risk score. Both heart and stroke risk were associated with decline in all cognitive tests except memory; dementia risk was not linked with decline in memory and verbal fluency.
“Although the dementia and cardiovascular risk scores all predict cognitive decline starting in late middle age, cardiovascular risk scores may have an advantage over the dementia risk score for use in prevention and for targeting changeable risk factors since they are already used by many physicians, Kaffashian said.
Experts also believe the findings show that high cholesterol and high blood pressure not only increase the risk of heart disease but also have a negative impact on cognitive abilities.
The European study is published in Neurology.