The bow and arrow have long been regarded as a possible indicator of culture in prehistoric times. Bows and arrows appear to have been in use for some 64,000 years, given evidence from South Africa. Until recently, their significance in human cognitive ability was unclear. Now two researchers have been able to decode the conceptual foundations of the bow and arrow. The results of the study, by Miriam Haidle of the Heidelberg Academy's ROCEEH project (sponsored by the Senckenberg Research Institute) and the University of Tübingen and Marlize Lombard of the University of Johannesburg, appear in the latest edition of the Cambridge Archaeological Journal.
Using archaeological finds and ethnological parallels, the two researchers reconstructed the steps needed to make a bow and arrows. These are complimentary tools -- separate, but developed interdependently. The bow is the controlling element, while the arrows can be used more flexibly and are interchangeable. About 2.5 million years ago, humans first used tools to make other tools then to make tools assembled from different parts to make a unit with particular qualities, such as wooden spears with stone spearheads (ca. 200,000-300,000 years ago.) The bow and arrow and other complementary tool sets made it possible for prehistoric humans to greatly increase the flexibility of their reactions.
There are many basic complementary tool sets: needle and thread, fishing rod and line, hammer and chisel. The bow and arrow are a particularly complex example. The reconstruction of the technique shows that no less than ten different tools are needed to manufacture a simple bow and arrows with foreshafts. It takes 22 raw materials and three semi-finished goods (binding materials, multi-component glue) and five production phases to make a bow, and further steps to make the arrows to go with it. The study was able to show a high level of complexity in the use of tools at an early stage in the history of homo sapiens.
The Heidelberg Academy of Sciences and Humanities project "The Role of Culture in Early Expansions of Humans" (ROCEEH) incorporates archaeologists, paleoanthropologists, palaeobiologists and geographers, working together to find out where the first humans arose, where they moved to in Africa and Eurasia, and why. The project covers the time between three million years ago and the last glacial maximum 20,000 years ago. The focus is on when, where, and in what form a changing climate, evolution and cultural development of early humans enabled them to expand the behavioral niche of a large primate within Africa and to find new roles outside of Africa. The University of Tübingen and the Senckenberg Research Institute in Frankfurt have been cooperating on this 20-year Heidelberg Academy project since 2008.
Preventing diabetes or delaying its onset has been thought to stave off cognitive decline -- a connection strongly supported by the results of a 9-year study led by researchers at the University of California, San Francisco (UCSF) and the San Francisco VA Medical Center.
Earlier studies have looked at cognitive decline in people who already had diabetes. The new study is the first to demonstrate that the greater risk of cognitive decline is also present among people who develop diabetes later in life. It is also the first study to link the risk of cognitive decline to the severity of diabetes.
The result is the latest finding to emerge from the Health, Aging, and Body Composition (Health ABC) Study, which enrolled 3,069 adults over 70 at two community clinics in Memphis, TN and Pittsburgh, PA beginning in 1997. All the patients provided periodic blood samples and took regular cognitive tests over time.
When the study began, hundreds of those patients already had diabetes. A decade later, many more of them had developed diabetes, and many also suffered cognitive decline. As described this week in Archives of Neurology, those two health outcomes were closely linked.
People who had diabetes at the beginning of the study showed a faster cognitive decline than people who developed it during the course of the study—and these people, in turn, tended to be worse off than people who never developed diabetes at all. The study also showed that patients with more severe diabetes who did not control their blood sugar levels as well suffered faster cognitive declines.
"Both the duration and the severity of diabetes are very important factors," said Kristine Yaffe, MD, the lead author of the study. "It's another piece of the puzzle in terms of linking diabetes to accelerated cognitive aging."
An important question for future studies, she added, would be to ask if interventions that would effectively prevent, delay or better control diabetes would also lower people's risk of cognitive impairment later in life.
Yaffe is the Roy and Marie Scola Endowed Chair of Psychiatry; professor in the UCSF departments of Psychiatry, Neurology and Epidemiology and Biostatistics; and Chief of Geriatric Psychiatry and Director of the Memory Disorders Clinic at the San Francisco VA Medical Center.
Diabetes and Cognitive Decline
Diabetes is a chronic and complex disease marked by high levels of sugar in the blood that arise due to problems with the hormone insulin, which regulates blood sugar levels. It is caused by an inability to produce insulin (type 1) or an inability to respond correctly to insulin (type 2).
The human brain can recognize thousands of different objects, but neuroscientists have long grappled with how the brain organizes object representation; in other words, how the brain perceives and identifies different objects. Now researchers at the MIT Computer Science and Artificial Intelligence Lab (CSAIL) and the MIT Department of Brain and Cognitive Sciences have discovered that the brain organizes objects based on their physical size, with a specific region of the brain reserved for recognizing large objects and another reserved for small objects. Their findings, to be published in the June 21 issue of Neuron, could have major implications for fields like robotics, and could lead to a greater understanding of how the brain organizes and maps information.
"Prior to this study, nobody had looked at whether the size of an object was an important factor in the brain's ability to recognize it," said Aude Oliva, an associate professor in the MIT Department of Brain and Cognitive Sciences and senior author of the study.
"It's almost obvious that all objects in the world have a physical size, but the importance of this factor is surprisingly easy to miss when you study objects by looking at pictures of them on a computer screen," said Dr. Talia Konkle, lead author of the paper. "We pick up small things with our fingers, we use big objects to support our bodies. How we interact with objects in the world is deeply and intrinsically tied to their real-world size, and this matters for how our brain's visual system organizes object information."
As part of their study, Konkle and Oliva took 3D scans of brain activity during experiments in which participants were asked to look at images of big and small objects or visualize items of differing size. By evaluating the scans, the researchers found that there are distinct regions of the brain that respond to big objects (for example, a chair or a table), and small objects (for example, a paperclip or a strawberry).
By looking at the arrangement of the responses, they found a systematic organization of big to small object responses across the brain's cerebral cortex. Large objects, they learned, are processed in the parahippocampal region of the brain, an area located by the hippocampus, which is also responsible for navigating through spaces and for processing the location of different places, like the beach or a building. Small objects are handled in the inferior temporal region of the brain, near regions that are active when the brain has to manipulate tools like a hammer or a screwdriver.
The work could have major implications for the field of robotics, in particular in developing techniques for how robots deal with different objects, from grasping a pen to sitting in a chair.
"Our findings shed light on the geography of the human brain, and could provide insight into developing better machine interfaces for robots," said Oliva.
Many computer vision techniques currently focus on identifying what an object is without much guidance about the size of the object, which could be useful in recognition. "Paying attention to the physical size of objects may dramatically constrain the number of objects a robot has to consider when trying to identify what it is seeing," said Oliva.
The study's findings are also important for understanding how the organization of the brain may have evolved. The work of Konkle and Oliva suggests that the human visual system's method for organizing thousands of objects may also be tied to human interactions with the world. "If experience in the world has shaped our brain organization over time, and our behavior depends on how big objects are, it makes sense that the brain may have established different processing channels for different actions, and at the center of these may be size," said Konkle.
Researchers who examined the link between metabolic syndrome and cognitive disorders has stressed the need for new lines of research to identify effective therapeutic targets.
No effective treatments are currently available for the prevention or cure of Alzheimer's disease (AD), the most frequent form of dementia in the elderly.
The most recognized risk factors, advancing age and having the apolipoprotein E 4 gene, cannot be modified or treated. Increasingly, scientists are looking toward other risk factors to identify preventive and therapeutic strategies. Much attention recently has focused on the metabolic syndrome (MetS), with a strong and growing body of research suggesting that metabolic disorders and obesity may play a role in the development of dementia.
Now, a new supplement to the Journal of Alzheimer's Disease has provided a state-of-the-art assessment of research into the link between metabolic syndrome and cognitive disorders.
The supplement is guest edited by Vincenza Frisardi, of the Department of Neurological and Psychiatric Sciences, University of Bari, and the Geriatric Unit and Gerontology-Geriatrics Research Laboratory, IRCCS, Foggia, Italy, and Bruno P. Imbimbo, Research and Development Department, Chiesi Farmaceutici, Parma, Italy.
The prevalence of MetS and obesity has increased over the past several decades. MetS is a cluster of vascular and metabolic risk factors including obesity, hypertension, an abnormal cholesterol profile, and impaired blood glucose regulation.
"Although molecular mechanisms underlying the relationship between MetS and neurological disorders are not fully understood, it is becoming increasingly clear that cellular and biochemical alterations observed in MetS may represent a pathological bridge between MetS and various neurological disorders," explained Dr. Frisardi.
Type 2 diabetes (T2D) has been linked with cognitive impairment in a number of studies. The risk for developing both T2D and AD increases proportionately with age, and evidence shows that individuals with T2D have a nearly twofold higher risk of AD than nondiabetic individuals.
Paula I. Moreira, Faculty of Medicine and Center for Neuroscience and Cell Biology, University of Coimbra, Portugal, outline some of the likely mechanisms.
Both AD and T2D present similar abnormalities in the mitochondria, which play a pivotal role in cellular processes that impair their ability to regulate oxidation in the cell. Human amylin, a peptide that forms deposits in the pancreatic cells of T2D patients, shares several properties with amyloid-beta plaques in the Alzheimer's brain.
Insulin resistance is another feature shared by both disorders. Impairment of insulin signalling is directly involved in the development of tau tangles and amyloid beta plaques.
"Understanding the key mechanisms underlying this deleterious interaction may provide opportunities for the design of effective therapeutic strategies," Dr. Moreira noted. (ANI)
Computers, which are pushing humans to unprecedented levels of work achievement and productivity, can now be used to monitor "cognitive overload" at the desk. Workers suffering from mental overload stop accepting new information, become emotional and stressed, make more mistakes and their reasoning is impaired.
However, technology developed at the information and communications technology research centre NICTA monitors voice signals humans can't hear.
The voice's resonant frequency through the vocal tract can be picked up via desktop microphone headsets and measured, allowing employers to tell in real time when their workers' brains are being overtaxed and, if necessary, intervene to avert disaster.
The technology, marketed as BrainGauge, is being sold as a recruitment tool for use in call centers.
Usually, less than 1 percent of calls in call centres can be manually reviewed by management, explained Bruce Whitby, managing director of BrainGauge.
"'But with the technology you can do 100 per cent screening of the calls, and highlight the calls with a very high cognitive load," the Sydney Morning Herald quoted Whitby as saying.
A 15-minute web-based assessment that analyses candidate overload when faced with specific tasks means that "by selecting the right candidate in the first place, you can save the money of training people who eventually will leave anyway. So you get a more stable workforce which will serve the customer [better] in the long run," says Fang Chen, BrainGauge research group general manager, said.
You're watching your weight, and you quit smoking and began eating a heart-healthy diet -- but that still might not be enough to get you approved for life insurance.
Physical health is just one part of the equation when you get to your golden years, as life insurance companies now often test your cognitive abilities as well as your physical fitness.
For years, testing for cognitive impairment has been standard procedure if you're applying for long-term care insurance, says Ray Dinstel, senior vice president for life and long-term care insurance underwriting at Genworth Financial.
Now, the procedure has been expanded to include those applying for life insurance if they're older than a certain age. At Genworth, testing for cognitive impairment is standard for anyone who is age 70 and above. But other life insurance companies start testing for those who are 60, while still other insurers begin testing at age 80, says Dinstel.
Research repeatedly has shown that "cognitive impairment leads to early mortality," Dinstel says.
A report published by the Alzheimer's Association states more than 60 percent of those who have Alzheimer's disease at age 70 are expected to die before the age of 80, compared to 30 percent of those who don't have the disease.
Alzheimer's disease is the sixth-leading cause of death in the United States. While deaths from other diseases, such as breast cancer and heart disease, fell between 2000 and 2008, deaths from Alzheimer's disease soared 66 percent, according to the report.
Those statistics "may have raised the red flag for insurance carriers," says Brian Ashe, treasurer of the board of directors of the Life and Health Insurance Foundation for Education (LIFE), which aims to help consumers make wise insurance choices.
Thanks for the memories: word recall test for life insurance
With Genworth, a delayed word recall test is used to assess cognitive function. You meet face-to-face with a tester who gives you a list of 10 words. Later in the meeting, you're asked to recall as many of the words as you can. Dinstel says well over 90 percent of those who take the test pass.
Life insurance companies "start worrying" if someone recalls fewer than five words, Dinstel says, but "they don't necessarily take negative action." They'll also check to see if you're employed or active in the community.
"We're very careful not to take action based on one piece of information," Dinstel says.
"If they're employed or active in the community, the person is probably fine, they just probably didn't do well in our screening," he says.
Life insurance Rx: Buy early, doctor's note nixes consideration
But even being active and employed doesn't necessarily mean someone will be approved for life insurance, says Ashe, who is also president of the insurance agency Brian Ashe and Associates Ltd. in Lisle, Ill.
Ashe has a client who is 73. The man works and is active in the community. Because the client has a couple of physical problems, Ashe made a preliminary inquiry with a life insurance company to see if he'd be approved.
While the client didn't take a cognitive impairment test, his medical records regarding his physical issues included doctor's notes that made mention of concern over the man's cognitive abilities.
Because of those comments, 13 life insurers have denied the man's life insurance application, Ashe says.
Ashe says that he wouldn't be surprised if in the future, life insurance applications contain questions about family members' cognitive abilities, as they do today about issues like cancer and heart disease. He cites the addition of questions about HIV/AIDS to life insurance applications following the start of the AIDS epidemic.
"If cognitive disorders become a huge factor, I expect more questions will be asked," he says.
There's been an uptick in the number of older people applying for life insurance, Ashe says. With longer life expectancies, many people worry that if they die, their spouse will run out of money. Others may be looking to make up losses suffered in the stock market decline.
To Ashe, the concern over cognitive impairment is a good reason why consumers should purchase life insurance when they are young and healthy, and rates are lower. "They ought to
get as much life insurance as they can."
Read more: http://www.foxbusiness.com/personal-finance/2012/06/07/is-it-crazy-for-life-insurers-to-test-for-dementia/#ixzz1xEWlpsr4
Is coffee a cure-all for chronic disease? Previous studies have tied drinking coffee to protective benefits against Parkinson's disease, stroke, diabetes, heart disease and some cancers.
In a new study of the effects of soy supplements for postmenopausal women, researchers at the Stanford University School of Medicine and the USC Keck School of Medicine found no significant differences -- positive or negative -- in overall mental abilities between those who took supplements and those who didn't.
While questions have swirled for years around a possible link between soy consumption and changes in cognition, this research offers no evidence to support such claims. "There were no large effects on overall cognition one way or another," said the study's lead author, Victor Henderson, MD, professor of health research and policy and of neurology and neurological sciences at Stanford.
The findings from the 2.5-year study in middle-aged and older women, which was larger and longer than any previous trials on soy use, appear in the June 5 issue of Neurology, the medical journal of the American Academy of Neurology. The results are in line with the largest previous study in this area: a 12-month trial of Dutch women during which daily soy intake showed "no significant effect on cognitive endpoints." That work was published in a 2004 issue of the Journal of the American Medical Association.
Still, there are a number of randomized clinical trials on soy's effect on cognition and memory in women that have presented conflicting takes about its benefits and harms. While improved cognition was seen in some findings, other research suggested that soy could have an adverse effect on memory.
Soy and soy-based products contain an estrogen-like compound called isoflavones, and some women choose to take soy supplements as an alternative to estrogen. It has been thought that isoflavones might be able to boost memory and perhaps overall brain function. The hippocampus, the part of the brain that controls memory, is rich in estrogen beta receptors, and isoflavones are known to activate these receptors.
Henderson's interest in the matter is part of his broader research agenda on finding new strategies to improve cognitive function in aging.
For this work, he and his colleagues conducted the National Institutes of Health-sponsored Women's Isoflavone Soy Health Trial, which was done between 2004 and 2008 to determine the effect of soy isoflavones on the progression of atherosclerosis and, secondarily, the effect on cognition. During this study, 350 healthy women ages 45-92 were randomized to receive daily 25 grams of isoflavone-rich soy protein (a dose comparable to that of traditional Asian diets) or a placebo. A battery of neuropsychological tests was given to the participants at the start of the study and again 2.5 years later.
Cardinal Health today announced that it will manufacture and distribute Amyvid, Eli Lilly and Company's new diagnostic imaging agent that aids in the evaluation of adult patients with cognitive impairment who are being evaluated for Alzheimer's Diseaseand other causes of cognitive decline. The commercial launch of Amyvid is scheduled for June 1, 2012.
Amyvid works by binding to amyloid plaques, one of the necessary pathological features of Alzheimer's Disease, and is detected using PET scan images of the brain. A negative Amyvid scan indicates sparse to no amyloid plaques are currently present, which is inconsistent with a neuropathological diagnosis of Alzheimer's Disease and reduces the likelihood that a patient's cognitive impairment is due to Alzheimer's Disease. A positive Amyvid scan indicates moderate to frequent amyloid plaques are present; this amount of amyloid plaque is present in patients with Alzheimer's Disease, but may also be present in patients with other types of neurologic conditions and in older people with normal cognition.
It is important to note that Amyvid is an adjunct to other diagnostic evaluations, and a positive Amyvid scan does not establish a diagnosis of Alzheimer's Disease or other cognitive disorder. Additionally, the safety and effectiveness of Amyvid have not been established for predicting development of dementia or other neurologic condition, or when used to monitor responses to therapies.
Radioactive agents like Amyvid are often likened to "shrinking ice cubes," because once they are manufactured, their radioactivity quickly decays. Amyvid loses over half of its radioactivity every two hours, so the sites that manufacture it need to be located in close proximity to hospitals and health centers that will use it.
Cardinal Health will manufacture and distribute Amyvid at seven sites throughout the U.S. beginning in June 2012, with a potential expansion to 12 locations by the end of the calendar year.