Gladiators' brain Exercise - Play
During the Roman Empire, Gladiatorial combat became increasingly popular as entertainment, reaching a peak in the 2nd and 3rd Century A.D. Especially spectacular was the Secular Games honoring the 1,000th anniversary of the founding of Rome (supposedly) in the year 247, celebrated by the Emperor Phillip 'the Arab' (who is still featured on the currency of Syria) at the Colisseum.
However, as the Empire suffered invasions and financial crisis, the games diminished in stature. Some argue that the Emperor Theodosius ended the games in 388; however, games were still exhibited in the Colisseum after the "Fall" of the Empire by Theodoric, the Gothic king of Italy, acting as representative for the Byzantine Emperor, who still claimed to rule all of the former Roman lands in the West. While the personal combat aspect of the games declined, chariot racing in the Hippodrome of Constantinople, modeled after the Circus Maximus in Rome, continued on until the 10th and 11th Centuries, at least.
There is some controversy over the severity of the combat. The traditional view of scholars was that the grisly fighting continued until the death or severe injury of one of the combatants. Recently, however, scholars have revised their view of single-combat gladiatorial games, placing them more along the lines of a more dangerous version of professional wrestling. Combatants were to engage in dramatic combat with intent to injure, but not kill or mortally wound opponents.
The reason for this is the expense of training, equipping, and promoting each gladiator in a precious metal based monetary economy . A career cut short would constitute a bad investment. Also, each match raised large amounts of money on wagering - follow-up re-matches would raise even more money. Therefore, there was a significant incentive for all owners to amortize their investments, and profits, over a predictable term. So death in the top tier of performers would be a fairly rare event.
These restrictions would not apply for the lowest strata of performance - condemned criminals, prisoners of war, and beasts - who in effect participated in mass public executions couched as entertainment.
Labels: gladiators brain gym
3 Related ideas: Your health, Your genes, and your brain speed. Brain Speed might be better termed "neural conduction velocity."
If you body was a PC or hi-def tuner, the assessment might be called 'bandwidth.' It is a measure of how quickly your brain can process symbolic functions. It is no coincidence that brain speed is correlated with IQ. The kind of exercise that measures 'brain speed' - known as ECTs or elementary cognitive tasks, were viewed initially as another, possibly superior version of IQ test.
For example, early IQ tests and even the Stanford-Binet test had the well-known cultural limitations apparent in such constructs as the Scholastic Achievement Test (SAT) which favored white, upper middle class students with probably a suburban reference point. Depending on how close you were to this archteype, your were more likely to do well. Take your SAT test, designed for suburban kids in Princeton, New Jersey and give it to a teenage Dinka tribesman in the Sudan of similar age. Obviously the Dinka would barely be able to write his name on such a test and would not understand any of the associations and analogies. Even the quantitative reasoning section would be biased. Translating the test would not improve the outcome.
Now take an extremely simple measure without words or with a simple Yes/No option - such as how quickly you can push a button when a light comes on or a symbol flashes, and you have an instrument which can be used across cultures and even across species in some cases. The result is actionable information that can be used to correlate with human genetic types and variations, disease and health conditions, and create a personalized plan for optimization, which includes greater longevity and cognitive ability.
In other words, the process is now coming into place for us to step forward and make a major jump in evolution and perceptive ability, which is not necessarily the same thing as our ability to design and build devices or machines that 'simplify' our lives. Going back into human history, there was a time when humans were able to understand symbols as a form of communication and expression, despite having only elementary manipulative technology to alter their environment. To some extent, perception of symbolic meaning might be lessened or 'dampened' in direct proportion to our reliance on manipulative technologies (tools) that reduce our ability to visualize by filling our perceptive fields (sight, sound, olfactory, and more) with a distracting level of activity that serves no purpose other than justifying its own existence.
By improving human bandwidth or brainspeed with repetitive symbolic exercise we train ourselves to develop more flexible, effective, and enhanced cognitive abilities, a benchmark of health, longevity, and thinking into the future.
Science moves closer to Artificial life, not just Cloning
J. Craig Venter, he-of-the-Phosphorescent-blue-eyes, works on Artificial life of a sort (research to be published in Science)
Scientists have taken a first step toward making synthetic life by transferring genetic material from one bacterium into another, transforming the second microbe into a copy of the first.
They intend to use their technique to custom-design bacteria to perform functions such as producing artificial fuel or cleaning up toxic waste, the researchers report in Friday's issue of the journal Science.
"This is equivalent to changing a Macintosh computer to a PC by inserting a new piece of software," Craig Venter, a genome pioneer who now heads his own institute in Rockville, Maryland, told reporters in a telephone briefing.
"I think eventually we could make artificial cells," Venter added. "This is a first step."
Venter has been trying for years to create a microbe from scratch. This is not quite it, but his team re-programmed one species of bacteria by adding in the genetic material from a closely related species.
They gene-engineered the replacement chromosome to resist an antibiotic and then flooded their experiment with the drug. The bacteria that survived all carried only the genes that had been spliced in.
They believe all the others simply died, but they are in fact not sure how the new DNA re-programmed some of the bacteria or what happened to the original DNA.
"I think that we don't know for certain how the donor genome takes over," Venter Institute researcher Ham Smith told reporters.
Nonetheless, Venter's team has applied for a patent on the process and they hope to exploit it industrially. Venter believes it will be relatively straightforward to build a new chromosome from scratch, one that performs the desired functions, to create a custom-made bacterium.
"What we are reporting in this Science paper is not anything about a synthetic organism," Venter said.
BOOTING UP LIFE
"It's a key enabling step so that once we have a synthetic chromosome we know it is now possible to boot that up. So synthetic biology itself and synthetic genomics is much closer to being proven," Venter added.
"We look forward to having fuels from genetically modified organisms within the next decade and perhaps in half that time."
The key to the experiment was using a very simple bacterium called Mycoplasma capricolum, which often infects goats. Bacteria do not have a nucleus as do cells from more complex organisms.
The research team injected a chromosome from a related species called Mycoplasma mycoides.
They do not know how well it worked but at least some of the M. capricolum were transformed into what looked and acted like M. mycoides.
The scientists concede it will be much more difficult to do this with more complex organisms, even bacteria, that have cell walls and all sorts of defensive mechanisms to keep out foreign
A non-profit Canadian organization called the ETC Group expressed concern about the experiment and Venter's patent application. "We are extremely concerned about the breadth and implications of this patent and of its monopoly claims," the group's Jim Thomas said in an e-mail.
"We will be requesting that patent offices worldwide refuse this patent."
But Venter defended the patent. "At every stage of what the team has done here over the past several years, we have had to develop novel technologies and approaches that have not existed before because the field has not existed before," he said.
Using the early-warning system network and data provided by the Japan Meteorological Agency via the Internet, the appliance sounds off a loud countdown of up to 20 seconds before the moment the tremor begins.
Security firm SunShine says this should give people enough time to hide under tables, turn off gas and fire sources, or even just to move away from potentially dangerous furniture.
Starting in October, the JMA warnings will also be broadcast on television and radio, and sent to mobile phones equipped to receive them, which will go on sale later this year.
But the company hopes that its EQGuard, which will also be available in October, will help people who just happened not to be watching television.
"There are 51 million households in Japan, and we expect this system to catch on with at least 20 percent of the households," said Kazuo Sasaki, SunShine's president.
Japan accounts for about 20 percent of the world's earthquakes of magnitude 6 or greater.
The appliance sends alerts, once it detects primary waves, or the first waves of an earthquake that do not cause major rattling but travel faster than the secondary waves that are responsible for the actual shaking.
The alerts could precede the shaking by 10 to 20 seconds, though the period would be much shorter--and in some cases absent--if the tremor's center is near.
Read more (CNET)
A Nearby Death Star - Eta Carinae?
Keep your brain active by pondering this scenario. In the U.S., our greatest risk of catastrophic natural disaster is probably the Yellowstone caldera, which is a gigantic simmering volcano capable of ending life in the western U.S. and changing weather worldwide for a number of years, possibly launching an ice age. It is highly active, as in recent years geysers have been changing their flow patterns on an accelerated pace - old Faithful is less faithful and less impressive, but other even larger geysers once dormant have become increasingly active. Apparently there is no immediate threat, however.
In the skies, people like Stephen Hawking and Arthur C. Clarke have been calling for mankind to colonize space, not merely for vanity's sake but as a matter of species survival. People speculate about rogue asteroids or comets impacting earth and creating a vision of destruction and flames in the atmosphere - but an asteroid may not be the only risk, and the timing might be more urgent than is commonly perceived.
Relatively close to the solar system at 7,500 light years away is a giant, unpredictable star. This star, Eta Carinae, is 100 times more massive than the sun and more than 4,000,000 times brighter. For a while in the 19th century it was the second brightest star in the sky. Astronomers have known about the star for some time. In the year 2000 it also showed strange perturbances. By 2007, it has started growing brighter again.
The estimated age of Eta Carinae is around 300,000 years (extremely young). However, the lifespan of a star this massive is 1 million years or less. This means that at anytime, from now to a few hundred thousand years in the future, Eta Carinae will explode in a giant supernova so large it is called a hypernova.
Quite possibly this has already occurred - with our linear view of time we would not know for 7,500 years - the rapid changes in the star visible in just over two centuries from our Earth vantage point might be compared to watching the lit fuse on a time bomb.
The intense gamma ray radiation burst from Eta Carinae will be strong enough to at least damage satellites and could possibly end all life on earth - no one really knows. If earth and the atmosphere are shielded from the gamma ray burst and shockwave accompanying the overwhelming visual light, which may be brighter than all of the stars of the Milky Way Galaxy for a short duration, possibly all will be well. However, this outcome is unknown. Otherwise, the atmosphere could be set aflame, with a proliferation of nitric oxides, radioactive Carbon 14, and total loss of ozone - the sun's radiation would hit the earth with full force. This is "An Inconvenient Truth" on a cosmic scale. Comforted as we are by the stability of the sun, which will not reach red giant size for about 5 billion years, Eta Carinae has evolved in a cosmic blink of an eye and it is capable of ending life just as quickly. Conversely, through their destruction, supernovas also replenish heavy elements necessary for life as stardust scattered by the shockwave. If you happen to read this message, please pass it on or post it to encourage debate, discussion is the first step to awareness, followed by solutions.
There is no Sanctuary
Remember Logan's Run? This pic comes from IMDB which will not let you link to any of its photos. It's anti-user friendly, however it's easy to save and post the content anyway (for free) so the DRM effort is a worthless distraction that probably keeps the site from being more popular. They (Amazon) need to create an IMDB widget or something more user friendly. Here's a link to IMDB, but what a hassle.
Anyway, right before age 30 men and women were 'transformed' through Carrousel, and were literally 'zapped' into dust. I never thought I could reach age 30, but now that seems relatively young.
Needless to say, in this dystopia, they wouldn't need Aubrey DeGrey, caloric restriction, Andy Weill, Deepak Chopra, and just about nobody would have Alzheimer's.
However, through our ability to understand our own genes, we may be participating in our own Festival...look below - it's a Youtube snippet:
Remember Nintendo sent a BrainAge game along with a Nintendo DS to George W for his last birthday?
Well, Cognitive Labs has been experimenting with the Presidential brain gym...which uses our proprietary and patented test 1.
See a pic of a founding father, honest abe, growth-minded Polk, or Richard "you won't have dick nixon to kick around anymore" Nixon...
Then there's a pause with Jerry Ford, Jimmy Carter, Ronald Reagan, George I, Bill Jeff Clinton, and George II. Several presidents ascended to the office accidentally - including Truman, LB Johnson and Ford. For almost 50 years after the Civil War, the country was run by ex-Union generals and soldiers, all the way up through McKinley, who was a common soldier at Antietam, attracting the notice of General Benjamin Harrison through his simple heroism, and was promoted on the battlefield to an officer's rank; I don't recall this being emphasized much in school, but that's where the data takes you.
You may learn some facts you didn't know
Next up, Gladiator's brain gym, a real mental workout/gymnasium or lyceum.
Egypt's Secretary General of the Supreme Council for Antiquities, Zahi Hawass, today announced the discovery of the mummy of Sennefer in Tomb 99 in the Valley of the Kings.
Sennefer was high priest of Amun in the 18th Dynasty. Translating the hieroglyphs in this tomb is easy - as they closely approximate the standard texts used to teach 'Middle Egyptian' without either archaic forms or later ideograms thrown-in. Also, scribes and rock-cutters of this period had good penmanship and literacy, in our 21st Century view.
I believe the Tomb of Sennefer is quite well-known, featuring some famous pictures - boatmen ferrying the host and his lady to the afterlife; and details of bread and beer offerings.
Another part of the tomb features hanging grapes painted on the ceiling to cover up a bit of uneven rock...quite creative.
These vignettes were shown because the Egyptians believed you could 'take it with you' so long as you had either small models or later, painted images - of your helpers, livestock, agricultural plenty, and other status symbols which would be transformed by the magic of Osiris and the ennead into facsimiles of the original form...
read the Yahoo! Story, which missed some of these details.
A probe of the upper echelons of the human brain's chain-of-command has found strong evidence that there are not one but two complementary commanders in charge of the brain, according to neuroscientists at Washington University School of Medicine in St. Louis.
Scientists exploring the upper reaches of the brain's command hierarchy were astonished to find not one but two brain networks in charge, represented by the differently-colored spheres on the brain image above. Starting with a group of several brain regions.
Scientists exploring the upper reaches of the brain's command hierarchy were astonished to find not one but two brain networks in charge, represented by the differently-colored spheres on the brain image above. Starting with a group of several brain regions implicated in top-down control (the spheres on the brain), they used a new brain-scanning technique to identify which of those regions work with each other. When they graphed their results (bottom half), using shapes to represent different brain regions and connecting brain regions that work with each other with lines, they found the regions grouped together into two networks. The regions in each network talked to each other often but never talked to brain regions in the other network.
It's as if Captains James T. Kirk and Jean-Luc Picard were both on the bridge and in command of the same starship Enterprise.
In reality, these two captains are networks of brain regions that do not consult each other but still work toward a common purpose — control of voluntary, goal-oriented behavior. This includes a vast range of activities from reading a word to searching for a star to singing a song, but likely does not include involuntary behaviors such as control of the pulse rate or digestion.
"This was a big surprise. We knew several brain regions contribute to top-down control, but most of us thought we'd eventually show all those regions linking together in one system, one little guy up top telling everyone else what to do," says senior author Steven Petersen, Ph.D., James S. McDonnell Professor of Cognitive Neuroscience and professor of neurology and psychology.
>> read total article
WASHINGTON - More than 26 million people worldwide have Alzheimer's Disease, and a new forecast says the number will quadruple by 2050. At that rate, one in 85 people will have the brain-destroying disease in 40 years, researchers from Johns Hopkins University conclude. The new estimates, being presented Sunday at an Alzheimer's Association conference in Washington, are not very different from previous projections of the looming global dementia epidemic with the graying of the world's population.
But they serve as a sobering reminder of the toll to come if scientists cannot find better ways to battle Alzheimer's and protect aging brains.
"If we can make even modest advances in preventing Alzheimer's disease, or delay its progression, we could have a huge global public health impact," said Johns Hopkins public health specialist Ron Brookmeyer, who led the new study.
The biggest jump is projected for densely populated Asia, home of almost half of today's Alzheimer's cases, 12.6 million. By 2050, Asia will have 62.8 million of the world's 106 million Alzheimer's patients, the study projects.
A recent U.S. study estimated that this nation's Alzheimer's toll will reach 16 million by 2050, compared with more than 5 million today. The new estimate is significantly lower, suggesting only 3.1 million North American cases today and 8.8 million by 2050.
Among the estimates for other regions are:
_Africa, 1.3 million today and 6.3 million in 2050.
_Europe, 7.2 million and 16.5 million.
_Latin America and the Caribbean, 2 million and 10.8 million.
_Oceania, 200,00 and 800,000.
This Flash movie has 160,000 years of human development in just a few minutes - starting in East Central Africa. About 90,000 years ago there was a rapid and deep freezing which wiped out most of early man (a scenario which has been repeated occasionally).
The spread of genetic factors (such as APOEe4, which allowed some individuals to better weather famine) follows along with the migration routes.
Australia was populated long before most of Europe and the Americas.
Ptolemy V - Silver Tetradrachm, Alexandria mint
Here's an excerpt of what is essentially an announcement of a tax cut by Greek ruler Ptolemeos V Epiphaneos, descendant of Alexander the Great's general Ptolemy, and founder of the Macedonian line of Kings - to be posted in the relevant temples. A few other copies of Ptolemy's decree have been found in Egypt:
In the reign of the new king, who was Lord of the diadems, great in glory, the stabilizer of Egypt, and also pious in matters relating to the gods, Superior to his adversaries, rectifier of the life of men, Lord of the thirty-year periods like Hephaestus the Great, King like the Sun, the Great King of the Upper and Lower Lands, offspring of the Parent-loving Gods, whom Hephaestus has approved, to whom the Sun has given victory, living image of Zeus, Son of the Sun, Ptolemy the ever-living, beloved by Ptah;
In the ninth year, when Aëtus, son of Aëtus, was priest of Alexander and of the Savior Gods and the Brother Gods and the Benefactor Gods and the Parent-loving Gods and the God Manifest and Gracious; Pyrrha, the daughter of Philinius, being athlophorus for Bernice Euergetis; Areia, the daughter of Diogenes, being canephorus for Arsinoë Philadelphus; Irene, the daughter of Ptolemy, being priestess of Arsinoë Philopator: on the fourth of the month Xanicus, or according to the Egyptians the eighteenth of Mecheir.
THE DECREE: The high priests and prophets, and those who enter the inner shrine in order to robe the gods, and those who wear the hawks wing, and the sacred scribes, and all the other priests who have assembled at Memphis before the king, from the various temples throughout the country, for the feast of his receiving the kingdom, even that of Ptolemy the ever-living, beloved by Ptah, the God Manifest and Gracious, which he received from his Father, being assembled in the temple in Memphis this day, declared:
Since King Ptolemy, the ever-living, beloved by Ptah, the God Manifest and Gracious, the son of King Ptolemy and Queen Arsinoë, the Parent-loving Gods, has done many benefactions to the temples and to those who dwell in them and also to all those subjects to his rule, being from the beginning a god born of a god and a goddess—like Horus, the son of Isis and Osirus, who came to the help of his Father Osirus—being benevolently disposed toward the gods, has concentrated to the temples revenues both of silver and of grain, and has generously undergone many expenses in order to lead Egypt to prosperity and to establish the temples... the gods have rewarded him with health, victory, power, and all other good things, his sovereignty to continue to him and his children forever
Demotic comes from the Greek word "demos" which means what? Just think 'democracy.'
Demotic originated in the Delta area after the 25th Dynasty, long after the invasion of the Peoples of the Sea and probably the diaspora of the Hebrews referred to in the Book of Exodus (if indeed this was a historical event and not an allegorical 'beginning' to the saga that was told by word of mouth.)
Rubbings of the stone can be had, and sometimes duplicates - the greatest contribution towards the understanding of the stone was perhaps made by 3 University of Pennsylvania undergraduates, whose concordance was duly hailed by the British Antiquities Society in London.
Following our experience with that text, we contibuted to RosettaNet and supply chain related EDI, whose main sponsor was the company Marshall Industries. One of our contributions was the ANSI EDI 204 transaction set, which governed information flows in the small package industry. This EDI specification was invented by UPS.
Having contributed to cross-platform supply chain rationalization and the real Rosetta Stone via the much less marketable Egyptian language, the third incidence of the Rosetta experience is the human genome, the grok of which is often compared to the transliteration of the Rosetta Stone. Our contribution here is a simple tool for evaluation of genetic risks as they relate to the brain and cognition, via Internet-enabled technologies - and this is the essence of the Revolution...
But on a more serious note, see this piece from the Scotsman (UK)
AN INCIDENT of reduced oxygen to the brain caused by a stroke, heart attack, or even heavy snoring could make people more vulnerable to Alzheimer's disease, according to scientists. It can leave the patient more open to the gradual build-up of toxic chemicals which can cause Alzheimer's, a team at Leeds University said. This means a stroke victim may still be more at risk of developing Alzheimer's decades after they have made a full recovery.
Professor Chris Peers, of the school of medicine, who led the research, said: "Our research is looking into what happens when oxygen levels in the brain are reduced by a number of factors, from long-term conditions like emphysema and angina, to sudden incidents such as a heart attack, stroke or head trauma.
"Even though the patient may outwardly recover, the hidden cell damage may be irreversible.
"It could even be an issue for people who snore heavily. It can be anything that stops the heart and lungs working together."Professor Susanne Sorensen, head of research at the Alzheimer's Society, said: "This is exciting because rather than focusing on neurons they looked at processes in the brain, which up until now have not been resesarched in much detail.... read more of the article
Using a technique known as interferometry, astronomers obtained a collage-like image of the star Altair, about 15 light years away in the constellation Aquila (the eagle). Altair is 1/3 of the summer triangle - the other stars are Deneb and Vega.
ANN ARBOR, Mich.—University of Michigan astronomers combined light from four widely separated telescopes to produce the first picture showing surface details on a sun-like star beyond our solar system.
The image of the rapidly rotating, hot star Altair is the most detailed stellar picture ever made using an innovative light-combining technique called optical interferometry, said U-M astronomer John Monnier.
Beyond this technical milestone, the Altair observations provide surprising new insights that will force theorists to revise ideas about the behavior of rapid rotators like Altair.
"This powerful new tool allows us to zoom in on a star that's a million times farther away than the sun," said Monnier, lead author of a paper to be published online Thursday by the journal Science. "We're testing the theories of how stars work in much more detail than ever before."
Monnier and U-M graduate student Ming Zhao led an international team that made the Altair observations using four of the six telescopes at Georgia State University's Center for High Angular Resolution Astronomy (CHARA) interferometric array on Mount Wilson, Calif.
The four telescopes were separated by nearly 300 yards. Vacuum tubes carried starlight from the four scopes to a U-M built device called the Michigan Infrared Combiner, known as MIRC.
The combiner allowed researchers to merge infrared light from four of CHARA's telescopes for the first time, simulating a single giant instrument three football fields across. The result was an image of unprecedented detail—roughly 100 times sharper than pictures from the Hubble Space Telescope.
While solar astronomers can view sunspots and storms on our home star's roiling surface in exquisite detail, most other stars have—until very recently—appeared as simple points of light through even the most powerful telescopes.
But in the past decade, advances in the emerging field of optical interferometry have launched a new era of stellar imaging.
Other research teams have used the technique to acquire surface images of giant stars hundreds of times bigger than Altair. But the U-M-led study provides the first picture of a so-called main sequence star, one that generates energy mainly from hydrogen-to-helium nuclear fusion reactions in its core. Main sequence stars include the sun and most of the stars we see in the night sky.
"This is just a monumental stepping stone for us," said Harold McAlister, director of CHARA and a regent's professor of astronomy at Georgia State University in Atlanta. "Main sequence stars are far and away the largest population of stars out there, and being able to make a picture of one creates tremendous opportunities for future research."
One likely target for future studies: Imaging planets around stars beyond our solar system, said U-M's Zhao. "Imaging stars is just the start," he said.
Altair is the brightest star in the constellation Aquila, The Eagle, and is clearly visible with the naked eye in Northern Hemisphere skies. The nearby star is hotter and younger than the sun and nearly twice its size. Altair rotates at 638,000 mph at its equator, roughly 60 times faster than our home star.
"It's really whipping around and that's why, of course, it's spread out like a twirling ball of pizza dough," said Monnier, an assistant professor of astronomy.
Previous studies revealed that Altair, unlike most stars, is not a perfect sphere. Instead, its rapid spin rate creates centrifugal forces that flatten it into an oval: Its radius is significantly larger at the equator than at the poles.
In 1924, astronomer Hugo von Zeipel predicted that rapid rotators would display just this type of equatorial bulge. He also surmised that these stars would sport a dark band along the equator called gravity darkening. The bloated equator would appear dark because it is farther from the star's fiery nuclear core, and therefore cooler than the poles.
The CHARA picture of Altair, the result of observations made on two nights last summer, is "the first image of a star that allows us to visually confirm that basic idea" of gravity darkening, Monnier said. But the Altair image displays even more equatorial darkening than standard models predict, pointing to flaws in current models, he said.
U-M astronomer David Berger was a co-author of the Science paper. The team also included researchers from St. Andrews University, Cambridge University, Georgia State University, California Institute of Technology, Cornell University, the Laboratoire d'Astrophysique de Grenoble in France, the Michelson Science Center, and the National Optical Astronomy Observatory. Funding for the Altair project was provided by the National Science Foundation and NASA.
Although it’s true that sleep deprivation can lead to short-term memory loss, new research suggests it’s not that our minds become too tired to absorb new material, but that the information doesn’t get relayed past the eyes.
Experiments show that visual processing is impaired in the sleep-deprived, scientists reported last month.
Those short on sleep can only see and take in a small number of objects at a time. Anything over a certain threshold is lost.
“When people are sleep-deprived, they may not be seeing what they think they should be seeing, and it appears that this is what contributes to memory declines following sleep deprivation,” said Dr. Michael Chee, a neuroscientist at the National University of Singapore Graduate Medical School. He headed the team of researchers from his institution and Duke University.
The study involved 30 healthy volunteers whose memory was tested after a regular night’s sleep and after going 24 hours with no sleep.
“We generally think of memory decline as a result of faulty storage of information,” Chee said. “However, if the information is not properly handled by the visual system, either as a result of a failure to direct attention appropriately or a failure of visual areas (of the brain) to process what is seen, you can forget about the later stages of information consolidation and storage.”
Chee’s findings were published online by the Proceedings of the National Academy of Sciences.
He noted that a small group of sleep-deprived volunteers who had better performance in the tests were more able to tune out distractions, “but even they suffered from compromised visual attention and processing.”
In his next round of experiments, Chee wants to use brain imaging during tests to see if there are structural differences in the brains of people who are more or less susceptible to attention deficits from sleep deprivation.
Individuals who are less susceptible might make better candidates for long shifts at air traffic control centers, power plants or emergency dispatch centers, the researcher suggests.
And while we’re talking about attention deficits, can we not talk about them in a crowded car?
Australian researchers – including some who first noted that a driver who chats on a cell phone quadruples his risk of ramming another car – now say that driving with passengers substantially increases the risk of a serious crash, no matter how old the driver is. This is especially the case when driving with two or more passengers.
“Drivers with passengers were more than 60 percent more likely to have a motor-vehicle crash resulting in hospitalization, irrespective of their age group,” said Dr. Susanne McEvoy, lead investigator for the project at the George Institute for International Health in Sydney.
“The likelihood of a crash was more than doubled in the presence of two or more passengers,” McEvoy added.
The distraction level was not as great as that reported from phone use. But a lot more people drive with passengers than talk on phones in transit, so the toll from this kind of distraction is likely higher.
In addition, the researchers said the distraction risk is probably just one part of the equation for those most easily diverted – teenagers. They run extra risks from riding with other teens because peer pressure and showing off make them do stupid things.
6.03.07 - DNA has been deciphered, thanks to Nobel prize winning Dr. James Watson, along the lines of the work of Jean Francois Champollion, who translated Egyptian heiroglyphs. Here is a piece in slashdot. The next step is personal DNA mapping.
This should usher in the era of personalized 'alert' services as a preliminary step. If you have spent time here and taken a few tests, you have already entered into this new world, where the web itself and your computer becomes your lab assistant and your key to self-discovery. Several of our tests have correlated with possible genetic predisposition for Alzheimer's, so your score, in coming months and years will likely be a gateway into this exciting new world of Knowing. Dr Watson himself was unsure if he wanted to know about predisposition for Alzheimer's, but many people do want find out so they can take action. Another benefit of speed-based brain exercise, more so than other forms, is the development of additional cognitive reserve.