Sunday, March 15, 2009

Blocking protein may help ease painful nerve condition

Scientists have identified the first gene that pulls the plug on ailing nerve cell branches from within the nerve cell, possibly helping to trigger the painful condition known as neuropathy.

The condition is a side effect of some forms of chemotherapy and can also afflict patients with cancer, diabetes, , viral infections, neurodegenerative disorders and other ailments.

Researchers at Washington University School of Medicine in St. Louis showed that blocking the dual leucine zipper kinase (DLK) gene inhibits degeneration of ailing cell branches, possibly preventing neuropathy.

"Neuropathy can become so extraordinarily painful that some patients stop taking their chemotherapy, regardless of the consequences in their fight against cancer," says co-senior author Aaron DiAntonio, M.D., Ph.D., associate professor of developmental biology. "So we're very excited about the possibilities this gene may offer for reducing that pain."

The findings are published online on March 15 in .

Scientists have known since 1850 that have ways to prune branches (also known as ) that are injured. Although axon pruning is also a normal part of early human development, inappropriate loss of axons in the adult nervous system causes that have been compared to burning, freezing or and have come to be known as neuropathy.

DiAntonio's lab previously revealed that the fruit fly's version of DLK helps establish synapses, junctures where two nerve cells communicate. But they found the gene doesn't do the same thing in mice.

Curious about DLK's role in mammals, Bradley Miller, an M.D./Ph.D. student in DiAntonio's lab, consulted with co-senior author Jeffrey Milbrandt, M.D., Ph.D., the David Clayson Professor of Neurology. Milbrandt studies the role of various proteins in . With support from the University's Hope Center for Neurological Disorders, they showed that the long axons of the sciatic nerve in mice with a mutated DLK gene resisted degeneration after it was surgically cut.

In follow-up tests, Miller and Craig Press, an M.D./Ph.D. student in Milbrandt's lab, took nerve cells in culture and treated their axons with the chemotherapy drug vincristine. Normal axons degenerated rapidly after exposure to the drug, but axons where DLK's activity had been blocked were protected from degeneration.

"The pain of neuropathy is often a key factor that limits the dose in cancer chemotherapy," DiAntonio notes. "We know when patients are going to start their treatment, so one day it might be possible to start patients on a DLK-blocking drug before their chemotherapy and spare them considerable pain."

DLK appears to act like a contractor that calls in wrecking crews, DiAntonio notes. It helps make the decision to eradicate an axon, but the actual demolition is left to other processes called up by DLK.

"We want to more fully understand the chain of molecular reactions that carry out DLK's decision, because that might reveal a better opportunity to block the effect with a drug," says DiAntonio.

DiAntonio and Milbrandt also plan to test if blocking DLK stops neurodegeneration in other forms of injury and stress, including the harm inflicted on the optic nerve by glaucoma and central nervous system phenomena like stroke and Parkinson's disease.

More information: Miller BR, Press C, Daniels RW, Sasaki Y, Milbrandt J, DiAntonio A. A DLK-dependent axon self-destruction program promotes Wallerian degeneration. Nature Neuroscience, online March 15.

Source: Washington University School of Medicine

A natural approach for HIV vaccine

For 25 years, researchers have tried and failed to develop an HIV vaccine, primarily by focusing on a small number of engineered "super antibodies" to fend off the virus before it takes hold. So far, these magic bullet antibodies have proved impossible to produce in people. Now, in research to be published March 15 online by Nature, scientists at The Rockefeller University have laid out a new approach. They have identified a diverse team of antibodies in "slow-progressing" HIV patients whose coordinated pack hunting knocks down the virus just as well as their super-antibody cousins fighting solo.

By showcasing the dynamic, natural in these exceptional patients, the research, led by Michel C. Nussenzweig, Sherman Fairchild Professor and head of the Laboratory of , suggests that an effective may come from a shotgun approach using of a wide range of natural rather than an engineered .

"We wanted to try something different, so we tried to reproduce what's in the patient. And what's in the patient is many different that individually have limited neutralizing abilities but together are quite powerful," says Nussenzweig, who also is a Howard Hughes Medical Institute investigator. "This should make people think about what an effective vaccine should look like."

strains mutate rapidly, making them especially wily adversaries of the immune system. But one element is shared almost universally among the diverging strains — a protein on the envelope of the called gp140 that HIV needs to infect . Prior research has shown that four randomly engineered antibodies that block the activity of that protein prevent the virus from infecting immune cells in culture, but all attempts to coax the human body into producing those four have failed.

So Johannes Scheid, a visiting student in Nussenzweig's lab who is now a doctoral candidate, turned his attention to the antibodies produced by six people infected with HIV whose immune systems put up an exceptionally strong fight. The patients represent the roughly 10 to 20 percent of HIV patients who are able to control the virus and are very slow to progress to disease. Their immune systems' memory B cells produce high levels of antivirus antibodies, but until now, researchers have known little about the antibodies or how effective they are.

With help from Rockefeller's Center for Clinical and Translational Science and Rockefeller scientists David D. Ho and Jeffrey V. Ravetch, Scheid and colleagues isolated 433 antibodies from these individuals' blood serum that specifically targeted the — the chink in HIV's protean armor. He cloned the antibodies and produced them in bulk, mapped which part of the envelope protein each targeted, and gauged how effective each was in neutralizing the virus. In the process, he identified a new structure within the envelope protein — called the gp120 core — that had never been recognized as a potential target for antibodies. "It's the first time that anyone has defined what is really happening in the B cell response in these patients," says Scheid.

Scheid's work shows that it's common for these antibodies to have neutralizing activity, says Nussenzweig. But each antibody alone has limited ability to fight the virus. "Individually, they're not as strong as the Famous Four," says Nussenzweig, referring to the high-profile super antibodies on which several vaccine attempts have been based. But in high concentrations, a combination of the sets of antibodies cloned from the individual patients seemed to act as teams to knock down the virus in cell culture as well as any single antibody studied to date. These natural antibodies were also able to recognize a range of HIV strains, indicating that their diversity may be an advantage over a single super antibody that focuses on only one part of the virus, which can mutate. The findings suggest that research into vaccines that mimic this natural antibody response could pay off.

Tuesday, March 10, 2009

MicroRNA-based Diagnostic Identifies Squamous Lung Cancer with 96% Sensitivity

A new study shows for the first time that a microRNA-based diagnostic test can objectively identify squamous lung cancer with 96% sensitivity, according to Harvey Pass, M.D. of the NYU Cancer Institute at NYU Langone Medical Center, one of the authors of the study published on-line ahead of print in the Journal of Clinical Oncology.

In a paper titled, “Diagnostic Assay Based on has-miR-205 Expression Distinguishes Squamous From Non-Squamous Non-Small-Cell Lung Carcinoma,” researchers looked at 252 patients with and sent their tumor samples to a lab where a single microRNA biomarker identified squamous lung carcinomas with 96% and 90% specificity. This is important because studies have shown that as many as 30% of squamous lung cancers are misclassified. If the type of lung cancer is not identified correctly, patients may have side effects due to treatment and medications. For example, carries increased risk of severe or fatal bleeding for certain targeted biological therapies including () and other drugs in development. Other approved therapies such as Pemetrexed (Alimta) are indicated for non-squamous lung cancer only.

The study, funded by Rosetta Genomics, was conducted at the NYU Cancer Institute at NYU Langone Medical Center in collaboration with researchers from Columbia University and Sheba Medical Center.

“The results of this study are very encouraging,” says Harvey Pass, MD, professor of cardiothoracic surgery and director, thoracic surgery and oncology at the NYU Cancer Institute at NYU Langone Medical Center. “The study has demonstrated that a microRNA biomarker successfully identifies squamous lung cancer with high reproducibility, sensitivity and specificity. “The study certainly demonstrates the power of microRNAs in correctly classifying lung cancer and hopefully can immediately translate into more accurate choices of targeted therapies as well as cytotoxics for the disease.”

Dr. Pass is the Vice chairman medical advisory board for Rosetta Genomics (Nasdaq: ROSG), the company who makes a test based on the same microRNA biomarker that was evaluated by the study. The test offers similar accuracy (97% sensitivity) and is now commercially available through Rosetta Genomics CLIA-certified lab in Philadelphia.

Provided by New York University Langone Medical Center

City Kids May Breathe Easier in the Country

Children with asthma have an easier time breathing if they spend even a few days in the country, safeguarded from urban air pollution, a study led by Giovanni Piedimonte, M.D., professor and chairman of the Department of Pediatrics at the West Virginia University School of Medicine, finds.

The study, published in the March issue of the journal Pediatrics, shows for the first time that limiting allergic ’s exposure to outdoor air pollutants can improve lung function while reducing inflammation of the airways.

“This finding is significant because inflammation creates health risks for children with chronic respiratory problems,” Dr. Piedimonte explains. “Now we know that simply providing a cleaner environment in terms of helps provide relief fairly rapidly for children with .”

He adds, “This study suggests that possibly we could manage asthmatic children with much less medication if the air they breathed was cleaner.”

Researchers from the and Italy studied 37 Italian children with allergies and mild but persistent asthma, transporting them to a relatively pristine countryside setting - with lower levels of pollution - for a week.

Children recruited for the study were patients ages 7 to 14 at an asthma clinic in Pescara, Italy. For the rural part of the study, the children stayed in a hotel during a school camp in Ovindoli, Italy. They remained medicine-free and treatment-free for the duration of the study so the researchers could make correlations between the environmental air quality and the biomarkers that signal inflammation.

, pollen counts and meteorological conditions were monitored at both sites.

“A whole host of pollutants in the air of cities in economically developed countries has contributed to a worldwide rise in asthma rates among children,” says Piedimonte, who is also physician in chief of WVU Children's Hospital and director of the WVU Pediatric Research Institute. “Even knowing that, I was surprised to see how much better the children’s lung functions were after just a few days of cleaner air.”

Some of the problem pollutants in the air of industrialized countries are ozone, carbon monoxide and benzene - all of which can trigger emergency room visits and hospitalizations of asthmatic children. “In addition, we have new data suggesting that ultrafine particles may be especially toxic to the airways of children with asthma,” Piedimonte says.

The Health Statistics Center of the West Virginia Department of Health and Human Resources reports that 31,000 children in West Virginia have asthma. Until 2003, hospitalization rates for asthma were higher in the United States than in West Virginia. Now the opposite is true.

“West Virginia is experiencing an epidemic of asthma worse than in the rest of the United States,” Piedimonte says. “Among the contributing risk factors are high levels of air pollution plus low socioeconomic status and high rates of obesity and smoking.”

The United Health Foundation’s recent health rankings gave West Virginia a rank of 39 among the states for overall health, and it named high levels of air pollution as one of the state’s top challenges. “Our study shows how vital air quality is in terms of triggering asthma and allergies in children," Piedimonte says. "It’s something to evaluate carefully before considering government cutbacks in regulatory agencies that affect the air we breathe and set limits on industrial pollution.”

Provided by West Virginia University Health Sciences Center

Wednesday, March 4, 2009

Influence of 'obesity gene' can be offset by healthy diet

Children who carry a gene strongly associated with obesity could offset its effect by eating a low energy density diet, according to new research from UCL (University College London) and the University of Bristol published today in PLoS ONE.

The study, based on data from a sample of 2275 children from the Bristol-based ALSPAC study (Children of the 90s) provides evidence that people might be able to avoid becoming obese if they adopt a healthier diet with a low energy density - even those who carry the FTO gene, identified as being a high risk gene for obesity.

Dietary energy density (DED) refers to the amount of energy consumed per unit weight of food, or number of calories per bite. A low dietary energy density can be achieved by eating lots of water-rich foods like fruits and vegetables and limiting foods high in fat and sugar like chocolate and biscuits.

The researchers looked at how DED affected the build up of fat in the body over a period of three years in children aged between 10 and 13 years old. They found that children with a more energy dense diet (more calories per bite) tended to have more fat mass three years later and also confirmed that those carrying the high risk gene had greater fat mass overall.

When the researchers looked at whether children with the FTO gene had a stronger reaction to an energy dense diet than children with a lower genetic risk they found that they did not. These results indicate that if a child with a high genetic risk eats a diet with fewer calories per bite, they may be able to offset the effect of the gene on weight gain and so stay a healthy weight.

Lead author Dr Laura Johnson, UCL Epidemiology and Public Health, said: "This is an important finding because it provides evidence that it's easier to eat too much energy and gain weight when your diet is packed tight with calories, so adopting a diet with more bulk and less energy per bite could help people avoid becoming obese regardless of their genetic risk. Obesity is not inevitable if your genes give you a higher risk because if you change the types of foods you eat this will help curb excessive weight gain."

"This shows that although our genetic make-up does have an influence on our health, it's certainly not the only defining factor. Those with high risk genes can, in some cases, resist their genetic lot if they alter their lifestyle in the right way - in this case, their diet."

FTO is the first common obesity gene to be identified in Caucasian populations. Previous studies have shown that adults with two copies of the FTO gene are on average 3kg heavier, and individuals with a single copy are on average 1.5kg heavier, than those without the gene.

More information: The paper "Dietary energy density affects fat mass in early adolescence and is not modified by FTO variants" is published online ahead of print in PLoS One.

Source: University College London

Power and the illusion of control: Why some make the impossible possible and others fall shortPower and the illusion of control: Why some make the imp

Power holders often seem misguided in their actions. Leaders and commanders of warring nations regularly underestimate the costs in time, money, and human lives required for bringing home a victory. CEOs of Fortune 500 companies routinely overestimate their capacity to turn mergers and acquisitions into huge profits, leading to financial losses for themselves, their companies, and their stockholders. Even ordinary people seem to take on an air of invincibility after being promoted to a more powerful position. The consequences of these tendencies, especially when present in the world's most powerful leaders, can be devastating.

In a new study, Nathanael Fast and Deborah Gruenfeld at Stanford Graduate School of Business, Niro Sivanathan at the London Business School and Adam Galinsky at the Kellogg School of Management at Northwestern University, show that power can literally "go to one's head," causing individuals to think they have more personal control over outcomes than they, in fact, do.

"We conducted four experiments exploring the relationship between power and illusory control - the belief that one has the ability to influence outcomes that are largely determined by chance," said Galinksy, "In each experiment, whether the participant recalled power by an experience of holding power or it was manipulated by randomly assigning participants to Manager-Subordinate roles, it led to perceived control over outcomes that were beyond the reach of the individual. Furthermore, the notion of being able to control a 'chance' result led to unrealistic optimism and inflated self-esteem."

For example, in one experiment, power holders were presented with a pair of dice, offered a reward for predicting the outcome of a roll, and then asked if they would like to roll the dice or have someone else do it for them. Each and every participant in the high power group chose to roll the dice themselves compared to less than 70% of low power and neutral participants, supporting the notion that simply experiencing power can lead an individual to grossly overestimate their abilities, in this case, influencing the outcome of the roll by personally rolling the dice.

These results, reported in Psychological Science, a journal of the Association for Psychological Science, have implications for how power, once attained, is maintained or lost. The authors note that positive illusions can be adaptive, helping power holders make the seemingly impossible possible. But the relationship between power and illusory control might also contribute directly to losses in power, by causing leaders to make poor choices. They conclude that "the illusion of personal control might be one of the ways in which power often leads to its own demise."

Source: Association for Psychological Science

Musicians' Brains 'Fine-Tuned' to Identify Emotion

Looking for a mate who in everyday conversation can pick up even your most subtle emotional cues? Find a musician, Northwestern University researchers suggest.

In a study in the latest issue of European Journal of Neuroscience, an interdisciplinary Northwestern research team for the first time provides biological evidence that musical training enhances an individual's ability to recognize emotion in sound.

"Quickly and accurately identifying emotion in sound is a skill that translates across all arenas, whether in the predator-infested jungle or in the classroom, boardroom or bedroom," says Dana Strait, primary author of the study.

A doctoral student in the Henry and Leigh Bienen School of Music, Strait does research in the Auditory Neuroscience Laboratory directed by neuroscientist Nina Kraus. The laboratory has done pioneering work on the neurobiology underlying speech and music perception and learning-associated brain plasticity.

Kraus, Northwestern's Hugh Knowles Professor of Communication Sciences and Neurobiology; Richard Ashley, associate professor of music cognition; and Auditory Neuroscience Laboratory manager Erika Skoe co-authored the study titled "Musical Experience and Neural Efficiency: Effects of Training on Subcortical Processing of Vocal Expressions in Emotion."

The study, funded by the National Science Foundation, found that the more years of musical experience musicians possessed and the earlier the age they began their music studies also increased their nervous systems' abilities to process emotion in sound.

"Scientists already know that emotion is carried less by the linguistic meaning of a word than by the way in which the sound is communicated," says Strait. A child's cry of "Mommy!" -- or even his or her wordless utterance -- can mean very different things depending on the acoustic properties of the sound.

The Northwestern researchers measured brainstem processing of three acoustic correlates (pitch, timing and timbre) in musicians and non-musicians to a scientifically validated emotion sound. The musicians, who learn to use all their senses to practice and perform a musical piece, were found to have "finely tuned" auditory systems.

This fine-tuning appears to lend broad perceptual advantages to musicians. "Previous research has indicated that musicians demonstrate greater sensitivity to the nuances of emotion in speech," says Ashley, who explores the link between emotion perception and musical experience. One of his recent studies indicated that musicians might even be able to sense emotion in sounds after hearing them for only 50 milliseconds.

The 30 right-handed men and women with and without music training in the European Journal of Neuroscience study were between the ages of 19 and 35. Subjects with music training were grouped using two criteria -- years of musical experience and onset age of training (before or after age 7).

Study participants were asked to watch a subtitled nature film to keep them entertained while they were hearing, through earphones, a 250-millisecond fragment of a distressed baby's cry. Sensitivity to the sound, and in particular to the more complicated part of the sound that contributes most to its emotional content, was measured through scalp electrodes.

The results were not exactly what the researchers expected. They found that musicians' brainstems lock onto the complex part of the sound known to carry more emotional elements but de-emphasize the simpler (less emotion conveying) part of the sound. This was not the case in non-musicians.

In essence, musicians more economically and more quickly focus their neural resources on the important -- in this case emotional -- aspect of sound. "That their brains respond more quickly and accurately than the brains of non-musicians is something we'd expect to translate into the perception of emotion in other settings," Strait says.

The authors of the study also note that the acoustic elements that musicians process more efficiently are the very same ones that children with language disorders, such as dyslexia and autism, have problems encoding. "It would not be a leap to suggest that children with language processing disorders may benefit from musical experience," says Kraus.

Strait, a pianist and oboe player who formerly worked as a therapist with autistic children, goes a step further. Noting that impaired emotional perception is a hallmark of autism and Asberger's syndromes, she suggests that musical training might promote emotion processing in these populations.

Source: Northwestern University

Friday, February 27, 2009

Scientists discover why teeth form in a single row









A system of opposing genetic forces determines why mammals develop a single row of teeth, while sharks sport several, according to a study published today in the journal Science. When completely understood, the genetic program described in the study may help guide efforts to re-grow missing teeth and prevent cleft palate, one of the most common birth defects.

Gene expression is the process by which information stored in genes is converted into proteins that make up the body's structures and carry its messages. As the baby's face takes shape in the womb, the development of teeth and palate are tightly controlled in space and time by gene expression. Related abnormalities result in the development of teeth outside of the normal row, missing teeth and cleft palate, and the new insights suggest ways to combat these malformations.

The current study adds an important detail to the understanding of the interplay between biochemicals that induce teeth formation, and others that restrict it, to result in the correct pattern. Specifically, researchers discovered that turning off a single gene in mice resulted in development of extra teeth, next to and inside of their first molars. While the study was in mice, past studies have shown that the involved biochemical players are active in humans as well.

"This finding was exciting because extra teeth developed from tissue that normally does not give rise to teeth," said Rulang Jiang, Ph.D., associate professor of Biomedical Genetics in the Center for Oral Biology at the University of Rochester Medical Center, and corresponding author on the Science paper. "It takes the concerted actions of hundreds of genes to build a tooth, so it was amazing to find that deleting one gene caused the activation of a complete tooth developmental program outside of the normal tooth row in those mice. Finding out how the extra teeth developed will reveal how nature makes a tooth from scratch, which will guide tooth regeneration research."

Why Extra Teeth Formed

When we lose our baby teeth, the permanent teeth grow in to replace them, but permanent teeth when lost are lost for good. U.S. adults aged 20 years and older are missing an average of four teeth due to gum disease, trauma or congenital defects. Tooth loss makes chewing difficult, causes speech problems, accelerates oral disease, and disfigures the face. Current treatments for missing teeth include dentures or dental implants, but each procedure comes with disadvantages. The idea of growing teeth to replace missing ones has captured the imaginations of scientists, with many labs investigating ways to regenerate teeth.

In the current study, Jiang and colleagues generated mice that lacked the oddskipped related-2 (Osr2) gene, which encodes one of many transcription factors that turn genes on or off. "Knocking out" (deleting) the Osr2 gene resulted in cleft palate, a birth defect where the two halves of the roof of the mouth fail to join up properly, leaving a gap. Secondly, and surprisingly, the Osr2 "knockout" mice developed teeth outside of the normal tooth row. Jiang decided to focus his research first on the effect of Osr2 on teeth patterning (vs. cleft palate) because much more was known at the time about teeth development pathways.

Although teeth usually do not become visible until after birth, their formation starts early in development. Teeth develop from the epithelium and mesenchyme, two key tissue layers within the mammalian embryo. The first sign of tooth development in mammals is the thickening of the epithelium along the jaw line to form a band of cells called the dental lamina. Because all teeth subsequently form from the dental lamina, the assumption was that some special quality of epithelial cells there made them "tooth competent." Classical experiments, however, found that the developing tooth mesenchyme was capable of inducing tooth formation from epithelial tissues that normally would not participate in tooth development. Researchers confirmed that it was indeed the mesenchyme that carried tooth initiation signals later in development, but how those signals were restricted to the area beneath the tooth row was unknown.

Past studies in other labs had shown bone morphogenic protein 4 (BMP4) to be an important factor for the initiation of teeth, and that a protein called Msx1 amplifies the BMP4 tooth-generating signal. Jiang and colleagues suggested for the first time that some unknown factor was restricting the growth of teeth into one row by opposing the Bmp4 signal.

The current study provides the first solid proof that the precise space where mammals can develop teeth (the "tooth morphogenetic field") is shaped and restricted by the effect of Osr2 on the expression of the Bmp4 gene within the mesenchymal cell layer. Jiang's team has shown not only that removing the Osr2 gene results in extra teeth outside of the normal row, but also that Osr2 is expressed in increasing concentration in the jaw mesenchyme as you move from the cheek toward the tongue in the mouse embryo, the exact opposite of the BMP4 concentration gradient. Osr2 restricts Bmp4 expression to the tooth mesenchyme under the dental lamina, and in Osr2's absence, Bmp4 gene expression expands into the jaw mesenchyme outside of the tooth row.

A second major finding of the study backs up another emerging theory which holds that careful regulation of competing pro- and anti-tooth initiation signals controls how mammalian teeth come one by one in sequence. As each tooth develops, something must prevent it from forming too close to the next or mammals would have no gaps between their teeth. When this mechanism occasionally falters, adjacent teeth come in fused together. Since evolution is not perfect, wisdom teeth (third molars) often come in too close to their predecessors, and must be pulled to make space.

Jiang and colleagues also engineered a group of mice with both the Osr2 and Msx1 genes removed. While mice without Msx1 failed to grow any teeth, mice lacking both Msx1 and Osr2 grew the first molars, but no additional teeth. Thus, without Osr2, enough BMP4 was expressed for the first molar teeth to grow, but without Msx1, the BMP4 signal was not amplified to the point where it could kick off the next tooth in the row. With these results, Jiang argues that BMP4 cooperates with other factors to create a temporary zone around each tooth where no other tooth can grow. When the tooth gets closer to maturity, Msx1 overwhelms decreasing levels of inhibitory factors to start the BMP4-driven development of the next tooth. Since the jaw is growing at the same time teeth are forming, it follows that each tooth must also receive signals that enough jaw has grown in for the next tooth to start forming atop it.

The implications of the current results may go beyond tooth development, researchers said. Thanks to the work of Jiang and others, some of the biochemical pathways involved in cleft lip/cleft palate development are now recognized, and may include BMP4, Msx1 and OSR2 as well as several others. In humans, Msx1 mutations have been linked with cleft lip/palate and with the failure to develop one or more teeth. In the next phase of the team's work, researchers will look at what other factors may be regulated by Msx1 and Osr2 to begin pinpointing the genetic network that controls teeth patterning and palate development. Their goal is to manipulate stem cells to treat malformations and to develop prevention strategies for cleft palate (e.g. the inclusion of folic acid in prenatal vitamins prevents neural tube defects in some cases). Cleft lip/palate occurs one in 700 live births.

Along with Jiang, the work was led by Zunyi Zhang and Yu Lan within the Center for Oral Biology and Department of Biomedical Genetics at the Medical Center. Yang Chai collaborated on the effort from the Center for Craniofacial Molecular Biology at University of Southern California School of Dentistry in Los Angeles. The work was sponsored by the National Institute of Dental and Craniofacial Research, part of the National Institutes of Health.

"Beyond medical applications, our results suggest that diversity in the number of tooth rows across species may be due to evolutionary changes in the control of the BMP4/Msx1 pathway," Jiang said. "In mammals, Osr2 suppresses this pathway to restrict teeth within a single row."

Source: University of Rochester Medical Center

Psychologists shed light on origins of morality

In everyday language, people sometimes say that immoral behaviours "leave a bad taste in your mouth". But this may be more than a metaphor according to new scientific evidence from the University of Toronto that shows a link between moral disgust and more primitive forms of disgust related to poison and disease.

"Morality is often pointed to as the pinnacle of human evolution and development," says lead author Hanah Chapman, a graduate student in the Department of Psychology. "However, disgust is an ancient and rather primitive emotion which played a key evolutionary role in survival. Our research shows the involvement of disgust in morality, suggesting that moral judgment may depend as much on simple emotional processes as on complex thought." The research is being published in Science on February 27, 2009.

In the study, the scientists examined facial movements when participants tasted unpleasant liquids and looked at photographs of disgusting objects such as dirty toilets or injuries. They compared these to their facial movements when they were subjected to unfair treatment in a laboratory game. The U of T team found that people make similar facial movements in response to both primitive forms of disgust and moral disgust.

The research employed electromyography, a technique that uses small electrodes placed on the face to detect electrical activation that occurs when the facial muscles contract. In particular, they focused on movement of the levator labii muscle, which acts to raise the upper lip and wrinkle the nose, movements that are thought to be characteristic of the facial expression of disgust.

"We found that people show activation of this muscle region in all three situations - when tasting something bad, looking at something disgusting and experiencing unfairness," says Chapman.

"These results shed new light on the origins of morality, suggesting that not only do complex thoughts guide our moral compass, but also more primitive instincts related to avoiding potential toxins," says Adam Anderson, principal investigator on the project and the Canada Research Chair in Affective Neuroscience. "Surprisingly, our sophisticated moral sense of what is right and wrong may develop from a newborn's innate preference for what tastes good and bad, what is potentially nutritious versus poisonous."

Source: University of Toronto

Is HIV testing during labor feasible?

Cameroon is a sub-Saharan African country with high HIV rates yet many pregnant women do not know their HIV status. Research published in the open access journal BMC Pregnancy and Childbirth has shown that HIV testing during labour is a suitable way of improving detection rates and may help mothers and their infants receive appropriate antiretroviral treatment.

Eugene Kongnyuy of the Liverpool School of Tropical Medicine and his collaborators from the University of Yaounde I, Cameroon, investigated the acceptability of rapid HIV testing among 2413 women of unknown HIV status at four hospitals in the capital city, Yaounde. They found that 88.3% of the women were willing to accept HIV testing during labour. Furthermore, their study revealed a higher rate of HIV infection among women screened during labour (10.1%) than was previously estimated in a national health survey (6.8%) which, according to the authors, highlights the importance of HIV testing during labour.

About 3.2 million infants and young children worldwide are infected with HIV and in most cases the infection is a consequence of mother-to-child transmission (MTCT). Rapid HIV testing during labour or delivery represents the last opportunity for treatment before delivery to reduce MTCT. While this investigation has shown that HIV testing in the delivery room is feasible, it is nevertheless a challenging task especially in resource-constraint settings. The authors recommend "an opt-out approach for HIV testing during labour in Cameroon (i.e. women are informed that HIV testing will be routine during labour if HIV status is unknown but each person may decline to be tested). Such an approach will decrease the proportion of women who give birth with unknown HIV status and increase the number of mother-infant pairs who receive appropriate treatment for preventing MTCT of HIV".

The team propose that cost-effectiveness of HIV counselling and testing during labour is evaluated before the approach is implemented nationwide.

More information: Acceptability of intrapartum HIV counselling and testing in Cameroon, Eugene Kongnyuy, Enow Mbu, Francois Mbopi-Keou, Nelson Fomulu, Philip Nana, Pierre Tebeu, Rebecca Tonye and Robert Leke, BMC Pregnancy and Childbirth (in press), http://www.biomedcentral.com/bmcpregnancychildbirth/

Source: BioMed Central

Do doodle: Research shows doodling can help memory recall

Doodling while listening can help with remembering details, rather than implying that the mind is wandering as is the common perception. According to a study published today in the journal Applied Cognitive Psychology, subjects given a doodling task while listening to a dull phone message had a 29% improved recall compared to their non-doodling counterparts.

40 members of the research panel of the Medical Research Council's Cognition and Brain Sciences Unit in Cambridge were asked to listen to a two and a half minute tape giving several names of people and places, and were told to write down only the names of people going to a party. 20 of the participants were asked to shade in shapes on a piece of paper at the same time, but paying no attention to neatness. Participants were not asked to doodle naturally so that they would not become self-conscious. None of the participants were told it was a memory test.

After the tape had finished, all participants in the study were asked to recall the eight names of the party-goers which they were asked to write down, as well as eight additional place names which were included as incidental information. The doodlers recalled on average 7.5 names of people and places compared to only 5.8 by the non-doodlers.

"If someone is doing a boring task, like listening to a dull telephone conversation, they may start to daydream," said study researcher Professor Jackie Andrade, Ph.D., of the School of Psychology, University of Plymouth. "Daydreaming distracts them from the task, resulting in poorer performance. A simple task, like doodling, may be sufficient to stop daydreaming without affecting performance on the main task."

"In psychology, tests of memory or attention will often use a second task to selectively block a particular mental process. If that process is important for the main cognitive task then performance will be impaired. My research shows that beneficial effects of secondary tasks, such as doodling, on concentration may offset the effects of selective blockade," added Andrade. "This study suggests that in everyday life doodling may be something we do because it helps to keep us on track with a boring task, rather than being an unnecessary distraction that we should try to resist doing."

Drug improves mobility for some MS patients

The experimental drug fampridine (4-aminopyridine) improves walking ability in some individuals with multiple sclerosis (MS). That is the conclusion of a multi-center Phase 3 clinical trial, the results of which were published today in the journal The Lancet.

"This study indicates that fampridine could represent an important new way to treat multiple sclerosis and perhaps become the first drug to improve certain symptoms of the disease," said neurologist Andrew Goodman, M.D., chief of the Multiple Sclerosis Center at the University of Rochester Medical Center (URMC) and lead author of the study. "The data suggest that, for a sub-set of MS patients, nervous system function is partially restored while taking the drug."

The study evaluated a sustained-release formulation of the drug, Fampridine-SR, which is being developed by Acorda Therapeutics, Inc. The company, which funded the study, submitted a new drug application to the U.S. Food and Drug Administration earlier this month. Goodman has been a consultant and advisor to Acorda for its fampridine studies in MS.

Multiple sclerosis is a disease of the central nervous system and is the most common cause of neurological disability in young adults. Worldwide it is estimated that more than a million people are affected by MS which is typically characterized by recurrent relapses followed by periods of remission early in its course. The symptoms of the disease vary from person to person, but commonly consist of muscle weakness, gait difficulties, numbness or tingling in arms and legs, difficulty with coordination and balance, blurred vision, and slurred speech. Over time, the effects of the disease tend to become more permanent and debilitating.

While the precise cause is unknown, it is understood that the immune system in individuals with MS attacks myelin, a fatty tissue in the central nervous system that wraps the fibers - or axons - that connect nerve cells. Similar to the insulation on an electrical wire, myelin allows for the efficient conduction of nerve impulses. When myelin is lost or damaged in the disease, signals between nerve cells are delayed, disrupted, or even blocked.

It is believed that fampridine improves the transmission of signals in the central nervous system of some MS patients by blocking potassium ion channels. These channels serve as gates on the surface of cells and regulate the normal electrical activity. In laboratory experiments involving nerve fibers with myelin that was damaged in a manner that mimics MS, scientists found that blocking these channels results in a recovery of signal conduction.

In the Phase 3 study published today, the effects of Fampridine-SR were tested in 301 adult MS patients at 33 locations in the U.S. and Canada over a 14-week period. Three quarters of the participants took the drug and the rest were given a placebo.

Typically, MS drugs have been evaluated based on the ability to prevent relapses. Because the goal of this study was to assess changes in function, the researchers instead sought to evaluate participants' mobility and muscle strength - as opposed to the disease process. In prior studies, Goodman and his URMC colleague, the late Steven Schwid, M.D., had validated new methods to measure changes in gait, or walking speed over distance. Employing these methods in The Lancet study, they found that 34.8% of those receiving the drug experienced an improvement (an average of about 25% increase) in the speed they could walk 25 feet compared to only 8.3% in the placebo group.

"During the course of the disease, many MS patients experience a decline in mobility and this disability has a major impact in terms of quality of life," said Goodman. "As a clinician, I can say that improvement in walking speed could have important psychological value; it may give individuals the potential to regain some of the independence that they may have lost in their daily lives."

Several other drugs have been approved to treat MS. These treatments either counter the nervous system inflammation that is a characteristic of the disease or suppress the immune system generally. While these drugs can be effective at preventing new relapses and slowing the progression of the disease, there are no treatments currently available that improve impaired function, such as mobility problems, for people with MS. Participants in the trial were allowed to continue to take most other medications for MS and researchers did not observe any negative interactions. However, a total of eleven patients (4.8%) in the fampridine-treated group discontinued the study due to side effects. Only two of these were considered by the investigators to be possibly related to treatment.

Source: University of Rochester Medical Center