Sunday, March 15, 2009

Blocking protein may help ease painful nerve condition

Scientists have identified the first gene that pulls the plug on ailing nerve cell branches from within the nerve cell, possibly helping to trigger the painful condition known as neuropathy.

The condition is a side effect of some forms of chemotherapy and can also afflict patients with cancer, diabetes, , viral infections, neurodegenerative disorders and other ailments.

Researchers at Washington University School of Medicine in St. Louis showed that blocking the dual leucine zipper kinase (DLK) gene inhibits degeneration of ailing cell branches, possibly preventing neuropathy.

"Neuropathy can become so extraordinarily painful that some patients stop taking their chemotherapy, regardless of the consequences in their fight against cancer," says co-senior author Aaron DiAntonio, M.D., Ph.D., associate professor of developmental biology. "So we're very excited about the possibilities this gene may offer for reducing that pain."

The findings are published online on March 15 in .

Scientists have known since 1850 that have ways to prune branches (also known as ) that are injured. Although axon pruning is also a normal part of early human development, inappropriate loss of axons in the adult nervous system causes that have been compared to burning, freezing or and have come to be known as neuropathy.

DiAntonio's lab previously revealed that the fruit fly's version of DLK helps establish synapses, junctures where two nerve cells communicate. But they found the gene doesn't do the same thing in mice.

Curious about DLK's role in mammals, Bradley Miller, an M.D./Ph.D. student in DiAntonio's lab, consulted with co-senior author Jeffrey Milbrandt, M.D., Ph.D., the David Clayson Professor of Neurology. Milbrandt studies the role of various proteins in . With support from the University's Hope Center for Neurological Disorders, they showed that the long axons of the sciatic nerve in mice with a mutated DLK gene resisted degeneration after it was surgically cut.

In follow-up tests, Miller and Craig Press, an M.D./Ph.D. student in Milbrandt's lab, took nerve cells in culture and treated their axons with the chemotherapy drug vincristine. Normal axons degenerated rapidly after exposure to the drug, but axons where DLK's activity had been blocked were protected from degeneration.

"The pain of neuropathy is often a key factor that limits the dose in cancer chemotherapy," DiAntonio notes. "We know when patients are going to start their treatment, so one day it might be possible to start patients on a DLK-blocking drug before their chemotherapy and spare them considerable pain."

DLK appears to act like a contractor that calls in wrecking crews, DiAntonio notes. It helps make the decision to eradicate an axon, but the actual demolition is left to other processes called up by DLK.

"We want to more fully understand the chain of molecular reactions that carry out DLK's decision, because that might reveal a better opportunity to block the effect with a drug," says DiAntonio.

DiAntonio and Milbrandt also plan to test if blocking DLK stops neurodegeneration in other forms of injury and stress, including the harm inflicted on the optic nerve by glaucoma and central nervous system phenomena like stroke and Parkinson's disease.

More information: Miller BR, Press C, Daniels RW, Sasaki Y, Milbrandt J, DiAntonio A. A DLK-dependent axon self-destruction program promotes Wallerian degeneration. Nature Neuroscience, online March 15.

Source: Washington University School of Medicine

A natural approach for HIV vaccine

For 25 years, researchers have tried and failed to develop an HIV vaccine, primarily by focusing on a small number of engineered "super antibodies" to fend off the virus before it takes hold. So far, these magic bullet antibodies have proved impossible to produce in people. Now, in research to be published March 15 online by Nature, scientists at The Rockefeller University have laid out a new approach. They have identified a diverse team of antibodies in "slow-progressing" HIV patients whose coordinated pack hunting knocks down the virus just as well as their super-antibody cousins fighting solo.

By showcasing the dynamic, natural in these exceptional patients, the research, led by Michel C. Nussenzweig, Sherman Fairchild Professor and head of the Laboratory of , suggests that an effective may come from a shotgun approach using of a wide range of natural rather than an engineered .

"We wanted to try something different, so we tried to reproduce what's in the patient. And what's in the patient is many different that individually have limited neutralizing abilities but together are quite powerful," says Nussenzweig, who also is a Howard Hughes Medical Institute investigator. "This should make people think about what an effective vaccine should look like."

strains mutate rapidly, making them especially wily adversaries of the immune system. But one element is shared almost universally among the diverging strains — a protein on the envelope of the called gp140 that HIV needs to infect . Prior research has shown that four randomly engineered antibodies that block the activity of that protein prevent the virus from infecting immune cells in culture, but all attempts to coax the human body into producing those four have failed.

So Johannes Scheid, a visiting student in Nussenzweig's lab who is now a doctoral candidate, turned his attention to the antibodies produced by six people infected with HIV whose immune systems put up an exceptionally strong fight. The patients represent the roughly 10 to 20 percent of HIV patients who are able to control the virus and are very slow to progress to disease. Their immune systems' memory B cells produce high levels of antivirus antibodies, but until now, researchers have known little about the antibodies or how effective they are.

With help from Rockefeller's Center for Clinical and Translational Science and Rockefeller scientists David D. Ho and Jeffrey V. Ravetch, Scheid and colleagues isolated 433 antibodies from these individuals' blood serum that specifically targeted the — the chink in HIV's protean armor. He cloned the antibodies and produced them in bulk, mapped which part of the envelope protein each targeted, and gauged how effective each was in neutralizing the virus. In the process, he identified a new structure within the envelope protein — called the gp120 core — that had never been recognized as a potential target for antibodies. "It's the first time that anyone has defined what is really happening in the B cell response in these patients," says Scheid.

Scheid's work shows that it's common for these antibodies to have neutralizing activity, says Nussenzweig. But each antibody alone has limited ability to fight the virus. "Individually, they're not as strong as the Famous Four," says Nussenzweig, referring to the high-profile super antibodies on which several vaccine attempts have been based. But in high concentrations, a combination of the sets of antibodies cloned from the individual patients seemed to act as teams to knock down the virus in cell culture as well as any single antibody studied to date. These natural antibodies were also able to recognize a range of HIV strains, indicating that their diversity may be an advantage over a single super antibody that focuses on only one part of the virus, which can mutate. The findings suggest that research into vaccines that mimic this natural antibody response could pay off.

Tuesday, March 10, 2009

MicroRNA-based Diagnostic Identifies Squamous Lung Cancer with 96% Sensitivity

A new study shows for the first time that a microRNA-based diagnostic test can objectively identify squamous lung cancer with 96% sensitivity, according to Harvey Pass, M.D. of the NYU Cancer Institute at NYU Langone Medical Center, one of the authors of the study published on-line ahead of print in the Journal of Clinical Oncology.

In a paper titled, “Diagnostic Assay Based on has-miR-205 Expression Distinguishes Squamous From Non-Squamous Non-Small-Cell Lung Carcinoma,” researchers looked at 252 patients with and sent their tumor samples to a lab where a single microRNA biomarker identified squamous lung carcinomas with 96% and 90% specificity. This is important because studies have shown that as many as 30% of squamous lung cancers are misclassified. If the type of lung cancer is not identified correctly, patients may have side effects due to treatment and medications. For example, carries increased risk of severe or fatal bleeding for certain targeted biological therapies including () and other drugs in development. Other approved therapies such as Pemetrexed (Alimta) are indicated for non-squamous lung cancer only.

The study, funded by Rosetta Genomics, was conducted at the NYU Cancer Institute at NYU Langone Medical Center in collaboration with researchers from Columbia University and Sheba Medical Center.

“The results of this study are very encouraging,” says Harvey Pass, MD, professor of cardiothoracic surgery and director, thoracic surgery and oncology at the NYU Cancer Institute at NYU Langone Medical Center. “The study has demonstrated that a microRNA biomarker successfully identifies squamous lung cancer with high reproducibility, sensitivity and specificity. “The study certainly demonstrates the power of microRNAs in correctly classifying lung cancer and hopefully can immediately translate into more accurate choices of targeted therapies as well as cytotoxics for the disease.”

Dr. Pass is the Vice chairman medical advisory board for Rosetta Genomics (Nasdaq: ROSG), the company who makes a test based on the same microRNA biomarker that was evaluated by the study. The test offers similar accuracy (97% sensitivity) and is now commercially available through Rosetta Genomics CLIA-certified lab in Philadelphia.

Provided by New York University Langone Medical Center

City Kids May Breathe Easier in the Country

Children with asthma have an easier time breathing if they spend even a few days in the country, safeguarded from urban air pollution, a study led by Giovanni Piedimonte, M.D., professor and chairman of the Department of Pediatrics at the West Virginia University School of Medicine, finds.

The study, published in the March issue of the journal Pediatrics, shows for the first time that limiting allergic ’s exposure to outdoor air pollutants can improve lung function while reducing inflammation of the airways.

“This finding is significant because inflammation creates health risks for children with chronic respiratory problems,” Dr. Piedimonte explains. “Now we know that simply providing a cleaner environment in terms of helps provide relief fairly rapidly for children with .”

He adds, “This study suggests that possibly we could manage asthmatic children with much less medication if the air they breathed was cleaner.”

Researchers from the and Italy studied 37 Italian children with allergies and mild but persistent asthma, transporting them to a relatively pristine countryside setting - with lower levels of pollution - for a week.

Children recruited for the study were patients ages 7 to 14 at an asthma clinic in Pescara, Italy. For the rural part of the study, the children stayed in a hotel during a school camp in Ovindoli, Italy. They remained medicine-free and treatment-free for the duration of the study so the researchers could make correlations between the environmental air quality and the biomarkers that signal inflammation.

, pollen counts and meteorological conditions were monitored at both sites.

“A whole host of pollutants in the air of cities in economically developed countries has contributed to a worldwide rise in asthma rates among children,” says Piedimonte, who is also physician in chief of WVU Children's Hospital and director of the WVU Pediatric Research Institute. “Even knowing that, I was surprised to see how much better the children’s lung functions were after just a few days of cleaner air.”

Some of the problem pollutants in the air of industrialized countries are ozone, carbon monoxide and benzene - all of which can trigger emergency room visits and hospitalizations of asthmatic children. “In addition, we have new data suggesting that ultrafine particles may be especially toxic to the airways of children with asthma,” Piedimonte says.

The Health Statistics Center of the West Virginia Department of Health and Human Resources reports that 31,000 children in West Virginia have asthma. Until 2003, hospitalization rates for asthma were higher in the United States than in West Virginia. Now the opposite is true.

“West Virginia is experiencing an epidemic of asthma worse than in the rest of the United States,” Piedimonte says. “Among the contributing risk factors are high levels of air pollution plus low socioeconomic status and high rates of obesity and smoking.”

The United Health Foundation’s recent health rankings gave West Virginia a rank of 39 among the states for overall health, and it named high levels of air pollution as one of the state’s top challenges. “Our study shows how vital air quality is in terms of triggering asthma and allergies in children," Piedimonte says. "It’s something to evaluate carefully before considering government cutbacks in regulatory agencies that affect the air we breathe and set limits on industrial pollution.”

Provided by West Virginia University Health Sciences Center

Wednesday, March 4, 2009

Influence of 'obesity gene' can be offset by healthy diet

Children who carry a gene strongly associated with obesity could offset its effect by eating a low energy density diet, according to new research from UCL (University College London) and the University of Bristol published today in PLoS ONE.

The study, based on data from a sample of 2275 children from the Bristol-based ALSPAC study (Children of the 90s) provides evidence that people might be able to avoid becoming obese if they adopt a healthier diet with a low energy density - even those who carry the FTO gene, identified as being a high risk gene for obesity.

Dietary energy density (DED) refers to the amount of energy consumed per unit weight of food, or number of calories per bite. A low dietary energy density can be achieved by eating lots of water-rich foods like fruits and vegetables and limiting foods high in fat and sugar like chocolate and biscuits.

The researchers looked at how DED affected the build up of fat in the body over a period of three years in children aged between 10 and 13 years old. They found that children with a more energy dense diet (more calories per bite) tended to have more fat mass three years later and also confirmed that those carrying the high risk gene had greater fat mass overall.

When the researchers looked at whether children with the FTO gene had a stronger reaction to an energy dense diet than children with a lower genetic risk they found that they did not. These results indicate that if a child with a high genetic risk eats a diet with fewer calories per bite, they may be able to offset the effect of the gene on weight gain and so stay a healthy weight.

Lead author Dr Laura Johnson, UCL Epidemiology and Public Health, said: "This is an important finding because it provides evidence that it's easier to eat too much energy and gain weight when your diet is packed tight with calories, so adopting a diet with more bulk and less energy per bite could help people avoid becoming obese regardless of their genetic risk. Obesity is not inevitable if your genes give you a higher risk because if you change the types of foods you eat this will help curb excessive weight gain."

"This shows that although our genetic make-up does have an influence on our health, it's certainly not the only defining factor. Those with high risk genes can, in some cases, resist their genetic lot if they alter their lifestyle in the right way - in this case, their diet."

FTO is the first common obesity gene to be identified in Caucasian populations. Previous studies have shown that adults with two copies of the FTO gene are on average 3kg heavier, and individuals with a single copy are on average 1.5kg heavier, than those without the gene.

More information: The paper "Dietary energy density affects fat mass in early adolescence and is not modified by FTO variants" is published online ahead of print in PLoS One.

Source: University College London

Power and the illusion of control: Why some make the impossible possible and others fall shortPower and the illusion of control: Why some make the imp

Power holders often seem misguided in their actions. Leaders and commanders of warring nations regularly underestimate the costs in time, money, and human lives required for bringing home a victory. CEOs of Fortune 500 companies routinely overestimate their capacity to turn mergers and acquisitions into huge profits, leading to financial losses for themselves, their companies, and their stockholders. Even ordinary people seem to take on an air of invincibility after being promoted to a more powerful position. The consequences of these tendencies, especially when present in the world's most powerful leaders, can be devastating.

In a new study, Nathanael Fast and Deborah Gruenfeld at Stanford Graduate School of Business, Niro Sivanathan at the London Business School and Adam Galinsky at the Kellogg School of Management at Northwestern University, show that power can literally "go to one's head," causing individuals to think they have more personal control over outcomes than they, in fact, do.

"We conducted four experiments exploring the relationship between power and illusory control - the belief that one has the ability to influence outcomes that are largely determined by chance," said Galinksy, "In each experiment, whether the participant recalled power by an experience of holding power or it was manipulated by randomly assigning participants to Manager-Subordinate roles, it led to perceived control over outcomes that were beyond the reach of the individual. Furthermore, the notion of being able to control a 'chance' result led to unrealistic optimism and inflated self-esteem."

For example, in one experiment, power holders were presented with a pair of dice, offered a reward for predicting the outcome of a roll, and then asked if they would like to roll the dice or have someone else do it for them. Each and every participant in the high power group chose to roll the dice themselves compared to less than 70% of low power and neutral participants, supporting the notion that simply experiencing power can lead an individual to grossly overestimate their abilities, in this case, influencing the outcome of the roll by personally rolling the dice.

These results, reported in Psychological Science, a journal of the Association for Psychological Science, have implications for how power, once attained, is maintained or lost. The authors note that positive illusions can be adaptive, helping power holders make the seemingly impossible possible. But the relationship between power and illusory control might also contribute directly to losses in power, by causing leaders to make poor choices. They conclude that "the illusion of personal control might be one of the ways in which power often leads to its own demise."

Source: Association for Psychological Science

Musicians' Brains 'Fine-Tuned' to Identify Emotion

Looking for a mate who in everyday conversation can pick up even your most subtle emotional cues? Find a musician, Northwestern University researchers suggest.

In a study in the latest issue of European Journal of Neuroscience, an interdisciplinary Northwestern research team for the first time provides biological evidence that musical training enhances an individual's ability to recognize emotion in sound.

"Quickly and accurately identifying emotion in sound is a skill that translates across all arenas, whether in the predator-infested jungle or in the classroom, boardroom or bedroom," says Dana Strait, primary author of the study.

A doctoral student in the Henry and Leigh Bienen School of Music, Strait does research in the Auditory Neuroscience Laboratory directed by neuroscientist Nina Kraus. The laboratory has done pioneering work on the neurobiology underlying speech and music perception and learning-associated brain plasticity.

Kraus, Northwestern's Hugh Knowles Professor of Communication Sciences and Neurobiology; Richard Ashley, associate professor of music cognition; and Auditory Neuroscience Laboratory manager Erika Skoe co-authored the study titled "Musical Experience and Neural Efficiency: Effects of Training on Subcortical Processing of Vocal Expressions in Emotion."

The study, funded by the National Science Foundation, found that the more years of musical experience musicians possessed and the earlier the age they began their music studies also increased their nervous systems' abilities to process emotion in sound.

"Scientists already know that emotion is carried less by the linguistic meaning of a word than by the way in which the sound is communicated," says Strait. A child's cry of "Mommy!" -- or even his or her wordless utterance -- can mean very different things depending on the acoustic properties of the sound.

The Northwestern researchers measured brainstem processing of three acoustic correlates (pitch, timing and timbre) in musicians and non-musicians to a scientifically validated emotion sound. The musicians, who learn to use all their senses to practice and perform a musical piece, were found to have "finely tuned" auditory systems.

This fine-tuning appears to lend broad perceptual advantages to musicians. "Previous research has indicated that musicians demonstrate greater sensitivity to the nuances of emotion in speech," says Ashley, who explores the link between emotion perception and musical experience. One of his recent studies indicated that musicians might even be able to sense emotion in sounds after hearing them for only 50 milliseconds.

The 30 right-handed men and women with and without music training in the European Journal of Neuroscience study were between the ages of 19 and 35. Subjects with music training were grouped using two criteria -- years of musical experience and onset age of training (before or after age 7).

Study participants were asked to watch a subtitled nature film to keep them entertained while they were hearing, through earphones, a 250-millisecond fragment of a distressed baby's cry. Sensitivity to the sound, and in particular to the more complicated part of the sound that contributes most to its emotional content, was measured through scalp electrodes.

The results were not exactly what the researchers expected. They found that musicians' brainstems lock onto the complex part of the sound known to carry more emotional elements but de-emphasize the simpler (less emotion conveying) part of the sound. This was not the case in non-musicians.

In essence, musicians more economically and more quickly focus their neural resources on the important -- in this case emotional -- aspect of sound. "That their brains respond more quickly and accurately than the brains of non-musicians is something we'd expect to translate into the perception of emotion in other settings," Strait says.

The authors of the study also note that the acoustic elements that musicians process more efficiently are the very same ones that children with language disorders, such as dyslexia and autism, have problems encoding. "It would not be a leap to suggest that children with language processing disorders may benefit from musical experience," says Kraus.

Strait, a pianist and oboe player who formerly worked as a therapist with autistic children, goes a step further. Noting that impaired emotional perception is a hallmark of autism and Asberger's syndromes, she suggests that musical training might promote emotion processing in these populations.

Source: Northwestern University