Vous êtes sur la page 1sur 27

New Cancer Treatment?

Universal Donor Immune Cells


ScienceDaily (July 26, 2011) One of the latest attempts to boost the body's defenses against cancer is called adoptive cell transfer, in which patients receive a therapeutic injection of their own immune cells. This therapy, currently tested in early clinical trials for melanoma and neuroblastoma, has its limitations: Removing immune cells from a patient and growing them outside the body for future re-injection is extremely expensive and not always technically feasible.
Weizmann Institute scientists have now tested in mice a new form of adoptive cell transfer, which overcomes these limitations while enhancing the tumor-fighting ability of the transferred cells. The research, reported recently in Blood, was performed in the lab of Prof. Zelig Eshhar of the Institute's Immunology Department, by graduate student Assaf Marcus and lab technician Tova Waks.

The new approach should be more readily applicable than existing adoptive cell transfer treatments because it relies on a donor pool of immune T cells that can be prepared in advance, rather than on the patient's own cells. Moreover, using a method pioneered by Prof. Eshhar more than two decades ago, these T cells are outfitted with receptors that specifically seek out and identify the tumor, thereby promoting its destruction. In the study, the scientists first suppressed the immune system of mice with a relatively mild dose of radiation. They then administered a controlled dose of the modified donor T cells. The mild suppression temporarily prevented the donor T cells from being rejected by the recipient, but it didn't prevent the cells themselves from attacking the recipient's body, particularly the tumor. This approach was precisely what rendered the therapy so effective: The delay in the rejection of the donor T cells gave these cells sufficient opportunity to destroy the tumor. If this method works in humans as well as it did in mice, it could lead to an affordable cell transfer therapy for a wide variety of cancers. Such therapy would rely on an off-the-shelf pool of donor T cells equipped with receptors for zeroing in on different types of cancerous cells. Prof. Zelig Eshhar's research is supported by the M.D. Moross Institute for Cancer Research; the Kirk Center for Childhood Cancer and Immunological Disorders; the Leona M. and Harry B. Helmsley Charitable Trust 50; and the estate of Raymond Lapon.

In Pregnancy, Diabetes-Obesity Combo a Major Red Flag


ScienceDaily (July 25, 2011) Type 2 diabetes and obesity in pregnancy is a daunting duo, according to new research published this month in The Journal of Maternal-Fetal and Neonatal

Medicine. The study shows that both conditions independently contribute to higher risks, opening the door to a wide range of pregnancy, delivery and newborn complications. Study authors say the findings are important because obesity and type 2 diabetes are skyrocketing in women of childbearing age. A study in The Journal of the American Medical Association reports that between 2007 and 2008 the prevalence of obesity among adult women in the United States was more than 35 percent. A report from the Centers for Disease Control and Prevention states that approximately 11 percent of women above the age of 20 had diabetes in 2010. Loralei Thornburg, M.D., senior study author and a high-risk pregnancy expert at the University of Rochester Medical Center, emphasizes that the research is needed now more than ever. "We've never seen the degree of obesity and type 2 diabetes in women that we are seeing right now, because for a very long time diabetes was a disease of an older population, so we rarely dealt with it in prenatal care. We hope this new knowledge will help physicians better understand and care for this rapidly expanding group of high-risk women." While numerous studies have established that obesity, in the absence of diabetes, is associated with problems in pregnancy -- preterm birth, birth trauma, blood loss and a prolonged hospital stay, to name a few -- less is known about type 2 diabetes and what causes difficulties when the two conditions coexist. Researchers from Rochester wanted to determine if obesity alone accounts for the increased risks in this "dual-diagnosis" group, or if diabetes plays a role as well. To determine the influence of obesity and type 2 diabetes when the conditions coexist in pregnancy, Thornburg and lead study author Kristin Knight, M.D., used clinical records and the hospital's birth certificate database to identify 213 pairs of women who delivered babies at the Medical Center between 2000 and 2008. Each pair included a diabetic and a non-diabetic patient with approximately the same pre-pregnancy body mass index (BMI). The majority of women in the study were overweight, obese or morbidly obese. "We matched the pairs pound for pound, because if obesity was the main problem, we'd see similar outcomes between women, whether they had diabetes or not. But if we saw different outcomes between pairs, we'd know the diabetes was impacting outcomes as well," said Thornburg. Using mathematical models and controlling for outside factors, such as age and tobacco use, researchers found that the patients with type 2 diabetes had overall worse pregnancy, delivery and newborn outcomes than their BMI-matched counterparts. Specifically, diabetic patients had higher rates of preeclampsia, cesarean delivery, shoulder dystocia, preterm delivery, large for gestational age infant, fetal anomaly and admission to the neonatal intensive care unit. "Women and their physicians need to be aware that each condition on its own increases risk in pregnancy, so when they coexist the situation is even more worrisome," said Knight, a maternal fetal medicine fellow at Rochester. "Pregnancy is a time of great change, and fortunately many women are very open to making modifications during this period in their life. Anything a woman

can do to improve her condition, from controlling blood sugar and exercising, to eating nutritious foods and maintaining an optimal weight, will help her deliver a healthier baby." Knight originally focused her research on the effects of type 1 and type 2 diabetes on pregnancy. In a previous study, she found that women with type 2 diabetes, most of whom were also obese, had poorer outcomes. Consequently, her research turned to obese, type 2 diabetics and their experiences in pregnancy. "If a woman enters pregnancy obese, but hasn't developed type 2 diabetes, she is in a better place than if she had both," concluded Thornburg. In addition to Knight and Thornburg, Eva K. Pressman, M.D., and David N. Hackney, M.D., from the Medical Center, also participated in the research.

Corporal Punishment May Have Long-Term Negative Effects On Children's Intelligence


ScienceDaily (July 26, 2011) Children in a school that uses corporal punishment performed significantly worse in tasks involving "executive functioning" -- psychological processes such as planning, abstract thinking, and delaying gratification -- than those in a school relying on milder disciplinary measures such as time-outs, according to a new study involving two private schools in a West African country. The findings, published by the journal Social Development, suggest that a harshly punitive environment may have long-term detrimental effects on children's verbal intelligence and their executive-functioning ability. As a result, children exposed to a harshly punitive environment may be at risk for behavioral problems related to deficits in executive-functioning, the study indicates. The study -- by Prof. Victoria Talwar of McGill University, Prof. Stephanie M. Carlson of the University of Minnesota, and Prof. Kang Lee of the University of Toronto, involved 63 children in kindergarten or first grade at two West African private schools. Their families lived in the same urban neighborhood. The parents were largely civil servants, professionals and merchants. In one school, discipline in the form of beating with a stick, slapping of the head, and pinching was administered publicly and routinely for offenses ranging from forgetting a pencil to being disruptive in class. In the other school, children were disciplined for similar offenses with the use of time-outs and verbal reprimands. While overall performance on the executive-functioning tasks was similar in the younger children from both schools, the Grade 1 children in the non-punitive school scored significantly higher than those in the punitive school. These results are consistent with research findings that punitive discipline may make children immediately compliant -- but may reduce the likelihood

that they will internalize rules and standards. That, in turn, may result in lower self-control as children get older. "This study demonstrates that corporal punishment does not teach children how to behave or improve their learning," Prof. Talwar said. "In the short term, it may not have any negative effects; but if relied upon over time it does not support children's problem-solving skills, or their abilities to inhibit inappropriate behaviour or to learn." Despite the age-old debate over the effects of corporal punishment, few studies have examined the effects on executive-functioning ability. This new study uses a quasi-experimental design to derive data from a naturally occurring situation in which children were exposed to two different disciplinary environments. The parents of children in both schools endorsed physical punishment equally, suggesting that the school environment can account for the differences found. There are many further questions that remain unanswered. "We are now examining whether being in a punitive environment day in and day out will have other negative impacts on children such as lying or other covert antisocial behaviors. Also, we are pursuing the long term consequences of experiencing corporal punishment. For example, what would children's cognitive and social development be 5 or 10 years down the road?," said Prof. Kang Lee. The findings are relevant to current controversy. "In the U.S., 19 states still allow corporal punishment in schools, although more of them are now asking for parent permission to use it. With this new evidence that the practice might actually undermine children's cognitive skills needed for self-control and learning, parents and policy makers can be better informed," said Prof. Stephanie M. Carlson.

Genes Play Greater Role in Heart Attacks Than Stroke, Researchers Say
ScienceDaily (July 27, 2011) People are significantly more likely to inherit a predisposition to heart attack than to stroke, according to research reported in Circulation: Cardiovascular Genetics, an American Heart Association journal. The study results have implications for better understanding the genetics of stroke and suggest the need for separate risk assessment models for the two conditions. "We found that the association between one of your parents having a heart attack and you having a heart attack was a lot stronger than the association between your parent having a stroke and you having a stroke," said senior author Peter M. Rothwell, M.D., Ph.D., professor of clinical neurology at Oxford University in England. "That suggests the susceptibility to stroke is less strongly inherited than the susceptibility to heart attack." A second analysis, which included patients' siblings as well as parents, yielded the same result: Family history proved a stronger risk predictor for heart attack than for stroke.

Rothwell and his colleagues conducted the study to clarify and confirm evidence suggesting a great difference in genetic predisposition between heart attacks and strokes. "We had found previously that much of the heritability of stroke is related to the genetics of high blood pressure, which doesn't seem to be the case for heart attack," Rothwell said. Hypertension appears to be closely related with stroke rather than heart attack, which is why a family history of hypertension is related to a higher risk of stroke. In the report just published, all patients were enrolled in the ongoing Oxford Vascular Study. OXVASC, as the study is known, that began in 2002 to study strokes, heart attacks and other acute vascular events in a part of Oxfordshire County where more than 91,000people are served by one hospital. Previous analyses in the same population conducted by lead author, Amitava Banerjee MPH PhD, have shown the particular importance of family history in mother-daughter transmission in both heart attacks and stroke. "Family history of heart attacks and family history of strokes have rarely been studied in the same population," Banerjee said. The researchers used data from 906 patients (604 men) with acute heart ailments and 1,015 patients (484 men)who suffered acute cerebral events. Among the study's findings:

In the heart patients, 30 percent had one parent who'd had a heart attack, 21 percent had at least one sibling who had suffered a heart attack. Seven percent had two or more siblings who had heart attacks and 5 percent had two parents with heart attack. Among the patients with a stroke or transient ischemic attacks (TIAs, often called a ministrokes or warning strokes), 21 percent had one parent who had a stroke, and 2 percent had two parents with stroke. Eight percent had at least one sibling with a stroke and 1.4 percent had at least two siblings with stroke. The risk of a sibling developing acute heart problems was similar for those with heart attack or stroke. The risk for an acute cardiac event was six times greater if both parents had suffered a heart attack and one-and-a-half times greater if one parent had a heart attack. In contrast, the likelihood of stroke did not change significantly with parents' stroke history.

The findings, if confirmed by additional studies, hold two significant implications, Rothwell said. "First, the way physicians predict the odds of a healthy person suffering a heart attack or stroke needs refining," he said. "Currently, most risk models lump a patient's family history of stroke and heart attack together. We probably should model family history of stroke and heart attack separately in the future." The new data also indicated that using the same criteria to predict both medical events overestimate the risk of stroke, he added. "The knowledge of genetic factors in stroke lags behind that in coronary artery disease," Rothwell said. The discovery that genes play a significantly smaller role in stroke could mean that genetic studies of stroke may not be critical to the field, he added.

Co-authors are Amitava Banerjee, MPH Ph.D.; Louise E. Silver, R.G.N., Ph.D.; Carl Heneghan, Ph.D.; Sarah J. V. Welch, R.G.N. MA; Ziyah Mehta, Ph.D.; and Adrian P. Banning, M.D.

Researchers Capture Breakthrough Data On Cervical Spine Injuries


ScienceDaily (July 26, 2011) A high school football player's broken neck -- from which he's recovered -- has yielded breakthrough biomechanical data on cervical spine injuries that could ultimately affect safety and equipment standards for athletes. University of New Hampshire associate professor of kinesiology Erik Swartz collaborated on the study, which appears in a letter in the New England Journal of Medicine. Swartz and lead author Steven Broglio of the University of Michigan captured this groundbreaking spinal fracture data while studying concussions. Broglio had fitted the helmets of football players at a high school in the Midwest with padded sensors as part of the Head Impact Telemetry System (HITS), which measures the location and magnitude of impacts to the helmet. During a head-down tackle, an 18-year-old cornerback in the study suffered both a concussion and a fracture of his cervical spine, or neck. (He has since fully recovered.) "This is really novel," says Swartz, explaining that all previous research on cervical spine injuries have been done on cadavers, animals, or via mathematical modeling. "You can't create a cervical spine fracture in a healthy human, but here you have an actual event where we captured data during an actual cervical spine injury," he says. Swartz notes that this research will bring real-world information to the study of axial load impact to the head and its effects on the spine. "We now have data that we know caused a serious spine injury in a healthy, 18-year-old strong-bodied athlete," he says. Swartz, who teaches athletic training, was tapped by Broglio for his expertise in cervical spine injuries in athletes. Swartz helped analyze the acceleration data from the in-helmet sensors in collaboration with sideline video footage of the tackle to describe the effects of the impact to the player. The authors see far-reaching implications for this work in the quest for greater safety in youth sports. In the journal letter, they note that sports and recreation activities are the second most common cause of cervical spine injuries for people under age 30, with an average lifetime cost of more than $3 million. While concussions are far more common than broken necks among high school or college athletes, Broglio notes that media attention has been focused on professional sports. "To us, the larger public health issue is with the 1.5 million high school kids that play football each year. Not the 1,500 that play in the NFL," he says.

Swartz adds that this work will inform ongoing discussions about the safety and long-term effects of head-down tackles. "It sends a huge message to the athletic community about headdown impact," he says.

Eliminating Protein in Specific Brain Cells Blocks Nicotine Reward


ScienceDaily (July 27, 2011) Removing a protein from cells located in the brain's reward center blocks the anxiety-reducing and rewarding effects of nicotine, according to a new animal study in the July 27 issue of The Journal of Neuroscience. The findings may help researchers better understand how nicotine affects the brain. Nicotine works by binding to proteins called nicotinic receptors on the surface of brain cells. In the new study, researchers led by Tresa McGranahan, Stephen Heinemann, PhD, and T. K. Booker, PhD, of the Salk Institute for Biological Studies, found that removing a specific type of nicotinic receptor from brain cells that produce dopamine -- a chemical released in response to reward -- makes mice less likely to seek out nicotine. The mice also did not show reductions in anxiety-like behaviors normally seen after nicotine treatment. Smokers commonly report anxiety relief as a key factor in continued smoking or relapse. "These findings show that the rewarding and anxiety-reducing properties of nicotine, thought to play a key role in the development of tobacco addiction, are related to actions at a single set of brain cells," said Paul Kenny, PhD, an expert on drug addiction at Scripps Research Institute, who was unaffiliated with the study. Previous studies showed blocking the alpha4 nicotinic receptor within the ventral tegmental area (VTA) -- a brain region important in motivation, emotion, and addiction -- decreases the rewarding properties of nicotine. Because alpha4 receptors are present on several cell types in the VTA, it was unclear how nicotine produced pleasurable feelings. To zero in on the circuit important in the brain's response to nicotine, researchers developed mice with a mutation that left them unable to produce the alpha4 receptor, but only on dopamine brain cells. Mice lacking alpha4 receptors in these cells spent less time looking to obtain nicotine compared with normal mice, suggesting the alpha4 receptors are required for the rewarding effects of nicotine. Nicotine also failed to reduce anxiety-like behaviors in the mutant mice, as it normally does in healthy mice. "Identification of the type of nicotinic receptors necessary for two key features of nicotine addiction -- reward and anxiety -- may help us better understand the pathway that leads to nicotine dependence, and potential treatment for the one billion cigarette smokers worldwide," McGranahan said. Diseases from tobacco use remain a major killer throughout the world, causing more than 5 million deaths per year.

The findings could guide researchers to a better understanding of the mechanisms of tobacco addiction and assist in the development of new drugs to treat tobacco addiction and provide relief from anxiety disorders, Kenny added. The research was supported by the National Institute of Neurological Disorders and Stroke, the National Institute on Alcohol Abuse and Alcoholism, and the National Institute on Drug Abuse.

Short-Term Use of Amphetamines Can Improve ADHD Symptoms in Adults, Review Finds
ScienceDaily (July 28, 2011) Giving amphetamines to adults with Attention Deficit Hyperactivity Disorder (ADHD) can help them control their symptoms, but the side effects mean that some people do not manage to take them for very long. These conclusions were drawn by a team of five researchers working at Girona and Barcelona Universities in Spain, and published in a new Cochrane Systematic Review. Attention Deficit Hyperactivity Disorder (ADHD) is a childhood onset disorder, but half of people with it find that the symptoms of hyperactivity, mood instability, irritability, difficulties in maintaining attention, lack of organization and impulsive behaviours persist into adulthood. "We wanted to see whether amphetamines could reverse the underlying neurological problems that feature in ADHD, and so improve ADHD symptoms," says Xavier Castells, who led the study and works in the Unit of Clinical Pharmacology at University of Girona. After searching through medical literature, they identified seven studies, which had enrolled a total of 1091 participants in clinical trials. The three amphetamine based medicines they considered (dextroamphetamine, lisdexamphetamine and mixed amphetamine salts (MAS)) all reduced ADHD symptoms, although there was no evidence that higher doses worked better than lower ones.. The researchers did not find any difference between in effectiveness between formulations that release the amphetamines rapidly, and those that have a sustained-release. While there was evidence that people taking amphetamines drop out of treatment due to adverse events slightly more than those on placebo controls, the researchers were keen to point out that only 9% of people taking amphetamines withdrew from treatment. Looking at the different formulations of amphetamines, those on MAS had lower drop-out rates than those on other versions of the drug. Furthermore, most studies had a duration of between 2 and 7 weeks, therefore precluding the possibility of drawing conclusions regarding amphetamine's efficacy and safety in the long-term. In many clinical trials, doctors randomly allocate some patients to 'treatment group' and give them the active medication, while others are placed in a 'control group' and receive a placebo -- a treatment that looks and feels like the real thing, but has no active ingredient in it. The idea is that the patient doesn't know which one they are on. This helps researchers determine how much

of any apparent treatment effect is actually due to the therapy, and how much is due to other factors unrelated to drug effects such as the person believes regarding the efficacy of the intervention or the natural history of the disease. This experimental system only works, though, if the patients have no idea which group they are in. "One of the problems with trying to make sense of this research is that you cannot do a properly controlled study because the amphetamines have such a distinct set of effects.. Patients instantly know whether they are on the treatment or the placebo, so you have to be more cautious about the way you interpret the data," says Castells. "Given that other drugs, like atomoxetine or methylphenidate, have also been shown to reduce ADHD symptoms in adults, it would be of great interest to compare the efficacy of amphetamines to these interventions," says Castells

Gastric Bypass Surgery Changes Food Preferences So That They Eat Less High Fat Food
ScienceDaily (July 27, 2011) Gastric bypass surgery alters people's food preferences so that they eat less high fat food, according to a new study led by scientists at Imperial College London. The findings, published in the American Journal of Physiology -- Regulatory, Integrative, and Comparative Physiology, suggest a new mechanism by which some types of bariatric surgery lead to long-term weight loss. A growing number of obese patients are choosing to undergo bariatric surgery in order to lose weight, with over 7,000 such procedures being carried out on the NHS in 2009-10. The most common and the most effective procedure is the 'Roux-en-Y' gastric bypass, which involves stapling the stomach to create a small pouch at the top, which is then connected directly to the small intestine, bypassing most of the stomach and the duodenum (the first part of the small intestine). This means that patients feel full sooner. The new study involved data from human trials as well as experiments using rats. The researchers used data from 16 participants in a study in which obese people were randomly assigned either gastric bypass surgery or another type of operation, vertical-banded gastroplasty, in which the stomach volume is reduced but no part of the intestine is bypassed. The participants who had had gastric bypass had a significantly smaller proportion of fat in their diet six years after surgery, based on questionnaire responses. In the rat experiments, rats given gastric bypass surgery were compared with rats that were given a sham operation. Rats that had gastric bypass surgery ate less food in total, but they specifically ate less high fat food and more low fat food. When given a choice between two bottles with different concentrations of fat emulsions, the rats that had gastric bypass surgery showed a lower preference for high fat concentrations compared with rats that had a sham operation.

"It seems that people who've undergone gastric bypass surgery are eating the right food without even trying," said Mr Torsten Olbers from Imperial College London, who performed the operations on patients in the study at Sahlgrenska University Hospital in Gteborg, Sweden. Dr Carel le Roux, from the Imperial Weight Centre at Imperial College London, who led the research, said: "It appears that after bypass surgery, patients become hungry for good food and avoid junk food not because they have to, but because they just don't like it any more. If we can find out why this happens, we might be able to help people to eat more healthily without much effort." The rat experiments suggested that the reduced preference for high fat food was partly due to the effects of digesting the food. There was no difference in preferences between gastric bypass rats and sham-operated rats when the rats were only given access to the bottles for a few seconds, suggesting that bypass rats did not dislike the taste of high fat emulsions when they were only allowed small volumes at a time. Rats can learn to avoid foods that they associate with illness, so the researchers tested whether high fat foods would condition them to avoid certain tastes. They gave the rats saccharineflavoured water while infusing corn oil into their stomachs. The gastric bypass rats learned to avoid saccharine, but the sham-operated rats did not, suggesting that the effect of digesting corn oil was unpleasant to the rats that had had gastric bypass surgery. Levels of the satiety-promoting hormones GLP-1 and PYY were higher after feeding in the gastric bypass rats compared with sham-operated rats, suggesting a possible mechanism for the changes in food preferences. The team at Imperial plan to study the role of these hormones further to see if it might be possible to mimic the effects of gastric bypass without using surgery.

Unexpected Discovery On Hormone Secretion


ScienceDaily (July 27, 2011) A team of geneticists at the Institut de recherches cliniques de Montral (IRCM), directed by Dr. Jacques Drouin, made an unexpected discovery on hormone secretion. Contrary to common belief, the researchers found that pituitary cells are organized in structured networks. The scientific breakthrough was published July 26 by the scientific journal Proceedings of the National Academy of Sciences (PNAS). The pituitary gland, located at the base of the brain, secretes the hormones that preserve the balance between all other glands of the endocrine system, which includes all hormone-producing organs.

"Each hormone in the pituitary gland is secreted by a specific type of cells," explains Dr. Drouin, Director of the Molecular Genetics research unit at the IRCM. "Until now, we believed that these cells were randomly distributed throughout the pituitary gland." By using three-dimensional imaging, the researchers discovered that the pituitary gland's secreting cells are rather organized into highly-structured networks. Inside these networks, each cell remains in contact with other cells of the same type, so as to form continuous sheets of cells. In fact, cells of the same lineage can recognize, exchange signals and even act in concert with one another. "We were the first to reveal this three-dimensional organization," says Lionel Budry, graduate student in Dr. Drouin's laboratory and first co-author of the study. "In addition to discovering the cell's structure, we showed its importance for the development and function of the pituitary gland." "We studied two networks of cells: cells that modulate our responses to stress, and cells that control reproduction," adds Dr. Drouin. "Disturbing these networks could be associated with hormone deficiencies." This research project was conducted in collaboration with the team of experts in threedimensional imaging at the Universit de Montpellier directed by Dr. Patrice Mollard, which includes Chrystel Lafont, who is first co-author of the article with Lionel Budry. Research carried out at the IRCM was funded by the Canadian Institutes of Health Research (CIHR) and the Canadian Cancer Society.

Could Patients' Own Kidney Cells Cure Kidney Disease? Reprogrammed Kidney Cells Could Make Transplants and Dialysis Things of the Past
ScienceDaily (July 27, 2011) Approximately 60 million people across the globe have chronic kidney disease, and many will need dialysis or a transplant. Breakthrough research published in the Journal of the American Society Nephrology (JASN) indicates that patients' own kidney cells can be gathered and reprogrammed. Reprogramming patients' kidney cells could mean that in the future, fewer patients with kidney disease would require complicated, expensive procedures that affect their quality of life. In the first study, Sharon Ricardo, PhD (Monash University, in Clayton, Australia) and her colleagues took cells from an individual's kidney and coaxed them to become progenitor cells, allowing the immature cells to form any type in the kidney. Specifically, they inserted several key reprogramming genes into the renal cells that made them capable of forming other cells.

In a second study, Miguel Esteban, MD, PhD (Chinese Academy of Sciences, in Guangzhou, China) and his colleagues found that kidney cells collected from a patient's urine can also be reprogrammed in this way. Using cells from urine allows a technology easy to implement in a clinic setting. Even better, the urine cells could be frozen and later thawed before they were manipulated. If researchers can expand the reprogrammed cells -- called induced pluripotent stem cells (iPSCs) -- and return them to the patient, these IPSCs may restore the health and vitality of the kidneys. In addition to providing a potentially curative therapy for patients, the breakthroughs might also help investigators to study the causes of kidney disease and to screen new drugs that could be used to treat them. In an accompanying editorial, Ian Rogers, PhD (Mount Sinai Hospital, in Toronto, Ontario, Canada) noted that "together, these two articles demonstrate the feasibility of using kidney cells as a source of iPSCs, and efficient production of adult iPSCs from urine means that cells can be collected at any time." Just as exciting, the ease of collection and high frequency of reprogramming described in these articles may help improve future therapies in many other areas of medicine. Dr. Ricardo's co-authors include Bi Song, Jonathan Niclis, Maliha Alikhan, Samy Sakkal, Aude Sylvain, Andrew Laslett, Claude Bernard (Monash University, in Clayton, Australia); and Peter Kerr, (Monash Medical Centre, Australia, in Clayton, Australia). Dr. Esteban's co-authors include Ting Zhou, Christina Benda, Yinghua Huang, Xingyan Li, Yanhua Li, Xiangpeng Guo, Guokun Cao, Shen Chen, Duanqing Pei (Chinese Academy of Sciences, in Guangzhou, China); Sarah Duzinger (University of Natural Resources and Life Sciences); Lili Hao, Jiayan Wu (Chinese Academy of Sciences, Beijing, China); Yau-Chi Chan, Kwong-Man Ng, Jenny Cy Ho, Hung-Fat Tse (University of Hong Kong, Pokfulam, in Hong Kong, HKSAR, China); Matthias Wieser (University of Natural Resources and Life Sciences and Austrian Center for Industrial Biotechnology (ACIB), in Vienna, Austria); Heinz Redl (Austrian Cluster for Tissue Regeneration, Vienna, Austria); and Johannes Grillari, Regina GrillariVoglauer ( University of Natural Resources and Life Sciences and Evercyte GmbH, in Vienna, Austria).

How Memory Is Lost: Loss of Memory Due to Aging May Be Reversible


ScienceDaily (July 28, 2011) Yale University researchers can't tell you where you left your car keys -- but they can tell you why you can't find them.

The neural networks in the brains of the middle-aged and elderly have weaker connections and fire less robustly than in youthful ones. (Credit: Image courtesy of Yale University) A new study published July 27 in the journal Nature shows the neural networks in the brains of the middle-aged and elderly have weaker connections and fire less robustly than in youthful ones. Intriguingly, the research suggests that this condition is reversible. "Age-related cognitive deficits can have a serious impact on our lives in the Information Age as people often need higher cognitive functions to meet even basic needs, such as paying bills or accessing medical care," said Amy Arnsten, Professor of Neurobiology and Psychology and a member of the Kavli Institute for Neuroscience. "These abilities are critical for maintaining demanding careers and being able to live independently as we grow older." As people age, they tend to forget things more often, are more easily distracted and disrupted by interference, and have greater difficulty with executive functions. While these age-related deficits have been known for many years, the cellular basis for these common cognitive difficulties has not been understood. The new study examined for the first time age-related changes in the activity of neurons in the prefrontal cortex (PFC), the area of the brain that is responsible for higher cognitive and executive functions. Networks of neurons in the prefrontal cortex generate persistent firing to keep information "in mind" even in the absence of cues from the environment. This process is called "working memory," and it allows us to recall information, such as where the car keys were left, even when that information must be constantly updated. This ability is the basis for abstract thought and reasoning, and is often called the "Mental Sketch Pad." It is also essential for executive functions, such as multi-tasking, organizing, and inhibiting inappropriate thoughts and actions. Arnsten and her team studied the firing of prefrontal cortical neurons in young, middle-aged and aged animals as they performed a working memory task. Neurons in the prefrontal cortex of the young animals were able to maintain firing at a high rate during working memory, while neurons in older animals showed slower firing rates. However, when the researchers adjusted the neurochemical environment around the neurons to be more similar to that of a younger subject, the neuronal firing rates were restored to more youthful levels.

Arnsten said that the aging prefrontal cortex appears to accumulate excessive levels of a signaling molecule called cAMP, which can open ion channels and weaken prefrontal neuronal firing. Agents that either inhibited cAMP or blocked cAMP-sensitive ion channels were able to restore more youthful firing patterns in the aged neurons. One of the compounds that enhanced neuronal firing was guanfacine, a medication that is already approved for treating hypertension in adults, and prefrontal deficits in children, suggesting that it may be helpful in the elderly as well. Arnsten's finding is already moving to the clinical setting. Yale is enrolling subjects in a clinical trial testing guanfacine's ability to improve working memory and executive functions in elderly subjects who do not have Alzheimer's Disease or other dementias.

Social Deficits Associated With Autism, Schizophrenia Induced in Mice With New Technology
ScienceDaily (July 27, 2011) Researchers at Stanford University School of Medicine have been able to switch on, and then switch off, social-behavior deficits in mice that resemble those seen in people with autism and schizophrenia, thanks to a technology that allows scientists to precisely manipulate nerve activity in the brain. In synchrony with this experimentally induced socially aberrant behavior, the mice exhibited a brain-wave pattern called gamma oscillation that has been associated with autism and schizophrenia in humans, the researchers say. The findings, to be published online in Nature on July 27, lend credence to a hypothesis that has been long floated but hard to test, until now. They mark the first demonstration, the researchers said, that elevating the brain's susceptibility to stimulation can produce social deficits resembling those of autism and schizophrenia, and that then restoring the balance eases those symptoms. Autism spectrum disorder and schizophrenia each affect nearly 1 percent of all people. At present, there are no good drugs for mitigating the social-behavioral deficits of either disorder. While they differ in many ways, each syndrome is extremely complex, involving diverse deficits including social dysfunction. Mice are social animals, and there are many well-established tests of sociability in these animals. Social behavior can't be ascribed to a single brain region, said Karl Deisseroth, MD, PhD, associate professor of psychiatry and behavioral sciences and of bioengineering and the study's senior author. "To form a coherent pattern of another individual, you need to quickly integrate all kinds of sensations. And that's just the tip of the iceberg," said Deisseroth, a practicing psychiatrist who routinely sees autistic-spectrum patients. "It's all changing, millisecond by millisecond, as both you and the other individual act and react. You have to constantly alter your own predictions about what's coming next. This kind of interaction is immensely more uncertain than, for example, predator/prey activity. It seems that it has to involve the whole brain, not just one or another part of it."

One intriguing hypothesis holds that social dysfunctions characteristic of autism and schizophrenia may stem from an altered balance in the propensity of excitatory versus inhibitory nerve cells in the brain to fire, resulting in an overall hyper-responsiveness to stimulation. Evidence for this hypothesis includes the higher seizure rate among patients with autism, and the fact that many autistic children's brains exhibit elevated levels of a high-frequency brain-wave pattern -- known as "gamma oscillation" -- that can be picked up by an electroencephalogram. Many schizophrenics also exhibit social deficits as well as higher levels of this anomalous brainwave pattern, even at rest. In addition, said Deisseroth, "autistic kids seem to be over-responding to environmental stimuli." For instance, they find eye contact overwhelming, or may cover their ears if there are too many people talking at once. There has been no direct way to test the "excitation/inhibition-balance" hypothesis, Deisseroth said. It's been impossible to experimentally shift the balance between excitation and inhibition in the brain by selectively raising the firing propensities of one class of nerve cells but not the opposing class, because there have been no drugs or electrophysiological methods that act only on excitatory cells of the brain, or only on inhibitory cells. But Deisseroth's team has a way of doing that, with a new technology, pioneered in his laboratory and called optogenetics: selectively bioengineering specific types of nerve cells so that they respond to light. These cells can be bioengineered to be either more or less likely -depending on the researchers' intent -- to relay an impulse to the next nerve cell in a circuit. So with the flick of a switch, the scientists can activate a nerve circuit in the brain or inhibit it. Nerve cells can also be rendered responsive, in various ways, to different frequencies of light, allowing several circuits to be manipulated at once. (The optogenetic technique cannot be used in humans at this time as it requires still-experimental genetic modifications to brain cells.) For the experiments in this study, the investigators targeted excitatory and inhibitory nerve cells in the medial prefrontal cortex, the most advanced part of the mouse brain, Deisseroth said. This region is very well-connected to everyplace else in the brain and is involved in processes such as planning, execution, personality and social behavior, he said. "We didn't want to precisely direct the firing patterns of excitatory or inhibitory cells," Deisseroth said. "We wouldn't know where to start, because we don't know the neural codes of behavior. We just wanted to bias excitability." Instead, the researchers bioengineered the nerve cells to respond to specific wavelength bands of light by becoming, for extended periods of time, either more or less likely to fire. "Nerve cells have an all-or-nothing tipping point," Deisseroth said. "Up to that point, they won't do much. But at a certain threshold, they fire." The study's two first co-authors, postdoctoral researcher Ofer Yizhar, PhD, (now at Weizmann Institute of Science in Rehovot, Israel), and Lief Fenno, a graduate student in the medical school's MD/PhD program, devised ways of activating or inhibiting brain circuits by a light pulse for up to a half-hour, variously increasing or decreasing the firing propensity of nerve cells in

those circuits. This time period was long enough to let the animals engage in various tests of social behavior. The researchers subjected the mice they'd bioengineered to standard assays of rodent behavior, and compared the results to outcomes using normal mice. The experimental mice exhibited no difference from the normal mice in tests of their anxiety levels, their tendency to move around or their curiosity about new objects. But, the team observed, the animals in whose medial prefrontal cortex excitability had been optogenetically stimulated lost virtually all interest in engaging with other mice to whom they were exposed. (The normal mice were much more curious about one another.) "Boosting their excitatory nerve cells largely abolished their social behavior," Deisseroth said. In addition, these mice's brains showed the same gamma-oscillation pattern that is observed among many autistic and schizophrenic patients. "When you raise the firing likelihood of excitatory cells in the medial prefrontal cortex, you see an increased gamma oscillation right away, just as one would predict it would if this change in the excitatory/inhibitory balance were in fact relevant." And when the scientists restored that balance by revving up inhibitory nerve-cell firing in the medial prefrontal cortex, they saw a moderate but significant recovery of social function. "The behavioral results and the correspondence of gamma-oscillation changes to alterations in the animals' excitatory/inhibitory balance suggest that that what we're observing in animals could be relevant to people," said Deisseroth. The study was performed in collaboration with experimental biophysics professor Peter Hegemann, PhD, and his colleagues at Humboldt University in Berlin, and John Huguenard, PhD, professor of neurology and neurological sciences at Stanford. Additional Stanford coauthors were bioengineering postdoctoral researchers Thomas Davidson, PhD, Vikaas Sohal, PhD and Inbal Goshen, PhD; neurology postdoctoral researcher Jeanne Paz, PhD; neuroscience graduate student Daniel O'Shea; bioengineering research associate Joel Finkelstein; and bioengineering laboratory manager Charu Ramakrishnan. Funding came from the Yu, Woo, Snyder and Keck foundations, and from the National Institute of Mental Health, National Institute on Drug Abuse, National Institute of Neurological Disorders and Stroke, the DARPA REPAIR program and the California Institute for Regenerative Medicine, as well as the CNC program at Stanford.

Yoga Boosts Stress-Busting Hormone, Reduces Pain, Study Finds


ScienceDaily (July 27, 2011) A new study by York University researchers finds that practicing yoga reduces the physical and psychological symptoms of chronic pain in women with fibromyalgia.

The study is the first to look at the effects of yoga on cortisol levels in women with fibromyalgia. The condition, which predominantly affects women, is characterized by chronic pain and fatigue; common symptoms include muscle stiffness, sleep disturbances, gastrointestinal discomfort, anxiety and depression. Previous research has found that women with fibromyalgia have lower-than-average cortisol levels, which contribute to pain, fatigue and stress sensitivity. According to the study, participants' saliva revealed elevated levels of total cortisol following a program of 75 minutes of hatha yoga twice weekly over the course of eight weeks. "Ideally, our cortisol levels peak about 30-40 minutes after we get up in the morning and decline throughout the day until we're ready to go to sleep," says the study's lead author, Kathryn Curtis, a PhD student in York's Department of Psychology, Faculty of Health. "The secretion of the hormone, cortisol, is dysregulated in women with fibromyalgia" she says. Cortisol is a steroid hormone that is produced and released by the adrenal gland and functions as a component of the hypothalamic-pituitary-adrenal (HPA) axis in response to stress. "Hatha yoga promotes physical relaxation by decreasing activity of the sympathetic nervous system, which lowers heart rate and increases breath volume. We believe this in turn has a positive effect on the HPA axis," says Curtis. Participants completed questionnaires to determine pain intensity pre- and post-study; they reported significant reductions in pain and associated symptoms, as well as psychological benefits. They felt less helpless, were more accepting of their condition, and were less likely to "catastrophize" over current or future symptoms. "We saw their levels of mindfulness increase -- they were better able to detach from their psychological experience of pain," Curtis says. Mindfulness is a form of active mental awareness rooted in Buddhist traditions; it is achieved by paying total attention to the present moment with a non-judgmental awareness of inner and outer experiences. "Yoga promotes this concept -- that we are not our bodies, our experiences, or our pain. This is extremely useful in the management of pain," she says. "Moreover, our findings strongly suggest that psychological changes in turn affect our experience of physical pain." The study -- Curtis' thesis -- was published July 26 in the Journal of Pain Research. It is coauthored by her supervisor, York professor Joel Katz, Canada Research Chair in Health Psychology, and Anna Osadchuk, a York University undergraduate student. Curtis was supported by a Canadian Institutes of Health Research (CIHR) Canada Graduate Scholarship and a CIHR Strategic Training Grant Fellowship in Pain: Molecules to Community.

How Memory Is Lost: Loss of Memory Due to Aging May Be Reversible


ScienceDaily (July 28, 2011) Yale University researchers can't tell you where you left your car keys -- but they can tell you why you can't find them. A new study published July 27 in the journal Nature shows the neural networks in the brains of the middle-aged and elderly have weaker connections and fire less robustly than in youthful ones. Intriguingly, the research suggests that this condition is reversible. "Age-related cognitive deficits can have a serious impact on our lives in the Information Age as people often need higher cognitive functions to meet even basic needs, such as paying bills or accessing medical care," said Amy Arnsten, Professor of Neurobiology and Psychology and a member of the Kavli Institute for Neuroscience. "These abilities are critical for maintaining demanding careers and being able to live independently as we grow older." As people age, they tend to forget things more often, are more easily distracted and disrupted by interference, and have greater difficulty with executive functions. While these age-related deficits have been known for many years, the cellular basis for these common cognitive difficulties has not been understood. The new study examined for the first time age-related changes in the activity of neurons in the prefrontal cortex (PFC), the area of the brain that is responsible for higher cognitive and executive functions. Networks of neurons in the prefrontal cortex generate persistent firing to keep information "in mind" even in the absence of cues from the environment. This process is called "working memory," and it allows us to recall information, such as where the car keys were left, even when that information must be constantly updated. This ability is the basis for abstract thought and reasoning, and is often called the "Mental Sketch Pad." It is also essential for executive functions, such as multi-tasking, organizing, and inhibiting inappropriate thoughts and actions. Arnsten and her team studied the firing of prefrontal cortical neurons in young, middle-aged and aged animals as they performed a working memory task. Neurons in the prefrontal cortex of the young animals were able to maintain firing at a high rate during working memory, while neurons in older animals showed slower firing rates. However, when the researchers adjusted the neurochemical environment around the neurons to be more similar to that of a younger subject, the neuronal firing rates were restored to more youthful levels. Arnsten said that the aging prefrontal cortex appears to accumulate excessive levels of a signaling molecule called cAMP, which can open ion channels and weaken prefrontal neuronal firing. Agents that either inhibited cAMP or blocked cAMP-sensitive ion channels were able to restore more youthful firing patterns in the aged neurons. One of the compounds that enhanced neuronal firing was guanfacine, a medication that is already approved for treating hypertension in adults, and prefrontal deficits in children, suggesting that it may be helpful in the elderly as well.

Arnsten's finding is already moving to the clinical setting. Yale is enrolling subjects in a clinical trial testing guanfacine's ability to improve working memory and executive functions in elderly subjects who do not have Alzheimer's Disease or other dementias.

Social Deficits Associated With Autism, Schizophrenia Induced in Mice With New Technology
ScienceDaily (July 27, 2011) Researchers at Stanford University School of Medicine have been able to switch on, and then switch off, social-behavior deficits in mice that resemble those seen in people with autism and schizophrenia, thanks to a technology that allows scientists to precisely manipulate nerve activity in the brain. In synchrony with this experimentally induced socially aberrant behavior, the mice exhibited a brain-wave pattern called gamma oscillation that has been associated with autism and schizophrenia in humans, the researchers say. The findings, to be published online in Nature on July 27, lend credence to a hypothesis that has been long floated but hard to test, until now. They mark the first demonstration, the researchers said, that elevating the brain's susceptibility to stimulation can produce social deficits resembling those of autism and schizophrenia, and that then restoring the balance eases those symptoms. Autism spectrum disorder and schizophrenia each affect nearly 1 percent of all people. At present, there are no good drugs for mitigating the social-behavioral deficits of either disorder. While they differ in many ways, each syndrome is extremely complex, involving diverse deficits including social dysfunction. Mice are social animals, and there are many well-established tests of sociability in these animals. Social behavior can't be ascribed to a single brain region, said Karl Deisseroth, MD, PhD, associate professor of psychiatry and behavioral sciences and of bioengineering and the study's senior author. "To form a coherent pattern of another individual, you need to quickly integrate all kinds of sensations. And that's just the tip of the iceberg," said Deisseroth, a practicing psychiatrist who routinely sees autistic-spectrum patients. "It's all changing, millisecond by millisecond, as both you and the other individual act and react. You have to constantly alter your own predictions about what's coming next. This kind of interaction is immensely more uncertain than, for example, predator/prey activity. It seems that it has to involve the whole brain, not just one or another part of it." One intriguing hypothesis holds that social dysfunctions characteristic of autism and schizophrenia may stem from an altered balance in the propensity of excitatory versus inhibitory nerve cells in the brain to fire, resulting in an overall hyper-responsiveness to stimulation. Evidence for this hypothesis includes the higher seizure rate among patients with autism, and the fact that many autistic children's brains exhibit elevated levels of a high-frequency brain-wave pattern -- known as "gamma oscillation" -- that can be picked up by an electroencephalogram.

Many schizophrenics also exhibit social deficits as well as higher levels of this anomalous brainwave pattern, even at rest. In addition, said Deisseroth, "autistic kids seem to be over-responding to environmental stimuli." For instance, they find eye contact overwhelming, or may cover their ears if there are too many people talking at once. There has been no direct way to test the "excitation/inhibition-balance" hypothesis, Deisseroth said. It's been impossible to experimentally shift the balance between excitation and inhibition in the brain by selectively raising the firing propensities of one class of nerve cells but not the opposing class, because there have been no drugs or electrophysiological methods that act only on excitatory cells of the brain, or only on inhibitory cells. But Deisseroth's team has a way of doing that, with a new technology, pioneered in his laboratory and called optogenetics: selectively bioengineering specific types of nerve cells so that they respond to light. These cells can be bioengineered to be either more or less likely -depending on the researchers' intent -- to relay an impulse to the next nerve cell in a circuit. So with the flick of a switch, the scientists can activate a nerve circuit in the brain or inhibit it. Nerve cells can also be rendered responsive, in various ways, to different frequencies of light, allowing several circuits to be manipulated at once. (The optogenetic technique cannot be used in humans at this time as it requires still-experimental genetic modifications to brain cells.) For the experiments in this study, the investigators targeted excitatory and inhibitory nerve cells in the medial prefrontal cortex, the most advanced part of the mouse brain, Deisseroth said. This region is very well-connected to everyplace else in the brain and is involved in processes such as planning, execution, personality and social behavior, he said. "We didn't want to precisely direct the firing patterns of excitatory or inhibitory cells," Deisseroth said. "We wouldn't know where to start, because we don't know the neural codes of behavior. We just wanted to bias excitability." Instead, the researchers bioengineered the nerve cells to respond to specific wavelength bands of light by becoming, for extended periods of time, either more or less likely to fire. "Nerve cells have an all-or-nothing tipping point," Deisseroth said. "Up to that point, they won't do much. But at a certain threshold, they fire." The study's two first co-authors, postdoctoral researcher Ofer Yizhar, PhD, (now at Weizmann Institute of Science in Rehovot, Israel), and Lief Fenno, a graduate student in the medical school's MD/PhD program, devised ways of activating or inhibiting brain circuits by a light pulse for up to a half-hour, variously increasing or decreasing the firing propensity of nerve cells in those circuits. This time period was long enough to let the animals engage in various tests of social behavior. The researchers subjected the mice they'd bioengineered to standard assays of rodent behavior, and compared the results to outcomes using normal mice.

The experimental mice exhibited no difference from the normal mice in tests of their anxiety levels, their tendency to move around or their curiosity about new objects. But, the team observed, the animals in whose medial prefrontal cortex excitability had been optogenetically stimulated lost virtually all interest in engaging with other mice to whom they were exposed. (The normal mice were much more curious about one another.) "Boosting their excitatory nerve cells largely abolished their social behavior," Deisseroth said. In addition, these mice's brains showed the same gamma-oscillation pattern that is observed among many autistic and schizophrenic patients. "When you raise the firing likelihood of excitatory cells in the medial prefrontal cortex, you see an increased gamma oscillation right away, just as one would predict it would if this change in the excitatory/inhibitory balance were in fact relevant." And when the scientists restored that balance by revving up inhibitory nerve-cell firing in the medial prefrontal cortex, they saw a moderate but significant recovery of social function. "The behavioral results and the correspondence of gamma-oscillation changes to alterations in the animals' excitatory/inhibitory balance suggest that that what we're observing in animals could be relevant to people," said Deisseroth. The study was performed in collaboration with experimental biophysics professor Peter Hegemann, PhD, and his colleagues at Humboldt University in Berlin, and John Huguenard, PhD, professor of neurology and neurological sciences at Stanford. Additional Stanford coauthors were bioengineering postdoctoral researchers Thomas Davidson, PhD, Vikaas Sohal, PhD and Inbal Goshen, PhD; neurology postdoctoral researcher Jeanne Paz, PhD; neuroscience graduate student Daniel O'Shea; bioengineering research associate Joel Finkelstein; and bioengineering laboratory manager Charu Ramakrishnan. Funding came from the Yu, Woo, Snyder and Keck foundations, and from the National Institute of Mental Health, National Institute on Drug Abuse, National Institute of Neurological Disorders and Stroke, the DARPA REPAIR program and the California Institute for Regenerative Medicine, as well as the CNC program at Stanford. The findings, to be published online in Nature on July 27, lend credence to a hypothesis that has been long floated but hard to test, until now. They mark the first demonstration, the researchers said, that elevating the brain's susceptibility to stimulation can produce social deficits resembling those of autism and schizophrenia, and that then restoring the balance eases those symptoms. Autism spectrum disorder and schizophrenia each affect nearly 1 percent of all people. At present, there are no good drugs for mitigating the social-behavioral deficits of either disorder. While they differ in many ways, each syndrome is extremely complex, involving diverse deficits including social dysfunction. Mice are social animals, and there are many well-established tests of sociability in these animals. Social behavior can't be ascribed to a single brain region, said Karl Deisseroth, MD, PhD, associate professor of psychiatry and behavioral sciences and of bioengineering and the study's senior author. "To form a coherent pattern of another individual, you need to quickly integrate all

kinds of sensations. And that's just the tip of the iceberg," said Deisseroth, a practicing psychiatrist who routinely sees autistic-spectrum patients. "It's all changing, millisecond by millisecond, as both you and the other individual act and react. You have to constantly alter your own predictions about what's coming next. This kind of interaction is immensely more uncertain than, for example, predator/prey activity. It seems that it has to involve the whole brain, not just one or another part of it." One intriguing hypothesis holds that social dysfunctions characteristic of autism and schizophrenia may stem from an altered balance in the propensity of excitatory versus inhibitory nerve cells in the brain to fire, resulting in an overall hyper-responsiveness to stimulation. Evidence for this hypothesis includes the higher seizure rate among patients with autism, and the fact that many autistic children's brains exhibit elevated levels of a high-frequency brain-wave pattern -- known as "gamma oscillation" -- that can be picked up by an electroencephalogram. Many schizophrenics also exhibit social deficits as well as higher levels of this anomalous brainwave pattern, even at rest. In addition, said Deisseroth, "autistic kids seem to be over-responding to environmental stimuli." For instance, they find eye contact overwhelming, or may cover their ears if there are too many people talking at once. There has been no direct way to test the "excitation/inhibition-balance" hypothesis, Deisseroth said. It's been impossible to experimentally shift the balance between excitation and inhibition in the brain by selectively raising the firing propensities of one class of nerve cells but not the opposing class, because there have been no drugs or electrophysiological methods that act only on excitatory cells of the brain, or only on inhibitory cells. But Deisseroth's team has a way of doing that, with a new technology, pioneered in his laboratory and called optogenetics: selectively bioengineering specific types of nerve cells so that they respond to light. These cells can be bioengineered to be either more or less likely -depending on the researchers' intent -- to relay an impulse to the next nerve cell in a circuit. So with the flick of a switch, the scientists can activate a nerve circuit in the brain or inhibit it. Nerve cells can also be rendered responsive, in various ways, to different frequencies of light, allowing several circuits to be manipulated at once. (The optogenetic technique cannot be used in humans at this time as it requires still-experimental genetic modifications to brain cells.) For the experiments in this study, the investigators targeted excitatory and inhibitory nerve cells in the medial prefrontal cortex, the most advanced part of the mouse brain, Deisseroth said. This region is very well-connected to everyplace else in the brain and is involved in processes such as planning, execution, personality and social behavior, he said. "We didn't want to precisely direct the firing patterns of excitatory or inhibitory cells," Deisseroth said. "We wouldn't know where to start, because we don't know the neural codes of behavior. We just wanted to bias excitability." Instead, the researchers bioengineered the nerve cells to respond to specific wavelength bands of light by becoming, for extended periods of time, either more or less likely to fire. "Nerve cells

have an all-or-nothing tipping point," Deisseroth said. "Up to that point, they won't do much. But at a certain threshold, they fire." The study's two first co-authors, postdoctoral researcher Ofer Yizhar, PhD, (now at Weizmann Institute of Science in Rehovot, Israel), and Lief Fenno, a graduate student in the medical school's MD/PhD program, devised ways of activating or inhibiting brain circuits by a light pulse for up to a half-hour, variously increasing or decreasing the firing propensity of nerve cells in those circuits. This time period was long enough to let the animals engage in various tests of social behavior. The researchers subjected the mice they'd bioengineered to standard assays of rodent behavior, and compared the results to outcomes using normal mice. The experimental mice exhibited no difference from the normal mice in tests of their anxiety levels, their tendency to move around or their curiosity about new objects. But, the team observed, the animals in whose medial prefrontal cortex excitability had been optogenetically stimulated lost virtually all interest in engaging with other mice to whom they were exposed. (The normal mice were much more curious about one another.) "Boosting their excitatory nerve cells largely abolished their social behavior," Deisseroth said. In addition, these mice's brains showed the same gamma-oscillation pattern that is observed among many autistic and schizophrenic patients. "When you raise the firing likelihood of excitatory cells in the medial prefrontal cortex, you see an increased gamma oscillation right away, just as one would predict it would if this change in the excitatory/inhibitory balance were in fact relevant." And when the scientists restored that balance by revving up inhibitory nerve-cell firing in the medial prefrontal cortex, they saw a moderate but significant recovery of social function. "The behavioral results and the correspondence of gamma-oscillation changes to alterations in the animals' excitatory/inhibitory balance suggest that that what we're observing in animals could be relevant to people," said Deisseroth. The study was performed in collaboration with experimental biophysics professor Peter Hegemann, PhD, and his colleagues at Humboldt University in Berlin, and John Huguenard, PhD, professor of neurology and neurological sciences at Stanford. Additional Stanford coauthors were bioengineering postdoctoral researchers Thomas Davidson, PhD, Vikaas Sohal, PhD and Inbal Goshen, PhD; neurology postdoctoral researcher Jeanne Paz, PhD; neuroscience graduate student Daniel O'Shea; bioengineering research associate Joel Finkelstein; and bioengineering laboratory manager Charu Ramakrishnan. Funding came from the Yu, Woo, Snyder and Keck foundations, and from the National Institute of Mental Health, National Institute on Drug Abuse, National Institute of Neurological Disorders and Stroke, the DARPA REPAIR program and the California Institute for Regenerative Medicine, as well as the CNC program at Stanford.

The First True View of Global Erosion


ScienceDaily (July 27, 2011) Every mountain and hill shall be made low, declared the ancient prophet Isaiah. In other words: erosion happens. But for the modern geologist a vexing question remains: how fast does this erosion happen? For more than a century, scientists have looked for ways to measure and compare erosion rates across differing landscapes around the globe -- but with limited success. "Knowing the background rate of erosion for a place is extremely important," says University of Vermont geologist Paul Bierman, "if you want to compare it to what's coming off the landscape today because of human impacts like agriculture, development, and forestry." Since the mid-1980's, measurements of a rare radioactive element -- beryllium-10 that appears in quartz bombarded by cosmic rays in the top few feet of Earth's surface -- have greatly improved geologists' ability to estimate erosion rates. But these experiments have been done on a local or regional scale, using a variety of methods, calculation constants, and corrections. Comparisons between climate zones and differing rock types have been difficult -- cutting off a global perspective. Now Bierman and his graduate student, Eric Portenga, have taken twenty years worth of this disparate data, compiled 1599 measurements from eighty-seven sites around the world, and recalculated it with a single, up-to-date method. Their work, "provides the first broad, standardized view of pre-human, geologic erosion rates," they write in "Understanding Earth's eroding surface with 10Be," published in the August edition of GSA Today, an open-access journal, available online July 26, 2011. Sustainable Soil "Nobody has stepped back far enough to look at this big picture," says Bierman, "we all work on our little postage stamps of the world -- Africa, South America, the western US." But many of the pressing questions about erosion are global in scale. Most urgent, the ability to support the nine billion people forecast to be living on Earth by midcentury rests directly on the resiliency of soil systems and the health of water supplies. And these two pillars of sustainability are directly and deeply affected by erosion. The method used in this new study can provide a good tool for measuring the sustainability of modern agricultural practices, Bierman notes, since the beryllium-10 data shows the rate at which landscapes have been changing in the recent geologic past: the last thousand to severalhundred-thousand years. "If human impacts result in rates faster than we measure, it's nonsustainable," he says.

Portenga sees how this study can help managers in contested landscapes like the Chesapeake Bay. "Regulators may want to stipulate an ideal amount of sediment coming out of a river system and they may say that they want to get this back to 'normal' standards or 'normal rate.' But what is that rate? What was the erosion like before people started interacting with the landscape?" he says. Not being able to answer that question well has contributed to many regulatory conflicts. "This work can help give a better idea of what is normal," says Portenga, who was the lead author on the study. No Smoking Gun This new study also goes fairly far in identifying the environmental factors -- including latitude, annual precipitation, and, especially, slope -- that drive erosion rates in drainage basins. The mechanisms controlling erosion on outcrops of bedrock are less clear. Using several statistical tests, Portenga and Bierman were able to explain about sixty percent of what controls differing erosion rates in drainage basins around the world. But their study only explains about thirty percent of the variability between outcrops of bedrock. "This means geologists are missing a lot of the crucial information about what is controlling bedrock erosion," Portenga says. Little-studied variables -- like the density of fractures in bedrock, the strength of rocks, and their chemistry -- may be controlling erosion rates, the study suggests. "I don't think we'll ever find the single smoking gun of erosion," says Portenga, "the natural world is so complex and there are so many factors that contribute to how landscapes change over time. But as this method develops, we will have a better sense of what variables are important -and which are not -- in this erosion story." For example, it has been a truism of geology for decades that rainfall is the biggest driver of erosion. Semi-arid landscapes with little vegetation and occasional major storms were understood to have the greatest rates of erosion. But this study challenges that idea. "It turns out that the greatest control on erosion is not mean annual precipitation," says Bierman. Instead, look at slope. "People had always thought slope was important," Beirman says, "but these data show that slope is really important." Modeling the Future Their new study, supported by the National Science Foundation, is part of a larger long-term goal of creating a global model that can predict the background rate and patterns of erosion across the whole planet -- and how these erosion rates will respond to changes like humaninduced climate change.

"Following this study, we can start to answer big questions like, 'how does climate drive erosion?'" says Bierman. In other words, a clearer picture of what global erosion has looked like in the recent past will start to illuminate what is likely to happen in the future as human impacts and land-use decisions play out. "We want a predictive model," says Bierman, "we want to be able to have somebody say, 'here's my drainage basin, here's the climate, here's the rock type, here's the slope, here's the mean annual precipitation: how quickly is this eroding?' That's what you need for land management."

Newly Developed Fluorescent Protein Makes Internal Organs Visible


ScienceDaily (July 26, 2011) Researchers at Albert Einstein College of Medicine of Yeshiva University have developed the first fluorescent protein that enables scientists to clearly "see" the internal organs of living animals without the need for a scalpel or imaging techniques that can have side effects or increase radiation exposure. The new probe could prove to be a breakthrough in whole-body imaging -- allowing doctors, for example, to noninvasively monitor the growth of tumors in order to assess the effectiveness of anti-cancer therapies. In contrast to other body-scanning techniques, fluorescent-protein imaging does not involve radiation exposure or require the use of contrast agents. The findings are described in the July 17 online edition of Nature Biotechnology. For the past 20 years, scientists have used a variety of colored fluorescent proteins, derived from jellyfish and corals, to visualize cells and their organelles and molecules. But using fluorescent probes to peer inside live mammals has posed a major challenge. The reason: hemoglobin in an animal's blood effectively absorbs the blue, green, red and other wavelengths used to stimulate standard fluorescent proteins along with any wavelengths emitted by the proteins when they do light up. To overcome that roadblock, the laboratory of Vladislav Verkhusha, Ph.D., associate professor of anatomy and structural biology at Einstein and the study's senior author, engineered a fluorescent protein from a bacterial phytochrome (the pigment that a species of bacteria uses to detect light). This new phytochrome-based fluorescent protein, dubbed iRFP, both absorbs and emits light in the near-infrared portion of the electromagnetic spectrum- the spectral region in which mammalian tissues are nearly transparent. The researchers targeted their fluorescent protein to the liver -- an organ particularly difficult to visualize because of its high blood content. Adenovirus particles containing the gene for iRFP were injected into mice. Once the viruses and their gene cargoes infected liver cells, the infected cells expressed the gene and produced iRFP protein. The mice were then exposed to nearinfrared light and it was possible to visualize the resulting emitted fluorescent light using a whole-body imaging device. Fluorescence of the liver in the infected mice was first detected the

second day after infection and reached a peak at day five. Additional experiments showed that the iRFP fluorescent protein was nontoxic. "Our study found that iRFP was far superior to the other fluorescent proteins that reportedly help in visualizing the livers of live animals," said Grigory Filonov, Ph.D., a postdoctoral fellow in Dr. Verkhusha''''s laboratory at Einstein, and the first author of the Nature Biotechnology paper. "iRFP not only produced a far brighter image, with higher contrast than the other fluorescent proteins, but was also very stable over time. We believe it will significantly broaden the potential uses for noninvasive whole-body imaging." Dr. Filonov noted that fluorescent-protein imaging involves no radiation risk, which can occur with standard x-rays and computed tomography (CT) scanning. And unlike magnetic resonance imaging (MRI), in which contrasting agents must sometimes be swallowed or injected to make internal body structures more visible, the contrast provided by iRFP is so vibrant that contrasting agents are not needed.

Vous aimerez peut-être aussi