Wednesday, November 30, 2011

Yale Researchers Develop A Way To Monitor Engineered Blood Vessels As They Grow In Patients

New research in the FASEB Journal suggests magnetic resonance imaging allows researchers to study and monitor how new vessels perform while they are 'under construction' in patients

 Using magnetic resonance imaging (MRI) and nanoparticle technology, researchers from Yale have devised a way to monitor the growth of laboratory-engineered blood vessels after they have been implanted in patients. This advance represents an important step toward ensuring that blood vessels, and possibly other tissues engineered from a patient's own biological material, are taking hold and working as expected. Until now, there has been no way to monitor the growth and progress of engineered tissues once they were implanted. This research was published in the December 2011 issue of the FASEB Journal (http://www.fasebj.org).

"We hope that the important findings from our study will serve as a valuable tool for physicians and scientists working to better understand the biological mechanisms involved in tissue engineering," said Christopher K. Breuer, M.D., co-author of the study from the Interdepartmental Program in Vascular Biology and Therapeutics at Yale University School of Medicine in New Haven, CT. "Resulting advances will hopefully usher in a new era of personalized medical treatments where replacement vessels are specifically designed for each patient suffering from cardiac anomalies and disease."

To make this advance, scientists used two different groups of cells to make tissue-engineered blood vessels. In the first group, the cells were labeled with the MRI contrast agent. In the second group, the cells were normal and did not have an MRI label. Cells from each group were then used to create separate laboratory-engineered blood vessels, which were implanted into mice. The purpose was to see whether the laboratory-engineered blood vessels made from cells that were labeled with the contrast agent would indeed be visible on MRI and to make sure that the addition of the contrast agent did not negatively affect the cells or the function of the laboratory-engineered vessels. Researchers imaged the mice with MRI and found that it was possible to track the cells labeled with contrast agent, but not possible to track the cells that were not labeled. This suggests that using MRI and cellular contrast agents to study cellular changes in the tissue-engineered blood vessels after they are implanted is an effective way to monitor these types of vessels.

"This is great news for patients with congenital heart defects, who have to undergo tissue grafting, but that's only the tip of the scalpel," said Gerald Weissmann, M.D., Editor-in-Chief of the FASEB Journal. "As we progress toward an era of personalized medicine—where patients' own tissues and cells will be re-engineered into replacement organs and treatments—we will need noninvasive ways to monitor what happens inside the body in real time. This technique fulfills another promise of nanobiology."

Contacts and sources:
Cody Mooneyhan
Federation of American Societies for Experimental Biology

Scientists Use Laser Imaging To Assess Safety Of Zinc Oxide Nanoparticles In Sunscreen

Ultra-tiny zinc oxide (ZnO) particles with dimensions less than one-ten-millionth of a meter are among the ingredients list of some commercially available sunscreen products, raising concerns about whether the particles may be absorbed beneath the outer layer of skin. 

Overlay of the confocal/multiphoton image of the excised human skin. Yellow color represents skin autofluorescence excited by 405 nm; Purple color represents zinc oxide nanoparticle distribution in skin (stratum corneum) excited by 770 nm, with collagen-induced faint SHG signals in the dermal layer.
Credit: Biomedical Optics Express

To help answer these safety questions, an international team of scientists from Australia and Switzerland have developed a way to optically test the concentration of ZnO nanoparticles at different skin depths. They found that the nanoparticles did not penetrate beneath the outermost layer of cells when applied to patches of excised skin. The results, which were published this month in the Optical Society's (OSA) open-access journal Biomedical Optics Express, lay the groundwork for future studies in live patients.

The high optical absorption of ZnO nanoparticles in the UVA and UVB range, along with their transparency in the visible spectrum when mixed into lotions, makes them appealing candidates for inclusion in sunscreen cosmetics. However, the particles have been shown to be toxic to certain types of cells within the body, making it important to study the nanoparticles' fate after being applied to the skin. By characterizing the optical properties of ZnO nanoparticles, the Australian and Swiss research team found a way to quantitatively assess how far the nanoparticles might migrate into skin.


Zinc oxide (ZnO) nanoparticle distribution in excised human skin. The black line represents the surface of the skin (top), blue represents ZnO nanoparticle distribution in the skin (stratum corneum), and pink represents skin.
Credit: Timothy Kelf, Macquarie University

The team used a technique called nonlinear optical microscopy, which illuminates the sample with short pulses of laser light and measures a return signal. Initial results show that ZnO nanoparticles from a formulation that had been rubbed into skin patches for 5 minutes, incubated at body temperature for 8 hours, and then washed off, did not penetrate beneath the stratum corneum, or topmost layer of the skin. The new optical characterization should be a useful tool for future non-invasive in vivo studies, the researchers write.


Contacts and sources:
Angela Stark
Optical Society of America

Paper: "Characterization of optical properties of ZnO nanoparticles for quantitative imaging of transdermal transport," Biomedical Optics Express, Vol. 2, Issue 12, pp. 3321-3333 (2011).



20 Essential Novels For African-American Women

What makes literature such a beautiful and compelling field of study is its fruitful bounty of diversity. Unfortunately, however, syllabi across the United States still tend towards books by dead white men, with everyone else competing for what few available slots remain. Progress has been made, of course, and dead white men still have plenty to say and offer. But the canon could easily do much, much better for itself. Whether historical, romantic, fantastic, mysterious or some combination thereof (or something else entirely), the following reads represent some of the best voices representing African-American women of today and generations past. By no means neither definitive nor emblematic of all experiences and perspectives, it still provides a great sample of some amazing books deserving of more consideration. Or, in some cases, fully deserving of the hefty recognition they already earned.

The Color Purple by Alice Walker

Alice Walker's Pulitzer-winning classic gives an empowering voice to women marginalized along racial, sexual and economic lines, setting her story during the Great Depression. Protagonist Celie ultimately finds empowerment despite such severe social, political, filial and financial hardships thanks to the loving sexual guidance of her bombastic friend and lover Shug Avery.

Beloved by Toni Morrison

Another sterling Pulitzer winner and rightfully lauded mainstay in the literary canon, Beloved compares and contrasts the times before, during and after the American Civil War. Haunting and intense, it features some horrifying depictions of slavery's reality and what lengths some might have gone to in order to escape it, including murdering loved ones.

Their Eyes were Watching God by Zora Neale Hurston

Featuring one of the strongest female leads in all of literature, Zora Neale Hurston's undeniable magnum opus follows a Florida woman through many different loves. Some horrid, some amazing, and all of them eventually shaping her into the self-assured, somewhat traumatized and frequently gossiped-about individual she eventually becomes.

Incidents in the Life of a Slave Girl by Harriet Jacobs

This fiercely feminist slave narrative comes so laden with autobiography it may as well be shelved as a memoir. Harriet Jacobs, here cast as Linda, recounts how masters tortured their female slaves more egregiously than their male counterparts, not infrequently involving sexual assault and rape. While graphic and heartwrenching, the novel does carry historical significance making it an essential read.

Waiting to Exhale by Terry McMillan

Four middle-aged women show each other love and support through times of triumph and times of tragedy both inter- and intrapersonal. Although their individual stories do base a lot of characterization off their masculine relationships, it still turns a realistic eye towards dating and marriage problems.

The Serpent's Gift by Helen Elaine Lee

Set at the turn of the 20th century, The Serpent's Gift chronicles a tale of two families whose lives begin overlapping in some interesting – some good, some bad — ways as time marches onward. For almost 100 years, they love, share and suffer through their middle-class Midwestern existence, impacted by some of America's most influential historical moments.

The Women of Brewster Place by Gloria Naylor

Short vignettes bound together by common themes and characters greatly humanize the female inhabitants of a decaying urban neighborhood. They cycle through victories and tragedies, their emotions running the gamut from joy to despair to homicidal rage.

Kindred by Octavia E. Butler

Science fiction and fantasy author Octavia E. Butler tackles time travel in her narrative of a young woman flung to a pre-Civil War plantation. There, she must serve as a slave in order to protect her identity – and ensure she even exists in the future.

The Street by Ann Petry

Published in 1946, The Street takes a long look at the experiences of a young, single mother in Harlem harboring a love of books and Ben Franklin. The latter serves as her inspiration to keep pressing forward, working hard and ensuring the safest possible life for her beloved son.

Betsey Brown by Ntozake Shange

The eponymous protagonist comes of age as the daughter of a doctor during school desegregation, witnessing firsthand the beginnings of the Civil Rights movement. Ntozake Shange juxtaposes Betsey's experiences with those of her parents Jane and Greer to showcase the different attitudes the generations held about social change.

Push by Sapphire

Though illiterate, impoverished, twice-pregnant because of her father's repeated rapes and suffering under an abusive mother, the 16-year-old girl around whichPush rotates pines for a healthier, happier life. Sapphire leaves her ending ambiguous, but by the end an alternative school has already bolstered her reading skills.

Coffee Will Make You Black by April Sinclair

Bildungsroman buffs might want to pick up this novel about a young woman crippled beneath poverty and racism in Chicago's South Side during the 1960s. Appropriate for teens and adults, it offers up some sobering lessons about some universal and historical themes alike.

What Looks Like Crazy on an Ordinary Day by Pearl Cleage

An Atlanta-based hairdresser relocates to her Michigan origins following a devastating and unexpected HIV diagnosis. She reunites with her sister, adopts a baby, rediscovers love and finds excitement in the city she once deemed unworthy.

Iola Leroy or, Shadows Uplifted by Frances E.W. Harper

Iola Leroy stands as one of the first novels ever published by an African-American woman and concerns itself with the mixed-race daughter of a former slave owner and the wife he once owned. But once the planter dies, she winds up thrust into servitude of her own before being freed and piecing together the broken fragments of her family.

Blanche on the Lam by Barbara Neely

Barbara Neely's debut novel introduced mystery aficionados to cook and housekeeper Blanche White, who eventually winds up playing detective while running from fraud charges. Her position as a majorly marginalized individual (along both class and race lines) allows her to go about her investigations smoother – handy, considering her first case involves a murdered gardener.

The Bondwoman's Narrative by Hannah Crafts

Speculation about The Bondwoman's Narrative abounds, with many scholars believing it might be the very first novel ever written by an African-American woman; it wasn't published until 2002, however. This slave story makes for another first-person example about the horrors faced by people dehumanized by others who wrongfully forced them into bondage.

Water in a Broken Glass by Odessa Rose

Odessa Rose's sensuous story twists and turns throughout an attraction triangle shared by a popular sculptress, a man she loves and the woman she ends up loving even more. It's a joyous journey through eroticism and art alike, and many readers consider it a major triumph of African-American lesbian literature.

The Color of Love by Sandra Kitt

Even skeptics towards the romance genre can still appreciate The Color of Lovefor its frank, grounded depiction of the unique challenges interracial couples frequently face. Few authors ever put forth the effort to explore the realities behind such relationships, and fewer still with as much gravitas and intelligent commentary as Sandra Kitt.

Praisesong for the Widow by Paule Marshall

At age 64, protagonist Avey Johnson heads out on a cruise to Carraiacou to find herself and better connect with her heritage after widowhood. Interspersed throughout her experiences on the Carribbean island are scenes taken from her childhood, marriage and motherhood to help her come to terms with where she's been and where she may very well go.

Corregidora by Gayl Jones

Through the powerful voice of haunted blues chanteuse Ursa Corregidora, her brutal family history of slavery collides with the realities and experiences of African-Americans in the 1930s. Her newly-acquired inability to bear children challenges her to think of the bitter past that scarred her mother and grandmother.


Contacts and sources:
Emma Taylor 

Food Served In Children's Hospitals Rated Largely Unhealthy

Given the obesity epidemic among the nation's young, one would hope that children's hospitals would serve as a role model for healthy eating. But hospitals in California fall short, with only 7 percent of entrees classified as "healthy" according to a new study published in Academic Pediatrics.

Researchers from UCLA and the RAND Corporation assessed 14 food venues at the state's 12 major children's hospitals and found there was a lot of room for improvement in their offerings and practices.

"As health professionals, we understand the connection between healthy eating and good health, and our hospitals should be role models in this regard," said Dr. Lenard Lesser, primary investigator and a physician in the Robert Wood Johnson Foundation Clinical Scholars Program in the Department of Family Medicine, David Geffen School of Medicine at UCLA. "Unfortunately, the food in many hospitals is no better – and in some cases worse – than what you would find in a fast food restaurant."

The study authors developed a modified version of the Nutrition Environment Measures Scale for Restaurants (NEMS-R) as an assessment tool for rating the food offerings in hospital cafeterias. This measurement system takes into account pricing, availability of vegetables, nutrition labeling, combo promotions and healthy beverages.

Overall the average score for the 14 hospital food venues was 19.1, out of a range of 0 (least healthy) to 37 (most healthy). Of the total 359 entrees the hospitals served, only 7 percent were classified as healthy according to the NEMS criteria. And while nearly all the hospitals offered healthy alternatives such as fruit, less than one third had nutrition information at the point of sale or signs to promote healthy eating.

Among the other key findings:
  • All 14 food venues offered low-fat or skim milk and diet soda
  • 81 percent offered high-calorie, high-sugar items such as cookies and ice cream near the cash register
  • 25 percent sold whole wheat bread
  • Half the hospitals did not provide any indication that they carried healthy entrees
  • 44 percent did not have low calorie salad dressings
Since no one has previously documented the health of food in these hospitals, researchers provided hospital administrators with their scores to encourage improvement. Since the study was conducted in July 2010, some of the hospitals surveyed have taken steps to either improve their fare and/or reduce unhealthy offerings. For example, some have eliminated fried food, lowered the price of salads, and increased the price of sugary beverages or eliminated them altogether from their cafeterias.

"The steps some hospitals are already taking to improve nutrition and reduce junk food are encouraging," Lesser said. "We plan to make this nutritional quality measurement tool available to hospitals around the country to help them assess and improve their food offerings."

Researchers said hospitals can improve the health of their food offerings by providing more fruits, vegetables, whole grains and smaller portions; shrink the amount of low-nutrient choices, and utilize low-cost options to promote healthy eating such as signage and keeping unhealthy impulse items away from the checkout stand.

"If we can't improve the food environment in our hospitals, how do we expect to improve the health of food in our community?" Lesser said. "By serving as role models for healthy eating, we can make a small step toward helping children prevent the onset of dietary-related chronic diseases."


Contacts and sources:
Enrique Rivero
University of California - Los Angeles Health Sciences

Hospitals the researchers surveyed were: Children's Hospital Central California; Children's Hospital Los Angeles; Children's Hospital of Orange County; Children's Hospital & Research Center at Oakland; Loma Linda University Children's Hospital; Lucile Salter Packard Children's Hospital at Stanford; Miller Children's Hospital; Rady Children's Hospital - San Diego; Mattel Children's Hospital UCLA; University Children's Hospital at University of California, Irvine; University of California, Davis Children's Hospital; University of California, San Diego Children's Hospital; University of California, San Francisco Children's Hospital; Children's Center at Sutter Medical Center, Sacramento.

The Robert Wood Johnson Foundation funded this study. In addition to Lesser, researchers on this study were Dana E. Hunnes, Phedellee Reyes, Robert H. Brook and Lenore Arab of UCLA; and Gery W. Ryan and Deborah A. Cohen of the RAND Corporation.

The research findings presented here are those of the researcher and are not necessarily the views of the Robert Wood Johnson Foundation.

The UCLA Department of Family Medicine provides comprehensive primary care to entire families, from newborns to seniors. The department also provides low-risk obstetrical services and prenatal and in-patient care at Santa Monica–UCLA Medical Center and Orthopedic Hospital and out-patient care at the Les Kelley Family Health Center in Santa Monica and the Mid-Valley Family Health Center, located in a Los Angeles County Health Center in Van Nuys, Calif. The department is also a leader in family medicine education, for both medical students and residents, and houses a significant research unit focusing on geriatric issues and health care disparities among immigrant families and minority communities in Los Angeles and California.

The Robert Wood Johnson Foundation Clinical Scholars® program has fostered the development of physicians who are leading the transformation of health care in the United States through positions in academic medicine, public health and other leadership roles for three decades. Through this program, future leaders learn to conduct innovative research and to work with communities, organizations, practitioners and policy-makers on issues important to the health and well-being of all Americans. For more information, visit http://www.rwjcsp.unc.edu.

The RAND Corporation is a nonprofit institution that helps improve policy and decision-making through research and analysis.

Study: Working Moms Multitask More And Have Worse Time Doing So Than Dads

Not only are working mothers multitasking more frequently than working fathers, but their multitasking experience is more negative as well, according to a new study in the December issue of the American Sociological Review.

"Gender differences in multitasking are not only a matter of quantity but, more importantly, quality," said Shira Offer, the lead author of the study and an Assistant Professor in the Department of Sociology and Anthropology at Bar-Ilan University in Israel. "Our findings provide support for the popular notion that women are the ultimate multitaskers and suggest that the emotional experience of multitasking is very different for mothers and fathers."

In terms of quantity, the study found that working mothers spend about 10 more hours per week multitasking than do working fathers, 48.3 hours per week for moms compared to 38.9 for dads.

"This suggests that working mothers are doing two activities at once more than two-fifths of the time they are awake, while working fathers are multitasking more than a third of their waking hours," said study coauthor Barbara Schneider, the John A. Hannah Chair and University Distinguished Professor in the College of Education and Department of Sociology at Michigan State University.

But the authors said an even bigger issue than the time discrepancy is the difference in the way multitasking makes working mothers and fathers feel. "There is a considerable disparity in the quality of the multitasking experience for working moms and dads," Offer said. "For mothers, multitasking is—on the whole—a negative experience, whereas it is not for fathers. Only mothers report negative emotions and feeling stressed and conflicted when they multitask at home and in public settings. By contrast, multitasking in these contexts is a positive experience for fathers."

The Offer-Schneider study relies on data from the 500 Family Study, a multi-method investigation of how middle-class families balance family and work experiences. The 500 Family Study collected comprehensive information from 1999 to 2000 on families living in eight urban and suburban communities across the United States. Most parents in the 500 Family Study are highly educated, employed in professional occupations, and work, on average, longer hours and report higher earnings than do middle-class families in other nationally representative samples. Although the 500 Family Study is not a representative sample of families in the United States, it reflects one of the most time pressured segments of the population. The Offer-Schneider study uses a subsample of 368 mothers and 241 fathers in dual-earner families from the 500 Family Study.

According to Offer and Schneider, their study shows that at least some of the difference in the way multitasking makes working mothers and fathers feel is related to the types of activities they perform.

"When they multitask at home, for example, mothers are more likely than fathers to engage in housework or childcare activities, which are usually labor intensive efforts," Offer said. "Fathers, by contrast, tend to engage in other types of activities when they multitask at home, such as talking to a third person or engaging in self-care. These are less burdensome experiences."

The study found that among working mothers, 52.7 percent of all multitasking episodes at home involve housework, compared to 42.2 percent among working fathers. Additionally, 35.5 percent of all multitasking episodes at home involve childcare for mothers versus 27.9 for fathers.

The authors also believe that multitasking—particularly at home and in public—is a more negative experience for working mothers than for fathers because mothers' activities are more susceptible to outside scrutiny.

"At home and in public are the environments in which most household- and childcare-related tasks take place, and mothers' activities in these settings are highly visible to other people," Schneider said. "Therefore, their ability to fulfill their role as good mothers can be easily judged and criticized when they multitask in these contexts, making it a more stressful and negative experience for them than for fathers."

Working fathers don't typically face these types of pressures, the authors said. "Although they are also expected to be involved in their children's lives and do household chores, fathers are still considered to be the family's major provider," Offer said. "As a result, fathers face less normative pressures and are under less scrutiny when they perform and multitask at home and in public."

So, what can be done to improve the situation for mothers? It's pretty simple—fathers need to step up.

"The key to mothers' emotional well-being is to be found in the behavior of fathers," Offer said. "I think that in order to reduce mothers' likelihood of multitasking and to make their experience of multitasking less negative, fathers' share of housework and childcare has to further increase."

Policymakers and employers can help facilitate this, the authors said. "Policymakers and employers should think about how to alter current workplace cultures, which constitute serious obstacles when it comes to getting fathers more involved in their families and homes," Offer said.

"For example, I think that fathers should have more opportunities to leave work early or start work late, so they can participate in important family routines; to take time off for family events; and to limit the amount of work they bring home, so they can pay undivided attention to their children and spouse during the evening hours and on weekends. The goal is to initiate a process that will alter fathers' personal preferences and priorities and eventually lead to more egalitarian norms regarding mothers' and fathers' parenting roles."

Contacts and sources:
Daniel Fowler
American Sociological Association

About the American Sociological Association and the American Sociological Review

The American Sociological Association (www.asanet.org), founded in 1905, is a non-profit membership association dedicated to serving sociologists in their work, advancing sociology as a science and profession, and promoting the contributions to and use of sociology by society. The American Sociological Review is the ASA's flagship journal.

Health Gap Has Grown Among Young US Adults, Study Finds

Levels of health disparity have increased substantially for people born in the United States after 1980, according to new research.

The study also found that health disparity tends to increase as people move into middle age, before declining as people reach old age.

These two results suggest that the gap between the healthiest and least healthy people in the United States as a whole will grow larger for the next one or even two decades as the younger generations grow older and replace previous generations.

"As young people today reach middle age and preceding cohorts with a smaller health gap die off, we expect health disparities in the whole population to grow even larger," said Hui Zheng, lead author of the study and assistant professor of sociology at Ohio State University.

A lot will depend on whether future generations will continue the trend, seen in post-baby boomers, of large health disparities.

"If that trend continues, as I expect it will, health disparities in the whole population will increase in the coming decades," Zheng said.

The health gap has not always been growing, according to the study. Health disparities continuously declined from those born early in the 20th century to the baby boomer cohort, before increasing for post-baby boomer cohorts, especially those born after 1980.

Zheng conducted the study with Yang Yang of the University of North Carolina-Chapel Hill and Kenneth Land of Duke University. Their results appear in the December issue of the journal American Sociological Review.

This study provides one of the clearest, most comprehensive pictures ever of health disparities in the United States because of a methodological innovation, Zheng said.

Zheng and his colleagues combined two statistical models that allowed them, for the first time, to disentangle how health disparity over time is affected by three factors: people's age, when they were born, and the time period when their health is assessed.

"We have never before been able to look at all three of these factors together and see how each interacts with the others to affect changes in health disparities," Zheng said.

The study is based on data from the National Health Interview Survey for the 24-year period from 1984 to 2007. The survey, which includes about 30,000 people each year, is conducted by the National Center for Health Statistics.

The survey asked respondents to rate their own health on a five-point scale from poor to excellent. While this is a self-report and not based on any objective health data, previous studies have shown that self-reported health is a good indicator of objective health and is actually better at predicting mortality among the elderly than doctor assessments, Zheng said.

The researchers took into account a variety of other factors that may affect health, including gender, race, marital status, work status, education and income.

Overall, the study found that late baby boomers – those born from 1955 to 1964 – reported better health than any other generation. In addition, self-rated health has significantly declined since the late 1990s.

One of the key findings was the large gap in self-reported health that opened up for people born since 1980. That means people are more spread out among the five health categories, from excellent to poor, Zheng said.

This data can't explain why health disparities grew, but research by other sociologists provides potential explanations, Zheng said.

For one, income inequality increased dramatically in the past three decades in the United States, which could impact accessibility to health care and other important resources.

Also, an increase in immigrants, both documented and undocumented, has probably changed the distribution of health ratings in the country, while the growing obesity crisis has added to those in poor health.

Finally, a growing "digital divide" in access to medical and health information on the internet has created disparities in health knowledge among different populations, which can affect health choices and outcomes.

The main reason that health disparity is expected to grow in the whole population in the coming decades has to do with what is happening among young adults born since 1980, Zheng said.

Current young adults have a larger health gap than preceding cohorts and, in addition, disparity rises as people move from youth to middle age, peaking at about age 55. Disparity then declines among the elderly, according to the study.

Those two factors will work together to increase disparity in the whole population as young adults replace their elders in the population.

Most young people are generally healthy, which keeps disparities low, Zheng said. As people age and some develop health problems and diseases, disparity grows. But these disparities fall again in old age as sicker people die and only the healthier people remain. The narrower disparities in old age may also result from the fact that all older people suffer frailty and tend to share the same health risk factors. Another factor may be equalization of health care usage and protections through Medicare coverage after age 65.

The study also found gender differences in health across the lifespan. There is a relatively large gap in early adulthood, with men being healthier than women. This gap narrows until about age 61, as men are more likely to experience more severe forms of chronic conditions, such as heart disease.

The gap widens again after age 61, as only the healthier men survive and there is a relatively larger share of women with poorer health alive at older ages.

While this study focused on health, Zheng said this new model can be used to study other types of inequality, such as income, education or wealth.

"This model provides a powerful framework to identify and study the evolution of inequalities across age, period and cohorts," Zheng said.


Contacts and sources:
Written by Jeff Grabmeier
Hui Zheng
Ohio State University

Want To Defeat A Proposed Public Policy? Just Label Supporters As “Extreme”

 New research shows how support for a generally liked policy can be significantly lowered, simply by associating it with a group seen as “radical” or “extreme.”

In one experiment, researchers found that people expressed higher levels of support for a gender equality policy when the supporters were not specified than when the exact same policy was attributed to “radical feminist” supporters.

These findings show why attacking political opponents as “extremists” is so popular – and so effective, said Thomas Nelson, co-author of the study and associate professor of political science at Ohio State University.

“The beauty of using this ‘extremism’ tactic is that you don’t have to attack a popular value that you know most people support,” Nelson said.

“You just have to say that, in this particular case, the supporters are going too far or are too extreme.”

Nelson conducted the study with Joseph Lyons and Gregory Gwiasda, both former graduate students at Ohio State. The findings were published in a recent issue of the journalPolitical Psychology.

For the study, the researchers did several related experiments.

In one experiment, 233 undergraduate students were asked to read and comment on an essay that they were told appeared on a blog. The blog entry discussed the controversy concerning the Augusta National Golf Club’s “men only” membership policy. The policy caused a controversy in 2003 before the club hosted the Masters Tournament.

Participants read one of three versions of an essay which argued that the PGA Tour should move the Masters Tournament if the club refused to change this policy.

One group read that the proposal to move the tournament was led simply by “people” and “citizens.” Another group read that the proposal was led by “feminists.” The third group read that the proposal was led by “radical feminists,” “militant feminists,” and “extremists.” Additional language reinforced the extremist portrayals by describing extreme positions that the groups allegedly held on other issues, such as getting rid of separate locker room and restroom facilities for men and women.

Participants were then asked to rate how much they supported Augusta changing its membership rules to allow women members, whether they supported the Masters tournament changing its location, and whether, if they were a member, they would vote to support female membership at the club.

The findings showed that participants were more supportive of the golf club and its rules banning women when the proposal to move the tournament was attributed to “radical feminists.” They were also less likely to support moving the tournament, and less likely to support female membership.

“All three groups in the study read the exact same policy proposals. But those who read that the policy was supported by ‘radical feminists’ were significantly less likely to support it than those who read it was supported by ‘feminists’ or just ‘citizens,’” Nelson said.

By associating a policy with unpopular groups, opponents are able to get people to lose some respect for the value it represents, like feminism or environmentalism, Nelson said.

The researchers were able to show that in a separate experiment. In this case, 116 participants read the same blog entry used in the previous experiment. Again, the blog entry supported proposals to allow women to join the golf club. One version simply attributed the proposal to citizens, while the other two attributed them to feminists or radical feminists.

Next, the subjects ranked four values in order of their importance as they thought about the issue of allowing women to join the club: upholding the honor and prestige of the Masters golf tournament; freedom of private groups to set up their own rules; equal opportunities for both men and women; and maintaining high standards of service for members of private clubs.

How people felt about the relative importance of these values depended on what version of the essay they read.

Of those participants who read the proposal attributed simply to citizens, 42 percent rated equality above the other three values. But only 32 percent who read the same proposal attributed to extremists thought equality was the top value.

On the other hand, 41 percent rated group freedom as the top value when they read the proposal attributed to citizens. But 52 percent gave freedom the top ranking when they read the proposal attributed to extremists.

“Tying the proposal to feminist extremists directly affected the relative priority people put on gender equality vs. group freedom, which in turn affected how they felt about this specific policy,” Nelson said.

“Perhaps thinking about some of the radical groups that support gender equality made some people lose respect for that value in this case.”

This tactic of attacking a policy by tying it to supposedly extremist supporters goes on all the time in politics, Nelson said.

For example, foes of President Obama’s health-care reform initiativeattacked the policy by calling Obama a “socialist” and comparing the president to Adolf Hitler.

These tactics can work when people are faced with competing values and are unsure what their priorities should be, Nelson said.

Environmental values, for example, may sometimes conflict with economic values if clean air or clean water laws make it more difficult for companies to earn a profit.

“If you want to fight against a proposed environmental law, you can’t publicly say you’re against protecting the environment, because that puts you in the position of fighting a popular value,” Nelson said.

“So instead, you say that proponents of the proposed law are going to extremes, and are taking the value too far.”

One problem with this tactic for society, though, is that it can hurt support of the underlying values, as well as the specific policy.

“If you use this extremism language, it can make people place less of a priority on the underlying value. People may become less likely to think environmentalism or gender equality are important values.”


Contacts and sources:
Thomas Nelson
Written by Jeff Grabmeier
Ohio State University

Vaccine Targeting Latent TB Enters Clinical Testing

Statens Serum Institut and Aeras today announce the initiation of the first Phase I clinical trial of a new candidate TB vaccine designed to protect people latently infected with TB from developing active TB disease. The trial is being conducted by the South African Tuberculosis Vaccine Initiative (SATVI) at its field site in Worcester, in the Western Cape province of South Africa. Dr. Hassan Mahomed is the principal investigator.

"Two billion men, women and children live with latent TB infection," said Jim Connolly, President and Chief Executive Officer of Aeras. "It's daunting to comprehend that there is a vast reservoir of people with a 5-10% lifetime risk of becoming sick with TB. A vaccine that prevents TB disease in this population could save millions of lives, and this trial is a first step in assessing a vaccine candidate designed for this purpose."

The candidate TB vaccine (SSI H56-IC31) is a subunit vaccine containing recombinant TB proteins formulated in a proprietary adjuvant IC31® from Intercell. It is being developed under a consortium of researchers led by Peter Andersen at the Statens Serum Institut (SSI) based in Copenhagen. The consortium is supported as part of the Grand Challenges in Global Health, an initiative that fosters scientific breakthroughs needed to prevent, treat and cure diseases of the developing world.

"The development of urgently needed new TB vaccines requires a global effort," said Prof. Peter Andersen, the Vice President of Vaccine Research & Development at SSI. "The advancement of this candidate from an idea to the clinic working in collaboration first with the Grand Challenges consortium and now with Aeras and SATVI is an important and exciting milestone for all the researchers involved."

This clinical trial will be the first to test this vaccine candidate in people. It will assess the safety and immunogenicity of SSI H56-IC31 in 25 adults, including participants with and without latent TB infection. SSI H56-IC31 has been tested in several pre-clinical studies with no safety concerns and has shown efficacy in small animal models administered both before infection and to latently infected animals. The vaccine was also shown to control clinical disease and reactivation in a non-human primate model. This is the first time a South African research institute has led a first-in-human Phase I clinical trial of a new TB vaccine.

"SATVI is delighted to be part of the trial at this early stage, which is testament to the high-regard that the developers have for our TB vaccine clinical research expertise to conduct these crucial early trials in humans," said SATVI Director, Professor Willem Hanekom.

SSI H56-IC31 is being developed for both adolescent and adult populations. The trial has been approved by the Medicines Control Council of South Africa. Preliminary results of this trial are expected at the end of 2012.


Contacts and sources:
Annmarie Leadman
Aeras

About Statens Serum Institut (SSI)

SSI (http://www.ssi.dk) is a state owned enterprise under the Danish Ministry of Health and Prevention. The Institute is integrated in the national Danish health services. SSI´s mission is to prevent and control infectious diseases, biological threats, and congenital disorders. The institute strives to be a highly-regarded and internationally recognized research, production and service enterprise.

About Aeras

Aeras (http://www.aeras.org) is a non-profit product development organization dedicated to the development of effective vaccines and biologics to prevent TB across all age groups in an affordable and sustainable manner. Aeras has invented or supported the development of six TB vaccine candidates, which are undergoing Phase I and Phase II clinical testing in Africa, Asia, North America and Europe. Aeras receives funding from the Bill & Melinda Gates Foundation, other private foundations, and governments. Aeras is based in Rockville, Maryland, USA where it operates a state-of-the-art manufacturing and laboratory facility, and Cape Town, South Africa.

Gone With The Wind – Why The Fast Jet Stream Winds Cannot Contribute Much Renewable Energy After All

The assumption that high jet steam wind speeds in the upper atmosphere correspond to high wind power has now been challenged by researchers of the Max Planck Institute for Biogeochemistry in Jena, Germany. 

Taking into account that the high wind speeds result from the near absence of friction and not from a strong power source, Axel Kleidon and colleagues found that the maximum extractable energy from jet streams is approximately 200 times less than reported previously. Moreover, climate model simulations show that energy extraction by wind turbines from jet streams alters their flow, and this would profoundly impact the entire climate system of the planet.

Artificial picture illustrating potential extraction of jet stream wind power.
Credit: Max Planck Institute for Biogeochemistry

Jet streams are regions of continuous wind speeds greater than 25 m/s that occur at altitudes of 7-16 km. Their high speeds seem to suggest an almost unlimited source of renewable energy that would only need airborne wind energy technology to utilize it. Claims that this potential energy source could “continuously power all civilization” sparked large investments into exploitation of this potential energy resource. 

 However, just like any other wind and weather system on Earth, jet streams are ultimately caused by the fact that the equatorial regions are heated more strongly by the sun than are polar regions. This difference in heating results in large differences in temperature and air pressure between the equator and the poles, which are the driving forces that set the atmosphere into motion and create wind. It is this differential heating that sets the upper limit on how much wind can be generated and how much of this could potentially be used as a renewable energy resource.

It is well known in meteorology that the high wind speeds of jet streams result from the near absence of friction. In technical terms, this fact is referred to in meteorology as “geostrophic flow”. This flow is governed by an accelerating force caused by pressure differences in the upper atmosphere, and the so-called Coriolis force arising from the Earth’s rotation. Because the geostrophic flow takes place in the upper atmosphere, far removed from the influence of the surface and at low air density, the slow-down by friction plays a very minor role. Hence, it takes only very little power to accelerate and sustain jet streams. 

Graphics depicting the calculations for high kinetic energy transport (upper panel) versus maximum kinetic energy extraction rates (lower panel) from jet streams. Please note the units.
Credit: Max Planck Institute for Biogeochemistry

 “It is this low energy generation rate that ultimately limits the potential use of jet streams as a renewable energy resource”, says Dr. Axel Kleidon, head of the independent Max Planck Research Group ‘Biospheric Theory and Modelling’. Using this approach based on atmospheric energetics, Kleidon’s group used climate model simulations to calculate the maximum rate at which wind energy can be extracted from the global atmosphere. Their estimate of a maximum of 7.5 TW (1 TW = 10^12 W, a measure for power and energy consumption) is 200-times less than previously reported and could potentially account for merely about half of the global human energy demand of 17 TW in 2010.

Max Planck researchers also estimated the climatic consequences that would arise if jet stream wind power would be used as a renewable energy resource. As any wind turbine must add some drag to the flow to extract the energy of the wind and convert it into electricity, the balance of forces of the jet stream must also change as soon as energy is extracted. If 7.5 TW were extracted from jet streams as a renewable energy source, this would alter the natural balance of forces that shape the jet streams to such an extent that the driving atmospheric pressure gradient between the equator and the poles is depleted. 

“Such a disruption of jet stream flow would slow down the entire climate system. The atmosphere would generate 40 times less wind energy than what we would gain from the wind turbines”, explains Lee Miller, first author of the study. “This results in drastic changes in temperature and weather”.

 Contacts and sources:
Max Planck Institute for Biogeochemistry

Citation:  L.M. Miller, F. Gans, & A. Kleidon, 2011: Jet stream wind power as a renewable energy resource: little power, big impacts; Earth Syst. Dynam. 2, 201–212, 2011, doi: 10.5194/esd-2-201-2011.

Switch Your Car To 'Autopilot' And Don’t To Worry About Collisions: Faster Transistors Drive Better Safety And Security

Imagine being able to switch your car to 'autopilot' and not having to worry about collisions. EU-funded researchers are working on new technologies, such as longer-range car radar, which could make such idle dreams possible. And the early results are already in commercial production.

Credit:  © Shutterstock

But such applications need to use higher radio frequency electronics than ever before, and they therefore rely on the development of new, faster microchips in order to work. The EU-funded project 'Towards 0.5 Terahertz Silicon/Germanium hetero-junction bipolar technology', or 'DotFive', has developed faster transistors that will lay the foundations for these new technologies. 


Increasing the running speed of microelectronics can open up new application areas: high-speed wireless communications, car collision avoidance or high-definition non-invasive imaging for security scanners. But microcircuits that can operate at over 100GHz, necessary to implement these new products, demand performance of three times this speed at transistor level.

This is where the three-year DotFive project comes in, with the goal of designing 'Hetero-junction bipolar transistors' (HBTs) that could reach 500GHz (or 0.5THz). The challenge was to double the frequency compared to the state of the art at the submission of the project.

'And we did reach those numbers!' says the project coordinator Gilles Thomas of STMicroelectronics, France.

The project consortium included four technology providers: two companies, Infineon and STMicroelectronics, and two research institutes. Consult the DotFive project record on CORDIS for more partner details. All partners made substantial progress; the two research institutes have produced transistors running at the target speed, with Germany's IHP Microelectronics GmbH getting the best results so far.

The DotFive team tried more than one approach to the problem: one of the project's work packages tried to build on existing architectures, while another tried 'breakthrough' architectures. As Mr Thomas explains: 'The architecture that did best is the one that minimises most of the "parasitic effects" in the transistor (such as capacitances, resistances and access resistance) and shows the best "self-alignment" of base, emitter and collector.'

Working together

One of the key successes of the project was therefore getting all the partners aligned on using the same methodology, electrical characterisation technique and modelling techniques so that the results were comparable among the project participants.

'In order to operate at these speeds, we had to understand factors never encountered before, as well as the physics of their effects,' says Mr Thomas. 'This could not be done without collaboration.'

The kind of technology development carried out by DotFive is pre-competitive, he explains further. The teams shared their 'Computer aided design' (CAD) platforms, measurement techniques, model parameters and some data processing.

Just as when the GSM standard for mobile phones was agreed, even if companies involved in a new technology are in competition with each other at the product level, they still need to collaborate to develop and agree the basic technology and standards.

Europe is the leading producer of this type of technology, according to Mr Thomas. 'We established the roadmap for radio frequency (RF) technology, so we'd better cooperate to maintain our lead.'

This is one reason why the EU contributed funding of EUR 9.7 million towards an overall project budget of EUR 14.74 million. 'In order to produce the products that will enlarge the market five years down the road, we need to talk to each other today,' he suggests.

Results in commercial production

'In terms of commercialisation,' continues Mr Thomas, 'we wanted to complete three cycles of learning - with incremental improvement of the design, process and tools - over the three years of the project.'

The results from the first year's cycle are already embedded in the circuit designs in preparation, with an increase in circuit speeds from 77GHz to 120GHz.

'We are now in the qualifying stage of the results of the third cycle,' says Mr Thomas, with radar demos running at 140GHz.

'Car radars have moved into a new generation thanks to this project,' says Mr Thomas. In addition to the 77GHz band allocated by international standards, the project expects a new 120GHz band to be opened up for longer-range radar.

'We would also like to develop imaging systems using millimetre waves,' he says. These lie above the 100GHz range, between microwaves and infra-red radiation. Such imaging systems could contribute to public safety by improving security scanners.

'Currently, such systems exist but they are expensive, bulky and use a lot of electricity,' he explains, as they are built from discrete components not microcircuits. And because they don't use integrated components, they cannot be assembled into large arrays, meaning their resolution remains poor.

'For scanners, if we can succeed with the miniaturisation and integration of our new high-speed components on silicon,' says Mr Thomas, 'it will be like moving from 1950s computers, which filled an air-conditioned room, to PCs.'

The different project partners are now taking different routes to market with their products. Having established the basic technologies, the work is moving from pure research towards commercial development.

'We have now begun a new project, funded by the Eureka programme's Catrene clusteron microelectronics, to develop a BiCMOS technology based on 500GHz HBTs plus digital CMOS for industrial production,' says Mr Thomas.

These would be able to integrate the RF components with the digital image processing on the same chip. Such revolutionary microcircuits may well contribute to continuing European success in these markets, as well as changing our lives through radically new applications.

DotFive was funded by the EU's Seventh Framework Programme (FP7), under the ICT sub-programme and budget line for 'Next-generation nanoelectronics components and electronics integration'. 

Scientists Create Continent-Wide Telecoms Test Lab


EU-funded researchers have developed new tools and processes to link up or 'federate' specialised telecoms research test laboratories and infrastructure. The end result is an award-winning pan-European platform which makes it faster and cheaper for companies to test and deliver new telecoms devices and services under more realistic conditions and at a larger scale. And commercial spin-offs beckon.

Credit: © Shutterstock

Last year, smart phone sales exceeded those of PCs for the first time, and since the advent of the iPad in April 2010 highly portable tablet computers are fast becoming the most popular way to access the internet. Both smart phones and tablets typically use a mix of WiFi and 3G networks for voice, text, video chat and internet access.

These trends indicate that communications and even internet access will be increasingly mediated by always-connected mobile devices rather than desktops and laptops. This emerging access paradigm will create both new needs and opportunities, leading to new mobile products and services that will require rapid and rigorous testing. That is what the EU-funded 'Pan-European laboratory infrastructure implementation' (PII) project delivers.

The main goal of PII was to link relevant innovation clusters in Europe into a federation of telecoms testing facilities, building on the work carried out in an earlier European project which first tackled the idea of a 'pan-European lab' or Panlab for ICT testing.

Large-scale testing facilities like PII's enable hardware and software to be put through their paces under real telecoms conditions, while isolated from public networks. As such, these testbeds act as a 'sandbox', where researchers can create any service and see how it performs on a real-world network, without the risks associated with a 'live test'.

But while Europe has many of these labs, before PII they were fragmented and even the largest laboratories were relatively limited and could not provide large-scale, continent-wide testing facilities.

Reversing the fragmentation problem
'The starting point was to reverse the fragmentation of testbeds and testing infrastructures in Europe,' explains Anastasius Gavras, the coordinator of the PII project.

To achieve this, the PII team developed mechanisms and tools to describe, store, locate and orchestrate testing services. The project also created tools which simplified the work of researchers, allowing them to automatically access composite testbeds across multiple administrative domains.

The team also developed new mechanisms that will be able to combine and accommodate clean-slate testing methods in the future which enable scientists to better define the testing environment. This is a major advance, permitting telecoms engineers to prototype products and services regardless of the underlying networks in use.

One of the project's key achievements was the definition and implementation of a common abstract control framework, which enables the interconnection of diverse testbeds. PII established and elaborated quality assurance processes and tools which reassure the users of the test platform or facility that their results are reliable and handled well. The team also looked at the long-term sustainability of the federation model.

'We fulfilled all our objectives and among the many initiatives we undertook and completed, the main one is a set of tools, collectively referred to as Teagle,' explains Mr Gavras.

Teagle offers a number of central services to testbed providers and users. It makes it possible to describe testbed resources in a consistent way, and to register, manage and deliver them, even across a wide variety of different labs and technologies.

'Different resource types can be handled, such as physical and virtual machines, devices, software, as well as abstract concepts and services,' Mr Gavras notes. 'All the current Teagle and federation framework prototypes were developed in the Panlab/PII project and released as open source.'

Faster and cheaper testing

The PII results will let companies test ICT services and products under more realistic conditions and at a larger scale, but the project will also have a very direct effect on telecoms innovation generally, because it makes it faster and cheaper to do interesting experiments.

PII is likely to have a rapid impact because many of the partners are already taking the work further in their own companies and organisations. Some of the partners are developing a business plan for setting up bases or 'Panlab' offices to exploit business opportunities. 'We plan to invest funds and human resources in order to find out whether the concept is financially viable and self-sustaining in the market,' Mr Gavras reveals.

Panlab will follow an internet-style entrepreneurship model. That means putting it to the market to see if it works and making any necessary adjustments along the way. Then if it doesn't work, it 'fails fast but fails cheap' says Mr Gavras. But if it works, then you 'expand fast and expand to other sectors'.

The business plan assumes the close collaboration between a number of PII project partners who own and operate testbeds. 'These testbeds will form the nucleus of the Panlab federation offering,' remarks Mr Gavras. 'Future plans will attempt to incorporate additional testbed owners.'

That is not the only vehicle which is taking the PII project's results forward. Academics and industrial researchers have picked up the results and are carrying them further as well. 'The framework and tools are considered generic, so they are being used for general purpose deployment of service platforms,' notes Mr Gavras.

In addition, an important work-in-progress is the establishment of a business offering for brokering testbed resources, including service guarantees. PII's achievements were also duly recognised when the project received an award at last year's Future Internet Awards.

The research is set to continue under another FP7 project, called OpenLab. Started in September 2011, it brings together the essential ingredients for an open, general purpose and sustainable large-scale experimental platform. This shared resource will build on PII's work to provide a means for developing early, successful prototypes for Future Internet Research and Experimentation. The EU support for a follow-on project is therefore a crowning achievement for PII.

The PII project received research EUR 5.7 million funding (out of EUR 8.38 million total budget) under the EU's Seventh Framework Programme, sub-programme 'New paradigms and experimental facilities'.


Contacts and sources:
Anastasius Gavras, Eurescom, Germany, coordinator of the PII project
CORDIS Features

Research Shows The Power Of Hand Gestures In Police Interviews

A University of Hertfordshire PhD student who graduated earlier this month conducted research which proved that hand gestures influence eyewitnesses in police interviews.

In a research thesis entitled The Misleading Potential of Communicative Hand Gestures in a Forensic Interview, Daniel Gurney conducted a series of four studies based on role plays of police interview scenarios which proved that hand gestures can exert an influence on witnesses and skew their responses when questioned.

“We found that eyewitness could be led to believe they saw something they didn’t when the interviewer performed misleading hand gestures,” he said. "For example, many people remembered a man having a beard when they saw the interviewer rubbing his chin.”

According to Dr Gurney, this is the first study to show that eyewitnesses can be misled non-verbally and continues research into how gestures can communicate carried out by his supervisor, Professor Karen Pine.

Contacts and sources:
University of Hertfordshire 

Excellent Heavy-Ion Performance For The Large Hadron Collider

One of the first lead-ion events recorded by ALICE in the 2011 lead-ion run


Credit: CERN

The Large Hadron Collider (LHC) has harvested a healthy crop of lead-ion collisions. In the two weeks since the beginning of the 2011 lead-ion run, some 10 times more luminosity (a measure of the number of collisions) has been delivered than in the entire 2010 lead-ion run. Analysis is in full swing for the three experiments gathering lead-ion data: ALICE, ATLAS and CMS. By studying lead-ion data, physicists probe matter as it would have been in the first instants of the Universe's life. One of the main goals is to produce tiny quantities of such matter, known as Quark Gluon Plasma, and to study how it has evolved into the kind of matter that makes up the Universe today.


Contacts and sources:
CERN

Hebrew University Researchers Discover Molecular Machinery Responsible For Production Of Proteins Involved In Bacterial Cell Death

Researchers at the Hebrew University of Jerusalem and the University of Vienna have revealed for the first time a stress-induced machinery of protein synthesis that is involved in bringing about cell death in bacteria.

Their work opens a new chapter in the understanding of protein synthesis under stress conditions, which are the conditions bacteria usually are faced with, both in humans and otherwise in nature, and could pave the way for the design of novel, new antibiotics that would help to overcome serious public health problems, the researchers believe.  

Prof. Hanna Engelberg-Kulka
Credit:  Hebrew University of Jerusalem

In the last 50 years, the biological machinery responsible for protein synthesis has been extensively studied, in particular in the gastric bacteria Escherichia coli (E.coli). The machinery of protein synthesis operates primarily through ribosomes -- small particle present in large numbers in every living cell whose function is to convert genetic information into protein molecules -- and messenger RNAs (mRNAs), which transfer the genetic information from the genome to the ribosomes and thereby direct the synthesis of cell proteins.

In an article in a recent issue of the journal Cell, Prof Hanna Engelberg-Kulka of the Institute for Medical Research Israel Canada (IMRIC) at the Hebrew University–Hadassah Medical School and her students describe the discovery of a novel molecular machinery for protein synthesis that is generated and operates under stress conditions in E.coli.. The work described in the Cell article was done in collaboration with the laboratory of Prof. Isabella Moll of the University of Vienna.

Their study represents is a breakthrough since it shows, for the first time, that under stress conditions, such as nutrient starvation and antibiotics, the synthesis of a specific toxic protein is induced that causes a change in the protein-synthesizing machinery of the bacteria. This toxic protein cleaves parts of the ribosome and the mRNAs, thereby preventing the usual interaction between these two components.

As a result, an alternative protein-synthesizing machinery is generated. It includes a specialized sub-class of ribosomes, called “stress ribosomes,” which is involved in the selective synthesis of proteins that are directed by the sliced mRNAs, and is responsible for bacterial cell death.

Practically speaking, the discovery of a “stress-induced protein synthesizing machinery” may offer a new way for the design of improved, novel antibiotics that would effectively utilize the stress-inducing mechanism process in order to more efficiently cripple pathogenic bacteria.


Contacts and sources:
Hebrew University of Jerusalem

Citation: In Cell, September 30, 2011 

Selective Translation of Leaderless mRNAs by Specialized Ribosomes Generated by MazF in Escherichia coli

Oliver Vesper,1 Shahar Amitai,2 Maria Belitsky,2 Konstantin Byrgazov,1 Anna Chao Kaberdina,1
Hanna Engelberg-Kulka,2,* and Isabella Moll1,*
1Max F. Perutz Laboratories, Center for Molecular Biology, Department of Microbiology, Immunobiology and Genetics, University of Vienna,
Dr. Bohrgasse 9/4, 1030 Vienna, Austria
2Department of Microbiology and Molecular Genetics, IMRIC, The Hebrew University-Hadassah Medical School, Jerusalem 91120, Israel
*Correspondence: hanita@cc.huji.ac.il (H.E.-K.), isabella.moll@univie.ac.at (I.M.)
DOI 10.1016/j.cell.2011.07.047



Mr. Potato Head: Human Brain And The Potato - Similarities Within The Mitochondrial Ion Channels


Certain elements of the membranes surrounding cellular mitochondria, responsible for transporting potassium ions, are identical in the potato and in mammalian brain, suggest researchers from the Nencki Institute of Experimental Biology of the Polish Academy of Sciences in Warsaw. Their claim is based on research conducted in collaboration with scientists from the Adam Mickiewicz University in Poznan.

Potassium ions (red) flow through the ion channel (yellow) in mitochondrial membrane. Mitochondrial potassium channels are investigated at the Nencki Experimental Biology Institute of the Polish Academy of Sciences in Warsaw. 
Source: Nencki Institute, Maciej Frołow

Membranes surrounding the mitochondria contain proteins controlling the movement of ions, called the ion channels. Research conducted by scientists from the Nencki Experimental Biology Institute of the Polish Academy of Sciences in Warsaw (Nencki Institute) and the Institute of Molecular Biology and Biotechnology at the Adam Mickiewicz University (IBMiB UAM) in Poznan showed that certain types of mitochondrial potassium channels in potato cells are identical with respect to their structure and function as their counterparts in the mitochondria of neurons in mammalian brain.

Mitochondria, the energy centres of cells, are organelles a few micrometres in length. They are present inside eukaryotic cells (cells with a nucleus) and their number ranges from a few hundred to a few thousand. Mitochondria are responsible for important life functions as, among other, they produce a chemical compound called adenosine triphosphate (ATP), which is the main carrier of chemical energy in cells. Its fundamental importance is highlighted by the fact that each day a human being transforms ATP in the amount comparable to the mass of its entire body.

Ion channels are proteins which allow the flow of large quantities of certain type of ions in a controlled manner. In the membranes surrounding either the cells or the mitochondria, there are channels specializing in transport of potassium, sodium, calcium or chlorine ions. Scientists from Nencki Institute and IBMiB UAM investigated mitochondrial potassium channels controlled by ATP as well as calcium ions.

“The problem with ion channels in the mitochondrial membranes is that they really should not exist at all. Modern models of energy production in the cell indicate that channels in mitochondrial membranes would lower the effectiveness of the process. But since the channels do exist, they must have provided significant evolutionary advantage. We, therefore, face the following question: when in the history of life on Earth has this advantage played a role?”, comments Prof. Adam Szewczyk from Nencki Institute, co-investigator.

Ion channels are opened and closed by specific activators or blockers. An example of a blocker affecting human cellular mitochondria is iberiotoxin, present also in the scorpion venom. Measurements of the current flowing through the potassium channels in the mitochondria of potatoes, taken at Nencki Institute, have shown that these proteins not only have a similar function as mammalian mitochondrial channels, but also react to the same toxins. “This is extraordinary. Proteins responsible for the transport of potassium ions seem to be evolutionary preserved it the mitochondria”, concludes Prof. Wiesława Jarmuszkiewicz from the IBMiB UAM.

Membranes surrounding the mitochondria in potato cells and neurons in mammalian brain contain similar ion channels. Mitochondrial potassium channels are investigated at the Nencki Experimental Biology Institute of the Polish Academy of Sciences in Warsaw.
 Source: Nencki Institute, Grzegorz Krzyżewski

Ion channel research may have great medical significance. Drugs affecting mitochondrial ion channels could significantly limit the effects of heart attacks and strokes. However, introducing a new drug to the market is a very expensive and long process. For this reason the effects of mitochondrial ion channel research will likely first impact the cosmetics industry.

The Nencki Institute of Experimental Biology PAS intends to begin long term collaboration with Dr. Irena Eris Cosmetic Laboratory. They have jointly submitted a project under the INNOTECH initiative implemented by the National Centre for Research and Development. By studying protective substances affecting mitochondrial potassium channels they hope to develop new dermocosmetics. „If everything goes as planned, in a couple of years everyone will be able to try a new dermocosmetic product and appreciate the benefits of mitochondrial basic research”, concludes Prof. Szewczyk.

The Nencki Institute of Experimental Biology of the Polish Academy of Sciences has been established in 1918 and is the largest non-university centre for biological research in Poland. Priority fields for the Institute include neurobiology, neurophysiology, cellular biology and biochemistry and molecular biology – at the level of complexity from tissue organisms through cellular organelles to proteins and genes. 

There are 31 labs at the Institute, among them modern Laboratory of Confocal Microscopy, Laboratory of Cytometry, Laboratory of Electron Microscopy, Behavioural and Electrophysiological Tests. The Institute is equipped with state-of-the-art research equipment and modernized animal house, where lab animals are bred, also transgenic animals, in accordance with the highest standards. Quality of experiments, publications and close ties with the international science community, place the Institute among the leading biological research centres in Europe.


Contacts and sources:
Nencki Institute of Experimental Biology

Tiny Life Forms Thriving Again In Lake Zurich

While elsewhere species extinction is proceeding at an ever-increasing rate, plankton biodiversity in Lake Zurich is apparently benefiting from rising temperatures and the successful measures against over-fertilization. Begun in the 1970s, this course of action may have a long-term positive effect on fish diversity, although it is too soon to tell. The new species are being watched closely by the water supply company, for some of them can produce harmful substances.

Plankton biodiversity in a sweetwater lake
Credit: EAWAG: Swiss Federal Institute of Aquatic Science and Technology

Not only the variety but also the number of plant and animal plankton in Lake Zurich has increased in the last 30 years, as has been ascertained by an Eawag research group working with experts from the Zurich Water Supply Company (WVZ). Their results have just been published in the journal Oikos. While in the 70s about 40 phytoplankton species and only 7 zooplankton species were found in the lake, in 2008 there were more than 100 ”plant” and 15 animal species. Parallel to the increase in biodiversity of these tiny organisms suspended in the waters of the lake is the growth of their total biomass, probably because there are now more species which are tolerant to reduced nutrients and profits from warmer temperatures even in deeper layers of the lake. Earlier, algae growth was limited to the uppermost layers.

The investigation of unusually detailed chemical, physical and biological data, measured in the deepest part of the lake and recorded by the Zurich Water Supply Company since 1977, made this work possible. Regularly monitored at 14 different depths from the surface to a depth of 135 m were temperature, pH, phosphorous, nitrogen and light et al. In addition, samples of phytoplankton and zooplankton were counted, analysed and classified. Using complex statistical methods, the scientists have now evaluated the data and determined the driving factors behind the result.

Characteristic for the period since 1977 are, above all, a slight but steady increase in water temperature (circa 0.2°C) and a decisive decrease in phosphorous concentration (from ca. 90 to 20 µg phosphate-P/Litre). Phosphorous concentrations today also show less seasonal variation, but greater variation over the range of lake depth levels. 

All of these changes had a positive effect on the increase in species diversity and have led to more stable populations than existed 30 years ago. According to project leader Francesco Pomati more ecological niches, to put it simply, have arisen in the lake, in which even less competitive organisms can find space, light and nourishment to survive. The increase in phytoplankton species variety has also fostered the growing number of zooplankton species, as these feed on the phytoplankton. This has occurred in spite of an increase in water temperatures, which leads according to other projects to a decrease in the biodiversity of zooplankton.

Number of phytoplankton and zooplankton species in lake Zurich, sliding 5 year averages
Credit: EAWAG: Swiss Federal Institute of Aquatic Science and Technology

“The warming of the climate and the successful reduction of over-fertilization is leading to more variation over the whole range of lake depths. This means the presence of more species in a given space”, says Pomati. The aquatic biologist is convinced that the results from Lake Zurich apply also for other lakes of similar depth. “And our work should certainly enrich the discussion about detrimental effects of environmental change brought about by humans”, he says.

Among the species profiting from the changed circumstances are also those not appreciated by everyone, for example the Planktothrix rubescens, which can produce toxic microcystins. For this cyanobacterium, the stable thermal layering of the lake and the supply of phosphate in the deeper layers of water is advantageous. Its increased growth is watched over especially carefully by the Zurich Water Supply Company. “We’re keeping a very close eye on this development, especially in those depths where lake water is taken in for the water utilities”, says the microbiologist Oliver Köster from the WVZ. There is certainly no cause for alarm among the consumers, for today's filters and oxidizing substances like ozone are dependable insurance that the organisms and their harmful substances do not get into Zurich's water pipes.


Contacts and sources:
EAWAG: Swiss Federal Institute of Aquatic Science and Technology

Citation: Pomati, F., Matthews, B., Jokela, J., Schildknecht, A. and Ibelings, B. W. (2011), Effects of re-oligotrophication and climate warming on plankton richness and community stability in a deep mesotrophic lake. Oikos. doi: 10.1111/j.1600-0706.2011.20055.x

10 Good Reasons Parents Want TV Show Ratings

Households are now capable of receiving hundreds of different channels with the advent of cable and satellite TV. This is an overwhelming number of television shows for concerned parents to monitor. How could they possibly filter out inappropriate viewing for their young children while still being able to watch the more mature shows they enjoy after they’ve gone to bed? In 1997 TV Parental Guidelines were implemented to help parents filter out television programs they don’t want their children to watch. Here are 10 good reasons parents want TV show ratings.

  1. Helpful tool – Even though these TV ratings are less than perfect, they are a helpful tool for parents to use. By examining what the different ratings are and what they stand for, parents have a starting point to work from when determining which shows they will allow their children to watch.
  2. Watchdogs – The TV Parental Guidelines Monitoring Board are the ones determining how the various shows are rated. This means there is a panel of watchdogs reviewing these programs for the parents. The panel consists of experts from the television industry and public interest advocates and are available to receive complaints from concerned parents who may not agree with their ratings.
  3. Volume of shows – The sheer volume of shows available on television make it impossible for parents to review them all. The ratings system helps to sort through a multitude of programs by age group. For instance, any show with a rating of TV-MA means that it is intended to mature audiences only and not appropriate for children.
  4. At a glance – Once parents are familiar with the various ratings, they can tell at a glance whether the show should be safe for young children to watch or may need further review.  The ratings appear in the upper left-hand corner of the TV screen at the beginning of the program and again after each commercial break.
  5. Saves time – The TV show ratings system saves precious time for busy parents. As mentioned before, nobody wants to take the time to review the massive number of shows available. The ratings are visible at a glance and whole groups of shows can be blocked using the V-Chip technology built into most television sets.
  6. V-Chip – Since the year 2000 television sets have been equipped with what is called V-Chip technology to help parents filter out programming they feel is inappropriate for their children. The V-Chip receives and understands the different ratings and on screen programs can be used to block whichever ratings parents choose.
  7. Can’t always be there – Since parents can’t always be there when their children are watching TV, blocking adult programming gives them some control even when they’re not at home. Although the system isn’t perfect, it can be improved with monitoring and adjustments.
  8. Can be used for discipline – Parents can even use the TV show ratings as a form of discipline. The various ratings can be used for either punishment or rewards. Parents can block violent shows for kids who get into a fight or unblock them as a reward for staying out of trouble.
  9. Flexibility – The seven different ratings and the five different content labels give parents a wide range of flexibility when determining which shows may or may not be appropriate. Each family has different values and concerns. What may be considered taboo for some parents could be ok with others, so these ratings take that into account.
  10. Peace of mind – Parents who are diligent about using the TV parental guidelines can have more peace of mind about what their kids are watching even when they’re not being supervised.
Many television and movie producers use violence, explicit sex and foul language to compete with each other. Parents need to have some form of control to limit the amount of this content their children are exposed to. Very young children are simply not capable of making good distinctions between what is real and fantasy on TV. Although some kids may not be happy about how their parents are using the ratings system to control their television viewing, it’s a valuable tool for families to use.



Contacts and sources:
Yelin George
http://www.cabletvproviders.net/blog/2011/10-good-reasons-parents-want-tv-show-ratings/