Sunday, July 31, 2011

Doctors: Colon Cleansing Has No Benefit But Many Side Effects Including Vomiting And Death

Colon cleansing - it's been described as a natural way to enhance well-being, but Georgetown University doctors say there's no evidence to back that claim. In fact, their review of scientific literature, published today in the August issue of The Journal of Family Practice, demonstrates that colon cleansing can cause side effects ranging from cramping to renal failure and death.

The procedure, sometimes called colonic irrigation or colonic hydrotherapy, often involves use of chemicals followed by flushing the colon with water through a tube inserted in the rectum. It has ancient roots, but was discredited by the American Medical Association in the early 1900s, yet colon cleansing has staged a comeback.

"There can be serious consequences for those who engage in colon cleansing whether they have the procedure done at a spa or perform it at home," says the paper's lead author, Ranit Mishori, M.D., a family medicine physician at Georgetown University School of Medicine. "Colon cleansing products in the form of laxatives, teas, powders and capsules with names such as Nature's Bounty Colon Cleaner tout benefits that don't exist." She also says it's important to remember the U.S. Food and Drug Administration has no authority to monitor these products.

Mishori and her colleagues examined 20 studies published in the medical literature published in the last decade. She says that while these reports show little evidence of benefit, there is an abundance of studies noting side effects following the use of cleansing products including cramping, bloating, nausea, vomiting, electrolyte imbalance and renal failure.

"Some herbal preparations have also been associated with aplastic anemia and liver toxicity," she says.

And Mishori points out that colon cleansing services are increasingly being offered at spas or clinics by practitioners who call themselves 'colon hygienists' but they have no significant medical training. In fact, organizations such as the National Board for Colon Hydrotherapy and others who promote colon cleansing require hygienists to have little more than a high school diploma.

Mishori says there are much better ways to enhance well-being: "Eat a balanced diet, exercise regularly, get six to eight hours of sleep and see a doctor regularly."

In addition to Mishori, other authors include Aye Otubu, M.D., M.P.H. and Aminah Alleyne Jones, M.D., M.P.H. of the Georgetown University and Providence Hospital Family Medicine Residency Program in Washington, D.C. The authors report no personal financial interests related to the study.

The authors report no personal financial interests related to the study.

Contacts and sources:
Karen Mallet
Georgetown University Medical Center

US Sets Drought Monitor's 'Exceptional Drought' Record In July

Worst classification for drought in nearly 12 percent of contiguous US.

US Drought Monitor, July 26, 2011

The percent of contiguous U.S. land area experiencing exceptional drought in July reached the highest levels in the history of the U.S. Drought Monitor, an official at the National Drought Mitigation Center at the University of Nebraska-Lincoln said.

Nearly 12 percent of the contiguous United States fell into the "exceptional" classification during the month, peaking at 11.96 percent on July 12. That level of exceptional drought had never before been seen in the monitor's 12-year history, said Brian Fuchs, UNL assistant geoscientist and climatologist at the NDMC.

The monitor uses a ranking system that begins at D0 (abnormal dryness) and moves through D1 (moderate drought), D2 (severe drought), D3 (extreme drought) and D4 (exceptional drought).

Exceptional drought's impacts include widespread crop and pasture losses, as well as shortages of water in reservoirs, streams and wells, creating water emergencies.

Currently, 18 percent of the country is classified as under either extreme or exceptional drought, Fuchs said. Much of it is in the south, particularly Texas, where the entire state is experiencing drought -- three-fourths considered exceptional.

The most recent drought monitor report, released late last week, indicated that 59 percent of the United States was drought-free, while 41 percent faced some form of abnormal dryness or drought. Two weeks ago, 

64 percent of the country was drought-free.

Other states that are at least 85 percent abnormally dry or in drought according to the report include:
  • New Mexico (100 percent in drought, 48 percent exceptional)
  • Louisiana (100 percent abnormally dry or in drought, 33 percent exceptional)
  • Oklahoma (100 percent abnormally dry or in drought, 52 percent exceptional)
  • South Carolina (97 percent abnormally dry or in drought, 16 percent extreme to exceptional)
  • Georgia (95 percent abnormally dry or in drought, 68 percent extreme to exceptional)
  • Arkansas (96 percent abnormally dry or in drought, 6 percent extreme to exceptional)
  • Florida (89 percent abnormally dry or in drought, 20 percent extreme to exceptional)
In the next two to three weeks, some affected areas may see some improvement. The wake of Tropical Storm Don should result in rainfall in the central and western Gulf Coast states, but the degree of drought relief will depend upon the storm's intensity, as well as its track and speed.

"Whenever there is a lot of moisture in a short period of time, the potential exists for rapid improvement," Fuchs said. "But while that possibility exists, it won't necessarily mean the end of drought in those areas. It will likely only improve by one drought category for those areas not impacted by any tropical storms or where drought related impacts improve."

The drought monitor combines numeric measures of drought and experts' best judgment into a weekly map. It is produced by the NDMC, the U.S. Department of Agriculture and the National Oceanic and Atmospheric 

Administration and incorporates review from 300 climatologists, extension agents and others across the nation.

Each week the previous map is revised based on rain, snow and other events, observers' reports of how drought is affecting crops, wildlife and other indicators.

To examine current and archived national, regional and state-by-state drought maps and conditions, go to http://droughtmonitor.unl.edu.

Contacts and sources:

20 Miles on Mars: Rover Opportunity Passes Landmark Distance

More than seven years into what was planned as a three-month mission on Mars, NASA's Mars Exploration Rover Opportunity has driven more than 20 miles, which is more than 50 times the mission's original distance goal.

A drive of 407 feet (124 meters) completed on July 17 took Opportunity past the 20-mile mark (32.2 kilometers). It brought the rover to within a few drives of reaching the rim of Endeavour crater, the rover's team's long-term destination since mid-2008. Endeavour is about 14 miles (22 kilometers) in diameter, and its western rim exposes outcrops that record information older than any Opportunity has examined so far. The rover is now about eight-tenths of a mile (about 1.3 kilometers) from the site chosen for arriving at the rim.

NASA's Mars Exploration Rover Opportunity used its navigation camera to record this view in the eastward driving direction after completing a drive on July 17, 2011, that took the rover's total driving distance on Mars beyond 20 miles.
NASA's Mars Exploration Rover Opportunity
Image Credit: NASA/JPL-Caltech

"The numbers aren't really as important as the fact that driving so much farther than expected during this mission has put a series of exciting destinations within Opportunity's reach," said Alfonso Herrera, a rover mission manager at NASA's Jet Propulsion Laboratory, Pasadena, Calif. who has worked on the rover missions since before launch in 2003.

The latest drive included an autonomous hazard detection portion during which the rover paused at intervals to check for obstacles before proceeding.

Herrera said, "Autonomous hazard detection has added a significant portion of the driving distance over the past few months. It lets us squeeze 10 to 15 percent more distance into each drive."

The milestone-setting drive was on the 2,658th Martian day, or "sol," of the rover's exploration of Mars. Opportunity drove backward. Backward driving is a technique to extend the life of a motor in the right-front wheel that sometimes draws more current than the other five wheels' drive motors.

JPL's Bill Nelson, chief of the mission's engineering team, said, "Opportunity has an arthritic shoulder joint on her robotic arm and is a little lame in the right front wheel, but she is otherwise doing remarkably well after seven years on Mars -- more like 70 in 'rover years.' The elevated right front wheel current is a concern, but a combination of heating and backwards driving has kept it in check over the past 2,000-plus sols."

Opportunity and its rover twin, Spirit, completed their three-month prime missions on Mars in April 2004. Both rovers continued for years of bonus, extended missions. Spirit finished communicating with Earth in March 2010. Both rovers have made important discoveries about wet environments on ancient Mars that may have been favorable for supporting microbial life.

NASA's Jet Propulsion Laboratory, a division of the California Institute of Technology in Pasadena, manages the Mars Exploration Rover Project for the NASA Science Mission Directorate, Washington. More information about the rovers is online at: http://www.nasa.gov/rovers .

Contacts and sources:
Guy Webster
Jet Propulsion Laboratory, Pasadena, Calif.

World-Record Algorithm Calculates Over 3 Trillion Particles In 11 Minutes

Computer simulations can be performed much faster with a method refined by Jülich scientists. During a test with the JUGENE supercomputer, researchers calculated a system comprising 3,011,561,968,121 particles in just over eleven minutes – a world record!

The method involves an optimized implementation of one of the top ten algorithms for scientific simulations, namely the fast multipole method (FMM). Scientists Ivo Kabadshow and Holger Dachsel at the Jülich Supercomputing Centre (JSC) are now making the source code available to interested users.


Comparison of the different methods. To calculate a system comprising three trillion particles directly, a normal PC would need a billion years. With the Jülich FMM, in contrast, it only needs 220 days. Germany's fastest computer JUGENE was finished in just over eleven minutes.
Complexity Overlay
Credit: Jülich Supercomputing Centre (JSC) 


Other applications, which are much smaller, can also benefit from the optimized algorithm. The fast multipole method is generally used to calculate spatially unlimited interactions between particles. These include what are often the most important forces in practical applications: gravitation and electromagnetic interaction. The latter is the basis for the propagation of light, electricity, chemical reactions and for the structure of solids, molecules and atoms. As each particle in such systems interacts with every other particle, the total number of interactions that have to be considered increases quadratically and quickly assumes huge proportions.

If you wanted to calculate the interactions between three trillion particles directly, a supercomputer such as JUGENE with 294,912 processors would require 32,000 years for a single run. A normal PC would take as long as a billion years. Using the fast multipole method, particles that are far apart can be combined in clusters described by multipole moments. This means that interactions no longer have to be calculated individually, which in turn shortens the computing time. Using the algorithm optimized in Jülich on Germany's fastest supercomputer, JUGENE, the time was reduced to 695 seconds.

In the past, large-scale simulations, such as those in astrophysics on the evolution of the universe, were limited to several hundred billion particles. In order to push this boundary back, the Jülich scientists "tinkered with" the storage requirements. "Supercomputers like JUGENE often have little storage per processor despite their huge computing power – often less than a PC. The number of particles therefore tends to be more limited by storage than by the processor performance," says Kabadshow.

In order to optimize the method, the Jülich team developed a new algorithm allowing automatic error checking and a reduction in the computing time. This also decreased the storage requirements and accelerated the calculation. "FMM was always considered a fast method. But up to now, it was almost impossible to optimally adjust it. The required computing time depends on three different parameters, which mutually influence each other and in principle have to be continuously readjusted.

If the parameters are not adequately adjusted, the computing time can quickly increase tenfold to a hundredfold," explains Jülich researcher Holger Dachsel. It is therefore the users who will specifically profit from the simplicity of this improved method. The Jülich FMM automatically adjusts all parameters continuously, thus allowing easier access to the algorithm. The library developed in cooperation with Argonne National Laboratory (ANL) and TU Chemnitz is now freely accessible.

Contacts and sources:


Jülich Supercomputing Centre (JSC)

More information:
Information on the Jülich Supercomputing Centre (JSC):
http://www.fz-juelich.de/ias/jsc/DE/Home/home_node.html

Information on the Jülich supercomputers:

http://www.fz-juelich.de/portal/DE/Forschung/Informationstechnologie/Supercomputer/_node.html

Link:
Top 10 algorithms of the 20th century:
http://www.siam.org/news/news.php?id=637

Dates for your diary:
Visit http://www.fz-juelich.de/termine to find out about upcoming conferences and events organized in and by Forschungszentrum Jülich, including

CECAM - Jülich Summer School 2011, during which the fast multipole method optimized at Jülich will be presented.


Discovery Of A New Magnetic Order: Atomic-Scale Magnetic Lattice Of Cycloidal Vortices


Physicists from Juelich, Kiel and Hamburg identify an atomic-scale magnetic lattice of cycloidal vortices in a thin metal film

Physicists at Forschungszentrum Jülich and the universities of Kiel and Hamburg are the first to discover a regular lattice of stable magnetic skyrmions – radial spiral structures made up of atomic-scale spins – on a surface instead of in bulk materials. Such tiny formations could one day form the basis of a new generation of smaller and more efficient data storage units in the field of information technology. The scientists discovered the magnetic spirals, each made up of just 15 atoms, in a one-atomic-layer of iron on iridium. They present their results in the current issue of the scientific journal Nature Physics (DOI: 10.1038/NPHYS2045).

Researchers at Jülich have discovered that magnetic moments in thin metal films can only take on a certain order. In the figure, the red and green arrows represent the so-called "spins" that can be regarded as small elementary magnets. In the top picture the existing arrangement is shown and at the bottom the mirror image that does not exist.
Illustration: University of Hamburg

The existence of magnetic skyrmions was already predicted over 20 years ago, but was first proven experimentally in 2009; a group of research scientists from the Technische Universität München (TUM) had identified lattices of magnetic vortices in manganese silicon in a weak magnetic field. Unlike these structures, the ones now discovered by physicists at Jülich, Kiel and Hamburg exist without an external magnetic field and are located on the surface of the materials examined, instead of inside them. Their diameter amounts to just a few atoms, making them at least one order of magnitude smaller than the skyrmions which have been identified to date.

"The magnetically-stable entities that we have discovered behave like particles and arrange themselves like atoms in a two-dimensional lattice", explains Prof. Stefan Blügel, Director at the Peter Grünberg Institute and the Institute for Advanced Simulation in Jülich. "This discovery is for us a dream come true". Already in 2007, the same scientific team had discovered a new type of magnetic order in a thin manganese film on tungsten and demonstrated the critical significance of the so-called Dzyaloshinskii-Moriya interaction for the formation of its wave-like structure. The same interaction is also necessary for the formation of the spiral-shaped skyrmions.

The scientists did not discover the skyrmion lattice at first attempt. Originally, they wanted to prepare a one-atomic layer of chromium on iridium, in order to investigate the presumed existence of a different magnetic state. As the experiments were unsuccessful, they then tried with other metals. Using spin-polarized scanning tunnelling microscopy in studies of iron on iridium at the University of Hamburg, the researchers noticed regular magnetic patterns that were not consistent with the crystalline structure of the metal surface. "We were sure straightaway that we had discovered skyrmions", says Blügel. Intricate calculations undertaken by the Jülich supercomputers subsequently proved him right.

The result is a model describing the formation of the spin alignment through a complex interplay of three interactions: the chiral Dzyaloshinskii-Moriya interaction, the conventional interaction between spins plus a non-linear interaction involving four spins. The model should help, in the future, to selectively influence magnetic structures on surfaces. "We are now planning to investigate the effect of electricity on skyrmions; how do the electron spins of an electric current "ride" the spirals, how do they influence resistance and how are the spirals affected?", says Blügel.


Contacts and sources:
Angela Wenzik
Helmholtz Association of German Research Centres
Prof. Stefan Blügel, Quantum Theory of Materials, Forschungszentrum Jülich

Original publication: Spontaneous atomic-scale magnetic skyrmion lattice in two dimensions; Stefan Heinze, Kirsten von Bergmann, Matthias Menzel, Jens Brede, André Kubetzka, Roland Wiesendanger, Gustav Bihlmayer, Stefan Blügel; Nature Physics, published online: 31.07.2011; DOI: 10.1038/NPHYS2045

Further information:

Forschungszentrum Jülich: http://www.fz-juelich.de/portal/EN/Home/home_node.html
Link to the press release from 10.05.2007 "Supercomputer shows that nanolayers have turning sense" http://www2.fz-juelich.de/portal/index.php?index=163&jahr=2007&cmd=show&mid=480
Research at the Institute "Quantum Theory of Materials": http://www.fz-juelich.de/sid_2C0C0844209B1401BD3B0B651A1E88C0/pgi/pgi-1/EN/Home/home_node.html

Physics Could Be Behind The Secrets Of Crop-Circle Artists

In this month's edition of Physics World, Richard Taylor, director of the Materials Science Institute at the University of Oregon, takes a serious, objective look at a topic that critics might claim is beyond scientific understanding – crop circles.

Credit: www.cropcircleconnector.com/2011/windmillhill2/windmillhill2011b.html

As the global crop-circle phenomenon grows alongside advances in science and technology, Taylor notes how physics and the arts are coming together to produce more impressive and spectacular crop-circle patterns that still manage to maintain their mystery.

Today's crop-circle designs are more complex than ever, with some featuring up to 2000 different shapes. Mathematical analysis has revealed the use of constructions lines, invisible to the eye, that are used to design the patterns, although exactly how crop circles are created remains an open question.

According to Taylor, physics could potentially hold the answer, with crop-circle artists possibly using the Global Positioning System (GPS) as well as lasers and microwaves to create their patterns, dispensing with the rope, planks of wood and bar stools that have traditionally been used.

Microwaves, Taylor suggests, could be used to make crop stalks fall over and cool in a horizontal position – a technique that could explain the speed and efficiency of the artists and the incredible detail that some crop circles exhibit.

Indeed, one research team claims to be able to reproduce the intricate damage inflicted on crops using a handheld magnetron, readily available from microwave ovens, and a 12 V battery.

As Taylor writes, "Crop-circle artists are not going to give up their secrets easily. This summer, unknown artists will venture into the countryside close to your homes and carry out their craft, safe in the knowledge that they are continuing the legacy of the most science-oriented art movement in history."

Matin Durrani, Editor of Physics World, says, "It may seem odd for a physicist such as Taylor to be studying crop circles, but then he is merely trying to act like any good scientist – examining the evidence for the design and construction of crop circles without getting carried away by the side-show of UFOs, hoaxes and aliens."

Also in this month's issue:

End of an era – an interview with veteran CERN theorist John Ellis, who is back in the UK after almost four decades at the Geneva lab but still searching for the elusive Higgs boson.

Contacts and sources:
Michael Bishop
Institute of Physics
http://www.physicsworld.com
Windmill Hill, near Avebury, Wiltshire. 

How The Internet Changed Our Brains

The rise of Internet search engines like Google has changed the way our brain remembers information, according to research by Columbia University psychologist Betsy Sparrow published July 14 in Science.

Betsy Sparrow talks about her research, which examines the changing nature of human memory. (3:08)

“Since the advent of search engines, we are reorganizing the way we remember things,” said Sparrow. “Our brains rely on the Internet for memory in much the same way they rely on the memory of a friend, family member or co-worker. We remember less through knowing information itself than by knowing where the information can be found.”

Sparrow’s research reveals that we forget things we are confident we can find on the Internet. We are more likely to remember things we think are not available online. And we are better able to remember where to find something on the Internet than we are at remembering the information itself. This is believed to be the first research of its kind into the impact of search engines on human memory organization.

Sparrow’s paper in Science is titled, “Google Effects on Memory: Cognitive Consequences of Having Information at Our Fingertips.” With colleagues Jenny Liu of the University of Wisconsin-Madison and Daniel M. Wegner of Harvard University, Sparrow explains that the Internet has become a primary form of what psychologists call transactive memory—recollections that are external to us but that we know when and how to access.

The research was carried out in four studies.

First, participants were asked to answer a series of difficult trivia questions. Then they were immediately tested to see if they had increased difficulty with a basic color naming task, which showed participants words in either blue or red. Their reaction time to search engine-related words, like Google and Yahoo, indicated that, after the difficult trivia questions, participants were thinking of Internet search engines as the way to find information.

Second, the trivia questions were turned into statements. Participants read the statements and were tested for their recall of them when they believed the statements had been saved—meaning accessible to them later as is the case with the Internet—or erased. Participants did not learn the information as well when they believed the information would be accessible, and performed worse on the memory test than participants who believed the information was erased.

Third, the same trivia statements were used to test memory of both the information itself and where the information could be found. Participants again believed that information either would be saved in general, saved in a specific spot, or erased. They recognized the statements which were erased more than the two categories which were saved.

Fourth, participants believed all trivia statements that they typed would be saved into one of five generic folders. When asked to recall the folder names, they did so at greater rates than they recalled the trivia statements themselves. A deeper analysis revealed that people do not necessarily remember where to find certain information when they remember what it was, and that they particularly tend to remember where to find information when they can’t remember the information itself.

According to Sparrow, a greater understanding of how our memory works in a world with search engines has the potential to change teaching and learning in all fields.

“Perhaps those who teach in any context, be they college professors, doctors or business leaders, will become increasingly focused on imparting greater understanding of ideas and ways of thinking, and less focused on memorization,” said Sparrow. “And perhaps those who learn will become less occupied with facts and more engaged in larger questions of understanding.”

The research was funded by the National Institutes of Health and Columbia’s department of psychology.

Contacts and sources:

Mchip Diagnoses Infectious Diseases Like HIV And Syphilis At Patients' Bedsides; New Device Could Streamline Blood Testing Worldwide

Samuel K. Sia, assistant professor of biomedical engineering at Columbia Engineering, has developed an innovative strategy for an integrated microfluidic-based diagnostic device—in effect, a lab-on-a-chip—that can perform complex laboratory assays, and do so with such simplicity that these tests can be carried out in the most remote regions of the world. In a paper published in Nature Medicine online on July 31, Sia presents the first published field results on how microfluidics—the manipulation of small amounts of fluids—and nanoparticles can be successfully leveraged to produce a functional low-cost diagnostic device in extreme resource-limited settings.

Sia and his team performed testing in Rwanda over the last four years in partnership with Columbia's Mailman School of Public Health and three local non-government organizations in Rwanda, targeting hundreds of patients. His device, known as mChip (mobile microfluidic chip), requires only a tiny finger prick of blood, effective even for a newborn, and gives—in less than 15 minutes—quantitative objective results that are not subject to user interpretation. This new technology significantly reduces the time between testing patients and treating them, providing medical workers in the field results that are much easier to read at a much lower cost. New low-cost diagnostics like the mChip could revolutionize medical care around the world.

"We have engineered a disposable credit card-sized device that can produce blood-based diagnostic results in minutes," said Sia. "The idea is to make a large class of diagnostic tests accessible to patients in any setting in the world, rather than forcing them to go to a clinic to draw blood and then wait days for their results."

Sia's lab at Columbia Engineering has developed the mChip devices in collaboration with Claros Diagnostics Inc., a venture capital-backed startup that Sia co-founded in 2004. (The company has recently been named by MIT's Technology Review as one of the 50 most innovative companies in the world.) The microchip inside the device is formed through injection molding and holds miniature forms of test tubes and chemicals; the cost of the chip is about $1 and the entire instrument about $100.

Sia hopes to use the mChip to help pregnant women in Rwanda who, while they may be suffering from AIDS and sexually transmitted diseases, cannot be diagnosed with any certainty because they live too far away from a clinic or hospital with a lab. "Diagnosis of infectious diseases is very important in the developing world," said Sia. "When you're in these villages, you may have the drugs for many STDs, but you don't know who to give treatments to, so the challenge really comes down to diagnostics." A version of the mChip that tests for prostate cancer has also been developed by Claros Diagnostics and was approved in 2010 for use in Europe.

Sia's work also focuses on developing new high-resolution tools to control the extracellular environments around cells, in order to study how they interact to form human tissues and organs. His lab uses techniques from a number of different fields, including biochemistry, molecular biology, microfabrication, microfluidics, materials chemistry, and cell and tissue biology.

Contacts and sources:

Sia was named one of the world's top young innovators for 2010 by MIT's Technology Review for his work in biotechnology and medicine, and by NASA as one of 10 innovators in human health and sustainability. In 2008, he received a CAREER award from the National Science Foundation that included a $400,000 grant to support his other research specialty in three-dimensional tissue engineering. A recipient of the Walter H. Coulter Early Career Award in 2008, Sia participated in the National Academy of Engineering's U.S. Frontiers of Engineering symposium for the nation's brightest young engineers in 2007. He earned his B.Sc. in biochemistry from the University of Alberta, and his Ph.D. in biophysics from Harvard University, where he was also a postdoctoral fellow in chemistry and chemical biology.

The mChip project has been supported by funding from the National Institutes of Health and Wallace Coulter Foundation.

Columbia Engineering

Columbia University's Fu Foundation School of Engineering and Applied Science, founded in 1864, offers programs in nine departments to both undergraduate and graduate students. With facilities specifically designed and equipped to meet the laboratory and research needs of faculty and students, Columbia Engineering is home to NSF-NIH funded centers in genomic science, molecular nanostructures, materials science, and energy, as well as one of the world's leading programs in financial engineering. These interdisciplinary centers are leading the way in their respective fields while individual groups of engineers and scientists collaborate to solve some of society's more vexing challenges. http://www.engineering.columbia.edu/

Remnants of 19th-Century Village Beneath Central Park

Columbia University archaeologist Nan Rothschild walks her dog in Central Park each morning, not far from where William G. Wilson used to live—more than 150 years ago.

Archaeologist Nan Rothschild talks about the Seneca Village Project. (2:32)
Credit: Columbia University

Rothschild has unearthed what is left of Wilson’s home, as well as other remnants of Seneca Village, the first community of African American property owners in New York City. The village existed from the 1820s until 1857, when its inhabitants were evicted to make way for the creation of Central Park.

“Seneca Village was autonomous,” said Rothschild, director of museum studies at Columbia and research professor at Barnard College. “It had its own institutions, so its residents could live free from the everyday burdens of racism. It was a refuge. In a way, it was both in the city and out of the city—located about three miles from the densely settled portion of Manhattan.”

Seneca Village was located within New York’s famous grid street system between 81st and 89th Streets and 7th and 8th Avenues, in what is now a portion of Central Park just east of Central Park West.

The Seneca Village Project, started in 1999, is managed by Rothschild and her co-directors, Diana Wall, of City College of New York and CUNY Graduate Center, and Cynthia Copeland, of New York University. Preliminary work included research of historical documents and soil analysis. Excavation, which began two months ago, is being conducted by the three scholars, as well as 10 undergraduates from colleges across New York City. The excavation portion of their research will conclude on July 29, but research of the artifacts will continue.


The team utilized ground-penetrating radar to study the area long before the dig. In fact, the radar showed them what they thought were artifacts concentrated in one place. They later learned that the radar had, in fact, found the walls of Wilson’s home. The discovery of his 19-foot by 21-foot home was, according to Rothschild, an accident. After indentifying the walls of the structure, the team excavated metal roofing, a stoneware beer bottle, kitchen utensils and clothing remnants from Wilson’s home. They also discovered ceramics and butchered animal bones near the home of another villager named Nancy Moore.

“Seneca Village was a middle-class African American community,” said Rothschild. “Our notions of what African Americans were like in the 19th century do not usually include class variations. In time, the village came to include Irish immigrants, which is counter to our ideas about how these two groups got along in that era.”

During its more than three decades, Seneca Village grew into a community of nearly 300 people. Two-thirds of its villagers were of African descent, while the rest were predominantly of Irish descent. The community included a school, as well as three churches—one of which was racially integrated.

“We know a great deal about Seneca Village from historic documents, but the archaeology gives us evidence of the fabric of peoples’ lives,” said Rothschild. “What foods—meat especially—they ate, what dishes they chose for their homes, how their homes were built. These are all details that are completely missing from the historical record and are really important in understanding, for example, how expressions of class—so visible in the purchase of home furnishings—were manifest in the village.”

The research would not have been possible without the extensive support of the New York City Department of Parks and Recreation and the Central Park Conservancy.

The excavation and identification of artifacts was funded by the National Science Foundation, National Geographic, the Durst Foundation, PSC-CUNY, the Richard Gilder Foundation and private contributions.

Contacts and sources:
Columbia University

Engineers Develop Material That Could Speed Telecommunications

Researchers at Columbia Engineering School have demonstrated that light can travel on an artificial material without leaving a trace under certain conditions, technology that would have many applications from the military to telecommunications.

In this illustration, light hits Kocaman’s and Wong’s specially engineered material without leaving a trace. The actual material is no thicker than one hundredth of the diameter of a strand of hair.
Credit: Columbia Engineering School

In a study published July 10 on Nature Photonics’s website, Serdar Kocaman, an electrical engineering Ph.D. candidate, and Chee Wei Wong, associate professor of mechanical engineering, demonstrated how an optical nanostructure can be built that controls the way light bounces off it.

When light travels, it bends—in technical terms, it disperses and incurs “phase,” an oscillating curve that leaves a trail of information behind it. Those oscillations show an object’s properties, such as shape and size, which can identify it. However, light hits Kocaman’s and Wong’s specially engineered material without leaving a trace.

Every natural known material has a positive refractive index: when light hits it, the light bends or refracts. The researchers engineered a structure in which they etched tiny holes, creating a material known as a “photonic crystal” which behaves as though it has zero index—light can travel with an ultrafast velocity in this environment. The material, a coating no thicker than one hundredth of the diameter of a strand of hair, has properties that don’t occur in nature.

“We’re very excited about this. We’ve engineered and observed a metamaterial with zero refractive index,” said Kocaman. “Even in a vacuum, light propagates with a phase advancement. With the zero phase advancement, what we’ve seen is that the light travels through the material as if the entire space is missing.”

“We can now control the flow of light, the fastest thing known to us,” Wong said. “This can enable self-focusing light beams, highly directive antennas, and even potentially an approach to hide objects, at least in the small scale or a narrow band of frequencies.”

The zero-index material was based on a negative refractive index material and a superlattice material demonstrated consecutively in 2008 and 2009 by the scientists. In the new paper Kocaman and Wong, together with colleagues, demonstrate that the optical phase advancement can be controlled and even eliminated under certain conditions.

The study was led by Wong and Kocaman, in collaboration with scientists at the University College of London, Brookhaven National Laboratory, and the Institute of Microelectronics of Singapore. It is the first time phase and zero-index observations have been made on both a photonic chip scale and at infrared wavelengths. These photonic chip circuits can be helpful in fiberoptic networks.

This research was supported by grants from the National Science Foundation and the Defense Advanced Research Projects Agency.

Contacts and sources:

NASA Satellite Tracks Severity of African Drought

Surface relative humidity anomalies in percent, during July 2011 compared to the average surface relative humidity over the previous eight years, as measured by the Atmospheric Infrared Sounder (AIRS) instrument on NASA’s Aqua spacecraft. The driest areas are shown in oranges and reds. 
Image credit: NASA/JPL-Caltech 

Northeast Africa continues to reel from the effects of the worst drought to strike the region in decades. The arid conditions are contributing to famines that the U.S. Department of State says are affecting more than 11.5 million people, particularly in Somalia, Ethiopia, Kenya and Djibouti. 

The drought is tied to strong La Nina conditions that prevailed in late 2010 and early 2011. La Nina shifts ocean temperatures and air pressure over the Pacific Ocean, causing effects that ripple through weather patterns around the world. In East Africa, La Nina typically brings drought.

The current dry conditions are illustrated in this new map, created using nine years of data on surface relative humidity from the Atmospheric Infrared Sounder (AIRS) instrument on NASA's Aqua spacecraft. Surface relative humidity measures the percent of water vapor in the air nearest to Earth's surface, where people, animals and plants live.

Scientists at NASA's Jet Propulsion Laboratory, Pasadena, Calif., created a climatology for the region by averaging eight years of July AIRS surface relative humidity data from 2003 through 2010, and then subtracting the result from the AIRS relative humidity data for July 1-18, 2011. Areas shown in greens, yellows, oranges and reds represent regions that are drier in July 2011 than the average of all the previous Julys dating back to 2003. The driest conditions, shown in red, are found in northeast Africa, while large regions throughout the Middle East are moderately dry. Areas in blue were moister in 2011 than in the previously studied years. White areas represent data voids caused primarily by the effects of mountain and highland topography.

In regions that are traditionally dry, the additional drying of more than 15 percent relative humidity is very stressful to crops, causing them to dry out and die.

AIRS is managed by NASA's Jet Propulsion Laboratory, Pasadena, Calif., under contract to NASA. JPL is a division of the California Institute of Technology in Pasadena.

More information about AIRS can be found at http://airs.jpl.nasa.gov.
Contacts and sources:
Alan Buis
Jet Propulsion Laboratory, Pasadena, Calif.

Sun-Free Photovoltaics: Materials Engineered To Give Off Precisely Tuned Wavelengths Of Light When Heated Are Key To New High-Efficiency Generating System.

A new photovoltaic energy-conversion system developed at MIT can be powered solely by heat, generating electricity with no sunlight at all. While the principle involved is not new, a novel way of engineering the surface of a material to convert heat into precisely tuned wavelengths of light — selected to match the wavelengths that photovoltaic cells can best convert to electricity — makes the new system much more efficient than previous versions. 

A variety of silicon chip micro-reactors developed by the MIT team. Each of these contains photonic crystals on both flat faces, with external tubes for injecting fuel and air and ejecting waste products. Inside the chip, the fuel and air react to heat up the photonic crystals. In use, these reactors would have a photovoltaic cell mounted against each face, with a tiny gap between, to convert the emitted wavelengths of light to electricity. 
Photo: Justin Knight

The key to this fine-tuned light emission, described in the journal Physical Review A, lies in a material with billions of nanoscale pits etched on its surface. When the material absorbs heat — whether from the sun, a hydrocarbon fuel, a decaying radioisotope or any other source — the pitted surface radiates energy primarily at these carefully chosen wavelengths.

Based on that technology, MIT researchers have made a button-sized power generator fueled by butane that can run three times longer than a lithium-ion battery of the same weight; the device can then be recharged instantly, just by snapping in a tiny cartridge of fresh fuel. Another device, powered by a radioisotope that steadily produces heat from radioactive decay, could generate electricity for 30 years without refueling or servicing — an ideal source of electricity for spacecraft headed on long missions away from the sun.

According to the U.S. Energy Information Administration, 92 percent of all the energy we use involves converting heat into mechanical energy, and then often into electricity — such as using fuel to boil water to turn a turbine, which is attached to a generator. But today's mechanical systems have relatively low efficiency, and can't be scaled down to the small sizes needed for devices such as sensors, smartphones or medical monitors.

"Being able to convert heat from various sources into electricity without moving parts would bring huge benefits," says Ivan Celanovic ScD '06, research engineer in MIT's Institute for Soldier Nanotechnologies (ISN), "especially if we could do it efficiently, relatively inexpensively and on a small scale."

It has long been known that photovoltaic (PV) cells needn't always run on sunlight. Half a century ago, researchers developed thermophotovoltaics (TPV), which couple a PV cell with any source of heat: A burning hydrocarbon, for example, heats up a material called the thermal emitter, which radiates heat and light onto the PV diode, generating electricity. The thermal emitter's radiation includes far more infrared wavelengths than occur in the solar spectrum, and "low band-gap" PV materials invented less than a decade ago can absorb more of that infrared radiation than standard silicon PVs can. But much of the heat is still wasted, so efficiencies remain relatively low.

An ideal match

The solution, Celanovic says, is to design a thermal emitter that radiates only the wavelengths that the PV diode can absorb and convert into electricity, while suppressing other wavelengths. "But how do we find a material that has this magical property of emitting only at the wavelengths that we want?" asks Marin Soljačić, professor of physics and ISN researcher. The answer: Make a photonic crystal by taking a sample of material and create some nanoscale features on its surface — say, a regularly repeating pattern of holes or ridges — so light propagates through the sample in a dramatically different way.

"By choosing how we design the nanostructure, we can create materials that have novel optical properties," Soljačić says. "This gives us the ability to control and manipulate the behavior of light."

The team — which also includes Peter Bermel, research scientist in the Research Laboratory for Electronics (RLE); Peter Fisher, professor of physics; and Michael Ghebrebrhan, a postdoc in RLE — used a slab of tungsten, engineering billions of tiny pits on its surface. When the slab heats up, it generates bright light with an altered emission spectrum because each pit acts as a resonator, capable of giving off radiation at only certain wavelengths.

This powerful approach — co-developed by John D. Joannopoulos, the Francis Wright Davis Professor of Physics and ISN director, and others — has been widely used to improve lasers, light-emitting diodes and even optical fibers. The MIT team, supported in part by a seed grant from the MIT Energy Initiative, is now working with collaborators at MIT and elsewhere to use it to create several novel electricity-generating devices.

Mike Waits, an electronics engineer at the Army Research Laboratory in Adelphi, Md., who was not involved in this work, says this approach to producing miniature power supplies could lead to lighter portable electronics, which is "critical for the soldier to lighten his load. It not only reduces his burden, but also reduces the logistics chain" to deliver those devices to the field. "There are a lot of lives at stake," he says, "so if you can make the power sources more efficient, it could be a great benefit."

The button-like device that uses hydrocarbon fuels such as butane or propane as its heat source — known as a micro-TPV power generator — has at its heart a "micro-reactor" designed by Klavs Jensen, the Warren K. Lewis Professor of Chemical Engineering, and fabricated in the Microsystems Technology Laboratories. While the device achieves a fuel-to-electricity conversion efficiency three times greater than that of a lithium-ion battery of the same size and weight, Celanovic is confident that with further work his team can triple the current energy density. "At that point, our TPV generator could power your smartphone for a whole week without being recharged," he says.

Celanovic and Soljačić stress that building practical systems requires integrating many technologies and fields of expertise. "It's a really multidisciplinary effort," Celanovic says. "And it's a neat example of how fundamental research in materials can result in new performance that enables a whole spectrum of applications for efficient energy conversion."
Contacts and sources:
Nancy W. Stauffer, MITEI
MIT
David L. Chandler contributed to this story.

Real Star Trek Tricorder! Cell Phones Now A Global Health Tool, UCLA Bio-Photonic Lab-in-a-Phone

Holy tricorder! Batman.  UCLA researchers have transformed a cell phone into a bio-photonic laboratory able to perform on the spot analysis much like the fictional tricorder of Star Trek fame.

Flow cytometry, a technique for counting and examining cells, bacteria and other microscopic particles, is used routinely in diagnosing disorders, infections and cancers and evaluating the progression of HIV and AIDS. But flow cytometers are big, bulky contraptions that cost tens of thousands of dollars, making them less than ideal for health care in the field or other settings where resources are limited.

OzcanCellCytometer
OzcanCellCytometer
Credit: UCLA

Now imagine you could achieve the same results using a device that weighs about half an ounce and costs less than five dollars.

Credit: UCLA

Researchers at the BioPhotonics Laboratory at the UCLA Henry Samueli School of Engineering and Applied Science have developed a compact, lightweight and cost-effective optofluidic platform that integrates imaging cytometry and florescent microscopy and can be attached to a cell phone. The resulting device can be used to rapidly image bodily fluids for cell counts or cell analysis.

The research, which was led by Aydogan Ozcan, a professor of electrical engineering and bioengineering and a member of the California NanoSystems Institute at UCLA, is currently available online in the journal Analytical Chemistry.

"In this work, we developed a cell phone–based imaging cytometry device with a very simple optical design, which is very cost-effective and easy to operate," said Hongying Zhu, a UCLA Engineering postdoctoral scholar at the BioPhotonics Lab and co-author of the research. "It has great potential to be used in resource-limited regions to help people there improve the quality of their health care."

The device is the latest advance by Ozcan's research team, which has developed a number of innovative, scaled-down, cell phone–based technologies that have the potential to transform global health care.

"We have more than 5 billion cell phone subscribers around the world today, and because of this, cell phones can now play a central role in telemedicine applications," Ozcan said. "Our research group has already created a very nice set of tools, including cell phone microscopes, that can potentially replace most of the advanced instruments used currently in laboratories."

How it works

Ozcan's group integrated compact optical attachments to create the optofluidic fluorescent cytometry platform. The platform, which weighs only 18 grams, includes:

• 1 simple lens (less than $3)

• 1 plastic color filter (less than $1)

• 2 LEDs (less than 30 cents each)

• Simple batteries

The microfluidic assembly is placed just above a separate, inexpensive lens that is put in contact with the cell phone's existing camera unit. This way, the entire cross-section of the microfluidic device can be mapped onto the phone's CMOS sensor-chip. The sample fluid is delivered continuously through a disposable microfluidic channel via a syringe pump.

The device is illuminated from the side by the LEDs using a simple butt-coupling technique. The excitation light is then guided within the cross-section of the device, uniformly exciting the specimens in the imaging fluid. The optofluidic pumping scheme also allows for the use of an inexpensive plastic absorption filter to create the dark-field background needed for fluorescent imaging.

In addition, video post-processing and contour-detection and tracking algorithms are used to count and label the cells or particles passing through the microfluidic chip.

In order to demonstrate proof-of-concept for the new platform, the team used the device to measure the density of white blood cells in human whole-blood samples, as white blood cell density is routinely tested to diagnosis various diseases and infections, including leukemia, HIV and bone marrow deficiencies.

"For the next step, we'd like to explore other potential applications of this device," Zhu said. "For example, we also want to utilize this device to count potential waterborne parasites for water-quality monitoring."

"We'd like to translate our devices for testing in the field and start using them in places they're supposed to be used," Ozcan said. "So I think the next stage for several of our technologies, including this one, is to deploy and test them in extremely poor-resource countries."

Contacts and sources: 
UCLA

This study was funded by the National Institutes of Health, the National Science Foundation, the Office of Naval Research, the Gates Foundation and the Vodafone Americas Foundation.

The UCLA Henry Samueli School of Engineering and Applied Science, established in 1945, offers 28 academic and professional degree programs and has an enrollment of almost 5,000 students. The school's distinguished faculty are leading research to address many of the critical challenges of the 21st century, including renewable energy, clean water, health care, wireless sensing and networking, and cybersecurity. Ranked among the top 10 engineering schools at public universities nationwide, the school is home to seven multimillion-dollar interdisciplinary research centers in wireless sensor systems, nanoelectronics, nanomedicine, renewable energy, customized computing, and the smart grid, all funded by federal and private agencies. (www.engineer.ucla.edu | www.twitter.com/uclaengineering)


Tamoxifen Lasting Benefits Sees Breast Cancer Deaths Down By Third

The benefits of using tamoxifen to prevent recurrence of breast cancer after surgery continue to accrue long after women stop taking the drug, a study led by Oxford University has found.

The findings suggest that for women with the most common type of breast cancer, full compliance with daily tamoxifen therapy for five years would reduce the long-term chances of dying by at least a third.

Oxford researchers used data from 20 trials comparing tamoxifen use for five years against no tamoxifen.
Clinical trials have shown the benefit of using tamoxifen against breast cancer recurrence.
Credit: US National Cancer Institute

‘Breast cancer is a nasty disease because it can come back years later,’ says Dr Christina Davies of the Clinical Trial Service Unit at Oxford University, and one of the lead investigators. ‘This study now shows that tamoxifen produces really long-term protection.

‘For ER-positive disease, tamoxifen reduces 15-year breast cancer mortality by at least a third, whether or not chemotherapy has been given.’

Most breast cancers are oestrogen receptor (ER)-positive – in the US or UK, it’s about 4 out of 5 breast cancers that are ER-positive.

Since tamoxifen acts on the ER protein in breast cancer cells, it can have an effect only if those cells contain some ER protein. But a simple test on surgically removed breast cancers can determine whether the cancer is ER-positive or not.

Various treatments can be given after apparently successful breast cancer surgery to prevent any tiny residual fragments eventually causing the cancer to come back as an incurable disease.

Many randomised trials have been conducted to try to determine the best treatment options, and every 5 years for the past 25 years the Early Breast Cancer Trialists’ Collaborative Group (EBCTCG) has brought together all the evidence from all of these trials.

In the current study, funded by Cancer Research UK, the Medical Research Council and the British Heart Foundation, the researchers brought together individual patient data for over 20,000 women with early-stage breast cancer from 20 randomised trials. The trials compared treatment with tamoxifen for 5 years against no tamoxifen, with participants showing 80% compliance in taking the daily pill.

Most of the trials of tamoxifen began in the 1980s, meaning there is now lots of data available on the long-term effects of treatment after women stopped taking the drug. That long-term analysis, published in the Lancet medical journal, now reveals large additional benefits of tamoxifen in reducing breast cancer deaths, not only during the first decade but also during the second decade after treatment began.

The researchers found that in women with ER-positive disease, 5 years of daily tamoxifen safely reduced the long-term (15-year) risks of breast cancer recurrence and death. It was effective whether or not chemotherapy had been given.

Remarkably, the researchers found a highly significant reduction in breast cancer mortality, not only during the five years of treatment and the five years following, but also during years 10–14.

Even in weakly ER positive disease, tamoxifen substantially reduced the likelihood of the cancer recurring.

There is a newer class of drugs called aromatase inhibitors (AIs) that offer an alternative to tamoxifen for some patients, but AIs are effective only in post-menopausal women.

Dr Davies explains that tamoxifen was developed 50 years ago and is long out of patent, so is relatively cheap. But even if costs are ignored it remains a major first-line treatment option for women with ER-positive breast cancer, she says – especially for women pre-menopause.

Moreover, the rare life-threatening side-effects of tamoxifen (uterine cancer and blood clots) are mainly experienced by women over 55 years of age, so there is little risk from giving tamoxifen to younger women.

Worldwide, half of all new patients diagnosed with breast cancer are younger than 55 years – that’s 0.7 million women.

Contacts and sources:
The Lancet/Oxford University

Why Plant 'Clones' Aren’t Identical

A new study of plants that are reproduced by ‘cloning’ has shown why cloned plants are not identical.

Scientists have known for some time that ‘clonal’ (regenerant) organisms are not always identical: their observable characteristics and traits can vary, and this variation can be passed on to the next generation. This is despite the fact that they are derived from genetically identical founder cells.

Clones of the plant 'thalecress' were analysed. 
Clones of 'thalecress' were analysed
Photo: Alberto Salguero
Now, a team from Oxford University, UK, and King Abdullah University of Science and Technology, Saudi Arabia, believe they have found out why this is the case in plants: the genomes of regenerant plants carry relatively high frequencies of new DNA sequence mutations that were not present in the genome of the donor plant.

The team report their findings in this week’s Current Biology.

‘Anyone who has ever taken a cutting from a parent plant and then grown a new plant from this tiny piece is actually harnessing the ability such organisms have to regenerate themselves,’ said Professor Nicholas Harberd of Oxford University’s Department of Plant Sciences, lead author of the paper. ‘But sometimes regenerated plants are not identical, even if they come from the same parent. Our work reveals a cause of that visible variation.’

Using DNA sequencing techniques that can decode the complete genome of an organism in one go (so-called ‘whole genome sequencing’) the researchers analysed ‘clones’ of the small flowering plant ‘thalecress’ (Arabidopsis). They found that observable variations in regenerant plants are substantially due to high frequencies of mutations in the DNA sequence of these regenerants, mutations which are not contained in the genome of the parent plant.

‘Where these new mutations actually come from is still a mystery,’ said Professor Harberd. ‘They may arise during the regeneration process itself or during the cell divisions in the donor plant that gave rise to the root cells from which the regenerant plants are created. We are planning further research to find out which of these two processes is responsible for these mutations. What we can say is that Nature has safely been employing what you might call a ‘cloning’ process in plants for millions of years, and that there must be good evolutionary reasons why these mutations are introduced.’

The new results suggest that variation in clones of plants may have different underlying causes from that of variation in clones of animals – where it is believed that the effect of environmental factors on how animal genes are expressed is more important and no similar high frequencies of mutations have been observed.

Professor Harberd said: ‘Whilst our results highlight that cloned plants and animals are very different they may give us insights into how both bacterial and cancer cells replicate themselves, and how mutations arise during these processes which, ultimately, have an impact on human health.’

A report of the research, ‘Regenerant Arabidopsis Lineages Display a Distinct Genome-Wide Spectrum of Mutations Conferring Variant Phenotypes’, is published this week online in Current Biology.

The project is a collaboration between scientists at Oxford University’s Department of Plant Sciences, Oxford University’s Wellcome Trust Centre for Human Genetics, and King Abdullah University of Science and Technology (KAUST), Saudi Arabia. The research was supported by KAUST and the UK’s Biotechnology and Biological Sciences Research Council.

Contacts and sources:
Oxford University

Lost Gospels Found? Ancient Lives Project Seeks Public Help Translating Ancient Papyri

Members of the public are being asked to help decode papyri, in order to find fragments of lost gospels, works of literature, and letters about everyday life in ancient Egypt, in a new project launched by Oxford University.

Oxyrhynchus Papyrus 5072 (3rd century AD), Uncanonical Gospel. Credit: Photo courtesy of the Egypt Exploration Society and Imaging Papyri Project, Oxford. 
  Papyri of a lost gospel from the Oxyrhynchus collection
Credit: Oxford. All rights reserved.

Ancient Lives (ancientlives.org), which launches today, is putting hundreds of thousands of images of fragments of papyri written in Greek online. Researchers say that ‘armchair archaeologists’ visiting the website can help with cataloguing the collection, and could make amazing finds, such as the recent discovery of fragments of a previously unknown ‘lost’ gospel which describes Jesus Christ casting out demons.

Nobody knows who wrote this lost gospel: it is part of a treasure trove of papyri recovered in the early 20th century from the Egyptian city of Oxyrhynchus, the ‘City of the Sharp-Nosed Fish’. The texts were written in Greek during a period when Egypt was under the control of a Greek (and later Roman) settler class. Many of the papyri had not been read for over a thousand years.

Because of the huge number of images involved researchers need volunteers to look through and catalogue them or transcribe the text using a simple web interface, which displays both known and unknown texts.

‘It’s with the digital advancements of our own age, that we're able to open up this window into the past, and see a common human experience in that intimate, traditional medium, handwriting,’ said lead developer and designer, William MacFarlane of Oxford University’s Department of Physics.

Experts have been studying the collection for over a hundred years. It is because of Oxyrhynchus that we now have lost masterpieces that went missing during the medieval period: the lost poetry of Sappho, the lost comedies of Menander and the lost plays of Sophocles. There are personal documents too – we learn from a letter that Aurelius the sausage-maker has taken out a loan of 9000 silver denarii, perhaps to expand his business, whilst in another letter of 127 AD a grandmother, called Sarapias, asks that her daughter is brought home so that she can be present at the birth of her grandchild.

‘Discovering new texts is always exciting,’ explains team papyrologist Dr James Brusuelas, ‘but the fact that you’re reading a piece of literature or a private letter that hasn’t been read in over a thousand years, that’s what I like about papyrology.’ Paul Ellis, an imaging specialist who assisted with the digitization of the papyrus texts, said: ‘Online images are a window into ancient lives.’

The project is a collaboration between Oxford University papyrologists, the Egypt Exploration Society, and a team in Oxford University’s Department of Physics who specialise in building ‘citizen science’ projects that allow anyone to make an authentic contribution to research.

‘Until now only experts could explore this incredible collection,’ said project leader Dr Chris Lintott of Oxford University’s Department of Physics, ‘but with so much of the collection unstudied there’s plenty for everyone. We’re excited to see what visitors to ancientlives.org can unearth.’

‘Papyrologists are well known for friendship among those interested in ancient texts,’ said Project Director Dr Dirk Obbink, Oxford University Lecturer in Papyrology and Greek Literature at the University of Oxford. ‘This effort is pervaded by a spirit of collaboration. We aim to transcribe as much as possible of the original papyri, and then identify and reconstruct the text. No single pair of eyes can see and read everything. From scientists and professors to school students and ancient enthusiasts, everyone has something to contribute – and gain.’

Ancientlives.org is part of the www.zooniverse.org network of public participation projects, which includes Old Weather, which aims to rescue weather records contained in World War I ship’s logs. More than 500,000 logbook pages have been transcribed so far. The original Zooniverse project was Galaxy Zoo, and a total of more than 400,000 people have registered to take part.

The project was supported by a grant from the Arts and Humanities Research Council and the John Fell Fund, and is indebted to the Oxford University Department of Classics and the Egypt Exploration Society, London who oversee the Oxyrhynchus Collection in the Sackler Library, Oxford as part of a wide range of scholarly and outreach activities.

Contacts and sources:

Bigger Eyes and Brains Evolved In Northern Countries Spurred By Gloomy Light Conditions

The farther that human populations live from the equator, the bigger their brains, according to a new study by Oxford University. But it turns out that this is not because they are smarter, but because they need bigger vision areas in the brain to cope with the low light levels experienced at high latitudes.

Scientists have found that people living in countries with dull, grey, cloudy skies and long winters have evolved bigger eyes and brains so they can visually process what they see, reports the journal Biology Letters.

Skulls from the 1800s used in the study
Credit:  Oxford University

Scientists from Oxford University measured the eye socket and brain volumes of 55 skulls from museum collections, representing inhabitants of twelve different countries from Scandinavia to Micronesia. The volume of the eye sockets and brain cavities were then plotted against the latitude of the central point of each individual’s country of origin. The researchers found that the size of both the brain and the eyes could be directly linked to the latitude of the country from which the individual came.

Lead author Eiluned Pearce said: ‘As you move away from the equator, there's less and less light available, so humans have had to evolve bigger and bigger eyes. Their brains also need to be bigger to deal with the extra visual input. Having bigger brains doesn't mean that higher latitude humans are smarter, it just means they need bigger brains to be able to see well where they live.’

Co-author Professor Robin Dunbar, Director of the Institute of Cognitive and Evolutionary, said: ‘Humans have only lived at high latitudes in Europe and Asia for a few tens of thousands of years, yet they seem to have adapted their visual systems surprisingly rapidly to the cloudy skies, dull weather and long winters we experience at these latitudes.’

That the explanation is the need to compensate for low light levels at high latitudes is indicated by the fact that actual visual sharpness measured under natural daylight conditions is constant across latitudes, suggesting that the visual processing system has adapted to ambient light conditions as human populations have moved across the globe.

The study takes into account a number of potentially confounding effects, including the effect of phylogeny (the evolutionary links between different lineages of modern humans), the fact that humans living in the higher latitudes are physically bigger overall, and the possibility that eye socket volume was linked to cold weather (and the need to have more fat around the eyeball by way of insulation).

The skulls used in the study were from the indigenous populations of England, Australia, Canary Islands, China, France, India, Kenya, Micronesia, Scandinavia, Somalia, Uganda and the United States. From measuring the brain cavity, the research suggests that the biggest brains belonged to populations who lived in Scandinavia with the smallest being Micronesians.

This study adds weight to other research that has looked at the links between eye size and light levels. Other studies have already shown that birds with relatively bigger eyes are the first to sing at dawn in low light. The eyeball size across all primates has been found to be associated with when they choose to eat and forage – with species with the largest eyes being those that are active at night.

Contacts and sources:
Oxford University

DARPA’s Micro-Technology for Positioning, Navigation, and Timing

Warfighters have depended for decades on global positioning satellite (GPS) technology, and have incorporated it into guided munitions and other platforms to meet rigid requirements for guidance and navigation. This creates a potential challenge in instance where an intended target is equipped with high-powered jammers or if the GPS constellation is compromised.

DARPA’s Micro-Technology for Positioning, Navigation, and Timing (Micro-PNT) program seeks to overcome these potential challenges by developing technologies for self-contained, chip-scale inertial navigation and precision guidance. Size, weight, and power are key concerns in the overall system design of guided munitions. Breakthroughs in micro fabrication techniques allow development of a single package containing all the necessary devices incorporated into a small and low-power timing and inertial measurement unit. 

DARPA’s Micro-Technology for Positioning, Navigation, and Timing
Credit: DARPA

 On-chip calibration would allow for constant internal error correction to reduce drift and thereby enable more accurate devices. Trending away from ultralow drift sensors to a self-calibration approach will allow revolutionary breakthroughs in technology for positioning, navigation, and timing.

Andrei Shkel, DARPA’s Micro-PNT program manager, described this latest development in the following manner: “The micro-nuclear magnetic resonance gyro uses gyroscopic spin of nuclear particles in a magnetic field to determine orientation. This gyro has no moving parts and is not sensitive to acceleration and vibration. Others, such as silicon-based MEMS gyros, are much more susceptible to vibration, which keeps them from meeting performance expectations.”

The Micro-PNT program recently developed a micro-nuclear magnetic resonance gyro that uses the spin of atomic nuclei to measure rotation. This provides the ability to achieve navigation-grade performance with a two orders-of-magnitude reduction in size, weight, and power from state-of-the-art navigation grade gyroscopes currently used in inertial measurement units. This will allow micro-nuclear magnetic resonance gyros to be used in systems for personal navigation, navigation in GPS-denied areas, and on micro-UAVs.

Contacts and sources:
Dr. Andrei Shkel 

15 Startling Facts about America’s Infrastructure

The infrastructure of a nation is what holds civilization together. It includes roads, water supplies, sewers, electrical grids, and telecommunications — things without which the world might prove a difficult place to navigate. While Americans enjoy a better infrastructure than many places in the world, the reality is that it is outdated, inefficient, and — in many places around the nation — currently crumbling to pieces.
Sadly, things are only going to get worse before they get better, as roads fill with potholes, bridges collapse, and electrical grids brown out with more regularly, all unable to provide for the needs of the populace. If you had any doubts about the sad state of the American infrastructure, read on to learn just how bad things really are.
  1. More than 25% of bridges in the United States need significant repairs or are handling more traffic than they were designed to carry.
This translates to a whopping 150,000 bridges that aren’t up to snuff. In recent years, bridge and overpass collapses have even led to death. One of the most notable of these was the I-35 bridge in Minneapolis, which collapsed in 2007, killing 13 and injuring 145. If bridges are not updated or repaired, these kinds of accidents could become more common.

  1. An inefficient, heavily overburdened electrical grid results in rolling blackouts and losses of $80 billion a year.
In a world that relies heavily on technology for everything from health care to business, losing power can be a big deal. In the past decade, huge blackouts have left much of the Northeast and Florida without power for several days. This costs money, time, and can create unsafe conditions for residents.

  1. Over 4,095 dams in America were deemed “unsafe” by the American Society of Civil Engineers.
This means that they have deficiencies that leave them more susceptible to failure, especially during flooding or earthquakes. The number of dams in the United States that could fail has grown 134% since 1999, and now comprises 3,346 dams nationwide. More than 1,300 of these dangerous dams are considered “high hazard” because their collapse could threaten the lives of those living nearby.

  1. More than a third of all dam failures or near-failures since 1874 have happened in just the last decade.
The rate of failures is increasing at a disturbingly fast rate, as America’s dams age and deteriorate. Can’t remember any recent dam failures? In 2004, 30 different dams in New Jersey’s Burlington County failed or were damaged after a period of particularly heavy rainfall.

  1. Nearly a third of all highway fatalities are related to substandard road conditions, obsolete road designs, or roadside hazards.
The Federal Highway Administration estimates that poor road conditions play a role in more than 14,300 traffic fatalities each year

  1. By 2035, highway usage (and shipping by truck) is expected to double, leaving Americans to spend an average of 160 hours a year in traffic.
If you think traffic is bad now, just wait a few years. Over the next quarter-century, experts estimate that traffic on American roads is going to be much, much worse. Commuting between work and home could be a nightmare for many, taking up nearly a week of time over the course of the year. Also, keep in mind that this number is just an average, and in high-traffic urban areas, the estimates are much higher.

  1. More than half of America’s interstate miles are at 70% of traffic capacity, and nearly 25% of the miles are strained at more than 95% capacity.
Americans love their cars, and the roads are clogged with drivers as a result. Much of the interstate system in the U.S. is struggling to keep up with the number of people who use it each day, leading to traffic jams and accidents at much higher rates.

  1. It is estimated that over one third of America’s major roads are in poor or mediocre condition.
If you hadn’t already noticed that the streets in your city were littered with potholes and cracks, this stat will let you in on the secret: American roads are falling apart. With many states teetering on the edge of bankruptcy and unable to keep up with maintenance, this situation isn’t likely to change soon.

  1. Traffic jams caused by poor transit infrastructure cost Americans 4 billion hours and nearly 3 billion gallons of gasoline a year.
Highways designed to carry fewer cars that they’re currently managing, poorly timed lights, and awfully-designed transit systems all help contribute to traffic jams. These jams keep drivers on the road for longer, wasting gallon upon gallon of gas and hour upon hour of time

  1. A study by the EPA exposed the dirty truth about America’s aging sewer systems: they spill an estimated 1.26 trillion gallons of untreated sewage every single year.
Not only is this a health and environmental concern, but it’s also a financial one. Cleaning up these spills costs an estimated $50 billion every year.

  1. The United States must invest $225 billion per year over the next 50 years to maintain and adequately enhance roads and other transportation systems to meet demand.
Currently, the U.S. is spending less than 40% of this amount, which will make it impossible to effectively keep up with and expand the transit system.

  1. In 2005, U.S. infrastructure earned a D rating from the American Society of Civil Engineers. This was down from a D+ in 2001 and 2003.
It’s no joke that the infrastructure of the U.S. is getting worse and worse. In some areas, quality of water, electricity, and roads have been compared to those of a developing nation. Major changes need to be made to keep up, modernize, and allow America to remain competitive in the world market.

  1. By 2020, every major U.S. container port is projected to be handling at least double the volume it was designed for .
Imports and exports are major, major business for the U.S., and in the future, this isn’t likely to change. Yet the ports we use to do our trading are going to be seriously overloaded and will need a major overhaul to adequately deal with the number of ships coming in and out.

  1. Costs attributed to airline delays related to congestion and outdated air traffic control systems are expected to triple to $30 billion from 2000 to 2015.
Sitting on the tarmac waiting to take off or deplane isn’t just annoying — it’s costing businesses billions of dollars each year. The amount of time lost or wasted on flights is continually rising, up to 170 total years (15 minutes lost on 1.6 million flights) in 2007 from just 70 years lost in 2003.

  1. Railroads are expected to need over $200 billion in investment through 2035.
Railroads are a viable, if not quick, means of transporting people and goods the world over — but in the U.S., many lines are painfully inefficient and falling apart. While money is being poured into modernizing train systems (most notably high speed rail on some Amtrak lines), much more will be needed to keep pace with the amount of rail traffic in coming years. Not to mention everything it will take to make rail travel an appealing option to notoriously phobic Americans.

Contacts and sources:
Story by Jennifer Lynch
http://www.carinsurance.org/2011/07/15-startling-facts-about-americas-infrastructure/