Wednesday, July 31, 2013

Capturing Black Hole Spin Could Further Understanding of Galaxy Growth

Astronomers have found a new way of measuring the spin in supermassive black holes, which could lead to better understanding about how they drive the growth of galaxies. The scientists at Durham University in the UK publish their work in a paper in the Oxford University Press journal Monthly Notices of the Royal Astronomical Society.

The team of astronomers observed a black hole -- with a mass 10 million times that of our Sun -- at the centre of a spiral galaxy 500 million light years from Earth while it was feeding on the surrounding disc of material that fuels its growth and powers its activity.
By viewing optical, ultra-violet and soft x-rays generated by heat as the black hole fed, they were able to measure how far the disc was from the black hole.
This distance depends on black hole spin as a fast spinning black hole pulls the disc in closer to itself, the researchers said. Using the distance between the black hole and the disc, the scientists were able to estimate the spin of the black hole.
The scientists said that understanding spin could lead to greater understanding of galaxy growth over billions of years.
Black holes lie at the centres of almost all galaxies, and can spit out incredibly hot particles at high energies that prevent intergalactic gases from cooling and forming new stars in the outer galaxy. Scientists don't yet understand why the jets are ejected into space, but the Durham experts believe that their power could be linked to the spin of the black hole. This spin is difficult to measure as it only affects the behaviour of material really close to the black hole.
Lead researcher Professor Chris Done, in the Department of Physics, at Durham University, said: "We know the black hole in the centre of each galaxy is linked to the galaxy as a whole, which is strange because black holes are tiny in relation to the size of a galaxy. This would be like something the size of a large boulder (10m), influencing something the size of Earth.
"Understanding this connection between stars in a galaxy and the growth of a black hole, and vice-versa, is the key to understanding how galaxies form throughout cosmic time.
"If a black hole is spinning it drags space and time with it and that drags the accretion disc, containing the black hole's food, closer towards it. This makes the black hole spin faster, a bit like an ice skater doing a pirouette.
"By being able to measure the distance between the black hole and the accretion disc, we believe we can more effectively measure the spin of black holes.
"Because of this, we hope to be able to understand more about the link between black holes and their galaxies."
The Durham scientists were able to measure the spin of the black hole using soft x-ray, optical and ultra-violet images captured by the European Space Agency's XMM-Newton satellite.

Planetary 'Runaway Greenhouse' More Easily Triggered, Research Shows

 It might be easier than previously thought for a planet to overheat into the scorchingly uninhabitable "runaway greenhouse" stage, according to new research by astronomers at the University of Washington and the University of Victoria published July 28 in the journal Nature Geoscience.

In the runaway greenhouse stage, a planet absorbs more solar energy than it can give off to retain equilibrium. As a result, the world overheats, boiling its oceans and filling its atmosphere with steam, which leaves the planet glowing-hot and forever uninhabitable, as Venus is now.
One estimate of the inner edge of a star's "habitable zone" is where the runaway greenhouse process begins. The habitable zone is that ring of space around a star that's just right for water to remain in liquid form on an orbiting rocky planet's surface, thus giving life a chance.
Revisiting this classic planetary science scenario with new computer modeling, the astronomers found a lower thermal radiation threshold for the runaway greenhouse process, meaning that stage may be easier to initiate than had been previously thought.
"The habitable zone becomes much narrower, in the sense that you can no longer get as close to the star as we thought before going into a runaway greenhouse," said Tyler Robinson, a UW astronomy postdoctoral researcher and second author on the paper. The lead author is Colin Goldblatt of the University of Victoria.
Though further research is called for, the findings could lead to a recalibration of where the habitable zone begins and ends, with some planets having their candidacy as possible habitable worlds revoked.
"These worlds on the very edge got 'pushed in,' from our perspective -- they are now beyond the runaway greenhouse threshold," Robinson said.
Subsequent research, the astronomers say, is needed in part because their computer modeling was done in a "single-column, clear-sky model," or a one-dimensional measure averaged around a planetary sphere that does not account for the atmospheric effect of clouds.
The findings apply to planet Earth as well. As the sun increases in brightness over time, Earth, too, will move into the runaway greenhouse stage -- but not for a billion and a half years or so. Still, it inspired the astronomers to write, "As the solar constant increases with time, Earth's future is analogous to Venus's past."
Other co-authors are Kevin J. Zahnle of the NASA Ames Research Center in Moffett Field, Calif.; and David Crisp of the Jet Propulsion Laboratory in Pasadena, Calif.

Spitzer Discovers Young Stars With a 'Hula Hoop'

 Astronomers using NASA's Spitzer Space Telescope have spotted a young stellar system that "blinks" every 93 days. Called YLW 16A, the system likely consists of three developing stars, two of which are surrounded by a disk of material left over from the star-formation process.

As the two inner stars whirl around each other, they periodically peek out from the disk that girds them like a hula hoop. The hoop itself appears to be misaligned from the central star pair, probably due to the disrupting gravitational presence of the third star orbiting at the periphery of the system. The whole system cycles through bright and faint phases, with the central stars playing a sort of cosmic peek-a-boo as the tilted disk twirls around them. It is believed that this disk should go on to spawn planets and the other celestial bodies that make up a solar system.
Spitzer observed infrared light from YLW 16A, emitted by the warmed gas and dust in the disk that still swathes the young stars. Other observations came from the ground-based 2MASS survey, as well as from the NACO instrument at the European Southern Observatory's Very Large Telescope in Chile.
YLW 16A is the fourth example of a star system known to blink in such a manner, and the second in the same star-forming region Rho Ophiuchus. The finding suggests that these systems might be more common than once thought. Blinking star systems with warped disks offer scientists a way to study how planets form in these environments. The planets can orbit one or both of the stars in the binary star system. The famous science fictional planet Tatooine in "Star Wars" orbits two stars, hence its double sunsets. Such worlds are referred to as circumbinary planets. Astronomers can record how light is absorbed by planet-forming disks during the bright and faint phases of blinking stellar systems, which in turn reveals information about the materials that comprise the disk.
"These blinking systems offer natural probes of the binary and circumbinary planet formation process," said Peter Plavchan, a scientist at the NASA Exoplanet Science Institute and Infrared Processing and Analysis Center at the California Institute of Technology, Pasadena, Calif., and lead author of a new paper accepted for publication in Astronomy & Astrophysics.
NASA's Jet Propulsion Laboratory, Pasadena, Calif., manages the Spitzer Space Telescope mission for NASA's Science Mission Directorate, Washington. Science operations are conducted at the Spitzer Science Center at Caltech. Data are archived at the Infrared Science Archive housed at the Infrared Processing and Analysis Center. Caltech manages JPL for NASA. For more information about Spitzer, 

Wednesday, July 17, 2013

Sunday, July 14, 2013

Promise of 'Human Computing Power' Via Crowdsourcing to Speed Medical Research

"Human computing power" harnessed from ordinary citizens across the world has the potential to accelerate the pace of health care research of all kinds, a team from the Perelman School of Medicine at the University of Pennsylvania, writes in a new review published online in the Journal of General Internal Medicine. In fact, they suggest, crowdsourcing -- a research method that allows investigators to engage thousands of people to provide either data or data analysis, usually via online communications -- could even improve the quality of research while reducing the costs.

But the field is new, and the team's findings suggest that standardized guidelines for health care crowdsourcing ventures are needed so that data can be collected, reported, and replicated most efficiently.
"While the concept of 'citizen science' has been in existence for more than a century and crowdsourcing has been used in science for at least a decade, it has been utilized primarily by non-medical fields and little is known about its potential in health research," said the study's senior author Raina Merchant, MD, an assistant professor of Emergency Medicine at Penn.
Merchant and colleagues successfully utilized crowdsourcing in a recent study to locate and catalog the locations of lifesaving automated external defibrillators (AEDs) throughout Philadelphia in the MyHeartMap Challenge. Their study led to the identification of more than 1,400 AEDs in public places, and they hope to replicate the study in other major cities across the U.S.
For the current review, in addition to a traditional database search, her team employed crowdsourcing again to perform a literature search for health and medical research articles using two free websites: Yahoo! Answers and Quora. Through this approach, they were able to collect and analyze 21 health-related studies that include crowdsourcing techniques. The studies collectively engaged a crowd of over 136,000 people, ranging in focus from tracking H1N1 influenza outbreaks in near real time to classifying different types of polyps in the colon.
"There is understandably some apprehension about letting the lay public in on medical research or even assisting with making medical diagnoses because the stakes are so high in medicine. However, studies we reviewed showed that the crowd can be very successful, such as solving novel complex protein structure problems or identifying malaria infected red blood cells with a similar accuracy as a medical professional," said the study's first author Benjamin Ranard, a third year medical student in the Perelman School of Medicine.
The research team found that the studies centered around four main categories of tasks: problem solving, data processing, surveillance/monitoring and surveying.
However, they found considerable variability in the amount and type of data reported about the crowd and the experimental set up, which would make it difficult for other researchers to replicate or model their work for their own research. For instance, the articles rarely reported data about the demographics of the crowd participating, including information standard to most clinical trials such as the size of the cohort, age, gender, and geographic location. They also noted that the limited amount of studies they found is surprising given the potential benefits of this approach.
The authors recommend that other health and medical investigators should look at their own research projects and consider involving the public through crowdsourcing. Whenever research requires human processing that computers alone cannot do, such as visually sorting pictures or other data, they say there is a potential to involve the crowd. Crowdsourcing can also be used to take advantage of problem solving skills members of the public may have (such as solving three-dimensional puzzles), or to employ the crowd to act as human sensors reporting data about the environment (for example, reporting cases of influenza-like symptoms).
"Every health field from studying chronic diseases to global health has a potential need for human computing power that crowdsourcing could fill to accelerate research. Prior work has heralded crowdsourcing as a feasible method for data collection, but a clear roadmap for the types of questions crowdsourcing could answer and the ways it could be applied has been lacking," said Merchant. "This review points to the need for streamlining the process and implementing more rigorous guidelines for this approach."
They call for continued study of the scope of crowdsourcing to determine where it might be as useful as traditional data. To further explore the power of crowdsourcing and other research approaches via social media, Merchant was recently appointed director of the Social Media Lab at the Penn Medicine Center for Health Care Innovation. In this role, she will lead a program exploring ways in which new communication channels can enhance Penn's ability to understand and improve the health and health care of patients and other populations.
Other Penn Medicine authors include Yoonhee P. Ha, MSc , Zachary F. Meisel, MD, MPH, MS, David A. Asch, MD, MBA and, Lance B. Becker, MD.

Story source

Brain Region Implicated in Emotional Disturbance in Dementia Patients

A study by researchers at Neuroscience Research Australia (NeuRA) is the first to demonstrate that patients with frontotemporal dementia (FTD) lose the emotional content/colour of their memories. These findings explain why FTD patients may not vividly remember an emotionally charged event like a wedding or funeral.

The research team discovered that a region of the brain, called the orbitofrontal cortex, plays a key role in linking emotion and memories.
"This step forward in the mapping of the brain will improve how we diagnose different types of dementia," says the study's lead author, Associate Professor Olivier Piguet.
The fact that we vividly remember events infused with emotion -- like birthday parties -- is well established. Patients with frontotemporal dementia (FTD) -- a degenerative condition that affects the frontal and temporal lobes of the brain -- show profound difficulty understanding and expressing emotion. Yet the extent to which such deficits weaken emotional enhancement of memory remains unknown.
To find out, the NeuRA team showed patients images that prompt an emotional reaction in healthy people. Healthy control subjects and patients with Alzheimer's disease remembered more emotional than neutral images. The FTD patients, however, did not.
Professor Piguet says, "Up until now, we knew that emotional memories were supported by the amygdala, a brain region also involved with emotion regulation. This study is the first to demonstrate the involvement of the orbitofrontal cortex in this process. This is an important development in how we understand the relations between emotions and memory and the disturbance of the emotional system in this type of dementia."
NeuRA researcher, Fiona Kumfor, says the findings will help carers better understand why their loved ones may find personal interactions difficult. "Imagine if you attended the wedding of your daughter, or met your grandchild for the first time, but this event was as memorable as doing the groceries. We have discovered that this is what life is like for patients with FTD," says Fiona.
"This is the first study that has looked at memory and emotion together in FTD and that is exciting. We now have new insight into the disease and can demonstrate that emotional memories are affected differently, depending on the type of dementia.
This information could help us create diagnostic tools and change how we diagnose certain types of dementias and differentiate between them. We have basically found the source of the deficit driving these impairments in patients, which brings us a step closer to understanding what it means to have FTD," she concluded.
 

Geothermal Power Facility Induces Earthquakes, Study Finds

An analysis of earthquakes in the area around the Salton Sea Geothermal Field in southern California has found a strong correlation between seismic activity and operations for production of geothermal power, which involve pumping water into and out of an underground reservoir.

"We show that the earthquake rate in the Salton Sea tracks a combination of the volume of fluid removed from the ground for power generation and the volume of wastewater injected," said Emily Brodsky, a geophysicist at the University of California, Santa Cruz, and lead author of the study, published online in Science on July 11.
"The findings show that we might be able to predict the earthquakes generated by human activities. To do this, we need to take a large view of the system and consider both the water coming in and out of the ground," said Brodsky, a professor of Earth and planetary sciences at UCSC.
Brodsky and coauthor Lia Lajoie, who worked on the project as a UCSC graduate student, studied earthquake records for the region from 1981 through 2012. They compared earthquake activity with production data for the geothermal power plant, including records of fluid injection and extraction. The power plant is a "flash-steam facility" which pulls hot water out of the ground, flashes it to steam to run turbines, and recaptures as much water as possible for injection back into the ground. Due to evaporative losses, less water is pumped back in than is pulled out, so the net effect is fluid extraction.
During the period of relatively low-level geothermal operations before 1986, the rate of earthquakes in the region was also low. Seismicity increased as the operations expanded. After 2001, both geothermal operations and seismicity climbed steadily.
The researchers tracked the variation in net extraction over time and compared it to seismic activity. The relationship is complicated because earthquakes are naturally clustered due to local aftershocks, and it can be difficult to separate secondary triggering (aftershocks) from the direct influence of human activities. The researchers developed a statistical method to separate out the aftershocks, allowing them to measure the "background rate" of primary earthquakes over time.
"We found a good correlation between seismicity and net extraction," Brodsky said. "The correlation was even better when we used a combination of all the information we had on fluid injection and net extraction. The seismicity is clearly tracking the changes in fluid volume in the ground."
The vast majority of the induced earthquakes are small, and the same is true of earthquakes in general. The key question is what is the biggest earthquake that could occur in the area, Brodsky said. The largest earthquake in the region of the Salton Sea Geothermal Field during the 30-year study period was a magnitude 5.1 earthquake.
The nearby San Andreas fault, however, is capable of unleashing extremely destructive earthquakes of at least magnitude 8, Brodsky said. The location of the geothermal field at the southern end of the San Andreas fault is cause for concern due to the possibility of inducing a damaging earthquake.
"It's hard to draw a direct line from the geothermal field to effects on the San Andreas fault, but it seems plausible that they could interact," Brodsky said.
At its southern end, the San Andreas fault runs into the Salton Sea, and it's not clear what faults there might be beneath the water. A seismically active region known as the Brawley Seismic Zone extends from the southern end of the San Andreas fault to the northern end of the Imperial fault. The Salton Sea Geothermal Field, located on the southeastern edge of the Salton Sea, is one of four operating geothermal fields in the area.

Story source

Insect Discovery Sheds Light On Climate Change

Simon Fraser University biologists have discovered a new, extinct family of insects that will help scientists better understand how some animals responded to global climate change and the evolution of communities.

"The Eocene Apex of Panorpoid Family Diversity," a paper by SFU's Bruce Archibald and Rolf Mathewes, plus David Greenwood from Brandon University, was recently published in the Journal of Paleontology.
The researchers named the new family the Eorpidae, after the Eocene Epoch, the age when these insects lived some 50 million years ago. The fossils were found in British Columbia and Washington state, most prominently at the McAbee Fossil Beds near Cache Creek, B.C.
This new family raises questions about its extinction. Insect families have steadily accumulated since before the Eocene, with few, scattered losses -- apart from the distinct exception of a cluster of family extinctions within a group of scorpionflies that includes the Eorpidae.
"The Eorpidae was part of a cluster of six closely related families in the Eocene, but today this group is reduced to two. Why were these different?" says Archibald. "We believe the answer may lay in a combination of two large-scale challenges that would have hit them hard: the evolutionary diversification of a strong competitive group and global climate change."
In a major evolutionary diversification, ants evolved from a small group to become major ecological players in the Eocene, now competing with these scorpionflies for the same food resource in a whole new, efficient manner.
Global climates were much warmer 50 million years ago, associated with increased atmospheric carbon, a relationship that scientists see today. Along with this, winters were mild, even in the cool, higher elevations where these insects lived. Average temperatures there were similar to modern Vancouver, but with few -- if any -- frost days.
When climates outside of the tropics later cooled, temperature seasonality also widened, forming the modern pattern of hot summers and freezing winters. Plant and animal groups that inhabited Eocene uplands either had to evolve tolerance for colder winters, migrate to the hot tropics and adapt to that climate, or go extinct.
"These scorpionfly families appear to have retained their need to inhabit cooler climates, but to persist there, they would need to evolve toleration for cold winters, a feat that only the two surviving families may have accomplished," Archibald explains. "Understanding the evolutionary history of these insects adds another piece to the puzzle of how animal communities change as climate does -- but in this case, when an interval of global warming ends."
 

Disks Don't Need Planets to Make Patterns, NASA Study Shows

Many young stars known to host planets also possess disks containing dust and icy grains, particles produced by collisions among asteroids and comets also orbiting the star. These debris disks often show sharply defined rings or spiral patterns, features that could signal the presence of orbiting planets. Astronomers study the disk features as a way to better understand the physical properties of known planets and possibly uncover new ones

But a new study by NASA scientists sounds a cautionary note in interpreting rings and spiral arms as signposts for new planets. Thanks to interactions between gas and dust, a debris disk may, under the right conditions, produce narrow rings on its own, no planets needed.
Watch the changing dust density and the growth of structure in this simulated debris disk, which extends about 100 times farther from its star than Earth's orbit around the sun. At left, the disk is seen from a 24-degree angle; at right, it's face-on. Lighter colors show higher dust density.
"When the mass of gas is roughly equal to the mass of dust, the two interact in a way that leads to clumping in the dust and the formation of patterns," said lead researcher Wladimir Lyra, a Sagan Fellow at NASA's Jet Propulsion Laboratory in Pasadena, Calif. "In essence, the gas shepherds the dust into the kinds of structures we would expect to be see if a planet were present."
A paper describing the findings was published in the July 11 issue of Nature.
The warm dust in debris disks is easy to detect at infrared wavelengths, but estimating the gas content of disks is a much greater challenge. As a result, theoretical studies tend to focus on the role of dust and ice particles, paying relatively little attention to the gas component. Yet icy grains evaporate and collisions produce both gas and dust, so at some level all debris disks must contain some amount of gas.
"All we need to produce narrow rings and other structures in our models of debris disks is a bit of gas, too little for us to detect today in most actual systems," said co-author Marc Kuchner, an astrophysicist at NASA's Goddard Space Flight Center in Greenbelt, Md.
Here's how it works. When high-energy ultraviolet light from the central star strikes a clump of dust and ice grains, it drives electrons off the particles. These high-speed electrons then collide with and heat nearby gas.
The rising gas pressure changes the drag force on the orbiting dust, causing the clump to grow and better heat the gas. This interaction, which the astronomers refer to as the photoelectric instability, continues to cascade. Clumps grow into arcs, rings, and oval features in tens of thousands of years, a relatively short time compared to other forces at work in a young solar system.
A model developed by Lyra and Kuchner shows the process at work.
"We were fascinated to watch this structure form in the simulations," Lyra said. "Some of the rings begin to oscillate, and at any moment they have the offset appearance of dust rings we see around many stars, such as Fomalhaut."
In addition, dense clumps with many times the dust density elsewhere in the disk also form during the simulation. When a clump in a ring grows too dense, the ring breaks into arcs and the arcs gradually shrink until only a single compact clump remains. In actual debris disks, some of these dense clumps could reflect enough light to be directly observable.
"We would detect these clumps as bright moving sources of light, which is just what we're looking for when we search for planets," adds Kuchner.
The researchers conclude that the photoelectric instability provides a simple and plausible explanation for many of the features found in debris disks, making the job of planet-hunting astronomers just a little bit harder.
 

Race for New Temperature Definition: Most Accurate Measurement Yet of Boltzmann Constant

Scientists at the UK's National Physical Laboratory (NPL) have performed the most accurate measurement yet of the Boltzmann constant.

While the impact of such an achievement is not immediately obvious, the measurement could revolutionise the way we define temperature, replacing the standard method that has been used for over 50 years.
The new measurement is 1.380 651 56 (98) × 10−23 J K−1, where the (98) shows the uncertainty in the last two digits, which amounts to an uncertainty of 0.7 parts per million --almost half the previous lowest uncertainty.
The result has been published today, 11 July, in the journal Metrologia by IOP Publishing on behalf of the Bureau International des Poids et Mesures (BIPM).
Scientists currently define the kelvin and the degree Celsius using the temperature of the triple point of water -- the point at which liquid water, solid ice and water vapour can all exist in equilibrium.
This 'standard temperature' has been defined as 273.16 K exactly. The temperature measurements we make in everyday life are an assessment of how much hotter or colder an object is compared to this value.
As temperature measurements need to be made with increasing accuracy across a wide range of disciplines, fixing a single temperature as a standard becomes problematic, especially when you're measuring extremely hot or cold temperatures.
Lead author of the study, Dr Michael de Podesta, said: The further away one measures from the temperature of the triple point of water, the harder it gets to precisely determine the ratio of exactly how much hotter or colder the temperature is than the standard temperature. This adds uncertainty to temperature measurements on top of the normal practical difficulties."
The solution is to redefine the kelvin using a fixed constant of nature, just as the metre has moved from a physical piece of metal to the length of the path travelled by light in vacuum over a specified number of nanoseconds. The suggested method is to use the Boltzmann constant, which is a measure of the relationship between the kinetic energy of molecules and temperature.
"It is fascinating that we worked out how to measure temperature long before we knew what temperature actually was. Now we understand that the temperature of an object is related to the energy of motion of its constituent atoms and molecules. When you touch an object and it feels 'hot' you are literally sensing the 'buzzing' of the atomic vibrations. The new definition directly links the unit of temperature to this basic physical reality," continued de Podesta.
In this study, the researchers, in collaboration with Cranfield University and the Scottish Universities Environmental Research Centre (SUERC), used acoustic thermometry to make the measurement by building an acoustic resonator and making amazingly precise measurements of the speed of sound in argon gas.
The researchers first cooled the resonator to the temperature of the triple point of water so they knew the temperature exactly in the current definition and filled it with argon gas that had its isotopic purity assessed by the SUERC team.
Then they used the speed of sound measurement to calculate the average speed of the argon molecules and hence the average amount of kinetic energy that they had -- from this they were able to calculate the Boltzmann constant with an extremely high accuracy.
In order to achieve this high accuracy, the researchers also needed to measure the radius of the resonator with a high accuracy.
The team at Cranfield University used a single-crystal diamond cutting tool to produce four copper hemispheres. The best pair, when put together, formed a triaxially-ellipsoidal resonator that was the correct shape to within one thousandth of a millimetre. The radius was then calculated using the specific frequencies at which the wavelength of microwaves exactly fit into the resonator and was measured with an overall uncertainty of 11.7 nm, which is the thickness of about 600 atoms.
"This experiment has been exhilarating, and after six years we are exhausted. Every aspect of the experiment has required pushing science and engineering to the limit. In this kind of work we need to worry constantly about all the things which might go wrong, and how they might affect the results. We are looking forward to worrying a little less and getting on with exploiting some of the new technology we have invented in the course of the project," continued de Podesta.

Story source

Engineers Test Microelectronic Cooling System in Zero Gravity On Board the Novespace Airbus

Three years of preparation supported by NASA paid off this June for researchers from the University of Illinois at Chicago who conducted experiments while floating weightless on a Novespace & European Space Agency (ESA) plane

Under the direction of Alexander Yarin, UIC professor of mechanical and industrial engineering, Suman Sinha Ray, a post-doctoral fellow and recent UIC graduate, and his brother Sumit Sinha Ray, a graduate student, braved high- and zero-gravity to test a cooling system Yarin's team developed for hot-running microelectronics.
When liquid on a hot surface evaporates, it carries away heat. Yarin and his colleagues were interested in learning how well the evaporative cooling system they developed would work under conditions of twice-normal or zero gravity.
The UIC team, one of 12 international teams monitoring experiments onboard the Novespace Airbus plane, was collaborating with Professor C. Tropea from the Technische Universität Darmstadt, in Germany, who sent three students on board.
The Novespace Airbus' parabolic flights produced conditions of weightlessness and nearly double normal gravity. As a plane flies a parabola, gravitational force increases when it climbs or descends steeply. As the plane reaches its peak and floats over the top of the curve, passengers experience weightlessness.
The flights were physically demanding but "exciting and fun," said Suman Singha Ray. They were warned not to try to float around the cabin as astronauts do in the space station. When you move your head while weightless, he said, "your senses don't match," and many people become nauseated, even after a pre-flight anti-nausea injection.
Suman was right at home, his brother Sumit said, and "worked on the computer very naturally as we ran the experiment."
Sumit photographed the experiment while keeping a close eye on the pressurized rig running the system. Holding the camera steady as his feet floated off the floor was a challenge, he said, as was keeping an eye on his fellow travelers. He accidentally kicked a Japanese scientist who floated too close.
The UIC researchers were testing their cooling system for potential application in near or outer space. Satellites, rockets and drones have elaborate electro-optical and infrared sensors, recording equipment and data processing systems. All of these electronics are designed with smaller and smaller elements that generate heat and can burn out.
"This is a problem that is very acute," said Yarin. "We are very nearly at the limit of miniaturization because of the problem of heat removal."
Yarin and his group have developed novel nano-textured surfaces that dramatically increase cooling efficiency. Their cooling system covers high-heat surfaces with mats made from tangles of nanofibers. The extremely thin fibers of the mat trap coolant against the surface so that evaporation is rapid and complete.
Over three days, the group flew three flights. On each 3 ½-hour flight, the plane flew 31 parabolas -- five minutes through the curve, five minutes rest, then another five-minute parabola, with a rest after every third parabola. They will now analyze the data collected during the high-gravity and zero-gravity portions of the parabolic flights to understand how the cooling system works under the conditions of space applications.
Yarin's group and the international collaboration with the Technische Universität Darmstadt was supported by NASA through grant NNX10AR99G.
 

Women at Risk of Developing Postpartum Psychosis Need Close Monitoring, Says New Review

There are clear risk factors for postpartum psychosis that all women should be asked about antenatally to ensure early recognition and prompt treatment of the condition, says a new review published today (12 July) in The Obstetrician & Gynaecologist (TOG).

Postpartum psychosis is a severe mental illness with a dramatic onset shortly after childbirth, affecting approximately 1-2 in 1000 deliveries. However, the review notes that the true incidence may be higher.
Common symptoms include; mania, severe depression, delusions and hallucinations, confusion, bewilderment or perplexity, all of which increase the risk for both mother and child.
The review notes that there is consistent evidence of a specific relationship between postpartum psychosis and bipolar disorder. Women with bipolar disorder have at least a 1 in 4 risk of suffering postpartum psychosis. Genetics are also a factor and women with bipolar disorder and a personal or family history of postpartum psychosis are at particularly high risk with greater than 1 in 2 deliveries affected by postpartum psychosis.
However, half of women who develop postpartum psychosis have no family history or previous risk factors that put them in a high risk group of suffering from the condition.
The review emphasises the need for close contact and review from a multidisciplinary team during the perinatal period for at least three months following delivery, even if the woman is well and recommends a written plan covering pregnancy and the postnatal period which should be discussed with the woman and her family.
Dr Ian Jones, Reader in Perinatal Psychiatry, Cardiff University and co-author of the review said:
"Women at high risk of postpartum psychosis need very careful care before conception, throughout pregnancy and during the postpartum period, including pre-conception counselling and close monitoring and psychiatric assessment after childbirth.
"Postpartum psychosis is a true psychiatric emergency and it is vital that is recognised early and treated immediately. Admission to hospital is usually necessary and women should ideally be offered a specialist mother and baby unit where the best treatment options can be established."
Jason Waugh, TOG's Editor-in-chief said: "This review emphasises the importance of women at high risk of postpartum psychosis as well as the early recognition and prompt treatment of women who develop the condition.
"This paper also underlines that half of women who experience postpartum psychosis have no previous risk factors. It is therefore vital that all women are made aware of the condition and its signs and symptoms."

Story source

Sculpting Flow: Supercomputers Help Microfluidics Researchers Make Waves at the Microscopic Level

Have you ever noticed the way water flows around boulders in a fast-moving river, creating areas of stillness and intense motion? What if those forces of fluid flow could be controlled at the smallest levels?

In May 2013, researchers from UCLA, Iowa State and Princeton reported results in Nature Communications about a new way of sculpting tailor-made fluid flows by placing tiny pillars in microfluidic channels. By altering the speed of the fluid, and stacking many pillars, with different widths, placements and orientations, in the fluid's path, they showed that it is possible to create an impressive array of controlled flows.
Why does this matter?
Because such a method will allow clinicians to separate white blood cells from other cells in a blood sample, increase mixing in industrial applications, and more quickly perform lab-on-a-chip-type operations, like DNA sequencing and chemical detection. Each of these could form the foundation for a multi-million dollar industry. Together, they could revolutionize microfluidics.
"Most microfluidic flow is at a very low speed," said Baskar Ganapathysubramaniam, assistant professor of mechanical engineering at Iowa State and one of the lead researchers. "At that speed, the flow hugs the cylinder and there's fore-aft symmetry. Whatever's happening upstream is exactly mirrored downstream. But if you increase the speed -- or more technically, the Reynolds number -- slightly, you can break this symmetry and get wakes, vortices, and non-trivial deformations." All of which create distinct flows.
Hashing out the idea with Dino Di Carlo, associate professor of bioengineering at UCLA, the two researchers asked themselves if they could control the flow of fluids in microfluidic channels by placing pillars in specific locations in the path. Using both experimental methods and numerical simulations, they explored the possibilities offered by this approach and found that they could indeed create a range of predictable flows.
"Each pillar has a unique deformation signature to it," Ganapathysubramaniam said. "By stacking these pillars together, we can create an astounding variety of deformations, and these can be tuned for specific purposes."
"Engineering tools like this allow a scientists to easily develop and manipulate a flow to a shape of their interest," Di Carlo said. "There hasn't been that platform available in the fluids community."
The equations used to determine the fluid flows are fairly straightforward, but the number of configurations needed to solve the problem required them to use the Ranger supercomputer at the Texas Advanced Computing Center (TACC). Ranger, funded by the National Science Foundation (NSF), served the national open science community for five years and was replaced by Stampede (the sixth most powerful supercomputer in the world) in January 2013.
Using several thousand processors concurrently, the researchers ran more than a 1,000 different problems, each representing a combination of different speeds, thicknesses, heights or offsets.
"Each of these gives us one transformation and together, they form what we call a library of transformations," DiCarlo described.
With this method, Ganapathysubramaniam says it's possible to create a sequence of pillars that would push white cells to the boundaries of a channel to separate them, and then return them to the center to be recaptured. He is also excited to study the potential of pillars to enhance mixing, which would be useful for removing heat from microprocessor fabrication as well as nano- and micro-scale controlled manufacturing.
Eventually, DiCarlo and Ganapathysubramanian want to crowd-source the identification of critical flow transportations that will have implications to industry.
"Once we have the library, we envision creating a video game where we ask the player to design a specific kind of flow transformation," Ganapathysubramaniam explained. "They pick different pillars, stack them together, and see if they can get that configuration."
It's this kind of out-of-the-box thinking that characterizes the Iowa State scientist's research. Recently, partnering with Manish Parashar, the director of the Rutgers Discovery Informatics Institute (RDI2) at Rutgers University, and with Rutgers research professor Jaroslaw Zola, Ganapathysubramaniam undertook another experiment typical of his knack for creative problem-solving.
Using Federated Computing enabled by CometCloud, the project brought together a team of researchers with access to 10 supercomputers at six high performance computing (HPC) centers across three continents to continue and extend Ganapathysubramaniam's microfluidics simulations. The consortium included TACC's new Stampede system, as well as resources from the Department of Energy, FutureGrid, and international HPC centers.
Using the Comet Cloud, the researchers ran 12,845 flow simulations, consuming more than 2.5 million core-hours and generating 400 gigabytes of data over the course of 16 days.
"The experiment allowed us to explore an alternate paradigm for doing computational science and demonstrate that we can support applications using this paradigm," Parashar said. "Many applications have a similar workflow so this could be a model for supporting researchers without all of them going to one resource or another. This could be used to provide compute resources to a wide-range of applications."
The computations enabled by CometCloud brought Ganapathysubramaniam halfway to his dream of a complete library of microscopic fluid flows. However, the entire library would take much more computing. Fortunately, supercomputers are getting relentlessly faster, and with new technologies come new opportunities for industry, science and medicine.

Story source

Caribbean's Native Predators Unable to Stop Aggressive Lionfish Population Growth

"Ocean predator" conjures up images of sharks and barracudas, but the voracious red lionfish is out-eating them all in the Caribbean -- and Mother Nature appears unable to control its impact on local reef fish. That leaves human intervention as the most promising solution to the problem of this highly invasive species, said researchers at the University of North Carolina at Chapel Hill.

"Lionfish are here to stay, and it appears that the only way to control them is by fishing them," said John Bruno, professor of biology in UNC's College of Arts and Sciences and lead investigator of the study. The research has important implications not just for Caribbean reefs, but for the North Carolina coast, where growing numbers of lionfish now threaten local fish populations.
Lionfish, native to the Indo-Pacific region, have long been popular aquarium occupants, with their striking stripes and soft, waving fins. They also have venomous spines, making them unpleasant fare for predators, including humans -- though once the spines are carefully removed, lionfish are generally considered safe to eat, Bruno said.
They have become big marine news as the latest invasive species to threaten existing wildlife populations. Bruno likened their extraordinary success to that of ball pythons, now eating their way through Florida Everglades fauna, with few predators other than alligators and humans.
"When I began diving 10 years ago, lionfish were a rare and mysterious species seen deep within coral crevices in the Pacific Ocean," said Serena Hackerott, lead author and master's student in marine sciences, also in UNC's College of Arts and Sciences. "They can now been seen across the Caribbean, hovering above the reefs throughout the day and gathering in groups of up to ten or more on a single coral head."
The international research team looked at whether native reef predators such as sharks and groupers could help control the population growth of red lionfish in the Caribbean, either by eating them or out-competing them for prey. They also wanted to evaluate scientifically whether, as some speculate, that overfishing of reef predators had allowed the lionfish population to grow unchecked.
The team surveyed 71 reefs, in three different regions of the Caribbean, over three years. Their results indicate there is no relationship between the density of lionfish and that of native predators, suggesting that, "interactions with native predators do not influence" the number of lionfish in those areas, the study said.
The researchers did find that lionfish populations were lower in protected reefs, attributing that to targeted removal by reef managers, rather than consumption by large fishes in the protected areas. Hackerott noted that during 2013 reef surveys, there appeared to be fewer lionfish on popular dive sites in Belize, where divers and reef managers remove lionfish daily.
The researchers support restoration of large-reef predators as a way to achieve better balance and biodiversity, but they are not optimistic that this would affect the burgeoning lionfish population.
"Active and direct management, perhaps in the form of sustained culling, appears to be essential to curbing local lionfish abundance and efforts to promote such activities should be encouraged," the study concluded.
 

Scientists Cast Doubt On Theory of What Triggered Antarctic Glaciation

A team of U.S. and U.K. scientists has found geologic evidence that casts doubt on one of the conventional explanations for how Antarctica's ice sheet began forming. Ian Dalziel, research professor at The University of Texas at Austin's Institute for Geophysics and professor in the Jackson School of Geosciences, and his colleagues report the findings today in an online edition of the journal Geology.

The Antarctic Circumpolar Current (ACC), an ocean current flowing clockwise around the entire continent, insulates Antarctica from warmer ocean water to the north, helping maintain the ice sheet. For several decades, scientists have surmised that the onset of a complete ACC played a critical role in the initial glaciation of the continent about 34 million years ago.
Now, rock samples from the central Scotia Sea near Antarctica reveal the remnants of a now-submerged volcanic arc that formed sometime before 28 million years ago and might have blocked the formation of the ACC until less than 12 million years ago. Hence, the onset of the ACC may not be related to the initial glaciation of Antarctica, but rather to the subsequent well-documented descent of the planet into a much colder "icehouse" glacial state.
"If you had sailed into the Scotia Sea 25 million years ago, you would have seen a scattering of volcanoes rising above the water," says Dalziel. "They would have looked similar to the modern volcanic arc to the east, the South Sandwich Islands."
Using multibeam sonar to map seafloor bathymetry, which is analogous to mapping the topography of the land surface, the team identified seafloor rises in the central Scotia Sea. They dredged the seafloor at various points on the rises and discovered volcanic rocks and sediments created from the weathering of volcanic rocks. These samples are distinct from normal ocean floor lavas and geochemically identical to the presently active South Sandwich Islands volcanic arc to the east of the Scotia Sea that today forms a barrier to the ACC, diverting it northward.
Using a technique known as argon isotopic dating, the researchers found that the samples range in age from about 28 million years to about 12 million years. The team interpreted these results as evidence that an ancient volcanic arc, referred to as the ancestral South Sandwich arc (ASSA), was active in the region during that time and probably much earlier. Because the samples were taken from the current seafloor surface and volcanic material accumulates from the bottom up, the researchers infer that much older volcanic rock lies beneath.
Combined with models of how the seafloor sinks vertically with the passage of time, the team posits that the ASSA originally rose above sea level and would have blocked deep ocean currents such as the ACC.
Two other lines of evidence support the notion that the ACC didn't begin until less than 12 million years ago. First, the northern Antarctic Peninsula and southern Patagonia didn't become glaciated until less than approximately 12 million years ago. And second, certain species of microscopic creatures called dinoflagellates that thrive in cold polar water began appearing in sediments off southwestern Africa around 11.1 million years ago, suggesting colder water began reaching that part of the Atlantic Ocean.
 

Saturday, July 13, 2013

Stem Cell Clues Uncovered

Proper tissue function and regeneration is supported by stem cells, which reside in so-called niches. New work from Carnegie's Yixian Zheng and Haiyang Chen identifies an important component for regulating stem cell niches, with impacts on tissue building and function. The results could have implications for disease research. It is published by Cell Stem Cell.

Lamins are proteins that the major structural component of the material that lines the inside of a cell's nucleus. Lamins have diverse functions, including suppressing gene expression. It has been difficult to understand how mutations in lamins cause diseases in specific tissues and organs, such as skeletal muscles, heart muscle, and fat.
A group of human diseases called laminopathies, which include premature aging, are caused by defects in proteins called lamins. Zheng and her team, which included Xin Chen of Johns Hopkins University, decided to examine whether lamins would link stem cell niche function to healthy tissue building and maintenance.
To understand the tissue-specific effects of lamin mutations, the team focused on fruit fly testis, one of the best-studied stem cell niche systems. In the fruit fly testis, biochemical cross-signaling between the different types of cells that make up the niche environment ensures proper maintenance and differentiation of the testis system's stem cells.
Using an advanced array of techniques available in fruit fly studies, the team demonstrated that lamins were a necessary component of supporting niche organization, which in turn regulates proper proliferation and differentiation of germline stem cells in fruit fly testis.
"These results could have implications for the role of lamins in other types of stem cell niches," Zheng said. "These findings could contribute to the study of diseases caused by lamina-based tissue degeneration. For example, different lamin mutations could disrupt the organization of different niches in the body, which then leads to degeneration in tissues."
 

As Ice Cover Disappears, Life in Frigid Antarctic Moves Fast

It might be cold in the Antarctic, but that doesn't mean that life there necessarily moves slowly. A report appearing in Current Biology, a Cell Press publication, on July 11 reveals the discovery of a surprisingly fast-growing community of glass sponges in an area formerly covered by permanent ice. With the ice at the surface disappearing, those little-known sponges are launching a seafloor takeover.

That's a surprise, given that glass sponges were thought to have very long and slow lives, the researchers say. The boldest estimates suggest lifetimes of more than 10,000 years.
"By comparing identical tracks video-surveyed by remotely operated underwater vehicle in one of the least accessible parts of the Antarctic, we found two- and three-fold increases in the biomass and abundance of glass sponges, respectively, from 2007 to 2011," says Claudio Richter of the Alfred Wegener Institute in Germany. "This is much faster than any of us would have thought possible."
"A general principle to be learned from our study is that benthic communities are very dynamic, even under the extreme environmental conditions prevailing in the Antarctic," says the study's first author, Laura Fillinger. "Only four years ago, the study area was dominated by a species of sea squirt. Now this pioneer species has all but disappeared, giving way to a community dominated by young individuals of a glass sponge."
Richter and Fillinger plan to keep going back to this polar site, to see what might happen next. They suspect the seafloor there will ultimately reach a climax community that looks like those in other shallow and seasonally ice-covered Antarctic waters; at this rate, that could happen within decades, not centuries.
Exactly what this will mean for the rest of the Antarctic or the planet is impossible to say. Glass sponges serve as important habitat for diverse communities of fish and invertebrates, but there is still a lot that no one really knows about them.
Ultimately, the future is anyone's guess. If you are a glass sponge, it does appear to be good news. "If the alarming rate of ice shelf disintegration continues… glass sponges may find themselves on the winners' side of climate change," the researchers write.
 

Writing Computer Programs Using Ordinary Language: Systems Convert Ordinary Language to Code

In a pair of recent papers, researchers at MIT's Computer Science and Artificial Intelligence Laboratory have demonstrated that, for a few specific tasks, it's possible to write computer programs using ordinary language rather than special-purpose programming languages.

The work may be of some help to programmers, and it could let nonprogrammers manipulate common types of files -- like word-processing documents and spreadsheets -- in ways that previously required familiarity with programming languages. But the researchers' methods could also prove applicable to other programming tasks, expanding the range of contexts in which programmers can specify functions using ordinary language.
"I don't think that we will be able to do this for everything in programming, but there are areas where there are a lot of examples of how humans have done translation," says Regina Barzilay, an associate professor of computer science and electrical engineering and a co-author on both papers. "If the information is available, you may be able to learn how to translate this language to code."
In other cases, Barzilay says, programmers may already be in the practice of writing specifications that describe computational tasks in precise and formal language. "Even though they're written in natural language, and they do exhibit some variability, they're not exactly Shakespeare," Barzilay says. "So again, you can translate them."
The researchers' recent papers demonstrate both approaches. In work presented in June at the annual Conference of the North American Chapter of the Association for Computational Linguistics, Barzilay and graduate student Nate Kushman used examples harvested from the Web to train a computer system to convert natural-language descriptions into so-called "regular expressions": combinations of symbols that enable file searches that are far more flexible than the standard search functions available in desktop software.
In a paper being presented at the Association for Computational Linguistics' annual conference in August, Barzilay and another of her graduate students, Tao Lei, team up with professor of electrical engineering and computer science Martin Rinard and his graduate student Fan Long to describe a system that automatically learned how to handle data stored in different file formats, based on specifications prepared for a popular programming competition.
Regular irregularities
As Kushman explains, computer science researchers have had some success with systems that translate questions written in natural language into special-purpose formal languages -- languages used to specify database searches, for instance. "Usually, the way those techniques work is that they're finding some fairly direct mapping between the natural language and this formal representation," Kushman says. "In general, the logical forms are handwritten so that they have this nice mapping."
Unfortunately, Kushman says, that approach doesn't work with regular expressions, strings of symbols that can describe the data contained in a file with great specificity. A regular expression could indicate, say, just those numerical entries in a spreadsheet that are three columns over from a cell containing a word of any length whose final three letters are "BOS."
But regular expressions, as ordinarily written, don't map well onto natural language. For example, Kushman explains, the regular expression used to search for a three-letter word starting with "a" would contain a symbol indicating the start of a word, another indicating the letter "a," a set of symbols indicating the identification of a letter, and a set of symbols indicating that the previous operation should be repeated twice. "If I'm trying to do the same syntactic mapping that I would normally do," Kushman says, "I can't pull out any sub-chunk of this that means 'three-letter.'"
What Kushman and Barzilay determined, however, is that any regular expression has an equivalent that does map nicely to natural language -- although it may not be very succinct or, for a programmer, very intuitive. Moreover, using a mathematical construct known as a graph, it's possible to represent all equivalent versions of a regular expression at once. Kushman and Barzilay's system thus has to learn only one straightforward way of mapping natural language to symbols; then it can use the graph to find a more succinct version of the same expression.
When Kushman presented the paper he co-authored with Barzilay, he asked the roomful of computer scientists to write down the regular expression corresponding to a fairly simple text search. When he revealed the answer and asked how many had gotten it right, only a few hands went up. So the system could be of use to accomplished programmers, but it could also allow casual users of, say, spreadsheet and word-processing programs to specify elaborate searches using natural language.
Opening gambit
The system that Barzilay, Rinard, Lei and Long developed is one that can automatically write what are called input-parsing programs, essential components of all software applications. Every application has an associated file type -- .doc for Word programs, .pdf for document viewers, .mp3 for music players, and so on. And every file type organizes data differently. An image file, for instance, might begin with a few bits indicating the file type, a few more indicating the width and height of the image, and a few more indicating the number of bits assigned to each pixel, before proceeding to the bits that actually represent pixel colors.
Input parsers figure out which parts of a file contain which types of data: Without an input parser, a file is just a random string of zeroes and ones.
The MIT researchers' system can write an input parser based on specifications written in natural language. They tested it on more than 100 examples culled froo
The system begins with minimal information about how written specifications might correspond to parser programs. It knows a handful of words that should consistently refer to particular data types -- the word "integer," for instance -- and it knows that the specification will probably describe some data structures that are nested in others: An image file, for instance, could consist of multiple chunks, and each chunk would be headed by a few bytes indicating how big it is.
Otherwise, the system just tries lots of different interpretations of the specification on a few sample files; in the researchers' experiments, the samples, too, were provided on the competition website. If the resulting parser doesn't seem to work on some of the samples, the system varies its interpretation of the specification slightly. Moreover, as it builds more and more working parsers, it becomes more adept at recognizing regularities in the way that parsers are specified. It took only about 10 minutes of calculation on an ordinary laptop for the system to produce its candidate parsers for all 100-odd specifications.
"This is a big first step toward allowing everyday users to program their computers without requiring any knowledge of programming language," says Luke Zettlemoyer, an assistant professor of computer science and engineering at the University of Washington. "The techniques they have developed should definitely generalize to other related programming tasks."
 

Asian Origins of Native American Dogs Confirmed

Once thought to have been extinct, native American dogs are on the contrary thriving, according to a recent study that links these breeds to ancient Asia.

The arrival of Europeans in the Americas has generally been assumed to have led to the extinction of indigenous dog breeds; but a comprehensive genetic study has found that the original population of native American dogs has been almost completely preserved, says Peter Savolainen, a researcher in evolutionary genetics at KTH Royal Institute of Technology in Stockholm.
In fact, American dog breeds trace their ancestry to ancient Asia, Savolainen says. These native breeds have 30 percent or less modern replacement by European dogs, he says.
"Our results confirm that American dogs are a remaining part of the indigenous American culture, which underscores the importance of preserving these populations," he says.
Savolainen's research group, in cooperation with colleagues in Portugal, compared mitochondrial DNA from Asian and European dogs, ancient American archaeological samples, and American dog breeds, including Chihuahuas, Peruvian hairless dogs and Arctic sled dogs.
They traced the American dogs' ancestry back to East Asian and Siberian dogs, and also found direct relations between ancient American dogs and modern breeds.
"It was especially exciting to find that the Mexican breed, Chihuahua, shared a DNA type uniquely with Mexican pre-Columbian samples," he says. "This gives conclusive evidence for the Mexican ancestry of the Chihuahua."
The team also analysed stray dogs, confirming them generally to be runaway European dogs; but in Mexico and Bolivia they identified populations with high proportions of indigenous ancestry.
Savolainen says that the data also suggests that the Carolina Dog, a stray dog population in the U.S., may have an indigenous American origin.
Savolainen works at the Science for Life Laboratory (SciLifeLab www.scilifelab.se), a collaboration involving KTH Royal Institute of Technology, Stockholm University, the Karolinska Institutet and Uppsala University.
 

One More Homo Species? 3D-Comparative Analysis Confirms Status of Homo Floresiensis as Fossil Human Species

Ever since the discovery of the remains in 2003, scientists have been debating whether Homo floresiensis represents a distinct Homo species, possibly originating from a dwarfed island Homo erectus population, or a pathological modern human. The small size of its brain has been argued to result from a number of diseases, most importantly from the condition known as microcephaly.


Based on the analysis of 3-D landmark data from skull surfaces, scientists from Stony Brook University New York, the Senckenberg Center for Human Evolution and Palaeoenvironment, Eberhard-Karls Universität Tübingen, and the University of Minnesota provide compelling support for the hypothesis that Homo floresiensis was a distinct Homo species.
The study, titled "Homo floresiensis contextualized: a geometric morphometric comparative analysis of fossil and pathological human samples," is published in the July 10 edition of PLOS ONE.
The ancestry of the Homo floresiensis remains is much disputed. The critical questions are: Did it represent an extinct hominin species? Could it be a Homo erectus population, whose small stature was caused by island dwarfism?
Or, did the LB1 skull belong to a modern human with a disorder that resulted in an abnormally small brain and skull? Proposed possible explanations include microcephaly, Laron Syndrome or endemic hypothyroidism ("cretinism").
The scientists applied the powerful methods of 3-D geometric morphometrics to compare the shape of the LB1 cranium (the skull minus the lower jaw) to many fossil humans, as well as a large sample of modern human crania suffering from microcephaly and other pathological conditions. Geometric morphometrics methods use 3D coordinates of cranial surface anatomical landmarks, computer imaging, and statistics to achieve a detailed analysis of shape.
This was the most comprehensive study to date to simultaneously evaluate the two competing hypotheses about the status of Homo floresiensis.
The study found that the LB1 cranium shows greater affinities to the fossil human sample than it does to pathological modern humans. Although some superficial similarities were found between fossil, LB1, and pathological modern human crania, additional features linked LB1exclusively with fossil Homo. The team could therefore refute the hypothesis of pathology.
"Our findings provide the most comprehensive evidence to date linking the Homo floresiensis skull with extinct fossil human species rather than with pathological modern humans. Our study therefore refutes the hypothesis that this specimen represents a modern human with a pathological condition, such as microcephaly," stated the scientists.

Story source

Solar Tsunami Used to Measure Sun's Magnetic Field

A solar tsunami observed by NASA's Solar Dynamics Observatory (SDO) and the Japanese Hinode spacecraft has been used to provide the first accurate estimates of the Sun's magnetic field.


Solar tsunamis are produced by enormous explosions in the Sun's atmosphere called coronal mass ejections (CMEs). As the CME travels out into space, the tsunami travels across the Sun at speeds of up to 1000 kilometres per second.
Similar to tsunamis on Earth, the shape of solar tsunamis is changed by the environment through which they move. Just as sound travels faster in water than in air, solar tsunamis have a higher speed in regions of stronger magnetic field. This unique feature allowed the team, led by researchers from UCL's Mullard Space Science Laboratory, to measure the Sun's magnetic field. The results are outlined in a paper soon to be published in the journal Solar Physics.
Dr David Long, UCL Mullard Space Science Laboratory, and lead author of the research, said: "We've demonstrated that the Sun's atmosphere has a magnetic field about ten times weaker than a normal fridge magnet."
Using data obtained using the Extreme ultraviolet Imaging Spectrometer (EIS), a UK-led instrument on the Japanese Hinode spacecraft, the team measured the density of the solar atmosphere through which the tsunami was travelling.
The combination of imaging and spectral observations provides a rare opportunity to examine the magnetic field which permeates the Sun's atmosphere.
Dr Long noted: "These are rare observations of a spectacular event that reveal some really interesting details about our nearest star."
Visible as loops and other structures in the Sun's atmosphere, the Sun's magnetic field is difficult to measure directly and usually has to be estimated using intensive computer simulations. The Hinode spacecraft has three highly sensitive telescopes, which use visible, X-ray and ultraviolet light to examine both slow and rapid changes in the magnetic field.
The instruments on Hinode act like a microscope to track how the magnetic field around sunspots is generated, shapes itself, and then fades away. These results show just how sensitive these instruments can be, measuring magnetic fields that were previously thought too weak to detect.
The explosions that produce solar tsunamis can send CMEs hurtling towards the Earth. Although protected by its own magnetic field, the Earth is vulnerable to these solar storms as they can adversely affect satellites and technological infrastructure.
Dr Long said: "As our dependency on technology increases, understanding how these eruptions occur and travel will greatly assist in protecting against solar activity."

Story sourcehttp://www.sciencedaily.com/releases/2013/07/130711113424.htm

Gang Members Found to Suffer Unprecedented Levels of Psychiatric Illness

*Young men who are gang members suffer unprecedented levels of psychiatric illness, placing a heavy burden on mental health services, according to new research led by Queen Mary, University of London.

The National Institute for Health Research (NIHR) and Maurice & Jacqueline Bennett Charitable Trust funded study surveyed 4,664 men aged 18 to 34 in Britain. The survey covered measures of psychiatric illness, violence and gang membership. It is the first time research has looked into whether gang violence is associated with psychiatric illness, other than substance misuse.
The survey sample was weighted to include significant numbers from areas with high gang membership (Hackney and Glasgow East), lower social classes and areas with a higher than average population of ethnic minority residents.
Of the total sample, 3,284 (70.4 per cent) reported that they had not been violent in the past five years, 1,272 (27.3 per cent) said they had assaulted another person or been involved in a fight, and 108 (2.1 per cent) said they were currently a member of a gang. Using these results the participants were split into three groups -- gang members, violent men and non-violent men for the analysis.
Both violent men and gang members were found to be younger than non-violent men, more likely to have been born in the UK and more likely to be unemployed.
In terms of mental health, gang members and violent men were significantly more likely to suffer from a mental disorder and access psychiatric services than non-violent men. The exception was depression, which was significantly less common among gang members and violent men.
Violent ruminative thinking, violent victimisation and fear of further victimisation were significantly higher in gang members and believed to account for high levels of psychosis and anxiety disorder in gang members.
The findings showed that, of the 108 gang members surveyed:
  • 85.8 per cent had an antisocial personality disorder;
  • Two-thirds were alcohol dependent;
  • 25.1 per cent screened positive for psychosis;
  • More than half (57.4 per cent) were drug dependent;
  • Around a third (34.2 per cent) had attempted suicide; and
  • More than half (58.9 per cent) had an anxiety disorder.
Professor Jeremy Coid, Director of the Forensic Psychiatry Research Unit at Queen Mary, and lead author of the paper said: "No research has previously investigated whether gang violence is related to psychiatric illness, other than substance misuse, or if it places a burden on mental health services.
"Here we have shown unprecedented levels among this group, identifying a complex public health problem at the intersection of violence, substance misuse, and mental health problems among young men.
"It is probable that, among gang members, high levels of anxiety disorder and psychosis were explained by post-traumatic stress disorder (PTSD), the most frequent psychiatric outcome of exposure to violence. However this could only partly explain the high prevalence of psychosis, which warrants further investigation.
"With street gangs becoming increasingly evident in UK cities, membership should be routinely assessed in young men presenting to healthcare services with psychiatric illness in urban areas with high levels of gang activity."
The authors suggest that the higher rate of attempted suicide attempts among gang members may be associated with other psychiatric illness, but could also correspond with the notion that impulsive violence may be directed both outwardly and inwardly.
Street gangs are concentrated in inner urban areas characterised by socioeconomic deprivation, high crime rates and multiple social problems. The authors report that around one per cent of 18 to 34-year-old men in Britain are gang members. The level rises to 8.6 per cent in the London borough of Hackney, where one in five black men reported gang membership.
Professor Coid added: "A potential limitation of the study is that survey participants were aged 18 to 34 and the average age for gang membership is 15. So gang members in this study should be considered 'core' gang members who have not stopped in early adulthood. We need further longitudinal studies to see if our findings are due to factors specific to this group."
The research is published today in the American Journal of Psychiatry.
 Story source

Novel Bicycle Saddle Prevents Chafing, Pain and Other Damage Associated With the Genital Area

Researchers at the University of Alicante have developed a novel bicycle saddle that prevents chafing, pain and other damage associated with the genital area as impotence and prostatitis.

It is a hinged articulated saddle whose coccyx-support narrow front and wide back have been articulately joined. The front part is mobile while the back is fixed, and both may change their positions at the user's will.
This new concept of bicycle saddle is designed and patented by researchers from the University of Alicante's Institute of Physics Applied to Science and Technology and the Department of Physics at the Polytechnic Higher School.
The UA researcher, Alfonso Panchón Ruiz, head of the research work, explained that "the main advantage of this new design compared to traditional saddles, is that it allows -at the user's will- to rest and recover from fatigue the perineal area suffering lasting intense compression for which they are not designed anatomically."
"The classic bicycle saddle has a unitary structure formed by a rigid body in anteroposterior direction which makes that, permanently, the perineal tissues, which are soft and not ready to withstand these forces, are being compressed, independent of the position taken by the user. For this reason, soon after starting the exercise, nerves and arteries reach high levels of compression, which causes problems associated with lack of blood supply, such as numbness and affection of the genitals in both men and women, and in the long run, significant pathologies requiring medical treatment may appear," Alfonso Panchón says.
Up to date, only two solutions have been found, either to go up on the pedals, on a typical pedalling, visible both in professional races and gyms, or dismounting the bike and standing up, abandoning thus, the exercise started.
In this sense, Alfonso Panchón explains that "with this new design, it is not the user who must be separated and rising from the seat, but it is the saddle which separates spinning or scrolling down to the perineal area of the user. Thus, it radically prevents pressure on that area, immediately improving the blood supply to the affected areas, resulting in functional recovery of the tissues concerned.
Another advantage is that the user does not lose balance control ability in driving, regardless of the conditions of use, race, walk, gym, mountain, etc.., and this allows users to make new lateral movements on the anterior mobile part as well as immediately recover -at their will- the traditional full seat with a slight initial reverse movement.
Also, with this model of saddle, more than ten centimetres can be released between the seat and handlebars, which can be availed with competitive advantage in declines as it allows an aerodynamic position on very steep slopes.
The research team has a prototype that allows them to check the health and medical benefits of this innovative aerodynamic concept of saddle. Currently, there is nothing similar on the market, so it is a technology with great potential for international marketing.
Story source

Where Do Muscles Get Their Power? Fifty-Year-Old Assumptions About Strength Muscled Aside

Doctors have a new way of thinking about how to treat heart and skeletal muscle diseases. Body builders have a new way of thinking about how they maximize their power. Both owe their new insight to high-energy X-rays, a moth and cloud computing.

The understanding of how muscles get their power has been greatly expanded with new results published online July 10 in the Royal Society journal Proceedings of the Royal Society B. The Royal Society is the U.K.'s national academy of sciences.
The basics of how a muscle generates power remain the same: Filaments of myosin tugging on filaments of actin shorten, or contract, the muscle -- but the power doesn't just come from what's happening straight up and down the length of the muscle, as has been assumed for 50 years.
Instead, University of Washington-led research shows that as muscles bulge, the filaments are drawn apart from each other, the myosin tugs at sharper angles over greater distances, and it's that action that deserves credit for half the change in muscle force scientists have been measuring.
Researchers made this discovery when using computer modeling to test the geometry and physics of the 50-year-old understanding of how muscles work. The computer results of the force trends were validated through X-ray diffraction experiments on moth flight muscle, which is very similar to human cardiac muscle. The X-ray work was led by co-author Thomas Irving, an Illinois Institute of Technology professor and director of the Biophysics Collaborative Access Team (Bio-CAT) beamline at the Advanced Photon Source, which is housed at the U.S. Department of Energy's Argonne National Laboratory.
A previous lack of readily available access to computational power and X-ray diffraction facilities are two reasons that this is the first time these findings have been documented, speculated lead-author C. David Williams, who earned his doctorate at the UW while conducting the research, and now is a postdoctoral researcher at Harvard University. Currently, X-ray lightsources have a waiting list of about three researchers for every one active experiment. The APS is undergoing an upgrade that will greatly increase access and research power and expedite data collection.
The new understanding of muscle dynamics derived from this study has implications for the research and use of all muscles, including organs.
"In the heart especially, because the muscle surrounds the chambers that fill with blood, being able to account for forces that are generated in several directions during muscle contraction allows for much more accurate and realistic study of how pressure is generated to eject blood from the heart," said co-author Michael Regnier, a UW bioengineering professor. "The radial and long axis forces that are generated may be differentially compromised in cardiac diseases and these new, detailed models allow this to be studied at a molecular level for the first time. They also take us to a new level in testing therapeutic treatments targeted to contractile proteins for both cardiac and skeletal muscle diseases. "
This study gives scientists and doctors a new basis for interpreting experiments and understanding the mechanisms that regulate muscle contraction. Researchers have known for sometime that the muscle filament lattice spacing changes over the length-tension curve, but its importance in generating the steep length dependence of force has not been previously demonstrated.
"The predominant thinking of the last 50 years is that 100 percent of the muscle force comes from changes as muscles shorten and myosin and actin filaments overlap. But when we isolated the effects of filament overlap we only got about half the change in force that physiologists know muscles are capable of producing," Williams said.
The rest of the force, he said, should be credited to the lattice work of filaments as it expands outward in bulging muscle -- whether in a body builder's buff biceps or the calves of a sinewy marathon runner.
"One of the major discoveries that David Williams brought to light is that force is generated in multiple directions, not just along the long axis of muscle as everyone thinks, but also in the radial direction," said Thomas Daniel, UW professor of biology and co-author on the paper.
"This aspect of muscle force generation has flown under the radar for decades and is now becoming a critical feature of our understanding of normal and pathological aspects of muscle," Daniel added.
Since the 1950s scientists have had a formula -- the so-called length-tension curve -- that accurately describes the force a muscle exerts at all points from fully outstretched, when every weight lifter knows there is little strength, to the middle points that display the greatest force, to the completely shortened muscle when, again, strength is minimized.
Williams developed computer models to consider the geometry and physics at work on the filaments at all those points.
"The ability to model in three dimensions and separate the effects of changes in lattice spacing from changes in muscle length wouldn't even have been possible without the advent of cloud computing in the last 10 years, because it takes ridiculous amounts of computational resources," Williams said Story source