Protected areas not safe from light pollution


Protected areas, such as nature reserves and national parks, are thought to provide a refuge for wildlife, but according to a new study, many of these areas are not safe from light pollution. Thanks to increasing urbanization, many nocturnal skies are no longer dark. 

Although helpful for humans, artificial night lighting can impact nocturnal wildlife by disrupting natural reproductive cycles, disorienting migratory species, and increasing the risk of predation. To assess how well protected areas shelter wildlife from light pollution and preserve natural darkness, researchers analyzed satellite images of Earth collected at night by the Defense Meteorological Satellite Program between 1992 and 2010. Individual pixels, representative of approximately 3 square kilometers, were assigned a number based on their degree of illumination, ranging from 0 (complete darkness) to 63 (brightly lit urban areas). 

More than 170,000 unique protected areas were identified using the International Union for Conservation of Nature’s World Database on Protected Areas. 

The degree of nighttime illumination was then compared with unprotected areas for each continent over the 2 decades. Although 86% of the world’s landmasses remain in relative darkness at night, darkness declined slightly in all regions over the study period.

 Protected areas were still generally darker than unprotected areas, yet protected areas experienced widespread increases in nighttime light exposure between 1992 and 2010, the team reports online this month in Conservation Biology. In Europe, Asia, and South and Central America, up to 42% of protected areas have experienced significant increases in nighttime lighting. 

A smaller percentage of protected areas in Europe (24%) and North America (17%) exhibited high levels of nighttime lighting in all years. Based on their findings, researchers propose reduced lighting zones be established around existing refuges to preserve their natural darkness and biodiversity. 

A step closer to explaining high-temperature superconductivity?


For years some physicists have been hoping to crack the mystery of high-temperature superconductivity—the ability of some complex materials to carry electricity without resistance at temperatures high above absolute zero—by simulating crystals with patterns of laser light and individual atoms. Now, a team has taken—almost—the next-to-last step in such "optical lattice" simulation by reproducing the pattern of magnetism seen in high-temperature superconductors from which the resistance-free flow of electricity emerges.

"It's a very big improvement over previous results," says Tilman Esslinger, an experimentalist at the Swiss Federal Institute of Technology in Zurich, who was not involved in the work. "It's very exciting to see steady progress."

An optical lattice simulation is essentially a crystal made of light. A real crystal contains a repeating 3D pattern of ions, and electrons flow from ion to ion. In the simulation, spots of laser light replace the ions, and ultracold atoms moving among spots replace the electrons. Physicists can adjust the pattern of spots, how strongly the spots attract the atoms, and how strongly the atoms repel one another. 

That makes the experiments ideal for probing physics such as high-temperature superconductivity, in which materials such as mercury barium calcium copper oxide carry electricity without resistance at temperatures up to 138 K, far higher above absolute zero than ordinary superconductors such as niobium can.

Just how the copper-and-oxygen, or cuprate, superconductors work remains unclear. The materials contain planes of copper and oxygen ions with the coppers arranged in a square pattern. Repelling one another, the electrons get stuck in a one-to-a-copper traffic jam called a Mott insulator state. They also spin like tops, and at low temperatures neighboring electrons spin in opposite directions, creating an up-down-up-down pattern of magnetism called antiferromagnetism. Superconductivity sets in when impurities soak up a few electrons and ease the traffic jam. The remaining electrons then pair to glide freely along the planes.

Theorists do not yet agree how that pairing occurs. Some think that wavelike ripples in the antiferromagnetic pattern act as a glue to attract one electron to the other. Others argue that the pairing arises, paradoxically, from the repulsion among the electrons alone. Theorists can write down a mathematical model of electrons on a checkerboard plane, known as the Fermi-Hubbard model, but it is so hard to "solve" that nobody has been able to show whether it produces superconductivity.

Experimentalists hope to reproduce the Fermi-Hubbard model in laser light and cold atoms to see if it yields superconductivity. In 2002, Immanuel Bloch, a physicist at the Max Planck Institute for Quantum Optics (MPQ) in Garching, Germany, and colleagues realized a Mott insulator state in an optical lattice. Six years later, Esslinger and colleagues achieved the Mott state with atoms with the right amount of spin to mimic electrons. Now, Randall Hulet, a physicist at Rice University in Houston, Texas, and colleagues have nearly achieved the next-to-last step along the way: antiferromagnetism.

Hulet and colleagues trapped between 100,000 and 250,000 lithium-6 atoms in laser light. They then ramped up the optical lattice and ramped it back down to put them in order. Shining laser light of a specific wavelength on the atoms, they observed evidence of an emerging up-down-up-down spin pattern. The laser light was redirected, or diffracted, at a particular angle by the rows of atoms—just as x-rays diffract off the ions in a real crystal. Crucially, the light probed the spin of the atoms: The light wave flipped if it bounced off an atom spinning one way but not the other. Without that flipping, the diffraction wouldn't have occurred, so observation confirms the emergence of the up-down-up-down pattern, Hulet says.

Hulet's team solved a problem that has plagued other efforts. Usually, turning the optical lattice on heats the atoms. To avoid that, the researchers added another laser that slightly repelled the atoms, so that the most energetic ones were just barely held by the trap. Then, as the atoms heated, the most energetic ones "evaporated" like steam from hot soup to keep the other ones cool, the researchers report online this week in Nature. They didn't quite reach a full stable antiferromagnetic pattern: The temperature was 40% too high. But the technique might get there and further, 

Hulet says. "We don't have a good sense of what the limit of this method is," he says. "We could get a factor of two lower, we could get a factor of 10 lower."

"It is indeed very promising," says Tin-Lun "Jason" Ho, a theorist at Ohio State University, Columbus. Reducing the temperature by a factor of two or three might be enough to reach the superconducting state, he says. However, MPQ's Bloch cautions that it may take still other techniques to get that cold. 

"There are several cooling techniques that people are developing and interesting experiments coming up," he says.

Physicists are also exploring other systems and problems with optical lattices. The approach is still gaining steam, Hulet says: "It's an exciting time."

Article: ScienceMag

NIH sets aside more than $40 million for study of human placenta


The Human Placenta Project, launched last year by the National Institutes of Health (NIH) despite uncertainty over how much money would back in the effort, has just received a whopping $41.5 million in 2015 to study the vital mass of tissue that sustains a developing fetus.

The placenta carries nutrients and oxygen to a fetus from its mother’s bloodstream and removes waste; problems with its performance may contribute to health concerns ranging from preterm birth to adult diabetes. Yet it is the least understood human organ, according to Alan Guttmacher, director of NIH’s National Institute of Child Health and Human Development (NICHD). 

Last year, Science reported on a NICHD workshop where planning began for a Human Placenta Project that would aim to monitor the placenta during a woman’s pregnancy, using new imaging approaches, tests for fetal molecules shed into a mother’s blood, and other tools.

That plan is reflected in the title of a 26 February request for grant applications, from NICHD and the National Institute of Biomedical Imaging and Bioengineering (NIBIB), that calls for “Paradigm-Shifting Innovations” in how to assess the human placenta. One objective is to learn how environmental factors such as a mother’s diet and exposure to pollutants affect the placenta. The $41.5 million will support eight to nine awards lasting up to 4 years.

The new funding commitment for the project comes on top of about $4.5 million in 2015 that NICHD and NIBIB have already set aside for research on tools to study the placenta. An NIH representative says that some of the additional $41.5 million could come from leftover funding from the National Children’s Study (NCS), a controversial plan to follow the health of 100,000 children for 21 years that NIH canceled in December. NIH is now looking for ways to spend $140 million that Congress appropriated for the NCS in 2015 on related studies.

Article: ScienceMag

Astronauts breeze through spacewalk to rig station for U.S. space taxis


Two U.S. astronauts whipped through a third spacewalk outside the International Space Station on Sunday to rig parking spots for new U.S. space taxis.

Station commander Barry "Butch" Wilmore and flight engineer Terry Virts expected to spend about seven hours installing antennas, cables and navigation aides on the station's exterior truss. Instead, the astronauts, who were making their third spacewalk in eight days, were back inside the space station in 5.5 hours.

The purpose of the outings was to prepare berthing slips for spaceships being developed by Boeing and Space Exploration Technologies, or SpaceX.

Wilmore and Virts floated outside the Quest airlock shortly after 7 a.m. EST/1200 GMT, a NASA Television broadcast showed. Their job was to install more than 400 feet (122 meters) of cables, a pair of antennas and reflectors that the new spaceships will use to navigate toward and dock with the station, a $100 billion laboratory that flies about 260 miles (418 km) above Earth.

After the spacewalk, Virts reported that a small amount of water had seeped into his helmet, a situation that also occurred after a spacewalk last week.

"It's no issue to crew safety," mission commentator Daniel Huot said.

In July 2013, Italian astronaut Luca Parmitano nearly drowned when water began leaking into his helmet. NASA immediately aborted the spacewalk and suspended spacewalks while engineers figured out the cause of the problem. The incident with the suit Virts was wearing is unrelated, NASA said.

Sunday's outing followed two spacewalks last week to rig power and data cables for a pair of docking port adapters that are due to arrive later this year. One adapter will be installed at the berthing slip once used by NASA's space shuttles, which were retired in 2011. The second docking system will be located at an adjacent hatch on the Harmony connecting node.

Since the shuttles' retirement, the United States has been dependent on Russia to fly crew to and from the station, a joint project of 15 nations.

NASA aims to break Russia's monopoly before the end of 2017 by buying rides from Boeing and SpaceX.

Article: Reuters

Let’s call it: 30 years of above average temperatures means the climate has changed


If you’re younger than 30, you’ve never experienced a month in which the average surface temperature of the Earth was below average.

Each month, the US National Climatic Data Center calculates Earth’s average surface temperature using temperature measurements that cover the Earth’s surface. Then, another average is calculated for each month of the year for the twentieth century, 1901-2000. For each month, this gives one number representative of the entire century. Subtract this overall 1900s monthly average – which for February is 53.9F (12.1C) – from each individual month’s temperature and you’ve got the anomaly: that is, the difference from the average.

The last month that was at or below that 1900s average was February 1985. Ronald Reagan had just started his second presidential term and Foreigner had the number one single with “I want to know what love is.”

These temperature observations make it clear the new normal will be systematically rising temperatures, not the stability of the last 100 years. The traditional definition of climate is the 30-year average of weather. The fact that – once the official records are in for February 2015 – it will have been 30 years since a month was below average is an important measure that the climate has changed.

Temperature history for all Februaries from 1880-2014  NCDC


How the Earth warms

As you can see in the graphic above, ocean temperature doesn’t vary as much as land temperature. This fact is intuitive to many people because they understand that coastal regions don’t experience as extreme highs and lows as the interiors of continents. Since oceans cover the majority of the Earth’s surface, the combined land and ocean graph strongly resembles the graph just for the ocean. Looking at only the ocean plots, you have to go all the way back to February 1976 to find a month below average. (That would be under President Gerald Ford’s watch.)

You can interpret variability over land as the driver of the ups and downs seen in the global graph. There are four years from 1976 onwards when the land was below average; the last time the land temperature was cool enough for the globe to be at or below average was February 1985. The flirtation with below-average temps was tiny – primarily worth noting in the spirit of accurate record keeping. Looking at any of these graphs, it’s obvious that earlier times were cooler and more recent times are warmer. None of the fluctuations over land since 1976 provide evidence contrary to the observation that the Earth is warming.

Some of the most convincing evidence that the Earth is warming is actually found in measures of the heat stored in the oceans and the melting of ice. However, we often focus on the surface air temperature. One reason for that is that we feel the surface air temperature; therefore, we have intuition about the importance of hot and cold surface temperatures. Another reason is historical; we have often thought of climate as the average of weather. We’ve been taking temperature observations for weather for a long time; it is a robust and essential observation.

Temperature history for every year from 1880-2014.  NOAA National Climatic Data Center


Despite variability, a stable signal

Choosing one month, February in this instance, perhaps overemphasizes that time in 1985 when we had a below average month. We can get a single yearly average for all the months in an entire year, January-December. If we look at these annual averages, then the ups and downs are reduced. In this case, 1976 emerges as the last year in which the global-average temperature was below the 20th century average of 57.0F (13.9C) – that’s 38 years ago, the year that Nadia Comaneci scored her seven perfect 10s at the Montreal Olympics.

I am not a fan of tracking month-by-month or even year-by-year averages and arguing over the statistical minutia of possible records. We live at a time when the Earth is definitively warming. And we know why: predominately, the increase of greenhouse gas warming due to increasing carbon dioxide in the atmosphere. Under current conditions, we should expect the planet to be warming. What would be more important news would be if we had a year, even a month, that was below average.

The variability we observe in surface temperature comes primarily from understood patterns of weather. Many have heard of El Niño, when the eastern Pacific Ocean is warmer than average. The eastern Pacific is so large that when it is warmer than average, the entire planet is likely to be warmer than average. As we look at averages, 30 years, 10 years, or even one year, these patterns, some years warmer, some cooler, become less prominent. The trend of warming is large enough to mask the variability. The fact that there have been 30 years with no month below the 20th century average is a definitive statement that climate has changed.

To see a cooler Earth any time soon, you’ll need to carve one out of ice.  Kirsten Spry, CC BY-NC-SA


The 30-year horizon

There are other reasons that this 30-year span of time is important. Thirty years is a length of time in which people plan. This includes personal choices – where to live, what job to take, how to plan for retirement. There are institutional choices – building bridges, building factories and power plants, urban flood management. There are resource management questions – assuring water supply for people, ecosystems, energy production and agriculture. There are many questions concerning how to build the fortifications and plan the migrations that sea-level rise will demand. Thirty years is long enough to be convincing that the climate is changing, and short enough that we can conceive, both individually and collectively, what the future might hold.

Finally, 30 years is long enough to educate us. We have 30 years during which we can see what challenges a changing climate brings us. Thirty years that are informing us about the next 30 years, which will be warmer still. This is a temperature record that makes it clear that the new normal will be systematically rising temperatures, not the ups and downs of the last 100 years.

Those who are under 30 years old have not experienced the climate I grew up with. In thirty more years, those born today will also be living in a climate that, by fundamental measures, will be different than the climate of their birth. Future success will rely on understanding that the climate in which we are all now living is changing and will continue to change with accumulating consequences.

Story: TheConversation

[VIDEO] Australian researchers unveil world's first 3D printed jet engine


Australian researchers unveiled the world's first 3D-printed jet engine on Thursday, a manufacturing breakthrough that could lead to cheaper, lighter and more fuel-efficient jets.

Engineers at Monash University and its commercial arm are making top-secret prototypes for Boeing Co, Airbus Group NV, Raytheon Co and Safran SA in a development that could be the savior of Australia's struggling manufacturing sector.


"This will allow aerospace companies to compress their development cycles because we are making these prototype engines three or four times faster than normal," said Simon Marriott, chief executive of Amaero Engineering, the private company set up by Monash to commercialize the product.

Marriott said Amaero plans to have printed engine components in flight tests within the next 12 months and certified for commercial use within the next two to three years.

Australia has the potential to corner the market. It has one of only three of the necessary large-format 3D metal printers in the world - France and Germany have the other two - and is the only place that makes the materials for use in the machine.

It is also the world leader in terms of intellectual property (IP) regarding 3D printing for manufacturing.


"We have personnel that have 10 years experience on this equipment and that gives us a huge advantage," Marriott told Reuters by phone from the Avalon Airshow outside Melbourne.

3D printing makes products by layering material until a three-dimensional object is created. Automotive and aerospace companies use it for producing prototypes as well as creating specialized tools, moldings and some end-use parts.

Marriott declined to comment in detail on Amaero's contracts with companies, including Boeing and Airbus, citing commercial confidentiality. Those contracts are expected to pay in part for the building of further large format printers, at a cost of around A$3.5 million ($2.75 million) each, to ramp up production of jet engine components.

3D printing can cut production times for components from three months to just six days.

Ian Smith, Monash University's vice-provost for research, said it was very different to the melting, molding and carving of the past.


"This way we can very quickly get a final product, so the advantages of this technology are, firstly, for rapid prototyping and making a large number of prototypes quickly," Smith said. "Secondly, for being able to make bespoke parts that you wouldn't be able to with classic engineering technologies."



(Editing by Paul Tait)

Scientists discover black hole so big it contradicts growth theory


Scientists say they have discovered a black hole so big that it challenges the theory about how they grow.

Scientists said this black hole was formed about 900 million years after the Big Bang.

But with measurements indicating it is 12 billion times the size of the Sun, the black hole challenges a widely accepted hypothesis of growth rates.

"Based on previous research, this is the largest black hole found for that period of time," Dr Fuyan Bian, Research School of Astronomy and Astrophysics, Australian National University (ANU), told Reuters on Wednesday.

"Current theory is for a limit to how fast a black hole can grow, but this black hole is too large for that theory."

The creation of supermassive black holes remains an open topic of research. However, many scientists have long believed the growth rate of black holes was limited.

Black holes grow, scientific theory suggests, as they absorb mass. However, as mass is absorbed, it will be heated creating radiation pressure, which pushes the mass away from the black hole.

"Basically, you have two forces balanced together which sets up a limit for growth, which is much smaller than what we found," said Bian.

The black hole was discovered a team of global scientists led by Xue-Bing Wu at Peking University, China, as part of the Sloan Digital Sky Survey, which provided imagery data of 35 percent of the northern hemisphere sky.

The ANU is leading a comparable project, known as SkyMapper, to carry out observations of the Southern Hemisphere sky.

Bian expects more black holes to be observed as the project advances.

(Editing by Robert Birsel)