A global shift towards healthy and more plant-based diets, halving food loss and waste, and improving farming practices and technologies are required to feed 10 billion people sustainably by 2050, a new study finds. Adopting these options reduces the risk of crossing global environmental limits related to climate change, the use of agricultural land, the extraction of freshwater resources, and the pollution of ecosystems through overapplication of fertilizers, according to the researchers.
The study, is the first to quantify how food production and consumption affects the planetary boundaries that describe a safe operating space for humanity beyond which Earth’s vital systems could become unstable.
“No single solution is enough to avoid crossing planetary boundaries. But when the solutions are implemented together, our research indicates that it may be possible to feed the growing population sustainably,” says Dr Marco Springmann of the Oxford Martin Programme on the Future of Food and the Nuffield Department of Population Health at the University of Oxford, who led the study.
“Without concerted action, we found that the environmental impacts of the food system could increase by 50-90% by 2050 as a result of population growth and the rise of diets high in fats, sugars and meat. In that case, all planetary boundaries related to food production would be surpassed, some of them by more than twofold.”
The study, funded by EAT as part of the EAT-Lancet Commission for Food, Planet and Health and by Wellcome’s “Our Planet, Our Health” partnership on Livestock Environment and People, combined detailed environmental accounts with a model of the global food system that tracks the production and consumption of food across the world. With this model, the researchers analysed several options that could keep the food system within environmental limits. They found:
- Climate change cannot be sufficiently mitigated without dietary changes towards more plant-based diets. Adopting more plant-based “flexitarian” diets globally could reduce greenhouse gas emissions by more than half, and also reduce other environmental impacts, such as fertilizer application and the use of cropland and freshwater, by a tenth to a quarter.
- In addition to dietary changes, improving management practices and technologies in agriculture is required to limit pressures on agricultural land, freshwater extraction, and fertilizer use. Increasing agricultural yields from existing cropland, balancing application and recycling of fertilizers, and improving water management, could, along with other measures, reduce those impacts by around half.
- Finally, halving food loss and waste is needed for keeping the food system within environmental limits. Halving food loss and waste could, if globally achieved, reduce environmental impacts by up to a sixth (16%).
“Many of the solutions we analysed are being implemented in some parts of the world, but it will need strong global co-ordination and rapid upscale to make their effects felt.”
“Improving farming technologies and management practices will require increasing investment in research and public infrastructure, the right incentive schemes for farmers, including support mechanisms to adopt best available practices, and better regulation, for example of fertilizer use and water quality,” says Line Gordon, executive director of the Stockholm Resilience Centre and an author on the report.
“Tackling food loss and waste will require measures across the entire food chain, from storage, and transport, over food packaging and labelling to changes in legislation and business behaviour that promote zero-waste supply chains.”
“When it comes to diets, comprehensive policy and business approaches are essential to make dietary changes towards healthy and more plant-based diets possible and attractive for a large number of people. Important aspects include school and workplace programmes, economic incentives and labelling, and aligning national dietary guidelines with the current scientific evidence on healthy eating and the environmental impacts of our diet,” adds Springmann.
All large-scale energy systems have environmental impacts, and the ability to compare the impacts of renewable energy sources is an important step in planning a future without coal or gas power. Extracting energy from the wind causes climatic impacts that are small compared to current projections of 21st century warming, but large compared to the effect of reducing US electricity emissions to zero with solar. Research publishing in the journal Joule on October 4 reports the most accurate modelling yet of how increasing wind power would affect climate, finding that large-scale wind power generation would warm the Continental United States 0.24 degrees Celsius because wind turbines redistribute heat in the atmosphere.
“Wind beats coal by any environmental measure, but that doesn’t mean that its impacts are negligible,” says senior author David Keith, an engineering and public policy professor at Harvard University. “We must quickly transition away from fossil fuels to stop carbon emissions. In doing so, we must make choices between various low-carbon technologies, all of which have some social and environmental impacts.”
“Wind turbines generate electricity but also alter the atmospheric flow,” says first author Lee Miller. “Those effects redistribute heat and moisture in the atmosphere, which impacts climate. We attempted to model these effects on a continental scale.”
To compare the impacts of wind and solar, Keith and Miller started by establishing a baseline for the 2012-2014 US climate using a standard weather forecasting model. Then they added in the effect on the atmosphere of covering one third of the Continental US with enough wind turbines to meet present-day US electricity demand. This is a relevant scenario if wind power plays a major role in decarbonizing the energy system in the latter half of this century. This scenario would warm the surface temperature of the Continental US by 0.24 degrees Celsius.
Their analysis focused on the comparison of climate impacts and benefits. They found that it would take about a century to offset that effect with wind-related reductions in greenhouse gas concentrations. This timescale was roughly independent of the specific choice of total wind power generation in their scenarios.
“The direct climate impacts of wind power are instant, while the benefits accumulate slowly,” says Keith. “If your perspective is the next 10 years, wind power actually has — in some respects — more climate impact than coal or gas. If your perspective is the next thousand years, then wind power is enormously cleaner than coal or gas.”
More than ten previous studies have now observed local warming caused by US wind farms. Keith and Miller compared their simulated warming to observations and found rough consistency between the observations and model.
They also compared wind power’s impacts with previous projections of solar power’s influence on the climate. They found that, for the same energy generation rate, solar power’s impacts would be about 10 times smaller than wind. But both sources of energy have their pros and cons.
“In terms of temperature difference per unit of energy generation, solar power has about 10 times less impact than wind,” says Miller. “But there are other considerations. For example, solar farms are dense, whereas the land between wind turbines can be co-utilized for agriculture.” The density of wind turbines and the time of day during which they operate can also influence the climatic impacts.
Keith and Miller’s simulations do not consider any impacts on global-scale meteorology, so it remains somewhat uncertain how such a deployment of wind power may affect the climate in other countries.
“The work should not be seen as a fundamental critique of wind power. Some of wind’s climate impacts may be beneficial. So rather, the work should be seen as a first step in getting more serious about assessing these impacts,” says Keith. “Our hope is that our study, combined with the recent direct observations, marks a turning point where wind power’s climatic impacts begin to receive serious consideration in strategic decisions about decarbonizing the energy system.”
Human evolution used to be depicted as a straight line, gradually progressing from an ape-like ancestor to modern Homo sapiens. But thanks to next-generation sequencing — as well as the discovery of genetic material from extinct subspecies of early humans — findings in recent years have shown that it wasn’t quite so orderly. The human family tree is full of twists and branches that helped shape what we are today. Now, a study published in the journal Cell is reporting new details about the role of viruses in shaping evolution, in particular viral interactions between modern humans and Neanderthals.
“It’s not a stretch to imagine that when modern humans met up with Neanderthals, they infected each other with pathogens that came from their respective environments,” “By interbreeding with each other, they also passed along genetic adaptations to cope with some of those pathogens.”
Current thinking is that modern humans began moving out of Africa and into Eurasia about 70,000 years ago. When they arrived, they met up with Neanderthals who, along with their own ancestors, had been adapting to that geographic area for hundreds of thousands of years. The Eurasian environment shaped Neanderthals’ evolution, including the development of adaptations to viruses and other pathogens that were present there but not in Africa.
The Cell study provides new details about the role of adaptive introgression, or hybridization between species, in human evolution. “Some of the Neanderthals had adaptive mutations that gave them advantages against these pathogens, and they were able to pass some of these mutations on to modern humans,” explains Enard, who completed the work while he was a postdoctoral researcher at Stanford University. “That’s called positive natural selection — it favors certain individuals that carry these advantageous mutations.”
Their earlier research focused on how viruses impacted the evolution of humans. In 2016, they reported that about one-third of protein adaptations since humans split from other great apes was driven by a response to infectious viruses. The new work built on those findings looked at which of those adaptations may have come from Neanderthals.
In the current study, the investigators annotated thousands of genes in the human genome that are known to interact with pathogens — more than 4,000 of the 25,000 total genes. “We focused on these genes because the ones that interact with viruses are much more likely to have been involved in adaptation against infectious disease compared with genes that don’t have anything to do with viruses.”
They then looked at whether there was an enrichment of stretches of Neanderthal DNA in those 4,000 genes. Earlier studies from other groups have shown that Neanderthal DNA is present in humans. Those sequences are publicly available to investigators in the field. Based on the analysis, Enard and Petrov found strong evidence that adaptive genes that provided resistance against viruses were shared between Neanderthals and modern humans.
“Many Neanderthal sequences have been lost in modern humans, but some stayed and appear to have quickly increased to high frequencies at the time of contact, suggestive of their selective benefits at that time,” Petrov says. “Our research aims to understand why that was the case. We believe that resistance to specific RNA viruses provided by these Neanderthal sequences was likely a big part of the reason for their selective benefits.”
“One of the things that population geneticists have wondered about is why we have maintained these stretches of Neanderthal DNA in our own genomes,” Enard adds. “This study suggests that one of the roles of those genes was to provide us with some protection against pathogens as we moved into new environments.”
A team of scientists has uncovered the neural processes mice use to ignore their own footsteps, a discovery that offers new insights into how we learn to speak and play music.
“The ability to ignore one’s own footsteps requires the brain to store and recall memories and to make some pretty stellar computations,” explains David Schneider, an assistant professor at New York University’s Center for Neural Science and one of the paper’s lead authors. “These are the building blocks for other, more important sound-generating behaviors, like recognizing the sounds you make when learning how to speak or to play a musical instrument.”
The research, centered on an intuition — that we are usually unaware of the sound of our own footsteps — as a vehicle for understanding larger neural phenomena: how this behavior reveals the ability to monitor, recognize, and remember the sound of one’s own movements in relation to those of their larger environments.
“The capacity to anticipate and discriminate these movement-related sounds from environmental sounds is critical to normal hearing,” Schneider explains. “But how the brain learns to anticipate the sounds resulting from our movements remains largely unknown.”
To explore this, Schneider and his colleagues, designed an “acoustic virtual reality system” for the mice. Here, the scientists controlled the sounds the mice made walking on a treadmill while monitoring the animals’ neural activity, allowing them to identify the neural circuit mechanisms that learn to suppress movement-related sounds.
Overall, they found a flexibility in neural function — the mice developed an adjustable “sensory filter” that allowed them to ignore the sounds of their own footsteps. In turn, this allowed them to better detect other sounds arising from their surroundings.
“For mice, this is really important,” said Schneider. “They are prey animals, so they really need to be able to listen for a cat creeping up on them, even when they’re walking and making noise.”
Being able to ignore the sounds of one’s own movements is likely important for humans as well. But the ability to anticipate the sounds of our actions is also important for more complex human behaviors such as speaking or playing music.
“When we learn to speak or to play music, we predict what sounds we’re going to hear — such as when we prepare to strike keys on a piano — and we compare this to what we actually hear,” explains Schneider. “We use mismatches between expectation and experience to change how we play — and we get better over time because our brain is trying to minimize these errors.”
Being unable to make predictions like this is also thought to be involved in a spectrum of afflictions.
“Overactive prediction circuits in the brain are thought to lead to the voice-like hallucinations associated with schizophrenia while an inability to learn the consequences of one’s actions could lead to debilitating social paralysis, as in autism,” explains Schneider. “By figuring out how the brain normally makes predictions about self-generated sounds, we open the opportunity for understanding a fascinating ability — predicting the future — and for deepening our understanding of how the brain breaks during disease.”
glow are unique among bioluminescent animals and entirely unlike those seen in fireflies. The study also examines genes associated with some of the dramatic — and reversible — changes that happen to the fireworms during reproduction.
The beautiful bioluminescence of the Bermuda fireworm , which lives throughout the Caribbean, was first documented in 1492 by Christopher Columbus and his crew just before landing in the Americas. The observations described the lights as “looking like the flame of a small candle alternately raised and lowered.”
The phenomenon went unexplained until the 1930s, when scientists matched the historic description with the unusual and precisely timed mating behavior of fireworms. During summer and autumn, beginning at 22 minutes after sunset on the third night after the full Moon, spawning female fireworms secrete a bright bluish-green luminescence that attracts males. “It’s like they have pocket watches,” said lead author Mercer R. Brugler, a Museum research associate and assistant professor at New York City College of Technology (City Tech).
“The female worms come up from the bottom and swim quickly in tight little circles as they glow, which looks like a field of little cerulean stars across the surface of jet black water,” said Mark Siddall, a curator in the American Museum of Natural History’s Division of Invertebrate Zoology and corresponding author of the study. “Then the males, homing in on the light of the females, come streaking up from the bottom like comets — they luminesce, too. There’s a little explosion of light as both dump their gametes in the water. It is by far the most beautiful biological display I have ever witnessed.”
To further investigate this phenomenon, Siddall, together with Brugler; Michael Tessler, a postdoctoral fellow in the Museum’s Sackler Institute for Comparative Genomics, and M. Teresa Aguado, former postdoctoral fellow in the Museum’s Sackler Institute for Comparative Genomics who is now at the Autonomous University of Madrid, analyzed the transcriptome — the full set of RNA molecules — of a dozen female fireworms from Ferry Reach in Bermuda.
Their findings support previous work showing that fireworms “glow” because of a special luciferase enzyme they produce. These enzymes are the principal drivers of bioluminescence across the tree of life, in organisms as diverse as copepods, fungi, and jellyfish. However, the luciferases found in Bermuda fireworms and their relatives are distinct from those found in any other organism to date.
” The work also took a close look at genes related to the precise reproductive timing of the fireworms, as well as the changes that take place in the animals’ bodies just prior to swarming events. These changes include the enlargement and pigmentation of the worms’ four eyes and the modification of the nephridia — an organ similar to the kidney in vertebrates — to store and release gametes.
A new study led by scientists at the University of Bristol has warned that unless we mitigate current levels of carbon dioxide emissions, Western Europe and New Zealand could revert to the hot tropical climate of the early Paleogene period — 56-48 million years ago.
As seen from the ongoing heat wave, the knock-on effects of such extreme warmth include arid land and fires as well as impacts on health and infrastructure.
The early Paleogene is a period of great interest to climate change scientists as carbon dioxide levels (around 1,000 ppmv) are similar to those predicted for the end of this century.
Dr David Naafs from the University of Bristol’s School of Earth Sciences, “We know that the early Paleogene was characterised by a greenhouse climate with elevated carbon dioxide levels.
“Most of the existing estimates of temperatures from this period are from the ocean, not the land — what this study attempts to answer is exactly how warm it got on land during this period.”
Scientists used molecular fossils of microorganisms in ancient peat (lignite) to provide estimates of land temperature 50 million-years ago. This demonstrated that annual land temperatures in Western Europe as well as New Zealand were actually higher than previously thought — between 23 and 29 °C — this is currently 10 to 15 °C higher than current average temperatures in these areas.
These results suggest that temperatures similar to those of the current heat wave that is influencing western Europe and other regions would become the new norm by the end of this century if CO2 levels in the atmosphere continue to increase.
Professor Rich Pancost, Co-author and Director of the University of Bristol Cabot Institute, added: “Our work adds to the evidence for a very hot climate under potential end-of-century carbon dioxide levels. “Importantly, we also study how the Earth system responded to that warmth. For example, this and other hot time periods were associated with evidence for arid conditions and extreme rainfall events.”
The research team will now turn their attentions to geographical areas in lower-latitudes to see how hot land temperatures were there.
Dr Naafs said: “Did the tropics, for example, become ecological dead zones because temperatures in excess of 40 °C were too high for most form of life to survive?
“Some climate models suggest this, but we currently lack critical data.
“Our results hint at the possibility that the tropics, like the mid-latitudes, were hotter than present, but more work is needed to quantify temperatures from these regions.”
Obscured by thick clouds of absorbing dust, the closest supermassive black hole to the Earth lies 26,000 light years away at the centre of the Milky Way. This gravity monster, which has a mass four million times that of the Sun, is surrounded by a small group of stars orbiting at high speed. This extreme environment — the strongest gravitational field in our galaxy — makes it the perfect place to test gravitational physics, particularly Einstein’s general theory of relativity.
New infrared observations from the exquisitely sensitive GRAVITY, NACO and SINFONI instruments on ESO’s Very Large Telescope (VLT) have now allowed astronomers to follow one of these stars, called S2, as it passed very close to the black hole during May 2018 at a speed in excess of 25 million kilometres per hour — three percent of the speed of light — and at a distance of less than 20 billion kilometres.
These extremely delicate measurements were made by an international team led by Reinhard Genzel of the Max Planck Institute for extraterrestrial physics (MPE) in Garching, Germany, in conjunction with collaborators around the world. The observations form the culmination of a 26-year series of ever more precise observations of the centre of the Milky Way using ESO instruments. ‘This is the second time that we have observed the close passage of S2 around the black hole in our galactic centre. But this time, because of much improved instrumentation, we were able to observe the star with unprecedented resolution’, explains Genzel. ‘We have been preparing intensely for this event over several years, as we wanted to make the most of this unique opportunity to observe general relativistic effects.’
The new measurements clearly reveal an effect called gravitational redshift. Light from the star is stretched to longer wavelengths by the very strong gravitational field of the black hole. And the stretch in wavelength of light from S2 agrees precisely with that predicted by Einstein’s theory of general relativity. This is the first time that this deviation from the predictions of simpler Newtonian gravity has been observed in the motion of a star around a supermassive black hole. The team used SINFONI to measure the motion of S2 towards and away from Earth and the GRAVITY interferometric instrument to make extraordinarily precise measurements of the position of S2 in order to define the shape of its orbit. GRAVITY creates such sharp images that it can reveal the motion of the star from night to night as it passes close to the black hole — 26,000 light years from Earth.
‘Our first observations of S2, about two years ago, already showed that we would have the ideal black hole laboratory’, adds Frank Eisenhauer (MPE), Co-Principal Investigator of the GRAVITY instrument. ‘During the close passage, we managed not only to precisely follow the star on its orbit, we could even detect the faint glow around the black hole on most of the images.’ By combining the position and velocity measurements from SINFONI and GRAVITY, as well as previous observations using other instruments, the team could compare them to the predictions of Newtonian gravity, general relativity and other theories of gravity. As expected, the new results are inconsistent with Newtonian predictions and in excellent agreement with the predictions of general relativity. More than one hundred years after he published his paper setting out the equations of general relativity, Einstein has been proved right once more.
The hardware contribution of the Institute of Physics I of the University of Cologne was the development and construction of the two spectrometers of GRAVITY. The spectrometers analyse the wavelength of the observed stellar light and convert the received photons into electronic signals. ‘GRAVITY is a technological challenge. However, after more than two decades of astrophysical research on the high velocity stars in the Galactic Centre and on the development of astronomical instrumentation, the effort has been rewarded with an excellent result in experimental physics’, says Andreas Eckhart from the University of Cologne.
Continuing observations are expected to reveal another relativistic effect later in the year a small rotation of the star’s orbit, known as Schwarzschild precession as S2 moves away from the black hole.
There may be more habitable planets in the universe than we previously thought, who suggest that plate tectonics — long assumed to be a requirement for suitable conditions for life — are in fact not necessary.
When searching for habitable planets or life on other planets, scientists look for biosignatures of atmospheric carbon dioxide. On Earth, atmospheric carbon dioxide increases surface heat through the greenhouse effect. Carbon also cycles to the subsurface and back to the atmosphere through natural processes.
“Volcanism releases gases into the atmosphere, and then through weathering, carbon dioxide is pulled from the atmosphere and sequestered into surface rocks and sediment,” said Bradford Foley, assistant professor of geosciences. “Balancing those two processes keeps carbon dioxide at a certain level in the atmosphere, which is really important for whether the climate stays temperate and suitable for life.”
Most of Earth’s volcanoes are found at the border of tectonic plates, which is one reason scientists believed they were necessary for life. Subduction, in which one plate is pushed deeper into the subsurface by a colliding plate, can also aid in carbon cycling by pushing carbon into the mantle.
Planets without tectonic plates are known as stagnant lid planets. On these planets, the crust is one giant, spherical plate floating on mantle, rather than separate pieces. These are thought to be more widespread than planets with plate tectonics. In fact, Earth is the only planet with confirmed tectonic plates.
Foley and Andrew Smye, assistant professor of geosciences, created a computer model of the lifecycle of a planet. They looked at how much heat its climate could retain based on its initial heat budget, or the amount of heat and heat-producing elements present when a planet forms. Some elements produce heat when they decay. On Earth, decaying uranium produces thorium and heat, and decaying thorium produces potassium and heat.
After running hundreds of simulations to vary a planet’s size and chemical composition, the researchers found that stagnant lid planets can sustain conditions for liquid water for billions of years. At the highest extreme, they could sustain life for up to 4 billion years, roughly Earth’s life span to date.
“You still have volcanism on stagnant lid planets, but it’s much shorter lived than on planets with plate tectonics because there isn’t as much cycling,” said Smye. “Volcanoes result in a succession of lava flows, which are buried like layers of a cake over time. Rocks and sediment heat up more the deeper they are buried.”
The researchers found that at high enough heat and pressure, carbon dioxide gas can escape from rocks and make its way to the surface, a process known as degassing. On Earth, Smye said, the same process occurs with water in subduction fault zones.
This degassing process increases based on what types and quantities of heat-producing elements are present in a planet up to a certain point, said Foley.
“There’s a sweet spot range where a planet is releasing enough carbon dioxide to keep the planet from freezing over, but not so much that the weathering can’t pull carbon dioxide out of the atmosphere and keep the climate temperate,” he said.
According to the researchers’ model, the presence and amount of heat-producing elements were far better indicators for a planet’s potential to sustain life.
“One interesting take-home point of this study is that the initial composition or size of a planet is important in setting the trajectory for habitability.” “The future fate of a planet is set from the outset of its birth.”
A new way of arranging advanced computer components called memristors on a chip could enable them to be used for general computing, which could cut energy consumption by a factor of 100.
This would improve performance in low power environments such as smartphones or make for more efficient supercomputers.
“Historically, the semiconductor industry has improved performance by making devices faster. But although the processors and memories are very fast, they can’t be efficient because they have to wait for data to come in and out,” said Wei Lu, U-M professor of electrical and computer engineering and co-founder of memristor startup Crossbar Inc.
Memristors might be the answer. Named as a portmanteau of memory and resistor, they can be programmed to have different resistance states — meaning they store information as resistance levels. These circuit elements enable memory and processing in the same device, cutting out the data transfer bottleneck experienced by conventional computers in which the memory is separate from the processor.
However, unlike ordinary bits, which are 1 or 0, memristors can have resistances that are on a continuum. Some applications, such as computing that mimics the brain (neuromorphic), take advantage of the analog nature of memristors. But for ordinary computing, trying to differentiate among small variations in the current passing through a memristor device is not precise enough for numerical calculations.
Lu and his colleagues got around this problem by digitizing the current outputs — defining current ranges as specific bit values (i.e., 0 or 1). The team was also able to map large mathematical problems into smaller blocks within the array, improving the efficiency and flexibility of the system.
Computers with these new blocks, which the researchers call “memory-processing units,” could be particularly useful for implementing machine learning and artificial intelligence algorithms. They are also well suited to tasks that are based on matrix operations, such as simulations used for weather prediction. The simplest mathematical matrices, akin to tables with rows and columns of numbers, can map directly onto the grid of memristors.
Once the memristors are set to represent the numbers, operations that multiply and sum the rows and columns can be taken care of simultaneously, with a set of voltage pulses along the rows. The current measured at the end of each column contains the answers. A typical processor, in contrast, would have to read the value from each cell of the matrix, perform multiplication, and then sum up each column in series.
“We get the multiplication and addition in one step. It’s taken care of through physical laws. We don’t need to manually multiply and sum in a processor,” Lu said.
His team chose to solve partial differential equations as a test for a 32×32 memristor array — which Lu imagines as just one block of a future system. These equations, including those behind weather forecasting, underpin many problems science and engineering but are very challenging to solve. The difficulty comes from the complicated forms and multiple variables needed to model physical phenomena.
When solving partial differential equations exactly is impossible, solving them approximately can require supercomputers. These problems often involve very large matrices of data, so the memory-processor communication bottleneck is neatly solved with a memristor array. such as those used for integrated circuit fabrication.
A team of scientists from the University of California, Irvine has found evidence of significant mass loss in East Antarctica’s Totten and Moscow University glaciers, which, if they fully collapsed, could add 5 meters (16.4 feet) to the global sea level.
The glaciologists estimate that between April 2002 and September 2016, the two glaciers lost about 18.5 billion tons of ice per year — equivalent to 0.7 millimeters (0.03 inches) of global sea level rise over the analyzed time period.
UCI’s researchers discovered this by applying a locally optimized technique to data from NASA’s Gravity Recovery & Climate Experiment satellite mission, combined with mass balance approximations from regional atmospheric climate models and ice discharge measurements by NASA’s Operation IceBridge and Measures projects.
“For this research, we used an improved methodology with GRACE data to retrieve the mass loss in an area undergoing rapid change,” a graduate student in UCI’s Department of Earth System Science. “By overlaying these data with independent measurements, we improve our confidence in the results and the conclusion that Totten and Moscow University are imperiled.”
Making up roughly two-thirds of the Antarctic continent, East Antarctica has been viewed by polar researchers as less threatened by climate change than the volatile ice sheets in West Antarctica and the Antarctic Peninsula.
“Both of these glaciers are vulnerable to the intrusion of warm ocean water and hold considerable potential for sea level rise,” said co-author Eric Rignot, Donald Bren Professor and chair of Earth system science at UCI. “This work highlights that East Antarctic glaciers are as important to our future as those in the continent’s western regions.”
According to co-author Isabella Velicogna, professor of Earth system science, it’s challenging to study the Totten and Moscow University glaciers because the signal of change is much weaker than that of their counterparts in the west.
“In this remote part of the world, the data from GRACE and other satellite missions are critical for us to understand the glacier evolution.”